WO2022095752A1 - Procédé de démultiplexage de trame, dispositif électronique et support de stockage - Google Patents

Procédé de démultiplexage de trame, dispositif électronique et support de stockage Download PDF

Info

Publication number
WO2022095752A1
WO2022095752A1 PCT/CN2021/126345 CN2021126345W WO2022095752A1 WO 2022095752 A1 WO2022095752 A1 WO 2022095752A1 CN 2021126345 W CN2021126345 W CN 2021126345W WO 2022095752 A1 WO2022095752 A1 WO 2022095752A1
Authority
WO
WIPO (PCT)
Prior art keywords
frame
multimedia
electronic device
frames
target file
Prior art date
Application number
PCT/CN2021/126345
Other languages
English (en)
Chinese (zh)
Inventor
侯朋飞
苏多铎
杜晓
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022095752A1 publication Critical patent/WO2022095752A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams

Definitions

  • the embodiments of the present application relate to the field of multimedia technologies, and in particular, to a frame demultiplexing method, an electronic device, and a storage medium.
  • One of the core functions of the multimedia distribution technology is to demultiplex the multimedia data from the source device, and send the demultiplexed audio data and video data to the target device.
  • Embodiments of the present application provide a frame demultiplexing method, an electronic device, and a storage medium, so as to provide a demultiplexing calculation method for audio frames and video frames, so as to improve the calculation capability and readability of audio frames and video frames. Therefore, the transmission volume of audio frames and video frames can be increased, thereby improving the transmission efficiency of audio frames and video frames.
  • an embodiment of the present application provides a frame demultiplexing method, including:
  • the target file contains multiple multimedia frames; specifically, the target file may be a multimedia file, for example, a piece of video.
  • the multimedia frames may include audio frames and video frames.
  • the input time may include the time input by the user, for example, the user is fast-forwarding or fast-rewinding at the designated time.
  • the input time may also be any time when the above-mentioned multimedia video is played in sequence, which is not particularly limited in this embodiment of the present application.
  • the optimal read amount and the first multimedia frame are determined based on the input moment; specifically, the optimal read amount may be the number of multimedia frames that can be read this time.
  • the first multimedia frame corresponds to the input moment.
  • the plurality of second multimedia frames may be a plurality of data frames including the above-mentioned first multimedia frames.
  • the optimal read amount is 4, if the frame number of the first multimedia frame is 1001, 4 frames are read from 1001, that is, 4 frames 1001, 1002, 1003 and 1004.
  • the plurality of second multimedia frames are sent to the second electronic device.
  • the multiple multimedia frames include multiple video frames and multiple audio frames, and based on the optimal read amount, starting from the first multimedia frame, read multiple second multimedia frames in the target file.
  • Multimedia frames include:
  • the multiple second multimedia frames may be multiple second video frames.
  • the multiple second multimedia frames may be multiple second audio frames.
  • determining the optimal read amount based on the input moment includes:
  • the preset pending reading number may be a tentative optimal reading number.
  • the preset pending reading number is set as the optimal reading amount; exemplarily, it is assumed that the preset pending reading number is 4. If the frame number of all multimedia frames from the beginning to the end of the target file is greater than or equal to 4, the optimal reading number can be set to 4, that is, 4 frames are read this time.
  • the preset pending read number is determined by the number of bits of the vector register and the data type of the multimedia frame.
  • the vector register may be a single instruction multiple data instruction set, for example, NEON.
  • reading multiple second multimedia frames in the target file includes:
  • the frame offset vector starting from the first multimedia frame, a plurality of second multimedia frames in the target file are read.
  • sending the plurality of second multimedia frames to the second electronic device includes:
  • the plurality of second multimedia frames and the frame size of each second multimedia frame are sent to the second electronic device. Therefore, it is convenient for the target device to separate each second multimedia frame based on the frame size of each second multimedia frame.
  • an embodiment of the present application provides a frame demultiplexing apparatus, including:
  • the first acquisition module is used to acquire a target file, and the target file contains a plurality of multimedia frames;
  • the second obtaining module is used to obtain the input time
  • a calculation module for determining the optimal read amount and the first multimedia frame based on the input moment
  • a reading module configured to read a plurality of second multimedia frames in the target file based on the optimal read amount, starting from the first multimedia frame, wherein the second multimedia frames include the first multimedia frame body frame;
  • the sending module is used for sending a plurality of second multimedia frames to the second electronic device.
  • the multiple multimedia frames include multiple video frames and multiple audio frames
  • the above-mentioned reading module is further configured to, based on the optimal read amount, start from the first video frame, read the target file a plurality of second video frames;
  • the above calculation module includes:
  • an acquisition unit used to acquire the preset pending read count
  • a statistical unit used to count the frame numbers of all multimedia frames from the input moment to the end moment of the target file
  • the determining unit is used to set the preset pending reading number as the optimal reading amount if the frame number of the multimedia frame is greater than or equal to the preset pending reading number; if the frame number of the multimedia frame is less than the preset pending reading number , the optimal read amount is set to 1.
  • the above-mentioned preset pending read number is determined by the number of bits of the vector register and the data type of the multimedia frame.
  • the above-mentioned reading module includes:
  • a computing unit for determining the block offset based on the input moment
  • a retrieval unit for obtaining the frame size of each second multimedia frame based on the optimal read amount
  • a building unit for building a frame matrix based on the block offset and the frame size
  • an operation unit used to perform a vectorization operation on the frame matrix to obtain a frame offset vector
  • the reading unit is configured to read, starting from the first multimedia frame, a plurality of second multimedia frames in the target file according to the frame offset vector.
  • the above-mentioned sending module is further configured to send the plurality of second multimedia frames and the frame size of each second multimedia frame to the second electronic device.
  • an embodiment of the present application provides a first electronic device, including:
  • Memory the memory is used to store computer program code, and the computer program code includes instructions.
  • the first electronic device reads the instructions from the memory, the first electronic device performs the following steps:
  • the target file contains multiple multimedia frames
  • the optimal read amount starting from the first multimedia frame, read a plurality of second multimedia frames in the target file, wherein the second multimedia frames include the first multimedia frames;
  • the plurality of second multimedia frames are sent to the second electronic device.
  • the plurality of multimedia frames include a plurality of video frames and a plurality of audio frames, so that the above-mentioned first electronic device executes a sequence from the first electronic device based on the optimal read amount.
  • the steps of reading a plurality of second multimedia frames in the target file include:
  • the step of making the above-mentioned first electronic device to perform the determination of the optimal reading amount based on the input time includes:
  • the preset pending reading number is set as the optimal reading amount
  • the optimal reading amount is set to 1.
  • the above-mentioned preset pending read number is determined by the number of bits of the vector register and the data type of the multimedia frame.
  • the above-mentioned first electronic device when executed by the above-mentioned first electronic device, the above-mentioned first electronic device is made to read, starting from the first multimedia frame, multiple data in the target file based on the optimal read amount.
  • the steps of a second multimedia frame include:
  • the frame offset vector starting from the first multimedia frame, a plurality of second multimedia frames in the target file are read.
  • causing the above-mentioned first electronic device to execute the step of sending a plurality of second multimedia frames to the second electronic device includes:
  • the plurality of second multimedia frames and the frame size of each second multimedia frame are sent to the second electronic device.
  • an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when it runs on a computer, causes the computer to execute the method described in the first aspect.
  • an embodiment of the present application provides a computer program, which is used to execute the method described in the first aspect when the computer program is executed by a computer.
  • the program in the fifth aspect may be stored in whole or in part on a storage medium packaged with the processor, and may also be stored in part or in part in a memory not packaged with the processor.
  • FIG. 1 is a schematic diagram of an application scenario provided by an embodiment of the present application.
  • FIG. 2 is a schematic flowchart of a frame demultiplexing method provided by an embodiment of the present application
  • FIG. 3 is a schematic diagram of a frame matrix provided by an embodiment of the present application.
  • FIG. 4 is a schematic diagram of a frame offset vector provided by an embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of a frame demultiplexing apparatus provided by an embodiment of the present application.
  • FIG. 6 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • first and second are only used for descriptive purposes, and should not be construed as indicating or implying relative importance or implicitly indicating the number of indicated technical features.
  • a feature defined as “first” or “second” may expressly or implicitly include one or more of that feature.
  • plural means two or more.
  • One of the core functions of the multimedia distribution technology is to demultiplex the multimedia data from the source device, and send the demultiplexed audio data and video data to the target device.
  • Fig. 1 is a flow chart of the source device sending multimedia data to the target device.
  • the multimedia file can be demultiplexed in the source device, thereby obtaining audio data and video data respectively; Read audio data and video data. Since only one frame of data can be read at a time, only a single audio frame and a single video frame can be obtained each time. For example, each time audio data is read, only one frame can be obtained. Audio data, each time video data is read, only one frame of video data can be obtained.
  • the source device sends the video frame and the audio frame to the target device through the transmission module, wherein only one frame of data can be sent at a time, for example, only one frame of audio data or one frame of video data can be sent at a time.
  • the source device can only demultiplex one frame of data (for example, one frame of audio or one frame of video) at a time. Therefore, the source device needs enough time to send a certain amount of audio data and video data to the target device so that the target device can play it, which will bring delay to the playback of the audio data and video data of the target device, reducing the User's audiovisual experience.
  • the source device can only send one frame of data at a time when sending multimedia data, and the data transmission efficiency is low.
  • an embodiment of the present application proposes a frame demultiplexing method, whereby multiple multimedia data frames can be read from the source device at one time, the data reading efficiency can be improved, and multiple multimedia data frames can be sent through the source device at one time
  • the data frame can improve the transmission efficiency of the data, thereby reducing the playback delay of the multimedia data by the target device, and improving the user's audio-visual experience.
  • FIG. 2 is a schematic flowchart of an embodiment of a frame demultiplexing method provided by an embodiment of the present application, including:
  • Step 101 Acquire a target multimedia file in a source device.
  • the source device may be a first electronic device that plays multimedia files, such as a mobile phone, tablet, computer, etc., or other types of first electronic devices, which are not specifically limited in this application.
  • the multimedia file may include frames of multimedia data, and the frames of multimedia data may include frames of audio data and frames of video data. It can be understood that, in the embodiment of the present application, a multimedia file in the MP4 format is used as an example, but the embodiment of the present application does not limit the format of the multimedia file.
  • the storage of the audio data frames and video data frames in the MP4 format in the multimedia file may be implemented by a table index.
  • the video data frame may include a timing-to-frame index table, a frame-to-block index table, a block offset index table, and a frame size index table.
  • Table 1 shows the timing to frame index table.
  • serial number number of frames frame duration 1 1 42 2 1 41 3 2 42 4 1 41 ... ... ...
  • each block can contain one or more frames.
  • the length of each block can vary, as can the lengths of the frames within the block.
  • serial number block number frames per block 1 1 to 28 13 2 29 12 3 30 to 57 13 4 58 12 ... ... ...
  • the frame number of the specified frame can be used to search and obtain the corresponding block containing the specified frame, so that the specified frame can be found through the block.
  • the specified frame is the 500th frame
  • the 500th frame is greater than the total number of frames of the first 2 block sets, but less than the total number of frames of the first 3 block sets, therefore, the 500th frame frames are in blocks of the set of sequence number 3.
  • the 500th frame is greater than the total number of frames of the first 38 blocks, but less than The total number of frames for the first 39 blocks, so this 500th frame is in the block with block number 39.
  • 500 28*13+1*12+9*13+7, therefore, the 500th frame is the 7th frame in the block whose block number is 39.
  • Table 3 shows the block offset index table. Among them, the block offset is used to identify the position of each block in the multimedia file.
  • the block offset corresponding to the specified block can be obtained by searching by the sequence number of the specified block.
  • the frame size index table is shown in Table 4.
  • the frame size corresponding to the specified frame can be obtained by searching the frame number of the specified frame.
  • the corresponding block offset and frame size can be obtained by looking up the table through the above Table 1-Table 4, and then the corresponding frame offset can be obtained. Realize fast forward and fast rewind when playing multimedia files.
  • the block number (eg, block 39) corresponding to the 500th frame can be obtained through Table 2, so that the corresponding block offset can be obtained by searching through Table 3.
  • the frame number of the first frame in block 39 is 494
  • the frame number of the seventh frame in block 39 is 500. Therefore, the size of frame 494-frame 499 can be obtained by searching through Table 4.
  • the offset of frame 494 to frame 500 can be obtained respectively.
  • the offset can obtain the offset of frame 494 .
  • the offset of frame 495 can be obtained from the offset of block 39 and the frame size of frame 494.
  • the offset of frame 496 can be obtained from the offset of block 39, the frame size of frame 494, the frame size of frame 495, and so on.
  • the offset from frame 494 to frame 500 may be the playback address of the data frame, so that fast forward and fast rewind can be implemented.
  • Step 102 acquiring the input time.
  • the input time can be any time input by the user.
  • the user can click any time on the playback progress bar of the multimedia file to achieve fast forward or fast rewind; it can also be a preset time, which is implemented in this application. This example does not make any special restrictions.
  • Step 103 Determine the current optimum reading amount based on the input moment.
  • the optimal read amount is used to identify the number of multimedia data frames that can be read this time, for example, n audio data frames or n video data frames can be read at a time, where n is optimal read volume.
  • the read base m may be determined by the number of bits of the register and the data type of the multimedia data frame. Wherein, the read base m is used to identify the number of reads in units of blocks.
  • the NEON instruction set is adopted, the NEON is a 128-bit single instruction multiple data instruction set (Single Instruction Multiple Data, SIMD) of the ARM Cortex-A series processor, that is, the NEON is a 128-bit vector register.
  • SIMD Single Instruction Multiple Data
  • the read base m can also be a different value, for example, 2, 8, 16, etc., This embodiment of the present application does not make any special limitation on this.
  • the current optimal reading amount can be determined according to the reading base m.
  • Step 104 Determine the block offset based on the input moment.
  • the block offset can also be determined according to the input time.
  • the block offset can be inquired in Table 1 according to the input time to determine the frame number corresponding to the input time; then, inquired in Table 2 according to the frame number to determine the block corresponding to the input time number; and by querying in Table 3 according to the block number, the block offset can be determined.
  • Step 105 Determine the number of frames based on the current optimal read amount, and obtain the corresponding frame size according to the number of frames.
  • the number of frames read this time can be determined.
  • the optimal read amount this time is 4, the number of frames read this time is 4.
  • the frame size of each frame of the frames read this time can be obtained. Taking 4 frames, the frame size of the above 4 frames can be obtained.
  • Step 106 constructing a frame matrix based on the above block offset and frame size.
  • a frame matrix may also be constructed according to the block offset and frame size. where each row in the frame matrix can contain block offset and frame size.
  • the number of rows of the matrix corresponds to the optimal read volume, and the number of columns of the matrix corresponds to the optimal read volume.
  • the frame matrix may be a 4*4 matrix; if the optimal reading amount is 8, the frame matrix may be an 8*8 matrix. The embodiment does not make any special limitation on this.
  • the optimal reading amount is 4 as an example for illustration. Since the optimal reading amount is 4, that is, 4 frames are read this time, a 4*4 frame matrix can be constructed.
  • Figure 3 shows a 4*4 frame matrix, the first row of data in the frame matrix is (block offset, 0, 0, 0), and the second row data is (block offset, first frame size, 0 , 0), the third line of data is (block offset, first frame size, second frame size, 0), and the fourth line of data is (block offset, first frame size, second frame size, third frame size) .
  • Step 107 Perform vectorization calculation on the frame matrix to obtain a frame offset vector.
  • a vectorized calculation may be performed on the frame matrix, thereby obtaining a frame offset vector.
  • the vectorization calculation may be an accumulation and summation of all columns in the frame matrix, thereby obtaining a column vector, and the column vector may be a frame offset vector.
  • the 4 columns in the frame matrix are accumulated and summed, so that the column vector shown in FIG. 4 can be obtained, wherein the column vector can be the frame offset vector.
  • the data of each row in the column vector corresponds to the offset of each frame.
  • the value of the first row in the column vector is the block offset value, and the block offset value corresponds to the offset of the first frame; in the column vector The value of the second row of the block offset value + the size of the first frame, the value of the block offset value + the size of the first frame corresponds to the offset of the second frame; the value of the third row in the column vector is the block offset value + first frame size + second frame size, the block offset value + first frame size + second frame size value corresponds to the offset of the third frame; the value of the fourth row in the column vector is the block offset value + The size of the first frame + the size of the second frame + the size of the third frame, the block offset value + the size of the first frame + the size of the second frame + the size of the third frame corresponds to the offset of the fourth frame.
  • Step 108 Acquire a multimedia data frame in the target multimedia file according to the frame offset vector, and send the multimedia data frame to the target device.
  • the target device may be a first electronic device with a multimedia playback function, such as a mobile phone, a tablet, or a TV.
  • the source device obtains the frame offset vector, it can obtain multiple multimedia data frames in the target multimedia file according to the frame offset of each frame in the frame offset vector, and can send the multiple multimedia data frames to the target device .
  • the frame offset vector may correspond to an audio frame offset vector or a video frame offset vector, therefore, each acquisition may be a video data frame or an audio data frame. This embodiment of the present application There is no special restriction on this.
  • the source device when it sends the above-mentioned multimedia data frame to the target device, it can also send the frame size of each multimedia data frame to the target device, so that the target device separates each multimedia data frame, thereby Each frame of multimedia data can be decoded and played.
  • the source device can set a buffer (buffer), and the buffer can be used to store multimedia data frames, wherein, the multimedia data frames stored in the buffer can be the multimedia data frames obtained after vectorization calculation this time, Next, the source device may send the multimedia data frame stored in the buffer and the frame size corresponding to each multimedia data frame to the target device. After receiving the multimedia data frame stored in the buffer and the frame size corresponding to each multimedia data frame, the target device can sequentially separate each frame from the buffer according to the frame size.
  • the offset of the first frame in the buffer can be regarded as 0, and the first frame can be separated according to the frame size of the first frame; then, the frame size of the first frame is used as the second frame.
  • the offset of the frame, and the second frame can be separated according to the frame size of the second frame, and so on, all the frames in the buffer can be separated.
  • the offsets of multiple multimedia data frames can be obtained, so as to realize reading multiple multimedia data frames at one time, which improves the efficiency of data reading, and improves the efficiency of data reading.
  • the data sending efficiency can be improved.
  • FIG. 5 is a schematic structural diagram of an embodiment of a frame demultiplexing apparatus of the present application.
  • the frame demultiplexing apparatus 50 may include: a first acquisition module 51 , a second acquisition module 52 , a calculation module 53 , a read Get module 54 and send module 55;
  • the first acquisition module 51 is used to acquire a target file, and the target file contains a plurality of multimedia frames;
  • the second obtaining module 52 is used to obtain the input time
  • the calculation module 53 is used to determine the optimal read amount and the first multimedia frame based on the input moment
  • the reading module 54 is configured to read a plurality of second multimedia frames in the target file from the first multimedia frame based on the optimal reading amount, wherein the second multimedia frames include the first multimedia frames. media frame;
  • the sending module 55 is configured to send a plurality of second multimedia frames to the second electronic device.
  • the multiple multimedia frames include multiple video frames and multiple audio frames
  • the above-mentioned reading module 54 is further configured to read the target file starting from the first video frame based on the optimal reading amount a plurality of second video frames in;
  • the above calculation module 53 includes: an acquisition unit 531, a statistics unit 532, and a determination unit 533;
  • Obtaining unit 531 used to obtain the preset pending reading number
  • Statistical unit 532 is used to count the frame numbers of all multimedia frames from the input moment to the end moment of the target file
  • Determining unit 533 for if the frame number of the multimedia frame is greater than or equal to the preset pending reading number, then the preset pending reading number is set as the optimal reading amount; if the frame number of the multimedia frame is less than the preset pending reading number number, set the optimal read amount to 1.
  • the above-mentioned preset pending read number is determined by the number of bits of the vector register and the data type of the multimedia frame.
  • the above-mentioned reading module 54 includes: a calculation unit 541, a retrieval unit 542, a construction unit 543, an operation unit 544, and a reading unit 545;
  • a calculation unit 541, configured to determine the block offset based on the input moment
  • Retrieval unit 542 for obtaining the frame size of each second multimedia frame based on the optimal read amount
  • a construction unit 543 for constructing a frame matrix based on the block offset and the frame size
  • Operation unit 544 for carrying out vectorization operation to frame matrix, obtains frame offset vector
  • the reading unit 545 is configured to read a plurality of second multimedia frames in the target file from the first multimedia frame according to the frame offset vector.
  • the above-mentioned sending module 55 is further configured to send a plurality of second multimedia frames and the frame size of each second multimedia frame to the second electronic device.
  • each module of the frame demultiplexing apparatus shown in FIG. 5 is only a division of logical functions, and may be fully or partially integrated into a physical entity in actual implementation, or may be physically separated.
  • these modules can all be implemented in the form of software calling through processing elements; they can also all be implemented in hardware; some modules can also be implemented in the form of software calling through processing elements, and some modules can be implemented in hardware.
  • the computing module may be a separately established processing element, or may be integrated in a certain chip of an electronic device.
  • the implementation of other modules is similar.
  • all or part of these modules can be integrated together, and can also be implemented independently.
  • each step of the above-mentioned method or each of the above-mentioned modules can be completed by an integrated logic circuit of hardware in the processor element or an instruction in the form of software.
  • the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more specific integrated circuits (Application Specific Integrated Circuit; hereinafter referred to as: ASIC), or, one or more microprocessors Digital Singnal Processor (hereinafter referred to as: DSP), or, one or more Field Programmable Gate Array (Field Programmable Gate Array; hereinafter referred to as: FPGA), etc.
  • ASIC Application Specific Integrated Circuit
  • DSP Digital Singnal Processor
  • FPGA Field Programmable Gate Array
  • these modules can be integrated together and implemented in the form of a system-on-a-chip (System-On-a-Chip; hereinafter referred to as: SOC).
  • FIG. 6 shows a schematic structural diagram of the first electronic device 100 .
  • the first electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, Antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194 , and a subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structures illustrated in the embodiments of the present invention do not constitute a specific limitation on the first electronic device 100 .
  • the first electronic device 100 may include more or less components than shown, or some components are combined, or some components are separated, or different components are arranged.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural-network processing unit neural-network processing unit
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may contain multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flash, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interface, so as to realize the touch function of the first electronic device 100 .
  • the I2S interface can be used for audio communication.
  • the processor 110 may contain multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications, sampling, quantizing and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is typically used to connect the processor 110 with the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 110 communicates with the camera 193 through a CSI interface, so as to implement the shooting function of the first electronic device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to implement the display function of the first electronic device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • the GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the first electronic device 100, and can also be used to transmit data between the first electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiment of the present invention is only a schematic illustration, and does not constitute a structural limitation of the first electronic device 100 .
  • the first electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through the wireless charging coil of the first electronic device 100 . While the charging management module 140 charges the battery 142 , it can also supply power to the electronic device through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives the input from the battery 142 and/or the charging management module 140 and supplies power to the processor 110 , the internal memory 121 , the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the first electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the first electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide a wireless communication solution including 2G/3G/4G/5G etc. applied on the first electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110, and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the first electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global Navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • GNSS global Navigation satellite system
  • frequency modulation frequency modulation, FM
  • NFC near field communication technology
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the
  • the antenna 1 of the first electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the first electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (GLONASS), a Beidou navigation satellite system (BDS), a quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the first electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED diode AMOLED
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the first electronic device 100 may include 1 or N display screens 194 , where N is a positive integer greater than 1.
  • the first electronic device 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the first electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the first electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point, and the like.
  • Video codecs are used to compress or decompress digital video.
  • the first electronic device 100 may support one or more video codecs.
  • the first electronic device 100 can play or record videos in various encoding formats, such as: Moving Picture Experts Group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG Moving Picture Experts Group
  • MPEG2 moving picture experts group
  • MPEG3 MPEG4
  • MPEG4 Moving Picture Experts Group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the first electronic device 100 can be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the first electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area may store data (such as audio data, phone book, etc.) created during the use of the first electronic device 100 and the like.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the processor 110 executes various functional applications and data processing of the first electronic device 100 by executing the instructions stored in the internal memory 121 and/or the instructions stored in the memory provided in the processor.
  • the first electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also referred to as a "speaker" is used to convert audio electrical signals into sound signals.
  • the first electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also referred to as "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be answered by placing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals. When making a call or sending a voice message, the user can make a sound through the human mouth close to the microphone 170C, and input the sound signal into the microphone 170C.
  • the first electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the first electronic device 100 may be provided with two microphones 170C, which may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the first electronic device 100 may further be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the earphone jack 170D is used to connect wired earphones.
  • the earphone interface 170D can be the USB interface 130, or can be a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the first electronic device 100 determines the intensity of the pressure according to the change in capacitance. When a touch operation acts on the display screen 194, the first electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the first electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example, when a touch operation whose intensity is less than the first pressure threshold acts on the short message application icon, the instruction for viewing the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, the instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the motion attitude of the first electronic device 100 .
  • the angular velocity of the first electronic device 100 about three axes ie, the x, y and z axes
  • the gyro sensor 180B can be used for image stabilization. Exemplarily, when the shutter is pressed, the gyro sensor 180B detects the shaking angle of the first electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to offset the shaking of the first electronic device 100 through reverse motion, Achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenarios.
  • the air pressure sensor 180C is used to measure air pressure.
  • the first electronic device 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist in positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the first electronic device 100 can detect the opening and closing of the flip holster by using the magnetic sensor 180D.
  • the first electronic device 100 can detect the opening and closing of the flip according to the magnetic sensor 180D. Further, according to the detected opening and closing state of the leather case or the opening and closing state of the flip cover, characteristics such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the first electronic device 100 in various directions (generally three axes).
  • the magnitude and direction of gravity can be detected when the first electronic device 100 is stationary. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the first electronic device 100 may measure the distance through infrared or laser. In some embodiments, when shooting a scene, the first electronic device 100 can use the distance sensor 180F to measure the distance to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the first electronic device 100 emits infrared light to the outside through light emitting diodes.
  • the first electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object near the first electronic device 100 . When insufficient reflected light is detected, the first electronic device 100 may determine that there is no object near the first electronic device 100 .
  • the first electronic device 100 can use the proximity light sensor 180G to detect that the user holds the first electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • Proximity light sensor 180G can also be used in holster mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the first electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the first electronic device 100 is in the pocket, so as to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the first electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking photos with fingerprints, answering incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect the temperature.
  • the first electronic device 100 uses the temperature detected by the temperature sensor 180J to execute the temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the first electronic device 100 reduces the performance of the processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection.
  • the first electronic device 100 when the temperature is lower than another threshold, the first electronic device 100 heats the battery 142 to avoid abnormal shutdown of the first electronic device 100 due to low temperature.
  • the first electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also called “touch device”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the first electronic device 100 , which is different from the position where the display screen 194 is located.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the pulse of the human body and receive the blood pressure beating signal.
  • the bone conduction sensor 180M can also be disposed in the earphone, combined with the bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vocal vibration bone block obtained by the bone conduction sensor 180M, so as to realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 180M, and realize the function of heart rate detection.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • the first electronic device 100 may receive key input, and generate key signal input related to user settings and function control of the first electronic device 100 .
  • Motor 191 can generate vibrating cues.
  • the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the motor 191 can also correspond to different vibration feedback effects for touch operations on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 to achieve contact with and separation from the first electronic device 100 .
  • the first electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card and so on. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the first electronic device 100 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the first electronic device 100 employs an eSIM, ie an embedded SIM card.
  • the eSIM card can be embedded in the first electronic device 100 and cannot be separated from the first electronic device 100 .
  • the interface connection relationship between the modules illustrated in the embodiment of the present invention is only a schematic illustration, and does not constitute a structural limitation of the first electronic device 100 .
  • the first electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the above-mentioned first electronic device 100 and the like include corresponding hardware structures and/or software modules for executing each function.
  • the embodiments of the present application can be implemented in hardware or a combination of hardware and computer software. Whether a function is performed by hardware or computer software driving hardware depends on the specific application and design constraints of the technical solution. Skilled artisans may implement the described functionality using different methods for each particular application, but such implementations should not be considered beyond the scope of the embodiments of the present invention.
  • the first electronic device 100 and the like may be divided into functional modules according to the foregoing method examples.
  • each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. middle.
  • the above-mentioned integrated modules can be implemented in the form of hardware, and can also be implemented in the form of software function modules. It should be noted that, the division of modules in the embodiment of the present invention is schematic, and is only a logical function division, and there may be other division manners in actual implementation.
  • Each functional unit in each of the embodiments of the embodiments of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as an independent product, may be stored in a computer-readable storage medium.
  • a computer-readable storage medium includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: flash memory, removable hard disk, read-only memory, random access memory, magnetic disk or optical disk and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

Des modes de réalisation de la présente demande, qui appartient au domaine technique du multimédia, concernent un procédé de démultiplexage de trame, un dispositif électronique et un support de stockage. Le procédé consiste à : obtenir un fichier cible, le fichier cible comprenant une pluralité de trames multimédias; obtenir un moment d'entrée; déterminer le volume de lecture optimal et une première trame multimédia d'après le moment d'entrée; en commençant par la première trame multimédia, lire une pluralité de secondes trames multimédia dans le fichier cible d'après le volume de lecture optimal, les secondes trames multimédias comprenant la première trame multimédia; et envoyer la pluralité de secondes trames multimédia à un second dispositif électronique. Selon le procédé décrit dans les modes de réalisation de la présente demande, la capacité de calcul et la capacité de lecture d'une trame audio et d'une trame vidéo peuvent être améliorées afin d'améliorer le volume d'envoi de la trame audio et de la trame vidéo, ce qui permet d'améliorer l'efficacité d'envoi de la trame audio et de la trame vidéo.
PCT/CN2021/126345 2020-11-09 2021-10-26 Procédé de démultiplexage de trame, dispositif électronique et support de stockage WO2022095752A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011240224.1A CN114466238B (zh) 2020-11-09 2020-11-09 帧解复用方法、电子设备及存储介质
CN202011240224.1 2020-11-09

Publications (1)

Publication Number Publication Date
WO2022095752A1 true WO2022095752A1 (fr) 2022-05-12

Family

ID=81404210

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/126345 WO2022095752A1 (fr) 2020-11-09 2021-10-26 Procédé de démultiplexage de trame, dispositif électronique et support de stockage

Country Status (2)

Country Link
CN (1) CN114466238B (fr)
WO (1) WO2022095752A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116721678A (zh) * 2022-09-29 2023-09-08 荣耀终端有限公司 音频数据的监测方法、电子设备以及介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104823450A (zh) * 2013-12-01 2015-08-05 Lg电子株式会社 发送和接收广播信号以便提供特技播放服务的方法和装置
CN108184163A (zh) * 2017-12-29 2018-06-19 深圳华侨城卡乐技术有限公司 一种视频播放方法、存储介质及播放器
US20180205978A1 (en) * 2017-01-19 2018-07-19 International Business Machines Corporation Video segment manager
CN109936763A (zh) * 2017-12-15 2019-06-25 腾讯科技(深圳)有限公司 视频的处理及发布方法
CN111741338A (zh) * 2020-07-22 2020-10-02 深圳力维智联技术有限公司 Hls流媒体播放方法、系统、设备及存储介质

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2003286491A1 (en) * 2002-10-18 2004-05-04 Global Epoint, Inc. Video and telemetry apparatus and methods
EP3125559B1 (fr) * 2010-08-17 2018-08-08 M&K Holdings Inc. Appareil de décodage d'un mode de prédiction intra
EP2530948A1 (fr) * 2011-06-03 2012-12-05 Samsung Electronics Co., Ltd. Procédé et dispositif de démultiplexage de données vidéo et audio d'un fichier multimédia
US9894393B2 (en) * 2015-08-31 2018-02-13 Gopro, Inc. Video encoding for reduced streaming latency
CN106686445B (zh) * 2015-11-05 2019-06-11 北京中广上洋科技股份有限公司 对多媒体文件进行按需跳转的方法
CN111291610B (zh) * 2019-12-12 2024-05-28 深信服科技股份有限公司 视频检测方法、装置、设备及计算机可读存储介质
CN111526314B (zh) * 2020-04-24 2022-04-05 荣耀终端有限公司 视频拍摄方法及电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104823450A (zh) * 2013-12-01 2015-08-05 Lg电子株式会社 发送和接收广播信号以便提供特技播放服务的方法和装置
US20180205978A1 (en) * 2017-01-19 2018-07-19 International Business Machines Corporation Video segment manager
CN109936763A (zh) * 2017-12-15 2019-06-25 腾讯科技(深圳)有限公司 视频的处理及发布方法
CN108184163A (zh) * 2017-12-29 2018-06-19 深圳华侨城卡乐技术有限公司 一种视频播放方法、存储介质及播放器
CN111741338A (zh) * 2020-07-22 2020-10-02 深圳力维智联技术有限公司 Hls流媒体播放方法、系统、设备及存储介质

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116721678A (zh) * 2022-09-29 2023-09-08 荣耀终端有限公司 音频数据的监测方法、电子设备以及介质

Also Published As

Publication number Publication date
CN114466238A (zh) 2022-05-10
CN114466238B (zh) 2023-09-29

Similar Documents

Publication Publication Date Title
WO2020244623A1 (fr) Procédé de mise en œuvre de mode de souris 3d et dispositif associé
WO2021017909A1 (fr) Procédé, dispositif électronique et système d'exécution de fonctions par l'intermédiaire d'une étiquette nfc
US20220232360A1 (en) Information processing method and device
CN114489533A (zh) 投屏方法、装置、电子设备及计算机可读存储介质
WO2022116930A1 (fr) Procédé de partage de contenu, dispositif électronique et support de stockage
WO2022156555A1 (fr) Procédé de réglage de luminosité d'écran, appareil et dispositif terminal
WO2022022319A1 (fr) Procédé et système de traitement d'image, dispositif électronique et système de puce
CN114422340A (zh) 日志上报方法、电子设备及存储介质
WO2022199613A1 (fr) Procédé et appareil de lecture synchrone
WO2022042768A1 (fr) Procédé d'affichage d'index, dispositif électronique et support de stockage lisible par ordinateur
WO2022105674A1 (fr) Procédé de réponse d'appel entrant, dispositif électronique et support de stockage
WO2022095752A1 (fr) Procédé de démultiplexage de trame, dispositif électronique et support de stockage
CN113593567A (zh) 视频声音转文本的方法及相关设备
JP2022501968A (ja) ファイル転送方法および電子デバイス
CN109285563B (zh) 在线翻译过程中的语音数据处理方法及装置
WO2022135144A1 (fr) Procédé d'affichage auto-adaptatif, dispositif électronique et support de stockage
WO2022033344A1 (fr) Procédé de stabilisation vidéo, dispositif de terminal et support de stockage lisible par ordinateur
WO2022170854A1 (fr) Procédé d'appel vidéo et dispositif associé
CN114120987B (zh) 一种语音唤醒方法、电子设备及芯片系统
CN116939559A (zh) 蓝牙音频编码数据分发方法、电子设备及存储介质
CN115297269B (zh) 曝光参数的确定方法及电子设备
WO2022105670A1 (fr) Procédé d'affichage et terminal
CN115019803B (zh) 音频处理方法、电子设备以及存储介质
CN113364067B (zh) 充电精度校准方法及电子设备
WO2024055881A1 (fr) Procédé de synchronisation d'horloge, dispositif électronique, système, et support de stockage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21888449

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21888449

Country of ref document: EP

Kind code of ref document: A1