CN114237941A - Data processing method, data processing device, storage medium and electronic equipment - Google Patents

Data processing method, data processing device, storage medium and electronic equipment Download PDF

Info

Publication number
CN114237941A
CN114237941A CN202111580266.4A CN202111580266A CN114237941A CN 114237941 A CN114237941 A CN 114237941A CN 202111580266 A CN202111580266 A CN 202111580266A CN 114237941 A CN114237941 A CN 114237941A
Authority
CN
China
Prior art keywords
data
audio
target data
processing
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111580266.4A
Other languages
Chinese (zh)
Inventor
伍洋
张艳苹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen TCL New Technology Co Ltd
Original Assignee
Shenzhen TCL New Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen TCL New Technology Co Ltd filed Critical Shenzhen TCL New Technology Co Ltd
Priority to CN202111580266.4A priority Critical patent/CN114237941A/en
Publication of CN114237941A publication Critical patent/CN114237941A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/546Message passing systems or structures, e.g. queues
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2209/00Indexing scheme relating to G06F9/00
    • G06F2209/54Indexing scheme relating to G06F9/54
    • G06F2209/548Queue

Abstract

The application discloses a data processing method, a data processing device, a storage medium and electronic equipment. The first target data module, the second target data module and the third target data module are obtained by calling the first request parameter and the second request parameter at the same time, namely the target data modules are obtained at the same time, so that the condition of low audio and video fluency caused by improper data obtaining sequence is avoided.

Description

Data processing method, data processing device, storage medium and electronic equipment
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a data processing method and apparatus, a storage medium, and an electronic device.
Background
Along with the improvement of living standard, people increasingly prefer to enrich life by playing audio and video, and as the fluency of the audio and video in the playing process is an important factor of the playing experience of users, the requirements of people on the fluency of the audio and video playing are higher and higher.
Because the network architecture in the current audio/video player is unreasonable in design, in the starting process and the skipping process of a high-code-rate film source, after a network request obtains metadata located at the head end of a file, original audio/video data located at the middle position of the file can be skipped over to preferentially obtain index data located at the tail end of the file, and then the original audio/video data can be obtained by returning to the middle position of the file.
Disclosure of Invention
The embodiment of the application provides a data processing method and device, a storage medium and electronic equipment, which can relieve the situation that the fluency of current audio and video playing is low.
The embodiment of the application provides a data processing method, which is characterized by comprising the following steps:
when a data acquisition request is received, calling a first request parameter to sequentially acquire a first target data module and a second target data module, and calling a second request parameter to acquire a third target data module, wherein the second target data module comprises a plurality of first sub-target data and a plurality of second sub-target data;
acquiring a data processing interval corresponding to the data acquisition request according to the third target data module;
acquiring processing parameters of the second target data module according to the first request parameters;
and calling the processing parameters to process each first sub-target data and each second sub-target data positioned in the data processing interval.
The data acquisition request comprises an audio and video playing request, the first target data module comprises metadata of an audio and video file, the second target data module comprises original video data and original audio data of the audio and video file, the third target data comprises index data of the audio and video file, when the data acquisition request is received, the first request parameter is called to sequentially acquire the first target data module and the second target data module, and the second request parameter is called to acquire the third target data module, and the data acquisition request comprises the following steps:
when an audio and video playing request is received, calling a first request parameter to sequentially obtain metadata of an audio and video file and original audio data and original video data of the audio and video file, and calling a second request parameter to obtain index data of the audio and video file.
The acquiring, by the module according to the third target data, the data processing interval corresponding to the data acquisition request includes:
determining initial cache data and ending cache data corresponding to the data acquisition request according to the index data;
and determining an interval between the initial buffer data and the finishing buffer data as a data processing interval.
The first sub-target data includes the original video data, the second sub-target data includes the original audio data, the processing parameters include a first processing parameter and a second processing parameter, and the invoking the processing parameters processes each first sub-target data and each second sub-target data located in the data processing section, including:
and alternately calling the first processing parameter and the second processing parameter to alternately buffer each original audio data and each original video data in an interval between the initial buffer data and the ending buffer data.
After the step of calling the processing parameter and processing each first sub-target data and each second sub-target data located in the data processing section, the method further includes:
when the alternative buffering of one original audio data and one original video data is finished, obtaining one video data and one audio data;
inquiring the current caching progress of the video data and the audio data;
obtaining a cache progress difference value between the original video data and the original audio data according to the current cache progress of the video data and the audio data;
when the cache progress difference is larger than a preset value, in the process of caching subsequent video data and audio data, adjusting the calling interval duration of the first processing parameter and the second processing parameter so as to enable the cache progress difference between the subsequent video data and the audio data to be smaller than or equal to the preset value.
An embodiment of the present application further provides a data processing apparatus, including:
the calling module is used for calling a first request parameter to sequentially obtain a first target data module and a second target data module and calling a second request parameter to obtain a third target data module when a data obtaining request is received, wherein the second target data module comprises a plurality of first sub-target data and a plurality of second sub-target data;
the interval acquisition module is used for acquiring a data processing interval corresponding to the data acquisition request according to the third target data module;
the processing parameter acquisition module is used for acquiring the processing parameters of the second target data module according to the first request parameters;
and the processing module is used for calling the processing parameters and processing the first sub-target data and the second sub-target data which are positioned in the data processing interval.
The data acquisition request comprises an audio and video playing request, the first target data module comprises metadata of an audio and video file, the second target data module comprises original video data and original audio data of the audio and video file, the third target data comprises index data of the audio and video file, and the calling module is specifically used for:
when an audio and video playing request is received, calling a first request parameter to sequentially obtain metadata of an audio and video file and original audio data and original video data of the audio and video file, and calling a second request parameter to obtain index data of the audio and video file.
The interval acquisition module is specifically configured to:
determining initial cache data and ending cache data corresponding to the data acquisition request according to the index data;
and determining an interval between the initial buffer data and the finishing buffer data as a data processing interval.
The embodiment of the application also provides a computer readable storage medium, wherein a plurality of instructions are stored in the computer readable storage medium, and the instructions are suitable for being loaded by a processor to execute any data processing method.
The embodiment of the application further provides an electronic device, which comprises a processor and a memory, wherein the processor is electrically connected with the memory, the memory is used for storing instructions and data, and the processor is used for executing the steps in any data processing method.
The embodiment of the application provides a data processing method, a data processing device, a storage medium and electronic equipment, wherein when a data acquisition request is received, a first request parameter is called to sequentially acquire a first target data module and a second target data module (comprising a plurality of first sub-target data and a plurality of second sub-target data), a second request parameter is called to acquire a third target data module, then a data processing interval corresponding to the data acquisition request is acquired according to the third target data module, then a processing parameter of the second target data module is acquired according to the first request parameter, and finally the processing parameter is called to process each first sub-target data and each second sub-target data which are positioned in the data processing interval. The first request parameter and the second request parameter are called simultaneously to obtain the first target data module, the second target data module and the third target data module, namely the first target data module, the second target data module and the third target data module are obtained simultaneously, so that the condition of low audio and video fluency caused by improper data obtaining sequence is avoided.
Drawings
The technical solution and other advantages of the present application will become apparent from the detailed description of the embodiments of the present application with reference to the accompanying drawings.
Fig. 1 is a schematic flow chart of a data processing method according to an embodiment of the present application.
Fig. 2 is a schematic view of a data processing scenario provided in an embodiment of the present application.
Fig. 3a is a schematic diagram of a data acquisition process in the prior art according to an embodiment of the present application.
Fig. 3b is a schematic diagram of a data acquisition process according to an embodiment of the present application.
Fig. 4 is another schematic flow chart of the data processing method according to the embodiment of the present application.
Fig. 5 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present application.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Fig. 7 is another schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides a data processing method and device, a storage medium and electronic equipment.
As shown in fig. 1, fig. 1 is a schematic flow chart of a data processing method provided in an embodiment of the present application, where the data processing method is applied to an electronic device, and a specific flow may be as follows:
s101, when a data acquisition request is received, calling a first request parameter to sequentially acquire a first target data module and a second target data module, and calling a second request parameter to acquire a third target data module, wherein the second target data module comprises a plurality of first sub-target data and a plurality of second sub-target data.
For example, in fig. 2, when a user clicks an audio/video file control 2002 on a file selection interface 2001, the data acquisition request may be triggered, the first request parameter and the second request parameter include codes for data access, the first target data module, the second target data module, and the third target data module are data forming parts of a network data file, and when data acquisition is completed, a first playing screen 2003 is displayed on a screen of the electronic device.
Optionally, the data acquisition request is an audio/video playing request, the first target data module is metadata of an audio/video file, the second target data module includes a plurality of first sub-target data and a plurality of second sub-target data, specifically, the first sub-target data is original video data, the second sub-target data is original audio data, and the third target data is index data of the audio/video file. Specifically, the metadata is a coding system of the network resource information, and is used for providing related information of the network resource information; the raw video data and the raw audio data are unprocessed (e.g., buffer processed) network data; the index data is used for representing the mapping relation between the data acquisition request and the related information of the original audio data and the original video data, and the mapping relation comprises the corresponding playing positions of the original audio data and the original video data.
In the prior art, as shown in fig. 3a, when a server receives an audio/video playing request, a network request 300 is first used to perform step S31 at the head end of an audio/video file: acquiring the metadata 3001, and then jumping to the end of the audio/video file to execute step S32: acquiring the index data 3002, and finally returning to the middle position of the audio/video file (i.e. the position between the head end and the tail end of the audio/video file) to execute the step S33: the original audio data and the original video data 3003 are acquired, and as the network request 300 needs to jump from the audio/video file to the tail end of the audio/video file and then jump from the tail end of the audio/video file to the middle position, the situation of network disconnection-reconnection occurs twice in the data acquisition process, so that the audio/video playing fluency is low.
In this embodiment, as shown in fig. 3b, when the server receives an audio/video playing request, the server invokes a first request parameter 301 to execute step S34 at the head end and the middle position of the audio/video file in sequence: acquisition metadata 3001, step S35: the original audio data and the original video data 3003 are acquired, and at the same time, the second request parameter 302 is invoked to perform step S36: the index data 3002 is obtained, that is, the first request parameter 301 and the second request parameter 302 are the metadata 3001, the original audio data, the original video data 3003, and the index data 3002 obtained in an asynchronous manner, so that the order of obtaining the metadata 3001, the original audio data, the original video data 3003, and the index data 3002 follows the front-back order of each position of the audio/video file, and the situation of low audio/video fluency caused by an improper order of obtaining data is avoided.
And S102, acquiring a data processing interval corresponding to the data acquisition request according to the third target data module.
The data processing interval is an interval in which data to be processed is located, and optionally, the data processing interval is an interval between the initial cache data and the end cache data. Specifically, since the index data may represent a mapping relationship between the data acquisition request and the related information of the original audio data and the original video data, the index data may be queried to determine initial buffer data and end buffer data corresponding to the data acquisition request, and playing positions of the initial buffer data and the end buffer data, and determine an interval between the initial buffer data and the end buffer data as a data processing interval, and optionally, the end buffer data is the original audio data and the original video data at the end of the audio/video file.
As shown in fig. 2, in an embodiment, after a user clicks an audio/video file control 2002 on a file selection interface 2001, a first playing picture 2003 is displayed on a screen, and at this time, a position of a playing progress control 2004 is a playing position of first initial cache data, so that a data processing interval is a playing interval included from the playing position of the first initial cache data to a playing position 2005 of end cache data; if the user drags the playing progress control 2004 at this time, so that the position of the dragged playing progress control 2004 is the playing position of the second initial cache data, and the data processing interval is the playing interval from the playing position of the second initial cache data (i.e., the position of the playing progress control 2004) to the playing position 2005 where the end cache data is located.
Further, when the user clicks the "return" control 2007, the current second playing screen 2006 is immediately and automatically closed in the screen, and the file selection interface 2001 is returned, when the user clicks the audio/video file control 2002 again, the second playing screen 2006 is displayed on the screen, the playing progress control 2004 is located at the playing position where the second initial cache data is located, and at this time, the data processing interval is the playing interval included from the playing position where the second initial cache data is located (i.e., the position where the playing progress control 2004 is located) to the playing position 2005 where the end cache data is located.
And S103, acquiring the processing parameters of the second target data module according to the first request parameters.
The processing parameters can be used for caching data, optionally, the processing parameters are used for caching original video data and original audio data, and after the original video data and the original audio data are cached, corresponding audio and video can be played through the player.
And S104, calling processing parameters, and processing the plurality of first sub-target data and the plurality of second sub-target data which are positioned in the data processing interval.
The processing parameters comprise a first processing parameter and a second processing parameter, and the first processing parameter is called to cache original video data to obtain playable video data; and calling a second processing parameter to cache the original audio data to obtain playable audio data.
Further, the step S104 further includes:
the first processing parameter and the second processing parameter are alternately called to alternately buffer a plurality of original audio data and a plurality of original video data located in an interval between the initial buffer data and the end buffer data.
The prior art generally caches original audio data and original video data synchronously, and the cached audio data and video data cannot be synchronized (namely, the audio data and the video data fall out of a cache memory) due to the fact that the caching speed of the original audio data is high and the caching speed of the original video data is low, so that a network is triggered to be automatically disconnected and reconnected, and the phenomenon of audio and video playing is blocked is caused.
In order to ensure that the buffered audio data and video data fall within the cache memory, in this embodiment, the first processing parameter and the second processing parameter are invoked in an alternating manner to perform alternating buffering on the original audio data and the original video data within the data processing interval.
For example, in a total of 20 (frames) of original audio data and 20 (frames) of original video data located in an interval between the initial buffering data and the end buffering data, the first processing parameter is first called to buffer the 1 st original video data, and calls a second processing parameter to buffer the 1 st original audio data after a preset interval duration (e.g., interval 0.2s) for the 1 st original video data to begin processing, then calls a second processing parameter to buffer the 2 nd original video data after 0.2s for the 1 st original audio data to begin processing, calls a second processing parameter to buffer the 2 nd original audio data after 0.2s for the 2 nd original video data to begin processing … …, and calls a second processing parameter to buffer the 20 th original audio data after 0.2s for the 20 th original video data to begin processing.
In one embodiment, to further ensure that the buffered audio data and the video data are synchronized, after obtaining a video data and a corresponding audio data (e.g., a first video data and a first audio data), querying a current buffering progress of the video data and the audio data, obtaining a buffering progress difference between the original video data and the original audio data according to the current buffering progress of the video data and the audio data, and when the buffering progress difference is greater than a preset value, adjusting a calling interval duration of the first processing parameter and the second processing parameter during buffering of subsequent video data and audio data, so that the buffering progress difference between the subsequent video data and the audio data is less than or equal to the preset value.
Specifically, the caching progress is the caching positions corresponding to the cached video data and the cached audio data, for example, the caching position corresponding to the first video data is 0.4s, and the caching position corresponding to the first audio data is 0.1 s.
For example, the call interval duration of the first processing parameter and the second processing parameter is 0.2s, after the first video data and the first audio data are obtained, the cache progress of the first video data is queried to be 0.4s, and the cache progress of the first audio data is queried to be 0.1s, so that the cache progress difference between the first original video data and the first original audio data is 0.3s, and the preset value is 0.1s, and therefore, the cache progress difference is greater than the preset value, which indicates that the call interval duration of the first processing parameter and the second processing parameter set before is unreasonable, so that the call interval duration of the first processing parameter and the second processing parameter is increased to 0.4s, and the cache progress difference between the second video data and the second audio data is 0.05 s.
Optionally, in another embodiment, when the difference in the buffering progress between the original video data and the original audio data is greater than a preset value, a network disconnection-reconnection operation is performed, and subsequent original video data and subsequent original audio data continue to be buffered after the network disconnection-reconnection.
As shown in fig. 4, fig. 4 is another schematic flow chart of the data processing method provided in the embodiment of the present application, and the specific flow may be as follows:
s401, when a data acquisition request is received, calling a first request parameter to sequentially acquire a first target data module and a second target data module, and calling a second request parameter to acquire a third target data module.
For example, as shown in fig. 2, when the server receives an audio/video playing request, the server invokes a first request parameter 301 to execute step S34 at the head end and the middle position of the audio/video file in sequence: acquisition metadata 3001, step S35: the original audio data and the original video data 3003 are acquired, and at the same time, the second request parameter 302 is invoked to perform step S36: index data 3002 is acquired.
S402, acquiring a data processing interval corresponding to the data acquisition request according to the third target data module.
For example, after the user drags the playing progress control 2004, the second initial cache data corresponding to the position of the dragged playing progress control 2004 is obtained through the index data, at this time, the data processing interval is the playing interval included from the playing position of the second initial cache data (i.e., the position of the playing progress control 2004) to the playing position 2005 where the finishing cache data is located, and the playing interval has 20 (frames) of original audio data and 20 (frames) of original video data in total.
And S403, acquiring the processing parameters of the second target data module according to the first request parameters.
The processing parameters can be used for caching data, optionally, the processing parameters are used for caching original video data and original audio data, and after the original video data and the original audio data are cached, corresponding audio and video can be played through the player.
And S404, alternately calling the first processing parameter and the second processing parameter to alternately cache a plurality of original audio data and a plurality of original video data in an interval between the initial cache data and the ending cache data.
For example, first, a first processing parameter is called to cache the 1 st original video data of 20 original video data in the interval, and after 0.2s of the 1 st original video data starting to be processed, a second processing parameter is called to cache the 1 st original audio data of the 20 original audio data.
S405, after video data and corresponding audio data are obtained, the current cache progress of the video data and the current cache progress of the audio data are inquired, and a cache progress difference value between original video data and original audio data is obtained.
For example, after the 1 st video data and the 1 st audio data are obtained, the buffering progress of the 1 st video data is queried to be 0.4s, and the buffering progress of the 1 st audio data is queried to be 0.1s, so that the difference between the buffering progress of the 1 st original video data and the buffering progress of the 1 st original audio data is 0.3 s.
S406, judging whether the cache progress difference value is larger than a preset value, if so, executing a step S407; if not, go to step S408.
For example, if the preset value is 0.1S, the difference between the buffering schedules of the 1 st original video data and the 1 st original audio data is 0.3S, and therefore, the difference between the buffering schedules is greater than the preset value, step S407 is executed; if the difference between the buffering schedules of the 1 st original video data and the 1 st original audio data is 0.1S, and thus the difference between the buffering schedules is equal to the preset value, step S408 is performed.
And S407, in the process of caching the subsequent video data and the audio data, adjusting the calling interval duration of the first processing parameter and the second processing parameter so as to enable the caching progress difference between the subsequent video data and the audio data to be smaller than or equal to a preset value.
For example, the call interval duration of the first processing parameter and the second processing parameter is increased to 0.4s, so that the difference in the buffering progress of the second video data and the second audio data is 0.05s (less than 0.1 s).
And S408, caching each subsequent original video data and each subsequent original audio data according to the calling interval duration of the initial first processing parameter and the initial second processing parameter.
For example, the second processing parameter is invoked after 0.2s from the beginning of the processing of the 1 st original audio data to buffer the 2 nd original video, and the second processing parameter is invoked after 0.2s from the beginning of the processing of the 2 nd original video data to buffer the 2 nd original audio data … … and the second processing parameter is invoked after 0.2s from the beginning of the processing of the 20 th original video data to buffer the 20 th original audio data.
As can be seen from the above, in the data processing method provided by the application, when a data acquisition request is received, a first request parameter is called to sequentially acquire a first target data module and a second target data module (including a plurality of first sub-target data and a plurality of second sub-target data), a second request parameter is called to acquire a third target data module, then a data processing section corresponding to the data acquisition request is acquired according to the third target data module, then a processing parameter of the second target data module is acquired according to the first request parameter, and finally the processing parameter is called to process each first sub-target data and each second sub-target data located in the data processing section. The first request parameter and the second request parameter are called simultaneously to obtain the first target data module, the second target data module and the third target data module, namely the first target data module, the second target data module and the third target data module are obtained simultaneously, so that the condition of low audio and video fluency caused by improper data obtaining sequence is avoided.
Referring to fig. 5, fig. 5 specifically describes a data processing apparatus provided in an embodiment of the present application, where the data processing apparatus may include: the method comprises a calling module 10, an interval obtaining module 20, a processing parameter obtaining module 30 and a processing module 40, wherein:
(1) calling module 10
The calling module 10 is configured to, when a data acquisition request is received, call a first request parameter to sequentially acquire a first target data module and a second target data module, and call a second request parameter to acquire a third target data module, where the second target data module includes a plurality of first sub-target data and a plurality of second sub-target data;
(2) section acquisition module 20
An interval obtaining module 20, configured to obtain, according to the third target data module, a data processing interval corresponding to the data obtaining request;
(3) process parameter acquisition module 30
A processing parameter obtaining module 30, configured to obtain a processing parameter of the second target data module according to the first request parameter;
(4) processing module 40
And the processing module 40 is configured to invoke the processing parameter, and process each first sub-target data and each second sub-target data located in the data processing interval.
As can be seen from the above, in the data processing apparatus provided in the present application, when a data acquisition request is received, the calling module 10 calls the first request parameter to sequentially obtain the first target data module and the second target data module (including a plurality of first sub-target data and a plurality of second sub-target data), calls the second request parameter to obtain the third target data module, then obtains the data processing section corresponding to the data acquisition request according to the third target data module through the section obtaining module 20, then obtains the processing parameter of the second target data module according to the first request parameter through the processing parameter obtaining module 30, and finally calls the processing parameter through the processing module 40 to process each first sub-target data and each second sub-target data located in the data processing section. The first request parameter and the second request parameter are called simultaneously to obtain the first target data module, the second target data module and the third target data module, namely the first target data module, the second target data module and the third target data module are obtained simultaneously, so that the condition of low audio and video fluency caused by improper data obtaining sequence is avoided.
Correspondingly, the embodiment of the invention also provides a data processing system, which comprises any one of the data processing devices provided by the embodiment of the invention, and the data processing device can be integrated in electronic equipment.
Since the data processing system may include any data processing apparatus provided in the embodiment of the present invention, beneficial effects that can be achieved by any data processing apparatus provided in the embodiment of the present invention can be achieved, for details, see the foregoing embodiment, and are not described herein again.
In addition, the embodiment of the application also provides electronic equipment, and the electronic equipment can be equipment such as a smart phone. As shown in fig. 6, the electronic device 600 includes a processor 601, a memory 602. The processor 601 is electrically connected to the memory 602.
The processor 601 is a control center of the electronic device 600, connects various parts of the whole electronic device by using various interfaces and lines, and performs various functions of the electronic device and processes data by running or loading an application program stored in the memory 602 and calling the data stored in the memory 602, thereby performing overall monitoring of the electronic device.
In this embodiment, the processor 601 in the electronic device 600 loads instructions corresponding to processes of one or more application programs into the memory 602 according to the following steps, and the processor 601 runs the application programs stored in the memory 602, thereby implementing various functions:
when a data acquisition request is received, calling a first request parameter to sequentially acquire a first target data module and a second target data module, and calling a second request parameter to acquire a third target data module, wherein the second target data module comprises a plurality of first sub-target data and a plurality of second sub-target data;
acquiring a data processing interval corresponding to the data acquisition request according to a third target data module;
acquiring processing parameters of a second target data module according to the first request parameters;
and calling the processing parameters, and processing the first sub-target data and the second sub-target data which are positioned in the data processing interval.
Fig. 7 is a specific block diagram of an electronic device according to an embodiment of the present invention, where the electronic device may be used to implement the data processing method provided in the foregoing embodiment. The electronic device 700 may be a smartphone or a tablet computer.
The RF circuit 710 is used for receiving and transmitting electromagnetic waves, and performing interconversion between the electromagnetic waves and electrical signals, thereby communicating with a communication network or other devices. The RF circuitry 710 may include various existing circuit elements for performing these functions, such as antennas, radio frequency transceivers, digital signal processors, encryption/decryption chips, Subscriber Identity Module (SIM) cards, memory, and so forth. The RF circuit 710 may communicate with various networks such as the internet, an intranet, a wireless network, or with other devices over a wireless network. The wireless network may comprise a cellular telephone network, a wireless local area network, or a metropolitan area network. The Wireless network may use various Communication standards, protocols, and technologies, including, but not limited to, Global System for Mobile Communication (GSM), Enhanced Data GSM Environment (EDGE), Wideband Code Division Multiple Access (WCDMA), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Wireless Fidelity (Wi-Fi) (e.g., Institute of Electrical and Electronics Engineers (IEEE) standard IEEE802.11 a, IEEE802.11 b, IEEE802.11g, and/or IEEE802.11 n), Voice over Internet Protocol (VoIP), world wide mail Access (Microwave Access for micro), wimax-1, other suitable short message protocols, and any other suitable Protocol for instant messaging, and may even include those protocols that have not yet been developed.
The memory 720 may be used to store software programs and modules, and the processor 780 performs various functional applications and data processing by operating the software programs and modules stored in the memory 720. The memory 720 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, memory 720 may further include memory located remotely from processor 780, which may be connected to electronic device 700 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input unit 730 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, the input unit 730 may include a touch-sensitive surface 731 as well as other input devices 732. Touch-sensitive surface 731, also referred to as a touch display screen or touch pad, can collect touch operations by a user on or near touch-sensitive surface 731 (e.g., operations by a user on or near touch-sensitive surface 731 using a finger, stylus, or any other suitable object or attachment) and drive the corresponding connection device according to a predetermined program. Alternatively, the touch sensitive surface 731 may comprise two parts, a touch detection means and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts it to touch point coordinates, and sends the touch point coordinates to the processor 780, and can receive and execute commands from the processor 780. In addition, the touch-sensitive surface 731 can be implemented in a variety of types, including resistive, capacitive, infrared, and surface acoustic wave. The input unit 730 may also include other input devices 732 in addition to the touch-sensitive surface 731. In particular, other input devices 732 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 740 may be used to display information input by or provided to the user and various graphical user interfaces of the electronic device 700, which may be made up of graphics, text, icons, video, and any combination thereof. The Display unit 740 may include a Display panel 741, and optionally, the Display panel 741 may be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like. Further, touch-sensitive surface 731 can overlay display panel 741, such that when touch-sensitive surface 731 detects a touch event thereon or nearby, processor 780 can determine the type of touch event, and processor 780 can then provide a corresponding visual output on display panel 741 based on the type of touch event. Although in FIG. 7 the touch-sensitive surface 731 and the display panel 741 are implemented as two separate components to implement input and output functions, in some embodiments the touch-sensitive surface 731 and the display panel 741 may be integrated to implement input and output functions.
The electronic device 700 may also include at least one sensor 750, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 741 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 741 and/or a backlight when the electronic device 700 is moved to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when the mobile phone is stationary, and can be used for applications of recognizing the posture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which may be further configured to the electronic device 700, detailed descriptions thereof are omitted.
The audio circuit 760, speaker 761, and microphone 762 may provide an audio interface between a user and the electronic device 700. The audio circuit 760 can transmit the electrical signal converted from the received audio data to the speaker 761, and the electrical signal is converted into a sound signal by the speaker 761 and output; on the other hand, the microphone 762 converts the collected sound signal into an electric signal, converts the electric signal into audio data after being received by the audio circuit 760, processes the audio data by the audio data output processor 780, and transmits the processed audio data to, for example, another terminal via the RF circuit 710, or outputs the audio data to the memory 720 for further processing. The audio circuitry 760 may also include an earbud jack to provide communication of a peripheral headset with the electronic device 700.
The electronic device 700, via the transport module 770 (e.g., a Wi-Fi module), may assist a user in sending and receiving e-mail, browsing web pages, accessing streaming media, etc., which provides wireless broadband internet access to the user. Although fig. 7 shows the transmission module 770, it is understood that it does not belong to the essential constitution of the electronic device 700 and may be omitted entirely within the scope not changing the essence of the invention as needed.
The processor 780 is a control center of the electronic device 700, connects various parts of the entire mobile phone using various interfaces and lines, and performs various functions of the electronic device 700 and processes data by operating or executing software programs and/or modules stored in the memory 720 and calling data stored in the memory 720, thereby integrally monitoring the mobile phone. Optionally, processor 780 may include one or more processing cores; in some embodiments, processor 780 may integrate an application processor that handles primarily the operating system, user interface, applications, etc. and a modem processor that handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 780.
The electronic device 700 also includes a power supply 790 (e.g., a battery) that provides power to various components, and in some embodiments may be logically coupled to the processor 780 via a power management system that may perform functions such as managing charging, discharging, and power consumption. The power supply 790 may also include any component including one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown, the electronic device 700 may further include a camera (e.g., a front camera, a rear camera), a bluetooth module, and the like, which are not described in detail herein. Specifically, in this embodiment, the display unit of the electronic device is a touch screen display, the electronic device further includes a memory, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the one or more processors, and the one or more programs include instructions for:
when a data acquisition request is received, calling a first request parameter to sequentially acquire a first target data module and a second target data module, and calling a second request parameter to acquire a third target data module, wherein the second target data module comprises a plurality of first sub-target data and a plurality of second sub-target data;
acquiring a data processing interval corresponding to the data acquisition request according to a third target data module;
acquiring processing parameters of a second target data module according to the first request parameters;
and calling the processing parameters, and processing the first sub-target data and the second sub-target data which are positioned in the data processing interval.
In specific implementation, the above modules may be implemented as independent entities, or may be combined arbitrarily to be implemented as the same or several entities, and specific implementation of the above modules may refer to the foregoing method embodiments, which are not described herein again.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor. To this end, the present invention provides a storage medium, in which a plurality of instructions are stored, and the instructions can be loaded by a processor to execute the steps in any one of the data processing methods provided by the embodiments of the present invention.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the instructions stored in the storage medium can execute the steps in any data processing method provided in the embodiment of the present invention, the beneficial effects that can be achieved by any data processing method provided in the embodiment of the present invention can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
In summary, although the present application has been described with reference to the preferred embodiments, the above-described preferred embodiments are not intended to limit the present application, and those skilled in the art can make various changes and modifications without departing from the spirit and scope of the present application, so that the scope of the present application shall be determined by the scope of the appended claims.

Claims (10)

1. A data processing method, comprising:
when a data acquisition request is received, calling a first request parameter to sequentially acquire a first target data module and a second target data module, and calling a second request parameter to acquire a third target data module, wherein the second target data module comprises a plurality of first sub-target data and a plurality of second sub-target data;
acquiring a data processing interval corresponding to the data acquisition request according to the third target data module;
acquiring processing parameters of the second target data module according to the first request parameters;
and calling the processing parameters to process each first sub-target data and each second sub-target data positioned in the data processing interval.
2. The data processing method of claim 1, wherein the data acquisition request includes an audio/video play request, the first target data module includes metadata of an audio/video file, the second target data module includes original video data and original audio data of the audio/video file, the third target data includes index data of the audio/video file, and when the data acquisition request is received, the first request parameter is invoked to sequentially acquire the first target data module and the second target data module, and the second request parameter is invoked to acquire the third target data module, including:
when an audio and video playing request is received, calling a first request parameter to sequentially obtain metadata of an audio and video file and original audio data and original video data of the audio and video file, and calling a second request parameter to obtain index data of the audio and video file.
3. The data processing method according to claim 2, wherein the acquiring, according to the third target data module, the data processing section corresponding to the data acquisition request includes:
determining initial cache data and ending cache data corresponding to the data acquisition request according to the index data;
and determining an interval between the initial buffer data and the finishing buffer data as a data processing interval.
4. The data processing method according to claim 3, wherein the first sub-target data is the original video data, the second sub-target data is the original audio data, the processing parameters include a first processing parameter and a second processing parameter, and the invoking the processing parameters processes the first sub-target data and the second sub-target data located in the data processing section, including:
and alternately calling the first processing parameter and the second processing parameter to alternately buffer each original audio data and each original video data in an interval between the initial buffer data and the ending buffer data.
5. The data processing method according to claim 4, further comprising, after the step of calling the processing parameter and processing each first sub-target data and each second sub-target data located in the data processing section, the step of:
when the alternative buffering of one original audio data and one original video data is finished, obtaining one video data and one audio data;
inquiring the current caching progress of the video data and the audio data;
obtaining a cache progress difference value between the original video data and the original audio data according to the current cache progress of the video data and the audio data;
when the cache progress difference is larger than a preset value, in the process of caching subsequent video data and audio data, adjusting the calling interval duration of the first processing parameter and the second processing parameter so as to enable the cache progress difference between the subsequent video data and the audio data to be smaller than or equal to the preset value.
6. A data processing apparatus, comprising:
the calling module is used for calling a first request parameter to sequentially obtain a first target data module and a second target data module and calling a second request parameter to obtain a third target data module when a data obtaining request is received, wherein the second target data module comprises a plurality of first sub-target data and a plurality of second sub-target data;
the interval acquisition module is used for acquiring a data processing interval corresponding to the data acquisition request according to the third target data module;
the processing parameter acquisition module is used for acquiring the processing parameters of the second target data module according to the first request parameters;
and the processing module is used for calling the processing parameters and processing the first sub-target data and the second sub-target data which are positioned in the data processing interval.
7. The data processing apparatus according to claim 6, wherein the data acquisition request includes an audio/video play request, the first target data module includes metadata of an audio/video file, the first sub-target data includes original video data of the audio/video file, the second sub-target data includes original audio data of the audio/video file, the third target data includes index data of the audio/video file, and the invoking module is specifically configured to:
when an audio and video playing request is received, calling a first request parameter to sequentially obtain metadata of an audio and video file and original audio data and original video data of the audio and video file, and calling a second request parameter to obtain index data of the audio and video file.
8. The data processing apparatus according to claim 6, wherein the interval acquisition module is specifically configured to:
determining initial cache data and ending cache data corresponding to the data acquisition request according to the index data;
and determining an interval between the initial buffer data and the finishing buffer data as a data processing interval.
9. A computer-readable storage medium having stored thereon a plurality of instructions adapted to be loaded by a processor to perform the steps of the data processing method of any of claims 1 to 5.
10. An electronic device comprising a processor and a memory, the processor being electrically connected to the memory, the memory being configured to store instructions and data, the processor being configured to perform the steps of the data processing method according to any one of claims 1 to 5.
CN202111580266.4A 2021-12-22 2021-12-22 Data processing method, data processing device, storage medium and electronic equipment Pending CN114237941A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111580266.4A CN114237941A (en) 2021-12-22 2021-12-22 Data processing method, data processing device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111580266.4A CN114237941A (en) 2021-12-22 2021-12-22 Data processing method, data processing device, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN114237941A true CN114237941A (en) 2022-03-25

Family

ID=80761244

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111580266.4A Pending CN114237941A (en) 2021-12-22 2021-12-22 Data processing method, data processing device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN114237941A (en)

Similar Documents

Publication Publication Date Title
US9760998B2 (en) Video processing method and apparatus
US9319630B2 (en) Method and device for video processing
US20160292946A1 (en) Method and apparatus for collecting statistics on network information
CN108958629B (en) Split screen quitting method and device, storage medium and electronic equipment
US20150043312A1 (en) Sound playing method and device thereof
CN111158815B (en) Dynamic wallpaper blurring method, terminal and computer readable storage medium
CN105513098B (en) Image processing method and device
CN107193551B (en) Method and device for generating image frame
CN110471895B (en) Sharing method and terminal device
CN111368238A (en) Status bar adjusting method and device, mobile terminal and storage medium
CN105320532B (en) Method, device and terminal for displaying interactive interface
CN105159655B (en) Behavior event playing method and device
CN108737619A (en) A kind of call control method of terminal, device and terminal
CN110996003B (en) Photographing positioning method and device and mobile terminal
CN111355991B (en) Video playing method and device, storage medium and mobile terminal
CN108008808A (en) The method of adjustment and mobile terminal of operating parameter
CN114237941A (en) Data processing method, data processing device, storage medium and electronic equipment
CN106803916B (en) Information display method and device
CN111143580A (en) Multimedia data storage method and device, storage medium and electronic equipment
CN106358070B (en) Multimedia file uploading method and device
CN110618798A (en) Multi-screen display method, multi-screen device, storage medium and terminal equipment
CN110971760B (en) Network communication content output control method and device, storage medium and terminal equipment
CN109660664B (en) Event processing method, device and storage medium
CN111966271B (en) Screen panorama screenshot method and device, terminal equipment and storage medium
CN110990606B (en) Picture storage method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination