CN113115039A - Working frequency determination method and device and electronic equipment - Google Patents

Working frequency determination method and device and electronic equipment Download PDF

Info

Publication number
CN113115039A
CN113115039A CN202110396719.1A CN202110396719A CN113115039A CN 113115039 A CN113115039 A CN 113115039A CN 202110396719 A CN202110396719 A CN 202110396719A CN 113115039 A CN113115039 A CN 113115039A
Authority
CN
China
Prior art keywords
sub
determining
frequency
processed
packet
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110396719.1A
Other languages
Chinese (zh)
Other versions
CN113115039B (en
Inventor
孙振燕
李�荣
罗小伟
郭春磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Spreadtrum Communications Shanghai Co Ltd
Original Assignee
Spreadtrum Communications Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Spreadtrum Communications Shanghai Co Ltd filed Critical Spreadtrum Communications Shanghai Co Ltd
Priority to CN202110396719.1A priority Critical patent/CN113115039B/en
Publication of CN113115039A publication Critical patent/CN113115039A/en
Application granted granted Critical
Publication of CN113115039B publication Critical patent/CN113115039B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/146Data rate or code amount at the encoder output
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/156Availability of hardware or computational resources, e.g. encoding based on power-saving criteria

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

The application provides a working frequency determining method and device and electronic equipment, and relates to the technical field of terminals. The working frequency determining method comprises the following steps: firstly, responding to each task starting instruction and/or task ending instruction sent by the upper layer application, and updating each to-be-processed instance according to each task starting instruction and/or task ending instruction. And then, when the number of each updated to-be-processed example is determined not to be 0, receiving the updated sub-packets of each to-be-processed example sent by the upper layer application. And finally, determining the total amount of video data contained in each received sub-packet, and determining the working frequency for processing each sub-packet from each available frequency according to the total amount of the video data. Therefore, the power consumption of the video coding and decoding module can be reduced on the basis of meeting different performance requirements.

Description

Working frequency determination method and device and electronic equipment
[ technical field ] A method for producing a semiconductor device
The present application relates to the field of terminal technologies, and in particular, to a method and an apparatus for determining a working frequency, and an electronic device.
[ background of the invention ]
Video coding refers to compressing original video data according to a certain standard or protocol to generate code stream data; video decoding refers to restoring code stream data into original video data. The video coding and decoding technology is widely applied to scenes such as video playing, video recording, network video call, network live broadcast and the like, and many common terminal devices such as mobile phones, vehicles and the like comprise video coding and decoding modules.
Generally, the higher the frequency of the video codec module, the faster the operation speed, and the higher the supportable performance indexes such as resolution, frame rate, etc. Therefore, in order to meet the performance requirements of users, the frequency setting of the video codec module is usually high. However, the high frequency setting may increase power consumption of the video codec module, which may result in unnecessary power consumption waste in a scenario with a low performance requirement. How to set the frequency of the video coding and decoding module to reduce the power consumption of the video coding and decoding module on the basis of meeting different performance requirements is a problem to be solved at present.
[ summary of the invention ]
The embodiment of the application provides a working frequency determining method, a working frequency determining device and electronic equipment, and the working frequency of video coding and decoding equipment is dynamically adjusted according to performance requirements under different application scenes. Therefore, the power consumption of the video coding and decoding module can be reduced on the basis of meeting different performance requirements.
In a first aspect, an embodiment of the present application provides a method for determining an operating frequency, where the method is applied to a video coding and decoding device, and includes: responding to each task starting instruction and/or task ending instruction sent by an upper layer application, and updating each to-be-processed instance according to each task starting instruction and/or task ending; when the number of the updated to-be-processed examples is determined to be not 0, receiving updated sub-packets of the to-be-processed examples sent by the upper layer application; the sub data packet is one data packet in all the data packets to be processed corresponding to the example to be processed; determining the total amount of video data contained in each received sub-data packet, and determining the processing frequency corresponding to the total amount of video data; and determining the working frequency for processing each sub-data packet from each available frequency according to the processing frequency.
In one possible implementation manner, updating each to-be-processed instance according to each task start instruction and/or task termination includes: adding an instance corresponding to each task starting instruction as a to-be-processed instance according to each task starting instruction; and/or deleting the instance corresponding to each task termination instruction from the instances to be processed according to each task termination instruction.
In one possible implementation manner, determining the total amount of video data included in each received sub-packet includes: determining the resolution and frame rate of the video data contained in each sub-data packet; and determining the total amount of the video data contained in each sub-packet according to the resolution and the frame rate of the video data contained in each sub-packet.
In one possible implementation manner, determining the resolution and the frame rate of the video data included in each sub-packet includes: if the to-be-processed example corresponding to the sub-data packet is the to-be-coded example, determining the resolution and the frame rate of the original video data according to the original video data contained in the sub-data packet; if the to-be-processed example corresponding to the sub-data packet is the to-be-decoded example, determining the resolution of the original video data corresponding to the code stream data according to the header information of the code stream data contained in the sub-data packet; and determining the frame rate of the original video data corresponding to the code stream data according to the decoding time stamp contained in the sub-data packet.
In one possible implementation manner, determining, from each available frequency according to the processing frequency, an operating frequency for processing each subpacket, includes: determining whether each available frequency includes an available frequency greater than the processing frequency; if the available frequencies comprise available frequencies larger than the processing frequency, determining the minimum available frequency in the available frequencies larger than the processing frequency as the working frequency for processing the sub-packets; and if the available frequencies are determined to be smaller than the processing frequency, determining the maximum available frequency in the available frequencies as the working frequency for processing the sub-packets.
In one possible implementation manner, after determining an operating frequency for processing each subpacket from each available frequency, the method further includes: and coding and decoding the received sub data packets of each example to be processed in sequence according to the working frequency.
In one possible implementation manner, the method further includes: and when the number of the updated to-be-processed examples is determined to be 0, terminating the encoding and decoding and powering down.
In a second aspect, an embodiment of the present application provides an operating frequency determining apparatus, including: the response module is used for responding to each task starting instruction and/or task ending instruction sent by the upper layer application and updating each to-be-processed instance according to each task starting instruction and/or task ending; a receiving module, configured to receive updated subpackets of each to-be-processed instance sent by the upper application when it is determined that the number of each to-be-processed instance after updating is not 0; the sub data packet is one data packet in all the data packets to be processed corresponding to the example to be processed; a first determining module, configured to determine a total amount of video data included in each received sub-packet, and determine a processing frequency corresponding to the total amount of video data; and the second determining module is used for determining the working frequency for processing each sub-data packet from each available frequency according to the processing frequency.
In a third aspect, an embodiment of the present application provides an electronic device, including: at least one processor; and at least one memory communicatively coupled to the processor, wherein: the memory stores program instructions executable by the processor, which when called by the processor are capable of performing the method as described above.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing computer instructions that cause the computer to perform the method described above.
In the above technical solution, first, each to-be-processed instance is updated according to each task start instruction and/or task termination in response to each task start instruction and/or task termination instruction sent by the upper layer application. And then, when the number of each updated to-be-processed example is determined not to be 0, receiving the updated sub-packets of each to-be-processed example sent by the upper layer application. The sub-packet is one of all the to-be-processed packets corresponding to the to-be-processed instance. And finally, determining the total amount of video data contained in each received sub-packet, determining the processing frequency corresponding to the total amount of the video data, and determining the working frequency for processing each sub-packet from each available frequency according to the processing frequency. Therefore, the power consumption of the video coding and decoding module can be reduced on the basis of meeting different performance requirements.
[ description of the drawings ]
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart of an operating frequency determining method according to an embodiment of the present disclosure;
fig. 2 is a flowchart of another operating frequency determining method provided in an embodiment of the present application;
fig. 3 is a flowchart of another operating frequency determining method provided in an embodiment of the present application;
fig. 4 is a flowchart of another operating frequency determining method provided in the embodiment of the present application;
fig. 5 is a schematic structural diagram of an operating frequency determining apparatus according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
[ detailed description ] embodiments
For better understanding of the technical solutions of the present application, the following detailed descriptions of the embodiments of the present application are provided with reference to the accompanying drawings.
It should be understood that the embodiments described are only a few embodiments of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terminology used in the embodiments of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the examples of this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
The working frequency determining method provided by the embodiment of the application can be applied to any video coding and decoding equipment. The video coding and decoding device can be any terminal device, such as a video coding and decoding module in a mobile phone, a vehicle, a monitoring device and the like.
Fig. 1 is a flowchart of an operating frequency determining method provided in an embodiment of the present application, and as shown in fig. 1, the operating frequency determining method may include:
step 101, responding to each task starting instruction and/or task ending instruction sent by the upper layer application, and updating each to-be-processed instance according to each task starting instruction and/or task ending instruction.
The operation modes of the video coding and decoding device are divided into two modes, including a single-instance mode and a multi-instance mode.
In the single-instance mode, the video coding and decoding device only supports one-way video coding or only supports one-way video decoding. The corresponding actual scene is, for example, a video recording, or a video playing, etc.
In the multiple-instance mode, the video coding and decoding device can support the alternation of multiple video coding and multiple video decoding in a time division multiplexing mode. The corresponding actual scene includes, for example, one path of video coding and one path of video decoding included in the video call, front and rear shooting video coding and preview video coding included in the automobile data recorder, and the like.
When the upper layer application needs to call the video coding and decoding equipment for coding and decoding, a task starting instruction can be sent to the video coding and decoding equipment so as to indicate the video coding and decoding equipment to newly add a to-be-processed example. And then, the upper layer application can send a data packet corresponding to the to-be-processed example to the video coding and decoding equipment. After all the data packets corresponding to the to-be-processed instances are coded and decoded, the upper layer application can send a task termination instruction to the video coding and decoding equipment to instruct the video coding and decoding equipment to delete the corresponding to-be-processed instances.
In the embodiment of the application, a counter count may be set, and when a to-be-processed instance is newly added or deleted, the counter count is automatically added or subtracted. So that the number of instances to be processed at any one time can be specified. When the number of the to-be-processed instances is determined to be 0 according to the counter count, the encoding and decoding flow can be terminated and the encoding and decoding equipment is powered off.
And 102, receiving updated sub-packets of each to-be-processed instance sent by the upper application when the number of each to-be-processed instance after updating is determined to be not 0.
In this embodiment of the present application, a sub-packet refers to one of all to-be-processed data packets corresponding to an to-be-processed instance. That is, for each pending instance, the upper layer application only sends one packet of all pending packets of the pending instance at a time, and the one packet only contains one frame of data of all pending video data. It can be understood that, therefore, the number of sub-packets received by the video codec device at a time is equal to the number of instances to be processed.
Step 103, determining the total amount of video data contained in each received sub-packet, and determining the processing frequency corresponding to the total amount of video data.
In this embodiment, the size of the video data included in each sub-packet may be determined according to the resolution and the frame rate of the video data included in the sub-packet. The resolution is the number of pixels of an image or video in both the length and width directions, and for example, 720p means 1280 × 720, and 1080p means 1920 × 1080. Frame rate is used to describe the number of frames per second a video is played/recorded. Resolution and frame rate are commonly used as measures of video display performance.
In the embodiment of the present application, the size of video data is expressed by the product of the resolution and the frame rate. The larger the product of resolution and frame rate, the larger the video data can be considered. It can be understood that, the larger the number of the to-be-processed instances is, the larger the video data of the data packet corresponding to each to-be-processed instance is, and the higher the requirement on the operating frequency of the video encoding and decoding device is. Therefore, the working frequency of the video coding and decoding equipment can be dynamically adjusted according to the total amount of the video data contained in each received sub-data packet.
First, the resolution and frame rate of the video data contained in each sub-packet may be determined.
In the embodiment of the present application, the determination methods of the resolution and the frame rate of the video data included in the sub-packets corresponding to the to-be-encoded instance and the to-be-decoded instance are different.
Specifically, if the to-be-processed instance corresponding to the sub-packet is the to-be-encoded instance, the data contained in the sub-packet is the original video data, and the corresponding resolution and frame rate can be directly obtained.
If the to-be-processed example corresponding to the sub-packet is the to-be-decoded example, the data contained in the sub-packet is code stream data, that is, binary data generated after the original video data is compressed. At this time, header information of the bitstream data may be parsed, thereby obtaining a resolution. As in the h.264 compression standard, resolution information is included in a Sequence Parameter Set (SPS), and resolution information of decoded data can be acquired by parsing the SPS.
Further, the sub-packets include Decoding Time Stamp (DTS) in addition to the code stream information. The DTS is used to indicate a time interval, which is usually in milliseconds, during which an upper layer transmits one frame of code stream data to the codec device. Thus, the frame rate of the video data contained in the sub-packet can be calculated according to the following formula.
Figure BDA0003018843890000071
Where R represents the frame rate of the video data contained in the sub-packet. TS (transport stream)DA value representing a timestamp.
Then, according to the resolution and frame rate of the video data contained in each sub-packet, the total amount of video data contained in each sub-packet is determined.
In the embodiment of the present application, the total amount of video data contained in each sub-packet may be calculated according to the following formula:
Figure BDA0003018843890000072
where C denotes the total amount of video data, wiIndicates the width, h, of the video data contained in the ith sub-packetiIndicating the height, r, of the video data contained in the ith sub-packetiIndicating the frame rate of the video data contained in the ith sub-packet.
Finally, a processing frequency corresponding to the total amount of video data is determined.
In the embodiment of the application, the video coding and decoding equipment has a plurality of available frequencies, and each frequency corresponds to one frequency point. For example, table 1 shows the correspondence between each available frequency and a frequency point of the video encoding and decoding device.
Equipment frequency point 0 1 2 3
Available frequency 176MHz 256MHz 384MHz 512MHz
TABLE 1
Further, each available frequency corresponds to a maximum video processing capacity. The maximum video processing capacity is expressed in terms of resolution, frame rate, and number of instances of video data. For convenience of description, the embodiments of the present application express the maximum video processing capability of each available frequency in the form of (resolution, frame rate, number of instances). For example, (720p, 30fps, 1), indicating that encoding or decoding of one-way (720p, 30fps) video data can be supported at maximum.
Based on this, the embodiment of the present application may utilize the corresponding relationship between each available frequency and the maximum video processing capability to calculate the processing frequency corresponding to the total amount of video data included in each sub-packet.
First, a reference frequency may be selected from various available frequencies of the video codec device. For example, the available frequency with the smallest frequency value, that is, the available frequency corresponding to frequency point 0, may be selected as the reference frequency.
Then, the processing frequency corresponding to the total amount of video data contained in each sub-packet can be calculated according to the following formula:
Figure BDA0003018843890000081
wherein f is0Is a reference frequency, w0Indicates the width, h, of the reference video data0Representing the height of the reference video data, r0Representing the frame rate of the reference video data. C is each obtained in the preceding stepThe sub-packets contain the total amount of video data. The reference video data refers to video data corresponding to a maximum video processing capability of a reference frequency.
And 104, determining the working frequency for processing each sub-packet from each available frequency according to the processing frequency.
It is understood that the processing frequency obtained in the foregoing steps is an ideal result calculated according to a linear relationship between the frequency and the maximum video processing capacity, and a frequency value equal to the processing frequency may not exist in each available frequency of the video codec device. Therefore, the final operating frequency can be selected from the available frequencies based on the calculated processing frequency.
Specifically, it may be determined whether each available frequency includes an available frequency greater than the processing frequency.
In one possible case, each available frequency includes an available frequency greater than the processing frequency. At this time, the smallest one of the available frequencies greater than the processing frequency may be determined as the operating frequency at which each sub-packet is processed. Therefore, redundant power consumption waste can be avoided.
In another possible scenario, each available frequency is less than the processing frequency. At this time, the largest one of the available frequencies may be determined as an operating frequency for processing each sub-packet. It should be noted that, at this time, the processing capacity required by each sub-packet exceeds the upper limit supportable by the video codec device.
In the embodiment of the application, the video coding and decoding device can update each to-be-processed instance according to each task starting instruction and/or task ending instruction sent by the upper layer application. And then, receiving updated sub-packets of each to-be-processed instance sent by the upper layer application, and dynamically adjusting the working frequency according to the total amount of video data contained in each sub-packet. Therefore, the power consumption of the video coding and decoding module can be reduced on the basis of meeting different performance requirements.
Fig. 2 is a flowchart of another operating frequency determining method provided in the embodiment of the present application, and as shown in fig. 2, after step 104 in the embodiment shown in fig. 1 of the present application, the method may further include:
and step 201, coding and decoding the received sub-packets of each to-be-processed instance in sequence according to the working frequency.
In this embodiment, the video encoding and decoding device may sequentially perform encoding and decoding processing on data included in each sub-packet according to the receiving sequence of each sub-packet.
For convenience of explanation, taking encoding and decoding sub-data of two to-be-processed examples as an example, as shown in fig. 3, if the video encoding and decoding device receives the sub-data packet of the to-be-processed example 1 first, after determining the working frequency, the encoding and decoding flow of the sub-data packet of each to-be-processed example is as follows:
step 301, the example 1 to be processed acquires equipment.
In the embodiment of the present application, the acquiring device refers to locking a video encoding and decoding device. During one instance acquisition device, the video codec device can only process packets for that instance.
Step 302, start encoding/decoding.
If the sub-packet corresponding to the example 1 to be processed is the packet to be decoded, the video codec device may start the decoding operation with the operating frequency obtained in the foregoing embodiment. Otherwise, the encoding operation is started.
In step 303, the encoding/decoding is finished.
In step 304, pending example 1 releases the device.
Releasing the device refers to unlocking the video codec device. After the device is released, the video coding and decoding device is in an acquirable state.
Step 305, the pending instance 2 acquires a device.
Step 306, start encoding/decoding.
In step 307, the encoding/decoding is finished.
Step 308, the pending instance 2 releases the device.
In the embodiment of the present application, after all the sub-packets corresponding to all the to-be-processed instances are processed, the video encoding and decoding device may receive a new sub-packet corresponding to each to-be-processed instance sent by the upper layer application again.
After receiving the new sub-packets, the operating frequency needs to be re-determined, and each new sub-packet is encoded and decoded according to the re-determined operating frequency. And circulating the steps until all data packets corresponding to the to-be-processed example are processed, and sending a task termination instruction corresponding to the to-be-processed example to the video coding and decoding equipment by the upper layer application.
Fig. 4 is a flowchart of another operating frequency determining method according to an embodiment of the present application. For convenience of understanding, in the embodiment of the present application, a specific implementation flow is taken as an example to describe the operating frequency determining method provided by the present application.
Step 401, receiving a task start instruction sent by an upper layer application.
Step 402, allocating processing resources for the instance corresponding to the task start instruction, and performing counter count addition.
Step 403, receiving the sub-data packet sent by the upper layer application.
Step 404, determining the working frequency according to the total amount of the video data of the sub-packets.
And 405, coding/decoding the sub-data packet according to the determined working frequency.
Step 406, determining whether a task termination instruction sent by the upper layer application is received, and if so, executing step 407; otherwise, step 403 is performed.
Step 407, release the corresponding processing resource, and perform the counter count down.
Step 408, judging whether the counter count is greater than 0, if so, executing step 403; otherwise, step 409 is performed.
And step 409, terminating the encoding and decoding, and powering off the equipment.
Fig. 5 is a schematic structural diagram of an operating frequency determining apparatus according to an embodiment of the present application. The operating frequency determining apparatus in this embodiment may be used as an operating frequency determining device to implement the operating frequency determining method provided in this embodiment. As shown in fig. 5, the operating frequency determining means may include: a response module 41, a receiving module 42, a first determining module 43 and a second determining module 44.
The response module 41 is configured to respond to each task start instruction and/or task termination instruction sent by the upper layer application, and update each to-be-processed instance according to each task start instruction and/or task termination.
And the receiving module 42 is configured to receive the updated sub packets of each to-be-processed instance sent by the upper layer application when it is determined that the number of each to-be-processed instance after updating is not 0. The sub-packet is one of all the to-be-processed packets corresponding to the to-be-processed instance.
The first determining module 43 is configured to determine a total amount of video data included in each received sub-packet, and determine a processing frequency corresponding to the total amount of video data.
And a second determining module 44, configured to determine, according to the processing frequency, an operating frequency for processing each sub-packet from each available frequency.
In a specific implementation manner, the response module 41 is configured to respond to each task start instruction and/or task end instruction sent by the upper layer application, and when each to-be-processed instance is updated according to each task start instruction and/or task end, specifically, according to each task start instruction, add an instance corresponding to each task start instruction as the to-be-processed instance. And/or deleting the instance corresponding to each task termination instruction from the to-be-processed instance according to each task termination instruction.
In a specific implementation manner, the first determining module 43 is specifically configured to determine a resolution and a frame rate of video data included in each sub-packet when the first determining module is configured to determine a total amount of video data included in each received sub-packet. And determining the total amount of the video data contained in each sub-packet according to the resolution and the frame rate of the video data contained in each sub-packet.
In a specific implementation manner, the determining, by the first determining module 43, the resolution and the frame rate of the video data included in each sub-packet includes: and if the to-be-processed example corresponding to the sub-data packet is the to-be-coded example, determining the resolution and the frame rate of the original video data according to the original video data contained in the sub-data packet. And if the to-be-processed example corresponding to the sub-data packet is the to-be-decoded example, determining the resolution of the original video data corresponding to the code stream data according to the header information of the code stream data contained in the sub-data packet. And determining the frame rate of the original video data corresponding to the code stream data according to the decoding time stamp contained in the sub-data packet.
In a specific implementation manner, when the second determining module 44 is configured to determine, according to the processing frequency, an operating frequency for processing each sub-packet from each available frequency, specifically, to determine whether each available frequency includes an available frequency greater than the processing frequency. If the available frequencies include available frequencies greater than the processing frequency, the smallest available frequency of the available frequencies greater than the processing frequency is determined as the operating frequency for processing the sub-packets. And if the available frequencies are determined to be smaller than the processing frequency, determining the maximum available frequency in the available frequencies as the working frequency for processing each sub-packet.
In a specific implementation manner, the second determining module 44 is configured to determine, according to the processing frequency, a working frequency for processing each sub packet from each available frequency, and then sequentially encode and decode the received sub packets of each to-be-processed instance according to the working frequency.
In a specific implementation manner, when the receiving module 42 determines that the updated number of each to-be-processed instance is 0, the receiving module is further configured to terminate the codec and power down.
First, the response module 41 responds to each task start instruction and/or task termination instruction sent by the upper layer application, and updates each to-be-processed instance according to each task start instruction and/or task termination. Then, when the receiving module 42 determines that the number of each updated to-be-processed instance is not 0, it receives the updated sub-packets of each to-be-processed instance sent by the upper layer application. Finally, the first determining module 43 determines the total amount of video data contained in each received sub-packet, determines a processing frequency corresponding to the total amount of video data, and determines the working frequency for processing each sub-packet from each available frequency according to the processing frequency by the second determining module 44. Therefore, the power consumption of the video coding and decoding module can be reduced on the basis of meeting different performance requirements.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 6, the electronic device may include at least one processor; and at least one memory communicatively coupled to the processor, wherein: the memory stores program instructions executable by the processor, and the processor calls the program instructions to execute the working frequency determination method provided by the embodiment of the application.
The electronic device may be an operating frequency determining device, and the embodiment does not limit the specific form of the electronic device.
FIG. 6 illustrates a block diagram of an exemplary electronic device suitable for use in implementing embodiments of the present application. The electronic device shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 6, the electronic device is in the form of a general purpose computing device. Components of the electronic device may include, but are not limited to: one or more processors 410, a memory 430, and a communication bus 440 that connects the various system components (including the memory 430 and the processors 410).
Communication bus 440 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. These architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus, to name a few.
Electronic devices typically include a variety of computer system readable media. Such media may be any available media that is accessible by the electronic device and includes both volatile and nonvolatile media, removable and non-removable media.
Memory 430 may include computer system readable media in the form of volatile Memory, such as Random Access Memory (RAM) and/or cache Memory. The electronic device may further include other removable/non-removable, volatile/nonvolatile computer system storage media. Although not shown in FIG. 6, a disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a Compact disk Read Only Memory (CD-ROM), a Digital versatile disk Read Only Memory (DVD-ROM), or other optical media) may be provided. In these cases, each drive may be connected to the communication bus 440 by one or more data media interfaces. Memory 430 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the application.
A program/utility having a set (at least one) of program modules, including but not limited to an operating system, one or more application programs, other program modules, and program data, may be stored in memory 430, each of which examples or some combination may include an implementation of a network environment. The program modules generally perform the functions and/or methodologies of the embodiments described herein.
The electronic device may also communicate with one or more external devices (e.g., keyboard, pointing device, display, etc.), one or more devices that enable a user to interact with the electronic device, and/or any devices (e.g., network card, modem, etc.) that enable the electronic device to communicate with one or more other computing devices. Such communication may occur via communication interface 420. Furthermore, the electronic device may also communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public Network such as the Internet) via a Network adapter (not shown in FIG. 6) that may communicate with other modules of the electronic device via the communication bus 440. It should be appreciated that although not shown in FIG. 6, other hardware and/or software modules may be used in conjunction with the electronic device, including but not limited to: microcode, device drivers, Redundant processing units, external disk drive Arrays, disk array (RAID) systems, tape Drives, and data backup storage systems, among others.
The processor 410 executes various functional applications and data processing by executing programs stored in the memory 430, for example, implementing the operating frequency determination method provided by the embodiment of the present application.
The embodiment of the present application further provides a computer-readable storage medium, where the computer-readable storage medium stores computer instructions, and the computer instructions enable the computer to execute the method for determining an operating frequency provided in the embodiment of the present application.
The computer-readable storage medium described above may take any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read Only Memory (ROM), an Erasable Programmable Read Only Memory (EPROM), a flash Memory, an optical fiber, a portable compact disc Read Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It should be noted that the terminal according to the embodiments of the present application may include, but is not limited to, a Personal Computer (Personal Computer; hereinafter, referred to as PC), a Personal Digital Assistant (Personal Digital Assistant; hereinafter, referred to as PDA), a wireless handheld device, a Tablet Computer (Tablet Computer), a mobile phone, an MP3 player, an MP4 player, and the like.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions in actual implementation, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the scope of protection of the present application.

Claims (10)

1. An operating frequency determining method applied to a video coding and decoding device includes:
responding to each task starting instruction and/or task ending instruction sent by an upper layer application, and updating each to-be-processed instance according to each task starting instruction and/or task ending;
when the number of the updated to-be-processed examples is determined to be not 0, receiving updated sub-packets of the to-be-processed examples sent by the upper layer application; the sub data packet is one data packet in all the data packets to be processed corresponding to the example to be processed;
determining the total amount of video data contained in each received sub-data packet, and determining the processing frequency corresponding to the total amount of video data;
and determining the working frequency for processing each sub-data packet from each available frequency according to the processing frequency.
2. The method according to claim 1, wherein updating each pending instance according to the respective task start instruction and/or task termination comprises:
adding an instance corresponding to each task starting instruction as a to-be-processed instance according to each task starting instruction; and/or the presence of a gas in the gas,
and deleting the instance corresponding to each task termination instruction from the to-be-processed instance according to each task termination instruction.
3. The method of claim 1, wherein determining the total amount of video data contained in each of the received subpackets comprises:
determining the resolution and frame rate of the video data contained in each sub-data packet;
and determining the total amount of the video data contained in each sub-packet according to the resolution and the frame rate of the video data contained in each sub-packet.
4. The method of claim 3, wherein determining the resolution and frame rate of the video data contained in each sub-packet comprises:
if the to-be-processed example corresponding to the sub-data packet is the to-be-coded example, determining the resolution and the frame rate of the original video data according to the original video data contained in the sub-data packet;
if the to-be-processed example corresponding to the sub-data packet is the to-be-decoded example, determining the resolution of the original video data corresponding to the code stream data according to the header information of the code stream data contained in the sub-data packet; and determining the frame rate of the original video data corresponding to the code stream data according to the decoding time stamp contained in the sub-data packet.
5. The method of claim 1, wherein determining an operating frequency for processing each sub-packet from the available frequencies according to the processing frequency comprises:
determining whether each available frequency includes an available frequency greater than the processing frequency;
if the available frequencies comprise available frequencies larger than the processing frequency, determining the minimum available frequency in the available frequencies larger than the processing frequency as the working frequency for processing the sub-packets;
and if the available frequencies are determined to be smaller than the processing frequency, determining the maximum available frequency in the available frequencies as the working frequency for processing the sub-packets.
6. The method of claim 1, wherein after determining the operating frequency for processing the sub-packets from the available frequencies, the method further comprises:
and coding and decoding the received sub data packets of each example to be processed in sequence according to the working frequency.
7. The method of claim 1, further comprising:
and when the number of the updated to-be-processed examples is determined to be 0, terminating the encoding and decoding and powering down.
8. An operating frequency determining apparatus, comprising:
the response module is used for responding to each task starting instruction and/or task ending instruction sent by the upper layer application and updating each to-be-processed instance according to each task starting instruction and/or task ending;
a receiving module, configured to receive updated subpackets of each to-be-processed instance sent by the upper application when it is determined that the number of each to-be-processed instance after updating is not 0; the sub data packet is one data packet in all the data packets to be processed corresponding to the example to be processed;
a first determining module, configured to determine a total amount of video data included in each received sub-packet, and determine a processing frequency corresponding to the total amount of video data;
and the second determining module is used for determining the working frequency for processing each sub-data packet from each available frequency according to the processing frequency.
9. An electronic device, comprising:
at least one processor; and
at least one memory communicatively coupled to the processor, wherein:
the memory stores program instructions executable by the processor, the processor invoking the program instructions to perform the method of any of claims 1 to 7.
10. A computer-readable storage medium storing computer instructions for causing a computer to perform the method of any one of claims 1 to 7.
CN202110396719.1A 2021-04-13 2021-04-13 Working frequency determination method and device and electronic equipment Active CN113115039B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110396719.1A CN113115039B (en) 2021-04-13 2021-04-13 Working frequency determination method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110396719.1A CN113115039B (en) 2021-04-13 2021-04-13 Working frequency determination method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN113115039A true CN113115039A (en) 2021-07-13
CN113115039B CN113115039B (en) 2022-12-02

Family

ID=76716704

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110396719.1A Active CN113115039B (en) 2021-04-13 2021-04-13 Working frequency determination method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN113115039B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1774929A (en) * 2003-04-15 2006-05-17 有限会社金泽大学Tlo Moving picture encoding or decoding processing system and moving picture encoding or decoding processing method
CN103051899A (en) * 2012-12-31 2013-04-17 青岛中星微电子有限公司 Method and device for video decoding
EP3422211A1 (en) * 2016-02-23 2019-01-02 Hangzhou Hikvision Digital Technology Co., Ltd. Data processing method and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1774929A (en) * 2003-04-15 2006-05-17 有限会社金泽大学Tlo Moving picture encoding or decoding processing system and moving picture encoding or decoding processing method
CN103051899A (en) * 2012-12-31 2013-04-17 青岛中星微电子有限公司 Method and device for video decoding
EP3422211A1 (en) * 2016-02-23 2019-01-02 Hangzhou Hikvision Digital Technology Co., Ltd. Data processing method and device

Also Published As

Publication number Publication date
CN113115039B (en) 2022-12-02

Similar Documents

Publication Publication Date Title
US11705924B2 (en) Low-latency encoding using a bypass sub-stream and an entropy encoded sub-stream
US10425782B2 (en) Voice messaging method and mobile terminal supporting voice messaging in mobile messenger service
CN107454416B (en) Video stream sending method and device
US10390049B2 (en) Electronic devices for sending a message and buffering a bitstream
EP3637782A1 (en) Method and device for encoding and decoding image data
CN109257646A (en) Method for processing video frequency, device, electronic equipment and computer-readable medium
EP2061255A1 (en) Information processing device and method
EP3697088A1 (en) Video sending and receiving method, device, and terminal
WO2019119950A1 (en) Video coding processing method and apparatus, and application having video coding function
CN112235597A (en) Method and device for synchronous protection of streaming media live broadcast audio and video and computer equipment
CN110602122A (en) Video processing method and device, electronic equipment and storage medium
US11196868B2 (en) Audio data processing method, server, client and server, and storage medium
US11908481B2 (en) Method for encoding live-streaming data and encoding device
CN103929682B (en) Method and device for setting key frames in video live broadcast system
WO2021254375A1 (en) Video partitioning method, transfer method, server, adaptor and storage medium
CN115550709A (en) Data processing method and electronic equipment
CN113115039B (en) Working frequency determination method and device and electronic equipment
CN107493478A (en) Encode frame per second method to set up and equipment
CN115119042A (en) Transmission system and transmission method
CN114760309A (en) Business interaction method, device, equipment and medium of terminal based on cloud service
CN104219537A (en) Video data processing method, device and system
CN103313017B (en) Multichannel kinescope method and system
CN112188213B (en) Encoding method, apparatus, computer device, and storage medium
CN101998376B (en) Information processing apparatus, information processing method, program and communication terminal
CN112751819B (en) Processing method and device for online conference, electronic equipment and computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant