CN118101962A - Encoding and decoding method, electronic device, computer-readable storage medium, and program product - Google Patents

Encoding and decoding method, electronic device, computer-readable storage medium, and program product Download PDF

Info

Publication number
CN118101962A
CN118101962A CN202410476337.3A CN202410476337A CN118101962A CN 118101962 A CN118101962 A CN 118101962A CN 202410476337 A CN202410476337 A CN 202410476337A CN 118101962 A CN118101962 A CN 118101962A
Authority
CN
China
Prior art keywords
decoding
real
resource
video
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410476337.3A
Other languages
Chinese (zh)
Inventor
王海军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202410476337.3A priority Critical patent/CN118101962A/en
Publication of CN118101962A publication Critical patent/CN118101962A/en
Pending legal-status Critical Current

Links

Landscapes

  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

The application relates to the technical field of electronic equipment, and provides a coding and decoding method, electronic equipment, a computer readable storage medium and a program product, which are applied to the electronic equipment. The method comprises the following steps: in a multi-channel encoding/decoding scenario where the occupied encoding/decoding resources encode/decode the first video and/or the second video, when the electronic device responds to the encoding/decoding request for the third video, if the remaining real-time encoding/decoding resources or non-real-time encoding/decoding resources are less than the required resources for encoding/decoding the third video, the electronic device can recycle the occupied encoding/decoding resources to encode/decode the third video. Meanwhile, when the real-time encoding and decoding resources are insufficient, but not the real-time encoding and decoding resources are sufficient, the method can be changed from the first encoding/decoding mode to the second encoding/decoding mode, so that the success rate of multi-channel encoding and decoding is improved.

Description

Encoding and decoding method, electronic device, computer-readable storage medium, and program product
Technical Field
Embodiments of the present application relate to the field of electronic devices, and in particular, to a coding and decoding method, an electronic device, a computer readable storage medium, and a program product.
Background
With the development of electronic device technology and the improvement of electronic device performance, more and more video functions can be supported by the electronic device. For example, in addition to video playback, the electronic device may also implement video functions such as video recording, video editing, and video dropping. Furthermore, a large number of multi-path encoding and decoding scenes exist in the actual use process of the electronic equipment.
However, a multi-channel codec scenario typically requires relatively large amounts of codec resources because multiple videos need to be encoded simultaneously. However, the encoding and decoding resources provided by the current electronic device are limited, so that the success rate of video encoding and decoding in a multi-channel encoding and decoding scene is reduced, the user interface in the multi-channel encoding and decoding scene is easy to be blocked, dead and the like, and the user experience is reduced.
Disclosure of Invention
The embodiment of the application provides a coding and decoding method, electronic equipment, a computer readable storage medium and a program product, which can improve the success rate of video coding/decoding of the electronic equipment in a multi-channel coding and decoding scene, reduce the phenomena of blocking, dead blocking and the like of a user interface in the multi-channel coding and decoding scene and ensure the use experience of a user.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical scheme:
In a first aspect, a coding and decoding method is provided, which is applied to an electronic device. The electronic device includes a first encoding/decoding mode (real-time encoding/decoding) and a second encoding/decoding mode (non-real-time encoding/decoding). Wherein the first encoding/decoding mode (real-time encoding/decoding) performs encoding/decoding through real-time encoding/decoding resources. The second codec (non-real-time codec) performs the codec through the non-real-time codec resource. The method comprises the following steps:
The electronic device encodes/decodes the one or more first videos using a first one of the real-time codec resources and/or encodes/decodes the one or more second videos using a first one of the non-real-time codec resources. Then, if the electronic device receives a request for encoding/decoding the third video according to the first encoding/decoding mode, the electronic device responds to the request to determine whether the second real-time encoding/decoding resources which are not occupied currently in the real-time encoding/decoding resources are less than the first required resources; wherein the first required resource is a real-time encoding/decoding resource required for encoding/decoding the third video in the first encoding/decoding manner. If the second real-time coding and decoding resources are less than the first required resources, the electronic equipment performs resource recovery on the occupied first real-time coding and decoding resources to obtain third real-time coding and decoding resources, wherein the third real-time coding and decoding resources comprise the second real-time coding and decoding resources and the recovered first real-time coding and decoding resources. Further, if the third real-time codec resource is still less than the first required resource, the electronic device further determines whether a second non-real-time codec resource that is not currently occupied in the non-real-time codec resources is equal to or more than the second required resource; wherein the second required resource is a non-real-time codec resource required for encoding/decoding the third video in the second encoding/decoding manner. And when the second non-real-time codec resource is equal to or more than the second required resource, the electronic device uses the second non-real-time codec resource to encode/decode the third video.
Therefore, in the method, under the multi-channel encoding/decoding scene of encoding/decoding the first video and/or the second video, if enough real-time encoding/decoding resources do not exist at present, the electronic equipment can recycle the occupied first real-time encoding/decoding resources through resource recycling to acquire the third real-time encoding/decoding resources, so that more real-time encoding/decoding resources can be vacated to support real-time encoding/decoding of the third video, and the success rate of video encoding/decoding under the multi-channel encoding/decoding scene can be improved.
Further, if the third real-time codec resource obtained after the resource recovery is still insufficient to support the real-time encoding/decoding of the third video, the electronic device may further perform the non-real-time encoding/decoding of the third video by using the second non-real-time codec resource under the condition that there is sufficient second non-real-time codec resource. That is, the electronic device may further increase the success rate of video encoding/decoding in the multi-channel encoding/decoding scenario by converting from original real-time encoding/decoding to non-real-time encoding/decoding when the real-time encoding/decoding resources are insufficient, but not the real-time encoding/decoding resources are sufficient.
The video coding/decoding success rate of the electronic equipment in the multi-channel coding/decoding scene is improved, and the phenomena of user interface jamming, jamming and the like caused by coding/decoding failure can be avoided when the third video needs to be displayed and played, so that the phenomena of jamming, jamming and the like of the user interface in the multi-channel coding/decoding scene are reduced, and the use experience of a user is ensured.
In a possible implementation manner of the first aspect, the codec method may further include:
And if the third real-time coding and decoding resources are less than the first required resources and the second non-real-time coding and decoding resources are less than the second required resources, recycling the resources of the first non-real-time coding and decoding resources to obtain the third non-real-time coding and decoding resources. At this time, the third non-real-time codec resource includes the second non-real-time codec resource and the recovered first non-real-time codec resource. Furthermore, if the third non-real-time encoding/decoding resource is equal to or more than the second required resource, the electronic device uses the third non-real-time encoding/decoding resource to encode/decode the third video; if the third non-real-time encoding/decoding resource is less than the second required resource, the electronic device does not encode/decode the third video.
In this implementation manner, when the third real-time codec resource obtained through resource recycling is still less than the first required resource, so that the electronic device decides to switch from real-time codec to non-real-time codec, if the second non-real time remaining in the non-real-time codec resource is also less than the second required resource, the electronic device further performs resource recycling on the first non-real-time codec resource that is already occupied in the non-real-time codec resource, thereby obtaining the third non-real-time codec resource. And the electronic equipment further decides whether to perform real-time encoding/decoding on the third video according to the condition of the third non-real-time encoding/decoding resource obtained after recycling.
Therefore, when the non-real-time encoding and decoding resources are insufficient, more non-real-time encoding and decoding resources are vacated by the method of resource recovery to support non-real-time encoding and decoding of the third video, so that the success rate of video encoding and decoding in a multi-channel encoding and decoding scene can be improved, the phenomena of blocking, jamming and the like of a user interface in the multi-channel encoding and decoding scene are reduced, and the use experience of a user is ensured.
In a possible implementation manner of the first aspect, the codec method may further include:
In the process of encoding/decoding the third video according to the second encoding/decoding mode, if the fourth real-time encoding/decoding resource is equal to or more than the first required resource, encoding/decoding the video segment which is not encoded/decoded yet in the third video by using the fourth real-time encoding/decoding resource; the fourth real-time encoding/decoding resource is an unoccupied real-time encoding/decoding resource after resource release in the process of encoding/decoding the third video according to the second encoding/decoding mode.
Since the occupied codec resources may be released at any time with the normal completion of the codec service or the triggering of the user. And the codec resources, once released, are resources that can be re-occupied. Therefore, in this implementation, in order to continue to meet the requirement that the real-time encoding/decoding needs to be performed on the third video, when the third video is a video that is converted from real-time encoding/decoding to non-real-time encoding/decoding, the electronic device obtains the fourth real-time encoding/decoding resource obtained by releasing the resource during the non-real-time encoding/decoding of the third video. If the fourth real-time encoding/decoding resource is equal to or more than the corresponding first required resource, the electronic device can switch back to the real-time encoding/decoding again to encode/decode the video segment which is not encoded/decoded in the third video in real time.
In a possible implementation manner of the first aspect, in order to timely and accurately understand a situation of real-time codec resource release, the electronic device may timely monitor and acquire the released fourth real-time codec resource by means of a registered resource monitoring callback function. Based on this, in the process of encoding/decoding the third video according to the second encoding/decoding method, if the fourth real-time encoding/decoding resource is equal to or more than the first required resource, encoding/decoding the video segment that has not been encoded/decoded in the third video by using the fourth real-time encoding/decoding resource may include:
Registering a resource monitoring callback function; during the process of encoding/decoding the third video according to the second encoding/decoding mode, the resource monitoring callback function is utilized to monitor the release of the real-time encoding/decoding resource, and a fourth real-time encoding/decoding resource is obtained; and if the fourth real-time coding and decoding resource is equal to or more than the first required resource, coding/decoding the video fragments which are not coded/decoded in the third video by utilizing the fourth real-time coding and decoding resource.
In a possible implementation manner of the first aspect, the checking of the current upper limit of the codec capability of the electronic device includes checking whether the resolution of the single-pass video is supported by the electronic device in addition to the checking of the codec resources. When the video is super-resolution video, the electronic device cannot encode/decode it. Therefore, in order to ensure that the encoding/decoding of video can be smoothly achieved, the encoding/decoding method further includes: it is checked whether the resolution of the third video exceeds the maximum resolution that can be supported by the electronic device.
Based on this, responding to the request for encoding/decoding the third video according to the first encoding/decoding mode, if the second real-time encoding/decoding resource is less than the first required resource, performing resource recycling on the first real-time encoding/decoding resource to obtain the third real-time encoding/decoding resource, which may include: when the resolution of the third video does not exceed the maximum resolution supported by the electronic equipment, responding to a request for encoding/decoding the third video according to the first encoding/decoding mode, and if the second real-time encoding/decoding resource is less than the first required resource, recovering the first real-time encoding/decoding resource to obtain the third real-time encoding/decoding resource.
In this implementation, before the step of checking the resolution of the video is put into the step of checking the codec resource by the electronic device at the checking timing, if the third video exceeds the maximum resolution that can be supported by the electronic device, the electronic device can directly determine that the codec fails, and does not need to check the codec resource any more, so that the power consumption of the device can be reduced.
Meanwhile, compared with the checking time sequence of checking the encoding and decoding resources and then detecting the video resolution, the situation that the encoding/decoding fails due to the fact that the resource recovery affects the existing video service and the resolution is not supported does not occur. That is, the video resolution is checked first and then the checking time sequence of the encoding and decoding resources is checked, so that the resource recovery is avoided to influence the encoding/decoding of other videos under the condition that the resolution is not supported, and the normal operation of other video services is avoided to be influenced.
In a possible implementation manner of the first aspect, since different applications have different priorities based on service types and service requirements, in order not to affect normal operation between services, when the electronic device performs resource recovery, the processing of resource recovery may be performed according to the process priorities of the applications.
Based on this, performing resource recovery on the first real-time codec resource to obtain a third real-time codec resource may include:
Determining the process priority of the first video corresponding to the first application and determining the process priority of the third video corresponding to the third application; determining a first target application according to the process priority; the first target application is a first application with a lower process priority than the third application; recovering a first real-time encoding and decoding resource occupied by a first target application to obtain a third real-time encoding and decoding resource; the third real-time encoding and decoding resources comprise first real-time encoding and decoding resources and second real-time encoding and decoding resources occupied by the first target application. Therefore, in the implementation manner, the electronic device only recovers the real-time encoding and decoding resources occupied by the first application with lower process priority than the third application, so that the first application with higher process priority can be ensured to normally perform encoding/decoding without being influenced by resource recovery, and the normal operation of each encoding and decoding service in the first application is not influenced.
In another possible implementation manner of the first aspect, the same, the electronic device may also reclaim the first non-real-time codec resource through the process priority. That is, performing resource recovery on the first non-real-time codec resource to obtain a third non-real-time codec resource may include: determining the process priority of the second video corresponding to the second application and determining the process priority of the third video corresponding to the third application; determining a second target application according to the process priority; the second target application is a second application with a lower process priority than the third application; recovering the first non-real-time encoding and decoding resources occupied by the second target application to obtain third non-real-time encoding and decoding resources; the third non-real-time encoding and decoding resources comprise a first non-real-time encoding and decoding resource and a second non-real-time encoding and decoding resource occupied by the second target application.
Therefore, the electronic equipment only recovers the non-real-time coding and decoding resources occupied by the second application with lower process priority than the third application, so that the second application with higher process priority can be ensured to normally perform coding and decoding without being influenced by resource recovery, and the normal operation of each coding and decoding service in the second application is not influenced.
In a possible implementation manner of the first aspect, when the application has a specific codec mode specified in the information of the codec format, the characterizing application side may have a specific requirement for the encoding/decoding of this video. Therefore, in order not to affect the business requirements of the application side. Meanwhile, when the real-time encoding/decoding and the non-real-time encoding/decoding coexist, the electronic device usually favors the real-time encoding/decoding. Therefore, if the third application specifies safe encoding/decoding or low-delay encoding/decoding of the third video in the information of the encoding/decoding format, the electronic device does not perform the process of converting the real-time encoding/decoding into the non-real-time encoding/decoding.
Meanwhile, because the video encoding is more than the video decoding, the required computing unit resources are more and the requirements on the device performance are higher, and based on the characteristic that the electronic device preferential treats the real-time encoding/decoding, the electronic device can also choose not to perform the process of converting the real-time encoding/decoding into the non-real-time encoding/decoding when determining that the video is encoded.
Based on this, the codec method may further include:
If the third real-time encoding/decoding resource is less than the first required resource and the third video is designated to be encoded/decoded safely or encoded/decoded with low delay according to the encoding/decoding format, or the third video is currently encoded, the third video is not encoded; wherein the codec format is specified by a third application corresponding to the third video.
Based on this, if the third real-time codec resource is less than the first required resource, the second non-real-time codec resource is equal to or more than the second required resource, and the encoding/decoding of the third video by occupying the second non-real-time codec resource may include:
If the third real-time encoding/decoding resource is less than the first required resource, and it is determined that the safe encoding/decoding or the low-delay encoding/decoding of the third video is not specified according to the encoding/decoding format, and the third video is currently decoded, and the second non-real-time encoding/decoding resource is equal to or more than the second required resource, the second non-real-time encoding/decoding resource is occupied to decode the third video.
In a possible implementation manner of the first aspect, if the electronic device determines that the non-real-time encoding/decoding is performed on the third video, the electronic device may also increase the success rate of encoding/decoding by using a resource recycling manner if it is determined that the current remaining non-real-time encoding/decoding resources are insufficient to perform non-real-time encoding/decoding on the third video in the process of checking the encoding/decoding resources.
Based on this, the codec method may further include:
responding to a request for encoding/decoding the third video according to a second encoding/decoding mode, and if the second non-real-time encoding/decoding resource is equal to or more than a second required resource, encoding/decoding the third video by using the second non-real-time encoding/decoding resource; if the second non-real-time coding and decoding resources are less than the second required resources, carrying out resource recovery on the first non-real-time coding and decoding resources to obtain third non-real-time coding and decoding resources; wherein the third non-real-time codec resource comprises a second non-real-time codec resource and a recovered first non-real-time codec resource; if the third non-real-time coding and decoding resource is equal to or more than the second required resource, coding/decoding the third video by using the third non-real-time coding and decoding resource; if the third non-real-time encoding/decoding resource is less than the second required resource, the third video is not encoded/decoded.
In a possible implementation manner of the first aspect, whether the electronic device encodes/decodes the video in real time or in non-real time may be determined based on the requirement of the application, such as according to the identification sent by the application. When the application does not indicate whether to perform real-time encoding/decoding or non-real-time encoding/decoding on the video through the identification, the electronic device may default to perform real-time encoding/decoding on the video because the electronic device may preferential to perform real-time encoding/decoding.
Therefore, the codec method may further include: receiving a coding/decoding request for coding/decoding the third video; if the encoding/decoding request carries the first identifier, determining whether to encode/decode the third video according to a first encoding/decoding mode or encode/decode the third video according to a second encoding/decoding mode according to the first identifier; if the encoding/decoding request does not carry the first identifier, determining to encode/decode the third video according to the first encoding/decoding mode.
In a possible implementation manner of the first aspect, the codec method may further include:
and if the second real-time coding and decoding resource or the third real-time coding and decoding resource is equal to or more than the first required resource, coding/decoding the third video by using the second real-time coding and decoding resource or the third real-time coding and decoding resource.
That is, when the second real-time codec resource is sufficient to support real-time/codec of the third video, the electronic device may directly use the second real-time codec resource to perform the encoding/decoding of the third video without performing additional resource recycling processing. Similarly, when the third real-time encoding/decoding resource obtained after the resource recovery is sufficient to support real-time encoding/decoding of the third video, the electronic device may not perform processing of converting to non-real-time encoding/decoding any more, and directly encode/decode the third video by using the third real-time encoding/decoding resource.
In a possible implementation manner of the first aspect, if the encoded/decoded third video is a video to be displayed, the electronic device may display the encoded/decoded third video on the interface after completing the encoding/decoding. That is, the codec method may further include: displaying a first interface, wherein the first interface comprises a third video after encoding/decoding; or the first interface includes a recording floating window, and the recording floating window corresponds to the encoding/decoding of the third video.
In a possible implementation manner of the first aspect, if the encoded/decoded first video and/or the second video are/is also a video to be displayed, the first interface may further include the encoded/decoded first video and/or the encoded/decoded second video.
In a possible implementation manner of the first aspect, the electronic device includes a media codec service, a codec resource query module, a video driver, and a resource reclamation service; responding to the request of encoding/decoding the third video according to the first encoding/decoding mode, if the second real-time encoding/decoding resource is less than the first required resource, performing resource recovery on the first real-time encoding/decoding resource to obtain a third real-time encoding/decoding resource, which may include:
The media coding and decoding service responds to a request for coding/decoding the third video according to the first coding/decoding mode, and requests a coding and decoding resource query module to allocate real-time coding and decoding resources; the encoding and decoding resource inquiry module calls a video driver to check whether the second real-time encoding and decoding resource is less than the first required resource; if the second real-time encoding and decoding resources are less than the first required resources, the encoding and decoding resource inquiry module returns no memory to the media encoding and decoding service, and the media encoding and decoding service requests the resource recycling service to recycle the resources; and the resource recycling service performs resource recycling on the first real-time coding and decoding resources to obtain third real-time coding and decoding resources.
In a possible implementation manner of the first aspect, the electronic device further includes an activity manager; the resource recycling service performs resource recycling on the first real-time coding and decoding resource to obtain a third real-time coding and decoding resource, and may include:
The resource recycling service requests the activity manager to acquire the process priority of the first video corresponding to the first application and determines the process priority of the third video corresponding to the third application; the resource recycling service determines a first application with a process priority lower than that of a third application to obtain a first target application, and recycles first real-time encoding and decoding resources occupied by the first target application to obtain the third real-time encoding and decoding resources.
In a possible implementation manner of the first aspect, the electronic device includes a media codec service, a video HAL, and a video driver; when the electronic device realizes resource release monitoring by registering a resource monitoring callback function, the encoding and decoding method can comprise the following steps: the media coding and decoding service registers a resource monitoring callback function to the video HAL, and indicates the video HAL to monitor and report the release of the real-time coding and decoding resources; in the process that the media coding/decoding service codes/decodes the third video according to the second coding/decoding mode, the video HAL acquires the release condition of video driving feedback and reports the release condition to the media coding/decoding service; wherein the release condition includes a fourth real-time codec resource; if the media codec service determines that the fourth real-time codec resource is equal to or greater than the first required resource, the media codec service uses the fourth real-time codec resource to encode/decode video segments in the third video that have not yet been encoded/decoded.
In a second aspect, the present application provides an electronic device comprising: the display screen, one or more processors and the memory, the display screen, the memory are coupled with the processor respectively; the display screen is used for displaying video; one or more computer program code is stored in the memory, the computer program code comprising computer instructions; the computer instructions, when executed by the processor, cause the electronic device to perform the steps of:
Encoding/decoding one or more first videos using the first real-time encoding/decoding resources and/or encoding/decoding one or more second videos using the first non-real-time encoding/decoding resources; wherein the first real-time codec resource is an occupied real-time codec resource and the first non-real-time codec resource is an occupied non-real-time codec resource;
Responding to a request for encoding/decoding the third video according to the first encoding/decoding mode, and if the second real-time encoding/decoding resources are less than the first required resources, carrying out resource recovery on the first real-time encoding/decoding resources to obtain third real-time encoding/decoding resources; wherein the first required resource is a real-time encoding/decoding resource required for encoding/decoding the third video according to the first encoding/decoding mode; the second real-time encoding and decoding resource is unoccupied resource in the real-time encoding and decoding resource; the third real-time coding and decoding resources comprise second real-time coding and decoding resources and recovered first real-time coding and decoding resources;
If the third real-time encoding and decoding resources are still less than the first required resources, the second non-real-time encoding and decoding resources are equal to or more than the second required resources, and the second non-real-time encoding and decoding resources are utilized to encode/decode the third video; wherein the second required resource is a non-real-time encoding/decoding resource required for encoding/decoding the third video in accordance with the second encoding/decoding mode; the second non-real-time codec resource is an unoccupied resource of the non-real-time codec resources.
In a possible implementation manner of the second aspect, the computer instructions, when executed by the processor, cause the electronic device to further perform the steps of: if the third real-time coding and decoding resources are less than the first required resources and the second non-real-time coding and decoding resources are less than the second required resources, carrying out resource recovery on the first non-real-time coding and decoding resources to obtain third non-real-time coding and decoding resources; wherein the third non-real-time codec resource comprises a second non-real-time codec resource and a recovered first non-real-time codec resource;
if the third non-real-time coding and decoding resource is equal to or more than the second required resource, coding/decoding the third video by using the third non-real-time coding and decoding resource; if the third non-real-time encoding/decoding resource is less than the second required resource, the third video is not encoded/decoded.
In a possible implementation manner of the second aspect, the computer instructions, when executed by the processor, cause the electronic device to further perform the steps of: in the process of encoding/decoding the third video according to the second encoding/decoding mode, if the fourth real-time encoding/decoding resource is equal to or more than the first required resource, encoding/decoding the video segment which is not encoded/decoded yet in the third video by using the fourth real-time encoding/decoding resource; the fourth real-time encoding/decoding resource is an unoccupied real-time encoding/decoding resource after resource release in the process of encoding/decoding the third video according to the second encoding/decoding mode.
In a possible implementation manner of the second aspect, the computer instructions, when executed by the processor, cause the electronic device to further perform the steps of: registering a resource monitoring callback function; during the process of encoding/decoding the third video according to the second encoding/decoding mode, the resource monitoring callback function is utilized to monitor the release of the real-time encoding/decoding resource, and a fourth real-time encoding/decoding resource is obtained; and if the fourth real-time coding and decoding resource is equal to or more than the first required resource, coding/decoding the video fragments which are not coded/decoded in the third video by utilizing the fourth real-time coding and decoding resource.
In a possible implementation manner of the second aspect, the computer instructions, when executed by the processor, cause the electronic device to further perform the steps of: checking whether the resolution of the third video exceeds the maximum resolution that can be supported by the electronic device; when the resolution of the third video does not exceed the maximum resolution supported by the electronic equipment, responding to a request for encoding/decoding the third video according to the first encoding/decoding mode, and if the second real-time encoding/decoding resource is less than the first required resource, recovering the first real-time encoding/decoding resource to obtain the third real-time encoding/decoding resource.
In a possible implementation manner of the second aspect, the computer instructions, when executed by the processor, cause the electronic device to further perform the steps of: determining the process priority of the first video corresponding to the first application and determining the process priority of the third video corresponding to the third application; determining a first target application according to the process priority; the first target application is a first application with a lower process priority than the third application; recovering a first real-time encoding and decoding resource occupied by a first target application to obtain a third real-time encoding and decoding resource; the third real-time encoding and decoding resources comprise first real-time encoding and decoding resources and second real-time encoding and decoding resources occupied by the first target application.
In a possible implementation manner of the second aspect, the computer instructions, when executed by the processor, cause the electronic device to further perform the steps of: determining the process priority of the second video corresponding to the second application and determining the process priority of the third video corresponding to the third application; determining a second target application according to the process priority; the second target application is a second application with a lower process priority than the third application; recovering the first non-real-time encoding and decoding resources occupied by the second target application to obtain third non-real-time encoding and decoding resources; the third non-real-time encoding and decoding resources comprise a first non-real-time encoding and decoding resource and a second non-real-time encoding and decoding resource occupied by the second target application.
In a possible implementation manner of the second aspect, the computer instructions, when executed by the processor, cause the electronic device to further perform the steps of: if the third real-time encoding/decoding resource is less than the first required resource and the third video is designated to be encoded/decoded safely or encoded/decoded with low delay according to the encoding/decoding format, or the third video is currently encoded, the third video is not encoded; the encoding and decoding format is specified by a third application corresponding to a third video;
If the third real-time encoding/decoding resource is less than the first required resource, and it is determined that the safe encoding/decoding or the low-delay encoding/decoding of the third video is not specified according to the encoding/decoding format, and the third video is currently decoded, and the second non-real-time encoding/decoding resource is equal to or more than the second required resource, the second non-real-time encoding/decoding resource is occupied to decode the third video.
In a possible implementation manner of the second aspect, the computer instructions, when executed by the processor, cause the electronic device to further perform the steps of: responding to a request for encoding/decoding the third video according to a second encoding/decoding mode, and if the second non-real-time encoding/decoding resource is equal to or more than a second required resource, encoding/decoding the third video by using the second non-real-time encoding/decoding resource; if the second non-real-time coding and decoding resources are less than the second required resources, carrying out resource recovery on the first non-real-time coding and decoding resources to obtain third non-real-time coding and decoding resources; wherein the third non-real-time codec resource comprises a second non-real-time codec resource and a recovered first non-real-time codec resource;
if the third non-real-time coding and decoding resource is equal to or more than the second required resource, coding/decoding the third video by using the third non-real-time coding and decoding resource; if the third non-real-time encoding/decoding resource is less than the second required resource, the third video is not encoded/decoded.
In a possible implementation manner of the second aspect, the computer instructions, when executed by the processor, cause the electronic device to further perform the steps of: receiving a coding/decoding request for coding/decoding the third video; if the encoding/decoding request carries the first identifier, determining whether to encode/decode the third video according to a first encoding/decoding mode or encode/decode the third video according to a second encoding/decoding mode according to the first identifier; if the encoding/decoding request does not carry the first identifier, determining to encode/decode the third video according to the first encoding/decoding mode.
In a possible implementation manner of the second aspect, the computer instructions, when executed by the processor, cause the electronic device to further perform the steps of: and if the second real-time coding and decoding resource or the third real-time coding and decoding resource is equal to or more than the first required resource, coding/decoding the third video by using the second real-time coding and decoding resource or the third real-time coding and decoding resource.
In a possible implementation manner of the second aspect, the computer instructions, when executed by the processor, cause the electronic device to further perform the steps of: displaying a first interface, wherein the first interface comprises a third video after encoding/decoding; or the first interface includes a recording floating window, and the recording floating window corresponds to the encoding/decoding of the third video.
In a possible implementation manner of the second aspect, the computer instructions, when executed by the processor, cause the electronic device to further perform the steps of: the first interface further includes a first video after encoding/decoding and/or a second video after encoding/decoding.
In a possible implementation manner of the second aspect, the electronic device includes a media codec service, a codec resource query module, a video driver, and a resource reclamation service; the computer instructions, when executed by the processor, cause the electronic device to further perform the steps of: the media coding and decoding service responds to a request for coding/decoding the third video according to the first coding/decoding mode, and requests a coding and decoding resource query module to allocate real-time coding and decoding resources; the encoding and decoding resource inquiry module calls a video driver to check whether the second real-time encoding and decoding resource is less than the first required resource; if the second real-time encoding and decoding resources are less than the first required resources, the encoding and decoding resource inquiry module returns no memory to the media encoding and decoding service, and the media encoding and decoding service requests the resource recycling service to recycle the resources; and the resource recycling service performs resource recycling on the first real-time coding and decoding resources to obtain third real-time coding and decoding resources.
In a possible implementation manner of the second aspect, the electronic device further includes an activity manager; the computer instructions, when executed by the processor, cause the electronic device to further perform the steps of: the resource recycling service requests the activity manager to acquire the process priority of the first video corresponding to the first application and determines the process priority of the third video corresponding to the third application; the resource recycling service determines a first application with a process priority lower than that of a third application to obtain a first target application, and recycles first real-time encoding and decoding resources occupied by the first target application to obtain the third real-time encoding and decoding resources.
In one possible implementation of the second aspect, the electronic device comprises a media codec service, a video HAL and a video driver; the computer instructions, when executed by the processor, cause the electronic device to further perform the steps of:
The media coding and decoding service registers a resource monitoring callback function to the video HAL, and indicates the video HAL to monitor and report the release of the real-time coding and decoding resources; in the process that the media coding/decoding service codes/decodes the third video according to the second coding/decoding mode, the video HAL acquires the release condition of video driving feedback and reports the release condition to the media coding/decoding service; wherein the release condition includes a fourth real-time codec resource; if the media codec service determines that the fourth real-time codec resource is equal to or greater than the first required resource, the media codec service uses the fourth real-time codec resource to encode/decode video segments in the third video that have not yet been encoded/decoded.
In a third aspect, the present application is a computer readable storage medium having stored thereon a computer program which, when executed by a processor in an electronic device, causes the electronic device to perform the codec method as in the first aspect and any one of its possible implementations.
In a fourth aspect, the application provides a computer program product for, when run on a computer, causing the computer to perform the method as in the first aspect and any one of its possible implementations. The computer may be the electronic device described above.
It will be appreciated that the advantages achieved by the electronic device according to any of the possible implementations of the second aspect, the computer readable storage medium according to the third aspect, and the computer program product according to the fourth aspect may refer to the advantages as in the first aspect and any of the possible implementations thereof, and are not described here again.
Drawings
Fig. 1 is a schematic interface diagram of a multi-path codec scenario provided in an embodiment of the present application;
FIG. 2 is a second interface schematic diagram of a multi-channel codec scenario according to an embodiment of the present application;
fig. 3 is a schematic interface diagram III of a multi-path encoding/decoding scenario according to an embodiment of the present application;
fig. 4 is an interface schematic diagram of a multi-path codec scenario provided in an embodiment of the present application;
Fig. 5 is an interface schematic diagram of a multi-path codec scenario provided in an embodiment of the present application;
fig. 6 is a conceptual diagram of a macroblock according to an embodiment of the present application;
FIG. 7 is a schematic diagram of an interface with a blocking phenomenon in a multi-path codec scenario according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 9 is a software structural block diagram of an electronic device according to an embodiment of the present application;
Fig. 10 is a schematic flow chart of a coding and decoding method according to an embodiment of the present application;
FIG. 11 is a flow chart illustrating a process of converting real-time encoding and decoding to non-real-time encoding and decoding according to an embodiment of the present application;
FIG. 12 is an interactive flowchart of a coding and decoding method according to an embodiment of the present application;
FIG. 13 is an interactive flow chart for converting non-real-time codec back to real-time codec according to an embodiment of the present application;
FIG. 14 is a flow chart of a conventional codec capability check according to an embodiment of the present application;
FIG. 15 is an interface schematic diagram of a super-resolution codec failure according to an embodiment of the present application;
FIG. 16 is a flowchart of a codec capability check according to an embodiment of the present application;
fig. 17 is a block diagram of a chip system according to an embodiment of the present application.
Detailed Description
The technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application. In describing embodiments of the present application, the terminology used in the embodiments is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
In addition, in order to facilitate the clear description of the technical solution of the embodiments of the present application, in the embodiments of the present application, if the words "first", "second", etc. are used, the same items or similar items having substantially the same functions and actions are distinguished. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ. And, in the description of the embodiments of the present application, unless otherwise indicated, the meaning of "plurality" means two or more. By "multiple" is meant two or more ways.
With the development of electronic equipment technology and the improvement of electronic equipment performance, electronic equipment can support video playing at present and can also realize video functions such as video recording, video editing, video projection and the like. Whichever video function is implemented, the need to store and transmit video is avoided.
The amount of data for video is typically relatively large compared to other types of data, and the amount of memory required is relatively large. The size of the data volume generally affects the efficiency of data storage and data transfer. Therefore, in order to facilitate storage and transmission of video, the existing electronic device compresses the video by a compression technique, and the compression process is video coding.
Correspondingly, because the transmission will compress and encode the video, the video files acquired by the electronic device will be encoded video files. In order to play the video normally, the electronic device needs to decode the video file, which is the video decoding process. Video encoding and video decoding may be collectively referred to as video encoding and decoding.
Meanwhile, as the performance of the electronic equipment is improved and the video functions supported by the electronic equipment are more and more, a large number of multi-path encoding and decoding scenes exist in the actual use process of the electronic equipment. A multi-channel codec scenario may be understood simply as a scenario in which an electronic device may encode and/or decode multiple videos simultaneously at the same time. Hereinafter, for convenience of scheme description, embodiments of the present application will refer to video encoding or video decoding as "encoding/decoding".
Taking a mobile phone as an example, fig. 1 to fig. 4 respectively show interface diagrams of a multi-path codec scenario.
Referring to fig. 1, the multi-path codec scene shown in fig. 1 is a video editing scene. In the video editing scene, the video displayed in the editing preview area 101 above the video editing interface 100 needs to occupy one path of encoding/decoding, and the video displayed in the video track 102 below the video editing interface 100 needs to occupy another path of encoding/decoding.
Referring to fig. 2, the multi-channel codec scenario shown in fig. 2 is video recording while video is playing.
As shown in fig. 2, if the user clicks the screen in the control center of the mobile phone during the video playing process, the mobile phone triggers the screen recording application to start screen recording in response to the click operation of the user on the screen recording. Meanwhile, the mobile phone can display prompt information of on-screen recording on the interface through the recording suspension window 201. Then, in the multi-path codec scenario, the user triggers the opened video recording, i.e. the recording floating window 201 occupies one path of codec corresponding to the recorded video, and the video displayed in the display area 202 occupies the other path of codec.
Referring to fig. 3 and 4, the multi-path codec scenes shown in fig. 3 and 4 are scenes in which two videos are simultaneously displayed. Fig. 3 is a view of a mobile phone supporting a split screen function, where two videos are displayed simultaneously in a split screen scene. As shown in fig. 3, in a split screen scenario, the mobile phone may display two videos on the first split screen 301 and the second split screen 302 at the same time. Then, the video displayed in the first sub-screen 301 occupies one path of encoding/decoding, and the video displayed in the second sub-screen 302 occupies the other path of encoding/decoding.
Fig. 4 shows a video displayed on the mobile phone in full screen, and simultaneously shows another video in the floating window 401. Thus, the video displayed in full screen occupies one path of encoding/decoding, and the video displayed in the floating window 401 occupies the other path of encoding/decoding.
It can be appreciated that when an electronic device has multiple videos playing simultaneously, there is a problem of channel conflicts. Based on different designs, when a plurality of videos are played simultaneously, some electronic devices control the audio mixing output of all videos, that is, can hear all the audio of the played videos simultaneously. But there are some electronic devices that control one or more of the video pauses when there are multiple videos to play in order to avoid audio ambiguity caused by audio mixing output.
That is, in some embodiments, when the electronic device performs multiple encoding and decoding on multiple video to be displayed at the same time, as shown in the scenes in fig. 3 and fig. 4, one of the video is controlled to be in a pause state. That is, because of the channel conflict, the electronic device controls that only one video is in a playing state at the same time, and corresponding audio of the video is output.
For example, in fig. 3, the video displayed in the first split screen 301 is normally played, and then the video displayed in the second split screen 302 is paused. Or the video displayed in the second split screen 302 in fig. 3 is normally played, the video displayed in the first split screen 301 is paused. As another example, if the video displayed in full screen in fig. 4 is played normally, the video displayed in the floating window 401 is paused. Or the video displayed in the floating window 401 in fig. 4 is played normally, the video displayed in full screen in fig. 4 is paused.
In addition, it will be appreciated that in some embodiments, multiple video encoded/decoded in multiple codec scenarios may be requested by the same application, such as the video editing scenario shown in FIG. 1. Of course, the multiple video encoded/decoded under the multiple codec scenes may also be separately requested by different applications, such as the multiple codec scenes shown in fig. 2-4. For example, the multi-path codec scenario shown in fig. 2 is two-path video codec initiated by the video recording application and the video playback application, respectively.
It should be noted that, the multi-path codec scenario shown in fig. 1 to fig. 4 is mostly a two-path codec scenario, but it is merely used as an example of the embodiment of the present application, and it does not limit the multi-path codec scenario.
For example, some video applications still occupy one path of codec because they do not release the occupied codec resources after switching from the foreground into the background. At this time, if the background has multiple paths of videos to occupy the encoding and decoding resources, or if the foreground has video functions of video playing, video recording and the like, the electronic equipment is still in multiple paths of encoding and decoding scenes. For another example, as described above, the video may be paused due to channel conflicts, and the video may still occupy one track of codec in the background, although the video is paused.
Also for example, for a portion of an electronic device, as shown in fig. 5, the interface 501 may include three floating windows 503, 504, 505 simultaneously, in addition to displaying a video in the display area 502 of the interface 501. Wherein one video is displayed in each of the floating window 503 and the floating window 504. Meanwhile, the floating window 505 is a recording floating window, and also occupies one path of video encoding/decoding correspondingly. In this case, there are four paths for video encoding/decoding that can be intuitively perceived by the user. In some embodiments, the recording hover window may be hidden from view.
Currently, for Video Coding, the mainstream Video Coding standards mainly include advanced Video Coding (Advanced Video Coding, AVC) and high efficiency Video Coding (HIGH EFFICIENCY Video Coding, HEVC).
Among them, AVC is also called h.264, HEVC is also called h.265. In these mainstream video encoding and decoding techniques, a macroblock (macroblock) encoding is often used. Macroblock coding can be simply understood as: video images are divided into individual blocks, which are macro blocks, for encoding respectively. That is, in the encoding scheme of macroblock encoding, the minimum unit of encoding is a divided macroblock.
It should be noted that the size of the macro block depends on the specific video coding and decoding standard, and the embodiment of the present application is not limited in this respect. For example, the video codec standard h.264 is to use 16×16 macroblocks as the minimum coding unit. Then, the macroblock in h.264 is 16×16, i.e. each macroblock corresponds to a 16×16 pixel image area.
Referring to fig. 6, a conceptual diagram of a macroblock is shown in an embodiment of the present application.
As shown in fig. 6, a video typically includes a plurality of video sequences (sequnece), one consisting of multiple frames of video images. Each frame of video image in the video sequence may then be encoded into one or more slices (slices), and each slice may be divided into a plurality of blocks, i.e., macro blocks. As shown in fig. 6, a slice includes seven macro blocks (macro blocks). In some embodiments, a macroblock may also be composed of multiple sub-blocks (sub-bolck) (sub-blocks not shown in fig. 6). In general, a number of sub-blocks form a macroblock, a number of macroblocks form slices, a number of slices form a frame of video images, a number of frames of video images form a video sequence, and a number of video sequences form a complete video.
However, since different chips have a certain difference in video processing capability, for example, processing capability of video processing units (video processing unit, VPUs) mounted on System On Chip (SOC) is different. Therefore, the performance of the electronic device with different chips in the multi-path codec scenario may also be different.
In order to ensure the performance of electronic devices in a multi-channel codec scenario, the existing part of chips introduce the concept of macroblock upper limit in the kernel (kernel) layer. The upper limit of a macroblock refers to the upper limit of the number of macroblocks that can be processed.
That is, the chip manufacturer defines in advance an upper limit on the number of macro blocks that the chip can process simultaneously, and once the upper limit of macro blocks is exceeded, a new codec instance cannot be created. It will be appreciated that the upper macroblock limit is equivalent to the upper codec capability limit of the electronic device, and once the upper macroblock limit is exceeded, the electronic device cannot create a new codec to encode/decode video.
Thus, under the constraint of the upper limit of the macro block, when the upper layer application in the electronic device needs to create a new codec instance to the bottom layer application because the encoding/decoding of the video, but the electronic device has already reached the upper limit of the encoding/decoding capability, the kernel layer returns a no memory (no_memory) to the upper layer application correspondingly, so as to indicate that there is not enough encoding/decoding resources currently to support the creation of a new codec for encoding/decoding the video.
That is, since the multi-path codec scene is a scene that needs to support simultaneous encoding and/or decoding of a plurality of videos, more codec resources are required in the multi-path codec scene. But there is also a constraint on the upper limit of the macroblock, so the codec resources that can be provided by the electronic device are limited. Furthermore, in the existing multi-channel codec scenario, the electronic device is prone to failure in multi-channel codec creation and failure in encoding/decoding video. In this case, if the video needs to be displayed and played at the front end, the user interface may be blocked or dead due to the encoding/decoding failure, so as to affect the user experience. This problem is more pronounced and frequent in electronic devices that are mounted with low-and-medium-end chips.
Taking the video editing as an example of a multi-path codec scenario shown in fig. 1, fig. 7 shows an interface schematic diagram of a user interface with a seizure phenomenon in the multi-path codec scenario. As shown in fig. 7, since the multi-channel codec creation fails, no video content is displayed in the edit preview area 101 above the video editing interface 100 in a black screen state, that is, a seizing phenomenon occurs.
Based on the above, the embodiment of the application provides a coding and decoding method, which is applied to electronic equipment. By the coding and decoding method provided by the embodiment of the application, the success rate of video coding and decoding of the electronic equipment in a multi-channel coding and decoding scene can be improved, so that the phenomena of blocking, dead blocking and the like of a user interface in the multi-channel coding and decoding scene are reduced, and the use experience of a user is ensured.
In the embodiment of the application, the constraint condition of the upper limit of the macro block is matched with a memory recycling mechanism for use in the encoding and decoding method. That is, when the electronic device has insufficient codec resources (i.e., reaches the upper limit of the macroblock) in the multi-path codec scene and cannot create a codec instance for encoding/decoding the video, the electronic device can further recover the codec resources. That is, the electronic device reclaims the codec resources that have been currently occupied.
Therefore, the electronic equipment can vacate a part of encoding and decoding resources through encoding and decoding resource recovery to create a new encoding/decoding device for encoding/decoding the video which is needed to be encoded/decoded currently, so that the success rate of encoding and decoding the video in a multi-channel encoding/decoding scene is improved, the user interface is prevented from being blocked and blocked, and the user experience is ensured.
Meanwhile, because the electronic device has the requirement of encoding and decoding time ductility based on different encoding and decoding scenes, the encoding and decoding of the video are divided into two encoding and decoding modes, including a first encoding/decoding mode and a second encoding/decoding mode. Wherein the encoding/decoding time delay of the first encoding/decoding mode is lower than the encoding/decoding time delay of the second encoding/decoding mode.
That is, the first encoding/decoding method is mainly applied to encoding/decoding scenes with relatively high encoding/decoding latency, such as live video, video recording, and the like. The second encoding/decoding method is mainly applied to encoding/decoding scenes with low encoding/decoding time ductility, such as video background editing, transcoding, single frame acquisition and the like, which do not need to pay much attention to the user interface. Meanwhile, in the case where the first encoding/decoding method and the second encoding/decoding method coexist, the electronic device generally satisfies the second encoding/decoding method first.
In the case that the coding and decoding modes are divided into a first coding and decoding mode and a second coding and decoding mode, the upper limit of the coding and decoding capability of the electronic equipment in the corresponding different coding and decoding modes is also measured separately. That is, the codec resources used to measure the upper limit of the codec capability of the electronic device are divided into the first codec resources and the second codec resources. The first encoding and decoding mode encodes/decodes the video through the first encoding and decoding resources, and the second encoding and decoding mode encodes/decodes the video through the second encoding and decoding resources.
For convenience of description of the scheme, the first encoding/decoding manner is hereinafter referred to as real-time encoding/decoding (real-time codec), and the corresponding first encoding/decoding resource is referred to as real-time encoding/decoding resource. And, the second encoding/decoding scheme is referred to as non-real-time encoding/decoding (no real time codec), and the corresponding second encoding/decoding resource is referred to as non-real-time encoding/decoding resource.
That is, in the embodiment of the present application, the real-time encoding/decoding refers to the first encoding/decoding method, and is mainly used for encoding/decoding scenes with relatively high encoding/decoding delay performance, such as video live broadcasting, video recording, and the like. While the real-time codec resource is used for real-time encoding/decoding of video, it can be understood that the upper limit of the real-time codec capability of the electronic device is mainly measured by the real-time codec resource.
The non-real-time encoding/decoding refers to the second encoding/decoding mode, and is mainly used for video background editing, transcoding, single frame acquisition and other encoding/decoding scenes which do not need to pay much attention to the user interface. The non-real-time codec resource is used to encode and decode video in non-real-time. It is also understood that the upper limit of the non-real-time codec capability of an electronic device is mainly measured by non-real-time codec resources. That is, the codec delay of the real-time codec is lower than the codec delay of the non-real-time codec, so that the electronic device usually satisfies the real-time codec first in the case of coexistence of the real-time codec and the non-real-time codec.
In one embodiment, the codec resources of the electronic device (i.e., the upper limit of the codec capability of the electronic device) are currently measured by the number of occupied macro blocks per second (macro block per second, MBPS), the number of occupied macro blocks per frame (macro block per frame, MBPF), and the number of macro blocks per frame occupied in real time (real time-macro block per frame, RT-MBPF). Wherein, MBPS and RT-MBPF correspond to real-time coding and decoding resources, MBPF includes real-time coding and decoding resources and non-real-time coding and decoding resources. MBPF is more than or equal to RT-MBPF. It is to be appreciated that MBPF can be used to measure both real-time codec resources and non-real-time codec resources. The codec resources in MBPF that can be used for real-time codec are called RT-MBPF.
That is, the real-time codec resources that the video needs to occupy can be determined by calculating the MBPS and RT-MBPF corresponding to the video. And the non-real-time coding and decoding resources required to be occupied by the video can be determined by calculating the MBPF corresponding to the video.
The upper limit of the codec resources that the electronic device can provide, i.e. the codec capability of the electronic device, is max_mbps, max_rt-MBPF, max_mbpf. It will be appreciated that the specific values of max_mbps, max_rt-MBPF, max_mbpf depend on the capabilities of the chip on which the electronic device is mounted, i.e. the fixed value given by the chip, and that the embodiments of the application are not limited in any way.
Therefore, when the electronic device determines that the codec resources are insufficient (i.e., the upper limit of the macro block is reached) in the multi-path codec scene and cannot be created to perform the codec for the video, the electronic device may further perform resource recycling on the codec resources (i.e., the real-time codec resources MBPS and RT-MBPF or the non-real-time codec resources MBPF) as required according to the specific codec mode (i.e., the real-time codec and the non-real-time codec) required by the video, so as to vacate enough codec resources to perform the codec for the video. Therefore, the success rate of video coding/decoding in a multipath coding/decoding scene can be improved, and the phenomena of blocking, jamming and the like of a user interface caused by coding/decoding failure are further avoided when the video needs to be displayed and played, so that the use experience of a user is ensured.
In addition, in the conventional encoding/decoding manner, when an upper layer application in the electronic device requests the bottom layer to encode/decode the video, an identifier is simultaneously sent to the bottom layer to indicate whether the bottom layer encodes/decodes the video in real time or not.
For example, taking the identification priority as an example, the identification priority=0 may represent real-time encoding/decoding, and the identification priority=1 may represent non-real-time encoding/decoding. Furthermore, the electronic device can pertinently utilize the real-time encoding/decoding resources or the non-real-time encoding/decoding resources to encode/decode the video based on the application requirements.
However, in the actual situation of the existing application, most applications do not send the identifier to the bottom layer. In the case where the application does not specify whether the video is real-time or non-real-time, the electronic device defaults to real-time encoding/decoding of the video in order to quickly respond to the application's request to encode/decode the video.
It will be appreciated that since the now sent identifier specifies the application flexibility of either real-time encoding/decoding or non-real-time encoding/decoding, it is possible for the present electronic device to default to almost all videos are currently real-time encoding/decoding. Thus, the real-time codec resources in the electronic device reach the upper resource limit earlier than the non-real-time codec resources.
Furthermore, in the embodiment of the present application, if the electronic device has insufficient real-time codec resources to support creating a new codec to perform real-time encoding/decoding on the video after completing the recovery of real-time codec resources, the electronic device of the embodiment of the present application may further utilize the remaining non-real-time codec resources to perform non-real-time encoding/decoding on the video.
Therefore, the electronic device can further convert the video from real-time encoding/decoding to non-real-time encoding/decoding under the condition of insufficient real-time encoding/decoding resources, such as converting real-time encoding to non-real-time encoding by the electronic device or converting real-time decoding to non-real-time decoding by the electronic device. In this way, the electronic device can further improve the success rate of encoding/decoding by converting video from real-time encoding/decoding to non-real-time encoding/decoding. Although the situation that the encoding and decoding time delay performance does not meet the application requirements may exist after the video is converted from real-time encoding/decoding to non-real-time encoding/decoding, the success rate of video encoding/decoding can be improved as much as possible in a multi-channel encoding/decoding scene, so that the phenomenon that the user interface is blocked and is blocked in the multi-channel encoding/decoding scene, which greatly influences the user experience, is avoided.
The electronic device may include at least one of a cellular phone, a foldable electronic device, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a Personal Digital Assistant (PDA), an augmented reality (augmented reality, AR) device, a Virtual Reality (VR) device, an artificial intelligence (ARTIFICIAL INTELLIGENCE, AI) device, a wearable device, a vehicle-mounted device, a smart home device, or a smart city device. The embodiment of the application does not limit the specific type of the electronic device.
As shown in fig. 8, an embodiment of the present application shows a schematic structural diagram of an electronic device.
The electronic device 800 may include a processor 810 (such as the system on chip SOC described above), an external memory interface 820, an internal memory 821, a universal serial bus (universal serial bus, USB) connector 830, a charge management module 840, a power management module 841, a battery 842, an antenna 1, an antenna 2, a mobile communication module 850, a wireless communication module 860, an audio module 870, a speaker 870A, a receiver 870B, a microphone 870C, an ear-headphone interface 870D, a sensor module 880, keys 890, a motor 891, an indicator 892, a camera module 893, a display 894, and a subscriber identity module (subscriber identification module, SIM) card interface 895, etc. The sensor module 880 may include, among other things, a pressure sensor 880A, a gyroscope sensor 880B, an air pressure sensor 880C, a magnetic sensor 880D, an acceleration sensor 880E, a distance sensor 880F, a proximity light sensor 880G, a fingerprint sensor 880H, a temperature sensor 880J, a touch sensor 880K, an ambient light sensor 880L, a bone conduction sensor 880M, and the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 800. In other embodiments of the application, electronic device 800 may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 810 (such as the system on a chip SOC described above) may include one or more processing units, such as: the processor 810 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a video codec (such as the video processing unit VPU described above), a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural Network Processor (NPU), etc.
Wherein the video codec is used to compress or decompress (i.e., encode or decode) digital video. The electronic device 800 may support one or more video codecs. In this way, the electronic device 800 may play or record video in a variety of encoding formats. Also, the various processing units of processor 810 may be separate devices or may be integrated into one or more processors.
The processor can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution. In particular, in the embodiment of the present application, the encoding and decoding methods described in the embodiment of the present application may be executed by the processor 810. Furthermore, the codec method described in the embodiments of the present application may be performed by a video codec (e.g., the video processing unit VPU described above).
A memory may also be provided in the processor 810 for storing instructions and data. In some embodiments, the memory in processor 810 may be a cache memory. The memory may hold instructions or data that are used or used more frequently by the processor 810. If the processor 810 needs to use the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 810 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 810 may include one or more interfaces. The interfaces may include an integrated circuit (inter-INTEGRATED CIRCUIT, I2C) interface, an integrated circuit built-in audio (inter-INTEGRATED CIRCUIT SOUND, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others. The processor 810 may be connected to the touch sensor, the audio module, the wireless communication module, the display screen, the camera module, etc. through at least one of the above interfaces.
It should be understood that the connection between the modules illustrated in the embodiments of the present application is merely illustrative, and does not limit the structure of the electronic device 800. In other embodiments of the present application, the electronic device 800 may also employ different interfaces in the above embodiments, or a combination of interfaces.
The external memory interface 820 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 800. The external memory card communicates with the processor 810 through an external memory interface 820 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card. Or transfer files such as music, video, etc. from the electronic device to an external memory card.
The internal memory 821 may be used to store computer-executable program code that includes instructions. The internal memory 821 may include a stored program area and a stored data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 800 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 821 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 810 performs various functional methods or data processing of the electronic device 800 by executing instructions stored in the internal memory 821 and/or instructions stored in a memory provided in the processor.
The USB connector 830 is an interface that meets the USB standard, and may be used to connect the electronic device 800 and a peripheral device, specifically, a Mini USB connector, a Micro USB connector, a USB Type C connector, etc.
The charge management module 840 is configured to receive a charge input from a charger. The charging management module 840 may also provide power to the electronic device through the power management module 841 while charging the battery 842.
The power management module 841 is configured to connect the battery 842, the charge management module 840 and the processor 810. The power management module 841 receives input from the battery 842 and/or the charge management module 840, and provides power to the processor 810, the internal memory 821, the display screen 894, the camera module 893, the wireless communication module 860, and the like.
The wireless communication function of the electronic device 800 may be implemented by the antenna 1, the antenna 2, the mobile communication module 850, the wireless communication module 860, a modem processor, a baseband processor, and the like.
The electronic device 800 may implement display functions through a GPU, a display screen 894, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display screen 894 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 810 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 894 is used to display images, videos, and the like. For example, the display 894 may be used to display the video after playing the codec. In some embodiments, display 894 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), an active-matrix organic LIGHT EMITTING diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-oLed, a quantum dot LIGHT EMITTING diode (QLED), or the like. In some embodiments, electronic device 800 may include 1 or more display screens 894.
The electronic device 800 may implement a camera function through a camera module 893, an isp, a video codec (such as the video processing unit VPU described above), a GPU, a display screen 894, an application processor AP, a neural network processor NPU, and the like.
Electronic device 800 may implement audio functionality through audio module 870, speaker 870A, receiver 870B, microphone 870C, ear speaker interface 870D, and an application processor. Such as music playing, recording, etc.
Keys 890 may include a power-on key, a volume key, etc. The motor 891 may generate a vibration alert. The indicator 892 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc. The SIM card interface 895 is used to connect to a SIM card.
In some embodiments, the software system of an electronic device, such as electronic device 800 described above, may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the following, the embodiment of the application takes an Android TM system with a layered architecture as an example, and illustrates a software structure of an electronic device.
Fig. 9 shows a software architecture block diagram of an electronic device.
As shown in fig. 9, the hierarchical architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, as shown in fig. 9, the Android TM system is divided into five layers, from top to bottom, an application layer, an application framework layer, a system runtime (Native) layer, a hardware abstraction layer (Hardware Abstract Layer, HAL), and a kernel (kernel) layer.
The application layer may include a series of application packages. As shown in fig. 9, the application package may include gallery, calendar, map, WLAN, music, call, video, recording, clipping, dropping, camera, etc. applications. It will be appreciated that in some embodiments, the application layer may further include applications (not shown in the drawings) such as sms, navigation, bluetooth, etc., and the embodiments of the present application are not limited in any way, depending on the application installed for actual use of the electronic device.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for the application of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 9, the application framework layers may include Resource recycling service (Resource MANAGER SERVICE), an activity manager, a window manager, a notification manager, an input manager, a view system, a content provider, and so on.
The Resource recycling service (Resource MANAGER SERVICE) is a service within the multimedia service (MEDIA SERVER) that can be used for Resource recycling. In the embodiment of the application, the resource recycling service can be used for recycling the coding and decoding resources. For example, the resource reclamation service may reclaim real-time codec resources and/or non-real-time codec resources.
In a specific embodiment, the resource recycling service may determine the recycling target application according to the process priority corresponding to the application, so as to recycle the codec resources occupied by the target application. For example, the resource reclamation service may reclaim codec resources occupied by lower process priority applications.
The activity manager may provide an activity management service (ACTIVITY MANAGER SERVICE, AMS), and the AMS may be used for system component (e.g., activity, service, content provider, broadcast receiver) start-up, handoff, scheduling, and application process management and scheduling tasks. In the embodiment of the application, the activity manager is further used for managing the process priority of each application process.
In a particular embodiment, the resource reclamation service may communicate with an activity manager that feeds back to the resource reclamation service the process priorities of the various applications currently occupying the codec resources in response to a request by the resource reclamation service. Then, the resource recycling service can determine a target application needing to recycle the occupied codec resources according to the process priority of the application returned by the activity manager, and recycle the codec resources occupied by the target application.
The Window manager provides a Window management service (Window MANAGER SERVICE, WMS), and WMS may be used for Window management, window animation management, surface management, and as a transfer station for an input system.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
The Input manager may provide an Input management service (Input MANAGER SERVICE, IMS) and the IMS may be used to manage inputs to the system, such as touch screen inputs, key inputs, sensor inputs, etc. The IMS retrieves events from the input device node and distributes the events to the appropriate windows through interactions with the WMS.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The system runtime layer can be divided into android runtime (Android runtime, ART) and native C/c++ libraries. As shown in FIG. 9, the android runtime includes a core library and An Zhuoyun rows. The android runtime is responsible for converting source code into machine code. Android runtime mainly includes employing Advanced Or Time (AOT) compilation techniques and Just In Time (JIT) compilation techniques. The core library is mainly used for providing the functions of basic Java class libraries, such as basic data structures, mathematics, IO, tools, databases, networks and the like. The core library provides an API for the user to develop the android application.
The native C/c++ library may include a plurality of functional modules. For example: surface manager (surface manager), function library (e.g., libc), graphics library (e.g., openGL ES), database (e.g., SQLite), browser engine (webkit), etc. The surface manager is used for managing the display subsystem and providing fusion of 2D and 3D layers for a plurality of application programs. OpenGL ES provides for drawing and manipulation of 2D graphics and 3D graphics in applications. SQLite provides a lightweight relational database for applications of electronic devices.
In an embodiment of the present application, the system runtime layer further includes a media codec service (media codec) and a codec resource query module (Acodec or Ccodec). Wherein the application may apply for creating a codec instance (instance) to a media codec service (media codec). That is, a codec request for encoding/decoding video by an application may be received by the media codec. The codec resource query module is a service for querying codec resources. Currently, the query mechanism used by the codec resource query module may be Acodec or Ccodec, depending on the specific chip to be mounted.
It should be noted that Acodec and Ccodec are mechanisms for implementing codec resource query, and the implementation principle of the codec resource query is the same, and the difference is that the timing of the codec resource query is different. Currently, the process of encoding/decoding in response to an applied codec request is mainly divided into three stages, including: an initialization (init) phase, a configuration (configure) phase, and a start (start) phase. The Acodec mechanism typically queries the codec resources at the configuration stage, while Ccodec typically queries the codec resources at the start stage.
In general, it can be simply understood that Acodec and Ccodec are just different names of different chips for the same mechanism. And whether an electronic device employs Acodec or Ccodec in particular depends primarily on the chip configured in the electronic device.
The hardware abstraction layer runs in a user space (user space), encapsulates the kernel layer driver, and provides a call interface to the upper layer. The kernel layer is a layer between hardware and software. As shown in fig. 9, the hardware abstraction layer may include a display HAL, an audio HAL, a camera HAL, a video HAL (video HAL), and so on. The kernel layer may include a display driver, an audio driver, a camera driver, a video driver (video kernel), and the like.
In the embodiment of the present application, briefly, the media codec receives and responds to a codec request for encoding/decoding video by an application, and further transmits the codec request to Acodec or Ccodec. Then, a video driver (video kernel) is invoked by Acodec or Ccodec via video HAL (video HAL) to check the current codec resources. Therefore, the media codec decides whether to perform the processes of resource recovery, real-time encoding/decoding to non-real-time encoding/decoding and the like according to the checked condition of the encoding/decoding resources, so as to improve the success rate of video encoding/decoding in a multi-channel encoding/decoding scene.
The following describes the codec method according to the embodiment of the present application in detail with reference to the accompanying drawings. The codec methods in the following embodiments may be implemented in the electronic device 800 having the above-described hardware configuration.
Fig. 10 shows a schematic flow chart of a coding and decoding method, which is applied to an electronic device and includes S1001-S1015. The following describes a coding and decoding method according to an embodiment of the present application with reference to fig. 10.
S1001, a first video is encoded/decoded in real time by occupying a first real-time encoding/decoding resource in the real-time encoding/decoding resources.
The first video is a video which is currently occupying real-time coding/decoding resources for real-time coding/decoding when a coding/decoding request corresponding to the third video arrives. That is, before the codec request corresponding to the third video arrives, the electronic device may have received a codec request initiated by the user through the first application to codec the first video. And, the electronic device may also already be in real-time encoding/decoding the first video in response to this codec request, while occupying real-time codec resources.
The real-time encoding/decoding is the first encoding/decoding method described above, and the explanation about the real-time encoding/decoding can be specifically described with reference to the above, which is not repeated here in the embodiments of the present application. Correspondingly, the real-time encoding and decoding resource is the first encoding and decoding resource. The real-time encoding/decoding (i.e., the first encoding/decoding mode) corresponds to the real-time encoding/decoding resource (i.e., the first encoding/decoding resource), and the electronic device needs to occupy the real-time encoding/decoding resource for encoding/decoding the video in real time.
In the embodiment of the present application, the real-time encoding/decoding resource occupied by the real-time encoding/decoding of the first video is referred to as a first real-time encoding/decoding resource. It may be understood that the first real-time codec resource is a real-time codec resource occupied by the first video performing real-time encoding/decoding, and the real-time codec resource includes the first real-time codec resource.
It will be appreciated that how much of the first real-time codec resource depends on how much real-time codec resource the first video needs to occupy for real-time encoding/decoding. Based on the actual situation, the first real-time codec resource may be part or all of the real-time codec resource. When the first video occupies all the real-time encoding/decoding resources to perform real-time encoding/decoding, then the real-time encoding/decoding resources are equal to the first real-time encoding/decoding resources. Meaning that currently all real-time codec resources are occupied and no free real-time codec resources currently remain.
It should be noted that, based on the actual operation situation of the electronic device, the first video may be empty. That is, when the codec request corresponding to the third video arrives, the electronic device does not currently have a video being encoded/decoded in real time. At this time, the first real-time codec resource=0. Or the first video may also include one or more videos.
For example, when a codec request corresponding to the third video arrives, the electronic device may already occupy real-time codec resources and is performing real-time encoding/decoding on the first video a, the first video B, and the first video C. Then, the first video at this time includes the first video a, the first video B, and the first video C. In this case, the first real-time codec resource is the sum of the real-time codec resource occupied by the first video a, the real-time codec resource occupied by the first video B, and the real-time codec resource occupied by the first video C.
In some embodiments, the first application may be a video playing application, a video recording application, a video editing application, or a video editing application, which may be specifically determined according to an actual running situation of the electronic device, which is not limited in any way by the embodiments of the present application. Correspondingly, the first video may be a video that is currently required to be played and is operated by a user in a video playing application (such as a short video application, a television application, a social application). For example, the first video may be a short video played in a short video application, or may also be a drama video or a movie video played in a drama application.
The first video may also be a video obtained by triggering a video recording application (such as a screen recording application carried by the system) to start recording the video by a user. The first video may also be a video that a user operates within a video editing application that requires editing.
S1002, the first non-real-time encoding/decoding resource in the non-real-time encoding/decoding resources is occupied to encode/decode the second video in non-real-time.
The second video is a video which is currently occupied by non-real-time coding/decoding resources by the electronic equipment to perform non-real-time coding/decoding on the second video when a coding/decoding request corresponding to the third video arrives. That is, before the codec request corresponding to the third video arrives, the electronic device may have received a codec request initiated by the user through the second application to codec the second video, and the electronic device may also have already occupied non-real-time codec resources in non-real-time encoding/decoding the second video in response to this codec request.
The non-real-time encoding/decoding is the second encoding/decoding method described above, and the explanation of the non-real-time encoding/decoding is specifically described with reference to the above, and the embodiments of the present application are not described herein again. Correspondingly, the non-real-time coding and decoding resource is the second coding and decoding resource. The non-real-time encoding/decoding (i.e., the second encoding/decoding mode) corresponds to the non-real-time encoding/decoding resource (i.e., the second encoding/decoding resource), and the electronic device needs to occupy the non-real-time encoding/decoding resource for encoding/decoding the video.
Similarly, in the embodiment of the present application, the non-real-time codec resource occupied by non-real-time encoding/decoding the second video is referred to as a first non-real-time codec resource. It is understood that the first non-real-time codec resource is a non-real-time codec resource occupied by the second video for non-real-time encoding/decoding, and the non-real-time codec resource includes the first non-real-time codec resource.
And how much of the first non-real time codec resource depends on how much of the non-real time codec resource the second video needs to occupy for non-real time encoding/decoding. Based on the actual situation, the first non-real-time codec resource may be part or all of the non-real-time codec resource. When the second video occupies all the non-real-time encoding/decoding resources to perform non-real-time encoding/decoding, the non-real-time encoding/decoding resources are equal to the first non-real-time encoding/decoding resources. Meaning that currently all non-real-time codec resources are occupied and no free non-real-time codec resources currently remain.
Similarly, the second video may be empty based on the actual operating conditions of the electronic device. That is, when the codec request corresponding to the third video arrives, the electronic device does not currently have a video being encoded/decoded in non-real time. At this time, the first non-real-time codec resource=0. Or the second video may also include one or more videos.
For example, when a codec request corresponding to the third video arrives, the electronic device is currently performing non-real-time encoding/decoding on the second video a, the second video B, and the second video C. Then, the second video at this time includes a second video a, a second video B, and a second video C. In this case, the first non-real-time codec resource is the sum of the non-real-time codec resource occupied by the second video a, the non-real-time codec resource occupied by the second video B, and the non-real-time codec resource occupied by the second video C.
In some embodiments, the second application may also be a video playing application, a video recording application, a video editing application, or a video editing application, which is also determined according to the actual running situation of the electronic device, which is not limited in any way by the embodiments of the present application. Correspondingly, the first video is the same, and the second video can be a video which is required to be played currently and is operated by a user in a video playing application (such as a short video application, a television application and a social application). For example, the second video may be a short video played in a short video application, or may also be a drama video or a movie video played in a drama application. The second video may also be a video obtained by triggering a video recording application (such as a screen recording application carried by the system) to start recording the video by the user. The second video may also be a video that the user operates within the video editing application that requires editing.
In some embodiments, when a codec request corresponding to the third video arrives, the electronic device may currently only have a case of encoding/decoding the first video in real time, may only have a case of encoding/decoding the second video in non-real time, and may also have a case of encoding/decoding the first video in real time and encoding/decoding the second video in non-real time at the same time.
Summarizing, based on the actual operation situation of the electronic device, the current coding and decoding situations of the electronic device may be: the electronic device occupies a first real-time encoding/decoding resource in the real-time encoding/decoding resources to encode/decode the first video in real time, and/or the electronic device occupies a first non-real-time encoding/decoding resource in the non-real-time encoding/decoding resources to encode/decode the first video in non-real time.
Without distinguishing real-time encoding/decoding from non-real-time encoding/decoding, if the first video includes n videos and the second video includes m videos, then the electronic device is characterized as being in an n+m-way encoding/decoding scene.
In addition, it will be appreciated that the electronic device may also have situations where the first video is not encoded/decoded in real-time and the second video is encoded/decoded in non-real-time at the same time. In this case, the electronic device does not have any video being encoded/decoded. At this time, both the first real-time codec resource and the first non-real-time codec resource may be 0. When the codec request corresponding to the third video arrives, if the third video is successfully encoded/decoded, the third video may be the current only one path of codec.
S1003, a codec request for encoding/decoding the third video is received.
The electronic device receives a codec request initiated by a third application to encode/decode a third video. The codec request may be a request generated as triggered by a user's operation within the third application. The third application may be a video playing application, a video recording application, a video editing application, a video clipping application, or the like.
Correspondingly, the third video may be a video that is currently required to be played and is operated by the user in a video playing application (such as a short video application, a television application, a social application). For example, the third video may be a short video played in a short video application, or may also be a drama video or a movie video played in a drama application. The third video may also be a video obtained by triggering a video recording application (such as a screen recording application carried by the system) to start recording the video by the user. The second video may also be a video that the user operates within the video editing application that requires editing.
S1004, determining whether the third video is real-time encoding/decoding or non-real-time encoding/decoding.
After the electronic device receives the codec request for encoding/decoding the third video, the electronic device needs to further determine whether the third video is encoded/decoded in real time or not in real time because the electronic device includes two modes of encoding/decoding in real time and encoding/decoding not in real time. If the electronic device determines that the third video is encoded/decoded in real time, the electronic device performs S1005 in response to the encoding/decoding request for encoding/decoding the third video. If the electronic device determines that the non-real-time encoding/decoding is performed on the third video, the electronic device responds to the encoding/decoding request for encoding/decoding the third video, and then performs S1010.
In some embodiments, the electronic device may determine whether the video corresponding to the codec request is real-time codec or non-real-time codec according to the first identifier carried by the codec request initiated by the application, e.g. the identifier priority.
That is, the electronic device may determine, according to the identifier priority carried by the codec request initiated by the third application, whether the codec request sent by the application is a request for performing real-time encoding/decoding on the third video or a request for performing non-real-time encoding/decoding on the third video. In a specific embodiment, if the identification priority indicates that the third video is real-time encoding/decoding, as described above, when the identification priority=0, the encoding/decoding request is a request for real-time encoding/decoding of the third video, and the electronic device further performs S1005 in response to the request for real-time encoding/decoding of the third video.
If the identification priority indicates that the third video is non-real-time encoded/decoded, if the identification priority=1, then the encoding/decoding request is a request for non-real-time encoding/decoding of the third video, and the electronic device further performs S1010 in response to the request for non-real-time encoding/decoding of the third video.
In another embodiment, if the codec request does not carry the first identifier, i.e. if the third application does not specify whether the third video is encoded/decoded in real time or not, the electronic device defaults to the codec request sent by the application being a request for encoding/decoding the third video in real time. Therefore, the electronic apparatus in this case executes S1005.
S1005, it is determined whether the second real-time codec resource is less than the first required resource.
That is, it is determined whether the second real-time codec resource is equal to or greater than the first required resource. The second real-time codec resource is currently unoccupied, i.e. the current remaining real-time codec resource. In addition, as described above, the first real-time codec resource is a real-time codec resource that has been currently occupied. Therefore, the sum of the first real-time codec resource and the second real-time codec resource can be understood as the real-time codec resource.
It will be appreciated that if the first video of the preamble is empty, then the first video does not occupy real-time codec resources of the electronic device. That is, all real-time codec resources are not occupied at this time. In this case, the first real-time codec resource is 0, and the second real-time codec resource is equal to the real-time codec resource.
And if the first video of the preamble is not empty, the first video occupies real-time encoding and decoding resources of the electronic device. At this time, the first real-time codec resource is not 0. Therefore, in this case, the second real-time codec resource is the real-time codec resource remaining after the first real-time codec resource is removed from the real-time codec resources.
The first required resource is a real-time codec resource required for real-time encoding/decoding of the third video. Currently, since the codec resources in the electronic device are specifically MBPS, MBPF, and RT-MBP. Therefore, the real-time coding and decoding resources required to be occupied by the third video can be determined by calculating the MBPS and RT-MBPF corresponding to the third video. And the non-real-time coding and decoding resources required to be occupied by the third video can be determined by calculating the MBPF corresponding to the third video.
In the embodiment of the application, the calculation formulas of MBPS, RT-MBPF and MBPF are as follows:
MBPS=width*height/(block*block)*framerate。
RT-MBPF=width*height/(block*block)。
MBPF=width*height/(block*block)。
Where width height is the resolution size of the video, block is the macroblock size, framerate is the frame rate of the video (e.g., the third video). For example, taking a macroblock of 16×16 in h.264 as an example, and taking a video of 1080×1920 resolution and 30HZ frame rate as an example, details of real-time codec resources and non-real-time codec resources that the video needs to occupy are as follows:
MBPS=1080*1920/(16*16)*30。
MBPF=1080*1920/(16*16)。
RT-MBPF=1080*1920/(16*16)。
In some embodiments, 16×16 macroblocks are partitioned by 1080×1920 of the original resolution, and because 1080 is not an integer multiple of 16, a macroblock that does not satisfy the 16 size would appear if partitioned by 16. In this case, the electronic device may perform resolution padding in advance so that the resolution is an integer multiple of the macroblock size. That is, mbps=1088×1920/(16×16) ×30 for video of 1080×1920 resolution, and MBPF and RT-MBPF both=1088×1920/(16×16).
Based on this, after determining that the third video needs to be encoded/decoded in real time, the electronic device may calculate, through the resolution size of the third video, the frame rate of the third video, and the macroblock size (e.g. 16×16 of h.264), MBPS and RT-MBPF corresponding to the third video, thereby obtaining the first required resource corresponding to the third video.
Then, the electronic device judges whether the remaining second real-time encoding and decoding resources are less than the first required resources corresponding to the third video. If the second real-time codec resource is determined to be equal to or greater than the first required resource, if not, the current remaining real-time codec resource of the electronic device is sufficient to support the electronic device to real-time codec the third video. Then the electronic device performs S1006, directly real-time encoding and decoding the third video using the second real-time encoding and decoding resource.
And if so, namely, under the condition that the second real-time coding and decoding resources are less than the first required resources, the current residual real-time coding and decoding resources of the electronic equipment are insufficient to support the electronic equipment to carry out real-time coding and decoding on the third video. In this case, then, the electronic device cannot directly use this second real-time codec resource to real-time encode/decode the third video. At this time, if the third video is a video that needs to be displayed at the front end, the interface may be stuck, as shown in fig. 7.
However, in the embodiment of the present application, if the electronic device determines that there are not enough real-time codec resources currently available for performing real-time encoding/decoding on the third video, the electronic device is further triggered to perform resource recovery on the occupied first real-time codec resources. That is, in case of yes, the electronic device is triggered to perform S1007 to vacate more real-time codec resources through resource reclamation.
S1006, the third video is encoded/decoded in real time by using the second real-time encoding/decoding resource.
In the event that the electronic device determines that the remaining second real-time codec resource is equal to or greater than the first required resource, the electronic device utilizes this remaining second real-time codec resource to real-time encode/decode the third video. It can be understood that, based on the amount of real-time encoding/decoding resources actually required for real-time encoding/decoding of the third video, the electronic device may occupy part or all of the second real-time encoding/decoding resources when performing real-time encoding/decoding of the third video by using the second real-time encoding/decoding resources.
That is, if the second real-time encoding/decoding resource is equal to the first required resource, the electronic device occupies all the second real-time encoding/decoding resources to perform real-time encoding/decoding for the third video. At this time, the first real-time encoding and decoding resources are occupied by the first video, and the second real-time encoding and decoding resources are occupied by the third video, so that under the condition that no real-time encoding and decoding resources are released, no real-time encoding and decoding resources remain at present.
If the second real-time encoding/decoding resources are more than the first required resources, the electronic device can support real-time encoding/decoding of the third video by occupying only a part of the second real-time encoding/decoding resources. At this time, the first required resources are removed from the second real-time codec resources, and the remaining real-time codec resources are the current remaining real-time codec resources.
And S1007, performing resource recovery on the first real-time coding and decoding resources to obtain third real-time coding and decoding resources.
Resource reclamation refers to reclaiming codec resources, including reclaiming real-time codec resources and non-real-time codec resources. The electronic device performs resource recycling on the first real-time coding and decoding resource to recycle the real-time coding and decoding resource. The third real-time coding and decoding resources obtained by recycling comprise the first real-time coding and decoding resources recycled by the electronic equipment and the second real-time coding and decoding resources remained at present.
For example, the third real-time codec resource is the sum of the originally remaining second real-time codec resource and the recovered first real-time codec resource. Assuming that when a third video coding and decoding request arrives, the real-time coding and decoding resource occupied for real-time coding and decoding in the real-time coding and decoding resources of the electronic equipment is a resource A, and the rest idle real-time coding and decoding resources are a resource B. At this time, the resource a is a first real-time encoding and decoding resource in the embodiment of the present application, and the resource B is a second real-time encoding and decoding resource in the embodiment of the present application. And when the resource B is insufficient to support the electronic equipment to perform real-time encoding and decoding on the third video, the electronic equipment performs resource recovery on the resource A, and the recovered resource is recorded as a resource C. Based on the actual resource recovery condition, the resource C is less than or equal to the resource A. At this time, the third real-time encoding and decoding resource is the sum of the resource B and the resource C, and the first real-time encoding and decoding resource is the difference between the resource a and the resource C.
It will be appreciated that resource C may be less than or equal to resource a, as based on actual resource reclamation conditions. That is, in the case of resource c=resource a, all occupied first real-time codec resources are reclaimed. Then the third real-time codec resource is now the full real-time codec resource. In addition, there may be a case where the electronic device performs resource reclamation but does not successfully recycle the resource, so that resource C may be equal to 0. In the case of resource c=0, the recovered first real-time codec resource is 0, and the third real-time codec resource at that time is the second real-time codec resource.
S1008, judging whether the third real-time coding and decoding resources are less than the first required resources.
That is, it is determined whether the third real-time codec resource is equal to or more than the first required resource. And under the condition that the rest real-time encoding and decoding resources, namely the second real-time encoding and decoding resources, are insufficient to support the third video to carry out real-time encoding and decoding, the electronic equipment carries out the recovery of the real-time encoding and decoding resources through resource recovery, so that after the third real-time encoding and decoding resources are obtained, the real-time encoding and decoding resources which are sufficient to support the third video to carry out real-time encoding and decoding can be successfully recovered as long as the resource recovery is not carried out. Therefore, after the electronic device completes the resource recovery of the real-time codec resource, it is further required to determine whether the third real-time codec resource is less than the first required resource.
If not, the electronic equipment characterizes that the real-time encoding and decoding resources which are enough to support the third video to perform real-time encoding/decoding are obtained after the resource recovery under the condition that the third real-time encoding and decoding resources obtained after the resource recovery are equal to or more than the first required resources. Further, the electronic device executes S1009 to directly use the third real-time codec resource after the resource recovery to perform real-time codec on the third video.
If so, under the condition that the third real-time encoding and decoding resources obtained after the resource recovery are still less than the first required resources, the real-time encoding and decoding resources obtained after the resource recovery are still insufficient to support the electronic equipment to encode/decode the third video in real time. Then, in the embodiment of the present application, in order to increase the success rate of the multi-channel encoding/decoding, the electronic device further performs S1010, in preparation for starting to switch the third video from the real-time encoding/decoding to the non-real-time encoding/decoding.
S1009, performing real-time encoding/decoding on the third video using the third real-time encoding/decoding resource.
And under the condition that the electronic equipment determines that the third real-time coding and decoding resources obtained after the resource recovery are equal to or more than the first required resources, the electronic equipment utilizes the third real-time coding and decoding resources to carry out real-time coding/decoding on the third video. It can be understood that, based on the amount of real-time encoding/decoding resources actually required for real-time encoding/decoding of the third video, the electronic device may occupy part or all of the third real-time encoding/decoding resources when performing real-time encoding/decoding of the third video by using the third real-time encoding/decoding resources.
That is, if the third real-time encoding and decoding resources are equal to the first required resources, the electronic device occupies all the third real-time encoding and decoding resources to perform real-time encoding and decoding for the third video. At this time, the first real-time encoding and decoding resources are occupied by the first video, and the third real-time encoding and decoding resources including the second real-time encoding and decoding resources are also occupied by the third video, so that under the condition that no real-time encoding and decoding resources are released, no real-time encoding and decoding resources remain.
If the third real-time encoding and decoding resources are larger than the first required resources, the electronic device can support real-time encoding and decoding of the third video by only occupying a part of the third real-time encoding and decoding resources. At this time, the first required resource is removed from the third real-time codec resource, and the remaining real-time codec resource is the current remaining real-time codec resource.
S1010, judging whether the second non-real-time coding and decoding resources are less than the second required resources.
The second non-real-time codec resource is the remaining non-real-time codec resource, and similarly the first non-real-time codec resource is the non-real-time codec resource occupied by the second video as described above. Therefore, it can be simply understood that the sum of the first non-real-time codec resource and the second non-real-time codec resource is the non-real-time codec resource of the electronic device.
It will be appreciated that if the second video of the preamble is empty, the second video does not occupy non-real-time codec resources of the electronic device. That is, all non-real-time codec resources are not occupied at this time. In this case, the first non-real-time codec resource is 0, and that second non-real-time codec resource is equal to the non-real-time codec resource of the electronic device.
And if the second video of the preamble is not empty, the second video occupies non-real-time codec resources of the electronic device. At this time, the first non-real-time codec resource is not 0. In this case, therefore, the second non-real-time codec resource is the non-real-time codec resource remaining after the first non-real-time codec resource is removed from the non-real-time codec resources.
The second required resource is a non-real-time codec resource required for non-real-time encoding/decoding of the third video. In the embodiment of the present application, the electronic device may determine the second required resource by calculating the MBPF corresponding to the third video through the resolution size and the macroblock size (for example, 16×16 of h.264) of the third video.
After the electronic device finishes resource recovery of the real-time coding and decoding resources, the electronic device still determines that the current residual real-time coding and decoding resources, namely the third real-time coding and decoding resources are still insufficient to support real-time coding and decoding of the third video, so as to improve the success rate of multi-channel coding and decoding. And, because applications typically do not specify the encoding scheme, the video to be encoded/decoded in non-real time is actually much less than the video to be encoded/decoded in real time. That is, the number of second videos may be much smaller than the number of first videos.
Thus, non-real-time codec resources will typically reach an upper limit later than real-time codec resources. Therefore, in the case where the electronic device determines that the real-time encoding/decoding resource is insufficient for real-time encoding/decoding the third video, it may be further prepared to start converting the third video into non-real-time encoding/decoding.
However, although the non-real-time codec resource reaches the upper limit relatively late, it is still limited, so the electronic device needs to further determine whether the second non-real-time codec resource currently remaining is less than the second required resource.
If not, the second non-real-time codec resource is equal to or more than the second required resource, that is, it indicates that the current remaining non-real-time codec resource is capable of supporting the electronic device to perform non-real-time encoding/decoding on the third video. In this case, the electronic apparatus performs S1011, non-real-time encoding/decoding of the third video directly using the second non-real-time encoding/decoding resource.
If yes, that is, the second non-real-time encoding/decoding resource is less than the second required resource, that is, the current remaining idle non-real-time encoding/decoding resource is not enough to support the electronic device to encode/decode the third video in non-real time. Then the electronic device performs S1012, the electronic device starts to reclaim the non-real time codec resources, see if enough non-real time codec resources can be vacated for non-real time encoding/decoding of the third video.
And S1011, non-real-time encoding/decoding the third video by using the second non-real-time encoding/decoding resource.
And under the condition that the electronic equipment determines that the current residual non-real-time coding and decoding resources, namely the second non-real-time coding and decoding resources are equal to or more than the second required resources corresponding to the third video, the electronic equipment can occupy the second non-real-time coding and decoding resources to carry out non-real-time coding and decoding on the third video. It can be understood that, based on the amount of non-real-time encoding/decoding resources actually required for non-real-time encoding/decoding of the third video, the electronic device may occupy part or all of the second non-real-time encoding/decoding resources when performing non-real-time encoding/decoding of the third video by using the second non-real-time encoding/decoding resources.
That is, if the second non-real-time encoding/decoding resource is equal to the second required resource, the electronic device needs to occupy all the second non-real-time encoding/decoding resource to perform non-real-time encoding/decoding for the third video. At this time, the first non-real-time encoding and decoding resources are occupied by the second video, and the second non-real-time encoding and decoding resources are also occupied by the third video, so that under the condition that no non-real-time encoding and decoding resources are released, no residual non-real-time encoding and decoding resources exist at present.
If the second non-real-time encoding/decoding resource is larger than the second required resource, the electronic device can support non-real-time encoding/decoding of the third video by occupying only a part of the second non-real-time encoding/decoding resource. At this time, the second required resources are removed from the second non-real-time codec resources, and the remaining non-real-time codec resources are currently still remaining non-real-time codec resources.
And S1012, recycling the resources of the first non-real-time coding and decoding resources to obtain third non-real-time coding and decoding resources.
And the electronic equipment performs resource recovery on the first non-real-time coding and decoding resources, namely, recovers the non-real-time coding and decoding resources. The third non-real-time encoding and decoding resources comprise the first non-real-time encoding and decoding resources recovered by the electronic equipment and the second non-real-time encoding and decoding resources remained at present.
For example, the third non-real-time codec resource may be a sum of the recovered first non-real-time codec resource and the second non-real-time codec resource. Assume that the non-real-time codec resource occupied for non-real-time encoding/decoding in the non-real-time codec resource of the electronic device is resource D, and the remaining idle non-real-time codec resource is resource E. Then, the resource D is the first non-real-time codec resource in the embodiment of the present application, and the resource E is the second non-real-time codec resource in the embodiment of the present application.
And when the electronic equipment is supported by insufficient resources E to encode/decode the third video in non-real time, the electronic equipment recovers the resources D, and the recovered resources are marked as resources F. Based on the actual resource recovery condition, the resource F is less than or equal to the resource D. At this time, the third non-real-time codec resource is the sum of the resource E and the resource F, and the first non-real-time codec resource is updated to be the difference between the resource D and the resource F.
It will be appreciated that resource F may be less than or equal to resource D based on actual resource reclamation. Then, in the case that the resource F is equal to the resource D, that is, all occupied non-real-time codec resources are recovered, the third non-real-time codec resource is now all the non-real-time codec resources.
In addition, there may be a case where the electronic device performs resource recovery but does not successfully recover the resource, so that the resource F may be equal to 0. In the case where the resource f=0, the recovered first non-real-time codec resource is 0, and the third non-real-time codec resource at this time can be understood as the second non-real-time codec resource.
S1013, judging whether the third non-real-time coding and decoding resource is less than the second required resource.
That is, it is determined whether the third non-real-time codec resource is equal to or greater than the second required resource. Under the condition that the non-real-time encoding and decoding resources are insufficient, the electronic equipment finishes the recovery of the non-real-time encoding and decoding resources through the resource recovery, and after obtaining the third non-real-time encoding and decoding resources, the same is true, and the non-real-time encoding and decoding resources which are sufficient for supporting the third video to carry out non-real-time encoding and decoding can be obtained as long as the resources are not recovered.
Therefore, the electronic device needs to further determine whether the third non-real-time codec resource is less than the second required resource corresponding to the third video. That is, it is determined whether the third non-real-time codec resource is equal to or greater than the second required resource.
And if not, representing that the resources are recovered to obtain the non-real-time encoding/decoding resources which are enough to support the third video to perform non-real-time encoding/decoding under the condition that the third non-real-time encoding/decoding resources are equal to or more than the second required resources. Further, the electronic device performs S1015, directly uses the recovered third non-real-time codec resource to perform non-real-time encoding/decoding on the third video.
If yes, that is, if the third non-real-time encoding/decoding resource is less than the second required resource, the non-real-time encoding/decoding resource obtained after resource recovery is characterized to be insufficient for supporting the third video to carry out non-real-time encoding/decoding.
That is, after the non-real-time codec resource recovery, an extreme case still occurs in which insufficient resources are available to support the encoding/decoding of the third video, and the electronic device can only discard the encoding/decoding of the third video, and S1014 is performed. In this case, the third video encoding/decoding fails.
S1014, the third video is not encoded/decoded.
And under the condition that the electronic equipment determines that the recovered third non-real-time encoding/decoding resources are still less than the second required resources, the electronic equipment does not have enough resources to encode/decode the third video in non-real time, and the electronic equipment cannot encode/decode the third video in non-real time. Meanwhile, the preamble electronic device has also determined that there is insufficient real-time codec resource support for real-time encoding/decoding the third video, and thus the third video encoding/decoding fails.
S1015, non-real-time encoding/decoding the third video using the third non-real-time encoding/decoding resource.
And under the condition that the electronic equipment determines that the recovered third non-real-time coding and decoding resources are equal to or more than the second required resources, the electronic equipment occupies the third non-real-time coding and decoding resources to carry out non-real-time coding and decoding on the third video.
It can be appreciated that, according to the amount of non-real-time encoding/decoding resources required by the third video, the electronic device may occupy part or all of the third non-real-time encoding/decoding resources when performing non-real-time encoding/decoding on the third video by using the third non-real-time encoding/decoding resources.
That is, if the third non-real-time encoding/decoding resource is equal to the second required resource, the electronic device needs to occupy all the third non-real-time encoding/decoding resources to perform non-real-time encoding/decoding for the third video. At this time, the first non-real-time coding and decoding resources are occupied by the second video, and the third non-real-time coding and decoding resources including the second non-real-time coding and decoding resources are also occupied by the third video, so that under the condition that no non-real-time coding and decoding resources are released, no non-real-time coding and decoding resources remain at present.
If the third non-real-time encoding/decoding resource is larger than the second required resource, the electronic device can support non-real-time encoding/decoding of the third video by occupying only a part of the third non-real-time encoding/decoding resource. At this time, the second required resource is removed from the third non-real-time codec resource, and the remaining non-real-time codec resource is the non-real-time codec resource that is currently still remaining.
In some embodiments, after the electronic device successfully encodes/decodes the video by means of real-time encoding/decoding to non-real-time encoding/decoding, for example, after the electronic device successfully encodes/decodes the third video by using the second non-real-time encoding/decoding resource or the third non-real-time encoding/decoding resource, the electronic device may display the relevant prompt information to inform the user that the video is converted to non-real-time encoding/decoding. For example, the hint information may be "the device has reached an upper limit on real-time encoding/decoding capability, and the current video is non-real-time encoding/decoding.
Therefore, in the multi-channel encoding/decoding scene that the third video is encoded/decoded simultaneously with the first video and/or the second video, if enough resources do not exist for encoding/decoding the third video, the electronic device can improve the success rate of encoding/decoding the third video through various modes such as real-time encoding/decoding resource recovery, real-time encoding/decoding to non-real-time encoding/decoding, non-real-time encoding/decoding resource recovery and the like, so that the success rate of multi-channel encoding/decoding is improved, and further, when the encoded/decoded video needs to be displayed at the front end, the phenomenon of blocking and dead of a user interface can be reduced, and the user experience is ensured.
It should be noted that, for convenience of explanation and understanding of the scheme, the embodiment of the present application mainly describes the codec method provided by the embodiment of the present application from the perspective of the third video. That is, the embodiment of the present application defaults to the first video being a video that is encoded/decoded in real time (first encoding/decoding scheme) and the second video being a video that is not encoded/decoded in real time (second encoding/decoding scheme). Meanwhile, the embodiments of the present application are not described in detail for the case of resource recovery in the encoding/decoding process of the first video and the second video.
However, it can be understood that, based on the encoding and decoding method provided by the embodiment of the present application, the electronic device may also encode/decode the first video and/or the second video in the same manner as the third video.
That is, when a codec request corresponding to the first video and/or the second video arrives, the electronic device also determines whether the first video and/or the second video is encoded/decoded in real time (first encoding/decoding method) or non-real time (second encoding/decoding method). Meanwhile, if there is not enough encoding/decoding resources for encoding/decoding the first video and/or the second video, the electronic device may vacate the resource creation codec to encode/decode the first video and/or the second video by performing resource recycling on the real-time encoding/decoding resources, converting real-time encoding/decoding to non-real-time encoding/decoding, performing resource recycling on the non-real-time encoding/decoding resources, and the like. The specific process flow of the first video and/or the second video may refer to the description of the encoding and decoding of the third video according to the embodiment of the present application, and the principle is the same as that of fig. 10, which is not repeated.
In addition, it should be noted that, since some of the resources in MBPFs are RT-MBPFs, RT-MBPFs correspond to real-time codec resources. The MBPF and RT-MBPF are calculated in the same manner. Therefore, when the electronic device determines that there is not enough non-real-time codec resource support to encode/decode the video (e.g., the third video) in non-real-time through the MBPF, there is certainly not enough RT-MBPF to support the electronic device to encode/decode the video (e.g., the third video) in real-time.
Therefore, when the electronic device determines that there is not enough MBPF (i.e., non-real-time codec resource) to perform non-real-time encoding/decoding on the video (e.g., the third video), it is not necessary to further consider performing non-real-time encoding/decoding to real-time encoding/decoding, because there is necessarily not enough RT-MBPF (i.e., real-time codec resource).
In some embodiments, if one or more of the first video, the second video, and the third video is a video to be displayed, the electronic device may display one or more of the first video, the second video, and the third video on the interface after encoding/decoding the first video, the second video, or the third video. Or if there is a corresponding encoding/decoding of the screen in the first video, the second video and the third video, the electronic device may display a corresponding screen floating window on the interface.
In a specific embodiment, if only the third video is a video to be displayed, the electronic device may display the first interface after encoding/decoding the third video (including real-time encoding/decoding and non-real-time encoding/decoding). The first interface includes encoded/decoded third video thereon. Or if the third video is a video encoded/decoded corresponding to the video recording, the first interface may include a recording floating window corresponding to the encoding/decoding of the third video.
In another specific embodiment, if the first video and/or the second video is/are a video to be displayed in addition to the third video being a video to be displayed or a corresponding recording floating window to be displayed. The electronic device may then further include the encoded/decoded first video and/or the encoded/decoded second video in the first interface.
For example, as shown in fig. 1, the first interface may be a video editing interface 100, the encoded/decoded third video may be a video displayed in an editing preview area 101, and the video displayed in a video track 102 may be the encoded/decoded first video or the second video. Or the video displayed in the edit preview area 101 may be the first video or the second video, and the video displayed in the video track 102 is the encoded/decoded third video.
As another example, as shown in fig. 3, the video displayed in the first split screen 301 may be the encoded/decoded third video, and the video displayed in the second split screen 302 may be the encoded/decoded first video or the second video. Or the video displayed in the first split screen 301 is the first video or the second video after encoding/decoding, and the video displayed in the second split screen 302 is the third video after encoding/decoding.
Also for example, as shown in fig. 5, the first interface may be interface 501. Wherein the display area 502, the floating window 503, the floating window 504, and the floating window 505 in the interface 501 may correspond to the first video, the second video, and the third video, respectively. For example, the encoded/decoded first video, second video, and third video may be displayed in one-to-one correspondence in the display area 502, the floating window 503, and the floating window 504, respectively. For another example, the encoded/decoded first video is displayed in the display area 502, the encoded/decoded second video is displayed in the floating window 503 or the floating window 504, and the recording floating window of the floating window 504 corresponds to the encoding/decoding of the third video.
It can be understood that the display content of the first interface is only used as an example of the embodiment of the present application, and whether the specific electronic device displays the encoded/decoded video in the first interface or displays the recording floating window corresponding to the encoded/decoded video, whether the video to be displayed is displayed in full screen or in a floating window manner, etc., mainly depends on the display requirement of the actual encoded/decoded video, and the embodiment of the present application is not limited in any way.
In some embodiments, different applications have different priorities based on their service types and service requirements, and in order not to affect normal operation between services, when the electronic device performs resource recovery, the electronic device may be implemented according to the process priority of the application. In a specific embodiment, for the application a that currently sends the codec request, the electronic device only recovers the codec resources occupied by the application B that has a lower process priority than the application a, where the codec resources include real-time codec resources or non-real-time codec resources.
Therefore, in the embodiment of the present application, the electronic device performs resource recovery on the first real-time codec resource to obtain a third real-time codec resource, which may include: determining the process priority of the first video corresponding to one or more first applications and determining the process priority of the third video corresponding to a third application; determining a first target application from the first applications according to the process priority; the first target application is a first application with a lower process priority than the third application; and recovering the first real-time encoding and decoding resources occupied by the first target application to obtain third real-time encoding and decoding resources.
Specifically, if the current remaining real-time encoding/decoding resources, i.e., the second real-time encoding/decoding resources, are insufficient to support real-time encoding/decoding of the third video on the premise that the third video is real-time encoded/decoded. And then triggering the electronic equipment to recover the currently occupied real-time coding and decoding resources, namely triggering the electronic equipment to recover the resources of the first real-time coding and decoding resources.
When the electronic equipment carries out resource recovery on the first real-time coding and decoding resource, firstly, a candidate object needing to carry out resource recovery is determined. At this time, the first real-time codec resource is a real-time codec resource occupied by the first video for real-time encoding/decoding. Therefore, the first application that initiates encoding/decoding of the first video is a candidate for resource reclamation.
It will be appreciated that since the first video may include one or more, there may be one or more corresponding first applications. However, it should be noted that the first applications and the first videos may not be in a one-to-one correspondence, where there may be a plurality of first videos corresponding to the same first application.
For example, as shown in the video clip scenario of fig. 1, two video parenchyma corresponding to two codecs are the same video clip application initiation. Therefore, the number of first applications may be equal to or less than the number of first videos.
The electronic device then needs to determine a target object from the one or more candidate objects. If only the codec resources occupied by the application with lower process priority can be reclaimed. The electronic device first determines the process priority of the one or more candidate objects, i.e., first determines the process priority corresponding to the one or more first applications. And simultaneously, determining the process priority of the third video corresponding to the third application.
Finally, the electronic device compares the process priorities of the first application and the third application respectively, and determines a first target application from the one or more first applications. The first target application is a first application having a lower process priority than the third application. The first target application determined by the electronic device is the target object of the resource reclamation determined by the electronic device. The electronic device recovers the first real-time codec resources occupied by the first target application.
For example, assume that the first video includes a first video a, a first video B, and a first video C, and the real-time codec resources occupied by the three videos are a first real-time codec resource a, a first real-time codec resource B, and a first real-time codec resource C, respectively, and the three videos correspond to the first application a, the first application B, and the first application C, respectively. Wherein the first application B and/or the first application C may be the first target application if both the process priority of the first application B and the process priority of the first application C are lower than the process priority of the third application. Furthermore, the electronic device may retrieve the first real-time codec resource occupied by the first application B and/or the first application C, thereby obtaining a third real-time codec resource. At this time, the third real-time codec resource is a sum of the second real-time codec resource and the first real-time codec resource B and/or the first real-time codec resource C.
Similarly, in the embodiment of the present application, the electronic device performs resource recovery on the first non-real-time codec resource to obtain a third non-real-time codec resource, which may include: determining the process priority of the second video corresponding to one or more second applications and determining the process priority of the third video corresponding to a third application; determining a second target application from the second applications according to the process priority; the second target application is a second application with a lower process priority than the third application; and recovering the first non-real-time encoding and decoding resources occupied by the second target application to obtain third non-real-time encoding and decoding resources.
That is, if the current remaining non-real-time codec resources, i.e., the second non-real-time codec resources, are insufficient to support non-real-time encoding/decoding of the third video on the premise that the third video is non-real-time encoded/decoded. The electronic device may also be triggered to recover the currently occupied non-real-time codec resources, i.e. the electronic device is triggered to perform resource recovery on the first non-real-time codec resources.
When the electronic device performs resource recovery on the first non-real-time coding and decoding resource, the candidate object needing to perform resource recovery is determined first. At this time, the first non-real-time codec resource is a non-real-time codec resource occupied by the second video performing non-real-time encoding/decoding. Therefore, the second application that initiates encoding/decoding of the second video is a candidate for resource reclamation.
It will also be appreciated that since the second video may include one or more, there may be one or more corresponding second applications. But the second application may also be an application that can initiate multiple video codecs, such as a video clip application. The second applications may not be in a one-to-one correspondence with the second videos, and the number of second applications may be equal to or less than the number of second videos.
The electronic device then needs to determine a target object from the one or more candidate objects. That is, the electronic device determines a process priority for one or more second applications and determines a process priority for a third video for a third application. The electronic device compares the process priorities of the second application and the third application respectively, and determines a second application with a lower process priority than the third application from the one or more second applications as a second target application. This second target application is the target object of the resource reclamation determined by the electronic device. The electronic device recovers the first non-real-time codec resources occupied by the second target application.
For example, assume that the second video includes a second video a, a second video B, and a second video C, and the non-real-time codec resources occupied by the three videos are a first non-real-time codec resource a, a first non-real-time codec resource B, and a first non-real-time codec resource C, and the three videos correspond to a second application a, a second application B, and a second application C, respectively. Wherein the second application a and/or the second application C may be the second target application if both the process priority of the second application a and the process priority of the second application C are lower than the process priority of the third application. Furthermore, the electronic device may recycle the first non-real-time codec resources occupied by the second application a and/or the second application C, thereby obtaining third non-real-time codec resources. At this time, the third non-real-time codec resource is the sum of the second non-real-time codec resource and the first non-real-time codec resource a and/or the first non-real-time codec resource C.
The process priority of the application may be configured by itself or may be customized by the system, which is not limited in any way in the embodiment of the present application. For example, the floating window may have a lower priority than the full screen window, and the background application may have a lower priority than the floating window.
Therefore, no matter whether the electronic equipment recovers real-time coding and decoding resources (such as first real-time coding and decoding resources) or recovers non-real-time coding and decoding resources (such as first non-real-time coding and decoding resources), the electronic equipment only recovers coding and decoding resources occupied by the application (such as first application and/or second application) with lower process priority than the third application through the process priority height relation among the applications, so that the application (such as the first application and/or the second application) with higher process priority can be ensured to normally perform coding and decoding without being influenced by resource recovery, and normal operation of each coding and decoding service is not influenced.
In some embodiments, the application may specify, in addition to real-time encoding/decoding and non-real-time encoding/decoding of the video based on the time delay requirement, software encoding/decoding (soft-decoding for short), low-latency encoding/decoding, secure encoding/decoding (secure decoder) and the like for the video according to other service requirements.
In some embodiments, since real-time encoding/decoding and non-real-time encoding/decoding in embodiments of the present application are mainly dependent on VPU implementation, they are essentially hardware encoding/decoding (hardware decoding for short). However, software codecs typically rely on a CPU for encoding/decoding. Therefore, the software codec is fundamentally different from the real-time codec and the non-real-time codec in the embodiment of the present application. Therefore, when the application side designates to perform software coding and decoding on the video, the electronic device may perform soft decoding on the video preferentially, and the coding and decoding method provided by the embodiment of the present application is not applicable. That is, if the electronic device receives a request for performing software encoding and decoding on the third video, the electronic device performs software encoding and decoding on the third video in response to the request for performing software encoding and decoding on the third video.
If the application side designates low-delay encoding/decoding of the video, the requirement of the application side on the encoding/decoding latency is higher. Meanwhile, the codec temporal ductility of the real-time codec is lower than that of the non-real-time codec. Therefore, when the application side designates low-delay encoding/decoding of the video, even if the current remaining non-real-time encoding/decoding resources can be enough to support the electronic device to encode/decode the video in non-real time, in order not to violate and affect the requirement of the application side on time delay, the electronic device can not process the video from real-time encoding/decoding to non-real-time encoding/decoding.
If the application side designates to perform secure encoding/decoding on the video, the secure encoding/decoding is a precondition that the data security needs to be ensured. Therefore, the security coding/decoding requires security protection in addition to the normal coding/decoding flow. For example, additional encryption and decryption processing may be required for data while encoding/decoding.
Thus, secure codec may require higher computational power (the ability to process information) than other codec schemes. Thus, under secure encoding/decoding, the electronic device may need to provide higher computational power. Meanwhile, in the present electronic device, for example, the electronic device with the Android TM system is used, because the system defaults to the situation that the real-time encoding/decoding and the non-real-time encoding/decoding coexist, the real-time encoding/decoding is preferentially favored. Therefore, in order to avoid the additional effect of the mode of converting to non-real-time encoding/decoding on the security encoding/decoding, the electronic device may not perform the process of converting from real-time encoding/decoding to non-real-time encoding/decoding in the case that the application side designates the security encoding/decoding.
That is, if the low-delay encoding/decoding or the secure encoding/decoding of the third video is specified in the encoding/decoding request for requesting the real-time encoding/decoding of the third video, the electronic device does not further check the non-real-time encoding/decoding resource to perform the real-time encoding/decoding to the non-real-time encoding/decoding even if the remaining real-time encoding/decoding resource is insufficient to support the real-time encoding/decoding of the third video.
That is, if the third application specifies low-latency encoding/decoding or secure encoding/decoding for the third video, the electronic device does not encode/decode the third video when the third real-time encoding/decoding resources are less than the first desired resources. The third real-time encoding and decoding resource is obtained by recycling the first real-time encoding and decoding resource after the electronic equipment determines that the second real-time encoding and decoding resource is less than the first required resource.
Fig. 11 is a schematic diagram showing a process of converting real-time encoding/decoding to non-real-time encoding/decoding, and the process of converting real-time encoding/decoding to non-real-time encoding/decoding of the third video will be described with reference to fig. 11.
As shown in fig. 11, after determining that the third real-time codec resource is less than the first required resource, the electronic device characterizes that the real-time codec resource sufficient to support real-time encoding/decoding of the third video is not currently available even after the real-time codec resource is recovered. Therefore, the electronic device may further determine that the third application has no designated encoding mode, such as determining whether low-delay encoding/decoding or secure encoding/decoding is designated for the third video. That is, the electronic device determines that the third real-time codec resource is less than the first required resource by performing the determining step of S1008, and then the electronic device further performs the determining step of "whether low-delay encoding/decoding or secure encoding/decoding is required for the third video is specified".
In some embodiments, whether there is a low latency codec specified or a secure codec, the electronic device may be determined by the information of the codec format carried within the codec request. For example, the third application may carry a corresponding identification within the format parameters specifying whether low-delay encoding/decoding or secure encoding/decoding of the video is required.
Then, if the electronic device determines that the application has a specification that requires low-delay encoding/decoding or secure encoding/decoding of the third video, the electronic device does not perform real-time encoding/decoding to non-real-time encoding/decoding in order not to affect the requirement of the application side. As shown in fig. 11, the electronic device performs S1014 without encoding and decoding the third video.
At this time, the third video may fail to be encoded/decoded because the encoding/decoding capability of the electronic device reaches the upper limit. If the electronic device determines that the application does not specify that the low-delay encoding/decoding or the secure encoding/decoding is required for the third video, the requirement of the application side is not affected even if the non-real-time encoding/decoding is switched, so that the success rate of the multi-channel encoding/decoding can be improved by switching the non-real-time encoding/decoding. At this point, the electronic device executes S1010, and starts to query whether the non-real-time codec resource is sufficient to support non-real-time encoding/decoding of the third video.
It should be noted that, in the descriptions of steps S1008, S1009, S1010, S1011, S1014 and the like in fig. 11, the description of fig. 10 may be specifically referred to, and the principles are the same, which is not repeated in the embodiments of the present application. Meanwhile, in fig. 11, reference is made to the description of fig. 10 as above regarding the steps before S1008 and the steps after the yes branch of S1010. For example, steps S1001 to S1007 are further included before S1008. The steps following the yes branch of S1010 further include steps S1012, S1013, S1015, and the like. The implementation principle of these steps is the same, and the embodiments of the present application will not be repeated.
In general, in the case where the application side designates a mode such as software codec, security codec, and low-delay codec, the electronic device may choose not to switch the video from real codec to non-real-time codec. Specifically, if the third application specifies secure encoding/decoding or low-delay encoding/decoding of the third video. Then, in case the real-time codec resources are insufficient to support real-time encoding/decoding of the third video, the electronic device will not switch the third application from real-time encoding/decoding to non-real-time encoding/decoding for the case of further checking the non-real-time codec resources.
Fig. 12 shows an interactive flow chart of a codec method. In the following, taking interaction of each module/service in the electronic device as an example, a coding and decoding method provided by the embodiment of the present application is described with reference to fig. 12.
As shown in fig. 12, an application (e.g., a first application, a second application, or a third application) transmits a codec request for encoding/decoding a video (e.g., a first video, a second video, or a third video) to a media codec service (media codec). The coding/decoding request sent by the application may carry an identification priority to characterize whether to perform real-time coding/decoding or non-real-time coding/decoding on the video. Meanwhile, the codec request may also carry an identifier to specify whether to perform software codec, low-delay codec or secure codec on the video. Note that fig. 12 does not show the software codec related interaction flow.
After receiving the codec request of the application, the media codec service (media codec) starts codec initialization. Meanwhile, a media codec (media codec) instructs a codec resource query module (Acodec/Ccodec) to create a codec object, and instructs a resource reclamation service to save codec object related information, including an application process identifier (Pid) and an application identity identifier (mUid). The codec resource query module is Acodec or Ccodec, depending on the chip mounted on the electronic device.
Then, a media codec service (media codec) determines whether to real-time encode/decode or non-real-time encode/decode the video. If the identification priority=0 carried by the codec request sent by the application, the media codec service (media codec) determines to perform real-time encoding/decoding on the video. If priority=1, the media codec determines that the video is not encoded/decoded in real time. And if the coding request does not carry the identification priority, the media codec defaults to perform real-time coding/decoding on the video.
After determining to real-time encode/decode the video, the media codec service (media codec) requests allocation of real-time codec resources to a video driver (video kernel) via a codec resource query module (Acodec/Ccodec), video HAL (video HAL). Similarly, after determining to non-real time encode/decode the video, a media codec service (media codec) may request allocation of non-real time codec resources to a video driver (video kernel) via a codec resource query module (Acodec/Ccodec), video HAL (video HAL).
After receiving a request of a media codec service (media codec) to allocate real-time codec resources, a video driver (video kernel) performs real-time codec resource checking. Similarly, after receiving a request from a video kernel to allocate non-real-time codec resources, a video driver (video kernel) performs non-real-time codec resource check.
Taking the third video as an example, performing the real-time codec resource check may include: and judging whether the second real-time coding and decoding resources are less than the first required resources. If the second real-time codec resource is less than the first required resource, the video driver (video kernel) may determine that the current remaining real-time codec resource is insufficient to support real-time encoding/decoding of the third video.
Performing the non-real-time codec resource check may include: and judging whether the second non-real-time coding and decoding resources are less than the second required resources. If the second non-real-time codec resource is less than the second required resource, the video driver (video kernel) may determine that the currently remaining non-real-time codec resource is insufficient to support non-real-time encoding/decoding of the third video.
In both cases, the video driver (video kernel) determines that resources cannot be allocated, and returns no memory to the media codec service (media codec). Conversely, in the case where the video driver (video kernel) determines that the second real-time codec resource is equal to or greater than the first required resource, or the video driver (video kernel) determines that the second non-real-time codec resource is equal to or greater than the second required resource, the video driver (video kernel) may perform resource allocation, and the video driver (video kernel) returns the completion of the resource allocation to the media codec service (media codec).
For example, as shown in fig. 12, for allocating non-real-time codec resources, "non-real-time codec resource allocation complete" may be returned. While for allocation of real-time codec resources, a "real-time codec resource allocation complete" may be returned (not shown in fig. 12). That is, fig. 12 shows only the interactive flow of the branch of returning no memory (no_memory) in the real-time codec resource check stage, and does not show the interactive flow of the branch of returning the completion of the resource allocation. And, fig. 12 does not show the interactive flow of the branch of returning no memory (no_memory) in the non-real-time codec resource checking stage, but only shows the interactive flow of the branch of returning the resource allocation completion.
As shown in fig. 12, after the media codec receives no memory, the media codec service requests real-time codec resource reclamation from the resource reclamation service. After receiving the request for recovering the real-time encoding and decoding resources, the resource recovery service requests the process priority of the feedback application from the activity manager. In some embodiments, since the resource reclamation service maintains codec object related information, the resource reclamation service may request the process priority of the feedback application from the activity manager through the application process identification (Pid) and the application identity identification (mUid).
For example, assuming that the first application and the second application have initiated the codec request before the third application, the resource reclamation service synchronously saves the application process identifications (Pid), application identities (mUid) of the first application and the second application after the codec resource query module (Acodec/Ccodec) creates corresponding codec objects for the first application and the second application. Further, the resource reclamation service may request feedback of the process priority of the first application and the process priority of the second application from the activity manager through application process identifications (Pid) and application identifications (mUid) of the first application and the second application.
Meanwhile, for comparison of priorities, the resource reclamation service may also request feedback of the process priority of the third application from the activity manager. Further, after the resource recycling service obtains the process priority of the application, the application with the lowest process priority can be determined as the target application (such as the first target application or the second target application) by comparing the process priorities.
Then, the resource recycling service performs resource recycling on the determined target application, and feeds back corresponding information to a media codec service (media codec) after the resource recycling is completed. As shown in fig. 12, in case of successful resource reclamation, the "successful resource reclamation" may be fed back to the media codec service (media codec).
And in case of resource reclamation failure, the "resource reclamation failure" may be fed back to the media codec service (media codec). That is, if an additional situation occurs in the process of recycling resources of the target application by the resource recycling service, the resource recycling is failed. Or if the resource reclamation service determines that there is no application with lower process priority, such as the process priority of the first application and the process priority of the second application are higher than the process priority of the current third application, the resource reclamation service characterizes the application with lower process priority. Then the resource reclamation service cannot perform the resource reclamation. In both cases, as shown in fig. 12, the resource reclamation service may feed back "resource reclamation failure" to the media codec service (media codec).
In case of successful resource reclamation, the media codec service (media codec) re-requests the video driver (video kernel) for allocation of the codec resources, and the video driver (video kernel) re-determines whether the resource allocation can be successfully completed after the resource reclamation.
As shown in fig. 12, in case that the real-time codec resource recovery is successful, the media codec service (media codec) re-requests the video driver (video kernel) to allocate the real-time codec resource, and the video driver (video kernel) re-determines whether the real-time codec resource allocation can be successfully completed after the real-time codec resource recovery is completed.
For example, taking the third video as an example, performing the real-time codec resource check again after the real-time codec resource is recovered may include: and judging whether the third real-time coding and decoding resources are less than the first required resources. Similarly, if the media codec service (media codec) triggers the resource recovery of the non-real-time codec resource, after the non-real-time codec resource is recovered, the video driver (video kernel) may be requested again to perform the check of the non-real-time codec resource again, which may include: and judging whether the third non-real-time coding and decoding resources are less than the second required resources.
If there are enough real-time codec resources after the resource recovery is successful, the video driver (video kernel) may feed back the real-time codec resource allocation completion to the media codec service (media codec), which may begin real-time codec (this branched interactive flowchart 12 is not shown).
If the real-time codec resources are still not enough after the real-time codec resources are successfully recovered, the video driver returns to no memory (no_memory) as it is, thereby triggering a request for recovering the resources again until no resources which can be recovered exist and the resource recovery fails. That is, the resource reclamation of the embodiment of the present application can be regarded as a cyclic interactive process. That is, only one target application (e.g., a first target application and a second target application) is determined at a time by the resource reclamation service, and then only the codec resources occupied by the one target application are reclaimed at a time by the resource reclamation service. Therefore, compared with the resource recycling service which fully recycles the coding and decoding resources occupied by all target applications with low process priority, the method can avoid the influence of resource recycling on more coding and decoding services as much as possible.
And under the condition that the real-time coding and decoding resource recovery fails, representing that the video is coded/decoded in real time by the resource recovery can not obtain enough real-time coding and decoding resource support. Thus, in an embodiment of the present application, a media codec service (media codec) is ready to begin turning to non-real-time codec. As shown in fig. 12, the media codec service then determines whether there is a designation that low-delay encoding/decoding or secure encoding/decoding of video is required.
If there is a specified low-delay codec or a secure codec, the service requirement of the application may be affected by switching to non-real-time codec, so that the media codec service (media codec) does not switch to non-real-time codec, and at this time, the application may be directly informed of codec failure. For example, the application may be informed that the codec capability has reached an upper limit and that the codec failed.
If low-delay codec or secure codec is not specified, then a media codec service (media codec) may convert the video to non-real-time codec in order to increase the success rate of the multi-channel codec.
At this time, as shown in fig. 12, a media codec service (media codec) may request allocation of non-real-time codec resources to a video driver (video kernel). After the video driver (video kernel) determines that there are enough non-real-time codec resources through the non-real-time codec resource check and completes allocation, it feeds back the information of the non-real-time codec resource allocation completion to the media codec service (media codec). Further, a media codec service (media codec) starts non-real-time encoding/decoding.
It will be appreciated that if the video driver (video kernel) determines that there are insufficient non-real time codec resources to non-real time encode/decode the video through a non-real time codec resource check. Then the video driver (video kernel) may return no memory (no_memory) informing of the media codec service (media codec).
Further, a media codec service (media codec) may request non-real-time codec resource reclamation to request reclamation of non-real-time codec resources. It should be noted that, in this request interaction flow, fig. 12 is not shown in the embodiment of the present application, but the principle is the same as that of the real-time codec resource recovery, and specifically, reference may be made to the real-time codec resource recovery flow, which is not described in detail in the embodiment of the present application.
In some embodiments, the occupied codec resources may be released at any time with normal completion of the codec service or user trigger. For example, the first video and/or the second video may be normally released due to the completion of encoding/decoding, so that the corresponding first real-time encoding/decoding resource and/or the first non-real-time encoding/decoding resource may be conveniently provided for encoding/decoding of the subsequent video.
Therefore, after the third video is changed from the real-time encoding/decoding (the first encoding/decoding mode) to the non-real-time encoding/decoding (the second encoding/decoding mode) due to insufficient real-time encoding/decoding resources, if in the non-real-time encoding/decoding process of the third video, the fourth real-time encoding/decoding resources remaining after the resources are released are monitored to be equal to or more than the first required resources corresponding to the third video. The electronic device may then again switch the third video from non-real-time encoding/decoding to real-time encoding/decoding. That is, the electronic device utilizes the fourth real-time codec resource to perform real-time encoding/decoding on video clips that have not yet been encoded/decoded in the third video.
It can be understood that the fourth real-time codec resource is the unoccupied real-time codec resource (the first codec resource) left by the resource release in the non-real-time codec process of the third video. For example, if the third video is converted to non-real-time encoding/decoding, the remaining real-time encoding/decoding resources are third real-time encoding/decoding resources. Meanwhile, the third real-time encoding/decoding resource is not occupied by other videos in the non-real-time encoding/decoding process of the third video. The fourth real-time codec resource may be understood as the sum of the third real-time codec resource and the fifth real-time codec resource. Wherein the fifth real-time codec resource is a real-time codec resource released in the non-real-time codec process of the third video.
In one embodiment, a media codec service (media codec) may implement real-time codec resource release interception by registering a resource interception callback function in video HAL (video HAL). Fig. 13 shows an interactive flow chart for switching non-real-time codec back to real-time codec.
As shown in fig. 13, a media codec service (media codec) registers a resource listening callback function with a video HAL (video HAL) after having converted the video into non-real-time codec because real-time codec resources are insufficient, and non-real-time codec is performed on the video. The release of real-time codec resources is monitored via video HAL (video HAL).
Once the video driver (video kernel) determines that real-time codec resources are released, the video HAL (video HAL) is informed of the release. The release is then reported by the video HAL (video HAL) to the media codec service. The release condition reported by the video HAL (video HAL) may include the released real-time codec resource, such as the fifth real-time codec resource described above. And the method can also comprise the total residual real-time coding and decoding resources after release, such as directly reporting the fourth real-time coding and decoding resources.
Then, the media codec determines whether to switch back to real-time codec according to the release condition reported by the video HAL (video HAL). In a specific embodiment, the media codec service (media codec) may determine to switch back to real-time codec after determining that the fourth real-time codec resource is equal to or greater than the first required resource corresponding to the third video. Further, a media codec service (media codec) real-time encodes/decodes the third video using the fourth real-time codec resource. And if the fourth real-time codec resources remaining after the release are still less than the first required resources required for real-time encoding/decoding of the third video, the media codec service (media codec) may determine not to re-convert the real-time encoding/decoding and continue non-real-time encoding/decoding of the third video.
Meanwhile, since the third video may have already encoded/decoded a part of the video clip in the non-real-time encoding/decoding process. Accordingly, a media codec service (media codec) may then real-time encode/decode video clips in the third video that have not yet been encoded/decoded using the fourth real-time codec resource.
In some embodiments, there are more computational unit resources required to mobilize as a result of video encoding versus video decoding. At the same time, video encoding may also place higher demands on device performance than video decoding.
Therefore, if the process of converting the real-time encoding/decoding to the non-real-time encoding/decoding affects the video encoding effect, and the success rate of the multi-channel encoding/decoding needs to be improved, the electronic device may limit the conversion to the non-real-time only in the case of video decoding. That is, the electronic device considers to change the real-time decoding to the non-real-time decoding only if the third video is real-time decoded and the real-time encoding and decoding resources are insufficient. And when the third video is real-time encoded, the electronic device does not consider converting the real-time encoding to non-real-time encoding even in the case where the real-time encoding and decoding resources are insufficient.
That is, if the electronic device determines that the resource reclamation results in the third real-time codec resource still being less than the first required resource, the electronic device needs to determine whether to currently video decode or video encode the third video before preparing to switch to non-real-time codec. If the current request for the codec resource is to video encode the third video, the electronic device does not trigger a transition to non-real-time encoding. The electronic device triggers a switch to non-real-time decoding only when the codec resource is currently requested for video decoding of the third video.
In some embodiments, when the conventional electronic device encodes/decodes the video, in addition to determining whether the capability of supporting encoding/decoding the video currently exists by detecting the encoding/decoding resources of the electronic device, each path of video is independently checked to determine whether the resolution of the path of video is within the maximum resolution range supported by the electronic device. It will be appreciated that the checking of video resolution is also a check of codec capability.
If the resolution of the video currently requested for encoding/decoding exceeds the maximum resolution that can be supported by the electronic device, that is, the video currently requested for encoding/decoding is a super-resolution video for the electronic device. For example, if the maximum resolution that the electronic device can support is 4096×2176, but the resolution of the third video is 4096×2404, then the third video is a super-resolution video for the electronic device.
For super-resolution video, even if the electronic device currently has enough codec resources, the electronic device cannot encode/decode the super-resolution video. That is, currently, the electronic device cannot encode/decode the super-resolution video. In a specific embodiment, if the electronic device does not support encoding/decoding because the video is super-resolution video, the kernel layer of the electronic device may feed back information that is not supported to the upper layer application, for example, the kernel layer may feed back no_supported to inform the upper layer that the super-resolution video cannot be encoded/decoded.
In a specific embodiment, the video driver (video kernel) may return no support to the media codec service (media codec), and the media codec service (media codec) informs that the resolution of the video is not supported (no_supported) for encoding/decoding.
However, according to conventional inspection logic, inspection of video resolution is typically placed after the codec resource inspection. For example, the step of checking the video resolution is taken as the last step in the checking flow.
Fig. 14 shows a flow chart of a codec capability check. Fig. 14 is a flow of inspection incorporating conventional inspection logic. The inspection flow shown in fig. 14 will be described below using the third video as an example.
As shown in fig. 14, the inspection process combined with the conventional inspection logic mainly includes step ①: check codec resources and step ②: the video resolution is checked.
With respect to step ①: the codec resources are checked. As shown in fig. 14, checking the codec resources includes a real-time codec resource check and a non-real-time codec resource check. That is, according to conventional inspection logic, the electronic device first inspects the codec resources.
If the electronic device determines that the real-time encoding/decoding is performed on the third video, the electronic device performs a real-time encoding/decoding resource check on the third video. If the electronic device determines to perform non-real-time encoding/decoding on the third video, the electronic device performs non-real-time encoding/decoding resource checking on the third video.
In the embodiment of the application, the checking of the real-time encoding and decoding resources mainly comprises the following steps: judging whether the remaining idle second real-time encoding and decoding resources in the real-time encoding and decoding resources are less than the first required resources corresponding to the real-time encoding/decoding of the third video. If the second real-time codec resource is less than the first required resource, the real-time codec resource check in the embodiment of the present application may further include: and (3) recovering the resources of the real-time coding and decoding resources (for example, recovering the resources of the first real-time coding and decoding resources to obtain third real-time coding and decoding resources), and judging whether the recovered third real-time coding and decoding resources are less than the first required resources.
It can be appreciated that since the real-time codec resources of an electronic device are mainly measured by MBPS and RT-MBPF. Therefore, the electronic device performs a check of the real-time codec resources for the third video, which is equivalent to checking whether MBPS required for the third video is within the defined max_mbps range and whether RT-MBPF required for the third video is within the defined max_rt-MBPF range. As shown in fig. 14, the real-time codec resource check may include: MBPS and RT-MBPF checks.
That is, the electronic device may first determine whether MBPS required for the third video is within a defined max_mbps range. If not, the electronic device determines no memory (no memory). If at this point, the electronic device further determines whether the RT-MBPF required for the third video is within the defined max_rt-MBPF range, and if not, the electronic device determines no memory (no_memory). If at this point, it indicates that there are sufficient real-time codec resources. That is, the first required resources in the embodiment of the present application are MBPS required by the third video and RT-MBPF required by the third video.
In the embodiment of the application, the checking of the non-real-time coding and decoding resources mainly comprises the following steps: judging whether the remaining second non-real-time encoding/decoding resources in the non-real-time encoding/decoding resources are less than the second required resources required for the non-real-time encoding/decoding of the third video.
Similarly, if the second non-real-time codec resource is less than the second required resource, the checking of the non-real-time codec resource in the embodiment of the present application may further include: and (3) recovering the resources of the non-real-time coding and decoding resources (for example, recovering the resources of the first non-real-time coding and decoding resources to obtain third non-real-time coding and decoding resources), and judging whether the recovered third non-real-time coding and decoding resources are less than the second required resources. That is, checking non-real-time codec resources for the third video is equivalent to checking whether MBPF required for the third video is within a defined max_mbpf range. The second required resource in the embodiment of the application is MBPF required for the third video.
Then, if the electronic device still determines that there are not enough codec resources to support encoding/decoding the third video after the resource recovery, the electronic device may determine that there is no memory (no_memory). Without a memory (no memory), the electronic device does not encode/decode the third video. In a specific embodiment, the checking of the codec resources and the determination of no memory (no_memory) may be performed by video kernel in kernel layer. When the video kernel determines that there are not enough codec resources currently to support encoding/decoding the third video by checking the codec resources, the video kernel may then feed back no memory (no_memory) to the upper layer application to inform the upper layer application that there are no resources currently available to create a codec for encoding/decoding the third video.
The electronic device may further check the video resolution if there are enough encoding/decoding resources to support encoding/decoding the third video after the resource recovery, i.e. the current memory state is present.
As shown in fig. 14, the electronic device performs step ② in the presence of memory: the video resolution is checked. If the electronic device checks to determine that the resolution of the third video does not exceed the maximum resolution that the electronic device can support, it is characterized that the third video is not a super-resolution video. In this case, the electronic device may encode/decode the third video, and the electronic device may normally occupy the currently remaining codec resources to create a corresponding codec to encode/decode the third video.
And if the electronic device checks to determine that the resolution of the third video exceeds the maximum resolution that can be supported by the electronic device, characterizing the third video as a super-resolution video. In this case, the electronic apparatus cannot encode/decode the third video. Thus, as shown in fig. 14, the electronic device determines not to support (no_supported), and does not encode/decode the third video. In a specific embodiment, the checking of the video resolution may also be performed by video kernel. When the video kernel check determines that the video currently required to be encoded/decoded is a super-resolution video, the video kernel may inform the upper layer application that the current video resolution is not supported for encoding/decoding by feeding back no_supported.
Thus, if the electronic device performs the resource recycling in the step of checking the codec resources because the codec resources are insufficient, a phenomenon that the video cannot be encoded/decoded even if the codec resources are recycled is very easy to occur, not only the video currently requested is not successfully encoded/decoded, but also the video service that is originally being encoded/decoded is affected.
For example, if the third video is a super-resolution video, after the electronic device performs resource recycling on the first real-time codec resource occupied by the first video or the first non-real-time codec resource occupied by the second video, the super-resolution is still unable to perform encoding/decoding on the third video. But now the resources occupied by the first video or the second video have been reclaimed, thereby affecting the encoding/decoding of the first video or the second video. For example, if the third video is a video that needs to be played by the foreground at this time, there is a case where the third video is still not played after the codec resources occupied by the first video or the second video are recovered. In this way, the electronic device does not successfully encode/decode the third video after a series of processes such as judging, recycling, converting the real-time encoding and decoding into the non-real-time encoding and decoding, etc., so that not only is the device resource consumed and the power consumption of the device increased, but also the original encoding and decoding service of the first video or the second video is affected, thereby affecting the use experience of the user.
Taking a mobile phone as an example, fig. 15 shows an interface schematic diagram of a super-resolution codec failure.
When the mobile phone is recording, if the mobile phone receives click operation of opening a super-resolution video in file management by a user. As shown in the interface 1501 of fig. 15, the resolution of the video is 4096×2404, and the frame rate is 24fps.
The handset displays an interface 1502 in response to this click operation by the user. As shown in the interface 1502 in fig. 15, because the current remaining idle codec resources of the mobile phone are not enough to support the mobile phone to decode and play the super-resolution video, the mobile phone recovers the resources occupied by the codec of the screen recording, and displays a prompt message "the current video codec capability of the mobile phone has reached the upper limit and the screen recording has stopped" on the interface 1502.
Meanwhile, the interface 1502 has therein a first frame image displaying the super-resolution video. However, since the video clicked by the user to be played is a super-resolution video, the mobile phone immediately switches the display interface 1503 after the short display interface 1502. In the interface 1503, the super-resolution video is displayed as a black screen 1504 in the interface 1503 because the super-resolution video cannot be decoded and played. The handset may also display a window 1505 in the interface 1502 to alert the user that the recorded video has been saved. Therefore, since the mobile phone checks the resolution of the video after checking the codec resources (including the resource recovery), the video recording which is normally performed is recovered and interrupted, and the super-resolution video is not smoothly played.
That is, when the codec capability is checked according to the conventional checking logic, there may be a case where the codec resources of other path video (e.g., the first video or the second video) are recovered and the current video (e.g., the third video) is still failed due to the no_supported problem. Therefore, on the condition that no_supported cannot be encoded and decoded originally, the realization of video functions of other road encoding and decoding which normally run originally is also influenced, and therefore the use experience of a user is influenced.
Therefore, to solve this problem, the embodiment of the present application changes the check logic to: the step of checking the video resolution is advanced before the step of checking the codec resources.
That is, the electronic device in the embodiment of the present application first checks whether the resolution of the third video exceeds the maximum resolution that can be supported by the electronic device. After the electronic device determines that the third video is not the super-resolution video, the electronic device checks the codec resources to determine whether the current remaining codec resources can support encoding/decoding of the third video.
It can be appreciated that in checking the codec resources, the electronic device can determine whether the resource recovery is required according to the condition of the codec resources. And the electronic device can also determine whether the real-time encoding/decoding to non-real-time encoding/decoding process is needed according to the condition of encoding/decoding resources and/or whether the condition of designating low-delay encoding/decoding or safe encoding/decoding exists.
Fig. 16 shows a flow chart of a codec capability check. The inspection process shown in fig. 16 is an inspection process under inspection logic provided in connection with an embodiment of the present application.
The inspection flow shown in fig. 16 will be described below by taking a third video as an example. As shown in fig. 16, the inspection process provided in the embodiment of the present application mainly includes step ①: check video resolution and step ②: the codec resources are checked.
After the electronic device receives the codec request of the third video, step ① is executed: the video resolution is checked. If the electronic device checks to determine that the resolution of the third video exceeds the maximum resolution that the electronic device can support, the third video is characterized as a super-resolution video. In this case, the electronic apparatus cannot encode/decode the third video. Thus, as shown in fig. 16, the electronic device determines not to support (no_supported), and does not encode/decode the third video.
And if the electronic device checks to determine that the resolution of the third video does not exceed the maximum resolution that can be supported by the electronic device, characterizing that the third video is not a super-resolution video. In this case, the electronic device may support encoding/decoding the third video, and then the electronic device performs step ②: the codec resources are checked.
Depending on the specific resource situation, the checking of the codec resources may include: judging whether the second real-time coding and decoding resources are less than the first required resources or not; performing resource recovery on the first real-time coding and decoding resources to obtain third real-time coding and decoding resources; judging whether the third real-time coding and decoding resources are less than the first required resources or not; judging whether the second non-real-time coding and decoding resources are less than second required resources or not; performing resource recovery on the first non-real-time coding and decoding resources to obtain third non-real-time coding and decoding resources; and determining whether the third non-real time codec resource is less than the second required resource, and so on. It can be understood that, in the embodiment of the present application, the specific flow of the step of checking the codec resources may be referred to the above embodiment and the description of fig. 10, and the principle is the same, which is not repeated here.
That is, taking the third video as an example, before the electronic device responds to the request for real-time encoding/decoding or non-real-time encoding/decoding of the third video, it needs to determine whether the third video is super-resolution video. If the third video is super-resolution video, the third video is directly not subjected to real-time encoding/decoding or non-real-time encoding/decoding, and the electronic equipment does not need to judge the condition of the real-time encoding/decoding resource or the non-real-time encoding/decoding resource. And only if the third video is not the super-resolution video, the electronic device further judges the status of the real-time encoding/decoding resource or the non-real-time encoding/decoding resource, i.e. checks the encoding/decoding resource to determine whether to encode/decode the third video in real time or not.
It can be seen that, before the step of checking the video resolution in the checking sequence is put to the step of checking the codec resource, if the video exceeds the maximum resolution that can be supported by the electronic device, the electronic device can directly determine that no_supported is not supported, and this way of video codec creation fails. In this way, in the case of no_supported, the electronic device does not check the codec resources and perform a series of processes such as judgment, resource recovery, and conversion from real-time codec to non-real-time codec, so that the device resources are not consumed and the device power consumption is increased. Meanwhile, the situation that the resources are recycled to influence the existing video service and meanwhile, the codec cannot be successfully created because of no_supported is avoided, so that the codec of other videos is prevented from being influenced under the condition that no_supported is not supported.
Another embodiment of the present application provides an electronic device, including: a display screen, one or more processors, and memory. The display screen and the memory are respectively coupled with the processor; the display screen is used for displaying video; one or more computer program code, including computer instructions, is stored in the memory; the computer instructions stored in the memory, when executed by the processor, cause the electronic device to implement the codec method described in any of the above embodiments.
In some embodiments, the processor includes a video codec that, when executing computer instructions stored in the memory, causes the electronic device to perform the codec method recited in any of the embodiments above.
Another embodiment of the present application provides a computer-readable storage medium storing a computer program that, when executed by a processor in an electronic device, causes the electronic device to implement the codec method described in any one of the above embodiments.
Embodiments of the present application also provide a computer program product which, when run on a computer, causes the computer to perform the functions or steps of the method embodiments described above.
Embodiments of the present application also provide a chip system, as shown in FIG. 17, the chip system 1700 includes at least one processor 1701 and at least one interface circuit 1702. The processor 1701 and the interface circuit 1702 may be interconnected by wires. For example, the interface circuit 1702 may be used to receive signals from other devices, such as a memory of a computer. For another example, the interface circuit 1702 may be used to send signals to other devices, such as the processor 1701.
The interface circuit 1702 may, for example, read instructions stored in a memory and send the instructions to the processor 1701. The instructions, when executed by the processor 1701, may cause a computer to perform the various steps of the embodiments described above. Of course, the system-on-chip may also include other discrete devices, which are not particularly limited in accordance with embodiments of the present application.
It will be apparent to those skilled in the art from this description that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts shown as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of specific embodiments of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the application is subject to the protection scope of the claims.

Claims (20)

1. The encoding and decoding method is characterized by being applied to electronic equipment, wherein the electronic equipment comprises a first encoding/decoding mode and a second encoding/decoding mode; the encoding/decoding time delay of the first encoding/decoding mode is lower than the encoding/decoding time delay of the second encoding/decoding mode; wherein the first encoding/decoding mode encodes/decodes through real-time encoding/decoding resources, and the second encoding/decoding mode encodes/decodes through non-real-time encoding/decoding resources; the method comprises the following steps:
Encoding/decoding one or more first videos using the first real-time encoding/decoding resources and/or encoding/decoding one or more second videos using the first non-real-time encoding/decoding resources;
Responding to a request for encoding/decoding a third video according to the first encoding/decoding mode, and if the second real-time encoding/decoding resource is less than the first required resource, carrying out resource recovery on the first real-time encoding/decoding resource to obtain a third real-time encoding/decoding resource; wherein the first required resource is a real-time encoding/decoding resource required for encoding/decoding the third video according to the first encoding/decoding mode; the second real-time codec resource is an unoccupied real-time codec resource; the third real-time coding and decoding resource comprises the second real-time coding and decoding resource and the recovered first real-time coding and decoding resource;
If the third real-time coding and decoding resource is less than the first required resource, the second non-real-time coding and decoding resource is equal to or more than the second required resource, and the second non-real-time coding and decoding resource is utilized to code/decode the third video; wherein the second required resource is a non-real-time encoding/decoding resource required for encoding/decoding the third video according to the second encoding/decoding mode; the second non-real-time codec resource is an unoccupied non-real-time codec resource.
2. The method according to claim 1, wherein the method further comprises:
if the third real-time coding and decoding resources are less than the first required resources and the second non-real-time coding and decoding resources are less than the second required resources, carrying out resource recovery on the first non-real-time coding and decoding resources to obtain third non-real-time coding and decoding resources; wherein the third non-real-time codec resource comprises the second non-real-time codec resource and the recovered first non-real-time codec resource;
if the third non-real-time coding and decoding resource is equal to or more than the second required resource, coding/decoding the third video by using the third non-real-time coding and decoding resource;
And if the third non-real-time coding and decoding resources are less than the second required resources, not coding/decoding the third video.
3. The method according to claim 1 or 2, characterized in that the method further comprises:
In the process of encoding/decoding the third video according to the second encoding/decoding mode, if a fourth real-time encoding/decoding resource is equal to or more than the first required resource, encoding/decoding a video segment which is not encoded/decoded yet in the third video by using the fourth real-time encoding/decoding resource; the fourth real-time encoding/decoding resource is an unoccupied real-time encoding/decoding resource after resource release in the process of encoding/decoding the third video according to the second encoding/decoding mode.
4. The method according to claim 3, wherein said encoding/decoding the video segments of the third video that have not been encoded/decoded by using the fourth real-time codec resource if the fourth real-time codec resource is equal to or more than the first required resource during the encoding/decoding of the third video according to the second encoding/decoding method comprises:
registering a resource monitoring callback function;
during the process of encoding/decoding the third video according to the second encoding/decoding mode, the resource monitoring callback function is utilized to monitor the release of the real-time encoding/decoding resource, so as to obtain the fourth real-time encoding/decoding resource;
and if the fourth real-time coding and decoding resource is equal to or more than the first required resource, coding/decoding the video fragments which are not coded/decoded in the third video by utilizing the fourth real-time coding and decoding resource.
5. The method according to claim 1 or 2, characterized in that the method further comprises:
checking whether the resolution of the third video exceeds a maximum resolution supportable by the electronic device;
And responding to the request for encoding/decoding the third video according to the first encoding/decoding mode, if the second real-time encoding/decoding resource is less than the first required resource, performing resource recovery on the first real-time encoding/decoding resource to obtain a third real-time encoding/decoding resource, including:
And when the resolution of the third video does not exceed the maximum resolution which can be supported by the electronic equipment, responding to a request for encoding/decoding the third video according to the first encoding/decoding mode, and if the second real-time encoding/decoding resource is less than the first required resource, carrying out resource recovery on the first real-time encoding/decoding resource to obtain a third real-time encoding/decoding resource.
6. The method according to claim 1 or 2, wherein the performing resource reclamation on the first real-time codec resource to obtain a third real-time codec resource comprises:
Determining the process priority of the first video corresponding to the first application and determining the process priority of the third video corresponding to the third application;
determining a first target application according to the process priority; wherein the first target application is a first application having the process priority lower than that of the third application;
recovering a first real-time encoding and decoding resource occupied by the first target application to obtain a third real-time encoding and decoding resource; the third real-time encoding and decoding resource comprises a first real-time encoding and decoding resource occupied by the first target application and the second real-time encoding and decoding resource.
7. The method of claim 2, wherein the performing resource reclamation on the first non-real-time codec resource to obtain a third non-real-time codec resource comprises:
determining the process priority of the second video corresponding to the second application and determining the process priority of the third video corresponding to the third application;
Determining a second target application according to the process priority; wherein the second target application is a second application having the process priority lower than that of the third application;
Recovering the first non-real-time encoding and decoding resources occupied by the second target application to obtain third non-real-time encoding and decoding resources; the third non-real-time coding and decoding resource comprises a first non-real-time coding and decoding resource occupied by the second target application and the second non-real-time coding and decoding resource.
8. The method according to claim 1 or 2, characterized in that the method further comprises:
If the third real-time encoding/decoding resource is less than the first required resource, and the third video is designated to be encoded/decoded safely or encoded/decoded with low delay according to the encoding/decoding format, or the third video is currently encoded, the third video is not encoded; wherein the codec format is specified by a third application corresponding to the third video;
And if the third real-time coding and decoding resource is less than the first required resource, the second non-real-time coding and decoding resource is equal to or more than the second required resource, and the coding/decoding of the third video by using the second non-real-time coding and decoding resource comprises the following steps:
And if the third real-time coding and decoding resources are less than the first required resources, and the safe coding/decoding or the low-delay coding/decoding of the third video is not specified according to the coding and decoding format, and the third video is currently subjected to video decoding, and the second non-real-time coding and decoding resources are equal to or more than the second required resources, the third video is decoded by using the second non-real-time coding and decoding resources.
9. The method according to claim 1, wherein the method further comprises:
Responding to a request for encoding/decoding the third video according to the second encoding/decoding mode, and if the second non-real-time encoding/decoding resource is equal to or more than a second required resource, encoding/decoding the third video by using the second non-real-time encoding/decoding resource;
If the second non-real-time coding and decoding resources are less than the second required resources, carrying out resource recovery on the first non-real-time coding and decoding resources to obtain third non-real-time coding and decoding resources; wherein the third non-real-time codec resource comprises the second non-real-time codec resource and the recovered first non-real-time codec resource;
if the third non-real-time coding and decoding resource is equal to or more than the second required resource, coding/decoding the third video by using the third non-real-time coding and decoding resource;
And if the third non-real-time coding and decoding resources are less than the second required resources, not coding/decoding the third video.
10. The method according to claim 1 or 9, characterized in that the method further comprises:
receiving a coding/decoding request for coding/decoding the third video;
If the coding and decoding request carries a first identifier, determining whether to code/decode the third video according to the first coding and decoding mode or to code/decode the third video according to the second coding and decoding mode according to the first identifier;
And if the encoding/decoding request does not carry the first identifier, determining to encode/decode the third video according to the first encoding/decoding mode.
11. The method according to claim 1 or 2, characterized in that the method further comprises:
and if the second real-time coding and decoding resource or the third real-time coding and decoding resource is equal to or more than the first required resource, coding/decoding the third video by using the second real-time coding and decoding resource or the third real-time coding and decoding resource.
12. The method according to claim 1 or 2, characterized in that the method further comprises:
Displaying a first interface, wherein the first interface comprises a third video after encoding/decoding; or the first interface comprises a recording screen floating window, and the recording screen floating window corresponds to the encoding/decoding of the third video.
13. The method of claim 12, wherein the first interface further comprises a first encoded/decoded video and/or a second encoded/decoded video.
14. The method of claim 1, wherein the electronic device comprises a media codec service, a codec resource query module, a video driver, and a resource reclamation service; and responding to the request for encoding/decoding the third video according to the first encoding/decoding mode, if the second real-time encoding/decoding resource is less than the first required resource, performing resource recovery on the first real-time encoding/decoding resource to obtain a third real-time encoding/decoding resource, including:
the media codec service requests the codec resource query module to allocate the real-time codec resource in response to a request for encoding/decoding the third video in the first encoding/decoding manner;
The coding and decoding resource inquiry module calls the video driver and checks whether the second real-time coding and decoding resource is less than the first required resource;
If the second real-time encoding and decoding resources are less than the first required resources, the encoding and decoding resource inquiry module returns no memory to the media encoding and decoding service, and the media encoding and decoding service requests the resource recycling service to recycle resources;
and the resource recycling service performs resource recycling on the first real-time coding and decoding resource to obtain the third real-time coding and decoding resource.
15. The method of claim 14, wherein the electronic device further comprises an activity manager;
The resource recycling service performs resource recycling on the first real-time coding and decoding resource to obtain the third real-time coding and decoding resource, and the method comprises the following steps:
The resource recycling service requests the activity manager to acquire the process priority of the first video corresponding to the first application and determines the process priority of the third video corresponding to the third application;
and the resource recycling service determines a first application of which the process priority is lower than that of the third application to obtain a first target application, and recycles first real-time encoding and decoding resources occupied by the first target application to obtain third real-time encoding and decoding resources.
16. The method of claim 4, wherein the electronic device comprises a media codec service, a video hardware abstraction layer, and a video driver; in the process of encoding/decoding the third video according to the second encoding/decoding manner, if a fourth real-time encoding/decoding resource is equal to or more than the first required resource, encoding/decoding a video segment in the third video that has not been encoded/decoded by using the fourth real-time encoding/decoding resource, including:
The media codec service registers the resource monitoring callback function to the video hardware abstraction layer to instruct the video hardware abstraction layer to monitor and report the release of the real-time codec resource;
In the process that the media coding/decoding service codes/decodes the third video according to the second coding/decoding mode, the video hardware abstraction layer acquires the release condition of the video driving feedback and reports the release condition to the media coding/decoding service; wherein the release condition includes the fourth real-time codec resource;
And if the media coding and decoding service determines that the fourth real-time coding and decoding resource is equal to or more than the first required resource, the media coding and decoding service utilizes the fourth real-time coding and decoding resource to code/decode the video fragments which are not coded/decoded yet in the third video.
17. An electronic device, comprising: a display screen, one or more processors, and a memory, the display screen and the memory being coupled to the processors, respectively; the display screen is used for displaying videos;
the memory has stored therein one or more computer program code comprising computer instructions; the computer instructions, when executed by the processor, cause the electronic device to perform the codec method of any one of claims 1-16.
18. The electronic device of claim 17, wherein the processor comprises a video codec, the computer instructions being executed by the video codec.
19. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when executed by a processor of an electronic device, causes the electronic device to perform the codec method of any one of claims 1-16.
20. A computer program product comprising a computer program which, when executed by a processor in an electronic device, causes the electronic device to perform the codec method of any one of claims 1-16.
CN202410476337.3A 2024-04-19 2024-04-19 Encoding and decoding method, electronic device, computer-readable storage medium, and program product Pending CN118101962A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410476337.3A CN118101962A (en) 2024-04-19 2024-04-19 Encoding and decoding method, electronic device, computer-readable storage medium, and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410476337.3A CN118101962A (en) 2024-04-19 2024-04-19 Encoding and decoding method, electronic device, computer-readable storage medium, and program product

Publications (1)

Publication Number Publication Date
CN118101962A true CN118101962A (en) 2024-05-28

Family

ID=91142281

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410476337.3A Pending CN118101962A (en) 2024-04-19 2024-04-19 Encoding and decoding method, electronic device, computer-readable storage medium, and program product

Country Status (1)

Country Link
CN (1) CN118101962A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106658018A (en) * 2016-12-20 2017-05-10 天脉聚源(北京)传媒科技有限公司 Intelligent soft decoding method and device
CN112351276A (en) * 2020-11-04 2021-02-09 北京金山云网络技术有限公司 Video encoding method and device and video decoding method and device
US11178395B1 (en) * 2020-06-10 2021-11-16 Whatsapp Llc Methods, mediums, and systems for dynamically selecting codecs
CN115964162A (en) * 2022-12-05 2023-04-14 浩云科技股份有限公司 Encoding and decoding resource allocation method and device
CN116170629A (en) * 2021-11-24 2023-05-26 华为技术有限公司 Method for transmitting code stream, electronic equipment and computer readable storage medium
US20230221969A1 (en) * 2020-09-16 2023-07-13 Alibaba Group Holding Limited Encoding scheduling method, server, client, and system for acquiring remote desktop

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106658018A (en) * 2016-12-20 2017-05-10 天脉聚源(北京)传媒科技有限公司 Intelligent soft decoding method and device
US11178395B1 (en) * 2020-06-10 2021-11-16 Whatsapp Llc Methods, mediums, and systems for dynamically selecting codecs
US20230221969A1 (en) * 2020-09-16 2023-07-13 Alibaba Group Holding Limited Encoding scheduling method, server, client, and system for acquiring remote desktop
CN112351276A (en) * 2020-11-04 2021-02-09 北京金山云网络技术有限公司 Video encoding method and device and video decoding method and device
CN116170629A (en) * 2021-11-24 2023-05-26 华为技术有限公司 Method for transmitting code stream, electronic equipment and computer readable storage medium
CN115964162A (en) * 2022-12-05 2023-04-14 浩云科技股份有限公司 Encoding and decoding resource allocation method and device

Similar Documents

Publication Publication Date Title
CN110312156B (en) Video caching method and device and readable storage medium
WO2022083465A1 (en) Electronic device screen projection method, medium thereof, and electronic device
CN114968836A (en) Garbage recycling method and electronic equipment
CN116700601B (en) Memory optimization method, equipment and storage medium
CN118101962A (en) Encoding and decoding method, electronic device, computer-readable storage medium, and program product
CN116033158B (en) Screen projection method and electronic equipment
CN116126744A (en) Memory recycling method and device and terminal equipment
CN115941674A (en) Multi-device application connection method, device and storage medium
CN110704157B (en) Application starting method, related device and medium
CN114449200A (en) Audio and video call method and device and terminal equipment
CN116055738B (en) Video compression method and electronic equipment
CN116055715B (en) Scheduling method of coder and decoder and electronic equipment
CN116701327B (en) File processing method and electronic equipment
WO2023185684A1 (en) Process killing method for application, and electronic device
WO2023045392A1 (en) Cloud mobile phone implementation method and apparatus
CN116684521B (en) Audio processing method, device and storage medium
CN117119053B (en) Terminal cloud collaborative data management method, terminal, system and readable storage medium
CN116033157B (en) Screen projection method and electronic equipment
CN116700813B (en) Method for loading widget, electronic device and readable storage medium
WO2024139864A1 (en) Method for adjusting program storage position, and related apparatus
WO2023061298A1 (en) Picture backup system and method, and device
CN114374813B (en) Multimedia resource management method, recorder and server
WO2023160208A1 (en) Image deletion operation notification method, device, and storage medium
CN117130698A (en) Menu display method and electronic equipment
CN116594534A (en) Application display method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination