CN116055802B - Image frame processing method and electronic equipment - Google Patents

Image frame processing method and electronic equipment Download PDF

Info

Publication number
CN116055802B
CN116055802B CN202210859314.1A CN202210859314A CN116055802B CN 116055802 B CN116055802 B CN 116055802B CN 202210859314 A CN202210859314 A CN 202210859314A CN 116055802 B CN116055802 B CN 116055802B
Authority
CN
China
Prior art keywords
frame
image
rate
loss rate
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210859314.1A
Other languages
Chinese (zh)
Other versions
CN116055802A (en
Inventor
郑家兴
白帆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202410161133.0A priority Critical patent/CN118138836A/en
Priority to CN202210859314.1A priority patent/CN116055802B/en
Publication of CN116055802A publication Critical patent/CN116055802A/en
Application granted granted Critical
Publication of CN116055802B publication Critical patent/CN116055802B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Environmental & Geological Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The application provides an image frame processing method and electronic equipment. The method comprises the following steps: receiving a first image frame sent by second electronic equipment, wherein the first image frame belongs to a target application on the first electronic equipment, and a time stamp corresponding to the first image frame is a first time stamp; determining a first frame rate of the image data received by the first electronic device according to the time stamp of the first number of image frames which are received recently and belong to the target application and are shown in the first time stamp; obtaining a first frame loss rate corresponding to a first image frame according to a first frame rate and a target frame rate, wherein the target frame rate is a frame rate at which the first electronic equipment sends image data to a target application; updating the accumulated frame loss rate according to the first frame loss rate; and discarding the first image frame if the updated accumulated frame loss rate is greater than or equal to the first value. Therefore, whether the current image frame is discarded or not is determined according to the proportion accumulation value of the redundant frames, continuous frame loss operation can be reduced, frame skip of video pictures is avoided, and user experience is improved.

Description

Image frame processing method and electronic equipment
Technical Field
The present disclosure relates to the field of terminal devices, and in particular, to an image frame processing and transmitting method and an electronic device.
Background
When a user uses a mobile phone to carry out video call, if the user needs to use a computer to complete some work, the user is inconvenient to use the mobile phone and the computer. At this time, the user can cooperate with the computer to switch the call video on the mobile phone to the computer, so that the mobile phone can be controlled by the computer to perform video call.
In this scenario, if the frame rate of the video image collected by the computer is greater than the frame rate of the image frame sent by the mobile phone to the application to which the video call belongs, the mobile phone needs to perform frame loss processing.
At present, in the related art, in the frame loss processing manner under the above scene, continuous frame loss operation occurs, and the frame skipping phenomenon occurs in the video picture of the mobile phone due to the continuous frame loss operation, so that the user experience is poor.
Disclosure of Invention
In order to solve the technical problems, the application provides an image frame processing method and electronic equipment, which are used for determining whether to discard a current image frame according to a proportion accumulation value of redundant frames by accumulating proportion of redundant frames in video image frames actually received by the electronic equipment, so that continuous frame discarding operation is reduced, frame skipping of video images is avoided, and user experience is improved.
In a first aspect, the present application provides an image frame processing method. The method is applied to a first electronic device. The method comprises the following steps: receiving a first image frame sent by second electronic equipment, wherein the first image frame belongs to a target application on the first electronic equipment, and a time stamp corresponding to the first image frame is a first time stamp; determining a first frame rate of the image data received by the first electronic device according to the time stamp of the first number of image frames which are received recently and belong to the target application and are shown in the first time stamp; obtaining a first frame loss rate corresponding to a first image frame according to a first frame rate and a target frame rate, wherein the target frame rate is a frame rate at which the first electronic equipment sends image data to a target application; updating the accumulated frame loss rate according to the first frame loss rate; and discarding the first image frame if the updated accumulated frame loss rate is greater than or equal to the first value. Therefore, the proportion of the redundant frames in the video image frames actually received by the electronic equipment is accumulated, whether the current image frame is discarded or not is determined according to the proportion accumulated value of the redundant frames, continuous frame loss operation can be reduced, frame skipping of video images is avoided, and user experience is improved.
According to a first aspect, determining a first frame rate at which image data is received by a first electronic device based on a timestamp of a first number of image frames belonging to a target application, the first number of image frames being most recently received at a time indicated by the first timestamp, includes: acquiring a time difference between a first time stamp and a first target time stamp, wherein the first target time stamp is the smallest time stamp in the time stamps of the first number of image frames; a first frame rate of the image data currently received by the electronic device is determined based on the time difference and the first number. In this way, the actual frame rate corresponding to each image frame can be obtained.
According to a first aspect, according to a first frame rate and a target frame rate, a first frame loss rate corresponding to a first image frame is obtained, the target frame rate being a frame rate at which the first electronic device sends image data to a target application, including: acquiring a frame rate difference between a first frame rate and a target frame rate; and determining a first frame loss rate according to the frame rate difference value and the first frame rate, wherein the first frame loss rate is equal to a quotient obtained by dividing the frame rate difference value by the first frame rate. Thus, the frame loss rate corresponding to each image frame can be obtained
According to a first aspect, updating the accumulated frame loss rate according to the first frame loss rate comprises: acquiring a first accumulated frame loss rate corresponding to a previous image frame of a first image frame; and determining an updated accumulated frame loss rate according to the first accumulated frame loss rate and the first frame loss rate, wherein the updated accumulated frame loss rate is equal to the sum of the first accumulated frame loss rate and the first frame loss rate. Thus, the accumulated frame loss rate of the current image frame cut-off in each video call can be obtained, and a basis is provided for judging whether to discard the current image frame.
According to a first aspect, if the updated accumulated frame loss rate is greater than or equal to the first value, after discarding the first image frame, the method further includes: and updating the value of the accumulated frame loss rate to be the difference between the accumulated frame loss rate and the first value. In this way, the accumulated frame loss rate can be protected from overflowing.
According to a first aspect, after updating the value of the accumulated frame loss rate to a difference between the accumulated frame loss rate and the first value, the method further includes: receiving a second image frame sent by second electronic equipment, wherein the second image frame belongs to a target application, and a second timestamp corresponding to the second image frame is larger than the first timestamp; determining a second frame rate of the image data received by the first electronic device according to the time stamp of the first number of image frames which belong to the target application and are received recently as shown in the second time stamp; obtaining a second frame loss rate corresponding to the second image frame according to the second frame rate and the target frame rate; updating the accumulated frame loss rate according to the second frame loss rate; and if the updated accumulated frame loss rate is smaller than or equal to a second value, sending the second image frame to the target application, wherein the second value is smaller than the first value. In this way, in the case where the accumulated frame loss rate is less than or equal to the second value, the current image frame is not discarded.
According to a first aspect, the method further comprises, after transmitting the first target image frame to the target application so that the target application displays the first target image frame: and updating the value of the accumulated frame loss rate to be the difference between the accumulated frame loss rate and the second value. In this way, the accumulated frame loss rate can be protected from overflowing.
According to a first aspect, after updating the value of the accumulated frame loss rate to a difference between the accumulated frame loss rate and the first value, the method further includes: receiving a third image frame sent by the second electronic equipment, wherein the third image frame belongs to the target application, and a third timestamp corresponding to the third image frame is larger than the first timestamp; determining a third frame rate of the image data received by the first electronic device according to the time stamp of the first number of image frames which belong to the target application and are received recently as shown in the third time stamp; obtaining a third frame loss rate corresponding to the third image frame according to the third frame rate and the target frame rate; updating the accumulated frame loss rate according to the third frame loss rate;
and if the updated accumulated frame loss rate is smaller than the first value and larger than the second value, transmitting the third image frame to the target application.
According to a first aspect, the first number is smaller than a preset number value.
According to a first aspect, the first electronic device is a smart phone and the second electronic device is a personal computer, tablet or smart screen.
In a second aspect, the present application provides an electronic device, comprising: a memory and a processor, the memory coupled to the processor; the memory stores program instructions that, when executed by the processor, cause the electronic device to perform the image frame processing method of any of the first aspects.
In a third aspect, the present application provides a computer readable storage medium comprising a computer program which, when run on an electronic device, causes the electronic device to perform the image frame processing method of any one of the preceding first aspects.
In a fourth aspect, the present application provides a memory management method. The method is applied to a second electronic device. The method comprises the following steps: receiving camera parameters sent by first electronic equipment, wherein the camera parameters comprise width information and height information of acquired images; determining a first target capacity according to the width information and the height information; applying for a first memory space and a second memory space, wherein the capacity of the first memory space and the capacity of the second memory space are equal to a first target capacity; after the first image frame is acquired, the first image data corresponding to the first image frame and at least one group of first related image data obtained by processing the first image frame are alternately stored in a first memory space and a second memory space. Therefore, the memory can be used for multiple times after the memory is applied once, and the memory is released when the camera is closed after the video call is ended, so that the time consumption for frequently applying the memory in the image frame processing process is reduced, and the processing time delay of each frame of image in the video call is reduced.
According to a fourth aspect, after the first image frame is acquired, after the first image data corresponding to the first image frame and at least one set of second image data obtained by processing the first image frame are alternately stored in the capacity of the first memory space and the second memory space, the method further includes: acquiring a second image frame; judging whether the data corresponding to the second image frame is equal to the first target capacity or not; and if so, alternately storing the second image data corresponding to the second image frame and at least one group of second related image data obtained by processing the second image frame into the first memory space and the second memory space.
According to a fourth aspect, after the first image frame is acquired, after the first image data corresponding to the first image frame and at least one set of second image data obtained by processing the first image frame are alternately stored in the capacity of the first memory space and the second memory space, the method further includes: acquiring a third image frame; judging whether the length of the third image data corresponding to the third image frame is equal to the first target capacity or not; if not, determining a second target capacity according to the length of the third image data; applying for a third memory space and a fourth memory space, wherein the capacity of the third memory space and the capacity of the fourth memory space are equal to the second target capacity; and alternately storing third image data corresponding to the third image frame and at least one group of third related image data obtained by processing the third image frame into a third memory space and a fourth memory space.
According to a fourth aspect, after the first image frame is acquired, after the first image data corresponding to the first image frame and at least one set of second image data obtained by processing the first image frame are alternately stored in the capacity of the first memory space and the second memory space, the method further includes: receiving an instruction for closing the camera; and releasing the capacity of the first memory space and the second memory space.
In a fifth aspect, the present application provides an electronic device, comprising: a memory and a processor, the memory coupled to the processor; the memory stores program instructions that, when executed by the processor, cause the electronic device to perform the memory management method of any of the third aspects.
In a sixth aspect, the present application provides a computer readable storage medium comprising a computer program which, when run on an electronic device, causes the electronic device to perform the memory management method of any one of the preceding third aspects.
Drawings
Fig. 1 is a schematic structural diagram of an exemplary electronic device 100;
fig. 2 is a software architecture block diagram of an electronic device 100 of an embodiment of the present application, which is exemplarily shown;
fig. 3 is a schematic diagram illustrating an exemplary two-user instant messaging through a mobile phone;
FIG. 4 is a schematic diagram of the user B in FIG. 3 cooperating with the computer C;
FIG. 5 is a flowchart illustrating an exemplary method of image frame processing;
FIG. 6 is a flowchart illustrating an exemplary memory management method;
fig. 7 is a diagram illustrating another exemplary flow of a memory management method.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The term "and/or" is herein merely an association relationship describing an associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone.
The terms first and second and the like in the description and in the claims of embodiments of the present application are used for distinguishing between different objects and not necessarily for describing a particular sequential order of objects. For example, the first target object and the second target object, etc., are used to distinguish between different target objects, and are not used to describe a particular order of target objects.
In the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as examples, illustrations, or descriptions. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the description of the embodiments of the present application, unless otherwise indicated, the meaning of "a plurality" means two or more. For example, the plurality of processing units refers to two or more processing units; the plurality of systems means two or more systems.
When a user performs a video call through a first electronic device, if a second electronic device is required to complete some work, the user is inconvenienced by using two electronic devices at the same time. At this time, the first electronic device and the second electronic device can be cooperated, and the audio and video can be switched to the second electronic device, so that the second electronic device can control the first electronic device to perform video call.
The image frame processing method in the embodiment of the application can be applied to first electronic equipment, and the first electronic equipment can be, for example, electronic equipment such as a smart phone and a tablet. The structure of the first electronic device may be as shown in fig. 1.
Fig. 1 is a schematic diagram of an exemplary illustrated electronic device 100. It should be understood that the electronic device 100 shown in fig. 1 is only one example of an electronic device, and that the electronic device 100 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in fig. 1 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
Referring to fig. 1, an electronic device 100 may include: processor 110, internal memory 121, universal serial bus (universal serial bus, USB) interface 130, charge management module 140, power management module 141, battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headset interface 170D, sensor module 180, indicator 192, camera 193, etc.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. In this embodiment, taking an Android (Android) system with a hierarchical architecture as an example, a software structure of the electronic device 100 is illustrated.
Fig. 2 is a software structural block diagram of the electronic device 100 of the embodiment of the present application, which is exemplarily shown.
The layered architecture of the electronic device 100 divides the software into several layers, each with a distinct role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system may include an application layer, an application framework layer, a system layer, a kernel layer, and the like.
The application layer may include a series of application packages.
As shown in fig. 2, the application package may include applications such as a camera, gallery, phone call, map, WLAN, bluetooth, video, instant messaging module, etc. The instant communication module may be any application module capable of performing an instant video call, for example. The user can carry out video call with other users through the instant communication module of the first electronic equipment.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layer may include a window manager, a content provider, a resource manager, a view system, an image frame processing module, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The image frame processing module is used for executing the image frame processing method. When a user performs video call with other users through the instant communication module of the first electronic device, the video call can be coordinated to the second electronic device, and the second electronic device adopts a camera of the second electronic device to collect image frames of the video call and sends the image frames to the first electronic device. The image frame processing module of the first electronic device receives the image frame sent by the second electronic device, and the image frame is processed by using the image frame processing method of the embodiment of the application. After the image frames are processed by the image frame processing module, if the processing result is that the image frames are discarded, the image frame processing module discards the image frames and does not send the image frames to the even communication module; if the processing result is to display an image frame, the image frame processing module transmits the image frame to the even-though communication module, and the even-though communication module displays the image frame.
The application framework layer may also include a telephony manager (not shown in fig. 2), among other things. The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. As shown in FIG. 2, in an embodiment of the present application, a system library may include
Surface manager (surface manager), media library (Media Libraries), three-dimensional graphics processing library (e.g., openGL ES), two-dimensional graphics engine (i.e., 2D graphics engine, e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing. The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The kernel layer is a layer between hardware and software.
As shown in fig. 2, the kernel layer may include modules such as a display driver, a bluetooth driver, a Wi-Fi driver, a bluetooth driver, an audio driver, a sensor driver, and the like.
It will be appreciated that the layers and components contained in the layers in the software structure shown in fig. 2 do not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer layers than shown, and more or fewer components may be included in each layer, as the present application is not limited.
The present application will be described in detail with reference to examples.
The application scenario of the image frame processing method in the embodiment of the present application will be described by taking the first electronic device as a mobile phone and the second electronic device as a computer as an example.
Fig. 3 is a schematic diagram illustrating an exemplary instant messaging by a two-user via a mobile phone. Referring to fig. 3, communication is performed between the user a and the user B through the mobile phone a and the mobile phone B. The instant messaging application is installed on both the mobile phone a and the mobile phone B, and for example, the instant messaging application can be any application capable of performing an instant video call.
Fig. 4 is a schematic diagram of the user B in fig. 3 cooperating with the computer C. Referring to fig. 4, after the mobile phone B cooperates with the computer C, the call video on the mobile phone B is switched to the computer C. The camera in the computer C acquires the video image frame of the user B and sends the video image frame to the mobile phone B, and the camera in the mobile phone B does not need to acquire the video image frame of the user B. The mobile phone B processes the video image frames sent by the computer C by using the image frame processing and sending method.
In the collaboration scenario shown in fig. 4, the mobile phone B is referred to as a center-side device, and the computer C is referred to as an end-side device or a device-side device. It should be noted that, although the end-side device in fig. 4 is a computer, it should be understood that this is merely an illustrative example of the end-side device,
for each instant messaging application, the center side device can set one or more video streams with different frame rates, and the end side device acquires images in the video call process by adopting the maximum frame rate in all video streams of each instant messaging application.
For example, assume that for an instant messaging application on handset B, handset B sets two video streams of different frame rates, one of which is a video stream that displays user B's video on a display local to handset B, and the other of which is a video stream that sends user B's video to handset a. And the computer C acquires the video image frame of the user B by taking the video stream with the largest frame rate of the two video streams as a reference.
For example, assume that the mobile phone B sets two video streams with different frame rates, where the frame rate of the video stream displaying the video of the user B on the local display of the mobile phone B is the frame rate 1, and the frame rate of the video stream transmitting the video of the user B to the mobile phone a is the frame rate 2, where the frame rate 1 is greater than the frame rate 2. Then, the frame rate at which the computer C captures the video image frames of the user b is 1. Then, the computer C sends the acquired video image frames to the mobile phone B at a frame rate of 1, i.e. the frame rate at which the mobile phone B receives the video image frames from the computer C is 1. At this time, for a video stream in which the video of user B is displayed on the local display of handset B, since the frame rate of the video stream is equal to frame rate 1, handset B can directly transmit the received video image frames to the video stream. However, for a video stream that transmits the video of user B to handset a, since the frame rate of the video stream is less than 1, handset B needs to perform frame loss processing on the video image received from computer C, and then transmit the remaining video image frames to the video stream.
For example, suppose that computer C sends 30 frames of images per second to handset B and handset B sends 15 frames of images per second to handset a, at which time handset B needs to discard 15 frames of images per second and send the remaining 15 frames of images to handset a.
The image frame processing method according to the embodiment of the present application will be described below by taking the mobile phone B in fig. 4 as an example.
Fig. 5 is a flowchart illustrating an exemplary image frame processing method. In this embodiment, the image frame processing method is applied to the mobile phone B, and the method may include the following steps:
s501, the mobile phone B receives an image frame sent by the computer C, wherein the image frame belongs to the application 1 on the mobile phone B, and the timestamp corresponding to the image frame is the timestamp 1.
S502, determining the actual frame rate of the image data received by the mobile phone B according to the time stamp of the x image frames which belong to the application 1 and are received recently as shown in the time stamp 1.
Wherein x is a natural number.
Assuming that the current image frame is the i-th image frame, the corresponding timeStamp is the timeStamp, and the timeStamp corresponding to the (i-x) -th image frame is the timeStamp (i-x), the actual frame rate curFrameRate (i) of the current image frame can be calculated by the following formula (1).
For example, i=11, x=10, and the actual frame rate when the 11 th image frame in the video is received by the mobile phone B can be calculated. The first image frame in the video does not need to calculate the actual frame rate, and the frame loss rate droplate (1) of the first image frame defaults to 0. X=1 when calculating the actual frame rate of the second image frame in the video, and x may take a natural number of 1 to 9 less than i when calculating the actual frame rate of the third to tenth image frames in the video.
It should be noted that, too large value of x may affect accuracy of the actual frame rate, and therefore, the value of x needs to be controlled within a certain range.
In one example, x is less than or equal to 60.
S503, according to the actual frame rate and the target frame rate, obtaining a frame loss rate 1 corresponding to the image frame, wherein the target frame rate is the frame rate of the mobile phone B sending the image data to the application 1.
Assuming that the target frame rate is targetFrameRate, the frame loss rate droplate (i) corresponding to the i-th image frame can be calculated by the following formula (2).
S504, updating the accumulated frame loss rate according to the frame loss rate 1.
The accumulated frame loss rate is obtained by accumulating the frame loss rates of all image frames sent by the computer C and received by the mobile phone B.
The accumulated frame loss rate droplate can be calculated by the following formula (3).
droprrate= Σdroprrate (i) formula (3)
S505, judging whether the updated accumulated frame loss rate is larger than or equal to a first value, if so, executing step S507, otherwise, executing step S506.
S506, judging whether the updated accumulated frame loss rate is smaller than or equal to a second value, if yes, executing step S509, otherwise executing step S511. The updated accumulated frame loss rate is less than or equal to a second value
S507, discarding the image frames.
Thus, the application 1 does not have to process the image frames.
S508, the value of the accumulated frame loss rate is updated to be the difference between the accumulated frame loss rate and the first value, and the process returns to step S501.
For example, the first value is 1, then droplate=droplate-1.
S509, the image frame is transmitted to the application 1.
Thus, the application 1 needs to process the image frames.
In the case where the updated accumulated frame loss rate is less than or equal to the second value, if the time interval for processing the image frames by the application 1 is full, but the new image frame has not been received by the handset B, then the handset B may send the received image frame nearest at this time to the application 1 so that the application 1 can process the image frames at the set frame rate.
S510, updating the value of the accumulated frame loss rate to be the difference between the accumulated frame loss rate and the second value, and returning to the step S501.
For example, the second value is-1, then droprate=droprate- (-1) =droprate+1.
S511, the image frame is transmitted to the application 1, and the process returns to step S501.
In this step, the value of the cumulative frame loss rate keeps the cumulative frame loss rate value calculated in step S504 unchanged.
Assuming that the video image frames received by the mobile phone B from the computer C include the image frames a1 to a100, x=10, the processing procedure of the mobile phone B for the image frames a1 to a100 is as follows:
The frame loss rate of the image frames a1 to a10 is 0, and the cumulative frame loss rate is also 0.
For the image frame a11, the actual frame rate and the accumulated frame loss rate are calculated according to the formula (1) and the formula (2), and if the accumulated frame loss rate corresponding to the image frame a11 is greater than-1 and less than 1, the image frame a11 is directly sent to the instant messaging application, which is assumed to be application 1.
The processing procedure of the image frame a12 to the image frame a20 is assumed to be the same as that of the image frame a 11.
For the image frame a21, the actual frame rate and the accumulated frame loss rate are calculated according to the formula (1) and the formula (2), and assuming that the accumulated frame loss rate corresponding to the image frame a21 is greater than 1, the mobile phone B discards the image frame a21 and does not transmit the image frame a21 to the application 1. The accumulated frame loss rate is then updated as droplate=droplate-1.
The processing procedure of the image frame a22 to the image frame a30 is assumed to be the same as that of the image frame a 11.
For image frame a31, the actual frame rate and the accumulated frame loss rate are calculated according to equations (1) and (2), and assuming that the accumulated frame loss rate corresponding to image frame a21 is less than-1, handset B transmits image frame a30 to application 1. The accumulated frame loss rate is then updated as droplate=droplate+1.
……
According to the image frame processing method, the proportion of the redundant frames in the video image frames actually received by the electronic equipment is accumulated, whether the current image frame is discarded or not is determined according to the proportion accumulated value of the redundant frames, continuous frame loss operation is reduced, frame skipping of video images is avoided, and user experience is improved.
In addition, the invention directly adopts the floating point number operation mode, and the processing process is simple.
In addition, the error between the accumulated frame loss rate and the theoretical frame loss rate calculated by the image frame processing method of the embodiment of the application is mainly the calculation error of the floating point number, and the frame loss standard droprrate is accumulated by the frame rate when each frame arrives, and each received frame influences the subsequent frame loss rate at the current frame rate, so that the frame loss rate is more approximate to the theoretical frame loss rate. Therefore, the frame loss processing of the image frame processing method is more accurate and reasonable.
Next, the memory management method according to the embodiment of the present application will be described by replacing the computer C in fig. 4 with the tablet D. That is, in the following embodiment, the center side device is the cellular phone B, and the end side device is the tablet D. The hardware structure of the tablet D may be the structure shown in fig. 1. In the software structure of the tablet D, the application framework layer includes a memory management module, where the memory management module may be used to execute the memory management method in the embodiment of the present application.
In the collaboration scenario shown in fig. 4, the mobile phone B sends the camera parameters to the computer C, and the tablet D collects the image frames in the video call and then sends the image frames to the mobile phone B. After the image frames are acquired by the panel D, the image frames are stored in the memory, and then the panel D performs at least one color conversion process on the image frames, and the image data obtained after each color conversion process is also stored in the memory. After all the color conversion processing steps are finished, the tablet D encodes the corresponding image data, and then the encoded image data is sent to the mobile phone B.
In the related art, the memory is applied to store the image data of the image frame acquired by the tablet D and the image data obtained by performing each color conversion processing on the image frame, and the memory is released after the use is completed.
For example, after the tablet D acquires the image frame a0, the memory 1 is applied, and the image frame a0 is stored in the memory 1;
then, the flat panel D acquires an image frame a0 from the memory 1, releases the memory 1, applies for the memory 2, performs first color conversion processing on the image frame a0 to obtain image data a1, and stores the image data a1 into the memory 2;
then, the flat panel D takes out the image data a1 from the memory 2, releases the memory 2, applies for the memory 3, performs a second color conversion process on the image data a1 to obtain the image data a2, and stores the image data a2 into the memory 3;
then, the flat panel D takes out the image data a2 from the memory 2, releases the memory 3, carries out coding processing on the image data a2 to obtain coded image data a3, applies for the memory 4, and stores the coded image data a3 into the memory 4;
then, the tablet D fetches the encoded image data a3 from the memory 4, and transmits the encoded image data a3 to the mobile phone B.
The above time for releasing each memory is only exemplary, and the embodiment of the present application does not limit the time for releasing the memory.
In the above memory application mode, the panel D needs to apply for multiple times of memory for each acquired frame of image, which consumes more time and results in larger processing delay of each frame of image.
The embodiment of the application provides a memory management method, which can reduce the processing time delay of terminal equipment to acquired image frames in the collaborative scene.
Fig. 6 is a flowchart illustrating an exemplary memory management method. In this embodiment, the memory management method may be applied to the tablet D shown in fig. 4. Referring to fig. 6, in the process of the tablet D cooperatively playing a video call with the mobile phone B, the method may include the following steps:
s601, the tablet D receives camera parameters sent by the mobile phone B, wherein the camera parameters comprise width information and height information of an acquired image.
In this embodiment, width of the image is represented by width, and height of the image is represented by height. The camera parameters sent by the mobile phone B to the tablet D include width and high.
The camera parameters may also include information such as resolution of the image, frame rate of the captured image, and color space.
S602, determining a first target capacity according to the width information and the height information.
The inventor finds that the memory size of the application is relatively fixed, and the memory size is only related to the width information and the height information of the acquired image.
The size of the first target capacity can be calculated by the following formula (4).
Wherein, memrysize represents the memory capacity.
S603, applying for a first memory space and a second memory space, wherein the capacity of the first memory space and the capacity of the second memory space are equal to the first target capacity.
In this step, two memory spaces are applied in one application process.
S604, after the first image frame is acquired, the first image data corresponding to the first image frame and at least one group of first related image data obtained by processing the first image frame are alternately stored in a first memory space and a second memory space.
For example, after the tablet D acquires the image frame a0, the image frame a0 is stored in the first memory space;
then, the flat panel D takes out an image frame a0 from the first memory space, performs first color conversion processing on the image frame a0 to obtain image data a1, and stores the image data a1 into the second memory space;
then, the flat panel D takes out the image data a1 from the second memory space, performs a second color conversion process on the image data a1 to obtain image data a2, and stores the image data a2 into the first memory space;
then, the flat panel D takes out the image data a2 from the first memory space, encodes the image data a2 to obtain encoded image data a3, applies for the memory 4, and stores the encoded image data a3 into the memory 4;
Then, the tablet D fetches the encoded image data a3 from the memory 4, and transmits the encoded image data a3 to the mobile phone B.
After the image frame a0 is processed and the coded image data a3 is sent to the mobile phone B, the tablet D directly uses the first memory space and the second memory space for processing the image frames acquired subsequently without releasing the first memory space and the second memory space.
For images of the same size, such as the image frame b0, the image frame c0, and the image frame D0 … …, which are acquired after the image frame a0, the flat panel D stores the original image frame data and the processed image data in the same manner as the image frame a0 is processed.
If the width and the height of the image frames acquired by the tablet D are unchanged all the time in the process of one collaborative video call, the first memory space and the second memory space are released until the video call is ended and the camera is closed.
If the width and the height of the image frames acquired by the panel D are changed in a collaborative video call process, the panel D releases the first memory space and the second memory space to reapply new two memory spaces.
Therefore, the process shown in fig. 7 may be used for the second frame and the subsequent image frames acquired during the collaborative video call.
Fig. 7 is a diagram illustrating another exemplary flow of a memory management method. Referring to fig. 7, in the process of the tablet D cooperatively playing the video call with the mobile phone B, the method may further include the following steps:
s701, acquiring image frames.
S702, judging whether the data amount corresponding to the image frame is equal to the first target capacity, if so, executing step S703, otherwise, executing step S704.
S703, alternately storing the second image data corresponding to the image frame and at least one group of second related image data obtained after the second image frame is processed into the first memory space and the second memory space, and ending the image processing of the frame.
S704, determining a second target capacity according to the length of the image data corresponding to the image frame.
And S705, applying for a third memory space and a fourth memory space, wherein the capacity of the third memory space and the capacity of the fourth memory space are equal to the second target capacity.
The first memory space and the second memory space may be released before step S705, after step S704, or after step S705.
S706, the second image data corresponding to the image frame and at least one group of second related image data obtained after the second image frame is processed are alternately stored in the third memory space and the fourth memory space, and the image processing of the frame is finished.
For example.
After processing the image frame a0 described above, the tablet D continues to acquire the image frame b0.
The flat panel D judges whether the data amount corresponding to the image frame b0 is equal to the first target capacity, if so, the flat panel D stores the image frame b0 into the second memory space;
then, the flat panel D takes out the image frame b0 from the second memory space, performs first color conversion processing on the image frame b0 to obtain image data b1, and stores the image data b1 into the first memory space;
then, the flat panel D takes out the image data b1 from the first memory space, performs a second color conversion process on the image data b1 to obtain image data b2, and stores the image data b2 into the second memory space;
then, the flat panel D takes out the image data b2 from the second memory space, encodes the image data b2 to obtain encoded image data b3, applies for the memory 4, and stores the encoded image data b3 into the memory 4;
then, the tablet D fetches the encoded image data B3 from the memory 4, and transmits the encoded image data B3 to the mobile phone B.
Then, after processing the image frame b0, the panel D changes the width and/or height of the acquired image, resulting in a change in the capacity of the memory to be applied.
After processing the image frame b0 described above, the tablet D continues to acquire the image frame c0.
The flat panel D judges whether the data amount corresponding to the image frame c0 is equal to the first target capacity, if not, the flat panel D determines the second target capacity according to the length of the image data corresponding to the image frame c 0;
the flat panel D applies for a third memory space and a fourth memory space, wherein the capacity of the third memory space and the capacity of the fourth memory space are equal to the second target capacity;
the flat panel D stores the image frame c0 into a third memory space;
then, the flat panel D takes out the image frame c0 from the third memory space, performs the first color conversion processing on the image frame c0 to obtain image data c1, and stores the image data c1 into the fourth memory space;
then, the flat panel D takes out the image data c1 from the fourth memory space, performs a second color conversion process on the image data c1 to obtain image data c2, and stores the image data c2 into the third memory space;
then, the flat panel D takes out the image data c2 from the third memory space, encodes the image data c2 to obtain encoded image data c3, and stores the encoded image data c3 into the memory 4;
then, the tablet D extracts the application memory 4 from the memory 4, encodes the image data c3, and transmits the encoded image data c3 to the mobile phone B.
……
And so on.
In one example, when the collaborative video call ends, the memory management method may further include the following steps:
receiving an instruction for closing the camera;
and releasing the first memory space and the second memory space.
According to the memory management method provided by the embodiment of the application, after the memory is applied, if the memory can be used by the image frames (namely, the size of the acquired image frames is unchanged), the memory is released until the camera is closed after the video call is ended, so that the memory is not required to be applied and released for each acquired image frame, and the processing time of the image is reduced.
According to the memory management method provided by the embodiment of the application, the memory can be used for multiple times after the application of the memory is finished, and the memory is released when the camera is closed after the video call is finished, so that the time consumption for frequently applying the memory in the image frame processing process is reduced, and the processing time delay of each frame of image is reduced.
The embodiment of the application also provides an electronic device, which comprises a memory and a processor, wherein the memory is coupled with the processor, the memory stores program instructions, and when the program instructions are executed by the processor, the electronic device is enabled to execute the image frame processing method executed by the electronic device.
The embodiment of the application also provides an electronic device, which comprises a memory and a processor, wherein the memory is coupled with the processor, the memory stores program instructions, and when the program instructions are executed by the processor, the electronic device can make the electronic device execute the memory management method.
It will be appreciated that the electronic device, in order to achieve the above-described functions, includes corresponding hardware and/or software modules that perform the respective functions. The steps of an algorithm for each example described in connection with the embodiments disclosed herein may be embodied in hardware or a combination of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application in conjunction with the embodiments, but such implementation is not to be considered as outside the scope of this application.
The present embodiment also provides a computer storage medium having stored therein computer instructions which, when executed on an electronic device, cause the electronic device to perform the above-described related method steps to implement the image frame processing method in the above-described embodiments.
The present embodiment also provides a computer storage medium having stored therein computer instructions which, when executed on an electronic device, cause the electronic device to execute the above-mentioned related method steps to implement the memory management method in the above-mentioned embodiments.
The present embodiment also provides a computer program product which, when run on a computer, causes the computer to perform the above-described related steps to implement the image frame processing method in the above-described embodiments.
The present embodiment also provides a computer program product, which when run on a computer, causes the computer to perform the above-mentioned related steps to implement the memory management method in the above-mentioned embodiments.
In addition, the embodiment of the application also provides a device, which can be a chip, a component or a module, and the device can comprise a processor and a memory which are connected; the memory is used for storing computer-executable instructions, and when the device is running, the processor can execute the computer-executable instructions stored in the memory, so that the chip executes the image frame processing method in each method embodiment.
The electronic device, the computer storage medium, the computer program product, or the chip provided in this embodiment are used to execute the corresponding methods provided above, so that the beneficial effects thereof can be referred to the beneficial effects in the corresponding methods provided above, and will not be described herein.
It will be appreciated by those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts shown as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
Any of the various embodiments of the application, as well as any of the same embodiments, may be freely combined. Any combination of the above is within the scope of the present application.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions to cause a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those of ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are also within the protection of the present application.
The steps of a method or algorithm described in connection with the disclosure of the embodiments disclosed herein may be embodied in hardware, or may be embodied in software instructions executed by a processor. The software instructions may be comprised of corresponding software modules that may be stored in random access Memory (Random Access Memory, RAM), flash Memory, read Only Memory (ROM), erasable programmable Read Only Memory (Erasable Programmable ROM), electrically Erasable Programmable Read Only Memory (EEPROM), registers, hard disk, a removable disk, a compact disc Read Only Memory (CD-ROM), or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC.
Those skilled in the art will appreciate that in one or more of the examples described above, the functions described in the embodiments of the present application may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, these functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those of ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are also within the protection of the present application.

Claims (10)

1. The image frame processing method is characterized by being applied to a first electronic device, wherein the first electronic device cooperates with a second electronic device, a conversation video on the first electronic device is switched to the second electronic device, the second electronic device collects video image frames and sends the video image frames to the first electronic device, and the method comprises the following steps:
Receiving a first image frame sent by a second electronic device, wherein the first image frame belongs to a target application on the first electronic device, and a time stamp corresponding to the first image frame is a first time stamp;
determining a first frame rate at which the first electronic device receives image data according to the time stamp of the first number of image frames which are received recently and belong to the target application, wherein the first frame rate comprises: acquiring a time difference between the first time stamp and a first target time stamp, wherein the first target time stamp is the smallest time stamp in the time stamps of the first number of image frames; determining a first frame rate of image data currently received by the electronic device according to the time difference and the first quantity;
obtaining a first frame loss rate corresponding to the first image frame according to the first frame rate and a target frame rate, wherein the target frame rate is a frame rate at which the first electronic device sends image data to the target application, and the method comprises the following steps: acquiring a frame rate difference between the first frame rate and the target frame rate; determining a first frame loss rate according to the frame rate difference value and the first frame rate, wherein the first frame loss rate is equal to the frame rate difference value divided by the first frame rate;
Updating the accumulated frame loss rate according to the first frame loss rate;
and discarding the first image frame if the updated accumulated frame loss rate is greater than or equal to a first value.
2. The method of claim 1, wherein updating the accumulated frame loss rate based on the first frame loss rate comprises:
acquiring a first accumulated frame loss rate corresponding to a previous image frame of the first image frame;
and determining an updated accumulated frame loss rate according to the first accumulated frame loss rate and the first frame loss rate, wherein the updated accumulated frame loss rate is equal to the sum of the first accumulated frame loss rate and the first frame loss rate.
3. The method of claim 1, wherein discarding the first image frame if the updated cumulative frame loss rate is greater than or equal to a first value such that the target application does not display the first image frame further comprises:
and updating the value of the accumulated frame loss rate to be the difference between the accumulated frame loss rate and the first value.
4. The method of claim 3, wherein updating the value of the accumulated frame loss rate to the difference between the accumulated frame loss rate and the first value further comprises:
receiving a second image frame sent by the second electronic device, wherein the second image frame belongs to the target application, and a second timestamp corresponding to the second image frame is larger than the first timestamp;
Determining a second frame rate of the image data received by the first electronic device according to the time stamp of the first number of image frames which are received recently and belong to the target application and shown by the second time stamp;
obtaining a second frame loss rate corresponding to the second image frame according to the second frame rate and the target frame rate;
updating the accumulated frame loss rate according to the second frame loss rate;
and if the updated accumulated frame loss rate is smaller than or equal to a second value, sending a second image frame to the target application, wherein the second value is smaller than the first value.
5. The method of claim 4, wherein transmitting the first target image frame to the target application such that the target application displays the first target image frame, further comprises:
and updating the value of the accumulated frame loss rate to be the difference between the accumulated frame loss rate and the second value.
6. The method of claim 3, wherein updating the value of the accumulated frame loss rate to the difference between the accumulated frame loss rate and the first value further comprises:
receiving a third image frame sent by the second electronic device, wherein the third image frame belongs to the target application, and a third timestamp corresponding to the third image frame is larger than the first timestamp;
Determining a third frame rate of the image data received by the first electronic device according to the time stamp of the first number of image frames which belong to the target application and are received recently at the time indicated by the third time stamp;
obtaining a third frame loss rate corresponding to the third image frame according to the third frame rate and the target frame rate;
updating the accumulated frame loss rate according to the third frame loss rate;
and if the updated accumulated frame loss rate is smaller than the first value and larger than the second value, transmitting the third image frame to the target application.
7. The method of claim 1, wherein the first number is less than a preset number value.
8. The method of claim 1, wherein the first electronic device is a smart phone and the second electronic device is a personal computer, tablet, or smart screen.
9. An electronic device, comprising:
a memory and a processor, the memory coupled with the processor;
the memory stores program instructions that, when executed by the processor, cause the electronic device to perform the image frame processing method of any of claims 1-8.
10. A computer readable storage medium comprising a computer program, characterized in that the computer program, when run on an electronic device, causes the electronic device to perform the image frame processing method according to any of claims 1-8.
CN202210859314.1A 2022-07-21 2022-07-21 Image frame processing method and electronic equipment Active CN116055802B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202410161133.0A CN118138836A (en) 2022-07-21 2022-07-21 Image frame processing method and electronic equipment
CN202210859314.1A CN116055802B (en) 2022-07-21 2022-07-21 Image frame processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210859314.1A CN116055802B (en) 2022-07-21 2022-07-21 Image frame processing method and electronic equipment

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202410161133.0A Division CN118138836A (en) 2022-07-21 2022-07-21 Image frame processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN116055802A CN116055802A (en) 2023-05-02
CN116055802B true CN116055802B (en) 2024-03-08

Family

ID=86114046

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202410161133.0A Pending CN118138836A (en) 2022-07-21 2022-07-21 Image frame processing method and electronic equipment
CN202210859314.1A Active CN116055802B (en) 2022-07-21 2022-07-21 Image frame processing method and electronic equipment

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202410161133.0A Pending CN118138836A (en) 2022-07-21 2022-07-21 Image frame processing method and electronic equipment

Country Status (1)

Country Link
CN (2) CN118138836A (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117478929B (en) * 2023-12-28 2024-03-08 昆明中经网络有限公司 Novel media exquisite image processing system based on AI large model

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108495142A (en) * 2018-04-11 2018-09-04 腾讯科技(深圳)有限公司 Method for video coding and device
CN108566550A (en) * 2018-07-03 2018-09-21 广州视源电子科技股份有限公司 Aging test method and device and electronic equipment
CN108600794A (en) * 2018-05-21 2018-09-28 深圳市梦网科技发展有限公司 A kind of bearing calibration of frame loss rate, device and terminal
CN108632624A (en) * 2017-12-18 2018-10-09 百富计算机技术(深圳)有限公司 Image processing method, device, terminal device and readable storage medium storing program for executing
CN109451248A (en) * 2018-11-23 2019-03-08 广州酷狗计算机科技有限公司 Processing method, device, terminal and the storage medium of video data
CN110832870A (en) * 2018-10-30 2020-02-21 深圳市大疆创新科技有限公司 Data processing method and equipment and pass-through glasses
CN110913118A (en) * 2018-09-17 2020-03-24 腾讯数码(天津)有限公司 Video processing method, device and storage medium
CN111932463A (en) * 2020-08-26 2020-11-13 腾讯科技(深圳)有限公司 Image processing method, device, equipment and storage medium
CN112822505A (en) * 2020-12-31 2021-05-18 杭州星犀科技有限公司 Audio and video frame loss method, device, system, storage medium and computer equipment
CN113556505A (en) * 2020-04-23 2021-10-26 杭州海康威视数字技术股份有限公司 Data processing method and device, electronic equipment and readable storage medium
CN114493982A (en) * 2022-02-17 2022-05-13 深圳欧克曼技术有限公司 Video processing method and device for preventing frame loss

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108632624A (en) * 2017-12-18 2018-10-09 百富计算机技术(深圳)有限公司 Image processing method, device, terminal device and readable storage medium storing program for executing
CN108495142A (en) * 2018-04-11 2018-09-04 腾讯科技(深圳)有限公司 Method for video coding and device
CN108600794A (en) * 2018-05-21 2018-09-28 深圳市梦网科技发展有限公司 A kind of bearing calibration of frame loss rate, device and terminal
CN108566550A (en) * 2018-07-03 2018-09-21 广州视源电子科技股份有限公司 Aging test method and device and electronic equipment
CN110913118A (en) * 2018-09-17 2020-03-24 腾讯数码(天津)有限公司 Video processing method, device and storage medium
CN110832870A (en) * 2018-10-30 2020-02-21 深圳市大疆创新科技有限公司 Data processing method and equipment and pass-through glasses
CN109451248A (en) * 2018-11-23 2019-03-08 广州酷狗计算机科技有限公司 Processing method, device, terminal and the storage medium of video data
CN113556505A (en) * 2020-04-23 2021-10-26 杭州海康威视数字技术股份有限公司 Data processing method and device, electronic equipment and readable storage medium
CN111932463A (en) * 2020-08-26 2020-11-13 腾讯科技(深圳)有限公司 Image processing method, device, equipment and storage medium
CN112822505A (en) * 2020-12-31 2021-05-18 杭州星犀科技有限公司 Audio and video frame loss method, device, system, storage medium and computer equipment
CN114493982A (en) * 2022-02-17 2022-05-13 深圳欧克曼技术有限公司 Video processing method and device for preventing frame loss

Also Published As

Publication number Publication date
CN118138836A (en) 2024-06-04
CN116055802A (en) 2023-05-02

Similar Documents

Publication Publication Date Title
WO2022258024A1 (en) Image processing method and electronic device
JP7085014B2 (en) Video coding methods and their devices, storage media, equipment, and computer programs
CN116055786B (en) Method for displaying multiple windows and electronic equipment
CN116320783B (en) Method for capturing images in video and electronic equipment
CN114071197A (en) Screen projection data processing method and device
CN116055802B (en) Image frame processing method and electronic equipment
CN115802146B (en) Method for capturing images in video and electronic equipment
CN115802148B (en) Method for acquiring image and electronic equipment
CN116708753B (en) Method, device and storage medium for determining preview blocking reason
CN116052701B (en) Audio processing method and electronic equipment
CN114945019B (en) Data transmission method, device and storage medium
WO2022193141A1 (en) Multimedia file playing method and related apparatus
CN115460343A (en) Image processing method, apparatus and storage medium
CN114793283A (en) Image encoding method, image decoding method, terminal device, and readable storage medium
CN116055868A (en) Shooting method and related equipment
CN115776532B (en) Method for capturing images in video and electronic equipment
CN117082295B (en) Image stream processing method, device and storage medium
CN115802147B (en) Method for capturing images in video and electronic equipment
WO2022206600A1 (en) Screen projection method and system, and related apparatus
CN116028383B (en) Cache management method and electronic equipment
CN117560574B (en) Shooting method, electronic equipment and readable storage medium
CN117641116B (en) Method for controlling frame rate of camera and electronic equipment
WO2022089621A1 (en) Image frame storage method, photographing method, and electronic device
CN114745542A (en) Encoding method, electronic device, communication system, storage medium, and program product
CN116149870A (en) Screen information reporting method and system, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant