CN112004086B - Video data processing method and device - Google Patents

Video data processing method and device Download PDF

Info

Publication number
CN112004086B
CN112004086B CN202010851392.8A CN202010851392A CN112004086B CN 112004086 B CN112004086 B CN 112004086B CN 202010851392 A CN202010851392 A CN 202010851392A CN 112004086 B CN112004086 B CN 112004086B
Authority
CN
China
Prior art keywords
chip
image data
frame
target video
control unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010851392.8A
Other languages
Chinese (zh)
Other versions
CN112004086A (en
Inventor
范泽华
郑超
陈江川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010851392.8A priority Critical patent/CN112004086B/en
Publication of CN112004086A publication Critical patent/CN112004086A/en
Priority to PCT/CN2021/102574 priority patent/WO2022037251A1/en
Application granted granted Critical
Publication of CN112004086B publication Critical patent/CN112004086B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/132Sampling, masking or truncation of coding units, e.g. adaptive resampling, frame skipping, frame interpolation or high-frequency transform coefficient masking

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Television Systems (AREA)

Abstract

The application discloses a video data processing method and a video data processing device, which are applied to electronic equipment comprising an AP, a frame insertion chip, a display screen and a memory, wherein the method comprises the following steps: the AP sends image data of a target video to the frame insertion chip; the frame interpolation chip performs frame interpolation processing on the received image data of the target video; and the frame inserting chip sends the image data after the target video frame insertion to a display screen and sends target information to the AP, wherein the target information is used for realizing the storage of the image data after the target video frame insertion into the memory. By adopting the embodiment of the application, the video can be reused after frame insertion, and the purpose of reducing power consumption is finally achieved.

Description

Video data processing method and device
Technical Field
The present application relates to the field of electronic technologies, and in particular, to a method and an apparatus for processing video data.
Background
At present, video frame interpolation is real-time frame interpolation, namely, frame interpolation is performed while video is played and is synchronously transmitted to a display screen for display. In order to implement real-time video frame interpolation, a frame interpolation chip (DSP) is usually bridged between an Application Processor (AP) and a display screen for Digital Signal Processing (DSP). The real-time frame insertion of the video is only used for playing at this time, and the frame insertion operation needs to be carried out again when the video is played next time, so that the video can be limited to be used again after the frame insertion.
Disclosure of Invention
The embodiment of the application provides a video data processing method and device.
In a first aspect, an embodiment of the present application provides a video data processing method, which is applied to an electronic device, where the electronic device includes an AP, a frame insertion chip, a display screen, and a memory, and the method includes:
the AP sends image data of a target video to the frame insertion chip;
the frame interpolation chip performs frame interpolation processing on the received image data of the target video;
and the frame inserting chip sends the image data after the target video frame insertion to the display screen and sends target information to the AP, wherein the target information is used for realizing the storage of the image data after the target video frame insertion into the memory.
In a second aspect, an embodiment of the present application provides a video data processing apparatus, which is applied to an electronic device including a display screen and a memory, where the apparatus includes an AP and a frame insertion chip;
the AP is used for sending image data of a target video to the frame interpolation chip;
the frame interpolation chip is used for performing frame interpolation processing on the received image data of the target video;
the frame insertion chip is further used for sending the image data after the target video frame insertion to the display screen and sending first information to the AP, wherein the first information is used for storing the image data after the target video frame insertion into the memory.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, a communication interface, a display screen, and one or more programs, where the processor includes an AP and a framing chip, and the one or more programs are stored in the memory and configured to be executed by the AP and the framing chip, and the programs include instructions for executing steps in any method of the first aspect of the embodiment of the present application.
In a fourth aspect, the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program makes a computer perform part or all of the steps described in any one of the methods of the first aspect of the present application.
In a fifth aspect, the present application provides a computer program product, wherein the computer program product includes a non-transitory computer-readable storage medium storing a computer program, and the computer program is operable to cause a computer to perform some or all of the steps as described in any one of the methods of the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
It can be seen that, in the embodiment of the present application, an AP first sends image data of a target video to a frame insertion chip, the frame insertion chip performs frame insertion processing on the received image data of the target video, then the frame insertion chip sends image data after the target video is inserted to a display screen, so that the display screen performs display processing, and meanwhile, the frame insertion chip sends target information to the AP, where the target information is used to implement that the image data after the target video is inserted is stored in a memory, so as to implement that an electronic device subsequently reads back the image data after the target video is inserted, thereby implementing that a video after the frame insertion can be reused, and finally achieving the purpose of reducing power consumption.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of the connection between the conventional AP and the frame insertion chip and the display screen;
fig. 3 is a schematic connection diagram of an AP, a frame insertion chip and a display screen according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of a software structure of an electronic device according to an embodiment of the present application;
fig. 5 is a schematic flowchart of a video data processing method according to an embodiment of the present application;
fig. 6 is a schematic diagram for determining a transition frame according to an embodiment of the present application;
fig. 7 is a schematic diagram of another method for determining a transition frame according to an embodiment of the present application;
FIG. 8 is a diagram of another example of determining a transition frame according to an embodiment of the present application;
FIG. 9 is an exemplary schematic diagram;
FIG. 10 is another exemplary schematic;
FIG. 11 is a diagram illustrating a conventional frame interpolation chip;
fig. 12 is a schematic structural diagram of a frame interpolation chip according to an embodiment of the present disclosure;
fig. 13 is a schematic structural diagram of a video data processing apparatus according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings.
The electronic device may be a portable electronic device, such as a cell phone, a tablet computer, a wearable electronic device with wireless communication capabilities (e.g., a smart watch), etc., that also contains other functionality, such as personal digital assistant and/or music player functionality. Exemplary embodiments of the portable electronic device include, but are not limited to, portable electronic devices that carry an IOS system, an Android system, a Microsoft system, or other operating system. The portable electronic device may also be other portable electronic devices such as a Laptop computer (Laptop) or the like. It should also be understood that in other embodiments, the electronic device may not be a portable electronic device, but may be a desktop computer.
In a first section, the software and hardware operating environment of the technical solution disclosed in the present application is described as follows.
Fig. 1 shows a schematic structural diagram of an electronic device 100. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a compass 190, a motor 191, a pointer 192, a camera 193, a display screen 194, and a Subscriber Identity Module (SIM) card interface 195, among others.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a video codec, a Digital Signal Processor (DSP), a baseband processor, and the like. Wherein the different processing units may be separate components or may be integrated in one or more processors. In some embodiments, the electronic device 100 may also include one or more processors 110. The processor 110 may generate an operation control signal according to the instruction operation code and the timing signal, and complete the control of instruction fetching and instruction execution. In other embodiments, a memory may also be provided in processor 110 for storing instructions and data.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an inter-integrated circuit (I2C) interface, an inter-integrated circuit audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose-output (GPIO) interface, a SIM card interface, and/or a USB interface.
It should be understood that the connection relationship between the modules illustrated in the embodiment of the present application is only an exemplary illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including a Wireless Local Area Network (WLAN) (e.g., a wireless fidelity (Wi-Fi) network), a Bluetooth (BT), and the like.
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The display screen 194 is used to display images, videos, and the like. The display screen 194 includes a display panel. In some embodiments, the electronic device 100 may include 1 or more display screens 194.
The existing connection relationship between the AP, the frame insertion chip and the display screen is shown in fig. 2, as shown in the figure, the output interface of the AP is connected with the input interface of the frame insertion chip, and the output interface of the frame insertion chip is connected with the input interface of the display screen.
In the present application, the connection relationship between the AP, the frame insertion chip, and the display screen is as shown in fig. 3, as shown in the figure, the output interface of the AP is connected with the input interface of the frame insertion chip, the first output interface of the frame insertion chip is connected with the input interface of the AP, and the second output interface of the frame insertion chip is connected with the input interface of the display screen.
Therefore, in the application, a read-back channel from the frame insertion chip to the AP is added, so that the frame insertion chip can feed back image data used for realizing subsequent read-back of the video frame insertion of the electronic equipment to the AP, the purpose that the video processed by frame insertion does not need to be processed by frame insertion again is achieved, and power consumption is further reduced.
The electronic device 100 may implement a photographing function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, and the application processor, etc. The ISP is used to process the data fed back by the camera 193. In some embodiments, the ISP may be provided in camera 193. The camera 193 is used to capture still images or video. In some embodiments, the electronic device 100 may include 1 or more cameras 193.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the electronic device 100.
Internal memory 121 may be used to store one or more computer programs, including instructions. The processor 110 may execute the above instructions stored in the internal memory 121, so as to enable the electronic device 101 to execute the frame insertion processing method provided in some embodiments of the present application, and various applications, data processing, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may also include a nonvolatile memory. In some embodiments, the processor 110 may cause the electronic device 101 to execute the frame insertion processing method provided in the embodiments of the present application and other applications and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor 110.
The electronic device 100 may implement audio functions via the audio module 170, speaker 170A, headphones 170B, microphone 170C, headset interface 170D, and application processor, among others. Such as music playing, recording, etc.
The sensor module 180 may include at least one sensor, as shown in particular in FIG. 1.
For example, fig. 4 shows a block diagram of a software structure of the electronic device 100. The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom. The application layer may include a series of application packages.
The application layer may include at least one application.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
The application framework layer includes a window manager for managing the window programs. The application framework layer includes a content provider for storing and retrieving data and making it accessible to applications. The application framework layer includes a view system that includes visual controls, such as controls for displaying text, controls for displaying pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. The application framework layer includes a telephony manager for providing communication functions of the electronic device 100. The application framework layer includes a resource manager that provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like. The application framework layer includes a notification manager that allows applications to display notification information in a status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media libraries (media libraries), three-dimensional graphics processing libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), and the like.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
In the second section, the claims disclosed in the embodiments of the present application are presented below.
Referring to fig. 5, fig. 5 is a schematic flowchart of a video data processing method applied to the electronic device according to an embodiment of the present disclosure.
Step 501: and the AP sends the image data of the target video to the frame insertion chip.
Step 502: and the frame interpolation chip performs frame interpolation processing on the received image data of the target video.
Step 503: and the frame inserting chip sends the image data after the target video frame insertion to the display screen and sends target information to the AP, wherein the target information is used for realizing the storage of the image data after the target video frame insertion into the memory.
Optionally, after step 503, the method further comprises: and after receiving the image data after the target video frame insertion, the display screen performs display processing.
The target video may be a video downloaded by the electronic device from a video source provider, or a video captured by the electronic device through the camera module.
Optionally, the frame interpolation chip performs frame interpolation processing, including: and the frame interpolation chip carries out frame interpolation processing based on a video frame interpolation algorithm.
In one embodiment, the video frame interpolation algorithm is: determining N first feature points based on a current frame image and a next frame image, wherein the first feature points are pixel points of the next frame image which are changed compared with the current frame image, and N is a positive integer;
determining a motion vector of each first feature point to obtain N first motion vectors;
determining a first transition frame image based on the current frame image, the next frame image and the N first motion vectors;
inserting the first transition frame image between the current frame image and the next frame image.
Optionally, the determining a motion vector of each first feature point to obtain N first motion vectors includes:
and determining a motion vector of each first characteristic point based on the pixel information of each first characteristic point in the current frame image and the pixel information of each first characteristic point in the next frame image to obtain N first motion vectors.
The pixel information of the first characteristic point comprises position information and brightness information of the first characteristic point in the image.
And the next frame image is the next frame image of the current frame image.
Optionally, the determining a first transition frame image based on the current frame image, the next frame image and the N first motion vectors includes:
determining pixel information of the first transition frame image based on the current frame image, the next frame image, and the N first motion vectors;
generating the first transition frame image based on pixel information of the first transition frame image.
For example, assuming that the current frame image and the next frame image of the operation interface are as shown in fig. 6, there are 4 first feature points (2 × 2 gray squares in the figure), if the position information of the 4 first feature points in the current frame image is (a 11, b 11), (a 12, b 12), (a 13, b 13) and (a 14, b 14), respectively, and the position information of the 4 first feature points in the next frame image is (a 21, b 21), (a 22, b 22), (a 23, b 23) and (a 24, b 24), respectively, then the motion vectors of the 4 first feature points are (a 21-a11, b21-b 11), (a 22-a12, b22-b 12), (a 23-a13, b23-b 13) and (a 24-a14, b24-b 14), respectively, and finally, the first transition frame image can be generated through the current frame image, the next frame image and the determined motion vector, as shown in fig. 6.
In another embodiment, the video frame interpolation algorithm is: determining M second feature points based on a current frame image and a previous K frame image of the current frame image, wherein the second feature points are pixel points of the current frame image which are changed compared with the previous K frame image, and M and K are positive integers;
determining a motion vector of each second feature point to obtain M second motion vectors;
determining a second transition frame image based on the current frame image and the M second motion vectors;
inserting the second transition frame image between the current frame image and a next frame image.
Wherein the previous frame image is a previous frame image of the current frame image.
Optionally, if M =1, the determining a motion vector of each second feature point to obtain M second motion vectors includes: and determining the motion vector of each second characteristic point based on the pixel information of each second characteristic point in the current frame image and the pixel information of each second characteristic point in the previous frame image to obtain M second motion vectors.
The pixel information of the first characteristic point comprises position information and brightness information of the first characteristic point in the image.
Optionally, if M =2, the determining a motion vector of each second feature point to obtain M second motion vectors includes: and determining a motion vector of each second feature point based on the pixel information of each second feature point in the current frame image and the pixel information of each second feature point in a target image to obtain M second motion vectors, wherein the previous K frame image comprises the target image, the target image is separated from the current frame image by W frame images, and W = [ (K + 2)/2 ] -2.
For example, assuming that K =1, the current frame image and the previous frame image of the operation interface have 4 second feature points (2 × 2 gray squares in the figure) as shown in fig. 7, and if the position information of the 4 second feature points in the next frame image is (c 11, d 11), (c 12, d 12), (c 13, d 13) and (c 14, d 14), respectively, and the position information of the 4 second feature points in the current frame image is (c 21, d 21), (c 22, d 22), (c 23, d 23) and (c 24, d 24), respectively, then the motion vectors of the 4 second feature points are (c 21-c11, d21-d 11), (c 22-c12, d22-d 12), (c 23-c13, d23-d 13) and (c 24-c14, d24-d 14), respectively, and finally, the second transition frame image can be generated by the current frame image, the previous frame image and the determined motion vector, as shown in fig. 7.
For another example, assuming that K =4, the current frame image and the target image (before the current frame image and spaced from the current frame image by 1 frame image) of the operation interface are shown in fig. 8, and there are 4 second feature points (2 × 2 gray squares in the figure), if the position information of the 4 second feature points in the target image is (e 11, f 11), (e 12, f 12), (e 13, f 13) and (e 14, f 14), respectively, and the position information of the 4 second feature points in the current frame image is (e 21, f 21), (e 22, f 22), (e 23, f 23) and (e 24, f 24), respectively, then the motion vectors of the 4 second feature points are [ (e 21-e 11)/2, (f 21-f 11)/2 ], [ (e 22-e 12)/2, (f 22-f 12)/2 ], [ (e 23-e 13)/2, (f 23-f 13)/2, [ (e 23-e 13)/2 ], (f 23-e) and f24 ], respectively, and the current frame image can be generated by the first frame image and the second image, as shown in fig. 8, the transition image.
It can be seen that, in the embodiment of the present application, an AP first sends image data of a target video to a frame insertion chip, the frame insertion chip performs frame insertion processing on the received image data of the target video, then the frame insertion chip sends the image data after the target video frame insertion to a display screen, so that the display screen performs display processing, and simultaneously the frame insertion chip sends target information to the AP, where the target information is used to implement that the image data after the target video frame insertion is stored in a memory, so as to implement that an electronic device subsequently reads back the image data after the target video frame insertion, thereby implementing that a video after frame insertion can be reused, and finally achieving the purpose of reducing power consumption.
In an implementation manner of the present application, the target information is identification information, the identification information is used to indicate that the target video has been subjected to frame insertion processing, and after the frame insertion chip sends the target information to the AP, the method further includes:
and the AP acquires the image data after the target video frame insertion from a service device and stores the image data after the target video frame insertion into the memory.
Optionally, before the AP acquires the image data after the target video frame insertion from the service device, the method further includes:
and the AP sends the target video to the service device in a target time period so that the service device carries out frame insertion processing on the target video, wherein the target time period is a non-working time period or an emergency time period of the electronic equipment.
In an embodiment, the service device is integrated in the electronic device, and the service device includes an algorithm module and a storage module, and the algorithm module stores the video frame insertion processing algorithm.
Wherein the service device is integrated in the processor or the service device is a module independent of the processor.
Specifically, as shown in fig. 9, the AP transmits a target video to the service device; after the service device receives the target video, the service device performs frame interpolation processing on the target video based on the video frame interpolation processing algorithm to obtain image data after frame interpolation of the target video; the service device stores the image data after the target video frame insertion in a storage module of the service device; and the AP acquires the image data after the target video frame insertion from a storage module of the service device and then stores the image data after the target video frame insertion in a memory.
It can be seen that, in the embodiment of the present application, after performing frame insertion processing on a target video, a frame insertion chip feeds back target information to an AP, and then after receiving the target information, the AP sends the target video to a service device in an idle period, so that the service device performs frame insertion processing on the target video in the idle period, and finally the AP acquires image data after frame insertion of the target video from the service device and stores the image data in a memory, so that the video can be reused after frame insertion, and finally the purpose of reducing power consumption is achieved.
In another embodiment, the service device is a cloud server, the cloud server includes a cloud algorithm and a cloud storage, the cloud algorithm includes the video frame processing algorithm, and the cloud storage is used for storing the video data after the target video frame is inserted.
Specifically, as shown in fig. 10, the AP sends a target video to the cloud server; the cloud server performs frame interpolation processing on the target video based on the video frame interpolation processing algorithm to obtain image data after frame interpolation of the target video; the cloud server stores the image data after the target video frame insertion in the cloud storage of the cloud server; the AP acquires image data after target video frame insertion from cloud storage of the cloud server and then stores the image data after the target video frame insertion in a memory.
It can be seen that, in the embodiment of the application, after the target video is subjected to frame interpolation processing, the frame interpolation chip feeds back identification information to the AP, and then after the identification information is received, the AP sends the target video to the cloud server in an idle period, so that the cloud server performs frame interpolation processing on the target video in the idle period, and finally the AP acquires image data after the target video is subjected to frame interpolation from the cloud server and stores the image data in the memory, so that the video can be reused after frame interpolation, and finally the purpose of reducing power consumption is achieved.
In an implementation manner of the present application, the electronic device further includes a memory, the target information is image data after the target video is interpolated, and after the interpolation chip sends the target information to the AP, the method further includes: and the AP stores the image data after the target video frame insertion into the memory.
It can be seen that, in the embodiment of the present application, the frame insertion chip sends the image data after the video frame insertion to the display screen, and simultaneously sends the image data after the target video frame insertion to the AP, so that the AP stores the image data after the target video frame insertion in the memory, thereby realizing that the video after the frame insertion can be reused, and finally achieving the purpose of reducing power consumption.
In an implementation manner of the present application, after the AP stores the image data after the target video frame insertion in the memory, the method further includes:
when a playing instruction of the target video is detected, the AP acquires the image data after the target video is inserted from the memory and sends the image data after the target video is inserted to the frame insertion chip;
the frame inserting chip sends the received image data after the target video frame inserting to the display screen;
and after receiving the image data after the target video frame insertion, the display screen performs display processing.
It can be seen that, in the embodiment of the present application, when the electronic device detects the play instruction of the target video again, the frame insertion chip does not need to perform frame insertion processing, and image data after the frame insertion of the target video is directly transmitted to the display screen for display, so that the operation of the frame insertion chip is reduced, and the power consumption of the electronic device is further reduced.
The structure of the current frame interpolation chip is shown in fig. 11, and the frame interpolation chip includes an input control unit, a compression processing unit, and an output control unit, where the input control unit is connected with the compression processing unit, and the compression processing unit is connected with the output control unit.
In an implementation manner of the present application, as shown in fig. 12, the frame interpolation chip includes an input control unit, a compression processing unit, an output control unit, and a storage control unit;
the frame inserting device comprises an input control unit, a compression processing unit, an output control unit, a storage control unit and a memory, wherein an input interface of a frame inserting chip is connected with the input control unit, the input control unit is connected with the compression processing unit, the compression processing unit is respectively connected with the output control unit and the storage control unit, the output control unit is respectively connected with a first output interface of the frame inserting chip, a second output interface of the frame inserting chip and the storage control unit, and the storage control unit is connected with the memory.
Further, the storage control unit is also connected with the storage.
Further, the electronic equipment further comprises a shooting module, and the target video is a video shot by the shooting module in real time.
It can be seen that, in the embodiment of the present application, the frame insertion chip is additionally provided with the storage control unit, so that the original frame insertion chip can only store 1 to 2 frames of content and expand the content to more content, in addition, a path from the compression processing unit to the storage control unit is additionally provided, and a path from the output control unit to the storage control unit is additionally provided, thereby realizing not only communication with the compression processing unit, but also cooperative processing by the output control unit, and improving the function of the frame insertion chip.
It will be appreciated that the electronic device, in order to implement the above-described functions, comprises corresponding hardware and/or software modules for performing the respective functions. The present application is capable of being implemented in hardware or a combination of hardware and computer software in conjunction with the exemplary algorithm steps described in connection with the embodiments disclosed herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, with the embodiment described in connection with the particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In this embodiment, the electronic device may be divided into functional modules according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module may be implemented in the form of hardware. It should be noted that the division of the modules in this embodiment is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
In the case of dividing each functional module with corresponding functions, fig. 13 shows a schematic diagram of a video data processing apparatus applied to an electronic device including a display screen and a memory, as shown in fig. 13, the video data processing apparatus may include: an AP1301 and an interpolation chip 1302.
AP1301 may be used, among other things, to enable an electronic device to perform steps 501, etc., described above, and/or other processes for the techniques described herein.
The framing chip 1302 may be used to support an electronic device performing steps 502, 503, etc., described above, and/or other processes for the techniques described herein.
It should be noted that all relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
The electronic device provided by the embodiment is used for executing the video data processing method, so that the same effect as the implementation method can be achieved.
In case an integrated unit is employed, the electronic device may comprise a processing module, a storage module and a communication module. The processing module may be configured to control and manage actions of the electronic device, for example, may be configured to support the electronic device to perform the steps performed by the AP1301 and the frame insertion chip 1302. The memory module can be used to support the electronic device in executing stored program codes and data, etc. The communication module can be used for supporting the communication between the electronic equipment and other equipment.
The processing module may be a processor or a controller, among others. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. A processor may also be a combination of computing functions, e.g., a combination of one or more microprocessors, a Digital Signal Processing (DSP) and a microprocessor, or the like. The storage module may be a memory. The communication module may specifically be a radio frequency circuit, a bluetooth chip, a Wi-Fi chip, or other devices that interact with other electronic devices.
In an embodiment, when the processing module is a processor and the storage module is a memory, the electronic device according to this embodiment may be a device having the structure shown in fig. 1.
The present embodiment also provides a computer storage medium, in which computer instructions are stored, and when the computer instructions are run on an electronic device, the electronic device is caused to execute the relevant method steps to implement the video data processing method in the foregoing embodiment.
The present embodiment also provides a computer program product, which when running on a computer, causes the computer to execute the relevant steps described above, so as to implement the video data processing method in the above embodiments.
In addition, an apparatus, which may be specifically a chip, a component or a module, may include a processor and a memory connected to each other; the memory is used for storing computer execution instructions, and when the apparatus runs, the processor may execute the computer execution instructions stored by the memory, so that the chip executes the position determination method in the above-mentioned method embodiments.
The electronic device, the computer storage medium, the computer program product, or the chip provided in this embodiment are all configured to execute the corresponding method provided above, and therefore, the beneficial effects that can be achieved by the electronic device, the computer storage medium, the computer program product, or the chip may refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Through the description of the foregoing embodiments, those skilled in the art will understand that, for convenience and simplicity of description, only the division of the functional modules is used for illustration, and in practical applications, the above function distribution may be completed by different functional modules as needed, that is, the internal structure of the device may be divided into different functional modules, so as to complete all or part of the functions described above.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, a module or a unit may be divided into only one logic function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another apparatus, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed to a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (8)

1. A video data processing method is applied to an electronic device, the electronic device comprises an application processor AP, a frame insertion chip, a display screen and a memory, and the method comprises the following steps:
the AP sends image data of a target video to the frame insertion chip;
the frame interpolation chip performs frame interpolation processing on the received image data of the target video;
the frame insertion chip sends the image data after the target video frame insertion to the display screen and sends target information to the AP, the target information is used for realizing that the image data after the target video frame insertion is stored in the memory, and the target information is the image data after the target video frame insertion;
the frame insertion chip comprises an input control unit, a compression processing unit, an output control unit and a storage control unit; the input interface of the frame insertion chip is connected with the input control unit, the input control unit is connected with the compression processing unit, the compression processing unit is respectively connected with the output control unit and the storage control unit, the output control unit is respectively connected with the first output interface of the frame insertion chip, the second output interface of the frame insertion chip and the storage control unit, and the storage control unit is connected with the storage.
2. The method of claim 1, wherein after the framing chip sends target information to the AP, the method further comprises: and the AP stores the image data after the target video frame insertion into the memory.
3. The method of claim 1 or 2, wherein after the AP stores the image data after the target video insertion in the memory, the method further comprises:
when a playing instruction of the target video is detected, the AP acquires the image data after the target video is inserted from the memory and sends the image data after the target video is inserted to the frame insertion chip;
the frame inserting chip sends the received image data after the target video frame inserting to the display screen;
and after receiving the image data after the target video frame insertion, the display screen performs display processing.
4. The method of claim 1, wherein the electronic device further comprises a shooting module, and the target video is a video shot by the shooting module in real time.
5. The method according to any one of claims 1-4, wherein an output interface of the AP is connected to an input interface of the framing chip, a first output interface of the framing chip is connected to the input interface of the AP, and a second output interface of the framing chip is connected to the input interface of the display screen.
6. A video data processing apparatus, for use in an electronic device comprising a display screen and a memory, the apparatus comprising an application processor AP and a framing chip, wherein:
the AP is used for sending image data of a target video to the frame interpolation chip;
the frame interpolation chip is used for performing frame interpolation processing on the received image data of the target video;
the frame interpolation chip is further configured to send image data after the target video frame interpolation to the display screen and send target information to the AP, where the target information is used to store the image data after the target video frame interpolation into the memory, and the target information is the image data after the target video frame interpolation;
the frame insertion chip comprises an input control unit, a compression processing unit, an output control unit and a storage control unit; the input interface of the frame insertion chip is connected with the input control unit, the input control unit is connected with the compression processing unit, the compression processing unit is respectively connected with the output control unit and the storage control unit, the output control unit is respectively connected with the first output interface of the frame insertion chip, the second output interface of the frame insertion chip and the storage control unit, and the storage control unit is connected with the storage.
7. An electronic device comprising a processor, a memory, a communication interface, a display screen, and one or more programs, the processor comprising an application processor, AP, and a framing chip, the one or more programs stored in the memory and configured to be executed by the AP and the framing chip, the programs comprising instructions for performing the steps in the method of any of claims 1-5.
8. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any one of claims 1-5.
CN202010851392.8A 2020-08-21 2020-08-21 Video data processing method and device Active CN112004086B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010851392.8A CN112004086B (en) 2020-08-21 2020-08-21 Video data processing method and device
PCT/CN2021/102574 WO2022037251A1 (en) 2020-08-21 2021-06-26 Video data processing method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010851392.8A CN112004086B (en) 2020-08-21 2020-08-21 Video data processing method and device

Publications (2)

Publication Number Publication Date
CN112004086A CN112004086A (en) 2020-11-27
CN112004086B true CN112004086B (en) 2022-11-11

Family

ID=73473102

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010851392.8A Active CN112004086B (en) 2020-08-21 2020-08-21 Video data processing method and device

Country Status (2)

Country Link
CN (1) CN112004086B (en)
WO (1) WO2022037251A1 (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112004086B (en) * 2020-08-21 2022-11-11 Oppo广东移动通信有限公司 Video data processing method and device
CN113835657A (en) * 2021-09-08 2021-12-24 维沃移动通信有限公司 Display method and electronic equipment
CN113852776B (en) * 2021-09-08 2024-06-04 维沃移动通信有限公司 Frame inserting method and electronic equipment
CN114285956A (en) * 2021-12-28 2022-04-05 维沃移动通信有限公司 Video sharing circuit, method and device and electronic equipment
CN114338953A (en) * 2021-12-28 2022-04-12 维沃移动通信有限公司 Video processing circuit, video processing method and electronic device
CN114338954A (en) * 2021-12-28 2022-04-12 维沃移动通信有限公司 Video generation circuit, method and electronic equipment
CN114285959A (en) * 2021-12-28 2022-04-05 维沃移动通信有限公司 Image processing circuit, method and device, electronic equipment and chip
CN114285978A (en) * 2021-12-28 2022-04-05 维沃移动通信有限公司 Video processing method, video processing device and electronic equipment
CN114285958B (en) * 2021-12-28 2024-07-19 维沃移动通信有限公司 Image processing circuit, image processing method, and electronic apparatus
CN114443894A (en) * 2022-01-05 2022-05-06 荣耀终端有限公司 Data processing method and device, electronic equipment and storage medium
CN114500853A (en) * 2022-02-25 2022-05-13 维沃移动通信有限公司 Electronic device and image display method
CN117111864A (en) * 2022-05-17 2023-11-24 荣耀终端有限公司 Multi-screen data processing method, electronic device and readable storage medium
CN115514902A (en) * 2022-09-09 2022-12-23 维沃移动通信有限公司 Image processing circuit and electronic device
CN116886996B (en) * 2023-09-06 2023-12-01 浙江富控创联技术有限公司 Digital village multimedia display screen broadcasting system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110874128A (en) * 2018-08-31 2020-03-10 上海瑾盛通信科技有限公司 Visualized data processing method and electronic equipment

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI326433B (en) * 2006-09-19 2010-06-21 Ind Tech Res Inst Method for saving interpolation data
JP4438795B2 (en) * 2006-12-28 2010-03-24 株式会社日立製作所 Video conversion device, video display device, and video conversion method
US8244713B2 (en) * 2007-07-12 2012-08-14 International Business Machines Corporation Content management system that retrieves data from an external data source and creates one or more objects in the repository
CN101977218B (en) * 2010-10-20 2013-11-27 深圳市融创天下科技股份有限公司 Internet playing file transcoding method and system
CN104469241B (en) * 2014-11-28 2018-01-16 中国航空无线电电子研究所 A kind of device for realizing video frame rate conversion
US10904535B2 (en) * 2017-04-01 2021-01-26 Intel Corporation Video motion processing including static scene determination, occlusion detection, frame rate conversion, and adjusting compression ratio
CN108600783B (en) * 2018-04-23 2021-03-30 深圳齐心好视通云计算有限公司 Frame rate adjusting method and device and terminal equipment
CN111083417B (en) * 2019-12-10 2021-10-19 Oppo广东移动通信有限公司 Image processing method and related product
CN111050149A (en) * 2019-12-24 2020-04-21 苏州乐梦光电科技有限公司 Video processing method, device and equipment for projection system and storage medium
CN111147787B (en) * 2019-12-27 2021-05-04 Oppo广东移动通信有限公司 Method for processing interpolation frame and related equipment
CN111225150B (en) * 2020-01-20 2021-08-10 Oppo广东移动通信有限公司 Method for processing interpolation frame and related product
CN112004086B (en) * 2020-08-21 2022-11-11 Oppo广东移动通信有限公司 Video data processing method and device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110874128A (en) * 2018-08-31 2020-03-10 上海瑾盛通信科技有限公司 Visualized data processing method and electronic equipment

Also Published As

Publication number Publication date
WO2022037251A1 (en) 2022-02-24
CN112004086A (en) 2020-11-27

Similar Documents

Publication Publication Date Title
CN112004086B (en) Video data processing method and device
CN108401124B (en) Video recording method and device
CN113726950B (en) Image processing method and electronic equipment
CN113254120B (en) Data processing method and related device
CN116501210B (en) Display method, electronic equipment and storage medium
CN113556598A (en) Multi-window screen projection method and electronic equipment
CN110297917B (en) Live broadcast method and device, electronic equipment and storage medium
CN111813490A (en) Method and device for processing interpolation frame
CN112767231B (en) Layer composition method and device
CN112328941A (en) Application screen projection method based on browser and related device
CN116052618B (en) Screen refresh rate switching method and electronic equipment
CN116055786A (en) Method for displaying multiple windows and electronic equipment
CN113225616A (en) Video playing method and device, computer equipment and readable storage medium
CN114040252B (en) Display frame rate control method and device, computer readable medium and electronic equipment
CN117631950A (en) Split screen display method and related device
CN115686403A (en) Display parameter adjusting method, electronic device, chip and readable storage medium
CN112260845B (en) Method and device for accelerating data transmission
CN115437723A (en) Application scene fast switching method and device, electronic equipment and storage medium
CN115086888A (en) Message notification method and device and electronic equipment
CN116077940B (en) Drawing processing method and related device in game application
CN117695626B (en) Game data identification method, equipment and storage medium
CN116664375B (en) Image prediction method, device, equipment and storage medium
CN114168096B (en) Display method and system of output picture, mobile terminal and storage medium
CN117082295B (en) Image stream processing method, device and storage medium
CN117724779A (en) Method for generating interface image and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant