CN110874128A - Visualized data processing method and electronic equipment - Google Patents
Visualized data processing method and electronic equipment Download PDFInfo
- Publication number
- CN110874128A CN110874128A CN201811010330.3A CN201811010330A CN110874128A CN 110874128 A CN110874128 A CN 110874128A CN 201811010330 A CN201811010330 A CN 201811010330A CN 110874128 A CN110874128 A CN 110874128A
- Authority
- CN
- China
- Prior art keywords
- chip
- visual data
- frame image
- target
- frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3234—Power saving characterised by the action undertaken
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1407—General aspects irrespective of display type, e.g. determination of decimal point position, display with fixed or driving decimal point, suppression of non-significant zeros
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
The application relates to a visualized data processing method and electronic equipment. The method comprises the following steps: the method comprises the steps of transmitting visual data to be processed to a second chip with lower power consumption through a first chip, performing frame insertion processing on the visual data to be processed through the second chip to obtain target visual data, transmitting the target visual data to a display screen, and outputting the target visual data through the display screen. Because the visual data can be transmitted to the second chip with lower power consumption for frame insertion processing, the power consumption can be reduced.
Description
Technical Field
The present application relates to the field of computer technologies, and in particular, to a visualized data processing method and an electronic device.
Background
With the development of computer technology, a frame interpolation technology appears, that is, the frame rate of visual data is increased by increasing the frame number of the visual data, so that the phenomenon of screen smear in the switching process of the visual data is avoided. At present, the frame interpolation technology mainly adopts an algorithm based on motion estimation, the algorithm needs to consider a plurality of special situations, the algorithm is complex, and the problem of high power consumption exists.
Disclosure of Invention
The embodiment of the application provides a visual data processing method and electronic equipment, which can reduce power consumption.
A visual data processing method, comprising:
transmitting the visual data to be processed to a second chip through the first chip;
performing frame interpolation processing on the visual data to be processed through the second chip to obtain target visual data, and transmitting the target visual data to a display screen;
outputting the target visualization data through the display screen;
wherein the first chip consumes more power than the second chip.
An electronic device comprises a first chip, a second chip and a display screen which are connected in sequence;
the first chip is used for transmitting visual data to be processed to the second chip;
the second chip is used for performing frame interpolation processing on the visual data to be processed to obtain target visual data and transmitting the target visual data to a display screen;
the display screen is used for outputting the target visualization data;
wherein the first chip consumes more power than the second chip.
According to the visual data processing method and the electronic equipment, the visual data to be processed is transmitted to the second chip with lower power consumption through the first chip, the visual data to be processed is subjected to frame insertion processing through the second chip, the target visual data is obtained and transmitted to the display screen, and the target visual data is output through the display screen. Because the visual data can be transmitted to the second chip with lower power consumption for frame insertion processing, the power consumption can be reduced.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a block diagram showing an internal configuration of an electronic apparatus according to an embodiment;
FIG. 2 is a schematic diagram showing an internal configuration of an electronic apparatus according to an embodiment;
FIG. 3 is a flow diagram of a visualization data processing method in one embodiment;
FIG. 4 is a flow diagram of a process for frame insertion by a second chip in one embodiment;
FIG. 5 is a flow diagram of a visualization data processing method in one embodiment;
fig. 6 is a block diagram of a partial structure of a cellular phone in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first chip may be referred to as a second chip, and similarly, a second chip may be referred to as a first chip, without departing from the scope of the present application. The first chip and the second chip are both chips, but they are not the same chip.
Fig. 1 is a schematic diagram of an internal structure of an electronic device in one embodiment. As shown in fig. 1, the electronic device 100 may include a first chip 110, a second chip 120, and a display screen 130. The first chip 110 may be a Central Processing Unit (CPU) module. The second chip 120 may be a low power chip with a visual data framing function. Wherein, the power consumption of the first chip 110 is higher than that of the second chip 120. The display screen 130 may be a liquid crystal display, an organic light emitting diode. The second chip 120 is connected between the first chip 110 and the display 130, and the first chip 110 may also be directly connected to the display 130.
The first chip 110 may transmit the to-be-processed visual data to the second chip 120, perform frame interpolation processing on the to-be-processed visual data through the second chip 120 to obtain target visual data, transmit the target visual data to the display screen 130, and output the target visual data through the display screen 130.
Fig. 2 is a schematic diagram illustrating an internal structure of the electronic device in one embodiment. As shown in fig. 2, the electronic device includes a first chip 110 and a second chip 120 connected, wherein the first chip 110 has higher power consumption than the second chip 120. The first chip 110 is used to support the operation of this electronic device. The second chip 120 is used for providing a visual data framing processing function. Second chip 120 includes microcontroller 122, processor 124, internal memory 126, and interface 128. The microcontroller 122 is connected to the first chip 110 via a serial peripheral interface, and is configured to provide control capability, support the operation of the second chip 120, and transmit target visual data to the display 130 via the interface 128. The processor 124 is connected to the microcontroller 122 and the internal memory 126, respectively, and is configured to perform frame interpolation on the to-be-processed visual data to obtain target visual data. The internal memory 126 is connected to the microcontroller 122 and the processor 124, respectively, and is used for storing the computer program, the to-be-processed visualization data transmitted by the first chip 110, the target visualization data, and the like. The Interface 128 is connected to the microcontroller 122 and connected to the display screen 130 via a MIPI (Mobile Industry Processor Interface) Interface, for transmitting the target visualization data to the display screen 130.
The first chip 110 may transmit the to-be-processed visual data to the second chip 120, perform frame interpolation processing on the to-be-processed visual data through the second chip 120 to obtain target visual data, transmit the target visual data to the display screen 130, and output the target visual data through the display screen 130.
The visual data may be a video, a UI (User Interface) Interface, or a continuous picture composed of a plurality of frames of still images. The electronic device can perform frame interpolation on the visual data to improve the frame rate of the visual data, and the visual data to be processed is the visual data which is not subjected to frame interpolation. The second chip may be a low power chip. The low power consumption chip is a chip capable of reducing power consumption and thus extending battery life. The frame interpolation processing refers to an operation of increasing the frame number of a static image in the visualized data to improve the frame rate of the visualized data and enable the visualized data to be displayed more smoothly. The second chip 120 of the electronic device may receive the to-be-processed visual data transmitted by the first chip 110, perform frame interpolation on the to-be-processed visual data through the second chip 120, transmit the target visual data after the frame interpolation to the display screen 130, and output the target visual data through the display screen 130, that is, display the visual data after the frame interpolation on the display screen 130.
In an embodiment, the electronic device performs frame interpolation on the to-be-processed visualization data through the second chip 120, and the process of obtaining the target visualization data may further include: the second chip 120 sequentially obtains a first frame image and a second frame image which are adjacent to each other from the to-be-processed visual data transmitted by the first chip 110, inputs the first frame image and the second frame image into the neural network to obtain a corresponding target frame image, and obtains target visual data according to the first frame image, the target frame image and the second frame image.
After receiving the to-be-processed visual data transmitted by the first chip 110, the second chip 120 may sequentially obtain a first frame image and a second frame image that are adjacent in the to-be-processed visual data, that is, sequentially take two adjacent frames of images in the to-be-processed visual data as the first frame image and the second frame image, respectively. The electronic device may train a neural network that may perform framing. In one embodiment, the electronic device may further optimize the trained neural network, configure a corresponding acceleration algorithm for the optimized neural network, perform frame interpolation on the to-be-processed visual data by combining the neural network and the acceleration algorithm, and improve the efficiency of frame interpolation while reducing power consumption. The second chip 120 stores a neural network capable of performing frame interpolation, and the electronic device inputs the first frame image and the second frame image into the neural network through the second chip 120 to obtain a corresponding target frame image.
In one embodiment, before the electronic device inputs the first frame image and the second frame image to the neural network through the second chip 120, the second chip 120 may be further configured to perform an initialization process on the neural network. Specifically, the electronic device may obtain parameters of the to-be-processed visual data, such as a frame rate, a resolution, a picture quality, and a number of video channels, and adjust parameters of the neural network according to the parameter values, so as to improve accuracy of the target frame image.
In one embodiment, the first chip 110 may be further configured to transmit the adjacent first frame image and the second frame image in the to-be-processed visual data to the second chip 120; the second chip 120 may further be configured to input the first frame image and the second frame image into the neural network to obtain corresponding target frame images, and sequentially transmit the first frame image, the target frame image, and the second frame image to the display screen 130.
The first chip 110 sequentially obtains a first frame image and a second frame image adjacent to each other in the to-be-processed visual data, and transmits the first frame image and the second frame image to the second chip 120. The second chip 120 is configured to, after receiving the first frame image and the second frame image transmitted by the first chip 110, input the first frame image and the second frame image into a neural network, obtain a corresponding intermediate frame image as a target frame image, and sequentially transmit the first frame image, the target frame image, and the second frame image to the display screen 130 for display.
In one embodiment, the first chip 110 may further control the second chip 120 to start the visualized data processing function when the currently running application is detected as the preset application.
The electronic device may preset an application program that needs to perform framing processing on the visual data. When the electronic device detects that the currently running application includes a preset application, the first chip 110 may be configured to control the second chip 120 to start a visualized data processing function.
In one embodiment, the first chip 110 may also be used to detect the remaining power of the electronic device; and when the residual electric quantity is lower than the electric quantity threshold value, controlling the second chip 120 to close the visual data processing function.
The remaining capacity of the electronic device is the proportion of the available power in the battery of the electronic device to the nominal capacity. The charge threshold may be determined based on actual usage requirements. The electronic device may detect the remaining power in real time, and when the remaining power is lower than the power threshold, the first chip 110 may be configured to control the second chip 120 to turn off the visual data processing function.
FIG. 3 is a flow diagram of a visualization data processing method in one embodiment. The data visualization method in this embodiment is described by taking the electronic device as an example. As shown in fig. 3, the visualized data processing method comprises steps 302 to 306. Wherein:
Visual data refers to interface data that is available for output to a display screen for presentation. Specifically, the visual data may be a video, a UI (User Interface) Interface, or a continuous picture composed of a plurality of frames of still images. For example, the visual data may be interface data when the electronic device performs an interactive operation on a main screen, interface data of any application program such as a game and a social application program, or played video interface data. The visual data can comprise data of a plurality of application program windows, the electronic equipment can carry out typesetting according to the positions of different application program windows, and the obtained typesetted visual data is output to a display screen for displaying. The electronic device can perform frame interpolation on the visual data to improve the frame rate of the visual data, and the visual data to be processed is the visual data which is not subjected to frame interpolation. The first chip may be a CPU of the electronic device. The second chip is a chip with a visualized data processing and converting function and lower power consumption than the first chip. In particular, the second chip may be a low power chip. The low power consumption chip is a chip capable of reducing power consumption and thus extending battery life. The second chip may be integrated on the first chip of the electronic device, or may be disposed in the electronic device in the form of a Printed Circuit Board (PCB).
The electronic device can transmit the visual data to be processed to the second chip through the first chip. For example, when the electronic device plays a video, the electronic device may transmit the video to the second chip through the first chip.
And 304, performing frame interpolation processing on the visual data to be processed through the second chip to obtain target visual data, and transmitting the target visual data to a display screen.
The frame interpolation processing refers to an operation of increasing the frame number of a static image in the visualized data to improve the frame rate of the visualized data and enable the visualized data to be displayed more smoothly. Specifically, frame interpolation processing is to add one frame of image between every two frames of images in the visualized data, shorten the display time between each frame of image, and improve the frame rate of the original visualized data. For example, when the to-be-processed visualized data is 30 frames per second, that is, the display time of each frame of image is 0.02s, the target visualized data with the frame rate of 60 frames per second, that is, the display time of each frame of image is 0.01s can be obtained through the frame interpolation processing, and the visualized data can be visually smoother. The target visualization data is the visualization data subjected to frame interpolation processing. Specifically, the second chip is a chip integrated with a frame interpolation algorithm, and may be, for example, a frame interpolation model trained according to a Deep learning algorithm such as DNN (Deep Neural Network), CNN (Convolutional Neural Network), Decision Tree (Decision Tree), Random forest (Random forest), and the second chip may perform frame interpolation processing on the visual data to be processed according to the frame interpolation model. The display screen may be a liquid crystal display, an organic light emitting diode, or the like.
The second chip of the electronic device can receive the visual data to be processed transmitted by the first chip, perform frame interpolation processing on the visual data to be processed through the second chip, and transmit the target visual data subjected to frame interpolation processing to the display screen.
And step 306, outputting the target visualization data through the display screen.
And the display screen outputs target visual data, namely the visual data subjected to frame interpolation processing is displayed on the display screen. Specifically, the electronic device may sequentially output images included in the target visualization data to a display screen for displaying according to the frame rate of the target visualization data. The user can view the visualized data displayed on the display screen or perform interactive operation on the displayed visualized data, and the like.
In the embodiment provided by the application, the to-be-processed visual data is transmitted to the second chip through the first chip, the to-be-processed visual data is subjected to frame insertion processing through the second chip, the target visual data is obtained and transmitted to the display screen, and the target visual data is output through the display screen. Because the second chip with lower power consumption can be adopted for frame insertion processing, the power consumption can be reduced while the blockage of visual data is reduced, the smoothness of the visual data is improved, the workload of the first chip is reduced, and the service life of a battery is prolonged.
As shown in fig. 4, in an embodiment, the provided visualization data processing method performs frame interpolation on to-be-processed visualization data through the second chip to obtain target visualization data, and may further include steps 402 to 406. Wherein:
and 402, sequentially acquiring a first frame image and a second frame image which are adjacent to each other from the visual data to be processed transmitted by the first chip through the second chip.
Specifically, after receiving the to-be-processed visual data transmitted by the first chip, the second chip may sequentially acquire a first frame image and a second frame image that are adjacent in the to-be-processed visual data, that is, sequentially take two adjacent frames of images in the to-be-processed visual data as the first frame image and the second frame image, respectively.
The electronic device may train a neural network that may perform framing. Specifically, in the training process of the neural network, a first frame image and a last frame image in any continuous 3 frames of images in visual data can be input into the neural network as training images, the neural network obtains a predicted intermediate frame image according to the training images, obtains a loss function according to the predicted intermediate frame image and a real intermediate frame image, namely, an intermediate frame image in the continuous 3 frames of images, and adjusts parameters of the neural network according to the loss function, so that the trained neural network can subsequently output an accurate intermediate frame image. In one embodiment, the electronic device may further optimize the trained neural network, configure a corresponding acceleration algorithm for the optimized neural network, perform frame interpolation on the to-be-processed visual data by combining the neural network and the acceleration algorithm, and improve the efficiency of frame interpolation while reducing power consumption. The second chip stores a neural network capable of performing frame interpolation, and the electronic device inputs the first frame image and the second frame image into the neural network through the second chip to obtain corresponding target frame images.
In one embodiment, before the electronic device inputs the first frame image and the second frame image to the neural network through the second chip, the electronic device may further include: and initializing the neural network through the second chip. Specifically, the electronic device may obtain parameters of the to-be-processed visual data, such as a frame rate, a resolution, a picture quality, and a number of video channels, and adjust parameters of the neural network according to the parameter values, so as to improve accuracy of the target frame image.
And 406, obtaining target visual data according to the first frame image, the target frame image and the second frame image.
Specifically, after the electronic device obtains the target frame image, the first frame image, the target frame image and the second frame image can be sequentially output through the second chip as target visualization data; or the target frame image output by the second chip can be obtained through the first chip, and then the target visual data obtained according to the first frame image, the target frame image and the second frame image is transmitted to the display screen; the electronic equipment can also transmit the target frame image to the display screen through the second chip image after the first chip transmits the first frame image to the display screen, and then transmit the second frame image to the display screen through the first chip.
The first frame image and the second frame image which are sequentially acquired from the visual data to be processed are input into the neural network through the second chip to obtain the corresponding target frame image, the target visual data obtained according to the first frame image, the target frame image and the second frame image are output to the display screen for displaying, and due to the fact that frame insertion processing can be carried out through the second chip with low power consumption, the frame rate of the visual data can be improved, and meanwhile power consumption can be reduced.
In one embodiment, the provided visualization data processing method may further include steps 502 to 506. Wherein:
The electronic device can also sequentially acquire a first frame image and a second frame image which are adjacent to each other in the visual data to be processed through the first chip, and transmit the first frame image and the second frame image to the second chip.
And step 504, inputting the first frame image and the second frame image into a neural network through a second chip to obtain corresponding target frame images.
The second chip may input the first frame image and the second frame image into the neural network after receiving the first frame image and the second frame image transmitted by the first chip, and obtain a corresponding intermediate frame image as a target frame image.
And step 506, sequentially transmitting the first frame image, the target frame image and the second frame image to a display screen.
The electronic equipment can transmit the first frame image, the target frame image and the second frame image to the display screen for displaying through the second chip. The electronic equipment can also transmit the first frame image to the display screen through the first chip, then transmit the target frame image to the display screen through the second chip, and then transmit the second frame image to the display screen through the first chip for displaying. Compared with the method that the first frame image and the second frame image are directly transmitted to the display screen for displaying, the visual data can be smoother and smoother, and the power consumption is smaller.
In an embodiment, the first chip may be further configured to control the second chip to start a visualized data processing function through the first chip when it is detected that the currently running application is a preset application.
An application refers to a computer program that can be used to perform a specific job or jobs. Applications may include, but are not limited to, social communication applications, gaming applications, educational applications, audio-visual entertainment applications such as video players, and the like. The electronic device may preset an application program that needs to perform framing processing on the visual data. Specifically, the electronic device may control the second chip to start the visualized data processing function through the first chip when detecting that the currently running application includes a preset application; or when the window of the current visual data contains a window of a preset application program, controlling the second chip to start a visual data processing function; and when the windows of the current visual data are all windows of preset application programs, controlling the second chip to start the visual data processing function. The preset application program may be set according to actual requirements, and is not limited herein. For example, the preset application may be a video player, a game type application, a webcast type application, and the like, but is not limited thereto.
In one embodiment, the provided visualized data processing method further comprises: the method comprises the steps that the residual electric quantity of the electronic equipment is detected through the first chip, and when the residual electric quantity is lower than an electric quantity threshold value, the second chip is controlled through the first chip to close a visual data processing function.
The remaining capacity of the electronic device is the proportion of the available power in the battery of the electronic device to the nominal capacity. The charge threshold may be determined based on actual usage requirements. For example, the charge amount threshold may be 10%, 15%, 20%, 30%, etc. without limitation thereto. The electronic equipment can detect the residual capacity in real time, and when the residual capacity is lower than the capacity threshold value, the electronic equipment can control the second chip to close the visual data processing function through the first chip.
In one embodiment, the provided visualized data processing method may further include: when a frame synchronization signal generated by the display screen is received, if the display screen does not receive the image transmitted by the first chip, a target frame image generated by the second chip is acquired for displaying.
The refresh process of the display screen is from bottom to bottom, and when the display screen is refreshed, a VSync (vertical synchronization) signal, i.e. a frame synchronization signal, is sent out. The display screen needs to acquire a new image for refresh display after generating the frame synchronization signal, and if the image transmitted by the first chip is not received at this time, the display screen of the electronic device may acquire a target frame image acquired by the second chip after the second chip inputs the adjacent first frame image and second frame image into the neural network, and refresh display the target frame image. Therefore, the phenomenon of blocking in the process of displaying the visual data is avoided, and the visual data is smoother and smoother.
In an embodiment, after the electronic device generates the target frame image through the second chip, the first frame image and the target frame image, and the target frame image and the second frame image may be respectively input to the neural network for frame interpolation processing, so as to obtain and output the first target frame image and the second target frame image, and the first frame image, the first target frame image, the second target frame image and the second frame image are sequentially output.
It should be understood that although the various steps in the flow charts of fig. 3-5 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 3-5 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternating with other steps or at least some of the sub-steps or stages of other steps.
The embodiment of the application also provides the electronic equipment. As shown in fig. 6, for convenience of explanation, only the parts related to the embodiments of the present application are shown, and details of the technology are not disclosed, please refer to the method part of the embodiments of the present application. The electronic device may be any terminal device including a mobile phone, a tablet computer, a PDA (Personal Digital Assistant), a POS (Point of Sales), a vehicle-mounted computer, a wearable device, and the like, taking the electronic device as the mobile phone as an example:
fig. 6 is a block diagram of a partial structure of a mobile phone related to an electronic device provided in an embodiment of the present application. Referring to fig. 6, the handset includes: a Radio Frequency (RF) circuit 610, a memory 620, an input unit 630, a display unit 640, a sensor 650, an audio circuit 660, a wireless fidelity (WiFi) module 670, a first chip 110, a second chip 120, and a power supply 690. Wherein, the first chip 110 consumes more power than the second chip 120. Those skilled in the art will appreciate that the handset configuration shown in fig. 6 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The RF circuit 610 may be used for receiving and transmitting signals during information transmission or communication, and may receive downlink information of the base station and then process the received downlink information to the first chip 110; the uplink data may also be transmitted to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuitry 610 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE)), e-mail, Short Messaging Service (SMS), and the like.
The memory 620 may be used to store software programs and modules, and the first chip 110 executes various functional applications and data processing of the mobile phone by operating the software programs and modules stored in the memory 620. The memory 620 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function (such as an application program for a sound playing function, an application program for an image playing function, and the like), and the like; the data storage area may store data (such as audio data, an address book, etc.) created according to the use of the mobile phone, and the like. Further, the memory 620 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 630 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone 600. Specifically, the input unit 630 may include a touch panel 631 and other input devices 632. The touch panel 631, which may also be referred to as a touch screen, may collect touch operations performed by a user on or near the touch panel 631 (e.g., operations performed by the user on or near the touch panel 631 using any suitable object or accessory such as a finger, a stylus, etc.) and drive the corresponding connection device according to a preset program. In one embodiment, the touch panel 631 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the first chip 110, and can receive and execute commands sent by the first chip 110. In addition, the touch panel 631 may be implemented using various types, such as resistive, capacitive, infrared, and surface acoustic wave. The input unit 630 may include other input devices 632 in addition to the touch panel 631. In particular, other input devices 632 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), and the like.
The display unit 640 may be used to display information input by the user or information provided to the user and various menus of the mobile phone. The display unit 640 may include a display screen 130. In one embodiment, the Display screen 130 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. In one embodiment, the touch panel 631 can cover the display 130, and when the touch panel 631 detects a touch operation thereon or nearby, the touch panel is transmitted to the first chip 110 to determine the type of the touch event, and then the first chip 110 provides a corresponding visual output on the display 130 according to the type of the touch event. Although in fig. 6 the touch panel 631 and the display 130 are two separate components to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 631 and the display 130 may be integrated to implement the input and output functions of the mobile phone.
The handset 600 may also include at least one sensor 650, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that adjusts the brightness of the display 130 according to the brightness of ambient light, and a proximity sensor that turns off the display 130 and/or the backlight when the mobile phone is moved to the ear. The motion sensor can comprise an acceleration sensor, the acceleration sensor can detect the magnitude of acceleration in each direction, the magnitude and the direction of gravity can be detected when the mobile phone is static, and the motion sensor can be used for identifying the application of the gesture of the mobile phone (such as horizontal and vertical screen switching), the vibration identification related functions (such as pedometer and knocking) and the like; the mobile phone may be provided with other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor.
WiFi belongs to short-distance wireless transmission technology, and the mobile phone can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 670, and provides wireless broadband Internet access for the user. Although fig. 6 shows WiFi module 670, it is understood that it is not an essential component of handset 600 and may be omitted as desired.
The first chip 110 is a control center of the mobile phone, connects various parts of the whole mobile phone by using various interfaces and lines, and executes various functions and processes data of the mobile phone by operating or executing software programs and/or modules stored in the memory 620 and calling data stored in the memory 620, thereby performing overall monitoring of the mobile phone. In one embodiment, first chip 110 may include one or more processing units. In one embodiment, the first chip 110 may integrate an application processor and a modem processor, wherein the application processor mainly processes an operating system, a user interface, an application program, and the like; the modem processor handles primarily wireless communications. It is to be understood that the modem processor described above may not be integrated into the first chip 110. The second chip 120 may be a low power chip with a function of visual data framing.
The handset 600 further includes a power supply 690 (e.g., a battery) for supplying power to the components, and preferably, the power supply may be logically connected to the first chip 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In one embodiment, the handset 600 may also include a camera, a bluetooth module, and the like.
The first chip 110 may transmit the to-be-processed visual data to the second chip 120, perform frame interpolation processing on the to-be-processed visual data through the second chip 120 to obtain target visual data, transmit the target visual data to the display screen 130, and output the target visual data through the display screen 130.
Any reference to memory, storage, database, or other medium used herein may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.
Claims (10)
1. A method of visual data processing, comprising:
transmitting the visual data to be processed to a second chip through the first chip;
performing frame interpolation processing on the visual data to be processed through the second chip to obtain target visual data, and transmitting the target visual data to a display screen;
outputting the target visualization data through the display screen;
wherein the first chip consumes more power than the second chip.
2. The method according to claim 1, wherein the performing, by the second chip, frame interpolation on the to-be-processed visualization data to obtain target visualization data comprises:
sequentially acquiring a first frame image and a second frame image which are adjacent to each other from the visual data to be processed transmitted by the first chip through the second chip;
inputting the first frame image and the second frame image into a neural network to obtain corresponding target frame images;
and obtaining the target visualization data according to the first frame image, the target frame image and the second frame image.
3. The method of claim 1, wherein transmitting the visual data to be processed to the second chip via the first chip comprises:
transmitting a first frame image and a second frame image which are adjacent in the visual data to be processed to the second chip through the first chip;
the frame insertion processing is carried out on the visual data to be processed through the second chip to obtain target visual data, and the target visual data are transmitted to a display screen, and the frame insertion processing method comprises the following steps:
inputting the first frame image and the second frame image into a neural network through the second chip to obtain the corresponding target frame images;
and sequentially transmitting the first frame image, the target frame image and the second frame image to the display screen.
4. The method of claim 1, further comprising:
and when the currently running application program is detected to be a preset application program, controlling the second chip to start a visual data processing function through the first chip.
5. The method of claim 1, further comprising:
detecting the residual capacity of the electronic equipment through the first chip;
and when the residual electric quantity is lower than an electric quantity threshold value, controlling the second chip to close the visualized data processing function through the first chip.
6. An electronic device is characterized by comprising a first chip, a second chip and a display screen which are sequentially connected;
the first chip is used for transmitting visual data to be processed to the second chip;
the second chip is used for performing frame interpolation processing on the visual data to be processed to obtain target visual data and transmitting the target visual data to a display screen;
the display screen is used for outputting the target visualization data;
wherein the first chip consumes more power than the second chip.
7. The electronic device according to claim 6, wherein the second chip is further configured to sequentially acquire a first frame image and a second frame image that are adjacent to each other from the visual data to be processed transmitted by the first chip; and inputting the first frame image and the second frame image into a neural network to obtain corresponding target frame images, and obtaining the target visualization data according to the first frame image, the target frame images and the second frame images.
8. The electronic device according to claim 6, wherein the first chip is further configured to control the second chip to start a visual data processing function when a visual data processing request is received or it is detected that a currently running application is a preset application.
9. The electronic device of claim 6, wherein the first chip is further configured to detect a remaining power of the electronic device; and when the residual electric quantity is lower than an electric quantity threshold value, controlling the second chip to close the visualized data processing function through the first chip.
10. The electronic device of claim 6, wherein the second chip comprises:
the memory is used for storing the visual data to be processed and the target visual data transmitted by the first chip;
the processor is connected with the memory and is used for performing frame interpolation processing on the visual data to be processed to obtain the target visual data;
and the microcontroller is connected with the processor and is used for transmitting the target visual data to a display screen through an interface.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811010330.3A CN110874128B (en) | 2018-08-31 | 2018-08-31 | Visualized data processing method and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811010330.3A CN110874128B (en) | 2018-08-31 | 2018-08-31 | Visualized data processing method and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110874128A true CN110874128A (en) | 2020-03-10 |
CN110874128B CN110874128B (en) | 2021-03-30 |
Family
ID=69715761
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811010330.3A Active CN110874128B (en) | 2018-08-31 | 2018-08-31 | Visualized data processing method and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110874128B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112004086A (en) * | 2020-08-21 | 2020-11-27 | Oppo广东移动通信有限公司 | Video data processing method and device |
CN112135081A (en) * | 2020-09-11 | 2020-12-25 | Oppo广东移动通信有限公司 | Mode control method and device, frame insertion chip and electronic equipment |
CN114003120A (en) * | 2020-07-28 | 2022-02-01 | Oppo广东移动通信有限公司 | Electronic device |
CN114153410A (en) * | 2021-11-29 | 2022-03-08 | 上海顺久电子科技有限公司 | Display control method and related device for multi-window pictures in same screen |
CN114285958A (en) * | 2021-12-28 | 2022-04-05 | 维沃移动通信有限公司 | Image processing circuit, image processing method, and electronic device |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101552866A (en) * | 2008-03-31 | 2009-10-07 | 株式会社东芝 | A video signal interpolating device, a video display device and a video signal interpolating method |
US20140002732A1 (en) * | 2012-06-29 | 2014-01-02 | Marat R. Gilmutdinov | Method and system for temporal frame interpolation with static regions excluding |
CN103702059A (en) * | 2013-12-06 | 2014-04-02 | 乐视致新电子科技(天津)有限公司 | Frame rate conversion control method and device |
CN104363502A (en) * | 2014-10-28 | 2015-02-18 | 深圳创维-Rgb电子有限公司 | Method and device for protecting OSD (on-screen display) pictures |
CN105828183A (en) * | 2015-01-04 | 2016-08-03 | 华为技术有限公司 | Video frame processing method, video processing chip and motion estimate and motion compensation MEMC chip |
CN108121524A (en) * | 2017-12-19 | 2018-06-05 | 广东欧珀移动通信有限公司 | The adjusting method and device, electronic equipment of electronic equipment image display preview frame per second |
CN108322685A (en) * | 2018-01-12 | 2018-07-24 | 广州华多网络科技有限公司 | Video frame interpolation method, storage medium and terminal |
-
2018
- 2018-08-31 CN CN201811010330.3A patent/CN110874128B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101552866A (en) * | 2008-03-31 | 2009-10-07 | 株式会社东芝 | A video signal interpolating device, a video display device and a video signal interpolating method |
US20140002732A1 (en) * | 2012-06-29 | 2014-01-02 | Marat R. Gilmutdinov | Method and system for temporal frame interpolation with static regions excluding |
CN103702059A (en) * | 2013-12-06 | 2014-04-02 | 乐视致新电子科技(天津)有限公司 | Frame rate conversion control method and device |
CN104363502A (en) * | 2014-10-28 | 2015-02-18 | 深圳创维-Rgb电子有限公司 | Method and device for protecting OSD (on-screen display) pictures |
CN105828183A (en) * | 2015-01-04 | 2016-08-03 | 华为技术有限公司 | Video frame processing method, video processing chip and motion estimate and motion compensation MEMC chip |
CN108121524A (en) * | 2017-12-19 | 2018-06-05 | 广东欧珀移动通信有限公司 | The adjusting method and device, electronic equipment of electronic equipment image display preview frame per second |
CN108322685A (en) * | 2018-01-12 | 2018-07-24 | 广州华多网络科技有限公司 | Video frame interpolation method, storage medium and terminal |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114003120A (en) * | 2020-07-28 | 2022-02-01 | Oppo广东移动通信有限公司 | Electronic device |
CN112004086A (en) * | 2020-08-21 | 2020-11-27 | Oppo广东移动通信有限公司 | Video data processing method and device |
CN112004086B (en) * | 2020-08-21 | 2022-11-11 | Oppo广东移动通信有限公司 | Video data processing method and device |
CN112135081A (en) * | 2020-09-11 | 2020-12-25 | Oppo广东移动通信有限公司 | Mode control method and device, frame insertion chip and electronic equipment |
CN114153410A (en) * | 2021-11-29 | 2022-03-08 | 上海顺久电子科技有限公司 | Display control method and related device for multi-window pictures in same screen |
CN114153410B (en) * | 2021-11-29 | 2024-08-20 | 上海顺久电子科技有限公司 | Display control method and related device for multi-window pictures in same screen |
CN114285958A (en) * | 2021-12-28 | 2022-04-05 | 维沃移动通信有限公司 | Image processing circuit, image processing method, and electronic device |
Also Published As
Publication number | Publication date |
---|---|
CN110874128B (en) | 2021-03-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110874128B (en) | Visualized data processing method and electronic equipment | |
CN104967896A (en) | Method for displaying bulletscreen comment information, and apparatus thereof | |
CN104852885B (en) | Method, device and system for verifying verification code | |
WO2017152355A1 (en) | Display method and terminal device | |
CN103596017B (en) | Video downloading method and system | |
CN108984066B (en) | Application icon display method and mobile terminal | |
CN104571979B (en) | A kind of method and apparatus for realizing split view | |
AU2018273505B2 (en) | Method for capturing fingerprint and associated products | |
CN108958629B (en) | Split screen quitting method and device, storage medium and electronic equipment | |
CN108958606B (en) | Split screen display method and device, storage medium and electronic equipment | |
KR101972939B1 (en) | Method and apparatus for displaying past chat history | |
US11165950B2 (en) | Method and apparatus for shooting video, and storage medium | |
CN109995862B (en) | Resource scheduling method and terminal | |
CN103488450A (en) | Method, device and terminal equipment for projecting picture | |
CN107103074B (en) | Processing method of shared information and mobile terminal | |
US20170160921A1 (en) | Media file processing method and terminal | |
CN103581762A (en) | Method, device and terminal equipment for playing network videos | |
CN111158815B (en) | Dynamic wallpaper blurring method, terminal and computer readable storage medium | |
CN107632985B (en) | Webpage preloading method and device | |
CN104238931B (en) | Information input method and device and electronic equipment | |
CN108628534B (en) | Character display method and mobile terminal | |
CN107329547B (en) | Temperature control method and device and mobile terminal | |
CN107193551B (en) | Method and device for generating image frame | |
CN105513098B (en) | Image processing method and device | |
CN105159655B (en) | Behavior event playing method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |