CN114071227A - Data processing method and device - Google Patents

Data processing method and device Download PDF

Info

Publication number
CN114071227A
CN114071227A CN202010784824.8A CN202010784824A CN114071227A CN 114071227 A CN114071227 A CN 114071227A CN 202010784824 A CN202010784824 A CN 202010784824A CN 114071227 A CN114071227 A CN 114071227A
Authority
CN
China
Prior art keywords
frame
controller
image frames
application program
camera assembly
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010784824.8A
Other languages
Chinese (zh)
Inventor
刘立军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202010784824.8A priority Critical patent/CN114071227A/en
Publication of CN114071227A publication Critical patent/CN114071227A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/647Control signaling between network components and server or clients; Network processes for video distribution between server and clients, e.g. controlling the quality of the video stream, by dropping packets, protecting content from unauthorised alteration within the network, monitoring of network load, bridging between two different networks, e.g. between IP and wireless
    • H04N21/64784Data processing by the network
    • H04N21/64792Controlling the complexity of the content stream, e.g. by dropping packets

Abstract

The present disclosure provides a data processing method and apparatus. The method is applied to the terminal and comprises the following steps: the method comprises the steps of receiving a request for calling a camera assembly sent by an application program, controlling the camera assembly to run through a controller, carrying out frame loss processing on a part of image frames collected by the camera assembly, and sending the image frames which are not discarded by the controller and acquired from the controller to the application program. Because the controller is used for executing the frame dropping operation, and the application program in the related technology is not required to execute the frame dropping operation, the processing amount of the application program is reduced, and the occupation of the application program on a CPU is reduced. Meanwhile, compared with the mode that all image frames acquired by the camera component are sent to the application program, the image frame transmission method and the image frame transmission device only send a part of image frames acquired by the camera component to the application program, and reduce the number of image frames needing to be transmitted, so that occupation of CPU by image frame transmission is reduced, and power consumption of image frame transmission is reduced.

Description

Data processing method and device
Technical Field
The present disclosure relates to the field of computer communication technologies, and in particular, to a data processing method and apparatus.
Background
The terminal is provided with a camera, and when an application program in the terminal calls the camera, image frames acquired by the camera are acquired and displayed according to the image frames.
Sometimes, an application program performs frame dropping processing on received image frames to reduce the number of image frames to be rendered and reduce the processing amount of image rendering and other processing. However, the above method may be performed with a large CPU footprint for the processor.
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides a data processing method and apparatus.
According to a first aspect of the embodiments of the present disclosure, there is provided a data processing method applied to a terminal, the terminal being installed with an application program, a camera component, and a controller for controlling the camera component, the method including:
receiving a request sent by the application program for calling the camera component;
controlling the camera assembly to operate through the controller, and performing frame loss processing on a part of image frames acquired by the camera assembly;
sending image frames acquired from the controller that are not discarded by the controller to the application.
Optionally, the performing, by the controller, frame dropping processing on the partial image frames acquired by the camera assembly includes:
determining, by the controller, frame numbers of image frames that need to be discarded from a set of image frames acquired by the camera assembly in a future unit time according to a target frame rate;
and when the camera assembly acquires the image in the future unit time, frame loss processing is carried out on the image frame with the frame number acquired by the camera assembly through the controller.
Optionally, the determining, by the controller, frame numbers of image frames that need to be discarded in a set of image frames acquired by the camera assembly in a future unit time according to a target frame rate includes:
determining, by the controller, a number of image frames of the set of image frames that need to be discarded based on a frame rate at which the camera assembly acquires image frames and the target frame rate;
determining, by the controller, the number of frame numbers from a set of frame numbers for the set of image frames.
Optionally, the determining, by the controller, the number of frame numbers from the set of frame numbers of the group of image frames includes any one of:
determining frame number intervals according to the frame rate of the image frames collected by the camera assembly and the target frame rate through the controller, and extracting the number of frame numbers from the frame number set according to the frame number intervals;
and randomly determining the number of frame numbers from the frame number set through the controller.
Optionally, the method further comprises any one of:
acquiring the target frame rate sent by the application program, and sending the target frame rate to the controller;
and acquiring the target frame rate from system configuration, and sending the target frame rate to the controller.
Optionally, the request for invoking the camera component carries the target frame rate;
the acquiring the target frame rate sent by the application program includes:
and acquiring the target frame rate carried by the request.
Optionally, the processing of frame dropping on the partial image frames acquired by the camera assembly through the controller includes any one of:
in response to determining that the scene of the image frame used by the application program is a preset scene, performing frame loss processing on the partial image frame through the controller;
and in response to determining that the application program is a non-shooting type application program, performing frame loss processing on the partial image frame through the controller.
According to a second aspect of the embodiments of the present disclosure, there is provided a data processing apparatus applied to a terminal installed with an application program, a camera component, and a controller for controlling the camera component, the apparatus including:
a receiving module configured to receive a request sent by the application program for invoking the camera component;
a control module configured to control the camera assembly to operate by the controller;
the frame loss module is configured to perform frame loss processing on a part of image frames acquired by the camera assembly;
a first sending module configured to send image frames acquired from the controller that are not discarded by the controller to the application.
Optionally, the frame dropping module includes:
a determination submodule configured to determine, by the controller, frame numbers of image frames that need to be discarded from a set of image frames acquired by the camera assembly in a future unit time, according to a target frame rate;
an acquisition sub-module configured to perform frame dropping processing on the image frame with the frame number acquired by the camera assembly through the controller when the camera assembly performs image acquisition within the future unit time.
Optionally, the determining sub-module includes:
a first determining unit configured to determine, by the controller, a number of image frames in the set of image frames that need to be discarded according to a frame rate at which the camera assembly acquires the image frames and the target frame rate;
a second determining unit configured to determine, by the controller, the number of frame numbers from a set of frame numbers of the group of image frames.
Optionally, the second determining unit includes:
a first determining subunit configured to determine, by the controller, a frame number interval according to a frame rate at which the camera component acquires image frames and the target frame rate;
an extraction subunit configured to extract the number of frame numbers from the set of frame numbers at intervals according to the frame number intervals; alternatively, the first and second electrodes may be,
a second determining subunit configured to determine the number of frame numbers from the set of frame numbers randomly by the controller.
Optionally, the apparatus further comprises:
a first obtaining module configured to obtain the target frame rate sent by the application program;
a second transmitting module configured to transmit the target frame rate to the controller; alternatively, the first and second electrodes may be,
a second obtaining module configured to obtain the target frame rate from a system configuration;
a third transmitting module configured to transmit the target frame rate to the controller.
Optionally, the request for invoking the camera component carries the target frame rate;
the second obtaining module is configured to obtain the target frame rate carried by the request.
Optionally, the frame dropping module includes any one of:
a first frame loss sub-module configured to perform, by the controller, frame loss processing on the part of the image frame in response to determining that a scene in which the image frame is used by the application is a preset scene;
and the second frame loss sub-module is configured to perform frame loss processing on the partial image frame through the controller in response to determining that the application program is a non-shooting type application program.
According to a third aspect of embodiments of the present disclosure, there is provided a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of any one of the above first aspects.
According to a fourth aspect of the embodiments of the present disclosure, there is provided a terminal, including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
receiving a request sent by the application program for calling the camera component;
controlling the camera assembly to operate through the controller, and performing frame loss processing on a part of image frames acquired by the camera assembly;
sending image frames acquired from the controller that are not discarded by the controller to the application.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
in the embodiment of the disclosure, the terminal receives a request for calling the camera assembly sent by the application program, controls the camera assembly to operate through the controller, performs frame loss processing on part of image frames acquired by the camera assembly, and sends the image frames acquired from the controller and not discarded by the controller to the application program. Because the controller is used for executing the frame dropping operation, and the application program in the related technology is not required to execute the frame dropping operation, the processing amount of the application program is reduced, and the occupation of the application program on a CPU is reduced. Meanwhile, compared with the mode that all image frames acquired by the camera component are sent to the application program, the image frame transmission method and the image frame transmission device only send partial image frames acquired by the camera component to the application program, and the number of the image frames needing to be transmitted is reduced, so that the occupation of the CPU by image frame transmission is reduced, and the power consumption of the image frame transmission is reduced.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
FIG. 1 is a flow diagram illustrating a method of data processing in accordance with an exemplary embodiment;
FIG. 2 is a block diagram illustrating a data processing apparatus according to an exemplary embodiment;
fig. 3 is a schematic diagram illustrating a structure of a terminal according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
Fig. 1 is a flowchart illustrating a data processing method according to an exemplary embodiment, in which the method illustrated in fig. 1 is applied to a terminal installed with an application program, a camera assembly, and a controller for controlling the camera assembly, the method including:
in step 101, a request sent by an application to invoke a camera component is received.
In the embodiment of the disclosure, the terminal is provided with an application program, a camera component and a controller for controlling the camera component, and the application program can call the camera component to acquire the image frames acquired by the camera component. The controller is an entity device and can be a single chip microcomputer. The camera assembly may include hardware such as a camera. The terminal is applicable to various terminals, such as mobile phones, tablets, notebooks, wearable electronic devices, and the like.
In some scenarios, the application may have a need to invoke the camera component, and the application may send a request to invoke the camera component. For example, in a photo scene, in a face recognition scene, in a gesture recognition scene, in a payment scene, etc. Accordingly, the terminal may receive a request sent by the application to invoke the camera component.
In step 102, the controller controls the camera assembly to operate, and performs frame loss processing on a part of image frames acquired by the camera assembly.
After receiving a request for calling the camera component sent by the application program, the terminal can send a corresponding command to the controller, so that the controller controls the camera component to operate after receiving the command, and frame loss processing is performed on a part of image frames acquired by the camera component.
In some embodiments, the frame dropping process may understand that no image frames are sent to the application, or the frame dropping process may understand that no image frames are stored and no image frames are sent to the application.
In some embodiments, the controller determines a target frame rate that indicates the number of image frames the controller needs to return per unit time. The target frame rate is less than a maximum frame rate at which the camera assembly acquires image frames. For example, the maximum frame rate at which the camera assembly acquires image frames is 60 frames per second, and the target frame rate may be 45 frames per second, 30 frames per second, 20 frames per second, 15 frames per second, and so on. The target frame rate can be set as desired.
Based on this, the operation of the controller to perform frame dropping processing on the partial image frames acquired by the camera assembly can be realized by the following modes: the method comprises the following steps that firstly, frame numbers of image frames needing to be discarded in a group of image frames collected by a camera assembly in a unit time in the future are determined according to a target frame rate; and a second step of performing frame loss processing on the image frame with the frame number acquired by the camera assembly when the camera assembly performs image acquisition in a future unit time. The controller sends the image frames acquired by the camera assembly and not discarded by the controller so that the terminal acquires some image frames acquired by the camera assembly.
For the first step, the controller may determine, according to the frame rate at which the camera assembly acquires the image frames and the target frame rate, the number of image frames that need to be discarded in a group of image frames acquired by the camera assembly in a unit time in the future, and determine the number of frame numbers from the set of frame numbers of the group of image frames.
Assuming that the frame rate of the image acquired by the camera component is N, where N is a positive integer, the camera component acquires N frames of images per second, and the set of frame numbers of a group of image frames acquired by the camera component in each second may be defined as [1, 2, 3, · · · N ], or the set of frame numbers of a first image frame acquired by the camera component in a next second may be defined as equal to the frame number +1 of a last image frame acquired in a current second, for example, the set of frame numbers of a group of image frames acquired in a first second is [1, 2, 3, · · N ], and the set of frame numbers of a group of image frames acquired in a second is [ N +1, N +2, N +3, ·, 2 · N ].
The controller determines the number of the frame numbers from the frame number set in various ways, for example, the controller randomly determines the number of the frame numbers from the frame number set, or the controller determines the frame number interval according to the frame rate of the image frame collected by the camera component and the target frame rate, and extracts the number of the frame numbers from the frame number set at the interval according to the frame number interval.
For example, the frame rate of the image frame collected by the camera component is 60 frames per second, the target frame rate is 30 frames per second, the controller determines that the frame number interval is 1 according to the frame rate of the image frame collected by the camera component and the target frame rate, extracts the frame numbers 2, 4, 6, 8, and 60 from the frame number set [1, 2, 3, · · · 60], extracts thirty frame numbers in total, and the image frame with the frame numbers 2, 4, 6, 8, · · · 60 is an image frame to be frame lost.
In some embodiments, the terminal may further obtain a target frame rate sent by the application program, and send the target frame rate to the controller.
One implementation is as follows: the application program defaults a target frame rate, or the application program is provided with a plurality of frame rates, and the application program determines the frame rate selected by the user as the target frame rate, or the application program determines the frame rate directly input by the user as the target frame rate.
In an application, the application program may send the target frame rate when calling the camera component, so that the controller acquires the target frame rate. For example, when the application program sends a request for calling the camera component, the application program defines that the request carries the target frame rate, in which case the terminal acquires the target frame rate carried by the request and sends the acquired target frame rate to the controller.
The other realization mode is as follows: the terminal provides a system configuration interface, and can set a target frame rate for the application program in the system configuration interface. Accordingly, the terminal may acquire the target frame rate from the system configuration and transmit the target frame rate to the controller.
In some embodiments, a scene in which an application uses image frames is preset, in the preset scene, image frames acquired by a camera assembly are generally processed by using an image recognition technology, requirements on image frame content and image frame display effects displayed on a terminal are not high, and the data processing method provided by the embodiments of the present disclosure may be used. The preset scenes are various, such as a face recognition scene, a gesture recognition scene, a scanning payment scene and the like.
After the terminal receives a request for calling the camera assembly sent by the application program, the camera assembly is controlled to operate through the controller only when the scene that the application program uses the image frame is determined to be a preset scene, and frame dropping processing is carried out on part of the image frame collected by the camera assembly through the controller.
If the terminal determines that the scene of the image frame used by the application program is a non-preset scene, such as a photographing scene, a video recording scene and the like, the camera assembly is controlled to operate through the controller, and all the image frames collected by the camera assembly are obtained through the controller.
In some embodiments, the photographing type application has a function of photographing an image, and the photographing type application is various, such as a camera application, an instant chat application having a video call function, and the like.
In the use process of the shooting type application program, a clear image needs to be displayed on the terminal, the requirements on the content and the display effect of the image displayed on the terminal are high, and in this case, the terminal can acquire all image frames collected by the camera assembly through the controller.
In contrast, for the non-shooting type application program, the requirements on the image content and the image display effect displayed on the terminal are not very high, and in this case, the terminal may execute the data processing method provided by the embodiment of the present disclosure. There are various non-photographing type application programs such as a payment application, a shopping application having a scanning function, and the like.
After the terminal receives a request for calling the camera assembly sent by an application program, if the application program sending the request is determined to be a non-photographing type application program, the camera assembly is controlled to operate through the controller, and frame dropping processing is carried out on partial image frames collected by the camera assembly.
In step 103, image frames acquired from the controller that are not discarded by the controller are sent to the application.
In the android camera framework, a business process, a camera service process and a camera Hardware Abstraction Layer (HAL) process are involved, wherein the business process can be understood as a process in an application that sends a request for invoking a camera component.
The controller carries out frame loss processing on partial image frames collected by the camera assembly, and the image frames which are not lost are finally transmitted to the business process through the HAL process and the camera service process. Because only a part of image frames collected by the camera assembly are transmitted, the number of the image frames needing to be transmitted is reduced, the occupation of the CPU by the image frame transmission is reduced, and the power consumption of the image frame transmission is reduced.
The experimental result shows that compared with the method in the background art, the method provided by the embodiment of the disclosure is used for data processing, the CPU occupancy rates of the business process, the camera service process and the HAL process are reduced by 5% or more in total, and the reduction effect of the CPU occupancy rates is very obvious.
In the embodiment of the disclosure, the terminal receives a request for calling the camera assembly sent by the application program, controls the camera assembly to operate through the controller, performs frame loss processing on part of image frames acquired by the camera assembly, and sends the image frames acquired from the controller and not discarded by the controller to the application program. Because the controller is used for executing the frame dropping operation, and the application program in the related technology is not required to execute the frame dropping operation, the processing amount of the application program is reduced, and the occupation of the application program on a CPU is reduced. Meanwhile, compared with the mode that all image frames acquired by the camera component are sent to the application program, the image frame transmission method and the image frame transmission device only send a part of image frames acquired by the camera component to the application program, and reduce the number of image frames needing to be transmitted, so that occupation of CPU by image frame transmission is reduced, and power consumption of image frame transmission is reduced.
While, for purposes of simplicity of explanation, the foregoing method embodiments have been described as a series of acts or combination of acts, it will be appreciated by those skilled in the art that the present disclosure is not limited by the order of acts, as some steps may, in accordance with the present disclosure, occur in other orders and concurrently.
Further, those skilled in the art should also appreciate that the embodiments described in the specification are exemplary embodiments and that acts and modules referred to are not necessarily required by the disclosure.
Corresponding to the embodiment of the application function implementation method, the disclosure also provides an embodiment of an application function implementation device and a corresponding terminal.
Fig. 2 is a block diagram illustrating a data processing apparatus according to an exemplary embodiment, which is applied to a terminal installed with an application program, a camera component, and a controller for controlling the camera component, the apparatus including: a receiving module 21, a control module 22, a frame loss module 23 and a first sending module 24; wherein the content of the first and second substances,
the receiving module 21 is configured to receive a request sent by the application program for calling the camera component;
the control module 22 configured to control the operation of the camera assembly through the controller;
the frame dropping module 23 is configured to perform frame dropping processing on a part of the image frames acquired by the camera assembly;
the first sending module 24 is configured to send image frames acquired from the controller that are not discarded by the controller to the application.
In an alternative embodiment, on the basis of the data processing apparatus shown in fig. 2, the frame dropping module 23 may include: determining a submodule and an obtaining submodule; wherein the content of the first and second substances,
the determining submodule is configured to determine, by the controller, frame numbers of image frames which need to be discarded in a set of image frames acquired by the camera assembly in a future unit time according to a target frame rate;
the acquisition submodule is configured to perform frame dropping processing on the image frame with the frame number acquired by the camera assembly through the controller when the camera assembly performs image acquisition in the future unit time.
In an optional embodiment, the determining sub-module may include: a first determination unit and a second determination unit; wherein the content of the first and second substances,
the first determining unit is configured to determine, by the controller, the number of image frames in the set of image frames that need to be discarded according to the frame rate at which the camera assembly acquires the image frames and the target frame rate;
the second determining unit is configured to determine, by the controller, the number of frame numbers from a set of frame numbers of the group of image frames.
In an optional embodiment, the second determining unit may include: a first determining subunit and an extracting subunit; wherein the content of the first and second substances,
the first determining subunit is configured to determine, by the controller, a frame number interval according to the frame rate at which the camera component acquires image frames and the target frame rate;
the extracting subunit is configured to extract the number of frame numbers from the set of frame numbers at intervals according to the frame number intervals.
In an alternative embodiment, the second determining subunit may be configured to determine the number of frame numbers from the set of frame numbers randomly by the controller.
In an optional embodiment, the apparatus may further include: the device comprises a first acquisition module and a second sending module; wherein the content of the first and second substances,
the first obtaining module is configured to obtain the target frame rate sent by the application program;
the second sending module is configured to send the target frame rate to the controller.
In an optional embodiment, the apparatus may further include: the second acquisition module and the third sending module; wherein the content of the first and second substances,
the second obtaining module is configured to obtain the target frame rate from a system configuration;
the third sending module is configured to send the target frame rate to the controller.
In an optional embodiment, the request to invoke the camera component carries the target frame rate;
the second obtaining module may be configured to obtain the target frame rate carried by the request.
In an alternative embodiment, on the basis of the data processing apparatus shown in fig. 2, the frame loss module includes any one of: a first frame loss submodule and a second frame loss submodule; wherein the content of the first and second substances,
the first frame loss submodule is configured to respond to the fact that the scene of the image frame used by the application program is determined to be a preset scene, and conduct frame loss processing on the partial image frame through the controller;
the second frame loss sub-module is configured to perform frame loss processing on the partial image frame through the controller in response to determining that the application program is a non-shooting type application program.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the disclosed solution. One of ordinary skill in the art can understand and implement it without inventive effort.
Accordingly, in one aspect, an embodiment of the present disclosure provides a terminal, including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to:
receiving a request sent by the application program for calling the camera component;
controlling the camera assembly to operate through the controller, and performing frame loss processing on a part of image frames acquired by the camera assembly;
sending image frames acquired from the controller that are not discarded by the controller to the application.
Fig. 3 is a block diagram illustrating a terminal 1600 according to an example embodiment. For example, apparatus 1600 may be a user device, which may be embodied as a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, a wearable device such as a smart watch, smart glasses, a smart bracelet, a smart running shoe, and the like.
Referring to fig. 3, apparatus 1600 may include one or more of the following components: processing component 1602, memory 1604, power component 1606, multimedia component 1608, audio component 1610, input/output (I/O) interface 1612, sensor component 1614, and communications component 1616.
The processing component 1602 generally controls overall operation of the device 1600, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 1602 may include one or more processors 1620 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 1602 can include one or more modules that facilitate interaction between the processing component 1602 and other components. For example, the processing component 1602 can include a multimedia module to facilitate interaction between the multimedia component 1608 and the processing component 1602.
The memory 1604 is configured to store various types of data to support operation at the device 1600. Examples of such data include instructions for any application or method operating on device 1600, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 1604 may be implemented by any type of volatile or non-volatile memory device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
A power supply component 1606 provides power to the various components of the device 1600. The power components 1606 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 1600.
The multimedia component 1608 includes a screen that provides an output interface between the apparatus 1600 and a user as described above. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of the touch or slide action but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 1608 comprises a front-facing camera and/or a rear-facing camera. The front-facing camera and/or the back-facing camera may receive external multimedia data when device 1600 is in an operational mode, such as an adjustment mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 1610 is configured to output and/or input an audio signal. For example, audio component 1610 includes a Microphone (MIC) configured to receive external audio signals when apparatus 1600 is in an operational mode, such as a call mode, recording mode, and voice recognition mode. The received audio signal may further be stored in the memory 1604 or transmitted via the communications component 1616. In some embodiments, audio component 1610 further includes a speaker for outputting audio signals.
The I/O interface 1612 provides an interface between the processing component 1602 and peripheral interface modules, such as keyboards, click wheels, buttons, and the like. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
Sensor assembly 1614 includes one or more sensors for providing status assessment of various aspects to device 1600. For example, sensor assembly 1614 can detect an open/closed state of device 1600, the relative positioning of components, such as a display and keypad of device 1600, a change in position of device 1600 or a component of device 1600, the presence or absence of user contact with device 1600, orientation or acceleration/deceleration of device 1600, and a change in temperature of device 1600. The sensor assembly 1614 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 1614 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 1614 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communications component 1616 is configured to facilitate communications between the apparatus 1600 and other devices in a wired or wireless manner. The device 1600 may access a wireless network based on a communication standard, such as WiFi, 2G, or 3G, or a combination thereof. In an exemplary embodiment, the communication component 1616 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the aforementioned communication component 1616 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 1600 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, there is also provided a non-transitory computer readable storage medium, such as the memory 1604 including instructions that, when executed by the processor 1620 of the apparatus 1600, enable the apparatus 1600 to perform a data processing method, the method comprising: receiving a request sent by the application program for calling the camera component; controlling the camera assembly to operate through the controller, and performing frame loss processing on a part of image frames acquired by the camera assembly; sending image frames acquired from the controller that are not discarded by the controller to the application.
The non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (16)

1. A data processing method applied to a terminal installed with an application program, a camera module, and a controller for controlling the camera module, the method comprising:
receiving a request sent by the application program for calling the camera component;
controlling the camera assembly to operate through the controller, and performing frame loss processing on a part of image frames acquired by the camera assembly;
sending image frames acquired from the controller that are not discarded by the controller to the application.
2. The method of claim 1, wherein the performing, by the controller, frame dropping processing on the partial image frames captured by the camera assembly comprises:
determining, by the controller, frame numbers of image frames that need to be discarded from a set of image frames acquired by the camera assembly in a future unit time according to a target frame rate;
and when the camera assembly acquires the image in the future unit time, frame loss processing is carried out on the image frame with the frame number acquired by the camera assembly through the controller.
3. The method of claim 2, wherein determining, by the controller, frame numbers of image frames that need to be discarded from a set of image frames acquired by the camera assembly in a future unit of time according to a target frame rate comprises:
determining, by the controller, a number of image frames of the set of image frames that need to be discarded based on a frame rate at which the camera assembly acquires image frames and the target frame rate;
determining, by the controller, the number of frame numbers from a set of frame numbers for the set of image frames.
4. The method of claim 3, wherein said determining, by said controller, said number of frame numbers from a set of frame numbers for said set of image frames comprises any of:
determining frame number intervals according to the frame rate of the image frames collected by the camera assembly and the target frame rate through the controller, and extracting the number of frame numbers from the frame number set according to the frame number intervals;
and randomly determining the number of frame numbers from the frame number set through the controller.
5. The method of claim 2, further comprising any of:
acquiring the target frame rate sent by the application program, and sending the target frame rate to the controller;
and acquiring the target frame rate from system configuration, and sending the target frame rate to the controller.
6. The method of claim 5, wherein the request to invoke the camera component carries the target frame rate;
the acquiring the target frame rate sent by the application program includes:
and acquiring the target frame rate carried by the request.
7. The method of claim 1, wherein the processing of frame dropping by the controller of the partial image frames captured by the camera assembly comprises any one of:
in response to determining that the scene of the image frame used by the application program is a preset scene, performing frame loss processing on the partial image frame through the controller;
and in response to determining that the application program is a non-shooting type application program, performing frame loss processing on the partial image frame through the controller.
8. A data processing apparatus applied to a terminal installed with an application program, a camera module, and a controller for controlling the camera module, the apparatus comprising:
a receiving module configured to receive a request sent by the application program for invoking the camera component;
a control module configured to control the camera assembly to operate by the controller;
the frame loss module is configured to perform frame loss processing on a part of image frames acquired by the camera assembly;
a first sending module configured to send image frames acquired from the controller that are not discarded by the controller to the application.
9. The apparatus of claim 8, wherein the frame loss module comprises:
a determination submodule configured to determine, by the controller, frame numbers of image frames that need to be discarded from a set of image frames acquired by the camera assembly in a future unit time, according to a target frame rate;
an acquisition sub-module configured to perform frame dropping processing on the image frame with the frame number acquired by the camera assembly through the controller when the camera assembly performs image acquisition within the future unit time.
10. The apparatus of claim 9, wherein the determining sub-module comprises:
a first determining unit configured to determine, by the controller, a number of image frames in the set of image frames that need to be discarded according to a frame rate at which the camera assembly acquires the image frames and the target frame rate;
a second determining unit configured to determine, by the controller, the number of frame numbers from a set of frame numbers of the group of image frames.
11. The apparatus of claim 10, wherein the second determining unit comprises:
a first determining subunit configured to determine, by the controller, a frame number interval according to a frame rate at which the camera component acquires image frames and the target frame rate;
an extraction subunit configured to extract the number of frame numbers from the set of frame numbers at intervals according to the frame number intervals; alternatively, the first and second electrodes may be,
a second determining subunit configured to determine the number of frame numbers from the set of frame numbers randomly by the controller.
12. The apparatus of claim 9, further comprising:
a first obtaining module configured to obtain the target frame rate sent by the application program;
a second transmitting module configured to transmit the target frame rate to the controller; alternatively, the first and second electrodes may be,
a second obtaining module configured to obtain the target frame rate from a system configuration;
a third transmitting module configured to transmit the target frame rate to the controller.
13. The apparatus of claim 12, wherein the request to invoke the camera component carries the target frame rate;
the second obtaining module is configured to obtain the target frame rate carried by the request.
14. The apparatus of claim 8, wherein the frame loss module comprises any one of:
a first frame loss sub-module configured to perform, by the controller, frame loss processing on the part of the image frame in response to determining that a scene in which the image frame is used by the application is a preset scene;
and the second frame loss sub-module is configured to perform frame loss processing on the partial image frame through the controller in response to determining that the application program is a non-shooting type application program.
15. A non-transitory computer readable storage medium, on which a computer program is stored, which, when executed by a processor, performs the steps of the method of any one of claims 1 to 7.
16. A terminal, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
receiving a request sent by the application program for calling the camera component;
controlling the camera assembly to operate through the controller, and performing frame loss processing on a part of image frames acquired by the camera assembly;
sending image frames acquired from the controller that are not discarded by the controller to the application.
CN202010784824.8A 2020-08-06 2020-08-06 Data processing method and device Pending CN114071227A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010784824.8A CN114071227A (en) 2020-08-06 2020-08-06 Data processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010784824.8A CN114071227A (en) 2020-08-06 2020-08-06 Data processing method and device

Publications (1)

Publication Number Publication Date
CN114071227A true CN114071227A (en) 2022-02-18

Family

ID=80232555

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010784824.8A Pending CN114071227A (en) 2020-08-06 2020-08-06 Data processing method and device

Country Status (1)

Country Link
CN (1) CN114071227A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101159862A (en) * 2007-11-29 2008-04-09 北京中星微电子有限公司 Frame rate control method and device
CN101217657A (en) * 2008-01-08 2008-07-09 北京中星微电子有限公司 A frame rate control method and device
CN104702968A (en) * 2015-02-17 2015-06-10 华为技术有限公司 Frame loss method for video frame and video sending device
CN107079135A (en) * 2016-01-29 2017-08-18 深圳市大疆创新科技有限公司 Method of transmitting video data, system, equipment and filming apparatus
JP2019029699A (en) * 2017-07-25 2019-02-21 キヤノン株式会社 Imaging apparatus, control method thereof, and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101159862A (en) * 2007-11-29 2008-04-09 北京中星微电子有限公司 Frame rate control method and device
CN101217657A (en) * 2008-01-08 2008-07-09 北京中星微电子有限公司 A frame rate control method and device
CN104702968A (en) * 2015-02-17 2015-06-10 华为技术有限公司 Frame loss method for video frame and video sending device
CN107079135A (en) * 2016-01-29 2017-08-18 深圳市大疆创新科技有限公司 Method of transmitting video data, system, equipment and filming apparatus
JP2019029699A (en) * 2017-07-25 2019-02-21 キヤノン株式会社 Imaging apparatus, control method thereof, and program

Similar Documents

Publication Publication Date Title
EP3188066B1 (en) A method and an apparatus for managing an application
EP3276976A1 (en) Method, apparatus, host terminal, server and system for processing live broadcasting information
US20170304735A1 (en) Method and Apparatus for Performing Live Broadcast on Game
EP3099042B1 (en) Methods and devices for sending cloud card
CN106506448B (en) Live broadcast display method and device and terminal
CN105069073B (en) Contact information recommendation method and device
EP3136793A1 (en) Method and apparatus for awakening electronic device
EP2988205A1 (en) Method and device for transmitting image
CN107040591B (en) Method and device for controlling client
CN107635074B (en) Method, apparatus and computer-readable storage medium for controlling alarm
EP3226432A1 (en) Method and device for sharing media data between terminals
US20180144546A1 (en) Method, device and terminal for processing live shows
CN107480785B (en) Convolutional neural network training method and device
EP3048508A1 (en) Methods, apparatuses and devices for transmitting data
EP3565374A1 (en) Region configuration method and device
CN112291631A (en) Information acquisition method, device, terminal and storage medium
CN108629814B (en) Camera adjusting method and device
CN107885464B (en) Data storage method, device and computer readable storage medium
CN107026941B (en) Method and device for processing reply of unread message
CN107357643B (en) Application calling method and device and computer readable storage medium
CN106919302B (en) Operation control method and device of mobile terminal
EP3905660A1 (en) Method and device for shooting image, and storage medium
CN108769513B (en) Camera photographing method and device
CN109246322B (en) Information processing method and system
CN114071227A (en) Data processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination