CN111193869A - Image data processing method, image data processing device and mobile terminal - Google Patents

Image data processing method, image data processing device and mobile terminal Download PDF

Info

Publication number
CN111193869A
CN111193869A CN202010023765.2A CN202010023765A CN111193869A CN 111193869 A CN111193869 A CN 111193869A CN 202010023765 A CN202010023765 A CN 202010023765A CN 111193869 A CN111193869 A CN 111193869A
Authority
CN
China
Prior art keywords
processed
description information
buffer block
target
image buffer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010023765.2A
Other languages
Chinese (zh)
Other versions
CN111193869B (en
Inventor
林飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010023765.2A priority Critical patent/CN111193869B/en
Publication of CN111193869A publication Critical patent/CN111193869A/en
Application granted granted Critical
Publication of CN111193869B publication Critical patent/CN111193869B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses an image data processing method, an image data processing device, a mobile terminal and a computer readable storage medium, wherein the method is applied to the mobile terminal with more than two cameras, and comprises the following steps: in the process of generating the night scene image, the image buffer blocks related to the interaction between the application and the hardware abstraction layer all carry initial description information used for describing the maximum size, and when the hardware abstraction layer calls corresponding pipeline flows to process the image buffer blocks, the initial description information carried by the hardware abstraction layer is changed into target description information used for describing the real size, so that the mobile terminal can transmit raw images of different sizes in the process of generating the night scene, and optical zooming is supported during night scene shooting.

Description

Image data processing method, image data processing device and mobile terminal
Technical Field
The present application belongs to the field of image processing technologies, and in particular, to an image data processing method, an image data processing apparatus, a mobile terminal, and a computer-readable storage medium.
Background
In order to meet the shooting requirements of people, the conventional mobile terminal is often provided with a plurality of different types of cameras, and a typical example is that the mobile terminal can be provided with a main camera, a wide-angle camera, a far-focus camera and the like. In a night scene shooting scene, in order to obtain a clear night scene image, the mobile terminal often acquires a plurality of raw images through a camera, synthesizes the raw images, and then sends the raw images to the driver to perform format conversion operation to obtain a YUV image. The above process can usually only be done on the main camera, which results in that the existing night shot cannot support optical zoom.
Disclosure of Invention
In view of the foregoing, the present application provides an image data processing method, an image data processing apparatus, a mobile terminal and a computer readable storage medium, which can support optical zooming during night scene shooting.
A first aspect of the present application provides an image data processing method, where the image data processing method is applied to a mobile terminal, where the mobile terminal has two or more cameras, and the image data processing method includes:
when the hardware abstraction layer of the mobile terminal receives the image buffer block to be processed transmitted by the appointed application, analyzing the image buffer block to be processed to obtain description information carried by the image buffer block to be processed, and recording the description information as description information to be processed, wherein the description information is used for describing the size and the type of the image buffer block;
if the to-be-processed description information is matched with preset initial description information, determining a target pipeline flow in more than one pipeline flow according to the type of the to-be-processed image buffer block indicated by the to-be-processed description information;
replacing the to-be-processed description information carried by the to-be-processed image buffer block with preset target description information, wherein the target description information is used for describing the real size;
calling the target pipeline flow to process the replaced image buffer block to be processed to obtain a processed image buffer block;
replacing the target description information carried by the processed image buffer block with the description information to be processed;
and transmitting the processed image buffer block carrying the to-be-processed description information to the specified application.
A second aspect of the present application provides an image data processing apparatus applied to a mobile terminal having two or more cameras, the image data processing apparatus including:
the analysis unit is used for analyzing the image buffer block to be processed when the hardware abstraction layer of the mobile terminal receives the image buffer block to be processed transmitted by the appointed application, obtaining the description information carried by the image buffer block to be processed and recording the description information as the description information to be processed, wherein the description information is used for describing the size and the type of the image buffer block;
a determining unit, configured to determine a target pipeline flow in more than one pipeline flow according to a type of a to-be-processed image buffer block indicated by the to-be-processed description information if the to-be-processed description information matches preset initial description information;
a first replacing unit, configured to replace to-be-processed description information carried by the to-be-processed image buffer block with preset target description information, where the target description information is used to describe a real size;
the calling unit is used for calling the target pipeline flow to process the replaced image buffer block to be processed to obtain a processed image buffer block;
the second replacement unit is used for replacing the target description information carried by the processed image buffer block with the description information to be processed;
a sending unit, configured to transmit the processed image buffer block carrying the to-be-processed description information to the specified application.
A third aspect of the present application provides a mobile terminal comprising a memory, a processor, two or more cameras, and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the method according to the first aspect when executing the computer program.
A fourth aspect of the present application provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of the first aspect as described above.
A fifth aspect of the application provides a computer program product comprising a computer program which, when executed by one or more processors, performs the steps of the method as described in the first aspect above.
As can be seen from the above, in the solution of the present application, for a mobile terminal having more than two cameras, when a hardware abstraction layer of the mobile terminal receives a to-be-processed image buffer block transmitted by a specified application, the to-be-processed image buffer block may be analyzed first to obtain description information carried by the to-be-processed image buffer block, and the description information is recorded as to-be-processed description information, if the to-be-processed description information is found to match with preset initial description information, a target pipeline flow is determined in more than one pipeline flow according to a type of the to-be-processed image buffer block indicated by the to-be-processed description information, and the to-be-processed description information carried by the to-be-processed image buffer block is replaced with preset target description information, where the target description information is used to describe a real size, and then the target pipeline flow is called to process the replaced to-be-processed image buffer block, and finally, transmitting the processed image buffer block carrying the to-be-processed description information to the specified application. According to the scheme, the mobile terminal can transmit raw images with different sizes in the night scene generation process, so that optical zooming is supported during night scene shooting.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
FIG. 1 is a schematic diagram of an implementation flow of an image data processing method provided in an embodiment of the present application;
fig. 2 is a schematic diagram of a data flow direction in an image data processing method provided in an embodiment of the present application;
fig. 3 is a schematic diagram of an interaction flow of APP and HAL in a first stage in an image data processing method provided in an embodiment of the present application;
fig. 4 is a schematic diagram illustrating an interaction flow of APP and HAL at a second stage in the image data processing method provided in the embodiment of the present application;
fig. 5 is a block diagram of an image data processing apparatus according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a mobile terminal according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to explain the technical solution of the present application, the following description will be given by way of specific examples.
Example one
The following describes an image data processing method provided in an embodiment of the present application, where the image data processing method is applied to a mobile terminal having two or more different types of cameras, where the types of cameras include a main camera, a wide camera, a telephoto camera, and the like, and the types of cameras are not limited herein. Referring to fig. 1, an image data processing method in an embodiment of the present application includes:
step 101, when a hardware abstraction layer of the mobile terminal receives a buffer block of an image to be processed transmitted by a specified application, analyzing the buffer block of the image to be processed to obtain description information carried by the buffer block of the image to be processed, and recording the description information as description information to be processed;
in the embodiment of the present application, the participants involved in the night scene shooting process include a specific Application (APP), a Hardware Abstraction Layer (HAL), and an Algorithm Processing Service (APS). Referring to fig. 2, fig. 2 shows the data flow during night scene shooting: the APP sends data to be filled to the HAL; HAL is filled to obtain a multi-frame raw image and returns the multi-frame raw image to APP; the APP sends the multi-frame raw image to the APS; synthesizing a plurality of raw images by the APS to obtain a raw image, and returning the raw image to the APP; the APP sends the raw image of the frame to the HAL; and performing raw2YUV operation by the HAL to convert the raw image into a YUV image and sending the YUV image to the APP, so that the operation related to night scene shooting is basically completed, and the operation of converting the YUV image into a JPEG image is performed subsequently, which is not described herein any more. The primary concern of embodiments of the present application is the interaction of APP with the HAL, where APP is specifically referred to as a camera application.
As can be seen from the data flow in the night scene shooting process, the interaction between APP and HAL is divided into two stages: in the first stage, HAL fills in the data of raw image; in the second phase, the operation performed by the HAL is a raw2yuv operation. In the above two phases, the APP is used as the initiator to interact with the HAL, that is, the data flow of the two phases is "APP → HAL → APP". Specifically, the data transmitted in the night scene shooting process is buffer, that is, an image buffer block; each buffer may carry a corresponding stream, that is, description information, where the description information is used to describe the type of the image buffer block and the size of the image buffer block. Therefore, when the HAL receives the buffer block of the image to be processed transmitted by the APP, in order to determine which stage of operation is currently executed, the HAL may analyze the received buffer block of the image to be processed, obtain the description information carried by the buffer block of the image to be processed, and record the description information as the description information to be processed, and analyze the description information to be processed, so as to determine the stage currently located, that is, determine the operation to be executed by the HAL currently.
Specifically, in order to enable the interaction between APP and HAL to be performed smoothly, the interaction protocol between the APP and HAL should be agreed before the two start interacting, for example, the size of the buffer transmitted by the two is agreed, and what data should be returned by the HAL to the APP when facing the specific respective types of buffers, etc. Therefore, when a start event of the APP is monitored, that is, when the APP starts, initial description information may be configured first, where a buffer size described by the initial description information is a maximum size. That is, the buffers are configured to be transmitted between the APP and the HAL based on the maximum size. Wherein the maximum size may be determined by: when the starting event of the specified application is monitored, the image size corresponding to each camera of the mobile terminal is obtained, then the image sizes corresponding to the cameras are compared, the maximum size is determined, and then the initial description information is configured based on the maximum size.
Specifically, considering that APP and HAL have two interactive phases, the initial descriptor may be two, and include a first initial descriptor (denoted as raw _ output _ stream) and a second initial descriptor (denoted as raw _ input _ stream), where the first initial descriptor is used for corresponding to the first phase and the second initial descriptor is used for corresponding to the second phase. The sizes of the image buffer blocks described by the first initial description information and the second initial description information are the same and are the maximum sizes; the types of the image buffer blocks described by the first initial description information and the second initial description information are different, specifically, the first initial description information is used for describing that the image buffer block is an output type, and the second initial description information is used for describing that the image buffer block is an input type. In addition to the initial description information, the mobile terminal may also configure other description information, such as prv _ output _ stream for preview, yuv0_ output _ stream, yuv1_ output _ stream, yuv2_ output _ stream, and the like for offline pipeline flow, which is not limited herein.
102, if the to-be-processed description information is matched with preset initial description information, determining a target pipeline flow in more than one pipeline flow according to the type of a to-be-processed image buffer block indicated by the to-be-processed description information;
in this embodiment of the application, when the to-be-processed description information matches with the initial description information, it is known that the to-be-processed description information is currently in the first stage or the second stage, and a target pipeline flow may be determined in one or more pipeline flows (pipeline) according to a type (e.g., an input type or an output type) of the to-be-processed image buffer block indicated by the to-be-processed description information. Specifically, more than one pipeline flow may be created when a start event of the APP is monitored, that is, when the APP starts. The number of the assembly line processes is determined by the number of cameras of the mobile terminal and the interaction stage where the HAL and the APP are located: considering that there are two interactive stages where the HAL and the APP are located, each interactive stage corresponds to a type of pipeline flow, where the pipeline flow corresponding to the first stage is a real-time pipeline flow (real time pipeline), and the pipeline flow corresponding to the second stage is an off-line pipeline flow (offline pipeline); and considering that the number of the cameras is multiple, each camera corresponds to a type of pipeline flow. The mobile terminal has three cameras, including a main camera, a telephoto camera and a wide-angle camera as an example, and six flow lines to be created are provided, which are respectively:
a real-time assembly line flow corresponding to the main camera;
an offline assembly line process corresponding to the main camera;
a real-time assembly line flow corresponding to the long-focus camera;
an offline assembly line process corresponding to the long-focus camera;
a real-time assembly line flow corresponding to the wide-angle camera;
and an offline assembly line flow corresponding to the wide-angle camera.
Based on this, considering that there are multiple pipeline flows, in this step, a target pipeline flow needs to be determined in more than one pipeline flow according to the type of the buffer block of the image to be processed indicated by the description information to be processed, specifically, the current shooting magnification of the mobile terminal is obtained first, and then according to the shooting magnification, the mobile terminal can determine a corresponding master pipeline index, which is a camera operated under the current shooting magnification; therefore, through the process, the target camera can be determined, and then the target pipeline flow can be determined in more than one pipeline flow according to the target camera and the type of the to-be-processed image buffer block indicated by the to-be-processed description information, that is, the real-time pipeline flow corresponding to the target camera or the off-line pipeline flow corresponding to the target camera is selected to be called.
Specifically, the type of the image buffer block (buffer) includes an output type and an input type, where the output type is described by raw _ output _ stream, and is used to instruct the HAL to perform image data padding operation, that is, if description information carried by the image buffer block to be processed is raw _ output _ stream, it indicates that the image buffer block to be processed is in the first stage, and the description information is a real-time pipeline flow; the input type is described by raw _ input _ stream, and is used to instruct the HAL to perform image data conversion operation, that is, if the description information carried by the buffer is raw _ input _ stream, it indicates that the buffer is in the second stage, and the corresponding buffer is an offline pipeline flow; based on this, the determining a target pipeline flow in one or more pipeline flows according to the target camera and the type of the to-be-processed image buffer block indicated by the to-be-processed description information includes:
if the type of the buffer block of the image to be processed indicated by the description information to be processed is an output type, determining a real-time pipeline flow of the target camera as the target pipeline flow;
and if the type of the to-be-processed image buffer block indicated by the to-be-processed description information is an input type, determining the offline pipeline flow of the target camera as the target pipeline flow.
As can be seen, when the description information carried by the image buffer block to be processed is raw _ output _ stream, determining the real-time pipeline flow of the target camera as the target pipeline flow; and when the description information carried by the image buffer block to be processed is raw _ input _ stream, determining an offline pipeline flow of a target camera as a target pipeline flow, wherein the target camera is determined according to the current shooting magnification factor.
103, replacing the to-be-processed description information carried by the to-be-processed image buffer block with preset target description information, wherein the target description information is used for describing a real size;
in the embodiment of the present application, the sizes of images corresponding to different cameras are different, that is, actually, there are a plurality of target description information, each target description information corresponds to one camera, and the target description information corresponding to the target camera can be determined in the plurality of target description information, so as to describe the real size of the image obtained by the target camera. Specifically, the object description information is configured based on image sizes corresponding to the respective cameras. When the description information to be processed is the first initial description information or the second description information, since the size of the image buffer block described by the first initial description information and the second description information is the maximum size, and the pipeline flows are created based on the real size corresponding to the cameras, in order to enable the image buffer block to be processed normally by the pipeline flows, the description information to be processed carried by the image buffer block to be processed needs to be replaced by preset target description information (referred to as real _ size _ stream), where the target description information is used for describing the real size, and the target description information is associated with the target camera.
Step 104, calling the target pipeline flow to process the replaced image buffer block to be processed to obtain a processed image buffer block;
in this embodiment of the application, when a target pipeline flow is a real-time pipeline flow, the HAL executes an image data filling operation by calling the target pipeline flow, specifically, fills raw data acquired by the target camera into the buffer block of the image to be processed, that is, obtains a raw image; and when the target pipeline flow is an offline pipeline flow, the HAL executes image data conversion operation by calling the target pipeline flow, specifically, converts raw data carried by the image buffer block to be processed into YUV data, that is, realizes conversion from a raw image to a YUV image. It should be noted that the target to be processed image buffer block in this step specifically refers to the to be processed image buffer block on which the description information replacement operation has been performed, that is, refers to the image buffer block carrying the target description information for describing the real size.
105, replacing the target description information carried by the processed image buffer block with the description information to be processed;
in this embodiment of the present application, after the target pipeline flow is called, and a processed image buffer block is obtained, the processed image buffer block needs to be transmitted back to the APP; considering that the buffer is configured to be transmitted based on the maximum size between the APP and the HAL, and whether the currently obtained processed image buffer actually carries the object description data for describing the real size, based on this, the description information carried by the processed image buffer needs to be replaced before the processed image buffer is transmitted back to the APP, specifically, the object description information carried by the processed image buffer needs to be replaced by the to-be-processed description information. In this way, the resulting processed image buffer block after replacement can be transmitted based on the maximum size.
Step 106, transmitting the processed image buffer block carrying the to-be-processed description information to the designated application.
In this embodiment of the present application, the processed image buffer block carrying the to-be-processed description information may be transmitted back to the APP. Specifically, referring to fig. 2, if the processed image buffer block is obtained by calling a real-time pipeline, that is, the HAL performs a raw image capture operation, the APP further transmits the processed image buffer block to the APS for performing a subsequent multi-frame raw image synthesis operation. Optionally, when the target pipeline flow called by the HAL is a real-time pipeline flow, the HAL writes the real size into metadata associated with the processed image buffer block after the target pipeline flow is called and the processed image buffer block is obtained, so that when the APP transfers the processed image buffer block to the APS, the APS may also read the real size based on the metadata to perform an image data synthesis operation.
Referring to fig. 3, fig. 3 shows an example of the interaction process between APP and HAL in the first phase: when a user inputs a shooting instruction, an APP sends a buffer to be processed (a buffer block of an image to be processed) to a HAL, and description information carried by the buffer to be processed is raw _ output _ stream (first initial description information); the HAL judges that the APP transmits a buffer to be processed carrying raw _ output _ stream, and knows that the APP is currently in a first stage, and at this time, based on a target camera (determined by a master pipeline Index corresponding to a current shooting magnification), the HAL can know the real size of the buffer to be processed, and changes the description information carried by the buffer to be processed from raw _ output _ stream to real _ size _ stream (i.e. target description information); then calling the real-time pipeline flow of the target camera to process the buffer to be processed changed into the real _ size _ stream, wherein the specific processing operation is an image data filling operation to obtain a processed buffer (a processed image buffer block) carrying the real _ size _ stream; finally, the description information carried by the processed buffer is changed from real _ size _ stream to raw _ output _ stream, and is sent to the APP.
Referring to FIG. 4, FIG. 4 shows an example of the interaction process between APP and HAL in the second phase. The APP sends a buffer to be processed (buffer block of image to be processed) to the HAL, and the description information carried by the buffer to be processed is raw _ input _ stream (second initial description information); the HAL judges that the APP transmits a to-be-processed buffer carrying raw _ input _ stream, and knows that the APP is currently in the second stage, and at this time, based on a target camera (determined by a master pipeline Index corresponding to a current shooting magnification factor), the HAL can know the real size of the to-be-processed buffer, and changes the description information carried by the to-be-processed buffer from raw _ input _ stream to real _ size _ stream (i.e. target description information); then calling a real-time pipeline flow of the target camera to process the buffer to be processed changed into the real _ size _ stream, wherein the specific processing operation is an image conversion (raw2yuv) operation to obtain a processed buffer (a processed image buffer block) carrying the real _ size _ stream; finally, the description information carried by the processed buffer is changed from real _ size _ stream to raw _ input _ stream, and is sent to the APP.
As can be seen from the above, in the embodiment of the present application, although there is a difference in the sizes of images acquired by different types of cameras, by configuring the APP and the HAL to transmit based on the maximum size of the buffered image block and changing the size of the image buffered block to the real size when the pipeline flow is called, raw images of different sizes can all be transmitted in the same channel, that is, the mobile terminal can transmit raw images of different sizes in the night view shooting process, so that the optical zoom is supported when the night view shooting is performed.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Example two
In a second embodiment of the present application, an image data processing apparatus is provided, where the image data processing apparatus may be integrated in a mobile terminal, and the mobile terminal has two or more cameras, as shown in fig. 5, an image data processing apparatus 500 in the embodiment of the present application includes:
an analyzing unit 501, configured to, when the hardware abstraction layer of the mobile terminal receives the to-be-processed image buffer block transmitted by the specified application, analyze the to-be-processed image buffer block to obtain description information carried by the to-be-processed image buffer block, and record the description information as to-be-processed description information, where the description information is used to describe a size and a type of the image buffer block;
a determining unit 502, configured to determine a target pipeline flow in more than one pipeline flow according to a type of a to-be-processed image buffer block indicated by the to-be-processed description information if the to-be-processed description information matches preset initial description information;
a first replacing unit 503, configured to replace to-be-processed description information carried by the to-be-processed image buffer block with preset target description information, where the target description information is used to describe a real size;
a calling unit 504, configured to call the target pipeline flow to process the replaced to-be-processed image buffer block, so as to obtain a processed image buffer block;
a second replacing unit 505, configured to replace the target description information carried by the processed image buffer block with the to-be-processed description information;
a sending unit 506, configured to transmit the processed image buffer block carrying the to-be-processed description information to the specified application.
Optionally, the image data processing apparatus 500 further includes:
a configuration unit, configured to configure the initial description information and the target description information when a start event of the specified application is monitored;
and the creating unit is used for respectively creating a corresponding real-time assembly line flow and an offline assembly line flow for each camera of the mobile terminal.
Optionally, the configuration unit includes:
a size obtaining subunit, configured to obtain, when a start event of the specified application is monitored, an image size corresponding to each camera of the mobile terminal;
the size determining subunit is used for comparing the image sizes corresponding to the cameras to determine the maximum size;
and the size configuration subunit is used for configuring the initial description information based on the maximum size and configuring the target description information based on the image size corresponding to each camera.
Optionally, the determining unit 502 includes:
the magnification acquiring subunit is used for acquiring the current shooting magnification of the mobile terminal;
the target camera determining subunit is used for determining a target camera according to the shooting magnification;
and the target pipeline flow determining subunit is used for determining a target pipeline flow in more than one pipeline flow according to the target camera and the type of the to-be-processed image buffer block indicated by the to-be-processed description information.
Optionally, the type of the image buffer block includes an output type and an input type, where the output type is used to instruct the hardware abstraction layer to perform an image data padding operation, and the input type is used to instruct the hardware abstraction layer to perform an image data conversion operation; accordingly, the target pipeline flow determining subunit is specifically configured to determine, if the type of the to-be-processed image buffer block indicated by the to-be-processed description information is an output type, the real-time pipeline flow of the target camera as the target pipeline flow, and if the type of the to-be-processed image buffer block indicated by the to-be-processed description information is an input type, determine, as the target pipeline flow, the offline pipeline flow of the target camera.
Optionally, the image data processing apparatus 500 further includes:
a writing unit, configured to, if the type of the to-be-processed image buffer block indicated by the to-be-processed description information is an output type, write the real size into metadata associated with the processed image buffer block by the hardware abstraction layer, so as to instruct an algorithm processing service to perform an image data synthesis operation on the processed image buffer block based on the metadata.
As can be seen from the above, in the embodiment of the present application, although there is a difference in the sizes of images acquired by different types of cameras, by configuring the APP and the HAL to transmit based on the maximum size of the buffered image block and changing the size of the image buffered block to the real size when the pipeline flow is called, raw images of different sizes can all be transmitted in the same channel, that is, the mobile terminal can transmit raw images of different sizes in the night view shooting process, so that the optical zoom is supported when the night view shooting is performed.
EXAMPLE III
Referring to fig. 6, a mobile terminal 6 in the embodiment of the present application includes: a memory 601, one or more processors 602 (only one shown in fig. 6), two or more cameras 603 (three shown in fig. 6 as an example), and a computer program stored on the memory 601 and executable on the processors. Wherein: the memory 601 is used for storing software programs and units, and the processor 602 executes various functional applications and data processing by running the software programs and units stored in the memory 601, so as to acquire resources corresponding to the preset events. Specifically, the processor 602 implements the following steps by running the above-mentioned computer program stored in the memory 601:
when the hardware abstraction layer of the mobile terminal receives the image buffer block to be processed transmitted by the appointed application, analyzing the image buffer block to be processed to obtain description information carried by the image buffer block to be processed, and recording the description information as description information to be processed, wherein the description information is used for describing the size and the type of the image buffer block;
if the to-be-processed description information is matched with preset initial description information, determining a target pipeline flow in more than one pipeline flow according to the type of the to-be-processed image buffer block indicated by the to-be-processed description information;
replacing the to-be-processed description information carried by the to-be-processed image buffer block with preset target description information, wherein the target description information is used for describing the real size;
calling the target pipeline flow to process the replaced image buffer block to be processed to obtain a processed image buffer block;
replacing the target description information carried by the processed image buffer block with the description information to be processed;
and transmitting the processed image buffer block carrying the to-be-processed description information to the specified application.
Assuming that the above is the first possible implementation, in a second possible implementation provided on the basis of the first possible implementation, the processor 602, by executing the above computer program stored in the memory 601, implements the following further steps:
when the starting event of the specified application is monitored, the initial description information and the target description information are configured;
and respectively creating a corresponding real-time assembly line flow and an offline assembly line flow for each camera 603 of the mobile terminal.
In a third possible implementation manner provided on the basis of the second possible implementation manner, the configuring the initial description information and the target description information when the start event of the specified application is monitored includes:
when the starting event of the specified application is monitored, acquiring the image size corresponding to each camera 603 of the mobile terminal;
comparing the image sizes corresponding to the cameras 603 to determine the maximum size;
configuring the initial description information based on the maximum size;
and configuring the target description information based on the image size corresponding to each camera.
In a fourth possible implementation manner provided on the basis of the two possible implementation manners, the determining, in more than one pipeline flow, a target pipeline flow according to the type of the to-be-processed image buffer block indicated by the to-be-processed description information includes:
acquiring the current shooting magnification of the mobile terminal;
determining a target camera according to the shooting magnification;
and determining a target pipeline flow in more than one pipeline flow according to the target camera and the type of the to-be-processed image buffer block indicated by the to-be-processed description information.
In a fifth possible implementation manner provided as the basis of the fourth possible implementation manner, the types of the image buffer block include an output type and an input type, where the output type is used to instruct the hardware abstraction layer to perform an image data padding operation, and the input type is used to instruct the hardware abstraction layer to perform an image data conversion operation; accordingly, the determining a target pipeline flow in more than one pipeline flows according to the target camera and the type of the image buffer block to be processed indicated by the description information to be processed includes:
if the type of the buffer block of the image to be processed indicated by the description information to be processed is an output type, determining a real-time pipeline flow of the target camera as the target pipeline flow;
and if the type of the to-be-processed image buffer block indicated by the to-be-processed description information is an input type, determining the offline pipeline flow of the target camera as the target pipeline flow.
In a sixth possible implementation manner provided on the basis of the fifth possible implementation manner, if the type of the to-be-processed image buffer block indicated by the to-be-processed description information is an output type, the processor 602 further implements the following steps when executing the computer program stored in the memory 601:
the hardware abstraction layer writes the real size into metadata associated with the processed image buffer block to instruct an algorithmic processing service to perform an image data synthesis operation on the processed image buffer block based on the metadata.
It should be understood that, in the embodiment of the present Application, the Processor 602 may be a Central Processing Unit (CPU), and the Processor may also be other general processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Memory 601 may include both read-only memory and random-access memory, and provides instructions and data to processor 602. Some or all of memory 601 may also include non-volatile random access memory. For example, the memory 601 may also store device class information.
As can be seen from the above, in the embodiment of the present application, although there is a difference in the sizes of images acquired by different types of cameras, by configuring the APP and the HAL to transmit based on the maximum size of the buffered image block and changing the size of the image buffered block to the real size when the pipeline flow is called, raw images of different sizes can all be transmitted in the same channel, that is, the mobile terminal can transmit raw images of different sizes in the night view shooting process, so that the optical zoom is supported when the night view shooting is performed.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned functions may be distributed as different functional units and modules according to needs, that is, the internal structure of the apparatus may be divided into different functional units or modules to implement all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art would appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of external device software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described system embodiments are merely illustrative, and for example, the division of the above-described modules or units is only one logical functional division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The integrated unit may be stored in a computer-readable storage medium if it is implemented in the form of a software functional unit and sold or used as a separate product. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. The computer program includes computer program code, and the computer program code may be in a source code form, an object code form, an executable file or some intermediate form. The computer-readable storage medium may include: any entity or device capable of carrying the above-described computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer readable Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signal, telecommunication signal, software distribution medium, etc. It should be noted that the computer readable storage medium may contain other contents which can be appropriately increased or decreased according to the requirements of the legislation and the patent practice in the jurisdiction, for example, in some jurisdictions, the computer readable storage medium does not include an electrical carrier signal and a telecommunication signal according to the legislation and the patent practice.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. An image data processing method is applied to a mobile terminal, the mobile terminal is provided with more than two cameras, and the image data processing method comprises the following steps:
when a hardware abstraction layer of the mobile terminal receives a to-be-processed image buffer block transmitted by a specified application, analyzing the to-be-processed image buffer block, obtaining description information carried by the to-be-processed image buffer block, and recording the description information as to-be-processed description information, wherein the description information is used for describing the size and the type of the image buffer block;
if the to-be-processed description information is matched with preset initial description information, determining a target pipeline flow in more than one pipeline flow according to the type of the to-be-processed image buffer block indicated by the to-be-processed description information;
replacing the to-be-processed description information carried by the to-be-processed image buffer block with preset target description information, wherein the target description information is used for describing the real size;
calling the target pipeline flow to process the replaced image buffer block to be processed to obtain a processed image buffer block;
replacing target description information carried by the processed image buffer block with to-be-processed description information;
and transmitting the processed image buffer block carrying the to-be-processed description information to the specified application.
2. The image data processing method according to claim 1, characterized in that the image data processing method further comprises:
when a starting event of the specified application is monitored, configuring the initial description information and the target description information;
and respectively creating a corresponding real-time assembly line flow and an offline assembly line flow for each camera of the mobile terminal.
3. The image data processing method according to claim 2, wherein the configuring the initial description information and the target description information when the start event of the specified application is monitored comprises:
when a starting event of the specified application is monitored, acquiring an image size corresponding to each camera of the mobile terminal;
comparing the image sizes corresponding to the cameras to determine the maximum size;
configuring the initial description information based on the maximum size;
and configuring the target description information based on the image size corresponding to each camera.
4. The image data processing method according to claim 2, wherein said determining a target pipeline flow among more than one pipeline flows according to the type of the to-be-processed image buffer block indicated by the to-be-processed description information comprises:
acquiring the current shooting magnification of the mobile terminal;
determining a target camera according to the shooting magnification;
and determining a target pipeline flow in more than one pipeline flow according to the target camera and the type of the buffer block of the image to be processed indicated by the description information to be processed.
5. The image data processing method according to claim 4, wherein the type of the image buffer block includes an output type and an input type, wherein the output type is used for instructing the hardware abstraction layer to perform an image data padding operation, and the input type is used for instructing the hardware abstraction layer to perform an image data conversion operation; correspondingly, the determining a target pipeline flow in more than one pipeline flows according to the target camera and the type of the to-be-processed image buffer block indicated by the to-be-processed description information includes:
if the type of the buffer block of the image to be processed indicated by the description information to be processed is an output type, determining a real-time pipeline flow of the target camera as the target pipeline flow;
and if the type of the buffer block of the image to be processed indicated by the description information to be processed is an input type, determining the offline pipeline flow of the target camera as the target pipeline flow.
6. The image data processing method according to claim 5, wherein if the type of the to-be-processed image buffer block indicated by the to-be-processed description information is an output type, the image data processing method further comprises:
the hardware abstraction layer writes the real size to metadata associated with the processed image buffer block to instruct an algorithmic processing service to perform an image data synthesis operation on the processed image buffer block based on the metadata.
7. An image data processing apparatus applied to a mobile terminal having two or more cameras, the image data processing apparatus comprising:
the analysis unit is used for analyzing the image buffer block to be processed when a hardware abstraction layer of the mobile terminal receives the image buffer block to be processed transmitted by a specified application, obtaining description information carried by the image buffer block to be processed and recording the description information as description information to be processed, wherein the description information is used for describing the size and the type of the image buffer block;
the determining unit is used for determining a target pipeline flow in more than one pipeline flow according to the type of the to-be-processed image buffer block indicated by the to-be-processed description information if the to-be-processed description information is matched with preset initial description information;
the first replacing unit is used for replacing the to-be-processed description information carried by the to-be-processed image buffer block with preset target description information, wherein the target description information is used for describing the real size;
the calling unit is used for calling the target pipeline flow to process the replaced image buffer block to be processed to obtain a processed image buffer block;
the second replacing unit is used for replacing the target description information carried by the processed image buffer block with the description information to be processed;
a sending unit, configured to transmit the processed image buffer block carrying the to-be-processed description information to the specified application.
8. The image data processing apparatus according to claim 7, characterized in that the image data processing apparatus further comprises:
the configuration unit is used for configuring the initial description information when monitoring a starting event of the specified application;
and the creating unit is used for respectively creating a corresponding real-time assembly line flow and an offline assembly line flow for each camera of the mobile terminal.
9. A mobile terminal comprising a memory, a processor, two or more cameras, and a computer program stored in the memory and executable on the processor, wherein the steps of the method according to any of claims 1 to 6 are implemented when the computer program is executed by the processor.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 6.
CN202010023765.2A 2020-01-09 2020-01-09 Image data processing method, image data processing device and mobile terminal Active CN111193869B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010023765.2A CN111193869B (en) 2020-01-09 2020-01-09 Image data processing method, image data processing device and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010023765.2A CN111193869B (en) 2020-01-09 2020-01-09 Image data processing method, image data processing device and mobile terminal

Publications (2)

Publication Number Publication Date
CN111193869A true CN111193869A (en) 2020-05-22
CN111193869B CN111193869B (en) 2021-08-06

Family

ID=70709991

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010023765.2A Active CN111193869B (en) 2020-01-09 2020-01-09 Image data processing method, image data processing device and mobile terminal

Country Status (1)

Country Link
CN (1) CN111193869B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115474061A (en) * 2021-06-10 2022-12-13 广州视源电子科技股份有限公司 Image data transmission method and device, terminal equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050140787A1 (en) * 2003-11-21 2005-06-30 Michael Kaplinsky High resolution network video camera with massively parallel implementation of image processing, compression and network server
CN104065937A (en) * 2014-06-20 2014-09-24 中国电子科技集团公司第四十四研究所 Real-time high-speed image pre-processing method for CMOS image sensor
CN106775902A (en) * 2017-01-25 2017-05-31 北京奇虎科技有限公司 A kind of method and apparatus of image procossing, mobile terminal
CN107391106A (en) * 2017-06-09 2017-11-24 深圳市金立通信设备有限公司 The initial method and terminal of camera parameter
CN108833804A (en) * 2018-09-20 2018-11-16 Oppo广东移动通信有限公司 Imaging method, device and electronic equipment
CN109167930A (en) * 2018-10-11 2019-01-08 Oppo广东移动通信有限公司 Image display method, device, electronic equipment and computer readable storage medium
CN109194855A (en) * 2018-09-20 2019-01-11 Oppo广东移动通信有限公司 Imaging method, device and electronic equipment
WO2019060430A1 (en) * 2017-09-21 2019-03-28 Qualcomm Incorporated Fully extensible camera processing pipeline interface

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050140787A1 (en) * 2003-11-21 2005-06-30 Michael Kaplinsky High resolution network video camera with massively parallel implementation of image processing, compression and network server
CN104065937A (en) * 2014-06-20 2014-09-24 中国电子科技集团公司第四十四研究所 Real-time high-speed image pre-processing method for CMOS image sensor
CN106775902A (en) * 2017-01-25 2017-05-31 北京奇虎科技有限公司 A kind of method and apparatus of image procossing, mobile terminal
CN107391106A (en) * 2017-06-09 2017-11-24 深圳市金立通信设备有限公司 The initial method and terminal of camera parameter
WO2019060430A1 (en) * 2017-09-21 2019-03-28 Qualcomm Incorporated Fully extensible camera processing pipeline interface
CN108833804A (en) * 2018-09-20 2018-11-16 Oppo广东移动通信有限公司 Imaging method, device and electronic equipment
CN109194855A (en) * 2018-09-20 2019-01-11 Oppo广东移动通信有限公司 Imaging method, device and electronic equipment
CN109167930A (en) * 2018-10-11 2019-01-08 Oppo广东移动通信有限公司 Image display method, device, electronic equipment and computer readable storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115474061A (en) * 2021-06-10 2022-12-13 广州视源电子科技股份有限公司 Image data transmission method and device, terminal equipment and storage medium

Also Published As

Publication number Publication date
CN111193869B (en) 2021-08-06

Similar Documents

Publication Publication Date Title
CN111225153B (en) Image data processing method, image data processing device and mobile terminal
WO2021175055A1 (en) Video processing method and related device
CN111917988B (en) Remote camera application method, system and medium of cloud mobile phone
TWI595786B (en) Timestamp-based audio and video processing method and system thereof
US20140244858A1 (en) Communication system and relaying device
CN109068059B (en) Method for calling camera, mobile terminal and storage medium
US20180352089A1 (en) High-Quality Audio/Visual Conferencing
WO2021218318A1 (en) Video transmission method, electronic device and computer readable medium
KR20080103929A (en) Encoding multi-media signals
CN115484403B (en) Video recording method and related device
CN110913138B (en) Continuous shooting method, device, terminal and computer readable storage medium
CN108174084A (en) panoramic video processing method and terminal device
CN111988526B (en) Mobile terminal and image data processing method
JP2019079468A (en) Image processing system, control method therefor, and program
CN111193869B (en) Image data processing method, image data processing device and mobile terminal
JP2013175824A (en) Electronic camera
CN102754448A (en) Data processing unit and data encoding device
CN104618675B (en) Kinescope method and device
US7620236B2 (en) Image processing method and apparatus
CN115278306B (en) Video editing method and device
CN114666477A (en) Video data processing method, device, equipment and storage medium
US9392036B2 (en) Terminal device and communication system
CN111741222A (en) Image generation method, image generation device and terminal equipment
WO2019091423A1 (en) An image‐processing microprocessor for supporting an application processor and multiple cameras
JP2006148972A (en) Communications terminal with camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant