CN110958399B - High dynamic range image HDR realization method and related product - Google Patents

High dynamic range image HDR realization method and related product Download PDF

Info

Publication number
CN110958399B
CN110958399B CN201911253913.3A CN201911253913A CN110958399B CN 110958399 B CN110958399 B CN 110958399B CN 201911253913 A CN201911253913 A CN 201911253913A CN 110958399 B CN110958399 B CN 110958399B
Authority
CN
China
Prior art keywords
hdr
camera
camera application
module
party
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911253913.3A
Other languages
Chinese (zh)
Other versions
CN110958399A (en
Inventor
杨平平
方攀
陈岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201911253913.3A priority Critical patent/CN110958399B/en
Publication of CN110958399A publication Critical patent/CN110958399A/en
Application granted granted Critical
Publication of CN110958399B publication Critical patent/CN110958399B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors

Abstract

The embodiment of the application discloses a high dynamic range image HDR realization method and a related product, which are applied to electronic equipment, wherein the electronic equipment comprises an operating system and a media service module, the application layer of the operating system is provided with a third party camera application, and the hardware abstraction layer of the operating system is provided with a camera hardware abstraction module; the method comprises the following steps: the method comprises the steps that a third party camera application sends an HDR processing request to a camera hardware abstraction module, the camera hardware abstraction module acquires first camera application data to be processed, an algorithm module for achieving an HDR function is called to process the first camera application data to obtain second camera application data, the second camera application data are sent to a preset target, and the preset target achieves an HDR effect according to the second camera application data; the method is beneficial to the third-party camera application to realize HDR effect and improve the shooting filming rate.

Description

High dynamic range image HDR realization method and related product
Technical Field
The application relates to the technical field of electronic equipment, in particular to a high dynamic range image HDR realization method and a related product.
Background
With the development of the times, people have higher and higher requirements on the photo reduction degree. In order to meet the above demand, a High-Dynamic Range (HDR) image has come to be used. HDR is generated by an image synthesis technique from a plurality of Low-Dynamic Range (LDR) images of different exposure times. Compared with a common image, HDR can provide more dynamic range and image details, and can better reflect the visual effect in a real environment.
In the existing Android platform, a third-party camera application has basic capability of accessing a bottom layer, but if more functions of the bottom layer are needed, corresponding standards are not provided for mapping the bottom layer capability to three-party access, so that an HDR conversion mode of an upper-layer third-party camera application usually consumes a long time and is high in cost, image matching areas are relatively few, and an exposure fusion effect is poor.
Disclosure of Invention
The embodiment of the application provides a high dynamic range image HDR realization method and a related product, aiming at improving the high efficiency and convenience of realizing HDR by a third-party camera application.
In a first aspect, an embodiment of the present application provides a method for implementing a high dynamic range image HDR, which is applied to an electronic device, where the electronic device includes an operating system and a media service module, an application layer of the operating system is provided with a third-party camera application, and a hardware abstraction layer of the operating system is provided with a camera hardware abstraction module; the method comprises the following steps:
the third party camera application sending an HDR processing request to the camera hardware abstraction module;
the camera hardware abstraction module receives the HDR processing request, acquires first camera application data to be processed, calls an algorithm module for realizing HDR function to process the first camera application data to obtain second camera application data, and sends the second camera application data to a preset target, wherein the algorithm module for realizing HDR function is opened by the third party camera application requesting the operating system to the third party camera application through the media service module;
the preset objective achieves an HDR effect in accordance with the second camera application data.
In a second aspect, an embodiment of the present application provides an apparatus for implementing a high dynamic range image HDR, which is applied to an electronic device, where the electronic device includes an operating system and a media service module, an application layer of the operating system is provided with a third-party camera application, and a hardware abstraction layer of the operating system is provided with a camera hardware abstraction module; the high dynamic range image HDR realization apparatus includes a processing unit, a communication unit, and a storage unit, wherein,
the processing unit, configured to send, by the third party camera application, an HDR processing request to the camera hardware abstraction module; the camera hardware abstraction module is used for receiving the HDR processing request, acquiring first camera application data to be processed, calling an algorithm module for realizing HDR function to process the first camera application data to obtain second camera application data, and sending the second camera application data to a preset target, wherein the algorithm module for realizing HDR function is used for requesting the operating system to be open for the third-party camera application through the media service module by the third-party camera application; and for the preset target to achieve an HDR effect in accordance with the second camera application data.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for executing steps in any method of the first aspect of the embodiment of the present application.
In a fourth aspect, the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program makes a computer perform part or all of the steps described in any one of the methods in the second aspect of the present application.
In a fifth aspect, the present application provides a computer program product, wherein the computer program product includes a non-transitory computer-readable storage medium storing a computer program, and the computer program is operable to cause a computer to perform some or all of the steps described in any one of the methods of the second aspect of the present application. The computer program product may be a software installation package.
It can be seen that, in the embodiments of the present application, a method for implementing a high dynamic range image HDR and a related product are provided, and are applied to an electronic device, where the electronic device includes an operating system and a media service module, an application layer of the operating system is provided with a third-party camera application, and a hardware abstraction layer of the operating system is provided with a camera hardware abstraction module; the method comprises the following steps: the third party camera application sending an HDR processing request to the camera hardware abstraction module; the camera hardware abstraction module receives the HDR processing request, acquires first camera application data to be processed, calls an algorithm module for realizing HDR function to process the first camera application data to obtain second camera application data, and sends the second camera application data to a preset target, wherein the algorithm module for realizing HDR function is opened by the third party camera application requesting the operating system to the third party camera application through the media service module; the preset objective achieves an HDR effect in accordance with the second camera application data. Therefore, the bottom hardware capability is fully utilized by the media platform, the HDR effect is favorably realized by the third-party camera, meanwhile, the effect of realizing the HDR by the third-party camera is favorably enriched, the shooting slice forming rate is favorably improved, and the maintenance cost is reduced.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 2A is a schematic flow chart of a method for implementing HDR in a high dynamic range image according to an embodiment of the present application;
FIG. 2B is a schematic flow chart of a method for enabling an algorithm module of an HDR function according to an embodiment of the present application;
FIG. 3 is a flow chart of another implementation method of HDR for high dynamic range images provided by the embodiments of the present application;
fig. 4 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 5 is a block diagram of functional units of an apparatus for implementing a high dynamic range image HDR according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The electronic device according to the embodiment of the present application may include various handheld devices, vehicle-mounted devices, wearable devices, computing devices or other processing devices connected to a wireless modem, which have wireless communication functions, and various forms of User Equipment (UE), Mobile Stations (MS), terminal devices (terminal device), and the like.
At present, in an Android platform, a three-party camera Application can access underlying camera data through a standard Android Application Programming Interface (API), but if a user wants to use more enhanced functions of an underlying layer or an image processed through an algorithm, the user does not have a corresponding standard Interface to map underlying capability to the three-party access, a photographing scheme of an existing popular mobile phone manufacturer generally selects an MF-HDR mode, other applications are more in a video scene, a time interval of a corresponding request of an underlying module is longer, an exposure time interval is longer, an image motion interval is larger, so that the time of an HDR implementation process is longer, an image matching area is relatively fewer, and an exposure fusion effect is poorer.
In view of the foregoing problems, embodiments of the present application provide a method and a related apparatus for processing camera data, and the following describes embodiments of the present application in detail with reference to the accompanying drawings.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an electronic device provided in an embodiment of the present application, and is applied to an electronic device 100, where the electronic device 100 includes an operating system and a media service module, the operating system may be an android system, an application layer of the operating system is provided with a third-party camera application, and a hardware abstraction layer of the operating system is provided with a camera hardware abstraction module; in addition, the hardware abstraction layer of the operating system is also provided with a media strategy module and an algorithm management module, the camera hardware abstraction module is in communication connection with the media policy module, the native architecture of the operating system further comprises a framework layer, a driver layer, a hardware layer and the like, the framework layer comprises application interfaces (such as native camera application program interfaces) of various native applications, application services (such as native camera services) and a framework layer interface (such as Google HAL3 interface), the hardware abstraction layer comprises hardware abstraction layer interfaces (such as HAL3.0) and hardware abstraction modules (such as camera hardware abstraction modules) of various native applications, the driver layer comprises various drivers (such as screen Display drivers and Audio drivers), and the driver layer is used for enabling various hardware (such as image signal processors ISP +, front-end image sensor sensors) of the electronic equipment.
The media service module is independent of the operating system, third-party camera application can communicate with the media service module through the media management module, the media service module can communicate with the media policy module through an android native information link formed by an application interface, an application service, a frame layer interface, a hardware abstraction layer interface and a hardware abstraction module, the media policy module communicates with the algorithm management module, the algorithm management module maintains an android native algorithm library, the algorithm library comprises enhancement functions supported by various native applications, and for the native camera application, the enhancement functions such as binocular shooting, beauty, sharpening, night vision and the like are supported. In addition, the media service module can also directly communicate with the media policy module or the algorithm management module.
Based on the above framework, the media service module can enable the algorithm module in the algorithm library through the android native information link, the media policy module and the algorithm management module, or enable the algorithm module in the algorithm library directly through the algorithm management module, thereby realizing the enhanced function of opening native application association for third-party camera application.
Based on the above framework, the media service module may invoke the driver of the application to enable some hardware through an android native information link, or through a first information link composed of the media policy module and the camera hardware abstraction module, or through a second information link composed of the media policy module, the algorithm management module, and the camera hardware abstraction module, thereby implementing opening native application-related hardware for third-party camera applications.
The following describes embodiments of the present application in detail.
Referring to fig. 2A, fig. 2A is a schematic flowchart of a method for implementing a high dynamic range image HDR according to an embodiment of the present application, and is applied to an electronic device, where the electronic device includes an operating system and a media service module, an application layer of the operating system is provided with a third-party camera application, and a hardware abstraction layer of the operating system is provided with a camera hardware abstraction module; the method comprises the following steps:
s201, the third party camera application sends an HDR processing request to the camera hardware abstraction module;
s202, the camera hardware abstraction module receives the HDR processing request, acquires first camera application data to be processed, calls an algorithm module for realizing HDR function to process the first camera application data to obtain second camera application data, and sends the second camera application data to a preset target;
wherein the preset target comprises a third party camera application.
Wherein the algorithm module for HDR enabling is the third party camera application requesting the operating system to be open to the third party camera application through the media service module.
Wherein the first camera application data includes a control command instructing processing, and the data processed may be image data when data processing is actually performed.
S203, the preset target achieves HDR effect according to the second camera application data.
It can be seen that, in the embodiments of the present application, a method for implementing a high dynamic range image HDR and a related product are provided, and are applied to an electronic device, where the electronic device includes an operating system and a media service module, an application layer of the operating system is provided with a third-party camera application, and a hardware abstraction layer of the operating system is provided with a camera hardware abstraction module; the method comprises the following steps: the third party camera application sending an HDR processing request to the camera hardware abstraction module; the camera hardware abstraction module receives the HDR processing request, acquires first camera application data to be processed, calls an algorithm module for realizing HDR function to process the first camera application data to obtain second camera application data, and sends the second camera application data to a preset target, wherein the algorithm module for realizing HDR function is opened by the third party camera application requesting the operating system to the third party camera application through the media service module; the preset objective achieves an HDR effect in accordance with the second camera application data. Therefore, the bottom hardware capability is fully utilized by the media platform, the HDR effect is favorably realized by the third-party camera, meanwhile, the effect of realizing the HDR by the third-party camera is favorably enriched, the shooting slice forming rate is favorably improved, and the maintenance cost is reduced.
In one possible example, the camera hardware abstraction module obtains first camera application data to be processed, including: the camera hardware abstraction module calls a camera driver to acquire local image data through a camera; and/or the camera hardware abstraction module receives network image data sent by the third party camera application.
In concrete implementation, a camera hardware abstraction module calls a camera driver to acquire local image data I, image data II and image data III through a camera.
In the concrete implementation, the camera hardware abstraction module calls a camera driver to acquire the first local image data through the camera, and meanwhile, the first network image data sent by the third-party camera application is received.
As can be seen, in this example, the camera hardware abstraction module obtains the first camera application data to be processed, where the first camera application data includes local image data acquired by a camera driver through a camera and/or network image data sent by the third-party camera application, which is beneficial to improving source diversity of the image data.
In one possible example, the preset goal is to achieve HDR effects in accordance with the second camera application data, including: when the preset target is the third-party camera application, the third-party camera application receives the second camera application data; the third party camera application detects long exposure frame data and short exposure frame data in the second camera application data; the third-party camera application performs exposure fusion on the long exposure frame data and the short exposure frame data to generate an HDR data frame; the third party camera application generates HDR video and/or photographs from the HDR data frame encoding.
Performing image matching on an image acquired by the third-party camera application, determining image pixels, and selecting a lower value of the HDR frame data if it is determined that the image has an overexposure condition; if the image is determined to have the underexposure condition, selecting a higher value of the HDR frame data; and if the image exposure is determined to be normal, selecting the average value of the HDR frame data.
The third party camera application generates HDR video and/or photos according to the HDR data frame codes, and comprises the third party camera application codes the HDR data frame to generate a plurality of JPEG files, carries out definition detection on the plurality of JPEG files and selects the clearest JPEG code to generate pictures.
In a specific implementation, after receiving the second camera application data, the third-party camera application detects that the second camera application data includes long-exposure frame data and short-exposure frame data; the third-party camera application performs exposure fusion on the long exposure frame data and the short exposure frame data to generate an HDR data frame; if the third-party camera application determines that the image has an over-exposure condition, selecting a lower value of the frame data; generating an HDR photo according to the HDR data frame coding.
It can be seen that, in this example, the preset target achieves HDR effects according to the second camera application data, including: when the preset target is the third-party camera application, the third-party camera application receives the second camera application data; the third party camera application detects long exposure frame data and short exposure frame data in the second camera application data; the third-party camera application performs exposure fusion on the long exposure frame data and the short exposure frame data to generate an HDR data frame; the third party camera application generating HDR video and/or photographs from the HDR data frame encoding; the method is beneficial to the third-party camera to flexibly process HDR (high-resolution ratio) of the video or the photo, and is beneficial to improving the accuracy and convenience of HDR realization.
In one possible example, the preset goal is to achieve HDR effects in accordance with the second camera application data, including: when the preset target is the camera hardware abstraction module, the camera hardware abstraction module detects long exposure frame data and short exposure frame data in the second camera application data; the camera hardware abstraction module carries out exposure fusion on the long exposure frame data and the short exposure frame data to generate an HDR data frame; the camera hardware abstraction module generates HDR video and/or photographs from the HDR data frame encoding.
Performing image matching on an image acquired by the camera hardware abstraction module, determining image pixels, and if it is determined that the image has an overexposure condition, selecting a lower value of the HDR frame data; if the image is determined to have the underexposure condition, selecting a higher value of the HDR frame data; and if the image exposure is determined to be normal, selecting the average value of the HDR frame data.
After the camera hardware abstraction module generates the HDR video and/or photo according to the HDR data frame code, the method further comprises the step that the camera hardware abstraction module sends the processed HDR file to a third party application APP.
In a specific implementation, the camera hardware abstraction module detects long exposure frame data and short exposure frame data in the second camera application data; the camera hardware abstraction module carries out exposure fusion on the long exposure frame data and the short exposure frame data to generate an HDR data frame; the camera hardware abstraction module generates HDR video and/or photographs from the HDR data frame encoding.
As can be seen, in this example, when the preset target is the camera hardware abstraction module, the camera hardware abstraction module detects long exposure frame data and short exposure frame data in the second camera application data; the camera hardware abstraction module carries out exposure fusion on the long exposure frame data and the short exposure frame data to generate an HDR data frame; the camera hardware abstraction module generates HDR video and/or photos from the HDR data frame encoding; the method is beneficial to reducing the bandwidth used by HDR realization and reducing the free control degree of the third-party application, thereby improving the bottom layer security.
In one possible example, before the third party camera application sends an HDR processing request to the camera hardware abstraction module, the method further comprises: the third-party camera application acquires a bottom camera capability list preset by the electronic equipment from the media service module; the third-party camera application inquires the bottom layer camera capability list from the media service module to acquire all bottom layer core capabilities supporting the third-party camera application; the third-party camera application determines that all the bottom-layer core capabilities comprise HDR functions, and sets preset function configuration characters for the media service module; the media service module sends the preset function configuration character to the camera hardware abstraction module; and the camera hardware abstraction module opens the use permission of the third-party application for the algorithm module of the HDR function according to preset function configuration characters.
The hardware abstraction layer of the operating system is also provided with a media strategy module and an algorithm management module, the algorithm management module manages an algorithm module library, and the camera hardware abstraction module is in communication connection with the media strategy module.
In a specific implementation, a third-party camera application acquires a bottom camera capability list preset by an electronic device from a media service module, queries the bottom camera capability list, acquires all bottom core capabilities supporting the third-party camera application, determines that all the bottom core capabilities include HDR functions, sets preset function configuration characters for the media service module, sends the preset function configuration characters to a camera hardware abstraction module by the media service module, and then opens the use permission of an algorithm module of the third-party application for the HDR functions according to the preset function configuration characters by the camera hardware abstraction module.
In this example, it can be seen that, the third-party camera application obtains the preset underlying camera capability list of the electronic device from the media service module, the third party camera application querying the underlying camera capabilities list from the media service module for all underlying core capabilities that support the third party camera application, the third party camera application determining that the all underlying core capabilities include HDR functionality, setting a preset function configuration character to the media service module, the media service module sending the preset function configuration character to the camera hardware abstraction module, the camera hardware abstraction module opens the use authority of the third party application for the algorithm module of the HDR function according to preset function configuration characters, the preprocessing process described above facilitates flexible adaptation of algorithms in different scenarios while opening HDR functionality to third party applications.
In one possible example, the hardware abstraction layer of the operating system is further provided with a media policy module and an algorithm management module, and the algorithm management module is used for managing algorithm modules in an algorithm library; the camera hardware abstraction module opens the use permission of the third party application for the algorithm module of the HDR function according to preset function configuration characters, and the method comprises the following steps: the camera hardware abstraction module receives the preset function configuration characters from the media service module and sends a first HDR parameter to the media policy module; the media strategy module converts the first HDR parameter into a second HDR parameter which can be identified by the algorithm management module and sends the second HDR parameter to the algorithm management module; the algorithm management module receives the second HDR parameter, and opens the use permission of the third party application for the algorithm module of the HDR function in the algorithm library according to the second HDR parameter; after receiving the opening of the use permission, the camera hardware abstraction module selects an HDR linear communication model; the camera hardware abstraction module enables parsing a hardware HDR algorithm according to the HDR linear communication model.
Wherein the second HDR parameter is processable by the identification of the underlying module.
The camera hardware abstraction module selects the HDR linear communication model, namely, the camera hardware abstraction module selects an appropriate underlying software HDR linear communication model for the configuration of the upper layer.
In a concrete implementation, as shown in fig. 2B, after receiving the preset function configuration character from the media service module, the camera hardware abstraction module sends a first HDR parameter to the media policy module, the media policy module converts the first HDR parameter into a second HDR parameter that can be recognized by the algorithm management module, and sends the second HDR parameter to the algorithm management module, the algorithm management module receives the second HDR parameter, opens the usage right of the algorithm module of the HDR function in the algorithm library for the third party application according to the second HDR parameter, and after receiving the usage right, the camera hardware abstraction module selects an HDR linear communication model, and enables to parse a hardware HDR algorithm according to the HDR linear communication model.
In this example, it can be seen that, the camera hardware abstraction module receives the preset function configuration characters from the media service module, sends a first HDR parameter to the media policy module, the media policy module converts the first HDR parameter into a second HDR parameter that can be recognized by the algorithm management module, and sends the second HDR parameter to the algorithm management module, the algorithm management module receives the second HDR parameter, opens the usage right of the algorithm module for the HDR function in the algorithm library by the third party application according to the second HDR parameter, and after receiving the usage right, the camera hardware abstraction module selects an HDR linear communication model and enables to parse the hardware HDR algorithm according to the HDR linear communication model; the method is beneficial to improving the adaptation degree of the HDR algorithm and the upper module, and is beneficial to improving the high efficiency of the third-party camera in applying the HDR.
In one possible example, the third party camera application determining that the full underlying core capability includes HDR functionality, setting preset functional configuration characters for the media service module, includes: the media service module collects all bottom layer core capabilities supported by the third-party camera application by adopting a JS object numbered notation (JSON); the third-party camera application receives the capitalized character strings sent by the media service module and determines that all the bottom layer core capabilities comprise HDR functions; when the third party camera application determines that a camera is requested to take a Video service, the third party camera application sets a JSON string { "HDR-Video": 1} for the media service module; when the third party camera application determines to request a camera to take a photo service, the third party camera application sets a JSON character string "HDR-Capture" 1 to the media service module.
Wherein the JSON string { "HDR-Video": 1} is used to represent that the media supports HDR Video capabilities for the third party camera application.
Wherein the JSON string "HDR-Capture": 1 is used to represent that the media supports HDR photo capabilities for the third party camera application.
Wherein the third party camera application determines that the entire underlying core capabilities do not include HDR functionality if it receives an empty string sent by the media service module.
In a specific implementation, the third-party camera application sends a request for determining all bottom core capabilities supported by the third-party camera application by the media platform to the media service module, the media service module collects all bottom core capabilities supported by the third-party camera application by using JS object notation JSON, the third-party camera application receives capitalized character strings sent by the media service module and determines that all bottom core capabilities include HDR functions, and when the third-party camera application determines that a camera is requested to shoot a Video service, the third-party camera application sets the JSON character string { "HDR-Video": 1} to the media service module so as to subsequently configure and select an adaptive HDR scheme for the third-party camera application.
In a specific implementation, the third-party camera application sends a request for determining all bottom-layer core capabilities supported by the third-party camera application by the media platform to the media service module, the media service module collects all bottom-layer core capabilities supported by the third-party camera application by adopting a JS object notation JSON, the third-party camera application receives an uppercase character string sent by the media service module, determines that all bottom-layer core capabilities comprise HDR functions, and when the third-party camera application determines to request a camera to take a photo service, the third-party camera application sets a JSON character string "HDR-Capture": 1 to the media service module so as to subsequently configure and select an adaptive HDR scheme for the third-party camera application.
As can be seen, in this example, the third-party camera application sends, to the media service module, a request for determining that all the underlying core capabilities include an HDR function, and different preset character strings are set according to different services of the third-party camera application, which is beneficial to fully utilizing underlying hardware capabilities and achieving an HDR effect, and at the same time, different kinds of HDR functions can be converted into an MF-HDR method adapted to the third-party camera application, so that the improvement of a shooting video filming rate is facilitated, and at the same time, more HDR data frames are generated in the same time, so that jitter of HDR motion is reduced at a high frame rate, and the filming definition is improved.
Consistent with the embodiment shown in fig. 2A, please refer to fig. 3, where fig. 3 is a schematic flowchart of another implementation method of a high dynamic range image HDR provided in the embodiment of the present application, and is applied to an electronic device, where the electronic device includes an operating system and a media service module, an application layer of the operating system is provided with a third-party camera application, and a hardware abstraction layer of the operating system is provided with a camera hardware abstraction module; as shown in the figure, the method for implementing the high dynamic range image HDR includes:
s301, the third party camera application sends an HDR processing request to the camera hardware abstraction module;
s302, the camera hardware abstraction module receives the HDR processing request, acquires first camera application data to be processed, calls an algorithm module for realizing HDR function to process the first camera application data to obtain second camera application data, and sends the second camera application data to the third-party camera application;
s303, after receiving the second camera application data, the third party camera application detects that the second camera application data includes long exposure frame data and short exposure frame data;
s304, the third-party camera application performs exposure fusion on the long exposure frame data and the short exposure frame data to generate an HDR data frame;
s305, the third party camera application generates HDR video and/or photos from the HDR data frame encoding.
It can be seen that, in the embodiments of the present application, a method for implementing a high dynamic range image HDR and a related product are provided, and are applied to an electronic device, where the electronic device includes an operating system and a media service module, an application layer of the operating system is provided with a third-party camera application, and a hardware abstraction layer of the operating system is provided with a camera hardware abstraction module; the method comprises the following steps: the third party camera application sending an HDR processing request to the camera hardware abstraction module; the camera hardware abstraction module receives the HDR processing request, acquires first camera application data to be processed, calls an algorithm module for realizing HDR function to process the first camera application data, obtains second camera application data, and sends the second camera application data to the third-party camera application, wherein the algorithm module for realizing HDR function is opened by the third-party camera application requesting the operating system to the third-party camera application through the media service module; the third party camera application implements an HDR effect in accordance with the second camera application data. Therefore, the bottom hardware capability is fully utilized by the media platform, the HDR effect is favorably realized by the third-party camera, meanwhile, the effect of realizing the HDR by the third-party camera is favorably enriched, the shooting slice forming rate is favorably improved, and the maintenance cost is reduced.
In addition, after receiving the second camera application data, the third-party camera application detects that the second camera application data comprises long exposure frame data and short exposure frame data; the third-party camera application performs exposure fusion on the long exposure frame data and the short exposure frame data to generate an HDR data frame; the third party camera application generating HDR video and/or photographs from the HDR data frame encoding; the method is beneficial to the third-party camera to flexibly process HDR (high-resolution ratio) of the video or the photo, and is beneficial to improving the accuracy and convenience of HDR realization.
Consistent with the embodiments shown in fig. 1, fig. 2A, and fig. 3, please refer to fig. 4, and fig. 4 is a schematic structural diagram of an electronic device 400 provided in an embodiment of the present application, as shown in the figure, the electronic device 400 includes an application processor 410, a memory 420, a communication interface 430, and one or more programs 421, where the one or more programs 421 are stored in the memory 420 and configured to be executed by the application processor 410, and the one or more programs 421 include instructions for performing the following steps;
the third party camera application sending an HDR processing request to the camera hardware abstraction module;
the camera hardware abstraction module receives the HDR processing request, acquires first camera application data to be processed, calls an algorithm module for realizing HDR function to process the first camera application data, obtains second camera application data, and sends the second camera application data to the third-party camera application, wherein the algorithm module for realizing HDR function is opened by the third-party camera application requesting the operating system to the third-party camera application through the media service module;
the third party camera application implements an HDR effect in accordance with the second camera application data.
It can be seen that, in the embodiments of the present application, a method for implementing a high dynamic range image HDR and a related product are provided, and are applied to an electronic device, where the electronic device includes an operating system and a media service module, an application layer of the operating system is provided with a third-party camera application, and a hardware abstraction layer of the operating system is provided with a camera hardware abstraction module; the method comprises the following steps: the third party camera application sending an HDR processing request to the camera hardware abstraction module; the camera hardware abstraction module receives the HDR processing request, acquires first camera application data to be processed, calls an algorithm module for realizing HDR function to process the first camera application data to obtain second camera application data, and sends the second camera application data to a preset target, wherein the algorithm module for realizing HDR function is opened by the third party camera application requesting the operating system to the third party camera application through the media service module; the preset objective achieves an HDR effect in accordance with the second camera application data. Therefore, the bottom hardware capability is fully utilized by the media platform, the HDR effect is favorably realized by the third-party camera, meanwhile, the effect of realizing the HDR by the third-party camera is favorably enriched, the shooting slice forming rate is favorably improved, and the maintenance cost is reduced.
In one possible example, the camera hardware abstraction module obtains first camera application data to be processed, the instructions in the program being specifically configured to: the camera hardware abstraction module calls a camera driver to acquire local image data through a camera; and/or the camera hardware abstraction module receives network image data sent by the third party camera application.
In one possible example, the preset goal is to achieve an HDR effect in accordance with the second camera application data, the instructions in the program being specifically configured to perform the following: when the preset target is the third-party camera application, the third-party camera application receives the second camera application data; the third party camera application detects long exposure frame data and short exposure frame data in the second camera application data; the third-party camera application performs exposure fusion on the long exposure frame data and the short exposure frame data to generate an HDR data frame; the third party camera application generates HDR video and/or photographs from the HDR data frame encoding.
In one possible example, the preset goal is to achieve an HDR effect in accordance with the second camera application data, the instructions in the program being specifically configured to perform the following: when the preset target is the camera hardware abstraction module, the camera hardware abstraction module detects long exposure frame data and short exposure frame data in the second camera application data; the camera hardware abstraction module carries out exposure fusion on the long exposure frame data and the short exposure frame data to generate an HDR data frame; the camera hardware abstraction module generates HDR video and/or photographs from the HDR data frame encoding.
In one possible example, before the third party camera application sends an HDR processing request to the camera hardware abstraction module, the instructions in the program are specifically operable to: the third-party camera application acquires a bottom camera capability list preset by the electronic equipment from the media service module; the third-party camera application inquires the bottom layer camera capability list from the media service module to acquire all bottom layer core capabilities supporting the third-party camera application; the third-party camera application determines that all the bottom-layer core capabilities comprise HDR functions, and sets preset function configuration characters for the media service module; the media service module sends the preset function configuration character to the camera hardware abstraction module; and the camera hardware abstraction module opens the use permission of the third-party application for the HDR function according to preset function configuration characters.
In one possible example, a hardware abstraction layer of the operating system is further provided with a media policy module and an algorithm management module, the camera hardware abstraction module is connected with the media policy module, and the media policy module is connected with the algorithm management module; the camera hardware abstraction module opens the usage right of the third-party application for the HDR function according to preset function configuration characters, and the instructions in the program are specifically configured to perform the following operations: the camera hardware abstraction module receives the preset function configuration characters from the media service module and sends a first HDR parameter to the media policy module; the media strategy module converts the first HDR parameter into a second HDR parameter which can be identified by the algorithm management module and sends the second HDR parameter to the algorithm management module; the algorithm management module receives the second HDR parameter, and opens the use permission of the third party application for the algorithm module of the HDR function in the algorithm library according to the second HDR parameter; after receiving the opening of the use permission, the camera hardware abstraction module selects an HDR linear communication model; the camera hardware abstraction module enables parsing a hardware HDR algorithm according to the HDR linear communication model.
In one possible example, the third party camera application determines that the entire underlying core capabilities include HDR functionality, sets preset functional configuration characters for the media service module, and the instructions in the program are specifically configured to perform the following: the media service module collects all bottom layer core capabilities supported by the third-party camera application by adopting a JS object numbered notation (JSON); the third-party camera application receives the capitalized character strings sent by the media service module and determines that all the bottom layer core capabilities comprise HDR functions; when the third party camera application determines to request a camera to shoot a Video service, the third party camera application sets a JSON string { "HDR-Video": 1} to the media service module, wherein the JSON string { "HDR-Video": 1} is used to represent that the media supports HDR Video capabilities for the third party camera application; when the third party camera application determines to request a camera to take a photo service, the third party camera application sets a JSON character string "HDR-Capture": 1 to the media service module, wherein the JSON character string "HDR-Capture": 1 is used to represent that the media supports HDR photo capabilities for the third party camera application.
The above description has introduced the solution of the embodiment of the present application mainly from the perspective of the method-side implementation process. It is understood that the electronic device comprises corresponding hardware structures and/or software modules for performing the respective functions in order to realize the above-mentioned functions. Those of skill in the art will readily appreciate that the present application is capable of hardware or a combination of hardware and computer software implementing the various illustrative elements and algorithm steps described in connection with the embodiments provided herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the electronic device may be divided into the functional units according to the method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Fig. 5 is a block diagram of functional units of a high dynamic range image HDR realization apparatus 500 according to an embodiment of the present application. The high dynamic range image HDR implementation device 500 is applied to an electronic device, and the electronic device includes an operating system and a media service module, an application layer of the operating system is provided with a third party camera application, and a hardware abstraction layer of the operating system is provided with a camera hardware abstraction module; the high dynamic range image HDR realization apparatus 500 comprises a processing unit 501, a communication unit 502 and a storage unit 503, wherein,
the processing unit 501, configured to send an HDR processing request to the camera hardware abstraction module by the third-party camera application; the camera hardware abstraction module is used for receiving the HDR processing request, acquiring first camera application data to be processed, calling an algorithm module for realizing HDR function to process the first camera application data to obtain second camera application data, and sending the second camera application data to the third-party camera application, wherein the algorithm module for realizing HDR function is used for requesting the operating system to be open for the third-party camera application through the media service module by the third-party camera application; and for the third party camera application to implement HDR effects in accordance with the second camera application data.
It can be seen that, in the embodiments of the present application, a method for implementing a high dynamic range image HDR and a related product are provided, and are applied to an electronic device, where the electronic device includes an operating system and a media service module, an application layer of the operating system is provided with a third-party camera application, and a hardware abstraction layer of the operating system is provided with a camera hardware abstraction module; the method comprises the following steps: the third party camera application sending an HDR processing request to the camera hardware abstraction module; the camera hardware abstraction module receives the HDR processing request, acquires first camera application data to be processed, calls an algorithm module for realizing HDR function to process the first camera application data to obtain second camera application data, and sends the second camera application data to a preset target, wherein the algorithm module for realizing HDR function is opened by the third party camera application requesting the operating system to the third party camera application through the media service module; the preset objective achieves an HDR effect in accordance with the second camera application data. Therefore, the bottom hardware capability is fully utilized by the media platform, the HDR effect is favorably realized by the third-party camera, meanwhile, the effect of realizing the HDR by the third-party camera is favorably enriched, the shooting slice forming rate is favorably improved, and the maintenance cost is reduced.
It can be understood that, since the method embodiment and the apparatus embodiment are different presentation forms of the same technical concept, the content of the method embodiment portion in the present application should be synchronously adapted to the apparatus embodiment portion, and is not described herein again.
In a possible example, the third-party camera application obtains a bottom-layer camera capability list preset by the electronic device from the media service module, and the processing unit 501 is specifically configured to: the third-party camera application sends a media platform version acquisition request carrying an authentication code to the media service module; the media service module checks the authentication code; after the media service module determines that the verification result is that the verification is passed, the media platform version information is returned to the third-party camera application; the third-party camera application sends a request for acquiring a bottom camera capability list preset by the electronic equipment to the media service module; the media service module collects the bottom layer camera capability list and sends the bottom layer camera capability list to the third party camera application.
In one possible example, before the third party camera application sends an HDR processing request to the camera hardware abstraction module, the method further comprises the processing unit 501 being specifically configured to: the third-party camera application acquires a bottom camera capability list preset by the electronic equipment from the media service module; the third-party camera application inquires the bottom layer camera capability list from the media service module to acquire all bottom layer core capabilities supporting the third-party camera application; the third-party camera application determines that all the bottom-layer core capabilities comprise HDR functions, and sets preset function configuration characters for the media service module; the media service module sends the preset function configuration character to the camera hardware abstraction module; and the camera hardware abstraction module opens the use permission of the third-party application for the algorithm module of the HDR function according to preset function configuration characters.
In a possible example, the third-party camera application obtains a bottom-layer camera capability list preset by the electronic device from the media service module, and the processing unit 501 is specifically configured to: the third-party camera application sends a media platform version acquisition request carrying an authentication code to the media service module; the media service module checks the authentication code; after the media service module determines that the verification result is that the verification is passed, the media platform version information is returned to the third-party camera application; and the third-party camera application acquires a bottom camera capability list preset by the electronic equipment through the media service module.
In a possible example, the third-party camera application determines that all the underlying core capabilities include an HDR function, and sets a preset function configuration character for the media service module, where the processing unit 501 is specifically configured to: the media service module collects all bottom layer core capabilities supported by the third-party camera application by adopting a JS object numbered notation (JSON); the third-party camera application receives the capitalized character strings sent by the media service module and determines that all the bottom layer core capabilities comprise HDR functions; when the third party camera application determines to request a camera to shoot a Video service, the third party camera application sets a JSON string { "HDR-Video": 1} to the media service module, wherein the JSON string { "HDR-Video": 1} is used to represent that the media supports HDR Video capabilities for the third party camera application; when the third party camera application determines to request a camera to take a photo service, the third party camera application sets a JSON character string "HDR-Capture": 1 to the media service module, wherein the JSON character string "HDR-Capture": 1 is used to represent that the media supports HDR photo capabilities for the third party camera application.
In one possible example, the camera hardware abstraction module obtains first camera application data to be processed, and the processing unit 501 is specifically configured to: the camera hardware abstraction module calls a camera driver to acquire local image data through a camera; and/or the camera hardware abstraction module receives network image data sent by the third party camera application.
In a possible example, the preset target achieves an HDR effect according to the second camera application data, and the processing unit 501 is specifically configured to: when the preset target is the third-party camera application, the third-party camera application receives the second camera application data; the third party camera application detects long exposure frame data and short exposure frame data in the second camera application data; the third-party camera application performs exposure fusion on the long exposure frame data and the short exposure frame data to generate an HDR data frame; the third party camera application generates HDR video and/or photographs from the HDR data frame encoding.
In a possible example, the preset target achieves an HDR effect according to the second camera application data, and the processing unit 501 is specifically configured to: when the preset target is the camera hardware abstraction module, the camera hardware abstraction module detects long exposure frame data and short exposure frame data in the second camera application data; the camera hardware abstraction module carries out exposure fusion on the long exposure frame data and the short exposure frame data to generate an HDR data frame; the camera hardware abstraction module generates HDR video and/or photographs from the HDR data frame encoding.
In one possible example, before the third party camera application sends an HDR processing request to the camera hardware abstraction module, the processing unit 501 is specifically configured to: the third-party camera application acquires a bottom camera capability list preset by the electronic equipment from the media service module; the third-party camera application inquires the bottom layer camera capability list from the media service module to acquire all bottom layer core capabilities supporting the third-party camera application; the third-party camera application determines that all the bottom-layer core capabilities comprise HDR functions, and sets preset function configuration characters for the media service module; the media service module sends the preset function configuration character to the camera hardware abstraction module; and the camera hardware abstraction module opens the use permission of the third-party application for the HDR function according to preset function configuration characters.
In one possible example, a hardware abstraction layer of the operating system is further provided with a media policy module and an algorithm management module, the camera hardware abstraction module is connected with the media policy module, and the media policy module is connected with the algorithm management module; the camera hardware abstraction module opens the usage right of the third-party application for the HDR function according to a preset function configuration character, and the processing unit 501 is specifically configured to: the camera hardware abstraction module receives the preset function configuration characters from the media service module and sends a first HDR parameter to the media policy module; the media strategy module converts the first HDR parameter into a second HDR parameter which can be identified by the algorithm management module and sends the second HDR parameter to the algorithm management module; the algorithm management module receives the second HDR parameter, and opens the use permission of the third party application for the algorithm module of the HDR function in the algorithm library according to the second HDR parameter; after receiving the opening of the use permission, the camera hardware abstraction module selects an HDR linear communication model; the camera hardware abstraction module enables parsing a hardware HDR algorithm according to the HDR linear communication model.
In a possible example, the third-party camera application determines that all the underlying core capabilities include an HDR function, and sets a preset function configuration character for the media service module, where the processing unit 501 is specifically configured to: the media service module collects all bottom layer core capabilities supported by the third-party camera application by adopting a JS object numbered notation (JSON); the third-party camera application receives the capitalized character strings sent by the media service module and determines that all the bottom layer core capabilities comprise HDR functions; when the third party camera application determines to request a camera to shoot a Video service, the third party camera application sets a JSON string { "HDR-Video": 1} to the media service module, wherein the JSON string { "HDR-Video": 1} is used to represent that the media supports HDR Video capabilities for the third party camera application; when the third party camera application determines to request a camera to take a photo service, the third party camera application sets a JSON character string "HDR-Capture": 1 to the media service module, wherein the JSON character string "HDR-Capture": 1 is used to represent that the media supports HDR photo capabilities for the third party camera application.
Embodiments of the present application also provide a computer storage medium, where the computer storage medium stores a computer program for electronic data exchange, the computer program enabling a computer to execute part or all of the steps of any one of the methods described in the above method embodiments, and the computer includes an electronic device.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the methods as described in the above method embodiments. The computer program product may be a software installation package, the computer comprising an electronic device.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer readable memory if it is implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the above-mentioned method of the embodiments of the present application. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash Memory disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. The method for realizing the high dynamic range image HDR is characterized by being applied to electronic equipment, wherein the electronic equipment comprises an operating system and a media service module, the media service module is independent of the setting of the operating system, an application layer of the operating system is provided with a third party camera application, and a hardware abstraction layer of the operating system is provided with a camera hardware abstraction module; the method comprises the following steps:
the third party camera application sending an HDR processing request to the camera hardware abstraction module;
the camera hardware abstraction module receives the HDR processing request, acquires first camera application data to be processed, calls an algorithm module for realizing HDR function to process the first camera application data to obtain second camera application data, and sends the second camera application data to a preset target, wherein the algorithm module for realizing HDR function is opened by the third party camera application requesting the operating system to the third party camera application through the media service module;
the preset objective achieves an HDR effect in accordance with the second camera application data.
2. The method of claim 1, wherein the camera hardware abstraction module obtains first camera application data to be processed, comprising:
the camera hardware abstraction module calls a camera driver to acquire local image data through a camera; and/or the presence of a gas in the gas,
the camera hardware abstraction module receives network image data sent from the third party camera application.
3. The method of claim 1, wherein the preset goal is to achieve HDR effects in accordance with the second camera application data, comprising:
when the preset target is the third-party camera application, the third-party camera application receives the second camera application data;
the third party camera application detects long exposure frame data and short exposure frame data in the second camera application data;
the third-party camera application performs exposure fusion on the long exposure frame data and the short exposure frame data to generate an HDR data frame;
the third party camera application generates HDR video and/or photographs from the HDR data frame encoding.
4. The method of claim 1, wherein the preset goal is to achieve HDR effects in accordance with the second camera application data, comprising:
when the preset target is the camera hardware abstraction module, the camera hardware abstraction module detects long exposure frame data and short exposure frame data in the second camera application data;
the camera hardware abstraction module carries out exposure fusion on the long exposure frame data and the short exposure frame data to generate an HDR data frame;
the camera hardware abstraction module generates HDR video and/or photographs from the HDR data frame encoding.
5. The method of claim 1, wherein prior to the third party camera application sending an HDR processing request to the camera hardware abstraction module, the method further comprises:
the third-party camera application acquires a bottom camera capability list preset by the electronic equipment from the media service module;
the third-party camera application inquires the bottom layer camera capability list from the media service module to acquire all bottom layer core capabilities supporting the third-party camera application;
the third-party camera application determines that all the bottom-layer core capabilities comprise HDR functions, and sets preset function configuration characters for the media service module;
the media service module sends the preset function configuration character to the camera hardware abstraction module;
and the camera hardware abstraction module opens the use permission of the third-party application for the HDR function according to preset function configuration characters.
6. The method according to claim 5, wherein a media policy module and an algorithm management module are further provided at a hardware abstraction layer of the operating system, the camera hardware abstraction module is connected to the media policy module, and the media policy module is connected to the algorithm management module; the camera hardware abstraction module opens the use permission of the third party application for the HDR function according to preset function configuration characters, and the method comprises the following steps:
the camera hardware abstraction module receives the preset function configuration characters from the media service module and sends a first HDR parameter to the media policy module;
the media strategy module converts the first HDR parameter into a second HDR parameter which can be identified by the algorithm management module and sends the second HDR parameter to the algorithm management module;
the algorithm management module receives the second HDR parameter, and opens the use permission of the third party application for the algorithm module of the HDR function in the algorithm library according to the second HDR parameter;
after receiving the opening of the use permission, the camera hardware abstraction module selects an HDR linear communication model;
the camera hardware abstraction module enables parsing a hardware HDR algorithm according to the HDR linear communication model.
7. The method of claim 5, wherein the third party camera application determines that the full underlying core capabilities include HDR functionality, and wherein setting preset functionality configuration characters for the media service module comprises:
the media service module collects all bottom layer core capabilities supported by the third-party camera application by adopting a JS object numbered notation (JSON);
the third-party camera application receives the capitalized character strings sent by the media service module and determines that all the bottom layer core capabilities comprise HDR functions;
when the third party camera application determines to request a camera to shoot a Video service, the third party camera application sets a JSON string { "HDR-Video": 1} to the media service module, wherein the JSON string { "HDR-Video": 1} is used to represent that the media supports HDR Video capabilities for the third party camera application;
when the third party camera application determines to request a camera to take a photo service, the third party camera application sets a JSON character string "HDR-Capture": 1 to the media service module, wherein the JSON character string "HDR-Capture": 1 is used to represent that the media supports HDR photo capabilities for the third party camera application.
8. The device for realizing the high dynamic range image HDR is applied to electronic equipment, the electronic equipment comprises an operating system and a media service module, the media service module is independent of the setting of the operating system, an application layer of the operating system is provided with a third party camera application, and a hardware abstraction layer of the operating system is provided with a camera hardware abstraction module; the high dynamic range image HDR realization apparatus includes a processing unit, a communication unit, and a storage unit, wherein,
the processing unit, configured to send, by the third party camera application, an HDR processing request to the camera hardware abstraction module; the camera hardware abstraction module is used for receiving the HDR processing request, acquiring first camera application data to be processed, calling an algorithm module for realizing HDR function to process the first camera application data to obtain second camera application data, and sending the second camera application data to a preset target, wherein the algorithm module for realizing HDR function is used for requesting the operating system to be open for the third-party camera application through the media service module by the third-party camera application; and for the preset target to achieve an HDR effect in accordance with the second camera application data.
9. An electronic device comprising a processor, a memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-7.
10. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any one of claims 1-7.
CN201911253913.3A 2019-12-09 2019-12-09 High dynamic range image HDR realization method and related product Active CN110958399B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911253913.3A CN110958399B (en) 2019-12-09 2019-12-09 High dynamic range image HDR realization method and related product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911253913.3A CN110958399B (en) 2019-12-09 2019-12-09 High dynamic range image HDR realization method and related product

Publications (2)

Publication Number Publication Date
CN110958399A CN110958399A (en) 2020-04-03
CN110958399B true CN110958399B (en) 2021-06-29

Family

ID=69980480

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911253913.3A Active CN110958399B (en) 2019-12-09 2019-12-09 High dynamic range image HDR realization method and related product

Country Status (1)

Country Link
CN (1) CN110958399B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI730746B (en) * 2020-04-30 2021-06-11 群邁通訊股份有限公司 Device and method for generating high dynamic range image, readable storage medium
CN112153282B (en) * 2020-09-18 2022-03-01 Oppo广东移动通信有限公司 Image processing chip, method, storage medium and electronic device
CN114745495B (en) * 2021-01-07 2023-06-23 北京小米移动软件有限公司 Image generation method, device and storage medium
CN113852762B (en) * 2021-09-27 2022-07-26 荣耀终端有限公司 Algorithm calling method and algorithm calling device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102662665A (en) * 2012-03-29 2012-09-12 福州瑞芯微电子有限公司 Input subsystem enabling Android to support various sensors
CN103546689A (en) * 2013-10-11 2014-01-29 Tcl集团股份有限公司 Resolution ratio acquiring method and device of external camera of android system
CN108762815A (en) * 2018-05-16 2018-11-06 北京麟卓信息科技有限公司 A kind of Android running environment implementation methods based on non-virtualized architectural framework
CN108833804A (en) * 2018-09-20 2018-11-16 Oppo广东移动通信有限公司 Imaging method, device and electronic equipment
CN110177218A (en) * 2019-06-28 2019-08-27 广州鲁邦通物联网科技有限公司 A kind of image processing method of taking pictures of Android device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101997912A (en) * 2010-10-27 2011-03-30 苏州凌霄科技有限公司 Mandatory access control device based on Android platform and control method thereof
CN103093777B (en) * 2012-12-28 2015-12-23 Tcl康钛汽车信息服务(深圳)有限公司 A kind of method and system adopting android system control DVD equipment
US10397322B2 (en) * 2017-11-21 2019-08-27 Sunmeet Singh Jolly Mobile and computer applications, systems and methods for large group travel and event management

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102662665A (en) * 2012-03-29 2012-09-12 福州瑞芯微电子有限公司 Input subsystem enabling Android to support various sensors
CN103546689A (en) * 2013-10-11 2014-01-29 Tcl集团股份有限公司 Resolution ratio acquiring method and device of external camera of android system
CN108762815A (en) * 2018-05-16 2018-11-06 北京麟卓信息科技有限公司 A kind of Android running environment implementation methods based on non-virtualized architectural framework
CN108833804A (en) * 2018-09-20 2018-11-16 Oppo广东移动通信有限公司 Imaging method, device and electronic equipment
CN110177218A (en) * 2019-06-28 2019-08-27 广州鲁邦通物联网科技有限公司 A kind of image processing method of taking pictures of Android device

Also Published As

Publication number Publication date
CN110958399A (en) 2020-04-03

Similar Documents

Publication Publication Date Title
CN110958399B (en) High dynamic range image HDR realization method and related product
CN109218628B (en) Image processing method, image processing device, electronic equipment and storage medium
KR102149187B1 (en) Electronic device and control method of the same
CN110753187B (en) Camera control method and device
CN109068059B (en) Method for calling camera, mobile terminal and storage medium
CN113727035B (en) Image processing method, system, electronic device and storage medium
CN111447370B (en) Camera access method, camera access device, terminal equipment and readable storage medium
CN110955541B (en) Data processing method, device, chip, electronic equipment and readable storage medium
CN114125284A (en) Image processing method, electronic device, and storage medium
CN111314606B (en) Photographing method and device, electronic equipment and storage medium
CN114286117A (en) Multi-platform multi-application live broadcast method and system, live broadcast equipment and storage medium
CN110990088B (en) Data processing method and related equipment
CN108898650B (en) Human-shaped material creating method and related device
CN114945019B (en) Data transmission method, device and storage medium
CN110941413B (en) Display screen generation method and related device
CN114186203A (en) Account authentication method and device
CN107682556B (en) Information display method and equipment
CN115908151A (en) Data processing method and device, computer equipment and storage medium
CN114630152A (en) Parameter transmission method and device for image processor and storage medium
CN116668773B (en) Method for enhancing video image quality and electronic equipment
CN111026893A (en) Intelligent terminal, image processing method and computer-readable storage medium
CN116723416B (en) Image processing method and electronic equipment
CN116668836B (en) Photographing processing method and electronic equipment
CN115767287B (en) Image processing method and electronic equipment
CN114630153B (en) Parameter transmission method and device for application processor and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant