CN110933275A - Photographing method and related equipment - Google Patents

Photographing method and related equipment Download PDF

Info

Publication number
CN110933275A
CN110933275A CN201911253067.5A CN201911253067A CN110933275A CN 110933275 A CN110933275 A CN 110933275A CN 201911253067 A CN201911253067 A CN 201911253067A CN 110933275 A CN110933275 A CN 110933275A
Authority
CN
China
Prior art keywords
data
hardware abstraction
application data
abstraction layer
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911253067.5A
Other languages
Chinese (zh)
Other versions
CN110933275B (en
Inventor
陈岩
方攀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201911253067.5A priority Critical patent/CN110933275B/en
Publication of CN110933275A publication Critical patent/CN110933275A/en
Application granted granted Critical
Publication of CN110933275B publication Critical patent/CN110933275B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Telephone Function (AREA)

Abstract

The application discloses a photographing method and related equipment, which are applied to electronic equipment, wherein the electronic equipment comprises a media service module and an operating system, and an application layer of the operating system is provided with a third-party application; the method comprises the following steps: the third party application sends a data request to a hardware abstraction layer of the operating system; the hardware abstraction layer receives the data request, acquires original application data to be processed, and calls an image fusion algorithm to process the original application data to obtain target application data, wherein the image fusion algorithm is used for a third party application to request an operating system to be open for the third party application in advance through a media service module; the hardware abstraction layer sends the target application data to the third party application. By adopting the embodiment of the application, the quality of the image shot by the third-party application can be improved.

Description

Photographing method and related equipment
Technical Field
The present application relates to the field of electronic technologies, and in particular, to a photographing method and a related device.
Background
Photographing has become an indispensable function of electronic equipment, and is also a competition field for the electronic manufacturers to compete. For an operation (Android) platform, a third-party application can only access a bottom layer through an Application Programming Interface (API) at present, but cannot call an algorithm inside a system to process application data, and therefore, only application data transmitted by the bottom layer can be passively received, and therefore, how to realize that the third-party application calls the algorithm inside the system to process the application data is a technical problem to be solved.
Disclosure of Invention
The embodiment of the application provides a photographing method and related equipment, which are used for processing application data by calling an algorithm in a system through a third-party application.
In a first aspect, an embodiment of the present application provides a photographing method, which is applied to an electronic device, where the electronic device includes a media service module and an operating system, and an application layer of the operating system is provided with a third-party application; the method comprises the following steps:
the third party application sends a data request to a hardware abstraction layer of the operating system;
the hardware abstraction layer receives the data request, acquires original application data to be processed, and calls an image fusion algorithm to process the original application data to obtain target application data, wherein the image fusion algorithm is that a third party application requests an operating system to be open for the third party application in advance through the media service module, the original application data comprises image data and detection data, the image data is acquired by a camera of the electronic equipment, and the detection data is detected by a detection sensor of the electronic equipment;
the hardware abstraction layer sends the target application data to the third party application.
In a second aspect, an embodiment of the present application provides a photographing apparatus, which is applied to an electronic device, where the electronic device includes a media service module and an operating system, and an application layer of the operating system is provided with a third-party application; the device comprises:
a processing unit to: controlling the third-party application to send a data request to a hardware abstraction layer of the operating system; controlling the hardware abstraction layer to receive the data request, acquiring original application data to be processed, and calling an image fusion algorithm to process the original application data to obtain target application data, wherein the image fusion algorithm is used for a third party application to request an operating system to be open for the third party application in advance through the media service module, the original application data comprises image data and detection data, the image data is acquired by a camera of the electronic equipment, and the detection data is detected by a detection sensor of the electronic equipment; and controlling the hardware abstraction layer to send the target application data to the third-party application.
In a third aspect, embodiments of the present application provide an electronic device, which includes a processor, a memory, a communication interface, and one or more programs, stored in the memory and configured to be executed by the processor, the programs including instructions for performing some or all of the steps described in the method according to the first aspect of the embodiments of the present application.
In a fourth aspect, an embodiment of the present application provides a chip, including: and the processor is used for calling and running the computer program from the memory so that the device provided with the chip executes part or all of the steps described in any method of the first aspect of the embodiment of the application.
In a fifth aspect, the present application provides a computer-readable storage medium, where the computer-readable storage medium is used to store a computer program, where the computer program is executed by a processor to implement part or all of the steps described in the method according to the first aspect of the present application.
In a sixth aspect, embodiments of the present application provide a computer program product, where the computer program product includes a non-transitory computer-readable storage medium storing a computer program, where the computer program is operable to cause a computer to perform some or all of the steps described in the method according to the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
It can be seen that, in the embodiment of the present application, a third-party application set in an application layer in an android system of an electronic device sends a data request to a hardware abstraction layer of an operating system; the hardware abstraction layer receives the data request, acquires original application data to be processed, and calls an image fusion algorithm to process the original application data to obtain target application data, wherein the image fusion algorithm is that a third party application requests an operating system to be open for the third party application in advance through the media service module, the original application data comprises image data and detection data, the image data is acquired by a camera of the electronic equipment, and the detection data is detected by a detection sensor of the electronic equipment; the hardware abstraction layer sends the target application data to the third party application; therefore, according to the technical scheme provided by the application, the third-party application calls the algorithm in the system to process the application data.
These and other aspects of the present application will be more readily apparent from the following description of the embodiments.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1A is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present disclosure;
fig. 1B is a schematic diagram of a software architecture of an electronic device according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a photographing method according to an embodiment of the present application;
fig. 3 is a schematic flowchart of a photographing method according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 5 is a schematic structural diagram of a photographing device according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The following are detailed below.
The terms "first," "second," "third," and "fourth," etc. in the description and claims of this application and in the accompanying drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Hereinafter, some terms in the present application are explained to facilitate understanding by those skilled in the art.
As shown in fig. 1A, fig. 1A is a schematic structural diagram of electronic device hardware provided in an embodiment of the present application. The electronic device includes a processor, a Memory, a signal processor, a communication interface, a display screen, a speaker, a microphone, a Random Access Memory (RAM), a camera module, a sensor, and the like. The device comprises a memory, a signal processor, a display screen, a loudspeaker, a microphone, an RAM, a camera module, a sensor and an Infrared (IR) sensor, wherein the sensor is connected with the processor, and a communication interface is connected with the signal processor.
The Display screen may be a Liquid Crystal Display (LCD), an Organic or inorganic Light-Emitting Diode (OLED), an active matrix Organic Light-Emitting Diode (AMOLED), or the like.
The camera module can include a common camera and an infrared camera, and is not limited herein. The camera may be a front camera or a rear camera, and is not limited herein.
Wherein the sensor comprises at least one of: light sensitive sensors, gyroscopes, IR sensors, fingerprint sensors, pressure sensors, etc. Among them, the light sensor, also called an ambient light sensor, is used to detect the ambient light brightness. The light sensor may include a light sensitive element and an analog to digital converter. The photosensitive element is used for converting collected optical signals into electric signals, and the analog-to-digital converter is used for converting the electric signals into digital signals. Optionally, the light sensor may further include a signal amplifier, and the signal amplifier may amplify the electrical signal converted by the photosensitive element and output the amplified electrical signal to the analog-to-digital converter. The photosensitive element may include at least one of a photodiode, a phototransistor, a photoresistor, and a silicon photocell.
The processor is a control center of the electronic equipment, various interfaces and lines are used for connecting all parts of the whole electronic equipment, and various functions and processing data of the electronic equipment are executed by operating or executing software programs and/or modules stored in the memory and calling data stored in the memory, so that the electronic equipment is monitored integrally.
The processor may integrate an application processor and a modem processor, wherein the application processor mainly handles operating systems, user interfaces, application programs, and the like, and the modem processor mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor.
The memory is used for storing software programs and/or modules, and the processor executes various functional applications and data processing of the electronic equipment by operating the software programs and/or modules stored in the memory. The memory mainly comprises a program storage area and a data storage area, wherein the program storage area can store an operating system, a software program required by at least one function and the like; the storage data area may store data created according to use of the electronic device, and the like. Further, the memory may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
As shown in fig. 1B, fig. 1B is a software architecture diagram of an electronic device according to an embodiment of the present disclosure. As shown in fig. 1B, an electronic device according to an embodiment of the present application includes a media Service module (ome Service) and an android system, an application layer of the android system is provided with a third party application and a media software development kit module (ome SDK), and a hardware abstraction layer of the android system is provided with a media policy module (ome Strategy), an algorithm management module (AlgoManager), and a Camera hardware abstraction module (Camera HAL). The third-party application is in communication connection with the media software development kit module, the media software development kit module is in communication connection with the media service module, the media service module is in communication connection with the camera hardware abstraction module, the camera hardware abstraction module is in communication connection with the media policy module, and the media policy module is in communication connection with the algorithm management module. In addition, the media service module may be further communicatively coupled to the media policy module and/or the algorithm management module.
The media software development kit module comprises a control interface, can acquire information such as capability value and configuration capability value, does not store static configuration information, can communicate with the media service module by a binder, and transmits the third-party application configuration information to the media service module.
The media service module resident in the service module of the system runs, authenticates and responds to the configuration request of the third-party application after the electronic equipment is started, so that the configuration information can reach the bottom layer. In the application, the media service module acquires a data request of the third-party application and sets a data processing scheme.
The media policy module is a bottom layer policy module, and can send information configured by the media service module to a bottom layer, convert the information into the capability which can be identified by the bottom layer, prevent the third party application from directly coupling and seeing the capability of the bottom layer, convert a request of an upper layer into a proprietary pipeline, and call algorithm information.
The algorithm management module can enable the capability configuration information issued by the upper layer, and can utilize the corresponding algorithm.
Wherein the third party application may directly notify the media service module that data processing or continuous shooting is required.
The electronic equipment of the embodiment of the application adopts a framework based on a media platform (OMedia), so that a third party application can use a pipeline which is continuously shot at the bottom layer to upload a clear shot image rather than a preview stream to the third party application, and a media service module and a hardware abstraction layer can be set through the media platform to use system functions such as high resolution, denoising, beautifying and the like provided by an Image Signal Processor (ISP) and system software.
Meanwhile, since a problem of too slow image speed may be caused by using a high resolution process provided by an image signal processor and system software, the problem may be solved by: the bottom layer can send clear YUV to a third-party application, the thumbnail can be reported to the third-party application for displaying in the middle, and after the third-party application receives the thumbnail, the third-party application does post-processing and JPG generation.
As shown in fig. 2, fig. 2 is a schematic flowchart of a photographing method provided in an embodiment of the present application, and is applied to the electronic device shown in fig. 1A and 1B, where the electronic device includes a media service module and an operating system, and an application layer of the operating system is provided with a third-party application; the method comprises the following steps:
step 201: the third party application sends a data request to a hardware abstraction layer of the operating system.
Among these, third party applications are, for example, figure show, BeautyCam beauty Camera, Camera FV5, and so on. The operating system may be, for example, an android system, an IOS system, a Linux system, or the like.
For example, when a third-party application installed in the electronic device needs to use an image fusion function carried by the android system, the third-party application sends a data request to a hardware abstraction layer of the operating system, and the data request may be to perform image fusion on an image shot by a camera to obtain an image with higher definition. Optionally, the data request may further include denoising and/or beautifying.
Step 202: the hardware abstraction layer receives the data request, acquires original application data to be processed, and calls an image fusion algorithm to process the original application data to obtain target application data, wherein the image fusion algorithm is that the third party application requests an operating system to be open for the third party application in advance through the media service module, the original application data comprises image data and detection data, the image data is acquired by a camera of the electronic equipment, and the detection data is detected by a detection sensor of the electronic equipment.
The detection sensor may be, for example, a gyroscope, a photosensitive sensor, a temperature sensor, a humidity sensor, a distance sensor, an infrared sensor, or the like. Correspondingly, the detection data detected by the gyroscope are anti-shake data, and the anti-shake data can be used for correcting the acquired image; the detection data detected by the photosensitive sensor is illumination data, and the illumination data can be used for supplementing light for the acquired image; the detection data detected by the temperature sensor is temperature data, and the temperature data can be used for adjusting the temperature of the collected image, and the like, which will not be described herein.
The image fusion algorithm may be stored in an algorithm library, and the algorithm library further includes other algorithms, such as an image enhancement algorithm, an image denoising algorithm, and the like.
Step 203: the hardware abstraction layer sends the target application data to the third party application.
It can be seen that, in the embodiment of the present application, a third-party application set in an application layer in an android system of an electronic device sends a data request to a hardware abstraction layer of an operating system; the hardware abstraction layer receives the data request, acquires original application data to be processed, and calls an image fusion algorithm to process the original application data to obtain target application data, wherein the image fusion algorithm is that a third party application requests an operating system to be open for the third party application in advance through the media service module, the original application data comprises image data and detection data, the image data is acquired by a camera of the electronic equipment, and the detection data is detected by a detection sensor of the electronic equipment; the hardware abstraction layer sends the target application data to the third party application; therefore, according to the technical scheme provided by the application, the third-party application calls the algorithm in the system to process the application data.
In an implementation manner of the present application, the sending, by the hardware abstraction layer, the target application data to the third-party application includes:
the hardware abstraction layer determining a transmission bandwidth between the hardware abstraction layer and the application layer;
the hardware abstraction layer compresses the target application data based on the transmission bandwidth, wherein the size of the compressed target application data is smaller than that of the target application data before compression;
and the hardware abstraction layer sends the compressed target application data to the third-party application.
Further, before the hardware abstraction layer compresses the target application data based on the transmission bandwidth, the method further comprises:
the hardware abstraction layer determines a transmission time of the target application data based on the transmission bandwidth and a size of the target application data;
the hardware abstraction layer determines that a transfer time of the target application data is greater than or equal to a first threshold.
It can be seen that, in the embodiment of the present application, when the transmission time of the target application data is greater than or equal to the first threshold, the target application data is compressed and then transmitted, so that the transmission speed of the target application data is increased, and the processing speed of the whole photographing process is also increased.
In an implementation manner of the present application, the hardware abstraction layer is provided with a camera hardware abstraction module and an algorithm management module, and the camera hardware abstraction module is connected to the algorithm management module;
the hardware abstraction layer receives the data request, obtains original application data to be processed, and calls an image fusion algorithm to process the original application data to obtain target application data, and the method comprises the following steps:
the camera hardware abstraction module receives the data request, acquires original application data to be processed and sends the original application data to the algorithm management module;
and the algorithm management module receives the original application data, and processes the original application data by using an image fusion algorithm to obtain target application data.
It can be seen that, in the embodiment of the present application, the third-party application may directly use the camera hardware abstraction module provided by the system to obtain the original application data, and directly use the algorithm management module provided by the system to call the image fusion algorithm to process the original application data.
In one implementation manner of the present application, the hardware abstraction layer is provided with a camera hardware abstraction module, a media policy module and an algorithm management module; the camera hardware abstraction module is connected with the algorithm management module through the media strategy module;
the hardware abstraction layer receives the data request, obtains original application data to be processed, and calls an image fusion algorithm to process the original application data to obtain target application data, and the method comprises the following steps:
the camera hardware abstraction module receives the data request, acquires original application data to be processed, and sends the original application data to the algorithm management module through the media strategy module;
and the algorithm management module receives the original application data, and processes the original application data by using an image fusion algorithm to obtain target application data.
It can be seen that, in the embodiment of the present application, the third-party application may directly use the camera hardware abstraction module provided by the system to obtain the original application data, send the original application data to the algorithm management module through the media policy module, and call the image fusion algorithm to process the original application data.
In an implementation manner of the present application, the hardware abstraction layer is provided with a camera hardware abstraction module, a media policy module and an algorithm management module, the camera hardware abstraction module is connected to the media policy module, and the media policy module is connected to the algorithm management module;
the image fusion algorithm is specifically opened to the third-party application by the following operations:
the media policy module receives first function configuration information from the media service module, wherein the first function configuration information comprises description information of an image fusion function; converting the first function configuration information into second function configuration information which can be identified by the algorithm management module, and sending the second function configuration information to the algorithm management module;
and the algorithm management module receives the second function configuration information and opens the use permission of the third-party application for the image fusion algorithm of the operating system according to the second function configuration information.
It can be seen that, in the embodiment of the present application, the media policy module receives first function configuration information from the media service module, where the first function configuration information includes description information of an image fusion function, converts the first function configuration information into second function configuration information that can be identified by the algorithm management module, and sends the second function configuration information to the algorithm management module; the algorithm management module receives the second function configuration information, opens the use permission of the third-party application for the image fusion algorithm of the operating system according to the second function configuration information, and is beneficial to enabling the third-party application to directly use the image fusion algorithm provided by the system.
In an implementation manner of the present application, before the third-party application sends the data request to the hardware abstraction layer of the operating system, the method further includes:
the third-party application sends a media platform version acquisition request carrying an authentication code to the media service module;
the media service module receives the media platform version acquisition request, verifies the authentication code and passes the verification;
and the media service module sends the media platform version information to the third-party application.
Therefore, in this example, the authentication is performed before the third-party application requests the system to open the use permission of the image fusion algorithm, which is beneficial to ensuring the safety of the algorithm opening of the target effect.
In an implementation manner of the present application, after the media service module sends the media platform version information to the third-party application, the method further includes:
the third party application receives the media platform version information and sends a capability acquisition request carrying the media platform version information to the media service module;
the media service module receives the capability acquisition request, inquires an application capability list of the media platform version information, and sends the application capability list to the third-party application;
the third-party application receives the application capability list, and inquires the application capability list to acquire a plurality of operation native functions supported by the current media platform aiming at the third-party application; and determining the image fusion function selected to be opened in the plurality of operation native functions.
As can be seen, in this example, after the authentication code is verified and the verification is passed, the media platform version information is returned to the third-party application, the verification result is determined, the third-party application requests the media service module for the application capability list, and selects the image fusion function that is open to the third-party application, which is favorable for accurately selecting the open image fusion algorithm to process the image.
In an implementation manner of the present application, the detection data includes anti-shake data, the image data includes N frames of first images, and N is a positive integer; the hardware abstraction layer receives the data request, obtains original application data to be processed, and calls an image fusion algorithm to process the original application data to obtain target application data, and the method comprises the following steps:
the hardware abstraction layer divides the N frames of first images into N frames of second images and N frames of third images based on the anti-shake data, wherein the N frames of first images are in one-to-one correspondence with the N frames of second images and the N frames of third images respectively;
the hardware abstraction layer carries out image compensation on the N frames of second images based on the N frames of third images to obtain N frames of fourth images;
and the hardware abstraction layer calls an image fusion algorithm to process the fourth image of the N frames to obtain target application data.
Further, the hardware abstraction layer segments the N frames of the first image into N frames of the second image and N frames of the third image based on the anti-shake data, including:
the hardware abstraction layer determines N centroids of the N first images, wherein the N centroids are in one-to-one correspondence with the N first images;
the hardware abstraction layer divides the N frames of first images by taking the N centroids as circle centers and taking a as a radius to obtain N frames of second images and N frames of third images, wherein the N frames of third images are N circular areas obtained by taking the N centroids as circle centers and taking a as a radius, the N frames of second images are N areas except the N circular areas in the N frames of first images, and the N circular areas and the N areas except the N circular areas are respectively in one-to-one correspondence with the N frames of first images.
It can be seen that, in the embodiment of the present application, before the image fusion algorithm processing is performed on the N frames of the first image, the original application data is processed in combination with the detection data acquired by other hardware (for example, a gyroscope), so that target application data with better quality can be obtained.
In an implementation manner of the present application, the processing, by the hardware abstraction layer, the N frames of fourth images by calling an image fusion algorithm to obtain target application data includes:
the hardware abstraction layer collects image characteristics included by the N frames of fourth images;
the hardware abstraction layer carries out registration on the N frames of fourth images based on the image characteristics to obtain registration parameters;
the hardware abstraction layer adjusts the displacement of the N frames of fourth images based on the registration parameters to obtain N frames of fifth images;
and the hardware abstraction layer fuses the N frames of fifth images based on an image fusion function to obtain target application data.
The image features mainly include color features, texture features, shape features and spatial relationship features of the image. The color feature is a global feature describing surface properties of a scene corresponding to an image or an image area; texture features are also global features that also describe the surface properties of the scene corresponding to the image or image area; the shape features are represented by two types, one is outline features, the other is region features, the outline features of the image mainly aim at the outer boundary of the object, and the region features of the image are related to the whole shape region; the spatial relationship characteristic refers to the mutual spatial position or relative direction relationship among a plurality of targets segmented from the image, and these relationships can be also divided into a connection/adjacency relationship, an overlapping/overlapping relationship, an inclusion/containment relationship, and the like.
Wherein the image fusion function comprises at least one of: linear weighted image fusion function, pyramid image fusion function, principal component transformation image fusion function, wavelet transformation fusion function.
Specifically, the registering, by the hardware abstraction layer, the N frames of fourth images based on the image features to obtain registration parameters, including:
the hardware abstraction layer selects one frame from the N frames of fourth images as a reference image and selects the other frame as a calibration image;
the hardware abstraction layer determining a geometric transformation relationship between image features comprised by the reference image and image features comprised by the calibration image;
the hardware abstraction layer takes the geometric transformation relationship as a calibration parameter.
As shown in fig. 3, fig. 3 is a schematic flowchart of a photographing method provided in an embodiment of the present application, and is applied to the electronic device shown in fig. 1A and fig. 1B, where the method includes:
step 301: and the third-party application sends a media platform version acquisition request carrying an authentication code to the media service module.
Step 302: and the media service module receives the media platform version acquisition request, verifies the authentication code and passes the verification.
Step 303: and the media service module sends the media platform version information to the third-party application.
Step 304: and the third-party application receives the media platform version information and sends a capability acquisition request carrying the media platform version information to the media service module.
Step 305: and the media service module receives the capability acquisition request, inquires an application capability list of the media platform version information and sends the application capability list to the third-party application.
Step 306: the third-party application receives the application capability list, and inquires the application capability list to acquire a plurality of operation native functions supported by the current media platform aiming at the third-party application; and determining the image fusion function selected to be opened in the plurality of operation native functions.
Step 307: the third party application sends a data request to a hardware abstraction layer of the operating system.
Step 308: and the camera hardware abstraction module receives the data request, acquires original application data to be processed and sends the original application data to the algorithm management module through the media strategy module.
Step 309: the algorithm management module receives the original application data, and processes the original application data by using an image fusion algorithm to obtain target application data, wherein the image fusion algorithm is used for requesting an operating system to be open for the third-party application in advance through the media service module, the original application data comprises image data and detection data, the image data is acquired by a camera of the electronic equipment, and the detection data is detected by a detection sensor of the electronic equipment.
Step 310: the algorithm management module determines a transmission bandwidth between the hardware abstraction layer and the application layer.
Step 311: and the algorithm management module compresses the target application data based on the transmission bandwidth, wherein the size of the compressed target application data is smaller than that of the target application data before compression.
Step 312: and the algorithm management module sends the compressed target application data to the third-party application.
It should be noted that, for the specific implementation process of the present embodiment, reference may be made to the specific implementation process described in the above method embodiment, and a description thereof is omitted here.
In accordance with the embodiments shown in fig. 2 and fig. 3, referring to fig. 4, fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure, the electronic equipment comprises an operating system, an application layer of the operating system is provided with a third party application, the hardware abstraction layer of the operating system is provided with a hardware abstraction module, a media strategy module and an algorithm management module, the algorithm management module is used for managing an algorithm library, the hardware abstraction module is in communication connection with the media policy module, the media policy module is in communication connection with the algorithm management module, as shown, the electronic device also includes memory, a communication interface, and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the processor, the programs including instructions for performing the steps of:
controlling the third-party application to send a data request to a hardware abstraction layer of the operating system;
controlling the hardware abstraction layer to receive the data request, acquiring original application data to be processed, and calling an image fusion algorithm to process the original application data to obtain target application data, wherein the image fusion algorithm is used for a third party application to request an operating system to be open for the third party application in advance through the media service module, the original application data comprises image data and detection data, the image data is acquired by a camera of the electronic equipment, and the detection data is detected by a detection sensor of the electronic equipment;
and controlling the hardware abstraction layer to send the target application data to the third-party application.
In an implementation manner of the present application, in controlling the hardware abstraction layer to send the target application data to the third-party application, the program includes instructions specifically configured to:
controlling the hardware abstraction layer to determine a transmission bandwidth between the hardware abstraction layer and the application layer;
controlling the hardware abstraction layer to compress the target application data based on the transmission bandwidth, wherein the size of the compressed target application data is smaller than that of the target application data before compression;
and controlling the hardware abstraction layer to send the compressed target application data to the third-party application.
In an implementation manner of the present application, the hardware abstraction layer is provided with a camera hardware abstraction module and an algorithm management module, and the camera hardware abstraction module is connected to the algorithm management module;
in controlling the hardware abstraction layer to receive the data request, obtain original application data to be processed, and call an image fusion algorithm to process the original application data to obtain target application data, the program includes instructions specifically for executing the following steps:
controlling the camera hardware abstraction module to receive the data request, acquiring original application data to be processed, and sending the original application data to the algorithm management module;
and controlling the algorithm management module to receive the original application data, and processing the original application data by using an image fusion algorithm to obtain target application data.
In one implementation manner of the present application, the hardware abstraction layer is provided with a camera hardware abstraction module, a media policy module and an algorithm management module; the camera hardware abstraction module is connected with the algorithm management module through the media strategy module;
in controlling the hardware abstraction layer to receive the data request, obtain original application data to be processed, and call an image fusion algorithm to process the original application data to obtain target application data, the program includes instructions specifically for executing the following steps:
controlling the hardware abstraction layer to receive the data request, acquiring original application data to be processed, and calling an image fusion algorithm to process the original application data to obtain target application data, wherein the method comprises the following steps:
controlling the camera hardware abstraction module to receive the data request, acquiring original application data to be processed, and sending the original application data to the algorithm management module through the media policy module;
and controlling the algorithm management module to receive the original application data, and processing the original application data by using an image fusion algorithm to obtain target application data.
In an implementation manner of the present application, the detection data includes anti-shake data, the hardware abstraction layer is provided with a camera hardware abstraction module, a media policy module and an algorithm management module, the camera hardware abstraction module is connected to the media policy module, and the media policy module is connected to the algorithm management module;
the image fusion algorithm is specifically opened to the third-party application by the following operations:
the media policy module receives first function configuration information from the media service module, wherein the first function configuration information comprises description information of an image fusion function; converting the first function configuration information into second function configuration information which can be identified by the algorithm management module, and sending the second function configuration information to the algorithm management module;
and the algorithm management module receives the second function configuration information and opens the use permission of the third-party application for the image fusion algorithm of the operating system according to the second function configuration information.
In an implementation manner of the present application, before controlling the third-party application to send a data request to a hardware abstraction layer of the operating system, the program includes instructions further for performing the following steps:
controlling the third-party application to send a media platform version acquisition request carrying an authentication code to the media service module;
controlling the media service module to receive the media platform version acquisition request, and checking the authentication code and passing the check;
and controlling the media service module to send the media platform version information to the third-party application.
In an implementation manner of the present application, after controlling the media service module to send the media platform version information to the third-party application, the program includes instructions further configured to:
controlling the third-party application to receive the media platform version information and send a capability acquisition request carrying the media platform version information to the media service module;
controlling the media service module to receive the capability acquisition request, inquiring an application capability list of the media platform version information, and sending the application capability list to the third-party application;
controlling the third-party application to receive the application capability list, and inquiring the application capability list to acquire a plurality of operation native functions supported by the current media platform for the third-party application; and determining the image fusion function selected to be opened in the plurality of operation native functions.
In an implementation manner of the present application, the detection data includes anti-shake data, the image data includes N frames of first images, and N is a positive integer; the hardware abstraction layer receives the data request, obtains original application data to be processed, and calls an image fusion algorithm to process the original application data to obtain target application data, wherein the program comprises instructions specifically used for executing the following steps:
controlling the hardware abstraction layer to divide the N frames of first images into N frames of second images and N frames of third images based on the anti-shake data, wherein the N frames of first images are in one-to-one correspondence with the N frames of second images and the N frames of third images respectively;
controlling the hardware abstraction layer to perform image compensation on the N frames of second images based on the N frames of third images to obtain N frames of fourth images;
and controlling the hardware abstraction layer to call an image fusion algorithm to process the fourth image of the N frames to obtain target application data.
In an implementation manner of the present application, the hardware abstraction layer invokes an image fusion algorithm to process the N frames of fourth images to obtain target application data, where the program includes instructions specifically configured to perform the following steps:
controlling the hardware abstraction layer to collect image characteristics included by the N frames of fourth images;
controlling the hardware abstraction layer to register the N frames of fourth images based on the image characteristics to obtain registration parameters;
controlling the hardware abstraction layer to adjust the displacement of the N frames of fourth images based on the registration parameters to obtain N frames of fifth images;
and controlling the hardware abstraction layer to fuse the N frames of fifth images based on an image fusion function to obtain target application data.
It should be noted that, for the specific implementation process of the present embodiment, reference may be made to the specific implementation process described in the above method embodiment, and a description thereof is omitted here.
The above embodiments mainly introduce the scheme of the embodiments of the present application from the perspective of the method-side implementation process. It is understood that the electronic device comprises corresponding hardware structures and/or software modules for performing the respective functions in order to realize the above-mentioned functions. Those of skill in the art would readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the electronic device may be divided into the functional units according to the method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
The following is an embodiment of the apparatus of the present application, which is used to execute the method implemented by the embodiment of the method of the present application. Referring to fig. 5, fig. 5 is a schematic structural diagram of a photographing apparatus provided in an embodiment of the present application, and is applied to an electronic device, where the electronic device includes a media service module and an operating system, and an application layer of the operating system is provided with a third-party application; the device comprises:
a processing unit 501, configured to: controlling the third-party application to send a data request to a hardware abstraction layer of the operating system; controlling the hardware abstraction layer to receive the data request, acquiring original application data to be processed, and calling an image fusion algorithm to process the original application data to obtain target application data, wherein the image fusion algorithm is used for a third party application to request an operating system to be open for the third party application in advance through the media service module, the original application data comprises image data and detection data, the image data is acquired by a camera of the electronic equipment, and the detection data is detected by a detection sensor of the electronic equipment; and controlling the hardware abstraction layer to send the target application data to the third-party application.
In an implementation manner of the present application, in controlling the hardware abstraction layer to send the target application data to the third-party application, the processing unit 501 is specifically configured to:
controlling the hardware abstraction layer to determine a transmission bandwidth between the hardware abstraction layer and the application layer;
controlling the hardware abstraction layer to compress the target application data based on the transmission bandwidth, wherein the size of the compressed target application data is smaller than that of the target application data before compression;
and controlling the hardware abstraction layer to send the compressed target application data to the third-party application.
In an implementation manner of the present application, the hardware abstraction layer is provided with a camera hardware abstraction module and an algorithm management module, and the camera hardware abstraction module is connected to the algorithm management module;
in controlling the hardware abstraction layer to receive the data request, obtain original application data to be processed, and call an image fusion algorithm to process the original application data to obtain target application data, the processing unit 501 is specifically configured to:
controlling the camera hardware abstraction module to receive the data request, acquiring original application data to be processed, and sending the original application data to the algorithm management module;
and controlling the algorithm management module to receive the original application data, and processing the original application data by using an image fusion algorithm to obtain target application data.
In one implementation manner of the present application, the hardware abstraction layer is provided with a camera hardware abstraction module, a media policy module and an algorithm management module; the camera hardware abstraction module is connected with the algorithm management module through the media strategy module;
in controlling the hardware abstraction layer to receive the data request, obtain original application data to be processed, and call an image fusion algorithm to process the original application data to obtain target application data, the processing unit 501 is specifically configured to:
controlling the hardware abstraction layer to receive the data request, acquiring original application data to be processed, and calling an image fusion algorithm to process the original application data to obtain target application data, wherein the method comprises the following steps:
controlling the camera hardware abstraction module to receive the data request, acquiring original application data to be processed, and sending the original application data to the algorithm management module through the media policy module;
and controlling the algorithm management module to receive the original application data, and processing the original application data by using an image fusion algorithm to obtain target application data.
In an implementation manner of the present application, the detection data includes anti-shake data, the hardware abstraction layer is provided with a camera hardware abstraction module, a media policy module and an algorithm management module, the camera hardware abstraction module is connected to the media policy module, and the media policy module is connected to the algorithm management module;
the image fusion algorithm is specifically opened to the third-party application by the following operations:
the media policy module receives first function configuration information from the media service module, wherein the first function configuration information comprises description information of an image fusion function; converting the first function configuration information into second function configuration information which can be identified by the algorithm management module, and sending the second function configuration information to the algorithm management module;
and the algorithm management module receives the second function configuration information and opens the use permission of the third-party application for the image fusion algorithm of the operating system according to the second function configuration information.
In an implementation manner of the present application, before controlling the third-party application to send a data request to a hardware abstraction layer of the operating system, the processing unit 501 is further configured to:
controlling the third-party application to send a media platform version acquisition request carrying an authentication code to the media service module;
controlling the media service module to receive the media platform version acquisition request, and checking the authentication code and passing the check;
and controlling the media service module to send the media platform version information to the third-party application.
In an implementation manner of the present application, after controlling the media service module to send the media platform version information to the third-party application, the processing unit 501 is further configured to:
controlling the third-party application to receive the media platform version information and send a capability acquisition request carrying the media platform version information to the media service module;
controlling the media service module to receive the capability acquisition request, inquiring an application capability list of the media platform version information, and sending the application capability list to the third-party application;
controlling the third-party application to receive the application capability list, and inquiring the application capability list to acquire a plurality of operation native functions supported by the current media platform for the third-party application; and determining the image fusion function selected to be opened in the plurality of operation native functions.
In an implementation manner of the present application, the detection data includes anti-shake data, the image data includes N frames of first images, and N is a positive integer; the hardware abstraction layer receives the data request, obtains original application data to be processed, and calls an image fusion algorithm to process the original application data to obtain target application data, and the processing unit 501 is specifically configured to:
controlling the hardware abstraction layer to divide the N frames of first images into N frames of second images and N frames of third images based on the anti-shake data, wherein the N frames of first images are in one-to-one correspondence with the N frames of second images and the N frames of third images respectively;
controlling the hardware abstraction layer to perform image compensation on the N frames of second images based on the N frames of third images to obtain N frames of fourth images;
and controlling the hardware abstraction layer to call an image fusion algorithm to process the fourth image of the N frames to obtain target application data.
In an implementation manner of the present application, the hardware abstraction layer invokes an image fusion algorithm to process the N frames of fourth images to obtain target application data, and the processing unit 501 is specifically configured to:
controlling the hardware abstraction layer to collect image characteristics included by the N frames of fourth images;
controlling the hardware abstraction layer to register the N frames of fourth images based on the image characteristics to obtain registration parameters;
controlling the hardware abstraction layer to adjust the displacement of the N frames of fourth images based on the registration parameters to obtain N frames of fifth images;
and controlling the hardware abstraction layer to fuse the N frames of fifth images based on an image fusion function to obtain target application data.
The photographing apparatus may further include a storage unit 503 for storing program codes and data of the electronic device. The processing unit 501 may be a processor, the communication unit 502 may be a touch display screen or a transceiver, and the storage unit 503 may be a memory.
It can be understood that, since the method embodiment and the apparatus embodiment are different presentation forms of the same technical concept, the content of the method embodiment portion in the present application should be synchronously adapted to the apparatus embodiment portion, and is not described herein again.
Embodiments of the present application further provide a chip, where the chip includes a processor, configured to call and run a computer program from a memory, so that a device in which the chip is installed performs some or all of the steps described in the electronic device in the above method embodiments.
Embodiments of the present application also provide a computer storage medium, where the computer storage medium stores a computer program for electronic data exchange, the computer program enabling a computer to execute part or all of the steps of any one of the methods described in the above method embodiments, and the computer includes an electronic device.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the methods as described in the above method embodiments. The computer program product may be a software installation package, the computer comprising an electronic device.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer readable memory if it is implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the above-mentioned method of the embodiments of the present application. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash Memory disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (11)

1. The photographing method is characterized by being applied to electronic equipment, wherein the electronic equipment comprises a media service module and an operating system, and an application layer of the operating system is provided with a third-party application; the method comprises the following steps:
the third party application sends a data request to a hardware abstraction layer of the operating system;
the hardware abstraction layer receives the data request, acquires original application data to be processed, and calls an image fusion algorithm to process the original application data to obtain target application data, wherein the image fusion algorithm is that a third party application requests an operating system to be open for the third party application in advance through the media service module, the original application data comprises image data and detection data, the image data is acquired by a camera of the electronic equipment, and the detection data is detected by a detection sensor of the electronic equipment;
the hardware abstraction layer sends the target application data to the third party application.
2. The method of claim 1, wherein the detection data comprises anti-shake data, the image data comprising N frames of the first image, N being a positive integer; the hardware abstraction layer receives the data request, obtains original application data to be processed, and calls an image fusion algorithm to process the original application data to obtain target application data, and the method comprises the following steps:
the hardware abstraction layer divides the N frames of first images into N frames of second images and N frames of third images based on the anti-shake data, wherein the N frames of first images are in one-to-one correspondence with the N frames of second images and the N frames of third images respectively;
the hardware abstraction layer carries out image compensation on the N frames of second images based on the N frames of third images to obtain N frames of fourth images;
and the hardware abstraction layer calls an image fusion algorithm to process the fourth image of the N frames to obtain target application data.
3. The method of claim 2, wherein the hardware abstraction layer invokes an image fusion algorithm to process the N fourth images to obtain target application data, and the method comprises:
the hardware abstraction layer collects image characteristics included by the N frames of fourth images;
the hardware abstraction layer carries out registration on the N frames of fourth images based on the image characteristics to obtain registration parameters;
the hardware abstraction layer adjusts the displacement of the N frames of fourth images based on the registration parameters to obtain N frames of fifth images;
and the hardware abstraction layer fuses the N frames of fifth images based on an image fusion function to obtain target application data.
4. The method of claim 1, wherein the hardware abstraction layer sends the target application data to the third-party application, comprising:
the hardware abstraction layer determining a transmission bandwidth between the hardware abstraction layer and the application layer;
the hardware abstraction layer compresses the target application data based on the transmission bandwidth, wherein the size of the compressed target application data is smaller than that of the target application data before compression;
and the hardware abstraction layer sends the compressed target application data to the third-party application.
5. The method according to claim 1 or 2, wherein the hardware abstraction layer is provided with a camera hardware abstraction module and an algorithm management module, and the camera hardware abstraction module is connected with the algorithm management module;
the hardware abstraction layer receives the data request, obtains original application data to be processed, and calls an image fusion algorithm to process the original application data to obtain target application data, and the method comprises the following steps:
the camera hardware abstraction module receives the data request, acquires original application data to be processed and sends the original application data to the algorithm management module;
and the algorithm management module receives the original application data, and processes the original application data by using an image fusion algorithm to obtain target application data.
6. The method according to claim 1 or 2, wherein the hardware abstraction layer is provided with a camera hardware abstraction module, a media policy module and an algorithm management module; the camera hardware abstraction module is connected with the algorithm management module through the media strategy module;
the hardware abstraction layer receives the data request, obtains original application data to be processed, and calls an image fusion algorithm to process the original application data to obtain target application data, and the method comprises the following steps:
the camera hardware abstraction module receives the data request, acquires original application data to be processed, and sends the original application data to the algorithm management module through the media strategy module;
and the algorithm management module receives the original application data, and processes the original application data by using an image fusion algorithm to obtain target application data.
7. The method according to claim 1, wherein the hardware abstraction layer is provided with a camera hardware abstraction module, a media policy module and an algorithm management module, the camera hardware abstraction module is connected with the media policy module, and the media policy module is connected with the algorithm management module;
the image fusion algorithm is specifically opened to the third-party application by the following operations:
the media policy module receives first function configuration information from the media service module, wherein the first function configuration information comprises description information of an image fusion function; converting the first function configuration information into second function configuration information which can be identified by the algorithm management module, and sending the second function configuration information to the algorithm management module;
and the algorithm management module receives the second function configuration information and opens the use permission of the third-party application for the image fusion algorithm of the operating system according to the second function configuration information.
8. The photographing device is applied to electronic equipment, the electronic equipment comprises a media service module and an operating system, and an application layer of the operating system is provided with a third-party application; the device comprises:
a processing unit to: controlling the third-party application to send a data request to a hardware abstraction layer of the operating system; controlling the hardware abstraction layer to receive the data request, acquiring original application data to be processed, and calling an image fusion algorithm to process the original application data to obtain target application data, wherein the image fusion algorithm is used for a third party application to request an operating system to be open for the third party application in advance through the media service module, the original application data comprises image data and detection data, the image data is acquired by a camera of the electronic equipment, and the detection data is detected by a detection sensor of the electronic equipment; and controlling the hardware abstraction layer to send the target application data to the third-party application.
9. A chip, comprising: a processor for calling and running a computer program from a memory so that a device on which the chip is installed performs the method of any one of claims 1-7.
10. An electronic device comprising a processor, a memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-7.
11. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any one of claims 1-7.
CN201911253067.5A 2019-12-09 2019-12-09 Photographing method and related equipment Active CN110933275B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911253067.5A CN110933275B (en) 2019-12-09 2019-12-09 Photographing method and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911253067.5A CN110933275B (en) 2019-12-09 2019-12-09 Photographing method and related equipment

Publications (2)

Publication Number Publication Date
CN110933275A true CN110933275A (en) 2020-03-27
CN110933275B CN110933275B (en) 2021-07-23

Family

ID=69857679

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911253067.5A Active CN110933275B (en) 2019-12-09 2019-12-09 Photographing method and related equipment

Country Status (1)

Country Link
CN (1) CN110933275B (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111491102A (en) * 2020-04-22 2020-08-04 Oppo广东移动通信有限公司 Detection method and system for photographing scene, mobile terminal and storage medium
CN112199127A (en) * 2020-10-10 2021-01-08 Oppo(重庆)智能科技有限公司 Image data processing method and device, mobile terminal and storage medium
CN112672046A (en) * 2020-12-18 2021-04-16 闻泰通讯股份有限公司 Storage method and device for continuous shooting image, electronic equipment and storage medium
WO2021115038A1 (en) * 2019-12-09 2021-06-17 Oppo广东移动通信有限公司 Application data processing method and related apparatus
CN113347378A (en) * 2021-06-02 2021-09-03 展讯通信(天津)有限公司 Video recording method and device
CN113645409A (en) * 2021-08-16 2021-11-12 展讯通信(上海)有限公司 Photographing processing method and device, photographing method, device and system and terminal equipment
CN113746998A (en) * 2020-05-29 2021-12-03 北京小米移动软件有限公司 Image processing method, device, equipment and storage medium
CN114125284A (en) * 2021-11-18 2022-03-01 Oppo广东移动通信有限公司 Image processing method, electronic device, and storage medium
WO2022161106A1 (en) * 2021-01-29 2022-08-04 华为技术有限公司 Electronic device, data transmission method between electronic device and another electronic device, and medium
CN114995591A (en) * 2021-10-30 2022-09-02 荣耀终端有限公司 Sensor registration method, control system and related equipment
WO2023123787A1 (en) * 2021-12-28 2023-07-06 北京小米移动软件有限公司 Image processing method, apparatus, electronic device, and storage medium
CN116708988A (en) * 2022-02-25 2023-09-05 荣耀终端有限公司 Electronic equipment and shooting method and medium thereof
CN116723394A (en) * 2022-02-28 2023-09-08 荣耀终端有限公司 Multi-shot strategy scheduling method and related equipment thereof

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070112441A1 (en) * 2002-03-28 2007-05-17 Amir Shahindoust Modular layer for abstracting peripheral hardware characteristics
CN104657956A (en) * 2015-03-16 2015-05-27 龙旗电子(惠州)有限公司 Method for realizing smart phone picture beautifying function
CN108496198A (en) * 2017-10-09 2018-09-04 华为技术有限公司 A kind of image processing method and equipment
CN109669782A (en) * 2017-10-13 2019-04-23 阿里巴巴集团控股有限公司 Hardware abstraction layer multiplexing method, device, operating system and equipment
CN110169056A (en) * 2016-12-12 2019-08-23 华为技术有限公司 A kind of method and apparatus that dynamic 3 D image obtains
CN110177218A (en) * 2019-06-28 2019-08-27 广州鲁邦通物联网科技有限公司 A kind of image processing method of taking pictures of Android device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070112441A1 (en) * 2002-03-28 2007-05-17 Amir Shahindoust Modular layer for abstracting peripheral hardware characteristics
CN104657956A (en) * 2015-03-16 2015-05-27 龙旗电子(惠州)有限公司 Method for realizing smart phone picture beautifying function
CN110169056A (en) * 2016-12-12 2019-08-23 华为技术有限公司 A kind of method and apparatus that dynamic 3 D image obtains
CN108496198A (en) * 2017-10-09 2018-09-04 华为技术有限公司 A kind of image processing method and equipment
CN109669782A (en) * 2017-10-13 2019-04-23 阿里巴巴集团控股有限公司 Hardware abstraction layer multiplexing method, device, operating system and equipment
CN110177218A (en) * 2019-06-28 2019-08-27 广州鲁邦通物联网科技有限公司 A kind of image processing method of taking pictures of Android device

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021115038A1 (en) * 2019-12-09 2021-06-17 Oppo广东移动通信有限公司 Application data processing method and related apparatus
CN111491102B (en) * 2020-04-22 2022-01-07 Oppo广东移动通信有限公司 Detection method and system for photographing scene, mobile terminal and storage medium
CN111491102A (en) * 2020-04-22 2020-08-04 Oppo广东移动通信有限公司 Detection method and system for photographing scene, mobile terminal and storage medium
CN113746998A (en) * 2020-05-29 2021-12-03 北京小米移动软件有限公司 Image processing method, device, equipment and storage medium
CN112199127A (en) * 2020-10-10 2021-01-08 Oppo(重庆)智能科技有限公司 Image data processing method and device, mobile terminal and storage medium
CN112672046A (en) * 2020-12-18 2021-04-16 闻泰通讯股份有限公司 Storage method and device for continuous shooting image, electronic equipment and storage medium
WO2022161106A1 (en) * 2021-01-29 2022-08-04 华为技术有限公司 Electronic device, data transmission method between electronic device and another electronic device, and medium
CN113347378A (en) * 2021-06-02 2021-09-03 展讯通信(天津)有限公司 Video recording method and device
CN113645409B (en) * 2021-08-16 2022-08-19 展讯通信(上海)有限公司 Photographing processing method and device, photographing method, device and system and terminal equipment
CN113645409A (en) * 2021-08-16 2021-11-12 展讯通信(上海)有限公司 Photographing processing method and device, photographing method, device and system and terminal equipment
CN114995591B (en) * 2021-10-30 2023-01-20 荣耀终端有限公司 Sensor registration method, control system and related equipment
CN114995591A (en) * 2021-10-30 2022-09-02 荣耀终端有限公司 Sensor registration method, control system and related equipment
CN114125284A (en) * 2021-11-18 2022-03-01 Oppo广东移动通信有限公司 Image processing method, electronic device, and storage medium
CN114125284B (en) * 2021-11-18 2023-10-31 Oppo广东移动通信有限公司 Image processing method, electronic device and storage medium
WO2023123787A1 (en) * 2021-12-28 2023-07-06 北京小米移动软件有限公司 Image processing method, apparatus, electronic device, and storage medium
CN116708988A (en) * 2022-02-25 2023-09-05 荣耀终端有限公司 Electronic equipment and shooting method and medium thereof
CN116723394A (en) * 2022-02-28 2023-09-08 荣耀终端有限公司 Multi-shot strategy scheduling method and related equipment thereof
CN116723394B (en) * 2022-02-28 2024-05-10 荣耀终端有限公司 Multi-shot strategy scheduling method and related equipment thereof

Also Published As

Publication number Publication date
CN110933275B (en) 2021-07-23

Similar Documents

Publication Publication Date Title
CN110933275B (en) Photographing method and related equipment
EP3410390B1 (en) Image processing method and device, computer readable storage medium and electronic device
CN107302663B (en) Image brightness adjusting method, terminal and computer readable storage medium
CN108024065B (en) Terminal shooting method, terminal and computer readable storage medium
KR102566998B1 (en) Apparatus and method for determining image sharpness
CN110930335B (en) Image processing method and electronic equipment
KR20170098089A (en) Electronic apparatus and operating method thereof
CN107948505B (en) Panoramic shooting method and mobile terminal
KR102495763B1 (en) Electronic device and method for correcting images corrected by a first image processing mode in external electronic device using a second image processing mode
CN111064895B (en) Virtual shooting method and electronic equipment
CN107566749B (en) Shooting method and mobile terminal
CN107623818B (en) Image exposure method and mobile terminal
CN108513067B (en) Shooting control method and mobile terminal
CN112840634B (en) Electronic device and method for obtaining image
CN110933313B (en) Dark light photographing method and related equipment
KR102397924B1 (en) Electronic device and method for correcting images based on image feature information and image correction scheme
CN109672827A (en) For combining the electronic device and its method of multiple images
CN113411498B (en) Image shooting method, mobile terminal and storage medium
CN108616687B (en) Photographing method and device and mobile terminal
CN115514876B (en) Image fusion method, electronic device, storage medium and computer program product
CN108156392B (en) Shooting method, terminal and computer readable storage medium
CN107734269B (en) Image processing method and mobile terminal
CN107395971B (en) Image acquisition method, image acquisition equipment and computer-readable storage medium
CN111028192B (en) Image synthesis method and electronic equipment
JP6563694B2 (en) Image processing apparatus, image processing system, image composition apparatus, image processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant