CN110933312B - Photographing control method and related product - Google Patents

Photographing control method and related product Download PDF

Info

Publication number
CN110933312B
CN110933312B CN201911253078.3A CN201911253078A CN110933312B CN 110933312 B CN110933312 B CN 110933312B CN 201911253078 A CN201911253078 A CN 201911253078A CN 110933312 B CN110933312 B CN 110933312B
Authority
CN
China
Prior art keywords
target
image
service module
media service
shooting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911253078.3A
Other languages
Chinese (zh)
Other versions
CN110933312A (en
Inventor
陈岩
方攀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201911253078.3A priority Critical patent/CN110933312B/en
Publication of CN110933312A publication Critical patent/CN110933312A/en
Application granted granted Critical
Publication of CN110933312B publication Critical patent/CN110933312B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation

Abstract

The embodiment of the application discloses a photographing control method and a related product, which are applied to electronic equipment, wherein the electronic equipment comprises a media service module and an operating system, and an application layer of the operating system is provided with a third-party application; the method comprises the following steps: the third-party application sends a photographing request to the media service module; the media service module receives the photographing request and analyzes the photographing request to obtain target photographing parameters; the driving layer drives a camera of the electronic equipment to shoot according to the target shooting parameters to obtain a plurality of frames of shot images; the media service module selects a target shooting image with the best image quality from the multi-frame shooting images and sends the target shooting image to the third-party application; the third party application receives a target photographic image. By adopting the embodiment of the application, the shooting data of the bottom layer can be actively screened, and the image shooting efficiency is improved.

Description

Photographing control method and related product
Technical Field
The application relates to the technical field of photographing control, in particular to a photographing control method and a related product.
Background
With the widespread use of electronic devices (such as mobile phones, tablet computers, and the like), the electronic devices have more and more applications and more powerful functions, and the electronic devices are developed towards diversification and personalization, and become indispensable electronic products in the life of users.
At present, Camera APP in a third-party application actively issues a photographing request, a bottom layer obtains photographing data according to the photographing request, and then the photographing data is directly sent to an application layer for processing.
Disclosure of Invention
The embodiment of the application provides a photographing control method and a related product, which can actively screen photographing data of a bottom layer, and improve the image photographing efficiency.
In a first aspect, an embodiment of the present application provides a photographing control method, which is applied to an electronic device, where the electronic device includes a media service module and an operating system, and an application layer of the operating system is provided with a third-party application; the method comprises the following steps:
the third-party application sends a photographing request to the media service module;
the media service module receives the photographing request and analyzes the photographing request to obtain target photographing parameters; sending the target shooting parameters to a driving layer of the operating system;
the driving layer drives a camera of the electronic equipment to shoot according to the target shooting parameters to obtain multi-frame shot images, and the multi-frame shot images are sent to the media service module;
the media service module selects a target shooting image with the best image quality from the multi-frame shooting images and sends the target shooting image to the third-party application;
and the third-party application receives the target shooting image.
In a second aspect, an embodiment of the present application provides a photographing control apparatus, which is applied to an electronic device, where the electronic device includes a media service module and an operating system, and an application layer of the operating system is provided with a third-party application; the apparatus, comprising:
the third-party application is used for sending a photographing request to the media service module;
the media service module is used for receiving the photographing request and analyzing the photographing request to obtain target photographing parameters; transmitting the target photographing parameters to a driver layer of the operating system,
the driving layer is used for driving a camera of the electronic equipment to shoot according to the target shooting parameters to obtain multi-frame shot images and sending the multi-frame shot images to the media service module;
the media service module is further configured to select a target captured image with the best image quality from the multiple frames of captured images, and send the target captured image to the third-party application;
the third-party application is further used for receiving the target shooting image.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for executing the steps in the first aspect of the embodiment of the present application.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program enables a computer to perform some or all of the steps described in the first aspect of the embodiment of the present application.
In a fifth aspect, embodiments of the present application provide a computer program product, where the computer program product includes a non-transitory computer-readable storage medium storing a computer program, where the computer program is operable to cause a computer to perform some or all of the steps as described in the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
The embodiment of the application has the following beneficial effects:
it can be seen that the photographing control method and related product described in the embodiments of the present application are applied to an electronic device, the electronic device includes a media service module and an operating system, an application layer of the operating system is provided with a third party application, the third party application sends a photographing request to the media service module, the media service module receives the photographing request and analyzes the photographing request to obtain a target photographing parameter, the target photographing parameter is sent to a driving layer of the operating system, the driving layer drives a camera of the electronic device to photograph according to the target photographing parameter to obtain a multi-frame photographed image, and sends the multi-frame photographed image to the media service module, the media service module selects a target photographed image with the best image quality from the multi-frame photographed images, and sends the target photographed image to the third party application, the third party application receives the target photographed image, so, when receiving the request of shooing, can select the optimal shooting data in the shooting data that the drive layer was shot through the media service module initiative, be equivalent to carry out the preliminary treatment to the shooting data, on the one hand, avoided giving the third party application with a large amount of bottom shooting data and handled the drawback that the consumption is difficult to control that brings, on the other hand, the third party application passes through the preprocessing process that the media service module was accomplished the shooting image, can promote the shooting effect better, promote user experience.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1A is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 1B is a schematic diagram of a software structure of an electronic device according to an embodiment of the present application;
fig. 1C is a schematic flowchart of a photographing control method according to an embodiment of the present application;
fig. 1D is a schematic flowchart of another photographing control method provided in the embodiment of the present application;
fig. 2 is a schematic flowchart of another photographing control method provided in the embodiment of the present application;
fig. 3 is a schematic structural diagram of another electronic device provided in an embodiment of the present application;
fig. 4 is a functional unit block of a photographing control apparatus according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The electronic device related to the embodiment of the present application may include various handheld devices (smart phones, tablet computers, etc.) having a wireless communication function, vehicle-mounted devices (navigators, vehicle-mounted refrigerators, vehicle-mounted dust collectors, etc.), wearable devices (smart watches, smart bracelets, wireless headsets, augmented reality/virtual reality devices, smart glasses), computing devices or other processing devices connected to wireless modems, and various forms of User Equipment (UE), Mobile Stations (MS), terminal devices (terminal device), and the like. For convenience of description, the above-mentioned devices are collectively referred to as electronic devices. The embodiment of the application is applied to an electronic device with an operating system, the operating system is a computer program for managing computer hardware and software resources, and the operating system can be at least one of the following: android system Android, Windows system, apple system, saiban system, hong meng system, Linux operating system, etc., which are not limited herein.
Currently, taking an Android platform as an example, a third party Application may now access underlying algorithm data through a standard Android Application Programming Interface (API), but if it wants to use more underlying enhanced functions or data processed through an algorithm, there is no corresponding standard Interface to map underlying capability to three parties for access. And the development and maintenance cost is increased by replacing the existing interface to call the bottom layer function, the code amount is large (2 k-3 k codes), and when the SDK is increased, the function is increased and the SDK is required to be updated.
The following describes embodiments of the present application in detail.
Referring to fig. 1A, fig. 1A is a schematic structural diagram of an electronic device disclosed in an embodiment of the present application, the electronic device 100 includes a storage and processing circuit 110, and a sensor 170 connected to the storage and processing circuit 110, where:
the electronic device 100 may include control circuitry, which may include storage and processing circuitry 110. The storage and processing circuitry 110 may be a memory, such as a hard drive memory, a non-volatile memory (e.g., flash memory or other electronically programmable read-only memory used to form a solid state drive, etc.), a volatile memory (e.g., static or dynamic random access memory, etc.), etc., and the embodiments of the present application are not limited thereto. Processing circuitry in storage and processing circuitry 110 may be used to control the operation of electronic device 100. The processing circuitry may be implemented based on one or more microprocessors, microcontrollers, digital signal processors, baseband processors, power management units, audio codec chips, application specific integrated circuits, display driver integrated circuits, and the like.
The storage and processing circuitry 110 may be used to run software in the electronic device 100, such as an Internet browsing application, a Voice Over Internet Protocol (VOIP) telephone call application, an email application, a media playing application, operating system functions, and so forth. Such software may be used to perform control operations such as, for example, camera-based image capture, ambient light measurement based on an ambient light sensor, proximity sensor measurement based on a proximity sensor, information display functionality based on status indicators such as status indicator lights of light emitting diodes, touch event detection based on a touch sensor, functionality associated with displaying information on multiple (e.g., layered) display screens, operations associated with performing wireless communication functionality, operations associated with collecting and generating audio signals, control operations associated with collecting and processing button press event data, and other functions in the electronic device 100, to name a few.
The electronic device 100 may include input-output circuitry 150. The input-output circuit 150 may be used to enable the electronic device 100 to input and output data, i.e., to allow the electronic device 100 to receive data from an external device and also to allow the electronic device 100 to output data from the electronic device 100 to the external device. The input-output circuit 150 may further include a sensor 170. The sensor 170 may include an ultrasonic module, and may further include an ambient light sensor, a proximity sensor based on light and capacitance, a touch sensor (for example, based on a light touch sensor and/or a capacitive touch sensor, where the touch sensor may be a part of a touch display screen, or may be used independently as a touch sensor structure), an acceleration sensor, a temperature sensor, and other sensors.
Input-output circuit 150 may also include one or more display screens, such as display screen 130. The display 130 may include one or a combination of liquid crystal display, organic light emitting diode display, electronic ink display, plasma display, display using other display technologies. The display screen 130 may include an array of touch sensors (i.e., the display screen 130 may be a touch display screen). The touch sensor may be a capacitive touch sensor formed by a transparent touch sensor electrode (e.g., an Indium Tin Oxide (ITO) electrode) array, or may be a touch sensor formed using other touch technologies, such as acoustic wave touch, pressure sensitive touch, resistive touch, optical touch, and the like, and the embodiments of the present application are not limited thereto.
The electronic device 100 may also include an audio component 140. The audio component 140 may be used to provide audio input and output functionality for the electronic device 100. The audio components 140 in the electronic device 100 may include a speaker, a microphone, a buzzer, a tone generator, and other components for generating and detecting sound.
The communication circuit 120 may be used to provide the electronic device 100 with the capability to communicate with external devices. The communication circuit 120 may include analog and digital input-output interface circuits, and wireless communication circuits based on radio frequency signals and/or optical signals. The wireless communication circuitry in communication circuitry 120 may include radio-frequency transceiver circuitry, power amplifier circuitry, low noise amplifiers, switches, filters, and antennas. For example, the wireless Communication circuitry in Communication circuitry 120 may include circuitry to support Near Field Communication (NFC) by transmitting and receiving Near Field coupled electromagnetic signals. For example, the communication circuit 120 may include a near field communication antenna and a near field communication transceiver. The communications circuitry 120 may also include a cellular telephone transceiver and antenna, a wireless local area network transceiver circuitry and antenna, and so forth.
The electronic device 100 may further include a battery, power management circuitry, and other input-output units 160. The input-output unit 160 may include buttons, joysticks, click wheels, scroll wheels, touch pads, keypads, keyboards, cameras, light emitting diodes and other status indicators, and the like.
A user may input commands through input-output circuitry 150 to control the operation of electronic device 100, and may use output data of input-output circuitry 150 to enable receipt of status information and other outputs from electronic device 100.
As shown in fig. 1B, the electronic device of fig. 1A according to the embodiment of the present application may include a media service module and an operating system, an application layer of the operating system is provided with a third party application and a media management module (also referred to as a media interface module), a hardware abstraction layer of the operating system is provided with a hardware abstraction module (here, a native module, such as a native Camera hardware abstraction module Camera), a media policy module and an algorithm management module, and further, the native architecture of the operating system further includes a framework layer and a driver layer, the framework layer includes application interfaces of various native applications (such as a native Camera application program interface), application services (such as a native Camera service), and a framework layer interface (such as Google HAL3 interface), the hardware abstraction layer includes a hardware abstraction layer interface (such as HAL3.0), and hardware abstraction modules of various native applications (such as a Camera hardware abstraction module), the driver layer includes various drivers (e.g., screen Display driver, Audio driver, etc.) for enabling various hardware of the electronic device, such as the image signal processor ISP + front-end image sensors, etc.
The media service module is independent of the operating system, third-party applications can communicate with the media service module through the media management module, the media service module can communicate with the media policy module through a native information link formed by an application interface, an application service, a framework layer interface, a hardware abstraction layer interface and the hardware abstraction module, the media policy module communicates with the algorithm management module, the algorithm management module maintains a native algorithm library, the algorithm library comprises enhancement functions supported by various native applications, and for example, for a native camera application, the native camera application supports the native camera application to realize various enhancement functions such as binocular shooting, beauty, sharpening, night vision and the like. In addition, the media service module can also directly communicate with the media policy module or the algorithm management module.
Based on the above framework, the media service module may enable the algorithm module in the algorithm library through the native information link, the media policy module, and the algorithm management module, or enable the algorithm module in the algorithm library directly through the media policy module and the algorithm management module, or enable the algorithm module in the algorithm library directly through the algorithm management module, thereby implementing an enhanced function of opening native application association for third-party applications.
Based on the above framework, the media service module may invoke the driver of the application to enable some hardware through the native information link, or through the first information link composed of the media policy module and the hardware abstraction module, or through the second information link composed of the media policy module, the algorithm management module, and the hardware abstraction module, thereby implementing opening the hardware associated with the native application for the third party application.
Based on the above architecture, the electronic device may implement the following functions:
the third-party application sends a photographing request to the media service module;
the media service module receives the photographing request and analyzes the photographing request to obtain target photographing parameters; sending the target shooting parameters to a driving layer of the operating system;
the driving layer drives a camera of the electronic equipment to shoot according to the target shooting parameters to obtain multi-frame shot images, and the multi-frame shot images are sent to the media service module;
the media service module selects a target shooting image with the best image quality from the multi-frame shooting images and sends the target shooting image to the third-party application;
and the third-party application receives the target shooting image.
It can be seen that, in the electronic device described in the embodiment of the present application, the electronic device includes a media service module and an operating system, an application layer of the operating system is provided with a third-party application, the third-party application sends a shooting request to the media service module, the media service module receives the shooting request and analyzes the shooting request to obtain a target shooting parameter, the target shooting parameter is sent to a driving layer of the operating system, the driving layer drives a camera of the electronic device to shoot according to the target shooting parameter to obtain multiple frames of shot images, and sends the multiple frames of shot images to the media service module, the media service module selects a target shot image with the best image quality from the multiple frames of shot images and sends the target shot image to the third-party application, and the third-party application receives the target shot image, so when the shooting request is received, can select the optimal shooting data in the shooting data that can shoot from the drive layer through the media service module initiative, be equivalent to carry out the preliminary treatment to the shooting data, on the one hand, avoided giving a large amount of bottom shooting data to the drawback that the consumption that third party application was handled and is brought is difficult to control, and on the other hand, the third party is used and is accomplished the preliminary treatment process of shooing the image through the media service module, can promote the shooting effect better, promotes user experience.
Referring to fig. 1C, fig. 1C is a schematic flowchart of a photographing control method provided in an embodiment of the present application, and as shown in the drawing, the method is applied to the electronic device shown in fig. 1A, where the electronic device includes a media service module and an operating system, and an application layer of the operating system is provided with a third-party application; the photographing control method comprises the following steps:
101. and the third-party application sends a photographing request to the media service module.
In this embodiment, the third-party application may be a foreground application, and the third-party application is specifically an application that can call a camera function, for example, the third-party application may be WeChat, QQ, and a trembler.
102. The media service module receives the photographing request and analyzes the photographing request to obtain target photographing parameters; and sending the target shooting parameters to a driving layer of the operating system.
Wherein, the target shooting parameter may be at least one of the following: sensitivity, white balance parameter, exposure time, anti-shake parameter, distortion correction parameter, beauty parameter, shooting mode, anti-shake parameter, shooting time, shooting position, watermark setting parameter, resolution, shooting time, shooting place, age, color value, expression, and the like, which are not limited herein. The photographing instruction comprises instruction content (for example, photographing effect, environmental parameters and the like), and when the photographing instruction issued by the third-party application is received, the photographing instruction (instruction content) can be analyzed through the media service module to obtain target photographing parameters, and a photographed image suitable for the photographing instruction effect can be photographed through the target photographing parameters.
In one possible example, the photographing instruction carries a target environment parameter and a target attribute setting parameter; the step 102 of analyzing the photographing request to obtain the target photographing parameter may include the following steps:
21. the media service module determines a first shooting parameter corresponding to the target environment parameter according to a mapping relation between a preset environment parameter and a shooting parameter;
22. and taking the first shooting parameter and the target attribute setting parameter as the target shooting parameter.
In this embodiment, the environmental parameter may be at least one of the following: ambient light level, temperature, humidity, weather, ambient color temperature, magnetic field interference intensity, etc., without limitation. The target property setting parameter may be at least one of: the shooting time, the shooting location, the watermark setting parameter, and the like, which are not limited herein, the first shooting parameter may be at least one of the following: sensitivity, white balance parameter, exposure time, anti-shake parameter, distortion correction parameter, beauty parameter, shooting mode, anti-shake parameter, and the like, which are not limited herein. The photographing instruction carries target environment parameters and target attribute setting parameters. In specific implementation, a mapping relationship between a preset environment parameter and a shooting parameter may be pre-stored in the electronic device, and then the media service module may determine a first shooting parameter corresponding to the target environment parameter according to the mapping relationship between the preset environment parameter and the shooting parameter, and may use the first shooting parameter and the target attribute setting parameter as the target shooting parameter.
103. The driving layer drives a camera of the electronic equipment to shoot according to the target shooting parameters to obtain multi-frame shooting images, and the multi-frame shooting images are sent to the media service module.
The driving layer can drive the camera to shoot according to the target shooting parameters, multi-frame shot images can be obtained, and in the shooting process, shaking or environment change is possible to occur, so that the image quality of each shot image is different. The driving layer can send the multi-frame shot images to the media service module, and the media service module can perform subsequent screening operation.
104. And the media service module selects a target shooting image with the best image quality from the multi-frame shooting images and sends the target shooting image to the third-party application.
The media service module may select one of the multiple captured images with the best image quality according to some preset rules, and send the target captured image to the third-party application, for example, the media service module may perform image quality evaluation on each captured image, and select an image corresponding to the largest evaluation value as the target captured image with the best image quality.
In one possible example, in step 104, the media service module may select a target captured image with the best image quality from the multiple captured images, and may include the following steps:
41. carrying out overall image quality evaluation on the multiple frames of shot images to obtain a plurality of image quality evaluation values, wherein each shot image corresponds to one image quality evaluation value;
42. selecting an image quality evaluation value larger than a preset threshold value from the plurality of image quality evaluation values to obtain at least one target image quality evaluation value, and acquiring a shot image corresponding to the at least one target image quality evaluation value to obtain at least one first shot image;
43. performing local image quality evaluation on each shot image in the at least one first shot image to obtain at least one target evaluation value, wherein each shot image corresponds to one target evaluation value;
44. and determining the maximum value of the at least one target evaluation value, and acquiring a target shot image corresponding to the maximum value.
In a specific implementation, the preset threshold may be set by a user or default by the system. The overall image quality evaluation may be performed on each of the plurality of captured images to obtain a plurality of image quality evaluation values, one for each captured image. Specifically, the overall image quality evaluation algorithm may be used to evaluate each captured image, for example, the overall image quality evaluation may be used to evaluate each captured image in a plurality of captured images using at least one image quality evaluation index, which may be at least one of the following: information entropy, average gray scale, contrast, sharpness, edge preservation, etc., and is not limited herein. The overall image quality evaluation algorithm can be stored in the algorithm management module in advance, and the media service module can call the overall image quality evaluation algorithm through the algorithm management module.
Further, an image quality evaluation value larger than a preset threshold value may be selected from the plurality of image quality evaluation values to obtain at least one target image quality evaluation value, so that a good-quality image may be selected, and a captured image corresponding to the at least one target image quality evaluation value may be acquired to obtain at least one first captured image. The local image quality evaluation may be understood as performing image quality evaluation on only a target region or a partial feature in an image, for example, expression evaluation, local region evaluation, or the like. The larger the target evaluation value is, the more popular the corresponding image is to the user, and the local image quality evaluation algorithm may be used to evaluate the first captured image for each frame to obtain at least one target evaluation value, each first captured image corresponds to one target evaluation value, the maximum value may be selected from the at least one target evaluation value, and the target captured image corresponding to the maximum value may be uploaded to a third-party application, that is, the target captured image may be presented in the third-party application, or the target captured image may be saved. The local image quality evaluation algorithm can be stored in the algorithm management module in advance, and the media service module can call the local image quality evaluation algorithm through the algorithm management module.
Further, in a possible example, the step 41, performing overall image quality evaluation on the multiple frames of captured images to obtain multiple image quality evaluation values, includes:
411. the media service module determines a first definition corresponding to a shot image i, wherein the shot image i is any one of the multiple frames of shot images;
412. dividing a shot image i into a plurality of regions;
413. determining the distribution density of the characteristic points corresponding to each of the plurality of areas to obtain a plurality of distribution densities of the characteristic points;
414. performing mean square error operation according to the distribution densities of the plurality of characteristic points to obtain a target mean square error;
415. determining a target definition adjusting coefficient corresponding to the target mean square error according to a mapping relation between a preset mean square error and a definition adjusting coefficient;
416. adjusting the first definition according to the target definition adjusting coefficient to obtain a second definition;
417. and determining the image quality evaluation value corresponding to the second definition according to a preset mapping relation between the definition and the image quality evaluation value.
Taking the shot image i as an example, the shot image i is any one of the multi-frame shot images. The first definition corresponding to the shot image i can be determined, the shot image i can be divided into a plurality of regions, the areas of the regions can be the same or different, the number of the feature points of each region can be determined, and the distribution density of the feature points corresponding to each region in the regions can be determined according to the number of the feature points of each region and the area of the region, so that the distribution density of the feature points can be obtained.
Furthermore, the mean square error operation can be performed according to the distribution density of the plurality of feature points to obtain a target mean square error, a mapping relation between a preset mean square error and a definition adjusting coefficient can be prestored in the electronic device, the value range of the definition adjusting coefficient can be-0.35 to 0.35, further, a target definition adjusting coefficient corresponding to the target mean square error can be determined according to the mapping relation between the preset mean square error and the definition adjusting coefficient, further, the first definition can be adjusted according to the target definition adjusting coefficient to obtain a second definition, namely the second definition is (1+ target definition adjusting coefficient) first definition, a mapping relation between the preset definition and an image quality evaluation value can be prestored in the electronic device, namely, the mapping relation between the preset definition and the image quality evaluation value can be obtained according to the mapping relation between the preset definition and the image quality evaluation value, and determining the image quality evaluation value corresponding to the second definition. On the one hand, the preliminary image quality evaluation can be preliminarily evaluated through the definition, on the other hand, the preliminary image quality evaluation result can be adjusted through the relevance (mean square error) between neighborhoods, and finally, the normalization processing is carried out through the mapping relation between the preset definition and the image quality evaluation value, namely, the image quality evaluation value is restricted in a certain range, for example, 0-1 range or 0-100 range, so that the image quality evaluation can be accurately realized to a certain degree.
Further, in a possible example, the step 43 of performing local image quality evaluation on each captured image of the at least one first captured image to obtain at least one target evaluation value may include the following steps:
431. the media service module extracts a target of a first shot image j to obtain a target j, wherein the first shot image j is any one of the at least one first shot image;
432. extracting characteristic parameters of the target j to obtain at least one characteristic parameter set;
433. and determining a target evaluation value corresponding to the first shot image j according to the at least one characteristic parameter set.
Wherein the target may be a human, an animal or other object. The characteristic parameter set may include at least one of: expressions, resolutions, feature points, feature outlines, etc., without limitation. In a specific implementation, taking the first captured image j as an example, the first captured image j is any one of at least one first captured image, and the target extraction may be performed on the first captured image j to obtain the target j, where the target extraction algorithm may be at least one of the following: HOG features, LBP features, Haar-like features, etc., without limitation. Further, feature parameter extraction may be performed on the target j, that is, only feature parameters of the target in the captured image are extracted to obtain at least one feature parameter set, where each feature parameter set may include at least one feature parameter, the target evaluation value corresponding to the first captured image j may be determined according to the at least one feature parameter set, each feature parameter set may correspond to one evaluation result, and the evaluation results corresponding to different feature parameter sets may be subjected to weighting operation to obtain a final evaluation result.
Further, in a possible example, the at least one feature parameter set is a plurality of feature parameter sets, each feature parameter set corresponds to a category label, and the step 433 of determining the target evaluation value corresponding to the first captured image j according to the at least one feature parameter set may include the following steps:
4331. determining an evaluation value corresponding to each characteristic parameter set in the plurality of characteristic parameter sets to obtain a plurality of evaluation values, wherein each characteristic parameter set corresponds to one evaluation value;
4332. determining a weight corresponding to each feature parameter set in the plurality of feature parameter sets according to a mapping relation between a preset category label and the weight to obtain a plurality of target weights;
4333. and performing weighting operation according to the plurality of evaluation values and the plurality of target weights to obtain a target evaluation value corresponding to the shot image j.
In this embodiment of the present application, the category label may be customized by a user or defaulted by a system, the category label is used to represent an attribute dimension corresponding to the characteristic parameter, and the category label may be at least one of the following: expression, feature point distribution density, area, contrast, etc., and are not limited herein. The mapping relationship between the preset category label and the weight value can be stored in the electronic device in advance. In a specific implementation, the electronic device may determine an evaluation value corresponding to each of a plurality of feature parameter sets, to obtain a plurality of evaluation values, where each feature parameter set corresponds to one evaluation value, and specifically, may determine one evaluation value by each feature parameter set, for example, if a feature parameter may be a target expression value, the evaluation value corresponding to the target expression value may be determined according to a preset mapping relationship between the expression value and the evaluation value. Further, a weight corresponding to each feature parameter set in the plurality of feature parameter sets may be determined according to a mapping relationship between a preset category label and the weight, so as to obtain a plurality of target weights, of course, a sum of the plurality of target weights may be 1, and further, a weighting operation may be performed according to the plurality of evaluation values and the plurality of target weights, so as to obtain a target evaluation value corresponding to the shot image j, where a specific formula is as follows:
target evaluation value 1+ evaluation value 2+ target weight 2+ … + evaluation value n + target weight n
The evaluation value 1, the evaluation value 2, the evaluation value …, and the evaluation value n are weights corresponding to different types of tags of the target, the target weight 1 is a weight corresponding to the evaluation value 1, the target weight 2 is a weight corresponding to the evaluation value 2, and so on.
105. And the third-party application receives the target shooting image.
The third-party application may receive the target captured image and store the target captured image, or further process the target captured image through the third-party application.
In one possible example, after the step 105, the following steps may be further included:
a1, the third-party application acquires target image processing parameters;
and A2, processing the target shooting image according to the target image processing parameters to obtain a final image.
In this embodiment of the present application, the target image processing parameter may be at least one of the following: beauty, anti-shake, special effects treatment (e.g., changing hair style, wearing glasses, etc.), size adjustment, background blurring, etc., without limitation. In specific implementation, the target image processing parameters can be obtained, the target shot image is processed according to the target image processing parameters to obtain a final image, and the final image can be displayed.
In one possible example, the step a1 of obtaining the target image processing parameter may include the following steps:
a11, the third-party application detects the trigger operation of the target control of the third-party application;
a12, determining a target image processing parameter corresponding to the target control according to a preset mapping relation between the control and the image processing parameter.
The third-party application can comprise different controls, each control corresponds to different image processing parameters, each control can achieve different image processing effects, for example, changing hair, wearing glasses, antique dresses and the like, and no limitation is made herein.
For example, as shown in fig. 1D, for a face image, based on the photographing control method provided in the embodiment of the present application, the following steps may be implemented:
s1, the third party application issues a photographing request;
s2, the media service module receives the photographing request and performs preliminary analysis on the request, and extracts target photographing parameters;
s3, the media service module sends the target shooting parameters to the driving layer;
s4, the driving layer drives the camera to shoot according to the target shooting parameters, and 4-6 frame data are obtained and serve as shooting data frames to be uploaded to the media service module;
s5, the media service module calls a definition recognition algorithm to detect the data frames uploaded by the driving layer, and 3 frames of images with better definition are selected as alternative frames;
s6, the media service module calls a facial expression detection algorithm to detect the facial expressions in the alternative images, and selects the data frame with the best effect and transmits the data frame to a third party application;
and S7, the third-party application further performs photographing processing on the data frame with the best effect and stores the data frame as a photographing result.
It can be seen that, the photographing control method described in the embodiment of the present application is applied to an electronic device, the electronic device includes a media service module and an operating system, an application layer of the operating system is provided with a third party application, the third party application sends a photographing request to the media service module, the media service module receives the photographing request and analyzes the photographing request to obtain a target photographing parameter, the target photographing parameter is sent to a driving layer of the operating system, the driving layer drives a camera of the electronic device to photograph according to the target photographing parameter to obtain a plurality of frames of photographed images, and sends the plurality of frames of photographed images to the media service module, the media service module selects a target photographed image with the best image quality from the plurality of frames of photographed images and sends the target photographed image to a third party application, the third party application receives the target photographed image, and when the photographing request is received, can select the optimal shooting data in the shooting data that can shoot from the drive layer through the media service module initiative, be equivalent to carry out the preliminary treatment to the shooting data, on the one hand, avoided giving a large amount of bottom shooting data to the drawback that the consumption that third party application was handled and is brought is difficult to control, and on the other hand, the third party is used and is accomplished the preliminary treatment process of shooing the image through the media service module, can promote the shooting effect better, promotes user experience.
Referring to fig. 2 in a manner consistent with the embodiment shown in fig. 1C, fig. 2 is a schematic flowchart of a photographing control method provided in an embodiment of the present application, and as shown in the figure, the method is applied to an electronic device shown in fig. 1A, where the electronic device includes a media service module and an operating system, and an application layer of the operating system is provided with a third-party application; the photographing control method comprises the following steps:
201. and the third-party application sends a photographing request to the media service module.
202. The media service module receives the photographing request and analyzes the photographing request to obtain target photographing parameters; and sending the target shooting parameters to a driving layer of the operating system.
203. The driving layer drives the camera of the electronic equipment to shoot according to the target shooting parameters to obtain multi-frame shot images, and the multi-frame shot images are sent to the media service module.
204. And the media service module selects a target shooting image with the best image quality from the multi-frame shooting images and sends the target shooting image to the third-party application.
205. And the third-party application receives the target shooting image.
206. And the third party application acquires target image processing parameters.
207. And the third party processes the target shooting image according to the target image processing parameters to obtain a final image.
The detailed description of the steps 201-207 can refer to the steps 101-105 described in the above fig. 1C, and will not be described herein again.
It can be seen that, in the photographing control method described in the embodiment of the present application, when a photographing request is received, optimal photographing data can be actively selected from the photographing data photographed by the driver layer through the media service module, which is equivalent to preprocessing the photographing data, and further, the photographing data is subjected to effect processing, so as to obtain an effect image desired by a user.
Referring to fig. 3, fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure, where as shown in the figure, the electronic device includes a processor, a memory, a communication interface, and one or more programs, the electronic device includes a media service module and an operating system, an application layer of the operating system is provided with a third-party application, and the one or more programs are stored in the memory and configured to be executed by the processor, where in an embodiment of the present disclosure, the program includes instructions for performing the following steps:
the third-party application sends a photographing request to the media service module;
the media service module receives the photographing request and analyzes the photographing request to obtain target photographing parameters; sending the target shooting parameters to a driving layer of the operating system;
the driving layer drives a camera of the electronic equipment to shoot according to the target shooting parameters to obtain multi-frame shot images, and the multi-frame shot images are sent to the media service module;
the media service module selects a target shooting image with the best image quality from the multi-frame shooting images and sends the target shooting image to the third-party application;
and the third-party application receives the target shooting image.
It can be seen that, in the electronic device described in the embodiment of the present application, the electronic device includes a media service module and an operating system, an application layer of the operating system is provided with a third-party application, the third-party application sends a shooting request to the media service module, the media service module receives the shooting request and analyzes the shooting request to obtain a target shooting parameter, the target shooting parameter is sent to a driving layer of the operating system, the driving layer drives a camera of the electronic device to shoot according to the target shooting parameter to obtain multiple frames of shot images, and sends the multiple frames of shot images to the media service module, the media service module selects a target shot image with the best image quality from the multiple frames of shot images and sends the target shot image to the third-party application, and the third-party application receives the target shot image, so when the shooting request is received, can select the optimal shooting data in the shooting data that can shoot from the drive layer through the media service module initiative, be equivalent to carry out the preliminary treatment to the shooting data, on the one hand, avoided giving a large amount of bottom shooting data to the drawback that the consumption that third party application was handled and is brought is difficult to control, and on the other hand, the third party is used and is accomplished the preliminary treatment process of shooing the image through the media service module, can promote the shooting effect better, promotes user experience.
In one possible example, the media service module selects a target captured image with the best image quality from the multiple frames of captured images, and includes:
carrying out overall image quality evaluation on the multiple frames of shot images to obtain a plurality of image quality evaluation values, wherein each shot image corresponds to one image quality evaluation value;
selecting an image quality evaluation value larger than a preset threshold value from the plurality of image quality evaluation values to obtain at least one target image quality evaluation value, and acquiring a shot image corresponding to the at least one target image quality evaluation value to obtain at least one first shot image;
performing local image quality evaluation on each shot image in the at least one first shot image to obtain at least one target evaluation value, wherein each shot image corresponds to one target evaluation value;
and determining the maximum value of the at least one target evaluation value, and acquiring a target shot image corresponding to the maximum value.
In one possible example, the photographing instruction carries a target environment parameter and a target attribute setting parameter;
the analyzing the photographing request to obtain target photographing parameters comprises:
the media service module determines a first shooting parameter corresponding to the target environment parameter according to a mapping relation between a preset environment parameter and a shooting parameter;
and taking the first shooting parameter and the target attribute setting parameter as the target shooting parameter.
In one possible example, the performing the overall image quality evaluation on the multiple frames of captured images to obtain multiple image quality evaluation values includes:
the media service module determines a first definition corresponding to a shot image i, wherein the shot image i is any one of the multiple frames of shot images;
dividing a shot image i into a plurality of regions;
determining the distribution density of the characteristic points corresponding to each of the plurality of areas to obtain a plurality of distribution densities of the characteristic points;
performing mean square error operation according to the distribution densities of the plurality of characteristic points to obtain a target mean square error;
determining a target definition adjusting coefficient corresponding to the target mean square error according to a mapping relation between a preset mean square error and a definition adjusting coefficient;
adjusting the first definition according to the target definition adjusting coefficient to obtain a second definition;
and determining the image quality evaluation value corresponding to the second definition according to a preset mapping relation between the definition and the image quality evaluation value.
In one possible example, the performing the local image quality evaluation on each of the at least one first captured image to obtain at least one target evaluation value includes:
the media service module extracts a target of a first shot image j to obtain a target j, wherein the first shot image j is any one of the at least one first shot image;
extracting characteristic parameters of the target j to obtain at least one characteristic parameter set;
and determining a target evaluation value corresponding to the first shot image j according to the at least one characteristic parameter set.
In one possible example, the determining the target evaluation value corresponding to the first captured image j according to the at least one feature parameter set includes:
determining an evaluation value corresponding to each characteristic parameter set in the plurality of characteristic parameter sets to obtain a plurality of evaluation values, wherein each characteristic parameter set corresponds to one evaluation value;
determining a weight corresponding to each feature parameter set in the plurality of feature parameter sets according to a mapping relation between a preset category label and the weight to obtain a plurality of target weights;
and performing weighting operation according to the plurality of evaluation values and the plurality of target weights to obtain a target evaluation value corresponding to the shot image j.
In one possible example, further comprising:
the third party application acquires target image processing parameters;
and processing the target shot image according to the target image processing parameters to obtain a final image.
In one possible example, the third-party application obtains target image processing parameters, including:
the third-party application detects a trigger operation of a target control for the third-party application;
and determining target image processing parameters corresponding to the target control according to a preset mapping relation between the control and the image processing parameters.
The above description has introduced the solution of the embodiment of the present application mainly from the perspective of the method-side implementation process. It is understood that the electronic device comprises corresponding hardware structures and/or software modules for performing the respective functions in order to realize the above-mentioned functions. Those of skill in the art will readily appreciate that the present application is capable of hardware or a combination of hardware and computer software implementing the various illustrative elements and algorithm steps described in connection with the embodiments provided herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the electronic device may be divided into the functional units according to the method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Fig. 4 is a block diagram of functional units of the photographing control apparatus 400 according to the embodiment of the present application. The photographing control device 400 is applied to an electronic device, wherein the electronic device comprises a media service module and an operating system, and an application layer of the operating system is provided with a third-party application; the apparatus 400 comprises: the third-party application 401, the media service module 402, and the driver layer 403 are as follows:
the third-party application 401 is configured to send a photographing request to the media service module;
the media service module 402 is configured to receive the photographing request, and analyze the photographing request to obtain a target photographing parameter; transmitting the target photographing parameters to a driver layer of the operating system,
the driving layer 403 is configured to drive a camera of the electronic device to perform shooting according to the target shooting parameter, obtain multiple frames of shot images, and send the multiple frames of shot images to the media service module;
the media service module 402 is further configured to select a target captured image with the best image quality from the multiple frames of captured images, and send the target captured image to the third-party application;
the third-party application 401 is further configured to receive the target captured image.
It can be seen that, the photographing control apparatus described in the embodiment of the present application is applied to an electronic device, the electronic device includes a media service module and an operating system, an application layer of the operating system is provided with a third party application, the third party application sends a photographing request to the media service module, the media service module receives the photographing request and analyzes the photographing request to obtain a target photographing parameter, the target photographing parameter is sent to a driving layer of the operating system, the driving layer drives a camera of the electronic device to photograph according to the target photographing parameter to obtain a plurality of frames of photographed images, and sends the plurality of frames of photographed images to the media service module, the media service module selects a target photographed image with the best image quality from the plurality of frames of photographed images and sends the target photographed image to a third party application, the third party application receives the target photographed image, and when the photographing request is received, can select the optimal shooting data in the shooting data that can shoot from the drive layer through the media service module initiative, be equivalent to carry out the preliminary treatment to the shooting data, on the one hand, avoided giving a large amount of bottom shooting data to the drawback that the consumption that third party application was handled and is brought is difficult to control, and on the other hand, the third party is used and is accomplished the preliminary treatment process of shooing the image through the media service module, can promote the shooting effect better, promotes user experience.
In one possible example, in the media service module 402, a target captured image with the best image quality is selected from the multiple captured images, specifically:
carrying out overall image quality evaluation on the multiple frames of shot images to obtain a plurality of image quality evaluation values, wherein each shot image corresponds to one image quality evaluation value;
selecting an image quality evaluation value larger than a preset threshold value from the plurality of image quality evaluation values to obtain at least one target image quality evaluation value, and acquiring a shot image corresponding to the at least one target image quality evaluation value to obtain at least one first shot image;
performing local image quality evaluation on each shot image in the at least one first shot image to obtain at least one target evaluation value, wherein each shot image corresponds to one target evaluation value;
and determining the maximum value of the at least one target evaluation value, and acquiring a target shot image corresponding to the maximum value.
In one possible example, the photographing instruction carries a target environment parameter and a target attribute setting parameter;
in the aspect of analyzing the photographing request to obtain a target photographing parameter, the media service module 402 is specifically configured to:
determining a first shooting parameter corresponding to the target environment parameter according to a mapping relation between a preset environment parameter and the shooting parameter;
and taking the first shooting parameter and the target attribute setting parameter as the target shooting parameter.
In one possible example, in terms of the overall image quality evaluation of the multiple frames of captured images to obtain multiple image quality evaluation values, the media service module 402 is specifically configured to:
determining a first definition corresponding to a shot image i, wherein the shot image i is any one of the multiple frames of shot images;
dividing a shot image i into a plurality of regions;
determining the distribution density of the characteristic points corresponding to each of the plurality of areas to obtain a plurality of distribution densities of the characteristic points;
performing mean square error operation according to the distribution densities of the plurality of characteristic points to obtain a target mean square error;
determining a target definition adjusting coefficient corresponding to the target mean square error according to a mapping relation between a preset mean square error and a definition adjusting coefficient;
adjusting the first definition according to the target definition adjusting coefficient to obtain a second definition;
and determining the image quality evaluation value corresponding to the second definition according to a preset mapping relation between the definition and the image quality evaluation value.
In one possible example, in terms of the local image quality evaluation performed on each captured image of the at least one first captured image to obtain at least one target evaluation value, the media service module 402 is specifically configured to:
performing target extraction on a first shot image j to obtain a target j, wherein the first shot image j is any one of the at least one first shot image;
extracting characteristic parameters of the target j to obtain at least one characteristic parameter set;
and determining a target evaluation value corresponding to the first shot image j according to the at least one characteristic parameter set.
In a possible example, the at least one feature parameter set is a plurality of feature parameter sets, each feature parameter set corresponds to a category label, and in the aspect of determining the target evaluation value corresponding to the first captured image j according to the at least one feature parameter set, the media service module 402 is specifically configured to:
determining an evaluation value corresponding to each characteristic parameter set in the plurality of characteristic parameter sets to obtain a plurality of evaluation values, wherein each characteristic parameter set corresponds to one evaluation value;
determining a weight corresponding to each feature parameter set in the plurality of feature parameter sets according to a mapping relation between a preset category label and the weight to obtain a plurality of target weights;
and performing weighting operation according to the plurality of evaluation values and the plurality of target weights to obtain a target evaluation value corresponding to the shot image j.
In one possible example, the third-party application 401 is further specifically configured to:
processing parameters by acquiring a target image;
and processing the target shot image according to the target image processing parameters to obtain a final image.
In a possible example, the third-party application 401 obtains target image processing parameters, specifically:
the third-party application 401 detects a trigger operation of a target control for the third-party application;
and determining target image processing parameters corresponding to the target control according to a preset mapping relation between the control and the image processing parameters.
It can be understood that the functions of each program module of the photographing control apparatus in this embodiment can be specifically implemented according to the method in the foregoing method embodiment, and the specific implementation process thereof can refer to the related description of the foregoing method embodiment, which is not described herein again.
Embodiments of the present application also provide a computer storage medium, where the computer storage medium stores a computer program for electronic data exchange, the computer program enabling a computer to execute part or all of the steps of any one of the methods described in the above method embodiments, and the computer includes an electronic device.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the methods as described in the above method embodiments. The computer program product may be a software installation package, the computer comprising an electronic device.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer readable memory if it is implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the above-mentioned method of the embodiments of the present application. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash Memory disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (11)

1. The photographing control method is characterized by being applied to electronic equipment, wherein the electronic equipment comprises a media service module and an operating system, the media service module is independent of the setting of the operating system, and an application layer of the operating system is provided with a third-party application; the method comprises the following steps:
the third-party application sends a photographing request to the media service module;
the media service module receives the photographing request and analyzes the photographing request to obtain target photographing parameters; sending the target shooting parameters to a driving layer of the operating system;
the driving layer drives a camera of the electronic equipment to shoot according to the target shooting parameters to obtain multi-frame shot images, and the multi-frame shot images are sent to the media service module;
the media service module selects a target shooting image with the best image quality from the multi-frame shooting images and sends the target shooting image to the third-party application;
and the third-party application receives the target shooting image.
2. The method of claim 1, wherein the media service module selects a target captured image with the best image quality from the plurality of captured images, and comprises:
carrying out overall image quality evaluation on the multiple frames of shot images to obtain a plurality of image quality evaluation values, wherein each shot image corresponds to one image quality evaluation value;
selecting an image quality evaluation value larger than a preset threshold value from the plurality of image quality evaluation values to obtain at least one target image quality evaluation value, and acquiring a shot image corresponding to the at least one target image quality evaluation value to obtain at least one first shot image;
performing local image quality evaluation on each shot image in the at least one first shot image to obtain at least one target evaluation value, wherein each shot image corresponds to one target evaluation value;
and determining the maximum value of the at least one target evaluation value, and acquiring a target shot image corresponding to the maximum value.
3. The method of claim 1, wherein the photographing request carries target environment parameters and target attribute setting parameters;
the analyzing the photographing request to obtain target photographing parameters comprises:
the media service module determines a first shooting parameter corresponding to the target environment parameter according to a mapping relation between a preset environment parameter and a shooting parameter;
and taking the first shooting parameter and the target attribute setting parameter as the target shooting parameter.
4. The method according to claim 2, wherein the performing the overall image quality evaluation on the plurality of frames of captured images to obtain a plurality of image quality evaluation values comprises:
the media service module determines a first definition corresponding to a shot image i, wherein the shot image i is any one of the multiple frames of shot images;
dividing a shot image i into a plurality of regions;
determining the distribution density of the characteristic points corresponding to each of the plurality of areas to obtain a plurality of distribution densities of the characteristic points;
performing mean square error operation according to the distribution densities of the plurality of characteristic points to obtain a target mean square error;
determining a target definition adjusting coefficient corresponding to the target mean square error according to a mapping relation between a preset mean square error and a definition adjusting coefficient;
adjusting the first definition according to the target definition adjusting coefficient to obtain a second definition;
and determining the image quality evaluation value corresponding to the second definition according to a preset mapping relation between the definition and the image quality evaluation value.
5. The method according to claim 2, wherein the performing a local image quality evaluation on each of the at least one first captured image to obtain at least one target evaluation value comprises:
the media service module extracts a target of a first shot image j to obtain a target j, wherein the first shot image j is any one of the at least one first shot image;
extracting characteristic parameters of the target j to obtain at least one characteristic parameter set;
and determining a target evaluation value corresponding to the first shot image j according to the at least one characteristic parameter set.
6. The method of claim 5, wherein the at least one feature parameter set is a plurality of feature parameter sets, each feature parameter set corresponds to a category label, and the determining the target evaluation value corresponding to the first captured image j according to the at least one feature parameter set comprises:
the media service module determines an evaluation value corresponding to each characteristic parameter set in the plurality of characteristic parameter sets to obtain a plurality of evaluation values, wherein each characteristic parameter set corresponds to one evaluation value;
determining a weight corresponding to each feature parameter set in the plurality of feature parameter sets according to a mapping relation between a preset category label and the weight to obtain a plurality of target weights;
and performing weighting operation according to the plurality of evaluation values and the plurality of target weights to obtain a target evaluation value corresponding to the shot image j.
7. The method according to any one of claims 1-6, further comprising:
the third party application acquires target image processing parameters;
and processing the target shot image according to the target image processing parameters to obtain a final image.
8. The method of claim 7, wherein the third-party application obtains target image processing parameters, comprising:
the third-party application detects a trigger operation of a target control for the third-party application;
and determining target image processing parameters corresponding to the target control according to a preset mapping relation between the control and the image processing parameters.
9. The photographing control device is applied to electronic equipment, the electronic equipment comprises a media service module and an operating system, the media service module is independent of the operating system, and an application layer of the operating system is provided with a third-party application; the device comprises:
the third-party application is used for sending a photographing request to the media service module;
the media service module is used for receiving the photographing request and analyzing the photographing request to obtain target photographing parameters; transmitting the target photographing parameters to a driver layer of the operating system,
the driving layer is used for driving a camera of the electronic equipment to shoot according to the target shooting parameters to obtain multi-frame shot images and sending the multi-frame shot images to the media service module;
the media service module is further configured to select a target captured image with the best image quality from the multiple frames of captured images, and send the target captured image to the third-party application;
the third-party application is further used for receiving the target shooting image.
10. An electronic device comprising a processor, a memory for storing one or more programs and configured for execution by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-8.
11. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any one of claims 1-8.
CN201911253078.3A 2019-12-09 2019-12-09 Photographing control method and related product Active CN110933312B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911253078.3A CN110933312B (en) 2019-12-09 2019-12-09 Photographing control method and related product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911253078.3A CN110933312B (en) 2019-12-09 2019-12-09 Photographing control method and related product

Publications (2)

Publication Number Publication Date
CN110933312A CN110933312A (en) 2020-03-27
CN110933312B true CN110933312B (en) 2021-07-06

Family

ID=69857681

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911253078.3A Active CN110933312B (en) 2019-12-09 2019-12-09 Photographing control method and related product

Country Status (1)

Country Link
CN (1) CN110933312B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111556249A (en) * 2020-05-19 2020-08-18 青岛海信移动通信技术股份有限公司 Image processing method based on ink screen, terminal and storage medium
CN111666124B (en) * 2020-05-29 2023-10-27 平安科技(深圳)有限公司 Image acquisition device calling method, device, computer device and storage medium
CN111901521B (en) * 2020-06-26 2021-12-10 深圳蚂里奥技术有限公司 Scene self-adaptive system, method and terminal based on image processing

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9148582B2 (en) * 2012-06-29 2015-09-29 Intel Corporation Method and system for perfect shot imaging from multiple images
CN105578042B (en) * 2015-12-18 2019-04-30 深圳市金立通信设备有限公司 A kind of transmission method and terminal of image data
CN105959530A (en) * 2016-04-26 2016-09-21 乐视控股(北京)有限公司 Method and system for invoking a camera function according to an individualized property of an application
CN106331504B (en) * 2016-09-30 2020-03-17 北京小米移动软件有限公司 Shooting method and device
CN108540726B (en) * 2018-05-15 2020-05-05 Oppo广东移动通信有限公司 Method and device for processing continuous shooting image, storage medium and terminal
CN110049244A (en) * 2019-04-22 2019-07-23 惠州Tcl移动通信有限公司 Image pickup method, device, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN110933312A (en) 2020-03-27

Similar Documents

Publication Publication Date Title
CN110139033B (en) Photographing control method and related product
CN110020622B (en) Fingerprint identification method and related product
CN107862265B (en) Image processing method and related product
CN107172364B (en) Image exposure compensation method and device and computer readable storage medium
CN107679482B (en) Unlocking control method and related product
CN110113515B (en) Photographing control method and related product
CN110933312B (en) Photographing control method and related product
CN109508321B (en) Image display method and related product
CN109002787B (en) Image processing method and device, storage medium and electronic equipment
WO2019052329A1 (en) Facial recognition method and related product
CN110312032B (en) Audio playing method and device, electronic equipment and computer readable storage medium
CN107729889B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN110191303B (en) Video call method, device and apparatus based on screen sound production and computer readable storage medium
CN110245607B (en) Eyeball tracking method and related product
CN110363702B (en) Image processing method and related product
CN111563466A (en) Face detection method and related product
CN113282317B (en) Optical fingerprint parameter upgrading method and related product
CN110198421B (en) Video processing method and related product
CN114302088A (en) Frame rate adjusting method and device, electronic equipment and storage medium
CN109639981B (en) Image shooting method and mobile terminal
CN112950525A (en) Image detection method and device and electronic equipment
CN110796673B (en) Image segmentation method and related product
CN111416936B (en) Image processing method, image processing device, electronic equipment and storage medium
CN110162264B (en) Application processing method and related product
CN110796147B (en) Image segmentation method and related product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant