CN117135448A - Shooting method and electronic equipment - Google Patents

Shooting method and electronic equipment Download PDF

Info

Publication number
CN117135448A
CN117135448A CN202310209672.2A CN202310209672A CN117135448A CN 117135448 A CN117135448 A CN 117135448A CN 202310209672 A CN202310209672 A CN 202310209672A CN 117135448 A CN117135448 A CN 117135448A
Authority
CN
China
Prior art keywords
camera
mode
link library
data stream
application program
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310209672.2A
Other languages
Chinese (zh)
Other versions
CN117135448B (en
Inventor
张丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310209672.2A priority Critical patent/CN117135448B/en
Publication of CN117135448A publication Critical patent/CN117135448A/en
Application granted granted Critical
Publication of CN117135448B publication Critical patent/CN117135448B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application relates to the field of terminals, and provides a shooting method and electronic equipment, wherein the method comprises the following steps: detecting a first operation on the camera; in response to a first operation, acquiring a first data stream and first information; if the first information is the name information of the application program of the camera and the first data stream comprises a first identifier, determining that the shooting mode of the camera is a video mode; if the first information is the name information of the application program of the camera and the first data stream comprises the first image resolution and the first image format, determining that the shooting mode of the camera is a preview mode; if the first information is the name information of the application program of the camera and the first data stream comprises the second image resolution and the second image format, determining that the shooting mode of the camera is a shooting mode; based on a shooting mode of a camera, a first link library file corresponding to the shooting mode of the camera is operated; displaying a first interface; based on the scheme of the application, the camera can rapidly respond to the operation of the user, and the user experience is improved.

Description

Shooting method and electronic equipment
Technical Field
The present application relates to the field of terminals, and in particular, to a photographing method and an electronic device.
Background
With the development of photographing functions in electronic devices, camera applications are becoming more and more widely used in electronic devices. The starting speed of the camera, or the starting speed when each different camera mode (such as a preview mode, a video recording mode or a photographing mode) in the camera is switched, influences the photographing experience of the user; currently, the time required for starting the camera is long; for example, in a scene where a camera needs to be frequently turned on and off, a user can obviously perceive a waiting process when the camera is started, resulting in poor photographing experience of the user.
Therefore, how to increase the start-up speed of the camera becomes a problem to be solved.
Disclosure of Invention
The application provides a shooting method and electronic equipment, which can shorten the time for starting a camera application program or shorten the time for switching the shooting mode of a camera; the camera can quickly respond to the operation of the user, and the user experience is improved.
In a first aspect, a photographing method is provided, applied to an electronic device, and includes:
detecting a first operation on the camera;
responding to the first operation, acquiring a first data stream and first information, wherein the first data stream is a data stream sent by the camera to a hardware abstraction layer, and the first information is name information of an application program for starting the camera;
If the first information is name information of a third party application program, determining that a shooting mode of the camera is a first mode, wherein the first mode is a mode of calling the camera by the third party application program;
if the first information is name information of an application program of the camera and the first data stream comprises a first identifier, determining that a shooting mode of the camera is a video mode, wherein the first identifier is used for indicating the video mode;
if the first information is name information of an application program of the camera and the first data stream comprises a first image resolution and a first image format, determining that a shooting mode of the camera is a preview mode;
if the first information is name information of an application program of the camera and the first data stream comprises a second image resolution and a second image format, determining that a shooting mode of the camera is a shooting mode, wherein the second image format comprises the first image format and the second image resolution is larger than the first image resolution;
based on the shooting mode of the camera, running a first link library file corresponding to the shooting mode of the camera, wherein the first link library file is used for initializing the running environment of the electronic equipment when the electronic equipment runs the shooting mode of the camera;
And displaying a first interface, wherein the first interface is an interface of a shooting mode of the camera.
In the embodiment of the application, the electronic equipment can identify the shooting scene of the camera application program according to the first data stream and the first information, namely, the shooting mode of the camera which needs to be started; for example, the photographing mode may include: the third party application program calls a first mode, a photographing mode, a preview mode and a video recording mode of the camera; the electronic equipment can pertinently operate the first dynamic link library file corresponding to the shooting mode based on the shooting mode; it can be understood that, if a certain shooting mode of the camera is operated, the electronic device only operates the dynamic link library file corresponding to the shooting mode; and does not run dynamic link library files corresponding to other shooting modes; the operation of removing redundant link library files is realized, and the dynamic link library files are operated only aiming at a certain shooting mode which needs to be started; thus, the problem of long time consumption caused by running all dynamic link files can be avoided; thus, the time for starting the camera application program can be shortened, or the time for switching the shooting mode of the camera can be shortened; the camera can quickly respond to the operation of the user, and the user experience is improved.
With reference to the first aspect, in some implementations of the first aspect, if the first information is name information of an application of the camera, and the first data stream includes the first identifier, the first image format, and the second image resolution, a shooting mode of the camera is the video recording mode.
With reference to the first aspect, in certain implementation manners of the first aspect, the hardware abstraction layer includes a camera hardware abstraction layer, a feature pool management module is included in the camera hardware abstraction layer, the feature pool management module includes a shooting scene identification module and a dynamic link library, the shooting scene identification module is used for determining a shooting mode of the camera based on the first data stream and the first information, the dynamic link library includes a plurality of link library files, and the plurality of link library files includes the first link library file.
With reference to the first aspect, in certain implementation manners of the first aspect, the camera hardware abstraction layer further includes a characteristic topology map management module, where the characteristic topology map management module is configured to receive the first data stream sent by the camera; the characteristic topological graph management module comprises the characteristic pool management module.
With reference to the first aspect, in certain implementations of the first aspect, the electronic device does not run a link library file in the dynamic link library other than the first link library file before the first interface is displayed.
With reference to the first aspect, in certain implementations of the first aspect, the first operation is an operation to turn on the camera.
With reference to the first aspect, in certain implementations of the first aspect, the first operation is an operation to switch a shooting mode of the camera.
In a second aspect, an electronic device is provided, the electronic device comprising one or more processors and memory; the memory is coupled to the one or more processors, the memory for storing computer program code, the computer program code comprising computer instructions that the one or more processors call to cause the electronic device to perform:
detecting a first operation on the camera;
responding to the first operation, acquiring a first data stream and first information, wherein the first data stream is a data stream sent by the camera to a hardware abstraction layer, and the first information is name information of an application program for starting the camera;
if the first information is name information of a third party application program, determining that a shooting mode of the camera is a first mode, wherein the first mode is a mode of calling the camera by the third party application program;
If the first information is name information of an application program of the camera and the first data stream comprises a first identifier, determining that a shooting mode of the camera is a video mode, wherein the first identifier is used for indicating the video mode;
if the first information is name information of an application program of the camera and the first data stream comprises a first image resolution and a first image format, determining that a shooting mode of the camera is a preview mode;
if the first information is name information of an application program of the camera and the first data stream comprises a second image resolution and a second image format, determining that a shooting mode of the camera is a shooting mode, wherein the second image format comprises the first image format and the second image resolution is larger than the first image resolution;
based on the shooting mode of the camera, running a first link library file corresponding to the shooting mode of the camera, wherein the first link library file is used for initializing the running environment of the electronic equipment when the electronic equipment runs the shooting mode of the camera;
and displaying a first interface, wherein the first interface is an interface of a shooting mode of the camera.
With reference to the second aspect, in some implementations of the second aspect, if the first information is name information of an application of the camera, and the first data stream includes the first identifier, the first image format, and the second image resolution, a shooting mode of the camera is the video recording mode.
With reference to the second aspect, in certain implementation manners of the second aspect, the hardware abstraction layer includes a camera hardware abstraction layer, a feature pool management module is included in the camera hardware abstraction layer, the feature pool management module includes a shooting scene identification module and a dynamic link library, the shooting scene identification module is configured to determine a shooting mode of the camera based on the first data stream and the first information, the dynamic link library includes a plurality of link library files, and the plurality of link library files includes the first link library file.
With reference to the second aspect, in certain implementation manners of the second aspect, the camera hardware abstraction layer further includes a characteristic topology map management module, where the characteristic topology map management module is configured to receive the first data stream sent by the camera; the characteristic topological graph management module comprises the characteristic pool management module.
With reference to the second aspect, in some implementations of the second aspect, the electronic device does not run a link library file of the dynamic link library other than the first link library file before the first interface is displayed.
With reference to the second aspect, in certain implementations of the second aspect, the first operation is an operation to turn on the camera.
With reference to the second aspect, in certain implementations of the second aspect, the first operation is an operation to switch a shooting mode of the camera.
In a third aspect, an electronic device is provided, comprising means for performing the method of photographing in the first aspect or any of the implementation forms of the first aspect.
In a fourth aspect, an electronic device is provided that includes one or more processors and memory; the memory is coupled with the one or more processors, the memory for storing computer program code comprising computer instructions that the one or more processors call to cause the electronic device to perform the method of photographing in the first aspect or any of the implementations of the first aspect.
In a fifth aspect, there is provided a chip system for application to an electronic device, the chip system comprising one or more processors for invoking computer instructions to cause the electronic device to perform the method of the first aspect or any of the first aspects.
In a sixth aspect, there is provided a computer readable storage medium storing computer program code which, when executed by an electronic device, causes the electronic device to perform the method of photographing in the first aspect or any of the implementations of the first aspect.
In a seventh aspect, there is provided a computer program product comprising: computer program code which, when run by an electronic device, causes the electronic device to perform the method of photographing of the first aspect or any of the implementations of the first aspect.
In the embodiment of the application, the electronic equipment can identify the shooting scene of the camera application program according to the first data stream and the first information, namely, the shooting mode of the camera which needs to be started; for example, the photographing mode may include: the third party application program calls a first mode, a photographing mode, a preview mode and a video recording mode of the camera; the electronic equipment can pertinently operate the first dynamic link library file corresponding to the shooting mode based on the shooting mode; it can be understood that, if a certain shooting mode of the camera is operated, the electronic device only operates the dynamic link library file corresponding to the shooting mode; and does not run dynamic link library files corresponding to other shooting modes; the operation of removing redundant link library files is realized, and the dynamic link library files are operated only aiming at a certain shooting mode which needs to be started; thus, the problem of long time consumption caused by running all dynamic link files can be avoided; thus, the time for starting the camera application program can be shortened, or the time for switching the shooting mode of the camera can be shortened; the camera can quickly respond to the operation of the user, and the user experience is improved.
Drawings
FIG. 1 is a schematic diagram of a hardware system suitable for use in an electronic device of the present application;
FIG. 2 is a schematic diagram of an application scenario suitable for use in embodiments of the present application;
FIG. 3 is a schematic diagram of another application scenario suitable for use in embodiments of the present application;
FIG. 4 is a schematic diagram of a prior art method of photographing;
FIG. 5 is a schematic diagram of a software system suitable for use with the electronic device of the present application;
fig. 6 is a schematic diagram of a photographing method according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a characteristic topology provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of a characteristic topology after a trimming process according to an embodiment of the present application;
FIG. 9 is an interactive flowchart of a method of capturing images provided by an embodiment of the present application;
FIG. 10 is a schematic diagram of another photographing method according to an embodiment of the present application;
FIG. 11 is a schematic diagram of a graphical user interface provided by an embodiment of the present application;
fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of another electronic device according to an embodiment of the present application.
Detailed Description
In embodiments of the present application, the following terms "first", "second", etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present embodiment, unless otherwise specified, the meaning of "plurality" is two or more.
In order to facilitate understanding of the embodiments of the present application, related concepts related to the embodiments of the present application will be briefly described.
1. Link library
In computers, some files are dedicated to storing reusable code blocks; for example, functionally useful functions or classes, commonly referred to as library files, simply "libraries"; the link library is a binary file generated after compiling and packaging the library file.
Illustratively, the link library may include a dynamic link library and a static link library; dynamic link (Dynamic Link Library, DLL) libraries refer to library files that implement linking operations using dynamic linking; the static link library refers to a library file for realizing a link operation in a static link mode.
For example, the dynamic link library includes a suffix file of ". So".
For ease of understanding, in embodiments of the present application, the "link library" described above may be referred to as a "link library file"; the above-described "dynamic link library" may be referred to as a "dynamic link library file".
2. Topology map
The topology map may also be referred to as a topology map; the topology map is used to represent the paths (or links) taken during data transmission.
The photographing method and the electronic device provided in the embodiments of the present application will be described below with reference to the accompanying drawings.
Fig. 1 shows a hardware system suitable for use in the electronic device of the application.
The electronic device 100 may be a mobile phone, a smart screen, a tablet computer, a wearable electronic device, an in-vehicle electronic device, an augmented reality (augmented reality, AR) device, a Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a personal digital assistant (personal digital assistant, PDA), a projector, etc., and the specific type of the electronic device 100 is not limited in the embodiments of the present application.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The configuration shown in fig. 1 does not constitute a specific limitation on the electronic apparatus 100. In other embodiments of the application, electronic device 100 may include more or fewer components than those shown in FIG. 1, or electronic device 100 may include a combination of some of the components shown in FIG. 1, or electronic device 100 may include sub-components of some of the components shown in FIG. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units. For example, the processor 110 may include at least one of the following processing units: application processors (application processor, AP), modem processors, graphics processors (graphics processing unit, GPU), image signal processors (image signal processor, ISP), controllers, video codecs, digital signal processors (digital signal processor, DSP), baseband processors, neural-Network Processors (NPU). The different processing units may be separate devices or integrated devices. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. For example, the processor 110 may include at least one of the following interfaces: inter-integrated circuit, I2C) interfaces, inter-integrated circuit audio (inter-integrated circuit sound, I2S) interfaces, pulse code modulation (pulse code modulation, PCM) interfaces, universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interfaces, mobile industry processor interfaces (mobile industry processor interface, MIPI), general-purpose input/output (GPIO) interfaces, SIM interfaces, USB interfaces.
Illustratively, in an embodiment of the present application, the processor 110 may be configured to perform the method of photographing provided by the embodiment of the present application; for example, a first operation on the camera is detected; responding to a first operation, acquiring a first data stream and first information, wherein the first data stream is a data stream sent by a camera to a hardware abstraction layer, and the first information is name information of an application program for starting the camera; if the first information is the name information of the third party application program, determining that the shooting mode of the camera is a first mode, wherein the first mode is a mode of calling the camera by the third party application program; if the first information is the name information of the application program of the camera and the first data stream comprises a first identifier, determining that the shooting mode of the camera is a video mode, wherein the first identifier is used for indicating the video mode; if the first information is the name information of the application program of the camera and the first data stream comprises the first image resolution and the first image format, determining that the shooting mode of the camera is a preview mode; if the first information is the name information of the application program of the camera and the first data stream comprises a second image resolution and a second image format, determining that the shooting mode of the camera is a shooting mode, wherein the second image format comprises a first image format and the second image resolution is larger than the first image resolution; based on a shooting mode of a camera, running a first link library file corresponding to the shooting mode of the camera, wherein the first link library file is used for initializing an operation environment of the electronic equipment when the electronic equipment runs the shooting mode of the camera; and displaying a first interface, wherein the first interface is an interface of a shooting mode of the camera.
The connection relationships between the modules shown in fig. 1 are merely illustrative, and do not constitute a limitation on the connection relationships between the modules of the electronic device 100. Alternatively, the modules of the electronic device 100 may also use a combination of the various connection manners in the foregoing embodiments.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The electronic device 100 may implement display functions through a GPU, a display screen 194, and an application processor. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
Illustratively, the display screen 194 may be used to display images or video.
Alternatively, the display screen 194 may be used to display images or video. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED), a flexible light-emitting diode (FLED), a Mini light-emitting diode (Mini LED), a Micro light-emitting diode (Micro LED), a Micro OLED (Micro OLED), or a quantum dot LED (quantum dot light emitting diodes, QLED). In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
Illustratively, the electronic device 100 may implement a photographing function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
Illustratively, the ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the camera, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. The ISP can carry out algorithm optimization on noise, brightness and color of the image, and can optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
Illustratively, a camera 193 (which may also be referred to as a lens) is used to capture still images or video. The shooting function can be realized by triggering and starting through an application program instruction, such as shooting and acquiring an image of any scene. The camera may include imaging lenses, filters, image sensors, and the like. Light rays emitted or reflected by the object enter the imaging lens, pass through the optical filter and finally are converged on the image sensor. The imaging lens is mainly used for converging and imaging light emitted or reflected by all objects (also called a scene to be shot and a target scene, and also called a scene image expected to be shot by a user) in a shooting view angle; the optical filter is mainly used for filtering out redundant light waves (such as light waves except visible light, such as infrared light) in the light; the image sensor may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The image sensor is mainly used for photoelectrically converting a received optical signal into an electrical signal, and then transmitting the electrical signal to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format.
Illustratively, the digital signal processor is configured to process digital signals, and may process other digital signals in addition to digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Illustratively, video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, and MPEG4.
Illustratively, the gyroscopic sensor 180B may be used to determine a motion pose of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x-axis, y-axis, and z-axis) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B can also be used for scenes such as navigation and motion sensing games.
For example, the acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically, x-axis, y-axis, and z-axis). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The acceleration sensor 180E may also be used to recognize the gesture of the electronic device 100 as an input parameter for applications such as landscape switching and pedometer.
Illustratively, a distance sensor 180F is used to measure distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, for example, in a shooting scene, the electronic device 100 may range using the distance sensor 180F to achieve fast focus.
Illustratively, ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
Illustratively, the fingerprint sensor 180H is used to capture a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to perform functions such as unlocking, accessing an application lock, taking a photograph, and receiving an incoming call.
Illustratively, the touch sensor 180K, also referred to as a touch device. The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a touch screen. The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor 180K may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 and at a different location than the display 194.
Currently, the time required for starting the camera is long; for example, in a scene of switching on and off the camera, the user can obviously perceive the waiting process when the camera is started, so that the photographing experience of the user is poor.
Scene one: in the process of starting the camera, the time consumption is long; the user can obviously perceive that a short black screen phenomenon appears.
Illustratively, as shown in fig. 2, the graphical user interface (graphical user interface, GUI) shown in fig. 2 (a) is the desktop 210 of the electronic device; the electronic device detects a click operation on the control 220 of the camera application on the desktop, as shown in fig. 2 (b); after the electronic device detects a click operation on the control 220 of the camera application on the desktop, a black screen may appear during the process of opening the camera application, as shown in (c) of fig. 2; the photo preview interface 230 is displayed after the black screen, as shown in fig. 2 (d).
Scene II: in the process that the camera application program is switched from the preview mode to the photographing mode, the waiting time is long; for example, in preview mode, the electronic device detects that the user clicks the shutter key, which is time consuming between clicking the shutter key and displaying the thumbnail of the captured image; the user usually regards displaying a thumbnail image of the photographed image as completing one photographing, resulting in a long waiting time for the user.
Illustratively, as shown in fig. 3, the display interface 240 shown in (a) in fig. 3 is a display interface of a preview mode of a camera application program, and the display interface 240 includes a photographing control 241; the electronic device detects a click operation on the photographing control 241, as shown in (b) in fig. 3; when the electronic device detects a click operation on the photographing control 241, the camera application program switches from the preview mode to the photographing mode; the electronic device detects a pop-up operation of the photographing control 241, as shown in (c) of fig. 3; detecting a flick operation of the photographing control 241 at the electronic device and running the photographing mode to generate a thumbnail image of the photographed image, displaying a display interface 250, as shown in (d) of fig. 3; currently, a user can obviously perceive that the waiting time between the (c) display interface shown in fig. 3 and the (d) display interface 250 shown in fig. 3 is long, resulting in poor photographing experience of the user.
Scene III: when the camera is invoked by a third party application (e.g., a payment application to scan codes, or a third party camera application), it takes longer time; the user can obviously perceive that a short black screen phenomenon appears.
At present, all link libraries are loaded in the existing shooting method; loading all inherent link libraries of the chips and all customized link libraries, so that the time required by starting the camera application program is long; the starting time of the camera application program waiting for the user is longer, so that the user experience is poor.
The flow of the conventional photographing method is described below with reference to fig. 4.
Fig. 4 is a flow chart of a conventional photographing method. The method 300 as shown in fig. 4 includes S310 to S350; s310 to S350 are described below, respectively.
S310, detecting an operation on the camera application.
Optionally, an operation is detected indicating that the camera application is turned on.
Illustratively, a click operation on the camera application is detected, as shown in (b) of fig. 2.
Optionally, a click operation is detected on a control of the camera functionality in the third party application.
Illustratively, when the electronic device is running an instant messaging application, a click operation of a control of the camera function is detected; or when the electronic equipment runs the payment application program, detecting clicking operation of a control of the camera function; alternatively, a click operation on the third party camera application is detected.
It should be appreciated that the above is illustrated with a click operation; in the embodiment of the application, the clicking operation can also be an operation of starting the camera application program through voice indication; or, indicating an operation of starting the camera application program through other instructions; the present application is not limited in any way.
S320, the camera application program sends a data stream.
S330, sending the data stream to the hardware abstraction layer through the application framework layer.
S340, data flow configuration processing.
It should be appreciated that the data stream configuration process may refer to initializing the operating environment of the electronic device.
The implementation of the data stream configuration process in S340 may include S341 to S343.
S341, loading all the link libraries.
Alternatively, all the link libraries may include link libraries inherent to the load chip and link libraries customized by the load; the link library of the chip platform can refer to a link library inherent to the chip; the inherent link library of the chip is a link library which is necessarily loaded when the chip is operated, and the link library cannot be not loaded under the control of a user; the customized link library may refer to a self-developed (or preconfigured) link library for different electronic devices.
It should be understood that the link library refers to a binary file generated by compiling and packaging an open source library file.
For example, the chip-inherent link library may include, but is not limited to, the following:
a frame synchronization library file (achorsync. So), a demux library file (demux. So), a fusion library file (fusion. So), a universal library file (generc. So), a multi-frame super-resolution library file (mfsr. So), a copy library file (memcpy. So), a serializer library file (seriizer. So), and the like.
For example, the customized link library may include, but is not limited to, the following files:
a high dynamic range algorithm library file (rawhdr. So), a super night scene algorithm library file (rawsuperright. So), and the like.
It should be appreciated that the customized link library may refer to a non-link library in the electronic device; it is understood that it is not a link library inherent in the chip.
S342, selecting a characteristic topological graph.
Wherein, the characteristic topological graph can be understood as the execution logic sequence of the loaded link library; for example, 4 different characteristic topologies are shown in fig. 7 (a), fig. 7 (b), fig. 7 (c), and fig. 7 (d).
It should be noted that, the selection characteristic topological graph may be understood as execution logic for selecting the link library loaded in S341.
S343, cutting the characteristic topological graph.
The cropping process may refer to a process of determining a topology map from a plurality of characteristic topology images.
For example, the characteristic topology map may be cropped based on a request initiation instruction in the electronic device.
It should be understood that a feature may refer to implementing some function of a camera application; the topology map that can be selected based on one characteristic may include a plurality of topology maps; the characteristic topological graph is cut, namely a final adopted topological graph is determined from a plurality of topological graphs; for example, executing S342 based on a certain characteristic results in 4 different topology structures as shown in fig. 7; execution S343 may further determine a topology map that is ultimately employed by a certain characteristic from the selected characteristic topology maps; obtaining a schematic diagram shown in fig. 8 based on a link library required by loading and a topological graph adopted finally; as shown in FIG. 8, any one of F-1, F-2, F-3, F-4, or F-5 in FIG. 8 represents a characteristic; a feature may be understood as a certain function of the camera application; the realization of the characteristics needs to rely on a link library corresponding to the characteristics; by loading the link library, the camera application program can realize a certain function, and the running environment of the electronic equipment is initialized.
S350, instantiation processing.
The electronic device applies for the running memory to run the characteristic topological graph after the cutting processing.
It should be understood that instantiation processing may refer to the process of materializing an abstract object.
It should be noted that, by analyzing the existing photographing method shown in fig. 4, it is found that the main reasons for time consumption in the starting process of the camera application program or the switching process of different photographing modes are: loading a link library of a chip platform and loading a customized characteristic (feature) dynamic link library; s341 as in fig. 4; because of the existing shooting method, all inherent link libraries and all customized link libraries of the chip are loaded, namely, when the existing shooting method initializes the running environment of the electronic equipment, all dynamic link libraries are loaded; because the time for loading all dynamic link libraries is long, the camera application program is started, or the time for switching the camera mode of the camera application program is long; the waiting time of the user is longer, and the user experience is poorer.
In view of the above, the embodiment of the application provides a shooting method and electronic equipment; according to the shooting method provided by the embodiment of the application, shooting scenes of the electronic equipment are identified according to the first data stream and the first information; for example, if the first information is name information of the third party application program, determining that the shooting mode of the camera is a first mode, wherein the first mode is a mode of calling the camera by the third party application program; if the first information is the name information of the application program of the camera and the first data stream comprises a first identifier, determining that the shooting mode of the camera is a video mode, wherein the first identifier is used for indicating the video mode; if the first information is the name information of the application program of the camera and the first data stream comprises the first image resolution and the first image format, determining that the shooting mode of the camera is a preview mode; if the first information is the name information of the application program of the camera and the first data stream comprises the second image resolution and the second image format, determining that the shooting mode of the camera is a shooting mode; based on a shooting mode of a camera, running a first link library file corresponding to the shooting mode of the camera, wherein the first link library file is used for initializing an operation environment of the electronic equipment when the electronic equipment runs the shooting mode of the camera; and displaying a first interface, wherein the first interface is an interface of a shooting mode of the camera. By the scheme of the application, the operation of removing the redundant link library file can be realized, and the dynamic link library file is operated only aiming at a certain shooting mode which needs to be started; thus, the problem of long time consumption caused by running all dynamic link files can be avoided; thus, the time for starting the camera application program can be shortened, or the time for switching the shooting mode of the camera can be shortened; the camera can quickly respond to the operation of the user, and the user experience is improved.
Fig. 5 is a schematic diagram of a software system of an electronic device according to an embodiment of the present application.
As shown in fig. 5, an application layer 410, an application framework layer 420, a hardware abstraction layer 430, a driver layer 440, and a hardware layer 450 may be included in the system architecture.
Illustratively, the application layer 410 may include a camera application, a third party application; wherein the third party application has a control therein that invokes the camera application.
Optionally, the application layer 410 may also include gallery, calendar, call, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications.
Illustratively, the application framework layer 420 provides an application programming interface (application programming interface, API) and programming framework for application-layer applications; the application framework layer may include some predefined functions.
For example, window manager, content provider, resource manager, notification manager, and view system, etc., are included in the application framework layer 420.
Wherein, the window manager is used for managing window programs; the window manager may obtain the display screen size, determine if there are status bars, lock screens, and intercept screens.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, and phonebooks.
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, and video files.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as a notification manager, is used for download completion notification and message alerting. The notification manager may also manage notifications that appear in the system top status bar in the form of charts or scroll bar text, such as notifications for applications running in the background. The notification manager may also manage notifications that appear on the screen in the form of dialog windows, such as prompting text messages in status bars, sounding prompts, vibrating electronic devices, and flashing lights.
The view system includes visual controls, such as controls to display text and controls to display pictures. The view system may be used to build applications. The display interface may be composed of one or more views, for example, a display interface including a text notification icon may include a view displaying text and a view displaying a picture.
Illustratively, the hardware abstraction layer 430 is used to abstract hardware.
For example, the hardware abstraction layer 430 includes a camera hardware abstraction layer 431; the camera hardware abstract layer 431 comprises a characteristic topological graph management module and a dynamic link library; the characteristic topological graph management module comprises a characteristic pool management module; the characteristic pool management module comprises a shooting scene identification module and a dynamic link library; the characteristic topological graph management module is used for responsible for creating, selecting and executing related processes of the characteristic topological graph; the characteristic pool management module is used for processing the relevant processes of loading all the link libraries; the shooting scene identification module is used for identifying shooting scenes of the camera application program; for example, shooting scenes includes, but is not limited to: the method comprises the steps that a preview scene of a camera application program, a photographing scene of the camera application program, a video recording scene of the camera application program and a third party application program call the camera application program; such as a scene of a payment application scanning code, a photograph scene of a third party camera application, etc.
Illustratively, the driver layer 440 is used to provide drivers for different hardware devices.
For example, the drive layer may include a camera drive.
Illustratively, the hardware layer 450 is located at the bottom layer of the software system.
For example, hardware layer 450 may include a camera module; the camera module may be used to capture images.
The photographing method provided by the embodiment of the present application is described in detail below with reference to fig. 6 to 10.
Fig. 6 is a schematic flowchart of a photographing method provided by an embodiment of the present application. The method 500 may be performed by the electronic device shown in fig. 1; the method 500 includes S510 to S550, and S510 to S550 are described in detail below, respectively.
S510, an operation for the camera application is detected.
Case one
Optionally, an operation to turn on the camera application is detected.
Illustratively, an operation to the camera application is detected, and the camera application is started in response to the operation.
For example, as shown in (b) of fig. 2, a click operation on the camera application is detected, and the camera application is started.
It should be noted that, the camera 220 shown in (b) in fig. 2 may refer to a camera in the electronic device; i.e. a camera application that is self-contained in the electronic device.
Case two
Optionally, a camera application in the electronic device is running, and an operation to switch a camera mode of the camera application is detected.
For example, the electronic device is running in preview mode to display a preview interface; when the electronic equipment detects the operation of the shutter key, the electronic equipment is switched to a photographing mode, and the photographing mode is operated.
For example, the electronic device is running in preview mode to display a preview interface; when the electronic equipment detects the operation of the video control, the electronic equipment operates a preview mode to display a video preview interface; when the electronic equipment detects the operation of the shutter key, the electronic equipment is switched to a video mode and operates the video mode.
For example, the electronic device is running a video mode; when the electronic equipment detects the operation of the snap shot shutter key, the electronic equipment is switched to a photographing mode, and the photographing mode is operated.
Case three
Optionally, an operation of invoking the camera application by the third party application is detected.
Illustratively, an operation of a camera functionality control in the third party application is detected, and the camera application is started in response to the operation.
For example, while the electronic device is running an instant messaging type application, a click operation of a control of the camera function is detected.
For example, when the electronic device is running a payment type application, a click operation of a control of the camera function is detected; for example, a click operation on a scan code control in a payment type application is detected.
For example, a click operation on the third party camera application is detected, indicating that the third party camera application is started.
It should be appreciated that the above is illustrated with a click operation; in the embodiment of the application, the clicking operation can also be an operation of starting the camera application program through voice indication; or, indicating an operation of starting the camera application program through other instructions; the present application is not limited in any way.
S520, the camera application transmits the data stream.
Optionally, in response to an operation of the camera application, the camera application transmits a data stream; the data stream comprises configuration information; wherein the configuration information includes: information such as image format, image resolution size, or identification of camera mode is requested.
Illustratively, if a photographing mode operation of starting the camera application is detected, the camera application transmits a data stream; wherein, the data stream comprises configuration information; the configuration information comprises an image format and an image resolution size; the image format may be: YUV420, or JPEG, etc.; the image resolution size may be 4096×3072.
Illustratively, if a record mode operation to turn on the camera application is detected, the camera application transmits a data stream; wherein, the data stream comprises configuration information; the configuration information comprises an image format, an image resolution and an identification of a shooting mode; the image format may be: YUV420, etc.; the image resolution size may be 4096×3072; the flag 1 may be used to indicate a recording mode.
For example, the data stream includes information in the "operation mode" field; operation mode indicates a recording mode.
It should be noted that, the image format and the image resolution of the video mode and the photographing mode may be the same; for the video mode, identification information for indicating the video mode may be included in the data stream.
Illustratively, if a preview mode operation is detected that turns on the camera application, the camera application transmits a data stream; wherein, the data stream comprises configuration information; the configuration information comprises an image format and an image resolution size; the image format may be: YUV420; the image resolution size may be 1440 x 1080.
S530, sending the data stream to the hardware abstraction layer through the application framework layer.
Optionally, the camera application in the application layer sends a data stream to the application framework layer; the application framework layer sends a data stream to the hardware abstraction layer; wherein the data stream comprises the configuration information described above.
S540, data flow configuration processing.
It should be appreciated that data flow configuration may refer to initializing an operating environment for an electronic device.
In the embodiment of the present application, implementation manners of the data stream configuration processing in S540 may include S541 to S544.
S541, identifying a shooting scene of the camera application program based on the configuration information and/or the packet name in the data stream.
It should be appreciated that in embodiments of the present application, a shooting scene of a camera application can be identified based on configuration information and/or packet names in a data stream; i.e. a camera mode capable of identifying a camera application that needs to be started; the electronic equipment can load the dynamic link library in a targeted manner based on the camera mode, and the dynamic link library which is not needed for starting the camera mode is not loaded, so that the loading of removing the redundant link library is realized; loading a dynamic link library aiming at a camera mode of an opened camera application program; thus, the problem of long time consumption caused by loading all dynamic link files can be avoided; therefore, the duration of starting the camera mode of the camera application program can be shortened, and the camera mode can be quickly started; and the user experience is improved.
Wherein the package name may refer to the package name of the application (application package, APK) that opens the camera; the package name of the APK may be used to distinguish between camera applications (e.g., a camera application that is self-contained in the electronic device, which may be referred to as a native camera), or third party applications (e.g., a third party camera application).
Optionally, based on the package name of the APK, a shooting scene of the camera application is identified.
For example, if the package name of the APK indicates the third party application program, the shooting scene of the camera application calls the camera application program for the third party application program; the electronic equipment can load a dynamic link library corresponding to the third application program; the third party application program can be a payment application program, an instant messaging application program, a video conference application program, a third party camera application program and the like.
For example, if the package name of the APK is the payment application, the dynamic link library corresponding to the payment application is loaded.
For example, if the package name of the APK is an instant messaging application, loading a dynamic link library corresponding to the instant messaging application.
For example, if the package name of the APK is a video conference application, a dynamic link library corresponding to the video conference application is loaded.
For example, if the package name of the APK is the third-party camera application program, the dynamic link library corresponding to the third-party camera application program is loaded.
Illustratively, if the package name of the APK indicates the camera application (for example, the package name may be camera), identifying the shooting scene of the camera application as passing through the opening of the camera application; the electronic device may load a dynamic link library corresponding to the camera application.
Optionally, a shooting scene of the camera application is identified based on the configuration information in the data stream and the packet name of the APK.
Illustratively, if the package name of the APK indicates a camera application (for example, the package name may be camera), and the configuration information includes an image format and an image resolution size; the image format may be: YUV420, or JPEG, etc.; the image resolution size may be 4096×3072, then the photographing mode of the camera application is loaded.
For example, if the package name of the APK indicates a camera application (for example, the package name may be camera), and the configuration information includes an image format, an image resolution size, and an identification of a shooting mode; the image format may be: YUV420, or JPEG, etc.; the image resolution size may be 4096×3072; the flag 1 may be used to indicate a recording mode, and the recording mode of the camera application is loaded.
For example, if the package name of the APK indicates the camera application (for example, the package name may be camera), the configuration information includes an image format and an image resolution size; the configuration information comprises an image format and an image resolution size; the image format may be: YUV420; the image resolution size may be 1440 x 1080, then the preview mode of the camera application is loaded.
Alternatively, the shooting scene of the camera application may be identified based on the configuration information in the data stream.
Optionally, based on the shooting scene, loading a dynamic link library corresponding to the shooting scene is executed after S542; for example, S5421 to S5424.
S5421, if the shooting scene is the shooting mode of starting the camera application program, loading a dynamic link library of the shooting mode.
Illustratively, the electronic device is running in a preview mode to display a preview interface; when the electronic device detects the operation of the shutter key, the electronic device starts a photographing mode of the camera application program.
Illustratively, the electronic device is running a video mode; when the electronic device detects the operation of the snap shot shutter key, the electronic device starts a shooting mode of the camera application program. For example, if the shooting scene is identified as a shooting mode for starting the camera application program based on the configuration information and the package name of the APK, only a dynamic link library required by the shooting mode of the camera application program may be loaded; for example, a so file corresponding to a photographing mode of a camera application may be loaded.
It should be understood that the electronic device may not need to load all the dynamic link libraries, but only load the dynamic link libraries corresponding to the photographing mode of the camera application program.
For example, when the shooting scene is a shooting mode in which the camera application program is directly started, the package name of the APK may be the camera; the configuration information comprises an image format and an image resolution; the image format may be: YUV420, or JPEG, etc.; the image resolution size may be 4096×3072.
S5422, if the shooting scene is a video mode of starting the camera application program, loading a dynamic link library of the video mode.
Illustratively, the electronic device is running in a preview mode to display a preview interface; when the electronic equipment detects the operation of the video control, the electronic equipment operates a preview mode to display a video preview interface; when the electronic device detects the operation of the shutter key, the electronic device starts a video recording mode of the camera application program. For example, if the shooting scene is identified as a video mode of starting the camera application program based on the configuration information and the package name of the APK, a dynamic link library required by the video mode of the camera application program may be loaded; for example, a so file corresponding to a video mode of a camera application may be loaded.
For example, when the shooting scene is a video mode of starting a camera application program, the package name of the APK may be the camera; the configuration information comprises an image format, an image resolution and an identification of a shooting mode; the image format may be: YUV420, or JPEG, etc.; the image resolution size may be 4096×3072; the flag 1 may be used to indicate a recording mode.
S5423, if the shooting scene is a preview mode of starting the camera application program, loading a dynamic link library of the preview mode.
Illustratively, a click operation on the camera application is detected, a preview mode of the camera application is opened, and a preview interface is displayed.
For example, if the configuration information and the package name of the APK identify that the shooting scene is a preview mode of starting the camera application program, a dynamic link library required by the preview mode of the camera application program may be loaded; for example, a so file corresponding to a preview mode of a camera application may be loaded.
For example, when the shooting scene is a preview mode of starting a camera application, the package name of the APK may be the camera; the configuration information comprises an image format and an image resolution; the configuration information comprises an image format and an image resolution size; the image format may be: YUV420; the image resolution size may be 1440 x 1080.
It should be noted that, for a shooting scene of the camera application program, the dynamic link library to be loaded is different because of the shooting mode, the video mode or the preview mode of the camera application program; and the dynamic link libraries which are needed to be loaded and correspond to each mode are more; therefore, for the shooting scene of the camera application program, the shooting mode, the video recording mode or the preview mode of the camera application program can be further distinguished; therefore, the link library to be loaded can be determined more accurately, the time consumption for loading the link library when the camera application program is started is shortened, and the starting speed of the camera application program is improved.
S5424, if the shooting scene is that the third party application program starts the camera application program, the third party application program is loaded to call a dynamic link library of the camera.
For example, if the package name based on the APK identifies that the shooting scene is a third party application program to start the camera application program, the third party application program may be loaded to call a link library of the camera; for example, a third party application may be loaded to invoke a so file corresponding to a linked library of the camera.
It should be noted that, the third party application program opens the camera application Cheng Xushi, and the functions of the camera that can be invoked by the third party application program are limited; therefore, the link library that the third party application program needs to load for opening the camera application program is limited; therefore, in the shooting scene of the third-party application program starting camera application program, the shooting mode, the video recording mode or the preview mode of the third-party camera application program starting camera application program can be not needed to be distinguished.
S543, selecting a characteristic topological graph.
Wherein, the characteristic topological graph can be understood as the execution logic sequence of the loaded link library; for example, fig. 7 shows 4 different characteristic topologies.
It should be noted that, the selection characteristic topological graph may be understood as execution logic for selecting the link library loaded in S341.
Alternatively, the selection of the characteristic topology map may be performed in accordance with the data stream; for example, a video recording mode for starting a camera application may be determined based on configuration information included in the data stream; the recording mode may include: a video preview state, a video state and a video snapshot state; according to the configuration information of the data stream, a video preview state, a topology map corresponding to the video snap state can be selected.
S544, cutting the characteristic topological graph.
Optionally, the data stream may be parsed to obtain parsing information; the resolution information and the configuration information may be different; and cutting the selected characteristic topological graph according to the analysis information to determine the final adopted topological graph.
For example, a video recording mode for starting a camera application may be determined based on configuration information in the data stream; through further analysis of the data stream, determining a preview state of a video mode of starting the camera application program, and S543 may obtain a video preview state, a topology diagram corresponding to the video snapshot state, and a video status; the step S544 is to determine the topology map of the preview state of the video mode from the video preview state, the topology map of the video state corresponding to the video snapshot state.
Illustratively, the characteristic topological graph may be cropped based on the data stream; the cropping process may be understood as selecting a target topology map from the determined at least one characteristic topology image; at least one characteristic topology corresponds to one characteristic.
It should be understood that a feature may refer to implementing some function of a camera application; the topology map that can be selected based on one characteristic may include a plurality of topology maps; the characteristic topological graph is cut, namely a final adopted topological graph is determined from a plurality of topological graphs; for example, executing S342 based on a certain characteristic results in 4 different topology structures as shown in fig. 7; execution S343 may further determine a topology map that is ultimately employed by a certain characteristic from the selected characteristic topology maps; obtaining a schematic diagram shown in fig. 8 based on a link library required by loading and a topological graph adopted finally; as shown in FIG. 8, any one of F-1, F-2, F-3, F-4, or F-5 in FIG. 8 represents a characteristic; a feature may be understood as a certain function of the camera application; the realization of the characteristics needs to rely on a link library corresponding to the characteristics; by loading the link library, the camera application program can realize a certain function, and the running environment of the electronic equipment is initialized.
S550, instantiation processing.
The memory is applied for and the operation processing is carried out on the characteristic topological graph after the cutting processing.
It should be appreciated that instantiation processing may refer to the process of materializing an abstract object; for example, the application running memory performs running processing on the characteristic topological graph after the cutting processing.
For example, by the camera opening method provided by the embodiment of the application, when the electronic equipment detects the click operation on the camera application program, the electronic equipment can only load the dynamic link library in the preview mode, thereby realizing quick opening of the camera.
For example, by the shooting method provided by the embodiment of the application, when the electronic equipment detects the click operation of the shutter key in the preview display interface, the electronic equipment can only load the dynamic link library of the shooting mode, so that the thumbnail images can be displayed rapidly, and the shooting response speed is improved.
For example, by using the shooting method provided by the embodiment of the application, when the clicking operation on the video control is detected in the shooting preview interface of the electronic equipment, the electronic equipment can only load the dynamic link library in the preview mode, so as to realize the rapid display of the video preview interface.
For example, according to the shooting method provided by the embodiment of the application, when the electronic equipment detects that the third party application program calls the camera, the electronic equipment can only record the dynamic link library of the third party application program, so that the camera can be quickly started; for example, the electronic device is running the payment application program, detects the operation of the "sweep-one sweep" control in the payment application program, and can load the dynamic link library corresponding to the camera called by the payment application program, so as to realize the quick display of the sweep-code interface.
In the embodiment of the application, the shooting scene of the camera application program can be identified based on the configuration information and/or the packet name in the data stream; i.e. a camera mode capable of identifying a camera application that needs to be started; the electronic equipment can load the dynamic link library in a targeted manner based on the camera mode, and the dynamic link library which is not needed for starting the camera mode is not loaded, so that the loading of removing the redundant link library is realized; loading a dynamic link library aiming at a camera mode of an opened camera application program; thus, the problem of long time consumption caused by loading all dynamic link files can be avoided; therefore, the duration of starting the camera mode of the camera application program can be shortened, and the camera mode can be quickly started; and the user experience is improved.
Fig. 9 is an interactive flowchart of a photographing method according to an embodiment of the present application. As shown in fig. 9, the method includes S601 to S613; s601 to S613 are described in detail below, respectively.
It should be noted that the characteristic topological graph management module includes a characteristic pool management module; the characteristic pool management module comprises a shooting scene identification module and a dynamic link library; the characteristic topological graph management module is used for responsible for creating, selecting and executing related processes of the characteristic topological graph; the characteristic pool management module is used for processing the relevant processes of loading all the link libraries; the shooting scene identification module is used for identifying shooting scenes of the camera application program; the dynamic link library includes a link library that needs to be loaded in a dynamic manner, see the relevant description of fig. 5, which is not repeated here.
It should be appreciated that the data stream configuration process is required before the camera application program turns on a certain function (e.g., video, photo, or preview, etc.); the data stream configuration processing refers to initializing the running environment of the electronic equipment; the operation environment initialization comprises initialization processing of a characteristic topological graph management module; the initialization processing of the characteristic topological graph management module comprises the following steps: s601 to S613.
S601, the characteristic topology management module acquires a data stream.
Optionally, detecting an operation of a camera application program in an application layer, wherein the camera application program in the application layer sends a data stream to a hardware abstraction layer through an application architecture layer; a characteristic topology management module in a hardware abstraction layer acquires a data stream; the data stream may include configuration information; the configuration information comprises information such as image format, image resolution, or shooting mode identification; see the description of S520 in fig. 6, and will not be repeated here.
S602, the characteristic topological graph management module creates a characteristic pool.
It should be understood that a feature may be understood as a certain function of the camera application; a property pool can be understood as a collection of functions of a camera application; creating a property pool may be understood as creating a functional set of a camera application.
S603, the characteristic topological graph management module sends an instruction and a data stream for creating a characteristic pool to the characteristic pool management module.
Optionally, configuration information may be included in the data stream; the configuration information comprises information such as image format, image resolution, or shooting mode identification; see the description of S520 in fig. 6, and will not be repeated here.
S604, the characteristic pool management module sends a data stream to the shooting scene identification module.
Optionally, configuration information may be included in the data stream; the configuration information comprises information such as image format, image resolution, or shooting mode identification; see the description of S520 in fig. 6, and will not be repeated here.
S605, the shooting scene identification module identifies the current shooting scene based on the configuration information in the data stream and/or the packet name of the APK.
Optionally, the shooting scene identifying module may receive the data stream sent by the characteristic pool module; and, the shooting scene recognition module can obtain the package name of the APK for starting the camera application program through the data interface.
Optionally, the shooting scene identification module identifies the current shooting scene based on the packet name of the APK.
For example, if the camera application is turned on, the package name of the APK may be the camera (e.g., camera); if the third party application program starts the camera application program, the package name of the APK may be the third party application program.
Alternatively, the shooting scene identification module may identify the current shooting scene based on the configuration information in the data stream.
Optionally, the shooting scene identification module identifies the current shooting scene based on the configuration information in the data stream and the packet name of the APK.
It should be noted that, the implementation manner of identifying the current shooting scene may be referred to the description of S541 in fig. 6, which is not repeated here.
S606, the shooting scene identification module sends shooting scene information to the characteristic pool management module.
S607, the characteristic pool management module sends a loading instruction to the dynamic link library.
Alternatively, the property pool management module may send different loading instructions based on information of different shooting scenes.
S608, running a link library corresponding to the loading instruction.
Optionally, a link library corresponding to the shooting scene can be determined based on the loading instruction, and only the link library required by the shooting scene is operated in the dynamic link library; other link libraries not needed for the shooting scene may not be run.
S609, the dynamic link library sends a loading completion instruction to the characteristic pool management module.
The loading completion instruction may be used to indicate that loading of the corresponding link library in the dynamic link library is completed.
Alternatively, after the loading of the link library required for shooting the scene is completed, a loading completion instruction may be sent to the property pool management module.
It is to be understood that the initialization process of the property pool management module may include S604 to S609 described above.
S610, the characteristic pool management module selects a characteristic topological graph.
Optionally, the implementation of S610 may be described with reference to S543 in fig. 6; and will not be described in detail herein.
S611, the characteristic pool management module cuts the characteristic topological graph.
Alternatively, the implementation of S612 may be described with reference to S544 in fig. 6; and will not be described in detail herein.
And S612, the characteristic pool management module sends the result after the cutting processing to the characteristic topological graph management module.
Alternatively, the property pool management module may send the information of the cut topology map to the property topology map management module.
S613, instantiation processing.
Optionally, the implementation of S613 may be described with reference to S550 in fig. 6; and will not be described in detail herein.
In the embodiment of the application, the shooting scene of the camera application program can be identified based on the configuration information and/or the packet name in the data stream; i.e. a camera mode capable of identifying a camera application that needs to be started; the electronic equipment can load the dynamic link library in a targeted manner based on the camera mode, and the dynamic link library which is not needed for starting the camera mode is not loaded, so that the loading of removing the redundant link library is realized; loading a dynamic link library aiming at a camera mode of an opened camera application program; thus, the problem of long time consumption caused by loading all dynamic link files can be avoided; therefore, the duration of starting the camera mode of the camera application program can be shortened, and the camera mode can be quickly started; and the user experience is improved.
Fig. 10 is a schematic flowchart of a photographing method provided by an embodiment of the present application. The method 700 may be performed by the electronic device shown in fig. 1; the method 700 includes S710 to S750, and S710 to S750 are described in detail below, respectively.
S710, detecting a first operation on the camera.
Optionally, the first operation is an operation to turn on the camera.
Alternatively, the first operation is an operation of switching a shooting mode of the camera.
S720, responding to the first operation, and acquiring a first data stream and first information.
The first data stream is a data stream sent by the camera to the hardware abstraction layer, and the first information is name information of an application program for starting the camera.
Alternatively, the first data stream may be the data stream shown in fig. 6, see the related description of fig. 6, which is not repeated here.
In an embodiment of the present application, a shooting mode of a camera may be determined according to a first data stream and first information; alternatively, determining the photographing mode of the camera according to the first data stream and the first information may include any one of S731 to S734.
S731, if the first information is name information of the third party application, determining that the shooting mode of the camera is the first mode.
The first mode is a mode that a third party application program calls a camera.
Optionally, the implementation may be referred to in S5424 of fig. 6, which is not described herein.
S732, if the first information is the name information of the application program of the camera, and the first data stream includes the first identifier, determining that the shooting mode of the camera is a video mode.
The first identifier is used for indicating a video recording mode.
Optionally, the implementation may be referred to in S5422 in fig. 6, which is not described herein.
S733, if the first information is the name information of the application program of the camera, and the first data stream includes the first image resolution and the first image format, determining that the shooting mode of the camera is a preview mode.
Alternatively, the implementation may be described with reference to S5423 in fig. 6, which is not described herein.
S734, if the first information is the name information of the application program of the camera, and the first data stream includes the second image resolution and the second image format, determining that the shooting mode of the camera is a shooting mode, wherein the second image format includes the first image format, and the second image resolution is greater than the first image resolution.
Alternatively, the implementation may be described with reference to S5421 in fig. 6, which is not described herein.
S740, based on the shooting mode of the camera, running a first link library file corresponding to the shooting mode of the camera.
The first link library file is used for initializing the running environment of the electronic equipment when the electronic equipment runs the shooting mode of the camera.
It should be understood that, based on the shooting mode of the camera, a first link library file corresponding to the shooting mode of the camera is run; it can be understood that the first link library file related to the shooting mode of the camera is run; assume that the dynamic link library comprises a link library file 1, a link library file 2 and a link library file 3; the link library file 1 corresponds to a preview mode of the camera; the link library file 2 corresponds to a photographing mode of a camera; the link library file 3 corresponds to a video mode of the camera; if it is determined that the shooting mode of the camera is the preview mode according to the first data stream and the first information, the electronic device only operates the link library file 1 in the dynamic link library and does not operate the link library file 2 and the link library file 3 in the dynamic link library. The method and the device selectively run the link library files in the dynamic link library based on the shooting mode of the camera.
S750, displaying the first interface.
The first interface is an interface of a shooting mode of the camera.
Optionally, if the first information is name information of an application program of the camera, and the first data stream includes the first identifier, the first image format and the second image resolution, the shooting mode of the camera is a video mode.
Optionally, the hardware abstraction layer includes a camera hardware abstraction layer, the camera hardware abstraction layer includes a property pool management module, the property pool management module includes a shooting scene recognition module and a dynamic link library, the shooting scene recognition module is used for determining a shooting mode of the camera based on the first data stream and the first information, the dynamic link library includes a plurality of link library files, and the plurality of link library files includes a first link library file.
Alternatively, the implementation may be referred to in fig. 5 or fig. 9, and will not be described herein.
Optionally, the camera hardware abstraction layer further includes a characteristic topological graph management module, where the characteristic topological graph management module is configured to receive a first data stream sent by the camera; the characteristic topological graph management module comprises a characteristic pool management module.
Optionally, the electronic device does not run the link library files in the dynamic link library other than the first link library file before the first interface is displayed.
In the embodiment of the application, the electronic equipment can identify the shooting scene of the camera application program according to the first data stream and the first information, namely, the shooting mode of the camera which needs to be started; for example, the photographing mode may include: the third party application program calls a first mode, a photographing mode, a preview mode and a video recording mode of the camera; the electronic equipment can pertinently operate the first dynamic link library file corresponding to the shooting mode based on the shooting mode; it can be understood that, if a certain shooting mode of the camera is operated, the electronic device only operates the dynamic link library file corresponding to the shooting mode; and does not run dynamic link library files corresponding to other shooting modes; the operation of removing redundant link library files is realized, and the dynamic link library files are operated only aiming at a certain shooting mode which needs to be started; thus, the problem of long time consumption caused by running all dynamic link files can be avoided; thus, the time for starting the camera application program can be shortened, or the time for switching the shooting mode of the camera can be shortened; enabling the camera to quickly respond to the user's operation; and the user experience is improved.
It should be understood that the above description is intended to aid those skilled in the art in understanding the embodiments of the present application, and is not intended to limit the embodiments of the present application to the specific values or particular scenarios illustrated. It will be apparent to those skilled in the art from the foregoing description that various equivalent modifications or variations can be made, and such modifications or variations are intended to be within the scope of the embodiments of the present application.
An exemplary interface diagram in an electronic device is described below in connection with fig. 11.
Illustratively, the graphical user interface (graphical user interface, GUI) shown in fig. 11 (a) is the desktop 810 of the electronic device; the electronic device detects a click operation on the control 820 of the setup application on the desktop 810, as shown in (b) in fig. 11; after the electronic device detects a click operation on the control 820 of the setup application on the desktop 810, another GUI as shown in (c) of fig. 11 may be displayed; the GUI shown in (c) of fig. 11 may be a display interface for setting an application program, and controls such as a wireless network, bluetooth, battery, or camera may be included in the display interface; for example, controls 830 including cameras; the electronic device detects a click operation on the control 830 of the camera, as shown in (d) in fig. 11; after the electronic device detects a click operation on the control 830 of the camera, displaying a setting display interface of the camera; a quick-start control 840 may be included in the camera's setup display interface, as shown in fig. 11 (e); the electronic device detects a click operation on the quick-open control 840, as shown in (f) of fig. 11; when the electronic device detects the clicking operation of the rapidly opened control 840, the electronic device executes the shooting method provided by the embodiment of the application when the electronic device subsequently detects the operation of opening the camera application program.
It should be noted that the foregoing is illustrative of a display interface in an electronic device, and the present application is not limited thereto.
It should be understood that the above description is intended to aid those skilled in the art in understanding the embodiments of the present application, and is not intended to limit the embodiments of the present application to the specific values or particular scenarios illustrated. It will be apparent to those skilled in the art from the foregoing description that various equivalent modifications or variations can be made, and such modifications or variations are intended to be within the scope of the embodiments of the present application.
The shooting method provided by the embodiment of the application is described in detail above with reference to fig. 1 to 11; an embodiment of the device of the present application will be described in detail with reference to fig. 12 and 13. It should be understood that the apparatus in the embodiments of the present application may perform the methods of the foregoing embodiments of the present application, that is, specific working procedures of the following various products may refer to corresponding procedures in the foregoing method embodiments.
Fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present application. The electronic device 1000 includes a processing module 1010 and a display module 1020.
Wherein, the processing module 1010 is configured to: detecting a first operation on the camera; responding to the first operation, acquiring a first data stream and first information, wherein the first data stream is a data stream sent by the camera to a hardware abstraction layer, and the first information is name information of an application program for starting the camera; if the first information is name information of a third party application program, determining that a shooting mode of the camera is a first mode, wherein the first mode is a mode of calling the camera by the third party application program; if the first information is name information of an application program of the camera and the first data stream comprises a first identifier, determining that a shooting mode of the camera is a video mode, wherein the first identifier is used for indicating the video mode; if the first information is name information of an application program of the camera and the first data stream comprises a first image resolution and a first image format, determining that a shooting mode of the camera is a preview mode; if the first information is name information of an application program of the camera and the first data stream comprises a second image resolution and a second image format, determining that a shooting mode of the camera is a shooting mode, wherein the second image format comprises the first image format and the second image resolution is larger than the first image resolution; based on the shooting mode of the camera, running a first link library file corresponding to the shooting mode of the camera, wherein the first link library file is used for initializing the running environment of the electronic equipment when the electronic equipment runs the shooting mode of the camera; the display module 1020 is configured to: and displaying a first interface, wherein the first interface is an interface of a shooting mode of the camera.
Optionally, as an embodiment, if the first information is name information of an application program of the camera, and the first data stream includes the first identifier, the first image format and the second image resolution, the shooting mode of the camera is the video recording mode.
Optionally, as an embodiment, the hardware abstraction layer includes a camera hardware abstraction layer, and the camera hardware abstraction layer includes a property pool management module, where the property pool management module includes a shooting scene identification module and a dynamic link library, where the shooting scene identification module is configured to determine a shooting mode of the camera based on the first data stream and the first information, and the dynamic link library includes a plurality of link library files, and the plurality of link library files includes the first link library file.
Optionally, as an embodiment, the camera hardware abstraction layer further includes a characteristic topology map management module, where the characteristic topology map management module is configured to receive the first data stream sent by the camera; the characteristic topological graph management module comprises the characteristic pool management module.
Optionally, as an embodiment, before displaying the first interface, the electronic device does not run a link library file in the dynamic link library except the first link library file.
Optionally, as an embodiment, the first operation is an operation to turn on the camera.
Optionally, as an embodiment, the first operation is an operation of switching a shooting mode of the camera.
The electronic device 1000 is embodied as a functional module. The term "module" herein may be implemented in software and/or hardware, and is not specifically limited thereto.
For example, a "module" may be a software program, a hardware circuit, or a combination of both that implements the functionality described above. The hardware circuitry may include application specific integrated circuits (application specific integrated circuit, ASICs), electronic circuits, processors (e.g., shared, proprietary, or group processors, etc.) and memory for executing one or more software or firmware programs, merged logic circuits, and/or other suitable components that support the described functions.
Thus, the elements of the examples described in the embodiments of the present application can be implemented in electronic hardware, or in a combination of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
Fig. 13 shows a schematic structural diagram of another electronic device provided by the present application. The dashed line in fig. 13 indicates that the unit or the module is optional; the electronic device 1100 may be used to implement the method of photographing described in the above method embodiments.
The electronic device 1110 includes one or more processors 1101, the one or more processors 1101 may support the electronic device 1100 to implement the method of photographing in the method embodiments. The processor 1101 may be a general purpose processor or a special purpose processor. For example, the processor 1101 may be a central processing unit (central processing unit, CPU), digital signal processor (digital signal processor, DSP), application specific integrated circuit (application specific integrated circuit, ASIC), field programmable gate array (field programmable gate array, FPGA), or other programmable logic device such as discrete gates, transistor logic, or discrete hardware components.
Optionally, the processor 1101 may be configured to control the electronic device 1100, execute a software program, and process data of the software program. The electronic device 1100 may also include a communication unit 1105 to enable input (reception) and output (transmission) of signals.
For example, the electronic device 1100 may be a chip, the communication unit 1105 may be an input and/or output circuit of the chip, or the communication unit 1105 may be a communication interface of the chip, which may be an integral part of a terminal device or other electronic device.
For another example, the electronic device 1100 may be a terminal device, the communication unit 1105 may be a transceiver of the terminal device, or the communication unit 1105 may include one or more memories 1102 in the communication unit 1100, on which a program 1104 is stored, and the program 1104 may be executed by the processor 1101 to generate instructions 1103, so that the processor 1101 performs the shooting method described in the above method embodiment according to the instructions 1103.
Optionally, the memory 1102 may also have data stored therein.
Optionally, the processor 1101 may also read data stored in the memory 1102, which may be stored at the same memory address as the program 1104, or which may be stored at a different memory address than the program 1104.
Alternatively, the processor 1101 and the memory 1102 may be provided separately or may be integrated together, for example, on a System On Chip (SOC) of the terminal device.
Illustratively, the memory 1102 may be used to store a related program 1104 of the photographing method provided in the embodiment of the present application, and the processor 1101 may be used to call the related program 1104 of the photographing method stored in the memory 1102 when performing photographing, to perform the photographing method of the embodiment of the present application; for example, a first operation on the camera is detected; responding to the first operation, acquiring a first data stream and first information, wherein the first data stream is a data stream sent by the camera to a hardware abstraction layer, and the first information is name information of an application program for starting the camera; if the first information is name information of a third party application program, determining that a shooting mode of the camera is a first mode, wherein the first mode is a mode of calling the camera by the third party application program; if the first information is name information of an application program of the camera and the first data stream comprises a first identifier, determining that a shooting mode of the camera is a video mode, wherein the first identifier is used for indicating the video mode; if the first information is name information of an application program of the camera and the first data stream comprises a first image resolution and a first image format, determining that a shooting mode of the camera is a preview mode; if the first information is name information of an application program of the camera and the first data stream comprises a second image resolution and a second image format, determining that a shooting mode of the camera is a shooting mode, wherein the second image format comprises the first image format and the second image resolution is larger than the first image resolution; based on the shooting mode of the camera, running a first link library file corresponding to the shooting mode of the camera, wherein the first link library file is used for initializing the running environment of the electronic equipment when the electronic equipment runs the shooting mode of the camera; and displaying a first interface, wherein the first interface is an interface of a shooting mode of the camera.
Optionally, the present application also provides a computer program product which, when executed by the processor 1101, implements the method of photographing in any of the method embodiments of the present application.
For example, the computer program product may be stored in the memory 1102, such as the program 1104, and the program 1104 is ultimately converted into an executable object file that can be executed by the processor 1101 through preprocessing, compiling, assembling, and linking processes.
Optionally, the present application further provides a computer readable storage medium, on which a computer program is stored, which when executed by a computer, implements the method of shooting according to any one of the method embodiments of the present application. The computer program may be a high-level language program or an executable object program.
For example, the computer-readable storage medium is, for example, memory 1102. The memory 1102 may be volatile memory or nonvolatile memory, or the memory 1102 may include both volatile and nonvolatile memory. The nonvolatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. The volatile memory may be random access memory (random access memory, RAM) which acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), synchronous DRAM (SLDRAM), and direct memory bus RAM (DR RAM).
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the above-described embodiments of the electronic device are merely illustrative, e.g., the division of the modules is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
It should be understood that, in various embodiments of the present application, the size of the sequence number of each process does not mean that the execution sequence of each process should be determined by its functions and internal logic, and should not constitute any limitation on the implementation process of the embodiments of the present application.
In addition, the term "and/or" herein is merely an association relation describing an association object, and means that three kinds of relations may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the present application should be defined by the claims, and the above description is only a preferred embodiment of the technical solution of the present application, and is not intended to limit the protection scope of the present application. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. A method of photographing, applied to an electronic device, comprising:
detecting a first operation on the camera;
responding to the first operation, acquiring a first data stream and first information, wherein the first data stream is a data stream sent by the camera to a hardware abstraction layer, and the first information is name information of an application program for starting the camera;
if the first information is name information of a third party application program, determining that a shooting mode of the camera is a first mode, wherein the first mode is a mode of calling the camera by the third party application program;
if the first information is name information of an application program of the camera and the first data stream comprises a first identifier, determining that a shooting mode of the camera is a video mode, wherein the first identifier is used for indicating the video mode;
if the first information is name information of an application program of the camera and the first data stream comprises a first image resolution and a first image format, determining that a shooting mode of the camera is a preview mode;
if the first information is name information of an application program of the camera and the first data stream comprises a second image resolution and a second image format, determining that a shooting mode of the camera is a shooting mode, wherein the second image format comprises the first image format and the second image resolution is larger than the first image resolution;
Based on the shooting mode of the camera, running a first link library file corresponding to the shooting mode of the camera, wherein the first link library file is used for initializing the running environment of the electronic equipment when the electronic equipment runs the shooting mode of the camera;
and displaying a first interface, wherein the first interface is an interface of a shooting mode of the camera.
2. The method of claim 1, wherein if the first information is name information of an application program of the camera and the first data stream includes the first identifier, the first image format and the second image resolution, the shooting mode of the camera is the video recording mode.
3. The method of claim 1 or 2, wherein the hardware abstraction layer comprises a camera hardware abstraction layer, the camera hardware abstraction layer comprising a property pool management module, the property pool management module comprising a shooting scene identification module and a dynamic link library, the shooting scene identification module to determine a shooting mode of the camera based on the first data stream and the first information, the dynamic link library comprising a plurality of link library files, the plurality of link library files comprising the first link library file.
4. The method of claim 3, wherein the camera hardware abstraction layer further comprises a characteristic topology map management module, the characteristic topology map management module to receive the first data stream sent by the camera; the characteristic topological graph management module comprises the characteristic pool management module.
5. The method of claim 3 or 4, wherein the electronic device does not run a link library file of the dynamic link library other than the first link library file prior to displaying the first interface.
6. The method of any one of claims 1 to 5, wherein the first operation is an operation to turn on the camera.
7. The method according to any one of claims 1 to 5, wherein the first operation is an operation of switching a shooting mode of the camera.
8. An electronic device, comprising:
one or more processors and memory;
the memory is coupled with the one or more processors, the memory for storing computer program code comprising computer instructions that the one or more processors invoke to cause the electronic device to perform the method of any of claims 1-7.
9. A chip system for application to an electronic device, the chip system comprising one or more processors for invoking computer instructions to cause the electronic device to perform the method of any of claims 1 to 7.
10. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program, which when executed by a processor causes the processor to perform the method of any of claims 1 to 7.
CN202310209672.2A 2023-02-24 2023-02-24 Shooting method and electronic equipment Active CN117135448B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310209672.2A CN117135448B (en) 2023-02-24 2023-02-24 Shooting method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310209672.2A CN117135448B (en) 2023-02-24 2023-02-24 Shooting method and electronic equipment

Publications (2)

Publication Number Publication Date
CN117135448A true CN117135448A (en) 2023-11-28
CN117135448B CN117135448B (en) 2024-07-12

Family

ID=88857064

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310209672.2A Active CN117135448B (en) 2023-02-24 2023-02-24 Shooting method and electronic equipment

Country Status (1)

Country Link
CN (1) CN117135448B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009083732A1 (en) * 2007-12-31 2009-07-09 Symbian Software Limited Preloading dynamic link libraries
CN105338249A (en) * 2015-11-24 2016-02-17 努比亚技术有限公司 Independent camera system-based shooting method and mobile terminal
WO2021151350A1 (en) * 2020-01-31 2021-08-05 华为技术有限公司 Method and apparatus for loading dynamic link library
CN114237720A (en) * 2021-11-12 2022-03-25 深圳市普渡科技有限公司 Loading device, apparatus and method for depth camera and storage medium
WO2022105759A1 (en) * 2020-11-20 2022-05-27 华为技术有限公司 Video processing method and apparatus, and storage medium
WO2022105758A1 (en) * 2020-11-20 2022-05-27 华为技术有限公司 Path identification method and apparatus

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009083732A1 (en) * 2007-12-31 2009-07-09 Symbian Software Limited Preloading dynamic link libraries
CN105338249A (en) * 2015-11-24 2016-02-17 努比亚技术有限公司 Independent camera system-based shooting method and mobile terminal
WO2017088603A1 (en) * 2015-11-24 2017-06-01 努比亚技术有限公司 Photographing method of mobile terminal and mobile terminal
WO2021151350A1 (en) * 2020-01-31 2021-08-05 华为技术有限公司 Method and apparatus for loading dynamic link library
WO2022105759A1 (en) * 2020-11-20 2022-05-27 华为技术有限公司 Video processing method and apparatus, and storage medium
WO2022105758A1 (en) * 2020-11-20 2022-05-27 华为技术有限公司 Path identification method and apparatus
CN114237720A (en) * 2021-11-12 2022-03-25 深圳市普渡科技有限公司 Loading device, apparatus and method for depth camera and storage medium

Also Published As

Publication number Publication date
CN117135448B (en) 2024-07-12

Similar Documents

Publication Publication Date Title
CN114205522B (en) Method for long-focus shooting and electronic equipment
CN113806105B (en) Message processing method, device, electronic equipment and readable storage medium
WO2023056795A1 (en) Quick photographing method, electronic device, and computer readable storage medium
CN115442517B (en) Image processing method, electronic device, and computer-readable storage medium
CN116152122B (en) Image processing method and electronic device
CN111159604A (en) Picture resource loading method and device
CN113660408A (en) Anti-shake method and device for video shooting
CN115115679A (en) Image registration method and related equipment
CN116668836B (en) Photographing processing method and electronic equipment
CN115238255A (en) Unlocking method and electronic equipment
CN117077703A (en) Image processing method and electronic equipment
CN115460343B (en) Image processing method, device and storage medium
CN117135448B (en) Shooting method and electronic equipment
CN116347217A (en) Image processing method, device and storage medium
CN116723383A (en) Shooting method and related equipment
CN115686182A (en) Processing method of augmented reality video and electronic equipment
CN117135447B (en) Photographing processing method and electronic equipment
CN117082339B (en) Shooting mode switching method and device, electronic equipment and readable storage medium
CN117135268B (en) Shooting method, electronic device, chip and storage medium
CN116723382B (en) Shooting method and related equipment
CN114245011B (en) Image processing method, user interface and electronic equipment
WO2023035868A1 (en) Photographing method and electronic device
CN113179362B (en) Electronic device and image display method thereof
CN117499526B (en) Shooting method, electronic device, chip system and computer readable storage medium
CN116522400B (en) Image processing method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant