CN116528063B - Shooting method, readable storage medium and electronic device - Google Patents

Shooting method, readable storage medium and electronic device Download PDF

Info

Publication number
CN116528063B
CN116528063B CN202310809507.0A CN202310809507A CN116528063B CN 116528063 B CN116528063 B CN 116528063B CN 202310809507 A CN202310809507 A CN 202310809507A CN 116528063 B CN116528063 B CN 116528063B
Authority
CN
China
Prior art keywords
application
resolution
camera
camera service
list
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310809507.0A
Other languages
Chinese (zh)
Other versions
CN116528063A (en
Inventor
尚义卓
张祎
刘航
邓嘉凯
王硕强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310809507.0A priority Critical patent/CN116528063B/en
Publication of CN116528063A publication Critical patent/CN116528063A/en
Application granted granted Critical
Publication of CN116528063B publication Critical patent/CN116528063B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Telephonic Communication Services (AREA)

Abstract

The application relates to the technical field of terminals, and discloses a shooting method, a readable storage medium and electronic equipment. The shooting method of the application comprises the following steps: when a first application installed on the electronic equipment calls a camera service, a first acquisition parameter for calling the camera service is sent, wherein the first acquisition parameter comprises a first resolution; when a camera service module on the electronic equipment determines that a second acquisition parameter exists in the stored preset acquisition parameters corresponding to the first application, acquiring image data by using the second acquisition parameter and sending the image data to the first application, wherein the second acquisition parameter comprises a second resolution corresponding to the proportion of the first resolution. According to the application, the camera service module acquires the image data according to the second acquisition parameters and sends the image data to the first application, so that the image acquired by the first application is clearer, the situation of abnormal clamping and the like can not occur, and the corresponding function of the first application is realized.

Description

Shooting method, readable storage medium and electronic device
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a shooting method, a readable storage medium, and an electronic device.
Background
When capturing an image with a camera, an application in an electronic device typically first obtains a plurality of sets of acquisition parameters (e.g., resolution, frame rate, scale, etc.) supported by the camera of the electronic device, and then selects one of the plurality of sets of acquisition parameters to capture the image. However, in some scenarios, the acquisition parameters selected by the application program may not match with the software or the camera of the electronic device, which causes problems of the electronic device such as jamming, flashing back of the application program, abnormal stretching of the photographed image, and lower resolution of the photographed image, and affects the user experience.
Disclosure of Invention
The embodiment of the application provides a shooting method, a readable storage medium and electronic equipment.
In a first aspect, an embodiment of the present application provides a shooting method, which is applied to an electronic device, and the method includes: detecting that a first application installed on the electronic equipment calls a camera service, and acquiring a first acquisition parameter sent by the first application calling the camera service, wherein the first acquisition parameter comprises a first resolution; determining that a second acquisition parameter exists in preset acquisition parameters of the first application, wherein the second acquisition parameter comprises a second resolution which is the same as the first resolution in proportion; and providing camera service for the first application by using the second acquisition parameters.
It will be appreciated that in some embodiments of the application, the first acquisition parameter may also be referred to hereinafter as a target acquisition parameter, and the preset acquisition parameter may also be referred to hereinafter as a custom acquisition parameter. When the first application invokes the camera service, the camera service is required to acquire the image data with the first resolution, but the resolution of the first resolution is not necessarily suitable for the first application, which may cause abnormal stretching or unclear images acquired by the first application. The second resolution may be, for example, a resolution suitable for the first application that is debugged by the developer, and the image data obtained by the first application through the second resolution does not have the problem of abnormal stretching or unclear images. Therefore, the first application can better realize corresponding functions by replacing the first resolution issued by the first application with the second resolution to acquire the image data.
In a possible implementation of the first aspect, the first resolution is different from the second resolution.
It will be appreciated that in some embodiments of the application, the first resolution is selected by the first application, and the first application does not have a complete set of rules to select the appropriate resolution, so the first resolution will be different from the second resolution. In other embodiments, the first resolution selected by the first application may be the same as the second resolution. Hereinafter, the first resolution may also be referred to as a target resolution, and the second resolution may also be referred to as a custom resolution.
In a possible implementation manner of the first aspect, the acquiring the first acquisition parameter sent by the first application invoking the camera service includes: and sending a collection parameter list supported by the camera to the first application, and receiving a first collection parameter selected from the collection parameter list by the first application.
It will be appreciated that in some embodiments of the application, the first application may select a first acquisition parameter from a list of acquisition parameters supported by the camera to invoke the camera service to acquire image data via the first acquisition parameter.
In a possible implementation manner of the first aspect, the electronic device stores a first list, where the first list includes an identifier of an application and a preset collection parameter corresponding to the identifier.
It will be appreciated that in some embodiments of the application, the first list may also be referred to as a whitelist, and that the identity of the application may be, for example, the application package name of the corresponding application. The first list may include, for example, a package name of the application and a preset collection parameter of the corresponding application. The first list may be stored, for example, in a camera service of the electronic device.
In a possible implementation manner of the first aspect, the determining that the second acquisition parameter exists in the preset acquisition parameter of the first application includes: acquiring a first identifier of a first application, which is sent by a first application calling camera service, and identifying the first application; determining that a second identifier which is the same as the first identifier exists in the first list; and determining a second acquisition parameter from preset acquisition parameters corresponding to the second identifier.
It will be appreciated that in some embodiments of the present application, the first identifier may be, for example, an application package name of the first application, and the second identifier may also be, for example, a package name of the first application, where the package name of the first application is in the first list, indicating that the first list stores preset acquisition parameters of the first application, and the camera service may determine the second acquisition parameter from the preset acquisition parameters.
In a possible implementation manner of the first aspect, the first identifier is an application package name of the first application, the second identifier is an application package name stored in the first list, and the preset collection parameter is matched with an application corresponding to the application package name stored in the first list.
In a possible implementation of the first aspect, the first application includes at least one of the following applications: conference applications, instant messaging applications, live applications, map applications, and navigation applications.
It is to be appreciated that in some embodiments of the present application, the first application is, for example, an application that can invoke a camera service to obtain an image, such as a conference application, an instant messaging application, a live application, a map application, a navigation application, and the like.
In a second aspect, an embodiment of the present application provides an electronic device, including a camera service module and a first application;
The camera service module detects that a first application calls a camera service, and obtains a first acquisition parameter sent by the first application calling the camera service, wherein the first acquisition parameter comprises a first resolution; the camera service module determines that a second acquisition parameter exists in preset acquisition parameters of the first application, wherein the second acquisition parameter comprises a second resolution corresponding to the proportion of the first resolution; the camera service module provides camera services for the first application using the second acquisition parameters.
In a third aspect, embodiments of the present application provide a method comprising storing instructions; at least one processor configured to execute instructions to cause an electronic device to implement the above first aspect and any one of possible implementations of the above first aspect to provide a shooting method.
In a fourth aspect, an embodiment of the present application provides a readable storage medium having stored thereon instructions that, when executed on a computer, cause the computer to perform the above first aspect and any possible implementation of the above first aspect.
In a fifth aspect, the present application provides a program product which, when run on an electronic device, causes the electronic device to carry out the above-described first aspect and any one of the possible implementations of the above-described first aspect.
It should be appreciated that the advantages of the second aspect to the fifth aspect may refer to the description of the first aspect, and are not described herein.
Drawings
Fig. 1 shows a schematic view of a scenario in which a user scans a two-dimensional code through a chat application on a mobile phone;
FIG. 2 illustrates a scenario in which a user is engaged in a video call through a conferencing application on a mobile phone;
FIG. 3 illustrates an implementation flow chart of an application invoking a camera service to take a picture;
FIG. 4a is a schematic diagram illustrating a scenario in which a user normally scans a two-dimensional code through a chat application on a mobile phone, according to some embodiments of the present application;
FIG. 4b is a diagram illustrating a comparison of the scenes of a user's video call through a conferencing application on a cell phone, according to some embodiments of the present application;
FIG. 5 illustrates a schematic block diagram of a system software architecture of a handset, according to some embodiments of the application;
FIG. 6a illustrates an interactive flow diagram for an application invoking a camera service to obtain image data, according to some embodiments of the application;
FIG. 6b illustrates a schematic diagram of a storage profile, according to some embodiments of the application;
FIG. 6c illustrates a configuration file, according to some embodiments of the application;
FIG. 7 illustrates an interactive flow diagram for updating a whitelist, in accordance with some embodiments of the application;
fig. 8 illustrates a schematic diagram of a mobile phone, according to some embodiments of the application.
Detailed Description
Illustrative embodiments of the application include, but are not limited to, a shooting method, a readable storage medium, an electronic device.
In order to make the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be described in detail below with reference to the accompanying drawings and specific embodiments of the present application.
As described above, in some scenarios, the acquisition parameters selected by the application program may not match with the software or the camera of the electronic device, which causes problems of the electronic device such as jamming, flashing back of the application program, abnormal stretching of the captured image, and low resolution of the captured image, and affects the user experience.
For example, fig. 1 shows a schematic view of a scenario in which a user scans a two-dimensional code through a three-party application on a mobile phone.
Referring to fig. 1, a user uses a three-party application on a mobile phone 100, which may be, for example, a chat application, through which the user scans two-dimensional code information. In the 01 interface of the mobile phone 100, a prompt message of "putting a two-dimensional code/bar code into a frame, that is, automatically scanning the code" is shown, and a two-dimensional code image 10 that the user wants to scan is displayed in the frame. However, because the collection parameters selected by the chat application are not matched with the camera of the mobile phone, abnormal stretching of the two-dimensional code image 10 occurs. For example, the acquisition parameters supported by the chat application in selecting the camera floor may include a resolution parameter in a ratio of 2:1. For example, the chat application selects the 640 x 320 resolution parameter. But chat applications require 512 aligned resolution parameters, i.e. parameters of resolution are multiples of 512, e.g. 1024 x 512. This results in that the resolution parameters required for the chat application do not match the selected resolution parameters, and therefore, when the chat application acquires the two-dimensional code image 10, abnormal stretching is displayed, which affects the user experience.
For another example, fig. 2 shows a schematic view of a scenario in which a user is engaged in a video call through a conferencing application on a mobile phone.
Referring to fig. 2, when a user is engaged in a video call through a conferencing application on the handset 100, the 03 interface on the handset 100 displays an image 40 of the user. It will be appreciated that the image 40 is blurred when the user is engaged in a video call due to the mismatch of the acquisition parameters selected by the conferencing application with the front-facing camera 30 of the handset 100. For example, when a user is engaged in a video call using a conferencing application on the handset 100, the conferencing application selects a camera resolution ratio of 16:9, and the 16:9 resolution supported by the camera bottom layer of the handset 100 includes 1920×1080, 1600×900, 1280×720, 960×540, and 640×360. The conference application will typically select the lowest resolution for the video call, e.g., 640 x 360 resolution, which may result in insufficient resolution of the image as the conference application proceeds through the video.
FIG. 3 illustrates a flowchart of an implementation of an application invoking a camera service to take a picture.
Referring to fig. 3, the process includes:
301, an application program obtains a collection parameter list supported by a camera.
For example, the application may obtain all acquisition parameters supported by the camera of the cell phone 100 from the camera driver when invoking the camera service to take a picture. In some embodiments, the camera-supported acquisition parameters of the handset 100 may have multiple sets, each of which may include resolution, scale, frame rate, etc.
302, the application program selects a target acquisition parameter from the acquisition parameter list, and invokes the camera service through the target acquisition parameter.
The application program illustratively selects a target acquisition parameter (which may also be referred to as a first acquisition parameter in other embodiments) that satisfies the condition from a list of acquisition parameters reported by the camera. For example, in the embodiment of fig. 1, the chat application invokes the camera service to obtain the image data with a resolution ratio of 2:1, and the resolution ratio of 2:1 in the acquisition parameter list reported by the camera includes 640×320, 720×360, 1024×512. The application program will often select a lower resolution call camera service, for example, a 640 x 320 resolution call camera service to obtain image data. Or in the fig. 2 embodiment, the conference application selects a 16:9 resolution call camera service to obtain image data. The 16:9 resolution list reported by the camera includes 1920×1080, 1600×900, 1280×720, 960×540, and 640×360. While the conferencing application will typically select a lower resolution to acquire image data, for example, selecting a 640 x 360 resolution invokes the camera service to acquire image data.
In other embodiments, the application may also select a larger resolution or randomly select a resolution to capture the image, but it is possible that the selected resolution cannot match the application.
303, the camera service initiates the camera to capture an image according to the target capture parameters.
The application program typically selects the target acquisition parameters, then sends the target acquisition parameters to the camera service, and the camera service obtains image data according to the target acquisition parameters and reports the collected image data to the application program.
304, the application program obtains and displays the image data reported by the camera service.
The application program, for example, obtains image data from a camera service to realize the corresponding function. For example, the conference application acquires image data through a camera service to realize a function of video call. The chat application obtains the image data of the two-dimension code through the camera service so as to realize the function of identifying the two-dimension code.
In the above embodiment, after the application program obtains the acquisition parameter list reported by the camera, the application program selects the corresponding target acquisition parameter from the acquisition parameter list. However, the resolution size in the target acquisition parameters selected by the application may not match the requirements of itself. The problems of locking, application program flashing back, abnormal stretching of a shot image, lower resolution of the shot image and the like of the electronic equipment are easily caused. For example, in the embodiment of fig. 1, the resolution selected by the chat application does not match the resolution required by the chat application itself, resulting in an abnormal stretching situation when the chat application scans the two-dimensional code. In the embodiment of fig. 2, the resolution selected by the conferencing application is different from the resolution required by itself, resulting in insufficient sharpness of the image acquired by the conferencing application.
In order to solve the above problems, the present application provides a photographing method. In the method, the electronic device may store a white list of the first application, where the white list includes an identifier of the first application and at least one set of customized acquisition parameters corresponding to the first application, where the customized acquisition parameters may include acquisition parameters that do not cause the electronic device to be jammed, an abnormal stretching of a captured image, and a lower resolution of the captured image. It will be appreciated that custom acquisition parameters may be adapted to settings according to the needs of the application itself as well as the needs of the user. When a module (for example, a camera service) for realizing a shooting function in the electronic equipment acquires target acquisition parameters selected by the first application, if the first application is in a white list stored in the electronic equipment, the camera service can select second acquisition parameters which are matched with the resolution ratio of the target acquisition parameters from customized acquisition parameters corresponding to the first application, and the camera is started to acquire images through the second acquisition parameters.
It should be appreciated that the target acquisition parameters may include at least one of resolution, frame rate, and scale. For convenience of description, the following description will be presented with the target acquisition parameters as resolution.
By the proposal, when the application of the electronic device in the white list shoots images through the camera, the camera service can start the camera through the customized acquisition parameters of the application, the problems of blocking, application program flashing, abnormal stretching of a shot image, lower resolution of the shot image and the like of the electronic equipment can be avoided, and therefore the use experience of a user is improved.
For example, in the embodiment of fig. 1, if a chat application is included in the whitelist stored in the mobile phone 100, the chat application needs a resolution ratio of 2:1 when scanning codes, and the customized resolution ratio of 2:1 in the whitelist is 1024×512. The chat application, when invoking the camera service, will select 640 x 320 resolution of the acquisition parameters provided by the camera to invoke the camera service, however, the camera service may replace the resolution selected by the chat application with the customized resolution of the chat application. For example, the camera service replaces the resolution of 640×320 selected by the chat application with the resolution of 1024×512, so that an image is acquired through the resolution of 1024×512, thereby avoiding the situation that the electronic device is abnormally stretched by chat scanning the two-dimensional code. As shown in fig. 4a, when the chat application collects images based on 1024×512 resolution, the chat application can scan the two-dimensional code normally to obtain a normal two-dimensional code image 10'.
For another example, in the fig. 2 embodiment, the whitelist stored by handset 100 includes a conference application having a custom resolution of 1920 x 1080 in a resolution ratio of 16:9. The conference application replaces the resolution of 640 x 360 selected by the conference application with the resolution of 1920 x 1080 customized by the conference application when the camera service is invoked, thereby acquiring images at the resolution of 1920 x 1080 and making the images of the conference application clearer in video calls. As shown in fig. 4b, when a user performs a video call using a conference application, the conference application can display a clearer image 40' in the video call by acquiring image data based on 1920×1080 resolution.
Referring to fig. 5, fig. 5 is a system software architecture of a mobile phone 100 according to an embodiment of the present application.
Fig. 5 shows a schematic block diagram of a system software architecture of a mobile phone 100 according to an embodiment of the present application.
The software system of the mobile phone 100 may employ a layered architecture, an event driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. Taking an android system with a layered architecture as an example, the embodiment of the application illustrates the software structure of the mobile phone 100.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (android run) and system libraries, and a kernel layer, respectively.
As shown in fig. 5, the application layer may include a series of application packages.
The application package may include camera, gallery, chat, talk, map, navigation, WLAN, bluetooth, music, conferencing, short message, etc. applications.
The first application may include, for example, an application capable of calling a camera to achieve functions of shooting, scanning codes, or acquiring images, such as chat, map, navigation, and conference.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
The application framework layers may include camera services (camera servers), window management services (windows manager service, WMS), view systems (view systems), graphics systems (graphics systems), display composites (surface flickers), display engine services (display engine service), resource managers, notification managers, and the like.
Camera services are the core process modules of the camera framework, mainly providing API interface functions to the application layer, and invoking down to the camera hardware abstraction layer by way of HIDL (hardware interface definition language) (an interface definition language).
For example, in an embodiment of the present application, a white list of electronic devices may be stored in a camera service, where when a first application in the white list invokes the camera service, the camera service can obtain a target resolution selected by the first application, and obtain a custom resolution corresponding to the target resolution of the first application in the white list. The camera service replaces the target resolution selected by the first application with the custom resolution described above and invokes the camera to acquire image data at the custom resolution.
It will be appreciated that the whitelist in the camera service may also be updated up-to-date. For example, for a first application that is not stored in the whitelist of the electronic device, the electronic device may download, by way of silent upgrade, a configuration file of the first application, the configuration file including a package name of the first application and a custom resolution of the first application. The white list is updated accordingly.
WMS for managing windows. The WMS may obtain the size of the display screen, determine if there is a status bar, lock the screen, intercept the screen, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views.
The graphic system is a general programming software package in the system, and is composed of graphic I/O devices, and the basic functions are graphic element generation, attribute setting and the like.
The surface flicker, which may also be referred to as interface composition, functions to accept graphical display data provided by the WMS, compose them, and input to a display device for display. The surface flinger may use an open graphics library hardware synthesizer (hard ware composer, HWC) to synthesize an interface (surface).
The display engine service may invoke the hardware capabilities provided by the ambient light sensor in the hardware abstraction layer to obtain the illumination intensity of the ambient light.
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
The android run includes a core library and a virtual machine. android run is responsible for scheduling and management of android systems. The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (media library), three-dimensional graphics processing library (e.g., open graphics library (open graphics library, openGL), or open graphics library for embedded systems (open graphics library for embedded systems, openGL ES)), 2D graphics engine (e.g., skia graphics library (skia graphics library, SGL)), etc. It is understood that an Android application may call the SGL or OpenGLES interface to draw a rendering UI interface.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
It will be appreciated that graphics rendering within an android application is divided into two types, 2D and 3D, 2D can be implemented by Skia, which can also call the contents of part OpenGL ES to achieve simple 3D effects.
The hardware abstraction layer, i.e. the HAL layer, is a package for hardware drivers, and provides a unified universal interface for hardware capabilities for upper layers. As shown in fig. 5, the hardware abstraction layer includes a CPU HAL, a GPUHAL, a sensor HAL, a display HAL, a camera HAL, and the like. For example, in an embodiment of the present application the camera HAL is able to obtain the custom resolution of the first application from the camera service and activate the camera to obtain image data, e.g. image frames, video frames, etc. data at the custom resolution. The camera HAL returns the image data to the camera service, which in turn sends the image data to the first application, thereby realizing the corresponding functionality of the first application.
The kernel layer is a layer between hardware and software. The kernel layer at least comprises a CPU driver, a GPU driver, a display driver, a sensor driver, a camera driver and the like. For example, in the embodiment of the application, when the first application invokes the camera service, the camera driver can report all acquisition parameters supported by the camera to the first application, and the first application invokes the camera service by selecting acquisition parameters required by the first application from all the acquisition parameters.
The shooting method of the present application will be described in detail with reference to the system software architecture of the mobile phone 100.
Fig. 6a illustrates an interactive flow diagram for an application program invoking a camera service to obtain image data, according to some embodiments of its own right.
It will be appreciated that in some embodiments of the application, the first application may be, for example, a conference application, and the acquisition parameters reported by the camera driver may include, for example, a resolution list. The following describes the embodiment flow of the application program to acquire image data in a scenario where a user clicks a conference application to perform a video conference.
As shown in fig. 6a, the process includes:
the first application sends 601 a request to acquire a resolution list to a camera service in response to a user operation.
Illustratively, the first application, when invoking the camera service, will send a request to the camera service to obtain the resolution list.
For example, for the scenario shown in fig. 4a, the first application is a chat application, and when the user scans the two-dimensional code image through the chat application, the chat application sends a request for obtaining the resolution list to the camera service.
For another example, for the scenario shown in fig. 4b, the first application is a conference application, and the function of the conference application for video call may be started when the user needs to conduct a video conference. The conference application invokes the camera service to implement the video call in response to the user initiating the video call. When the conference application invokes the camera service, a request to obtain a resolution list is sent to the camera service.
The camera service obtains 602 a resolution list from the camera driver and sends 602 to the first application.
Illustratively, when the camera service receives a request for acquiring the resolution list sent by the first application, the camera service acquires the resolution list of all resolutions supported by the camera from the camera driver and sends the resolution list to the first application. For example, the ratio of resolutions supported by the camera includes ratios of 2:1, 4:3, 5:4, 16:9, and 16:9, etc. Each resolution ratio may include a plurality of resolutions, for example, a ratio of 2:1 resolution includes 640 x 320, 720 x 360, 1024 x 512. Resolutions in the ratio of 16:9 include 1920×1080, 1600×900, 1280×720, 960×540, and 640×360.
603, the first application selects a target resolution.
Illustratively, the first application may select a resolution required by itself as the target resolution after acquiring the resolution list sent by the camera suit.
For example, for the scenario shown in fig. 4a, the first application is a chat application, where the chat application may select the target resolution from the resolutions in the resolution list having a ratio of 2:1, and the resolution required for the chat application to turn on the scan function is 2:1. For example, the chat application selects a resolution of 640×320 from resolutions of 640×320, 720×360, 1024×512 as the target resolution.
For another example, for the scenario shown in fig. 4b, the first application is a conference application that requires a resolution of 16:9 when initiating the video call function. The conference application will select the target resolution from the resolution list reported by the camera service with a ratio of 16:9. For example, the conferencing application selects one resolution from 1920×1080, 1600×900, 1280×720, 960×540, and 640×360. The conferencing application will typically select the smallest resolution, e.g. 640 x 360 resolution as the target resolution to invoke the camera service.
In other embodiments, the first application may also select other resolutions as the target resolution. For example, the conferencing application excludes the smallest resolution from the 16:9 resolution list, and then selects the lowest resolution, e.g., 960×540 resolution, as the target resolution. After excluding a lowest resolution from the 2:1 resolution list, the chat application selects one of the remaining resolutions, for example, 720×360 resolution as the target resolution. The first application may also select the target resolution in other ways. However, it is always possible that the first application does not select a relatively suitable resolution as the target resolution, i.e. that the first application does not have an explicit selection method capable of selecting a suitable resolution as the target resolution.
The first application sends 604 call parameters to the camera service.
Illustratively, when the first application invokes the camera service, a call parameter is sent to the camera service, where the call parameter may include, for example, a target resolution selected by the first application, an identity of the invoked camera, and a package name of the first application. The camera identity is used to determine the camera that the first application needs to invoke, e.g., the handset 100 includes a front camera and a rear camera. The camera identification of the front camera is 0, and the camera identification of the rear camera is 1. When the first application calls the camera service, if the camera identifier sent to the camera service is 0, the conference application needs to call the front camera.
For example, for the scenario shown in fig. 4a, the first application is a chat application, when the chat application starts the code scanning function to call the camera service, the call parameters sent to the camera service include the application package name of the chat application, the called camera identifier is 1 (i.e. is a rear camera), and the target resolution is 640×320.
For another example, for the scenario shown in fig. 4b, the first application is a conference application, when the conference application starts the video call function to call the camera service, the call parameter sent to the camera service includes an application package name of the conference application, the called camera identifier is 0 (i.e. is a front camera), and the target resolution is 640×360.
605, the camera service determines whether the first application is in a white list.
It will be appreciated that a white list of applications is stored in the camera service, including package names of applications and custom resolution. For example, the white list includes the package name of the chat application and the custom resolution of the chat application, and the package name of the meeting application and the custom resolution of the meeting application. The camera service may determine whether the first application is in the white list based on the package name of the first application sent by the first application. The custom resolution may be determined based on a resolution scale, with different resolution scales having different custom resolutions. For example, a chat application may have a custom resolution of 2:1 of 1024 x 512 and a custom resolution of 16:9 of 1920 x 1080.
It will be appreciated that the custom resolution may be determined, for example, from the application and from debugging. For example, if the chat application needs 512 alignment at the time of code scanning, the 2:1 resolution of the chat application can be set to 1024×512 in the configuration file. Or the resolution of the video call in the conference application is 1920×1080, which can meet the requirement of the user on the definition of the video call, the resolution of the conference application at 16:9 can be set to 1920×1080.
If the determination is yes, for example, the camera service determines that the first application is in the white list through the package name of the first application, then execution 607, the camera service replaces the target resolution with the custom resolution.
If the determination is negative, for example, the camera service determines, through the package name of the first application, that the first application is not in the white list, then execution 606 starts the camera based on the target resolution.
It will be appreciated that the whitelist of the mobile phone 100 may be obtained by, for example, a developer determining a customized resolution of a first application by debugging the first application and packaging a package name and the customized resolution of the first application, where the whitelist is stored in the camera service. And, the white names stored in the camera service can be updated. For example, the first application in the mobile phone 100, because it is not stored in the white list of the camera service, causes the first application to use the selected target resolution to be mismatched with the camera when the camera service is invoked, which affects the first application to implement the corresponding function. The developer may obtain a configuration file of the first application after debugging the first application to determine the customized resolution of the first application. The configuration file of the first application comprises the package name of the first application and the customized resolution of the first application. Thereby updating the whitelist in the camera service.
By way of example, FIG. 6b shows a schematic diagram of a storage profile, according to an embodiment of the application. As shown in fig. 6b, the parameters Yun Tuifu server generates a configuration file and sends it to the camera service. The camera service saves the configuration file in extensible markup language (extensible markup language, XML) to update the whitelist. After the handset 100 is powered on and the camera service is started, the camera service can implement resolution replacement through the new white list. The updating of the white list of camera services is described in detail below.
It should be understood that in other embodiments, the mobile phone 100 may store the configuration file in other formats of files or forms, which is not limited herein.
Illustratively, FIG. 6c shows a configuration file, in accordance with an embodiment of the present application.
As shown in fig. 6c, the camera service stores the version number version of the configuration file in XML as 6.1.22.105, the file name is "camera_config_cure.xml", the product name product is "ROM", the application package name pkg in the configuration file is huiyi (package name indicating conference application), the identification of the invoked front camera is 0, the custom resolution of the front camera is 1920×1080, the identification of the rear camera is 1, and the custom resolution of the rear camera is 1920×1080. Where "34 1920 1080 OUTPUT" represents the custom resolution in preview mode (indicated by 34), and "35 1920 1080 OUTPUT" and "33 1920 1080 OUTPUT" represent the custom resolution at the time of shooting (indicated by 33 or 35).
Illustratively, the version number encoding rules described above: bb. aa represents a large version number of the parameter format, corresponding to a large version of a system. bb denotes a small version number of the parameter format, which is modified when a compatible change occurs in the parameter file format, starting from 10. yy represents the year of release, taking the last two digits, e.g., 22 for 2022 release. rrr denotes the number of publications in a year, starting from 100 per year, with 1 added to each publication. The storage path of the configuration file is 'product configuration bin/product_h/etc/camera'. It should be appreciated that in other embodiments, the configuration file may be stored in other paths, which is not limited herein.
The camera service activates the camera based on the target resolution 606.
The camera service may determine that the first application is not within the white list, and may initiate the camera directly with the resolution selected by the first application. For example, the conferencing application is not stored in the white list of the camera service, and the camera service may invoke the camera directly to acquire image data at the target resolution 640 x 360 selected by the conferencing application.
It will be appreciated that the camera service transmits the target resolution 640 x 360 to the camera HAL, through which the camera is activated to acquire corresponding image data at a resolution of 640 x 360.
607, the camera service replaces the target resolution with the custom resolution.
For example, if the camera service determines that the first application is in the white list according to the package name of the first application, the target resolution selected by the first application is replaced by the customized resolution.
For example, for the scenario shown in FIG. 4a, the first application is a chat application, and the chat application has a custom resolution of 1024X 512 in the white list of 2:1, and the target resolution that the chat application sends to the camera service is 640X 320. At this point the camera service may replace the target resolution 640 x 360 of the chat application with the corresponding custom resolution 1024 x 512, from which the image data is acquired.
For another example, for the scenario shown in fig. 4b, the first application is a conferencing application, the conferencing application is within a whitelist of camera services, and the custom resolution of conferencing application 16:9 in the whitelist is 1920 x 1080, and the target resolution that the conferencing application sends to the camera services is 640 x 360. The camera service may now replace the target resolution 640 x 360 of the conferencing application with the custom resolution 1920 x 1080, thereby acquiring image data at the custom resolution.
608, the camera service activates the camera based on the custom resolution.
It will be appreciated that after the camera service replaces the target resolution with the custom resolution, the custom resolution is sent to the camera HAL and the camera is started by the camera HAL to acquire image data.
For example, for the scenario shown in fig. 4a, when the chat application invokes the camera service, the camera service sends the custom resolution 1024 x 512 of the chat application to the camera HAL, which initiates the camera to acquire image data at a resolution of 1024 x 512.
For another example, for the scenario shown in fig. 4b, when the conference application invokes the camera service, the camera service sends the custom resolution 1920×1080 of the conference application to the camera HAL, which initiates the camera to acquire image data at 1920×1080 resolution.
The camera service obtains image data from the camera 609 and sends the image data to the first application.
Illustratively, after the camera acquires image data according to the resolution issued by the camera service, the camera service acquires the image data from the camera. The camera service then sends the image data to the first application, thereby implementing the functionality of the first application.
For example, for the scenario shown in fig. 4a, the camera service obtains image data from the camera and then sends the image data to the chat application, which obtains the image data identifying the two-dimensional code.
For another example, for the scenario shown in fig. 4b, the camera service obtains image data from the camera and then sends the image data to the conference application, which obtains the image data to implement the function of the video call.
It can be appreciated that, according to the embodiment of the present application, when the first application issues the selected target resolution to the camera service, the camera service determines whether the first application is in the white list according to the package name of the application program, so as to determine whether to replace the target resolution issued by the first application. Therefore, the embodiment of the application can add the package name and the corresponding customized resolution of the first application in the white list, so that the camera service can start the camera to acquire the image data according to the customized resolution of the first application, and the image data acquired by the first application can meet the self requirement and the requirement of a user.
For the white list in the camera service in the embodiment of fig. 6a, the cloud pushing update can be performed by the parameter cloud pushing server.
For example, FIG. 7 illustrates an interactive flow diagram for updating a whitelist, according to some embodiments of the application.
As shown in fig. 7, the flow includes:
701, parameters Yun Tuifu server packages the configuration file.
For example, for a first application not stored in the white list, when the camera service is invoked, if there is a need to adjust the acquisition parameters, the white list in the camera service may be updated by a cloud pushing manner. For convenience of description, the following description will take the acquisition parameters as resolution examples. For example, the map application is not stored in the whitelist of the electronic device, and after the developer has debugged the custom resolution of the map application, the parameter Yun Tuifu server may package the application package name and custom resolution of the map application into a configuration file through a packaging tool.
It will be appreciated that in other embodiments, the configuration file may include an application package name and a custom resolution for at least one first application.
The parameter Yun Tuifu server sends the broadcast information to the system upgrade client 702.
Illustratively, the parameter Yun Tuifu server packages the configuration file and then transmits the broadcast information including the update instruction, thereby enabling the electronic device to receive the broadcast information and update the whitelist.
703, the system upgrade client listens for a broadcast information acquisition parameter update instruction. The electronic device may illustratively implement a mobile new service download function via over-the-air technology (OTA). Taking the mobile phone 100 as an example, the system upgrade client (OTA update client, OUC) in the mobile phone 100 can monitor the broadcast information sent by the parameter Yun Tuifu server, so as to implement the over-the-air technology. It will be appreciated that OUC is an application installed in, for example, the application layer (application) of the system of the handset 100 for operating system upgrade management.
For example, after the system upgrade client in the mobile phone 100 monitors the broadcast information sent by the parameter cloud pushing server, the parameter update instruction may be obtained from the broadcast information.
The system upgrade client downloads the configuration file 704.
Illustratively, after the system upgrade client obtains the parameter update instruction from the broadcast information, the configuration file packaged by the parameter Yun Tuifu server may be downloaded.
It will be appreciated that the manner in which the system upgrade client downloads the parameter file may be a silent upgrade, for example, the handset 100 may be able to automatically download the configuration file in the case of a connection to the internet between two and four points in the morning. Or the mobile phone 100 pops up a prompt box to prompt the user to upgrade the system, and downloads the configuration file under the condition that the user clicks the authorization.
705, the system upgrade client sends a configuration file to the camera service.
The system upgrade client, illustratively, downloads the configuration file and then sends the configuration file to the camera service. It will be appreciated that when the system upgrade client is run, the camera service of the application framework layer may be accessed through the application programming interface and the configuration file sent to the camera service to update the camera service.
The camera service updates 706 the whitelist according to the configuration file.
Illustratively, the camera service may update the whitelist after acquiring the configuration file. For example, when the system upgrade client downloads the configuration file of the map application to the camera service, the white list in the camera service includes the package name of the map application and the custom resolution of the map application, thereby completing the update of the white list.
The white list of the electronic equipment is updated through the parameter cloud pushing server, so that the problem that the resolution of an application program which is not stored in the white list of the electronic equipment is not matched with a camera of the electronic equipment can be solved in time.
Fig. 8 shows a schematic structural diagram of a mobile phone 100 according to an embodiment of the present application.
The handset 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the structure illustrated in the embodiments of the present application is not limited to the specific embodiment of the mobile phone 100. In other embodiments of the application, the handset 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bidirectional synchronous serial bus, which includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display, camera, etc. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with a camera, display screen, wireless communication module 160, sensor module 180, or the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a mini USB interface, a micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect to a charger to charge the mobile phone 100, or may be used to transfer data between the mobile phone 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the connection relationship between the modules illustrated in the embodiment of the present application is only illustrative, and is not limited to the structure of the mobile phone 100. In other embodiments of the present application, the mobile phone 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the above embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the cell phone 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the display, the camera, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the mobile phone 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied to the handset 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc. applied to the mobile phone 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capabilities of the handset 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data (e.g., audio data, phonebook, etc.) created during use of the handset 100, etc. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications and data processing of the mobile phone 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The handset 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display 194, the mobile phone 100 detects the intensity of the touch operation according to the pressure sensor 180A. The mobile phone 100 may also calculate the position of the touch based on the detection signal of the pressure sensor 180A.
The gyro sensor 180B may be used to determine the motion gesture of the cell phone 100.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the handset 100 calculates altitude from the barometric pressure value measured by the barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D may include, for example, a hall sensor.
The acceleration sensor 180E can detect the magnitude of acceleration of the mobile phone 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the handset 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The cell phone 100 may measure the distance by infrared or laser. In some embodiments, the cell phone 100 may range using the distance sensor 180F to achieve quick focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The cellular phone 100 emits infrared light outward through the light emitting diode. The cell phone 100 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object in the vicinity of the cell phone 100. When insufficient reflected light is detected, the handset 100 can determine that there is no object in the vicinity of the handset 100. The mobile phone 100 can detect that the user holds the mobile phone 100 close to the ear to talk by using the proximity light sensor 180G, so as to automatically extinguish the screen to achieve the purpose of saving electricity.
The ambient light sensor 180L is used to sense ambient light level. The cell phone 100 may adaptively adjust the brightness of the display 194 based on perceived ambient light levels. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect if the handset 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The mobile phone 100 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The temperature sensor 180J is for detecting temperature.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the mobile phone 100 at a different location than the display 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195 or removed from the SIM card interface 195 to enable contact and separation with the handset 100. The handset 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The mobile phone 100 interacts with the network through the SIM card to realize functions such as call and data communication. In some embodiments, handset 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the handset 100 and cannot be separated from the handset 100.
In the drawings, some structural or methodological features may be shown in a particular arrangement and/or order. However, it should be understood that such a particular arrangement and/or ordering may not be required. Rather, in some embodiments, these features may be arranged in a different manner and/or order than shown in the illustrative figures. Additionally, the inclusion of structural or methodological features in a particular figure is not meant to imply that such features are required in all embodiments, and in some embodiments, may not be included or may be combined with other features.
It should be noted that, in the embodiments of the present application, each unit/module mentioned in each device is a logic unit/module, and in physical terms, one logic unit/module may be one physical unit/module, or may be a part of one physical unit/module, or may be implemented by a combination of multiple physical units/modules, where the physical implementation manner of the logic unit/module itself is not the most important, and the combination of functions implemented by the logic unit/module is only a key for solving the technical problem posed by the present application. Furthermore, in order to highlight the innovative part of the present application, the above-described device embodiments of the present application do not introduce units/modules that are less closely related to solving the technical problems posed by the present application, which does not indicate that the above-described device embodiments do not have other units/modules.
It should be noted that in the examples and descriptions of this patent, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
While the application has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the application.

Claims (7)

1. A photographing method applied to an electronic device, the method comprising:
detecting that a first application installed on the electronic equipment calls a camera service, and acquiring a first acquisition parameter sent by the first application calling the camera service, wherein the first acquisition parameter comprises a first resolution;
determining that a second acquisition parameter exists in preset acquisition parameters of the first application, wherein the second acquisition parameter comprises a second resolution which is the same as the first resolution in proportion; the first resolution is less than the second resolution;
the electronic equipment stores a first list, wherein the first list comprises an application identifier and preset acquisition parameters corresponding to the identifier;
and determining that a second acquisition parameter exists in the preset acquisition parameters of the first application, wherein the method comprises the following steps:
acquiring a first identifier of the first application, which is sent by the first application calling the camera service, and identifying the first application;
Determining that a second identifier which is the same as the first identifier exists in the first list;
determining the second acquisition parameters from preset acquisition parameters corresponding to the second identifiers;
and providing the camera service for the first application by using the second acquisition parameters.
2. The method of claim 1, wherein the obtaining the first acquisition parameter sent by the first application invoking the camera service comprises:
and sending a collection parameter list supported by a camera to the first application, and receiving the first collection parameter selected from the collection parameter list by the first application.
3. The method of claim 1, wherein the first identifier is an application package name of the first application, the second identifier is an application package name stored in the first list, and the preset collection parameter is matched with an application corresponding to the application package name stored in the first list.
4. A method according to any one of claims 1 to 3, wherein the first application comprises at least one of the following applications:
conference applications, instant messaging applications, live applications, map applications, and navigation applications.
5. An electronic device comprising a camera service module and a first application;
the camera service module detects that the first application calls a camera service, and obtains a first acquisition parameter sent by the first application calling the camera service, wherein the first acquisition parameter comprises a first resolution;
the camera service module determines that a second acquisition parameter exists in preset acquisition parameters of the first application, wherein the second acquisition parameter comprises a second resolution corresponding to the proportion of the first resolution; the first resolution is less than the second resolution;
the camera service module stores a first list, wherein the first list comprises an application identifier and preset acquisition parameters corresponding to the identifier;
the camera service module determines that a second acquisition parameter exists in the preset acquisition parameters of the first application, and the method comprises the following steps:
the camera service module obtains a first identifier of the first application, which is sent by the first application calling the camera service;
the camera service module determines that a second identifier which is the same as the first identifier exists in the first list;
the camera service module determines the second acquisition parameters from preset acquisition parameters corresponding to the second identification;
The camera service module provides the camera service for the first application using the second acquisition parameters.
6. An electronic device comprising a memory for storing instructions;
at least one processor configured to execute the instructions to cause the electronic device to implement the photographing method of any one of claims 1 to 4.
7. A computer-readable storage medium, wherein instructions stored thereon, which when executed on a computer, cause the computer to perform the photographing method of any one of claims 1 to 4.
CN202310809507.0A 2023-07-04 2023-07-04 Shooting method, readable storage medium and electronic device Active CN116528063B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310809507.0A CN116528063B (en) 2023-07-04 2023-07-04 Shooting method, readable storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310809507.0A CN116528063B (en) 2023-07-04 2023-07-04 Shooting method, readable storage medium and electronic device

Publications (2)

Publication Number Publication Date
CN116528063A CN116528063A (en) 2023-08-01
CN116528063B true CN116528063B (en) 2023-11-03

Family

ID=87401604

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310809507.0A Active CN116528063B (en) 2023-07-04 2023-07-04 Shooting method, readable storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN116528063B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104503655A (en) * 2014-11-28 2015-04-08 晨星半导体股份有限公司 Application program interface display control method and device
CN107172345A (en) * 2017-04-07 2017-09-15 深圳市金立通信设备有限公司 A kind of image processing method and terminal
CN108037819A (en) * 2017-12-05 2018-05-15 广东美晨通讯有限公司 A kind of mobile terminal reduces method, mobile terminal and the storage device of power consumption
CN109068059A (en) * 2018-08-27 2018-12-21 Oppo广东移动通信有限公司 A kind of method, mobile terminal and storage medium calling camera
CN110012217A (en) * 2018-12-06 2019-07-12 香港乐蜜有限公司 The determination method, apparatus and electronic equipment of acquisition resolution
CN111212227A (en) * 2020-01-13 2020-05-29 广州视源电子科技股份有限公司 Method, device and equipment for configuring video resolution of camera and storage medium
CN113179369A (en) * 2021-04-08 2021-07-27 重庆传音通讯技术有限公司 Shot picture display method, mobile terminal and storage medium
CN115202599A (en) * 2022-06-24 2022-10-18 中国银行股份有限公司 Screen projection display method and related device
CN116074623A (en) * 2022-05-30 2023-05-05 荣耀终端有限公司 Resolution selecting method and device for camera

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5516662B2 (en) * 2011-12-05 2014-06-11 株式会社ニコン Imaging device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104503655A (en) * 2014-11-28 2015-04-08 晨星半导体股份有限公司 Application program interface display control method and device
CN107172345A (en) * 2017-04-07 2017-09-15 深圳市金立通信设备有限公司 A kind of image processing method and terminal
CN108037819A (en) * 2017-12-05 2018-05-15 广东美晨通讯有限公司 A kind of mobile terminal reduces method, mobile terminal and the storage device of power consumption
CN109068059A (en) * 2018-08-27 2018-12-21 Oppo广东移动通信有限公司 A kind of method, mobile terminal and storage medium calling camera
CN110012217A (en) * 2018-12-06 2019-07-12 香港乐蜜有限公司 The determination method, apparatus and electronic equipment of acquisition resolution
CN111212227A (en) * 2020-01-13 2020-05-29 广州视源电子科技股份有限公司 Method, device and equipment for configuring video resolution of camera and storage medium
CN113179369A (en) * 2021-04-08 2021-07-27 重庆传音通讯技术有限公司 Shot picture display method, mobile terminal and storage medium
CN116074623A (en) * 2022-05-30 2023-05-05 荣耀终端有限公司 Resolution selecting method and device for camera
CN115202599A (en) * 2022-06-24 2022-10-18 中国银行股份有限公司 Screen projection display method and related device

Also Published As

Publication number Publication date
CN116528063A (en) 2023-08-01

Similar Documents

Publication Publication Date Title
CN114467297B (en) Video call display method and related device applied to electronic equipment
CN113885759B (en) Notification message processing method, device, system and computer readable storage medium
CN116360725B (en) Display interaction system, display method and device
EP4086780A1 (en) File sharing method and system, and related device
CN114089932B (en) Multi-screen display method, device, terminal equipment and storage medium
CN114579389A (en) Application management method, device, equipment and storage medium
EP4033354A1 (en) Method and apparatus for installing plug-in, and storage medium
CN116048358B (en) Method and related device for controlling suspension ball
CN113821767A (en) Application program authority management method and device and electronic equipment
CN115914461B (en) Position relation identification method and electronic equipment
CN116051351B (en) Special effect processing method and electronic equipment
CN116048831B (en) Target signal processing method and electronic equipment
CN116528063B (en) Shooting method, readable storage medium and electronic device
CN115017498B (en) Method for operating applet and electronic device
CN117348894A (en) Software upgrading method, terminal equipment and system
CN116527266A (en) Data aggregation method and related equipment
CN116561085A (en) Picture sharing method and electronic equipment
CN114996162B (en) Test method and related device
CN116382728B (en) Propagation name display method and terminal equipment
CN116233599B (en) Video mode recommendation method and electronic equipment
CN115495716B (en) Local authentication method and electronic equipment
CN115792431B (en) Abnormal position detection method and electronic equipment
CN115706752B (en) Incoming call volume control method, electronic equipment and storage medium
CN117251223A (en) Cloud function plug-in configuration and scheduling method, system and electronic equipment
CN118034948A (en) Key event monitoring method and system and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant