CN116489494A - Shooting method, readable medium and electronic device - Google Patents

Shooting method, readable medium and electronic device Download PDF

Info

Publication number
CN116489494A
CN116489494A CN202310369097.2A CN202310369097A CN116489494A CN 116489494 A CN116489494 A CN 116489494A CN 202310369097 A CN202310369097 A CN 202310369097A CN 116489494 A CN116489494 A CN 116489494A
Authority
CN
China
Prior art keywords
camera
application
screen
physical
shooting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310369097.2A
Other languages
Chinese (zh)
Inventor
武文斌
张东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310369097.2A priority Critical patent/CN116489494A/en
Publication of CN116489494A publication Critical patent/CN116489494A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/0206Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings
    • H04M1/0208Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings characterized by the relative motions of the body parts
    • H04M1/0214Foldable telephones, i.e. with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Telephone Function (AREA)
  • Studio Devices (AREA)

Abstract

The application relates to the technical field of terminals and discloses a shooting method, a readable medium and electronic equipment. The electronic equipment comprises a first screen and a plurality of physical cameras, wherein the first screen is a folding screen; the method comprises the following steps: the method comprises the steps that a first shooting instruction of a second application is received by a first application, wherein the first shooting instruction comprises a visual angle identifier of front shooting; under the condition that a first screen is in an unfolding state, responding to a first shooting instruction, and starting a first physical camera by a first application; under the condition that the first screen is in a folded state, responding to a first shooting instruction, and starting a second physical camera by a first application; the second application completes a first shooting instruction through a physical camera started by the first application; wherein the second application is independent of the first application. Therefore, a developer of the second application does not need to develop an instruction for determining the physical camera for responding to the same shooting visual angle identification according to the folding form of the first screen, and development cost is reduced.

Description

Shooting method, readable medium and electronic device
This application is a divisional application, the filing number of the original application is 202111651361.9, the date of the original application is 2021, 12 months and 30 days, and the entire contents of the original application are incorporated herein by reference.
Technical Field
The application relates to the technical field of terminals, in particular to a shooting method, a readable medium and electronic equipment.
Background
With the development of flexible screen technology, the application of foldable electronic devices equipped with flexible screens is becoming more and more widespread. In a foldable electronic device, a plurality of physical cameras are generally configured, and when the electronic device is in different forms, for example, a folded state and an unfolded state, a logic front camera/a logic rear camera used by the electronic device may be different physical cameras. Therefore, the application program needs to determine the physical camera adopted by the logical front/rear camera according to the form of the electronic device.
Disclosure of Invention
In view of this, the embodiment of the application provides a shooting method, a readable medium and an electronic device. The system software/service/module of the electronic equipment determines the physical camera corresponding to the logic camera to be started by the application program according to the folding form of the electronic equipment, so that the development cost of the application program can be reduced.
In a first aspect, an embodiment of the present application provides a shooting method, which is applied to an electronic device, where the electronic device includes a first screen and a second screen that are located on different sides of the electronic device, where the first screen is a folded screen, at least one first physical camera is disposed on a same side of the first screen, and at least one second physical camera is disposed on a same side of the second screen; and the method comprises the following steps: the method comprises the steps that a first application receives a shooting instruction of a second application, wherein the shooting instruction comprises a shooting visual angle identifier, and the shooting visual angle comprises front shooting and rear shooting; the first application determines physical cameras for completing shooting instructions according to shooting visual angle identifiers, folding forms of a first screen and corresponding relations of all physical cameras and shooting visual identifiers of the first screen stored in advance under different folding forms; the second application completes shooting instructions through the determined physical cameras; wherein the second application is independent of the first application.
In this embodiment of the present application, a first application in the electronic device (for example, system software/service/module/system application of the electronic device, such as a camera provider below) determines, according to a folding status of a folding screen (first screen) of the electronic device, a physical camera corresponding to the same capturing view identifier (that is, a logical camera identifier below, such as a logical front-end camera identifier, a logical rear-end camera identifier, etc.), so that the second application may collect an image/video through the determined corresponding physical camera. Therefore, a developer of the second application does not need to develop a command for responding to the physical camera of the same shooting visual angle identification according to the folding form of the folding screen of the electronic equipment, and development cost is reduced.
In one possible implementation manner of the first aspect, the determining, by the first application, the physical camera for completing the shooting instruction according to the shooting view angle identifier, the folding form of the first screen, and the corresponding relationship between each physical camera and the shooting visual identifier of the first screen in different folding forms, where the corresponding relationship is stored in advance, includes: the first application determines that a shooting instruction is finished through a first physical camera under the condition that a shooting visual angle mark is a front shooting mark and a first screen is in an unfolding state; and the second application determines that the shooting instruction is finished through the second physical camera under the condition that the shooting visual angle identifier is the front shooting identifier and the first screen is in a folded state.
In a possible implementation of the first aspect, the method further includes: and under the condition that the folding form of the first screen is changed, the first application switches the physical camera for completing the shooting instruction.
In one possible implementation manner of the first aspect, the switching, by the first application, the physical camera for completing the shooting instruction in a case where a folding morphology of the first screen changes, includes: under the condition that a first screen is switched from an unfolding state to a folding state, a first application switches a physical camera for completing shooting instructions from the first physical camera to a second physical camera; and under the condition that the first screen is switched from the folded state to the unfolded state, the first application switches the physical camera used for completing the shooting instruction from the second physical camera to the first physical camera.
In other words, if the folding form of the first screen of the electronic device changes during the process that the second application collects images or videos through the determined physical camera, the first application can switch the corresponding physical camera according to the changed folding form of the electronic device. Therefore, after the folding form of the electronic equipment is changed, the second application still collects images or videos through the physical cameras corresponding to the folding form of the electronic equipment before the folding form is changed, and user experience is improved.
In a possible implementation manner of the first aspect, the electronic device further includes at least one third physical camera disposed on a different side from the first screen; and the method further comprises the steps of: the first application determines that a shooting instruction is finished through a third physical camera and/or a second physical camera under the condition that a shooting visual angle mark is a post shooting mark and a first screen is in an unfolding state; and the first application determines that the shooting instruction is finished through the third physical camera under the condition that the shooting visual angle mark is a post shooting mark and the first screen is in a folded state.
The electronic device uses a third physical camera to complete a shooting instruction when the first screen is in a folded state and uses the third physical camera and/or a second physical camera to complete the shooting instruction when the first screen is in an unfolded state when the shooting view angle of the second application is a rear shooting (i.e. when a logic rear camera is used to collect images or videos).
In a possible implementation of the first aspect, the method further includes:
under the condition that the first application is abnormal and restored, the physical cameras for completing shooting instructions are redetermined based on shooting view angle identifiers, folding forms of the first screen and corresponding relations of the physical cameras and the shooting view angle identifiers of the first screen stored in advance under different forms.
That is, under the condition that the first application is abnormal and restored, for example, after the first application cannot continue to provide service for restarting, the first application can redetermine the physical camera for completing the shooting instruction based on the shooting visual angle identifier, the folding form of the first screen and the corresponding relation between each physical camera and the shooting visual angle identifier of the first screen stored in advance under different forms, so that the situation that the second application cannot acquire images or videos through the correct physical camera due to restarting of the first application is avoided, and user experience is improved.
In a possible implementation of the first aspect, the method further includes:
the first application obtains a folded form of the first screen from the third application.
In one possible implementation of the first aspect, the operating system of the electronic device includes a hardware abstraction layer and an application framework layer; and the first application is arranged at the hardware abstraction layer; the third application is arranged on an application framework layer.
In a possible implementation of the first aspect, the second application includes any one of the following applications: camera applications, instant messaging applications, browser applications, video conferencing applications.
In a second aspect, an embodiment of the present application provides a readable medium, where the readable medium includes instructions, which when executed by a processor of an electronic device, cause the electronic device to implement any one of the foregoing first aspect and a possible implementation of the foregoing first aspect.
In a third aspect, an embodiment of the present application provides an electronic device, including: a plurality of physical cameras;
a memory for storing instructions for execution by one or more processors of the electronic device; and at least one processor configured to execute instructions to cause the electronic device to select at least one of the plurality of physical cameras to complete the shooting instruction by any one of the shooting methods provided in the first aspect and possible implementations of the first aspect.
Drawings
Fig. 1 illustrates a schematic structural diagram of a foldable cellular phone 100, according to some embodiments of the present application;
FIG. 2A illustrates a schematic view of the front side (A side of FIG. 1) of a cellular telephone 100 in an expanded state, according to some embodiments of the present application;
FIG. 2B illustrates a schematic front underside view of the back (B/C side of FIG. 1) side of the handset 100 in an expanded state, according to some embodiments of the application;
Fig. 3A illustrates a schematic view of the front side (C side shown in fig. 1) of a cellular phone 100 in a folded state, according to some embodiments of the present application;
FIG. 3B illustrates a schematic view of the back side (B side of FIG. 1) of a cellular phone 100 in a folded state, according to some embodiments of the present application;
FIG. 4A illustrates a schematic view of a user's self-timer scene with the handset 100 in an expanded state, according to some embodiments of the application;
fig. 4B illustrates a schematic view of a user's self-timer scene with the handset 100 in a folded state, according to some embodiments of the application;
FIG. 5 illustrates a flow diagram of a method of capturing images, according to some embodiments of the present application;
fig. 6 illustrates a software architecture diagram of a mobile phone 100, according to some embodiments of the present application;
FIG. 7 illustrates an interactive process diagram of a shooting method, according to some embodiments of the present application;
fig. 8A illustrates a display interface diagram of a mobile phone 100 when a user switches to a logical front-facing camera, according to some embodiments of the present application;
FIG. 8B illustrates a display interface diagram of a cell phone 100 when the camera 011 is activated, according to some embodiments of the present application;
FIG. 9 illustrates an interactive process diagram of a shooting method, according to some embodiments of the present application;
Fig. 10 illustrates a schematic diagram of a mobile phone 100, according to some embodiments of the present application.
Detailed Description
Illustrative embodiments of the present application include, but are not limited to, a photographing method, a readable medium, and an electronic device.
The following describes the technical scheme of the embodiments of the present application with reference to the accompanying drawings.
For easy understanding, an electronic device to which the photographing method provided in some embodiments of the present application is applicable will be described first.
It is understood that the physical camera mentioned in the present application refers to an entity camera installed on an electronic device or externally connected to the electronic device. In some embodiments, different physical cameras/camera units typically have different physical camera identifications through which applications in the electronic device may invoke corresponding physical cameras to capture image data.
It can be appreciated that, compared to the above-mentioned physical camera, the logic camera mentioned in the application is a conceptual camera for different usage fields Jing Dingyi, and according to the application of the user with the image capturing or video capturing function on the electronic device, when capturing images or videos, the physical camera capturing directions are different, and the logic camera can be divided into a logic front-mounted camera (such as a front-mounted capturing view angle and a logic rear-mounted camera (such as an inward-mounted capturing view angle and a capturing direction of a display screen currently used by the user in the electronic device).
It will be appreciated that in other embodiments, the logic camera may also include other types of logic cameras, such as a logic depth camera for capturing 3-dimensional images, as determined by the function of the camera, without limitation.
Specifically, for example, fig. 1 illustrates a schematic structural diagram of a foldable cellular phone 100, according to some embodiments of the present application; FIG. 2A illustrates a schematic view of the front side (A side of FIG. 1) of a cellular telephone 100 in an expanded state, according to some embodiments of the present application; FIG. 2B illustrates a schematic view of the back side (the B/C side of FIG. 1) of a cellular telephone 100 in an expanded state, according to some embodiments of the present application; fig. 3A illustrates a schematic view of the front side (C side shown in fig. 1) of a cellular phone 100 in a folded state, according to some embodiments of the present application; fig. 3B illustrates a schematic view of the back side (B side shown in fig. 1) of the mobile phone 100 in a folded state, according to some embodiments of the present application.
When the mobile phone 100 is in the unfolded state, that is, when the folding screen 1941 is in the unfolded state, referring to fig. 1 and 2A, the logical front camera of the mobile phone 100 is a camera unit 1931, and referring to fig. 2B, the logical rear camera of the mobile phone 100 is a camera unit 1933; when the mobile phone 100 is in the folded state, that is, the folding screen 1941 is in the folded state, referring to fig. 3A, the logical front camera of the mobile phone 100 is the camera unit 1932, and referring to fig. 3B, the logical rear camera of the mobile phone 100 is the camera unit 1933. Therefore, the same physical camera can be used as a logic front camera or a logic rear camera under different use scenes.
It will be appreciated that a folding screen is a foldable, bendable display screen made of a flexible material, such as a flexible screen made of Organic Light-Emitting diodes (OLED), or the like.
It should be understood that the foregoing structure and configuration of the mobile phone 100 are merely examples, and that in other embodiments, the mobile phone 100 may be other types of foldable electronic devices, such as the mobile phone 100 folded along the M-M direction B/C plane shown in fig. 1, which is not limited herein.
It will be appreciated that the aforementioned unfolded state and folded state of the mobile phone 100 are merely examples, and in some embodiments, when the included angle between the left half screen and the right half screen of the display 1941 is greater than the preset unfolded angle, it may be determined that the mobile phone 100 is in the unfolded state; when the included angle between the left half screen and the right half screen of the display 1941 is smaller than the preset folding angle, it is determined that the mobile phone 100 is in the folded state, which is not limited herein.
It should be understood that the foregoing distribution of the camera units of the mobile phone 100 is merely an example, and in other embodiments, the mobile phone 100 may include more or fewer camera units, and the camera units may be disposed at other positions of the mobile phone 100, which is not limited herein.
It is understood that a camera unit may include any number of cameras, for example, camera unit 1931 may include a conventional camera for capturing 2-dimensional images and a depth camera for capturing 3-dimensional images.
As described above, when the mobile phone 100 is processed in different modes, the logical front-end camera of the mobile phone 100 is a different physical camera, so that the application program installed in the mobile phone 100 needs to select the corresponding physical camera as the logical front-end camera of the mobile phone 100 according to the mode of the mobile phone 100. For example, referring to fig. 4A and 4B, when the user uses the camera application to perform self-photographing, the camera application selects the camera unit 1931 as a logical front camera when the mobile phone 100 is in an unfolded state, that is, acquires an image of the user through the camera unit 1931; when the mobile phone 100 is in the folded state, the camera unit 1932 is selected as a logical front camera, that is, an image of the user is acquired through the camera unit 1932.
That is, in order to ensure that the application program installed on the mobile phone 100 can use the corresponding physical camera to collect the image or video when the mobile phone 100 is in different forms (folded state and unfolded state), the developer of the application program needs to integrate and detect the form of the mobile phone 100 for each application program using the camera of the mobile phone 100, and select the corresponding physical camera as the instruction of the logical camera according to the form of the mobile phone 100, thereby increasing the development cost of the application program. For example, to keep referring to fig. 4A and fig. 4B, in order to ensure that when the camera application is used in the user's self-timer, the camera application can use the physical cameras disposed in the mobile phone 100 in both the unfolded state and the folded state, where the display screen currently used by the user is on the same side, the developer needs to integrate the form of the mobile phone 100 in the camera application, and determine the instruction of the physical camera corresponding to each logical camera in the current form of the mobile phone 100 according to the form of the mobile phone 100 and the correspondence between each logical camera and each physical camera in different forms of the mobile phone 100, so that the camera application can acquire images or videos through the camera unit 1931 in the folded state of the mobile phone 100 and acquire images or videos through the camera unit 1932 in the unfolded state of the mobile phone 100 in the operation process.
In view of this, the embodiment of the present application provides a shooting method, in which the system software of the mobile phone 100, such as a Camera Provider (Camera Provider), establishes the correspondence between the logic front-end Camera and the physical Camera and the correspondence between the logic rear-end Camera and the physical Camera when the mobile phone 100 is in different forms. Therefore, when an application program in the mobile phone 100 acquires an image or a video through the logic front camera or the logic rear camera, the system software of the mobile phone 100 can determine a physical camera of the application program for acquiring the image or the video according to the corresponding relation only by providing the identification of the logic front camera or the logic rear camera for the system software of the mobile phone 100, and the physical camera is started through the physical camera identification of the physical camera, so that a developer of the application program is not required to independently write and acquire the form of the mobile phone 100 for the application program, and the instruction of the physical camera corresponding to each logic camera is determined according to the form of the mobile phone 100, and the development cost of the application program is reduced.
For example, the mobile phone 100 includes 3 physical camera units, i.e., a camera unit 1931, a camera unit 1932, and a camera unit 1933, and the mobile phone 100 processes different modes, and the correspondence between each logical camera and each physical camera is shown in table 1:
Table 1 the mobile phone 100 is in different forms, and the correspondence table between each logical camera and each physical camera
Mobile phone 100 shape Logic front camera (sign: 1) Logic rear camera (sign: 0)
Expanded state Camera unit 1931 Camera unit 1933
Folded state Camera unit 1932 Camera unit 1933
As shown in table 1, the system software of the mobile phone 100 may set the identifier of the logical front camera (front-end capturing identifier) to 1 and the identifier of the logical rear camera (rear-end capturing identifier) to 0, so that the application program on the mobile phone 100 may collect images or videos through the camera unit 1931 when the mobile phone 100 shown in fig. 4A is in the unfolded state, for example, referring to fig. 4A and 4B, and in the scenario where the user uses the camera to self-capture, the camera application provides the logical front camera identifier "1" to the system software of the mobile phone 100; when the mobile phone 100 is in the folded state shown in fig. 4B, the camera unit 1932 is used to collect images or videos, and the camera application developer does not need to integrate the instructions for collecting images or videos in the unfolded state of the mobile phone 100 and collecting images or videos in the folded state of the mobile phone 100 through the camera unit 1931 in the camera application, so that the development cost of the camera application is reduced.
It is to be understood that the correspondence between each logical camera and each physical camera in the mobile phone 100 shown in table 1 is merely an example, and in other embodiments, other correspondence may be also possible, for example, the logical post-camera may also include the camera unit 1932 in the unfolded state of the mobile phone 100, which is not limited herein.
It will be appreciated that the foregoing logical front-facing camera identification may be set to 1 as an example, and that in other embodiments, the logical front-facing camera identification may be other identifications, which are not limited herein.
It will be appreciated that in other embodiments, in the case where the mobile phone 100 is in a different configuration and uses a different physical camera as the logical post-camera, the system software of the mobile phone 100 may also provide a unified logical post-camera identifier, for example, 0, for the physical camera that may be used as the logical post-camera, so that an application in the mobile phone 100 may collect an image or video through a corresponding physical camera in a different configuration of the mobile phone 100 according to the logical post-camera identifier, which is not limited herein.
It will be appreciated that in the system software of the handset 100, the camera provider may provide services for applications in the handset 100 to access the interfaces of the various physical cameras/camera units of the handset 100. The following describes a technical solution of the embodiment of the present application taking a camera provider as an example to implement the system software of the embodiment of the present application.
Specifically, fig. 5 illustrates a flow diagram of a method of capturing images implemented by a camera provider, according to some embodiments of the present application. As shown in fig. 5, the process includes the steps of:
s501: a request is received for an application to start a camera.
That is, the camera provider triggers the shooting method provided in the embodiment of the present application when receiving a request of the application program to start the camera.
It will be appreciated that the request sent by the application program to start the camera includes a logical camera identifier to be started, for example, a logical front camera identifier, a logical rear camera identifier, a logical depth camera identifier, and the like. In some embodiments, the logical front camera identification may be set to 1 and the logical rear camera identification to 0.
Specifically, for the scenario illustrated in fig. 4A and 4B, the request sent by the camera provider to activate the camera that is received by the camera application may include the logical front-facing camera identification 1.
S502: the method comprises the steps of obtaining the form of the mobile phone 100, and determining a physical camera to be started according to the form of the mobile phone 100 and the logical camera identification.
That is, after receiving a request for starting a camera sent by an application program, a camera provider obtains a form of the mobile phone 100, and determines a physical camera corresponding to the logical camera identifier in the current form according to the form of the mobile phone 100 and the logical camera identifier in the request for starting the camera, and the correspondence between each logical camera and each physical camera of the mobile phone 100 in different forms.
For example, for the scenario shown in fig. 4A and 4B, a logical front-facing camera identification 1 may be included in the start-up camera request sent by the camera application to the camera provider. Referring to fig. 4A, when detecting that the mobile phone 100 is in the unfolded state, the camera provider determines that the physical camera corresponding to the logical front camera identifier 1 is a camera unit 1931; referring to fig. 4B, when detecting that the mobile phone 100 is in the folded state, the camera provider determines that the physical camera corresponding to the logical front camera identifier 1 is the camera unit 1932.
It can be appreciated that, in other embodiments, in the case where the logical camera to be started by the application is a logical post-camera (labeled 0), the camera provider may also determine that the physical camera corresponding to the logical camera label 0 is the camera unit 1933 when the mobile phone 100 is in the folded state; in the case that the mobile phone 100 is in the unfolded state, it is determined that the physical camera corresponding to the logical camera identifier 0 is the camera unit 1933 and/or the camera unit 1932, that is, in the case that the mobile phone 100 is in the unfolded state, the application program may collect an image or a video through the camera unit 1933 or the camera unit 1932, or may collect an image or a video through collaborative shooting by the camera unit 1933 and the camera unit 1932.
S503: and starting the determined physical camera.
That is, the camera provider starts the corresponding physical camera according to the physical camera identifier of the physical camera determined in step S502, and provides an interface for the application program to collect images or videos through the physical camera.
For example, in the scenario shown in fig. 4A, the camera provider may activate the camera unit 1931 and provide an interface for the camera application to capture images or video through the camera unit 1931; in the scenario illustrated in fig. 4B, the camera provider may activate the camera unit 1932 and provide an interface for camera applications to capture images or video via the camera unit 1932.
It will be appreciated that after the camera provider activates the corresponding physical camera and provides an interface through which the application program captures images or video, the application program may capture images or video through the physical camera.
It should be understood that the foregoing process of performing the steps S501 to S503 with the camera provider is only an example, and in other embodiments, the process may be performed by other applications/modules/services in the mobile phone 100, for example, application software at an application layer, services at an application framework layer, etc., which are not limited herein.
It should be understood that the foregoing execution sequence of steps S501 to S503 is only an example, and in other embodiments, other sequences may be adopted, and partial steps may be combined or split, which is not limited herein.
By the method provided by the embodiment of the application program, the application program in the mobile phone 100 can acquire images or videos through the corresponding physical cameras when the mobile phone 100 is in different forms through the same logical camera identification, and a developer of the application program does not need to integrate instructions for determining the physical cameras corresponding to the logical cameras according to the form of the mobile phone 100 in the application program, so that the development cost of the application program is reduced.
For easy understanding, a software architecture of the mobile phone 100 suitable for the photographing method provided in the embodiment of the present application is described below.
Specifically, fig. 6 illustrates a software architecture diagram of a mobile phone 100, according to some embodiments of the present application.
Referring to fig. 6, the software architecture of the mobile phone 100 includes an application layer 01, an application framework layer 02, a hardware abstraction layer 03, and a kernel layer 04, wherein:
the application layer 01 includes applications of the handset 100 such as cameras that can capture images or video through a camera of the handset 100 and/or third party applications 011. The camera and/or third party application 011 may facilitate access interfaces to physical cameras through the handset 100 by sending a request to activate a camera to a camera service 025 in the application framework layer 02, in some embodiments the camera and/or third party application 011 includes, but is not limited to, a camera application, instant messaging application, browser application, video conferencing application, and the like. As previously described, the request for starting the camera may include a logical camera identifier to be started, for example, a logical front camera identifier 1, a logical rear camera identifier 0, and the like.
It will be appreciated that in other implementations, where there is only one physical camera corresponding to the logical camera, the start-up camera request may also include a physical camera identifier, which is not limited herein.
It will be appreciated that in other embodiments, the application layer 01 may include further applications, such as "information," "calendar," "memo," etc., without limitation.
The application framework layer 02 provides an application programming interface (application programming interface, API) and programming framework for the application of the application layer 01 so that the application can interact with services/applications/modules/drivers in the hardware abstraction layer 03 and the kernel layer 04. In some embodiments, the application framework layer 02 may include:
the screen folding management (HwFoldScreenManager) 021 may obtain the form of the mobile phone 100 from the data of each Sensor acquired based on the Sensor Service 022, and send the form to the system Service management 023 so that each system Service or application may acquire the folded (screen) form of the mobile phone 100.
In some embodiments, the screen collapse manager 021 may send a notification of the change in the form of the mobile phone 100 to each service/module in the system service manager (SystemManager) 023 in the case where it is determined that the form of the mobile phone 100 changes, for example, in the case where it is determined that the mobile phone 100 is switched from an expanded state to a collapsed state, so that the system service/application may respond according to the notification. For example, upon receiving notification that the mobile phone 100 is switched from the unfolded state to the folded state, the camera provider 032 switches the physical camera corresponding to the logical front camera id 1 from the camera unit 1931 to the camera unit 1932.
The sensor service 022 is used to provide an interface for acquiring sensor data of the mobile phone 100 to an application/service in the mobile phone 100. In some embodiments, the Sensor services 022 may include a gesture Sensor service (Posture Sensor) 0221, a Hinge Sensor service (Hinge Sensor) 0222, and a Hall Sensor service (Hall Sensor) 0223. The attitude sensor service 0221 can acquire data of the gyroscope, and thereby determine the rotation angle of the mobile phone 100 from the data. The hinge sensor service 0222 can determine the form of the mobile phone 100, such as the folding angle, through the angle determination unit 031 and the hinge sensor drive 041. The hall sensor service 0223 may obtain hall sensor data of the cell phone 100 through the hall sensor driver 042, which in some embodiments may be used to determine the morphology of the cell phone 100.
It is understood that in other embodiments, the sensor service 022 may also include more or fewer sensor services, and is not limited herein.
The system services management 023 is configured to manage and run various types of services in the handset 100, for example in some embodiments, the system services management 023 may include a camera services agent (hwcamera serviceproxy) 0231 and a camera folding screen extension module (hwcamera folder extension) 0232. The Camera Service agent 0231 is used to start a Camera Service (Camera Service) 025. The camera folding screen expansion module 0232 is used for acquiring the form of the mobile phone 100 from the screen folding management 021 and sending the form of the mobile phone 100 to the camera post-processing 024.
It is understood that in other embodiments, system service 023 may include more or fewer services, and embodiments of the present application are not limited in this respect.
Post camera 024 is used to provide an interactive interface for the camera folding screen extension module 0232 and the hardware abstraction layer (Hardware Abstraction Layer, HAL) 03. For example, the camera post-processing 024 may receive the form of the handset 100 sent by the camera folding screen extension module and forward the form of the handset 100 to the camera provider 032 in the hardware abstraction layer 03 via the hardware abstraction layer interface definition language (HAL Interface Definition Language, HIDL).
The camera service 025 is configured to provide an interface to the application in the application layer 01 for accessing the camera provider 032, e.g., in some embodiments, the camera service 025 may forward the identity of the camera to be started, e.g., logical front camera identity 1, logical back camera identity 0, sent by the camera/third party application 011 to the camera service 025, to the camera provider 032.
It will be appreciated that in other embodiments, more or fewer modules may be included in the application framework layer 02, and embodiments of the present application are not limited thereto.
The hardware abstraction layer 03 is used to provide an access interface for hardware of the mobile phone 100 to the application framework layer 02 or the application layer 01. For example, in some embodiments, the hardware abstraction layer 03 may include an angle determination unit 031 and a camera provider 032. The angle determining unit 031 is used for determining the folding angle of the mobile phone 100 according to the data of the hinge sensor. The camera provider 032 is configured to determine a physical camera corresponding to the logical camera identifier according to the logical camera identifier or the physical camera identifier sent by the application program or the camera service 025 in the application layer 01, the form of the mobile phone 100 acquired from the camera post-processing 024, and the correspondence between each logical camera and each physical camera, and provide an access interface of the physical camera to the application layer 01.
It will be appreciated that in other embodiments, the hardware abstraction layer 03 may also include more modules, such as bluetooth, audio, and video, and the like, which are not limited herein.
The kernel layer 04 is configured to provide an access interface for hardware of the mobile phone 100 to the hardware abstraction layer 03, the application framework layer 02 and the application layer 01, so that the hardware abstraction layer 03, the application framework layer 02 and the application layer 01 can obtain data from the hardware of the mobile phone 100 or send data to the hardware of the mobile phone 100 through the interface. For example, in some embodiments, the kernel layer 04 may include a hinge sensor driver 041, a hall sensor driver 042, a camera unit 1931 driver 043, a camera unit 1932 driver 044, and a camera unit 1933 driver 045, each of which may be used to drive corresponding hardware so that applications/services/modules in the handset 100 may obtain data from or send data through the corresponding hardware. For example, the camera unit 1931 driver 043 may provide image data collected by the camera unit 1931 to the camera/third party application 011.
It will be appreciated that the software architecture of the mobile phone 100 shown in fig. 6 is merely an example, and in other embodiments, the software architecture of the mobile phone 100 may include more or fewer modules, merge or split partial modules, change the positions of partial modules, and use other software architectures, which are not limited in this application.
In order to make the technical solution of the present application clearer, the following describes an interaction flow of the shooting method provided in the embodiment of the present application in combination with a software architecture diagram of the mobile phone 100 shown in fig. 6.
Specifically, fig. 7 illustrates an interaction flow diagram of a shooting method according to some embodiments of the present application. As shown in fig. 7, the interactive flow includes the following steps.
S701: the system service management 023 initiates a camera service 025. That is, the system service manager 023 initiates the camera service 025 through the camera service agent 0231 when the handset 100 is started.
It will be appreciated that in some embodiments, after the camera service 025 is started, the application in the handset 100 may send a request to start the camera to the camera provider 032 via the camera service 025.
S702: the system service manager 023 initiates a folding screen service 021. That is, when the mobile phone 100 is started, the system service manager 023 starts the screen folding manager 021, so that other services/modules/application programs can acquire the form of the mobile phone 100 provided by the screen folding manager 021 by registering the monitoring service in the screen folding manager 021.
It will be appreciated that in some embodiments, the system service management 023 may also activate other services that implement functions related to the mobile phone 100, such as the camera folding screen extension module 0232, etc. when the mobile phone 100 is activated, which is not limited herein.
S703: the system service management 023 sends a startup completion notification to the camera service agent 0231.
The system service manager 023 sends a notification of the completion of the power on to the camera service agent 0231 when the power on of the mobile phone 100 is completed, that is, when the related services in the mobile phone 100 are all started, for example, the camera service 025 and the screen folding manager 021 are started to be completed.
It will be appreciated that in some embodiments, the system service manager 023 may also send a power on completion notification to other services/applications/modules, etc., such as to the camera service 025, camera provider 032, camera folding screen extension module 0232, etc.
S704: the camera service agent 0231 transmits a startup completion notification to the camera folding screen extension module 0232.
After receiving the notification of the completion of the power-on of the mobile phone 100 sent by the system service management 023, the camera service agent 0231 forwards the notification of the completion of the power-on to the camera folding screen extension module 0232.
It will be appreciated that in other embodiments, the boot completion notification may also be sent by the system service manager 023 directly to the camera folding screen extension module 0232, which is not limited herein.
S705: the camera folding screen extension module 0232 registers the modality of the listening mobile phone 100 with the screen folding management 021.
The camera folding screen expansion module 0232, after receiving the notification that the mobile phone 100 has been started, determines that the screen folding management 021 has been started, registers the form of monitoring the mobile phone 100 with the screen folding management 021.
It can be understood that the camera folding screen expansion module 0232 can acquire the form of the mobile phone 100 through the screen folding management 021 after registering the form of the monitoring mobile phone 100 with the screen folding management 021.
It can be appreciated that, in some embodiments, after the camera folding screen extension module 0232 registers the morphology of the mobile phone 100 with the screen folding management 021, the screen folding management 021 actively sends a notification of the morphology change of the mobile phone 100 to the camera folding screen extension module 0232 when the morphology of the mobile phone 100 changes.
S706: the camera folding screen extension module 0232 obtains the form of the cell phone 100 from the screen folding management 021 and sends to the camera post-processing 024.
The camera folding screen expansion module 0232 acquires the form of the mobile phone 100 after the screen folding management 021 registers the form of the monitoring mobile phone 100 successfully, and sends the form to the camera post-processing 024.
S707: the camera post-processing 024 sends the modality of the handset 100 to the camera provider 032.
That is, when the camera post-processing 024 receives the form of the mobile phone 100, it forwards the received form to the camera provider 032.
S708: the camera 011 sends a request to activate a camera to the camera service 025.
That is, the camera 011 sends a request to activate the camera to the camera service 025. The request for starting the camera may include a logical camera identifier of the camera to be started, for example, in the case where the camera to be started by the camera 011 is a logical front-end camera, the request for starting the camera may include a logical front-end camera identifier, for example, 1.
Specifically, referring to fig. 8A, for example, the camera 011, upon detecting a user operation to switch the logical front camera control 81, sends a request to activate the logical front camera to the camera service 025, which may include the front camera identification 1 therein.
It will be appreciated that in some embodiments where the default logical camera of camera 011 or the logical camera at the last end of camera 011 is the logical front camera, referring to fig. 8B, after the user initiates the camera 011 application by clicking on camera 011 icon 82, camera 011 sends a request to initiate the logical front camera to camera service 025.
S709: the camera service 025 sends a request to the camera provider 032 to activate the camera.
The camera service 025, upon receiving a request to activate the camera sent by the camera 011, forwards the request to the camera provider 032.
S710: the camera provider 032 activates the corresponding physical camera according to the camera identification and the morphology of the handset 100.
The camera provider 032 determines that the physical camera to be started is a camera unit 1931 or a camera unit 1932 according to the camera identifier to be started in the camera starting request, for example, a logic front camera identifier 1; according to the correspondence between each logic front camera and the physical camera of the mobile phone 100 in different forms, the starting camera unit 1931 or the camera unit 1932 is determined, for example, referring to fig. 4A, when the mobile phone 100 is in an unfolded state, the starting physical camera is determined to be the camera unit 1931, for example, referring to fig. 4B, and when the mobile phone 100 is in a folded state, the starting physical camera is determined to be the camera unit 1932; and finally, starting the determined physical camera, and providing an interface for accessing the determined physical camera to the camera 011 so that the camera 011 can acquire images or videos from the physical camera.
S711: the screen fold management 021 detects whether the form of the mobile phone 100 is changed.
That is, the screen fold management 021 detects whether the form of the mobile phone 100 is changed. If the change of the form of the mobile phone 100 is detected, indicating that a new physical camera needs to be switched, turning to step S712, and sending the form of the mobile phone 100 to the camera folding screen expansion module 0232; otherwise, it is described that a new physical camera is not required to be switched, and whether the form of the mobile phone 100 is changed is continuously detected.
S712: the camera folding screen extension module 0232 sends the form of the cell phone 100 to the camera post-processing 024.
S713: the camera post-processing 024 sends the modality of the handset 100 to the camera provider 032.
S714: the camera provider 032 switches the physical camera according to the form of the mobile phone 100.
That is, the camera provider 032 uses the physical camera for the camera 011 according to the received form of the mobile phone 100, for example, when the mobile phone 100 is switched from the folded state to the unfolded state, the physical camera is switched from the camera unit 1932 to the camera unit 1931; when the cellular phone 100 is switched from the unfolded state to the folded state, the physical camera is switched from the camera unit 1931 to the camera unit 1932.
It can be appreciated that in some embodiments, the camera provider 032 may also actively obtain the form of the mobile phone 100 from the screen of the camera folding screen extension module 0232, detect whether the form of the mobile phone 100 changes, and switch the physical camera after detecting the change of the form of the mobile phone 100.
It should be understood that the foregoing execution sequence of steps S701 to S714 is merely an example, and in other embodiments, the execution sequence of each step may be adjusted, for example, the execution sequence of step S701 and step S702 may be interchanged, or part of the steps may be combined or split, which is not limited in the embodiments of the present application.
By the shooting method provided by the embodiment of the application program, when the mobile phone 100 is in different forms, the application program in the mobile phone 100 can acquire images or videos through the physical cameras corresponding to the logical cameras through the same logical camera identification, and a developer of the application program does not need to integrate instructions for determining the physical cameras corresponding to the logical cameras according to the form of the mobile phone 100 in the application program, so that the development cost of the application program is reduced.
The foregoing embodiment describes the interaction process of the mobile phone 100 for shooting under the condition that each service/application program of the mobile phone 100 operates normally. However, in some embodiments, some modules of the mobile phone 100 may be restarted accidentally, unresponsive, and locked during operation. In order to ensure that the mobile phone 100 can start the corresponding physical camera according to the form of the mobile phone 100 based on the request of starting the camera of the application program even if the partial service is abnormal. The embodiment of the application also provides a shooting method, which is characterized in that after the abnormality occurs in part of modules in the mobile phone 100, the form of the mobile phone 100 is obtained again based on the abnormal recovery condition of the part of services, and the corresponding physical camera is started according to the form of the mobile phone 100. The following describes the technical solution of the embodiment of the present application by taking an example that the camera 011 is abnormal when the camera provider 032 uses the camera.
Specifically, fig. 9 illustrates an interaction process schematic of a shooting method according to some embodiments of the present application. As shown in fig. 9, the interactive process includes the following steps.
S901: the camera provider 032 sends a notification of an online or offline to the camera service 025.
That is, in the case where the camera provider 032 fails to continue providing the service due to an abnormality, for example, in the case where the camera provider 032 restarts due to a software failure and is stuck, the camera provider transmits a offline notification to the camera service 025, and in the case of abnormality recovery, transmits an online notification to the camera service 025.
It will be appreciated that in other embodiments, in the event that the camera provider 032 cannot send an offline notification to the camera service 025, the offline notification of the camera provider 032 may also be sent to the camera service 025 by other modules in the handset 100, which is not limited herein.
S902: the camera service 025 sends the status of the camera provider 032 to the camera service proxy 0231.
That is, when the camera service 025 receives an online or offline notification of the camera provider 032, the camera service agent 0231 transmits the status of the camera provider 032.
It will be appreciated that in some embodiments, the status of camera provider 032 may include camera provider 032 up, camera provider 032 down.
S903: the camera service agent 0231 transmits the status of the camera provider to the camera folding screen extension module 0232.
That is, the camera service agent 0231, upon receiving the status of the camera provider 032, forwards the status of the camera provider 032 to the camera folding screen extension module 0232.
S904: the camera folding screen extension module 0232 detects whether the status of the camera provider 032 is online. If so, the method indicates that the mobile phone 100 is required to be provided to the camera provider 032 again, and the process goes to step S905; otherwise, the camera provider 032 is indicated to be malfunctioning and not processing.
S905: the camera folding screen extension module 0232 obtains the form of the mobile phone 100 from the screen folding management 021 and sends the form of the mobile phone 100 to the camera post-processing 024.
That is, when the camera provider 032 is online, the camera folding screen expansion module 0232 acquires the form of the mobile phone 100 from the screen folding management 021 and sends the form of the mobile phone 100 to the camera post-processing 024.
S906: the camera post-processing 024 sends the modality of the handset 100 to the camera provider 032.
That is, the camera post-processing 024, after receiving the form of the mobile phone 100, forwards the form of the mobile phone 100 to the camera provider 032.
S907: the camera provider 032 starts the corresponding physical camera according to the logical camera identification transmitted by the camera 011 and the form of the mobile phone 100.
For example, when the camera flag transmitted from the camera 011 is 1, that is, when the activated camera is a logical front camera, the camera provider 032 activates the camera unit 1931 when the mobile phone 100 is in the unfolded state, and activates the camera unit 1932 when the mobile phone 100 is in the folded state.
It should be understood that the foregoing execution of steps S901 to S907 is merely an example, and in other embodiments, the execution order of the steps may be adjusted, or part of the steps may be combined or split, which is not limited herein.
According to the shooting method provided by the embodiment of the application, after the camera provider 032 is abnormal and is recovered, the mobile phone 100 is sent to the camera provider 032 by re-acquiring the form of the mobile phone 100, so that the camera provider 032 can start the corresponding physical cameras based on the identification of the logic camera to be started by the camera 011, the form of the mobile phone 100 and the corresponding relation between each logic camera and the physical camera of the mobile phone 100 in different forms after the abnormality is eliminated, and the user experience is improved.
It will be appreciated that, in other embodiments, after the other modules/services in the mobile phone 100 are abnormal and restored, the configuration of the mobile phone 100 may be sent to the camera provider 032 by a similar method, so as to ensure that the camera provider 032 can start the corresponding physical camera after the abnormality of the other modules/services is eliminated, based on the identity of the logical camera to be started by the camera 011, the configuration of the mobile phone 100, and the correspondence between each logical camera and the physical camera of the mobile phone 100 in different configurations, which is not limited herein.
Further, fig. 10 illustrates a schematic diagram of a mobile phone 100, according to some embodiments of the present application. As shown in fig. 10, the mobile phone 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, a subscriber identity module (subscriber identification module, SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system. In some embodiments, the processor 110 may call and execute the execution instructions of the photographing methods provided in the embodiments of the present application stored in the memory, so as to implement the photographing methods provided in the embodiments of the present application.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement a touch function of the mobile phone 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the camera function of cell phone 100. The processor 110 and the display 194 communicate via a DSI interface to implement the display function of the handset 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect to a charger to charge the mobile phone 100, or may be used to transfer data between the mobile phone 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
The charge management module 140 is configured to receive a charge input from a charger. The charging management module 140 may also supply power to the mobile phone 100 through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like.
The wireless communication function of the mobile phone 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied to the handset 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc. applied to the handset 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
The mobile phone 100 implements display functions through a GPU, a display 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a Mini-LED, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the cell phone 100 may include 1 or N display screens 194, N being a positive integer greater than 1. For example, in some implementations, the display 194 may include the aforementioned display 1941 disposed on the A-side and display 1942 disposed on the C-side.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the cell phone 100 may include 1 or N cameras 193, N being a positive integer greater than 1. For example, in some embodiments, camera 193 may include a camera unit 1931, a camera unit 1932, and a camera unit 1933, and at least one camera may be included in each camera unit.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capabilities of the handset 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a program memory area and a data memory area. The program storage area may store an operating system, application programs required for at least one function, and the like. The data storage area may store data created during use of the handset 100, etc. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 executes various functional applications of the handset 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor 110. In some embodiments, the internal memory 121 may be used to store the correspondence between each logical camera and physical camera of the mobile phone 100 in different forms.
The handset 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals.
The earphone interface 170D is used to connect a wired earphone.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The handset 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display 194, the mobile phone 100 detects the intensity of the touch operation according to the pressure sensor 180A. The mobile phone 100 may also calculate the position of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions.
The hall sensor 180C is used to sense a magnetic field change, and can convert the magnetic field change into an electrical signal. For example, in some embodiments, hall sensors may be disposed on both sides of the display 1941 such that the cell phone 100 may determine the angle between the left and right halves of the display 1941 based on the hall sensor data.
The hinge sensor 180D is used to obtain an angular change of the mobile phone 100 during folding or unfolding, so that the mobile phone 100 can determine the form of the mobile phone 100 according to the angular change.
The acceleration sensor 180E can detect the magnitude of acceleration of the mobile phone 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the handset 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
The ambient light sensor 180L is used to sense ambient light level. The cell phone 100 may adaptively adjust the brightness of the display 194 based on perceived ambient light levels. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect if the handset 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The mobile phone 100 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access an application lock, fingerprint photographing, fingerprint incoming call answering and the like. In the embodiment of the application, the main space and the privacy space can adopt different unlocking fingerprints to carry out verification login so as to ensure the security of private data in the privacy space. The user can verify that different fingerprints enter the main space and the privacy space, respectively, on the handset 100.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the mobile phone 100 at a different location than the display 194.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The handset 100 may receive key inputs, generating key signal inputs related to user settings and function control of the handset 100.
The motor 191 may generate a vibration cue.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card.
It should be understood that the structure of the mobile phone 100 shown in the embodiments of the present application does not constitute a specific limitation on the mobile phone 100. In other embodiments of the present application, the handset 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components may be provided. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
It will be appreciated that the embodiments of the present application are described by taking the mobile phone 100 as an example, and the technical solutions of the embodiments of the present application are also applicable to other foldable electronic devices, including, but not limited to, laptop computers, tablet wearable devices, head-mounted displays, mobile email devices, portable game consoles, portable music players, reader devices, and the like, which are not limited thereto.
Embodiments of the mechanisms disclosed herein may be implemented in hardware, software, firmware, or a combination of these implementations. Embodiments of the present application may be implemented as a computer program or program code that is executed on a programmable system including at least one processor, a storage system (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
Program code may be applied to input instructions to perform the functions described herein and generate output information. The output information may be applied to one or more output devices in a known manner. For purposes of this application, a processing system includes any system having a processor such as, for example, a Digital Signal Processor (DSP), microcontroller, application Specific Integrated Circuit (ASIC), or microprocessor.
The program code may be implemented in a high level procedural or object oriented programming language to communicate with a processing system. Program code may also be implemented in assembly or machine language, if desired. Indeed, the mechanisms described in the present application are not limited in scope to any particular programming language. In either case, the language may be a compiled or interpreted language.
In some cases, the disclosed embodiments may be implemented in hardware, firmware, software, or any combination thereof. The disclosed embodiments may also be implemented as instructions carried by or stored on one or more transitory or non-transitory machine-readable (e.g., computer-readable) storage media, which may be read and executed by one or more processors. For example, the instructions may be distributed over a network or through other computer readable media. Thus, a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), including but not limited to floppy diskettes, optical disks, read-only memories (CD-ROMs), magneto-optical disks, read-only memories (ROMs), random Access Memories (RAMs), erasable programmable read-only memories (EPROMs), electrically erasable programmable read-only memories (EEPROMs), magnetic or optical cards, flash memory, or tangible machine-readable memory for transmitting information (e.g., carrier waves, infrared signal digital signals, etc.) in an electrical, optical, acoustical or other form of propagated signal using the internet. Thus, a machine-readable medium includes any type of machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (e.g., a computer).
In the drawings, some structural or methodological features may be shown in a particular arrangement and/or order. However, it should be understood that such a particular arrangement and/or ordering may not be required. Rather, in some embodiments, these features may be arranged in a different manner and/or order than shown in the illustrative figures. Additionally, the inclusion of structural or methodological features in a particular figure is not meant to imply that such features are required in all embodiments, and in some embodiments, may not be included or may be combined with other features.
It should be noted that, in the embodiments of the present application, each unit/module is a logic unit/module, and in physical aspect, one logic unit/module may be one physical unit/module, or may be a part of one physical unit/module, or may be implemented by a combination of multiple physical units/modules, where the physical implementation manner of the logic unit/module itself is not the most important, and the combination of functions implemented by the logic unit/module is the key to solve the technical problem posed by the present application. Furthermore, to highlight the innovative part of the present application, the above-described device embodiments of the present application do not introduce units/modules that are less closely related to solving the technical problems presented by the present application, which does not indicate that the above-described device embodiments do not have other units/modules.
It should be noted that in the examples and descriptions of this patent, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
While the present application has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present application.

Claims (12)

1. The shooting method is applied to foldable electronic equipment and is characterized in that the electronic equipment comprises a first screen and a second screen which are respectively positioned on the front side and the back side of the electronic equipment in an unfolding state, wherein the first screen is a folding screen, at least one first physical camera is arranged on the same side of the electronic equipment as the first screen, and at least one second physical camera is arranged on the same side of the electronic equipment as the second screen; and is also provided with
The method comprises the following steps:
the method comprises the steps that a first shooting instruction of a second application is received by a first application, wherein the first shooting instruction comprises a view angle identifier of front shooting;
under the condition that the first screen is in an unfolding state, responding to the first shooting instruction, and starting the first physical camera by the first application;
in the case that the first screen is in a folded state, responding to the first shooting instruction, and starting the second physical camera by the first application;
the second application completes the first shooting instruction through a physical camera started by the first application;
wherein the second application is independent of the first application.
2. The method of claim 1, wherein the first application, in response to the first photographing instruction, activates the first physical camera with the first screen in the unfolded state, comprises:
the first application determines that the first physical camera responds to the first shooting instruction and starts the first physical camera according to the visual angle identifier of the front shooting, the folding form of the first screen and the corresponding relation between each physical camera and the shooting visual identifier of the first screen, which are stored in advance, in different folding forms.
3. The method of claim 1, wherein the first application, in response to the first photographing instruction, activates the second physical camera with the first screen in a folded state, comprising:
and the first application determines that the second physical camera responds to the first shooting instruction and starts the second physical camera according to the visual angle identifier of the front shooting, the folding form of the second screen and the corresponding relation of the physical cameras and the shooting visual identifiers of the first screen under different folding forms, which are stored in advance.
4. The method as recited in claim 1, further comprising:
and under the condition that the folding morphology of the first screen is changed, the first application switches the started physical camera.
5. The method of claim 4, wherein the first application switches the physical camera activated in the event of a change in the folded configuration of the first screen, comprising:
the first application switches the started physical camera from the first physical camera to the second physical camera under the condition that the first screen is switched from an unfolding state to a folding state;
and under the condition that the first screen is switched from a folded state to an unfolded state, the first application switches the started physical camera from the second physical camera to the first physical camera.
6. The method of claim 1, wherein the electronic device further comprises at least one third physical camera disposed on a different side than the first screen; and is also provided with
The method further comprises the steps of:
the method comprises the steps that a first application receives a second shooting instruction of a second application, wherein the second shooting instruction comprises a visual angle identifier of post shooting;
In response to the second photographing instruction, the first application starts at least one of the third physical camera and the second physical camera in a state that the first screen is in an expanded state;
in the case that the first screen is in a folded state, responding to the second shooting instruction, and starting the third physical camera by the first application;
and the second application completes the second shooting instruction through a physical camera started by the first application.
7. The method according to any one of claims 1 to 6, further comprising:
and restarting the physical camera for completing the first shooting instruction by the first application under the condition of abnormality and recovery.
8. The method according to any one of claims 1 to 6, further comprising:
the first application obtains a folded form of the first screen from a third application.
9. The method of claim 8, wherein the operating system of the electronic device includes a hardware abstraction layer and an application framework layer; and is also provided with
The first application is arranged at the hardware abstraction layer;
the third application is disposed on the application framework layer.
10. The method of claim 9, wherein the second application comprises any one of the following applications: camera applications, instant messaging applications, browser applications, video conferencing applications.
11. A readable medium, comprising instructions that, when executed by a processor of an electronic device, cause the electronic device to implement the shooting method of any one of claims 1 to 10.
12. An electronic device, comprising:
a plurality of physical cameras;
a memory for storing instructions for execution by one or more processors of the electronic device; and
at least one processor for executing the instructions to cause the electronic device to select at least one of the plurality of physical cameras to complete a shooting instruction by the shooting method of any one of claims 1 to 10.
CN202310369097.2A 2021-12-30 2021-12-30 Shooting method, readable medium and electronic device Pending CN116489494A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310369097.2A CN116489494A (en) 2021-12-30 2021-12-30 Shooting method, readable medium and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202310369097.2A CN116489494A (en) 2021-12-30 2021-12-30 Shooting method, readable medium and electronic device
CN202111651361.9A CN115022495B (en) 2021-12-30 2021-12-30 Photographing method, readable medium, and electronic device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202111651361.9A Division CN115022495B (en) 2021-12-30 2021-12-30 Photographing method, readable medium, and electronic device

Publications (1)

Publication Number Publication Date
CN116489494A true CN116489494A (en) 2023-07-25

Family

ID=83064655

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202310369097.2A Pending CN116489494A (en) 2021-12-30 2021-12-30 Shooting method, readable medium and electronic device
CN202111651361.9A Active CN115022495B (en) 2021-12-30 2021-12-30 Photographing method, readable medium, and electronic device

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202111651361.9A Active CN115022495B (en) 2021-12-30 2021-12-30 Photographing method, readable medium, and electronic device

Country Status (1)

Country Link
CN (2) CN116489494A (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118116117A (en) * 2022-11-30 2024-05-31 荣耀终端有限公司 Crowd identification method, device and storage medium
CN117714834A (en) * 2023-06-21 2024-03-15 荣耀终端有限公司 Camera control method and related equipment thereof
CN117725941B (en) * 2024-02-02 2024-06-25 荣耀终端有限公司 Code scanning method and electronic equipment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107368150A (en) * 2017-06-30 2017-11-21 维沃移动通信有限公司 A kind of photographic method and mobile terminal
CN107613196A (en) * 2017-09-05 2018-01-19 珠海格力电器股份有限公司 Self-photographing method and device and electronic equipment
CN110430362A (en) * 2019-08-12 2019-11-08 珠海格力电器股份有限公司 Video shooting method and device based on folding screen and storage medium
CN112839115B (en) * 2019-11-22 2022-04-08 北京小米移动软件有限公司 Flexible screen module, terminal device and shooting method
CN114237530A (en) * 2020-01-21 2022-03-25 华为技术有限公司 Display method and related device of folding screen
CN113840070B (en) * 2021-09-18 2023-05-23 维沃移动通信有限公司 Shooting method, shooting device, electronic equipment and medium

Also Published As

Publication number Publication date
CN115022495B (en) 2023-04-07
CN115022495A (en) 2022-09-06

Similar Documents

Publication Publication Date Title
CN114679537B (en) Shooting method and terminal
CN115022495B (en) Photographing method, readable medium, and electronic device
KR102422114B1 (en) Screen control method, electronic device, and storage medium
CN114125786B (en) Message synchronization method, readable medium and electronic device
CN112771900B (en) Data transmission method and electronic equipment
CN114115769B (en) Display method and electronic equipment
CN111182614B (en) Method and device for establishing network connection and electronic equipment
CN112751954B (en) Operation prompting method and electronic equipment
CN109523609B (en) Content editing method and terminal
CN113168461A (en) Method for deleting security service and electronic equipment
CN110543287A (en) Screen display method and electronic equipment
CN110780929B (en) Method for calling hardware interface and electronic equipment
CN111897465B (en) Popup display method, device, equipment and storage medium
CN116048436B (en) Application interface display method, electronic device and storage medium
JP7204902B2 (en) File transfer method and electronic device
CN114077519B (en) System service recovery method and device and electronic equipment
CN114004732A (en) Image editing prompting method and device, electronic equipment and readable storage medium
CN116909474B (en) Device identification method and related device
CN111475363A (en) Card death recognition method and electronic equipment
CN117376475B (en) Shooting switching method, electronic device, chip and readable storage medium
CN117750340A (en) Update method, readable medium, program product, and electronic device
CN117708055A (en) File mounting method, medium, program product and electronic device
CN116339569A (en) Split screen display method, folding screen device and computer readable storage medium
CN118337905A (en) Note generation method, readable storage medium, program product, and electronic device
CN118092745A (en) Content acquisition method, readable storage medium, program product, and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination