CN114401340A - Collaborative shooting method, electronic device and medium thereof - Google Patents

Collaborative shooting method, electronic device and medium thereof Download PDF

Info

Publication number
CN114401340A
CN114401340A CN202111682873.1A CN202111682873A CN114401340A CN 114401340 A CN114401340 A CN 114401340A CN 202111682873 A CN202111682873 A CN 202111682873A CN 114401340 A CN114401340 A CN 114401340A
Authority
CN
China
Prior art keywords
screen
mobile phone
picture
shooting
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111682873.1A
Other languages
Chinese (zh)
Other versions
CN114401340B (en
Inventor
刘智鹍
聂星星
王拣贤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202111682873.1A priority Critical patent/CN114401340B/en
Priority to CN202311176831.XA priority patent/CN117424954A/en
Publication of CN114401340A publication Critical patent/CN114401340A/en
Application granted granted Critical
Publication of CN114401340B publication Critical patent/CN114401340B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72427User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/0206Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings
    • H04M1/0208Portable telephones comprising a plurality of mechanically joined movable body parts, e.g. hinged housings characterized by the relative motions of the body parts
    • H04M1/0214Foldable telephones, i.e. with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0266Details of the structure or mounting of specific components for a display module assembly
    • H04M1/0268Details of the structure or mounting of specific components for a display module assembly including a flexible display panel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/16Details of telephonic subscriber devices including more than one display unit
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Abstract

The present disclosure relates to the field of electronic technologies, and in particular, to a collaborative shooting method, an electronic device, and a medium thereof. By the collaborative shooting method, when a photographer shoots a photographed person by using the folding screen mobile phone, the photographed person can also visually see the shooting effect through the mirror image picture displayed by the outer screen of the mobile phone, and the mirror image picture displayed by the outer screen is the picture of the inner screen display picture of the mobile phone after mirror image processing, namely for the photographed person, the picture displayed by the outer screen is similar to the picture displayed when the photographed person looks at the mirror, so that the photographed person can directly adjust the posture and the position of the photographed person according to the picture displayed by the outer screen to obtain a better shooting effect.

Description

Collaborative shooting method, electronic device and medium thereof
Technical Field
The present disclosure relates to the field of electronic technologies, and in particular, to a collaborative shooting method, an electronic device, and a medium thereof.
Background
At present, when people use the rear camera of the folding screen mobile phone to take pictures, a shot person cannot watch a shot preview picture through the outer screen of the folding screen mobile phone, so that the shot person cannot visually see the imaging effect of the person in the rear camera, and only can move the position or adjust the shape according to the command or prompt of other people so as to improve the imaging effect. For example, as shown in fig. 1(a), a photographer 10 takes a picture of a subject 20 with a folding-screen mobile phone 100, an external screen a of the folding-screen mobile phone 100 displays a standby screen (e.g., date and time), and the subject 20 cannot view a shooting preview screen through the external screen a of the folding-screen mobile phone 100, which results in poor user experience.
Disclosure of Invention
The embodiment of the application provides a collaborative shooting method, electronic equipment and a medium thereof.
In a first aspect, an embodiment of the present application provides a collaborative shooting method, where the method is applied to an electronic device, where the electronic device includes a first camera and a first screen (e.g., an inner screen of a mobile phone) and a second screen (e.g., an outer screen of the mobile phone) that are located on different sides of the electronic device, and the first camera and the first screen are located on different sides of the electronic device, e.g., a rear camera of the mobile phone and the inner screen of the mobile phone are located on different sides of the mobile phone, and the first camera and the second screen are located on the same side of the electronic device, e.g., the rear camera of the mobile phone and the outer screen of the mobile phone are located on the same side of the mobile phone;
the method comprises the following steps:
the first screen displays a first shooting picture shot by the first camera and detects a first instruction;
in response to the first instruction, the second screen displays a second shot picture, wherein the second shot picture is a mirror image of the first shot picture, and the first shot picture and the second shot picture have the same size ratio.
The first screen displays the first shooting picture and the first instruction detected by the electronic device in a non-limited order. That is, the first screen may display the first captured picture captured by the first camera after the electronic device detects the first instruction, for example, after the electronic device detects an instruction of the user to start the cooperative shooting function, the first captured picture may be displayed on the first screen. Or after the first screen displays the first shooting picture shot by the first camera, when the electronic device detects the first instruction, for example, the user opens the mobile phone camera application, and the instruction for starting the cooperative shooting function is not given, the first screen may also display the first shooting picture first.
In the method, the electronic equipment can display a first shooting picture on the first screen and simultaneously display a second shooting picture on the second screen according to a user instruction in the shooting process, and the second shooting picture is a mirror image picture of the first shooting picture, so that a shot person can visually see the imaging effect of the shot person through the second shooting picture of the second screen, and the shooting experience of the user is improved.
In one possible implementation manner of the first aspect, the first screen is displayed with a display interface of the camera application, and the display interface of the camera application includes a first shot picture.
In a possible implementation manner of the first aspect, the display interface of the camera application further includes a first control, and in a case that a first operation of the first control by a user is detected, the collaborative shooting instruction is detected.
In some embodiments, the first control may be a button.
That is to say, the user can start the collaborative shooting function of the electronic device according to the first control on the camera application display interface. In some possible implementation manners, the first operation may be that the user clicks the first control, or a voice instruction input to the first control by the user, and the like, which is not limited in this application.
In one possible implementation form of the first aspect, the camera application includes a plurality of shooting modes, and
in response to the first instruction, the second screen displays a second photographed picture including:
and controlling the second screen to display the second shooting picture in the case that the camera application current shooting mode is determined to support the collaborative shooting function.
That is, since a certain shooting mode of the camera application may not support the collaborative shooting function, and the second screen cannot display the second shot picture, it is necessary to determine that the current mode of the camera application supports the collaborative shooting function before the second screen displays the second shot picture, so that the collaborative display of the first screen and the second screen can be realized.
In one possible implementation manner of the first aspect, the shooting mode includes one or more of a photographing mode, a recording mode, a portrait mode, or a movie mode.
In a possible implementation manner of the first aspect, the method further includes:
detecting that a photographing mode of a camera application is switched from a first photographing mode supporting a collaborative photographing function to a second photographing mode not supporting the collaborative photographing function;
the first control is not displayed on a display interface of the camera application.
It is understood that the first control is used to start or close the collaborative shooting function, and therefore when a certain camera mode does not support the collaborative shooting function, the first control is not necessarily displayed on the camera application interface. Therefore, when the photographing mode of the camera application is switched from the photographing mode supporting the collaborative photographing function to the photographing mode not supporting the collaborative photographing function, the camera interface may not display the first control.
In addition, under the shooting mode that does not support the collaborative shooting function, the first control is not displayed, and the situation that the current shooting mode is not available due to the fact that a user touches the first control by mistake, for example, the camera application is in a black screen state, can be effectively avoided.
In a possible implementation manner of the first aspect, the electronic device further includes a collaborative photographing function module, a first screen display module, and a second screen display module; and the number of the first and second electrodes,
the first instruction is detected by the cooperative photographing function module; and is
In response to the first instruction, the second screen displays a second photographed picture including:
the second screen display module receives a first instruction from the collaborative photographing function module through the first screen display module;
the second screen display module controls the second screen to display a second shooting picture in response to receiving the first instruction.
That is, the display contents of the first screen are controlled by the first screen control module, and the display contents of the second screen are controlled by the second screen control module. The electronic equipment detects a first instruction through the cooperative photographing function module, and sends the first instruction to the second screen display module through the first screen display module after the cooperative photographing function module detects the first instruction so as to control the cooperative display of the second screen display module and the first screen display module.
In a possible implementation manner of the first aspect, the electronic device further includes a view module and an image synthesis module, and
the second screen display module controls a second screen to display a second shooting picture in response to receiving the first instruction, and the method comprises the following steps:
and the second screen display module acquires a layer bearing a second shot picture from the view module in response to the received first instruction, and displays the layer bearing the second shot picture on the second screen, wherein the second shot picture borne in the layer is acquired by the view module from the image synthesis module, and the second shot picture is obtained by performing mirror image processing on the first shot picture by the image synthesis module.
In some embodiments, the View module may be a View (View) framework, and the image synthesis module may be a surfefinger module.
In one possible implementation manner of the first aspect, the electronic device further includes a preview data stream module that stores data of the first shot picture; and is
The image synthesis module generates a second shot picture by:
the image synthesis module acquires copy data of the first shooting picture from the preview data stream module;
and the image synthesis module performs mirror image processing on the copied data to obtain data of a second shot picture.
In a second aspect, embodiments of the present application further provide an electronic device, which includes a memory storing computer program instructions; a processor coupled to the memory, wherein the computer program instructions stored in the memory when executed by the processor cause the electronic device to implement the co-shooting method of any of the first aspects.
In a third aspect, an embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored, and when executed by a processor, the computer program implements the collaborative shooting method according to any one of the first aspect.
In a fourth aspect, an embodiment of the present application further provides a computer program product, which, when running on an electronic device, causes the electronic device to execute the collaborative shooting method according to any one of the first aspects.
It is understood that the beneficial effects of the second to fourth aspects can be seen from the description of the first aspect, and are not described herein again.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings according to the drawings without inventive labor.
FIG. 1 illustrates an application scenario diagram of the present solution;
fig. 2 shows two states of the cellular phone 100, wherein fig. 2(a) shows a schematic diagram of the folding-screen cellular phone 100 in an unfolded state, fig. 2(B) shows a schematic diagram of the folding-screen cellular phone 100 in an unfolded state, and fig. 2(C) shows a schematic diagram of the folding-screen cellular phone 100 in a folded state;
fig. 3 is a schematic diagram illustrating an internal and external screen display interface of a mobile phone, where fig. 3(a) and fig. 3(B) respectively illustrate an internal and external screen display interface schematic diagram of an example of the internal and external screen cooperative display provided in the embodiment of the present application;
fig. 4 is a schematic diagram of an internal and external screen display interface of a mobile phone in which an internal and external screen cooperatively displays, where fig. 4(a) and fig. 4(B) respectively show an example of an internal and external screen display interface schematic diagram of an internal and external screen cooperatively display provided in an embodiment of the present application;
fig. 5 shows a schematic diagram of an internal and external screen display interface of a mobile phone in which an internal and external screen cooperatively displays, where fig. 5(a) and fig. 5(B) respectively show an exemplary internal and external screen display interface schematic diagram of an internal and external screen cooperatively display provided in an embodiment of the present application;
fig. 6 shows a schematic diagram of an internal and external screen display interface of a mobile phone in which an internal and external screen cooperatively displays, where fig. 6(a) and fig. 6(B) respectively show an exemplary internal and external screen display interface schematic diagram of an internal and external screen cooperatively display provided in an embodiment of the present application;
fig. 7 is a flowchart illustrating the cooperative shooting method according to the present application;
FIG. 8 is a diagram illustrating a mirror processing method according to an embodiment of the present application;
fig. 9 is a diagram showing a software architecture of the cellular phone 100 according to an embodiment of the present application;
FIG. 10 is an interactive diagram showing cooperation between the various software structures of FIG. 9 to implement the method of the present application;
fig. 11 is a schematic diagram illustrating a hardware structure of the mobile phone 100 according to an embodiment of the present application.
Detailed Description
Various aspects of the illustrative embodiments will be described using terms commonly employed by those skilled in the art.
It should be noted that the collaborative shooting method of the present application is applicable to any multi-screen electronic device having a shooting function and having screens on opposite sides, such as a folding-screen mobile phone, a tablet computer, a smart wearable device, and the like, and the present application is not limited thereto. For convenience of description, the following will be described taking a camera application in the folding-screen mobile phone 100 as an example, and the folding-screen mobile phone 100 will be simply referred to as the mobile phone 100. It can be understood that the technical solution of the present application is also applicable to other applications having a shooting function on an electronic device, and is not limited to a camera application.
It is understood that one configuration of the cellular phone 100 may be as shown in fig. 2, in which fig. 2(a) -2 (B) show the unfolded state of the cellular phone 100, and fig. 2(C) shows the folded state of the cellular phone 100. Specifically, as shown in fig. 2(a), the mobile phone 100 has an external screen a, on which information such as current time, network status, and power level can be displayed, and a camera P1 is disposed on the external screen a. Fig. 2(B) shows the inner screen of the mobile phone 100, and it can be seen that the inner screen can be divided into an inner screen area B-1 and an inner screen area B-2 when the mobile phone 100 is in the unfolded state. The camera P2 may be disposed on the inner screen area B-1 or the inner screen area B-2 of the mobile phone 100, which is not limited in this application.
Fig. 2(C) shows the mobile phone 100 in a folded state, and it can be seen that the inner screen area B-1 and the inner screen area B-2 are not displayed and the outer screen a is displayed.
The above is a schematic diagram of the mobile phone 100 being in the unfolded state and the folded state respectively, and it should be noted that, for convenience of description, the inner screen area B-1 and the inner screen area B-2 of the mobile phone 100 are collectively referred to as the inner screen B, and the unfolded angle and the unfolded state of the inner screen B of the mobile phone 100 are used as the unfolded angle and the unfolded state of the mobile phone 100, for example, in fig. 2(B), the inner screen B of the mobile phone 100 is in the unfolded state, and then the mobile phone 100 is also in the unfolded state, and in fig. 2(C), the inner screen B of the mobile phone 100 is in the folded state, then the mobile phone 100 is also in the folded state.
For convenience of description, in the following description, it is assumed that the photographer faces the inner panel B and the subject faces the outer panel a as an example, and accordingly, the camera P1 is referred to as the front camera 1 and the camera P2 is referred to as the rear camera 2. It is understood that the collaborative shooting method of the present application is also applicable to a scenario in which the photographer faces the outer screen a and the subject faces the inner screen B, in which case, the camera P1 is referred to as the rear camera 1 and the camera P2 is referred to as the front camera 2, respectively.
As background art, when people use the mobile phone 100 and the rear camera P1 to take a picture, the photographer cannot visually observe his or her imaging effect, which affects the user experience of the photographer. To solve the technical problem, some embodiments of the present application provide a collaborative shooting method suitable for a folding-screen electronic device.
Specifically, in the cooperative shooting method according to some embodiments of the present application, the shot picture displayed on the inner screen B of the mobile phone 100 is subjected to mirror image processing and then displayed on the outer screen a of the folding-screen mobile phone 100 in the same scale (hereinafter referred to as a mirror image), so that when a shot is taken, a shot that is currently shot by the camera P1 can be viewed through the outer screen a of the folding-screen mobile phone 100, and the posture of the shot is convenient to adjust. Among them, a method of displaying the same-scale photographing screen with respect to the mirroring process and the inside and outside panels a will be described in detail below.
For example, as shown in fig. 1(B), when the photographer 10 photographs the subject 20 with the rear camera P1 of the cellular phone 100, the photographed screen 101a photographed by the camera P1 is displayed on the inner panel B facing the photographer, and the mirror image 101B (left-right symmetrical to the photographed screen 101 a) of the photographed screen 101a is displayed on the outer panel a of the folding cellular phone 100 facing the subject. In this way, the mirror image 101b displayed on the outer screen a corresponds to the subject himself/herself seen on the mirror, that is, information such as the posture of the subject 20 seen on the mirror image 101b of the outer screen a by the subject 20, the position of the subject 20 on the entire shooting screen, and the like, and is the same as that seen on the shooting screen 101a of the inner screen by the user 10. And in order to display a complete mirror image of the photographed screen 101a in the external screen a, the size ratio (e.g., aspect ratio) of the photographed screen 101a to the mirror image 101b is the same.
In this way, when the photographer takes a picture of the photographer using the folding screen mobile phone 100, the photographer can also visually see the shooting effect through the mirror image displayed by the outer screen a, and since the mirror image displayed by the outer screen a is a picture after the inner screen B displays a picture subjected to mirror image processing, that is, for the photographer, the picture displayed by the outer screen a is similar to the picture displayed when the photographer looks at the mirror, the photographer can directly adjust the posture and position of the photographer according to the picture displayed by the outer screen a to obtain a better shooting effect.
It can be understood that the user can open the collaborative shooting function by clicking a button related to the collaborative shooting in the camera application interface, so that the collaborative shooting method of the application is adopted for shooting. In addition, it can be understood that the user may also turn on the cooperative shooting function by a voice instruction or the like, which is not limited herein.
For example, fig. 3(a) and 3(B) show an implementation process of the mobile phone 100 for turning on the cooperative photographing function. As shown in fig. 3(a), after the photographer 10 opens the camera application, the inner panel B of the mobile phone 100 displays a camera application interface, which includes a currently photographed picture and some function buttons located at two sides of the camera application interface, and the function buttons include a setting button, a brightness adjusting button, a flash button, an intelligent object recognizing button, a beauty button, and a cooperative photographing button 102 from top to bottom. After the user clicks the cooperative photographing button 102, the mobile phone 100 starts the cooperative photographing function, and the external screen a of the mobile phone 100 displays a mirror image 101B of the photographing image 101a of the internal screen B.
After the mobile phone 100 starts the cooperative shooting function, the user can select a specific shooting mode through each function button related to the shooting mode on the right side of the camera application interface, such as a shooting mode button, a recording mode button, a movie mode button, and the like, so as to meet the shooting requirement. It should be noted that, in the process of selecting the camera mode by the user for photographing, the ratio of the images displayed on the inner screen B and the outer screen a of the mobile phone 100 is the same.
For example, as shown in fig. 4(a), when the user selects the photographing mode 105 from the camera mode on the right side of the inner screen B of the mobile phone 100, the photographing screen 101a is displayed on the inner screen B with an aspect ratio of 4: 3, and as shown in fig. 4(B), the mirror image 101B of the photographing screen 101a is also displayed on the outer screen a of the mobile phone 100, and the aspect ratio of the mirror image 101B is also 4: 3.
For example, as shown in fig. 5(a), when the user selects the video recording mode 106 in the camera mode on the right side of the inner screen B of the mobile phone 100, the above-mentioned photographed screen 101a is displayed on the inner screen B at a ratio of an aspect ratio of 21: 9, and as shown in fig. 5(B), the mirror image 101B of the photographed screen 101a is also displayed on the outer screen a of the mobile phone 100, and the aspect ratio of the mirror image 101B is also 21: 9.
It is understood that the user can select other modes from the camera mode on the right side of the screen B in the mobile phone 100, for example, a self-timer mode, a movie mode, a portrait mode (not shown in the figure), and the like, which is not limited in this application. However, no matter what camera mode the user selects, in the case that the camera mode supports the cooperative photographing function, the display frame ratios of the inner and outer panels a of the mobile phone 100 are the same and are the ratios corresponding to the camera mode. For example, in FIG. 3, the length-width ratio for the photographing mode is 4: 3, and the length-width ratio for the recording mode is 21: 9.
In addition, it is understood that in some embodiments, after the user selects another mode that does not support the collaborative photographing function from the camera modes, the collaborative photographing function of the mobile phone 100 is automatically turned off, and the external screen a of the mobile phone 100 no longer displays a mirror image of the photographing screen of the internal screen B of the mobile phone 100. For example, as shown in fig. 6(a) - (B), after the user clicks the lens flipping button 104, turns on the front camera P2 to enter the self-timer mode, since the collaborative photographing function only supports the case of photographing using the rear camera P1, the collaborative photographing function of the mobile phone 100 is automatically turned off at this time, and the collaborative photographing button 102 is no longer displayed on the inner screen B, and only the photographed picture 101a photographed by the front camera P2 is displayed on the inner screen B of the mobile phone 100, and no photographed picture is displayed on the outer screen a (or information such as the date and time shown in fig. 1(a) is displayed, which is not limited herein). It is to be understood that the above-mentioned camera mode that does not support the cooperative photographing function may also be other modes, and the present application does not limit this.
In addition, in other embodiments, after the user selects the shooting mode in which the camera application does not support the collaborative shooting function, the collaborative shooting button 102 may be changed to gray, so that the collaborative shooting button 102 is in an inoperable state.
In addition, it can be further understood that, in some embodiments, in order to improve the user experience, after the user exits from the camera application in the camera mode supporting the collaborative photographing function, if the user opens the camera application again within a preset time period (for example, 15 minutes), the collaborative photographing function of the camera application is still in an open state, so that the user does not need to click the collaborative photographing function button 102 again to open the collaborative photographing function.
The above describes the interface change condition corresponding to the mobile phone 100 when the user performs the related touch operation and uses the cooperative photographing function of the mobile phone 100. In order to better understand the collaborative shooting method of the present application, a specific implementation process of the collaborative shooting method of the present application is described below with reference to fig. 7. It is understood that the following description is also given by taking a camera application in the mobile phone 100 as an example, and the steps of the method shown in fig. 7 can be implemented by the processor 110 of the mobile phone 100 executing the relevant programs.
In particular, fig. 7 illustrates a collaborative shooting method according to some embodiments of the present application. As shown in fig. 7, includes:
701, an opening instruction of the collaborative photographing function is detected in the camera application.
As previously described, the user may click the collaborative photographing function button in the camera application interface to turn on the collaborative photographing function. For example, as shown in fig. 3(a) - (B), the user may click the collaborative photographing function button 102 on the left side of the inner screen B in the camera application interface of the inner screen B of the mobile phone 100 to start the collaborative photographing function.
And 702, judging whether the current camera mode supports the cooperative photographing function. If the current camera mode supports the collaborative photographing function, 703 is entered, that is, the mobile phone 100 generates a mirror image of the photographed image of the inner screen B and displays the mirror image on the outer screen a. If the current camera mode does not support the cooperative photographing function, 704 is entered, i.e. the external screen a of the mobile phone 100 does not display the mirror image of the photographed image of the internal screen B.
For example, in the case where the current camera mode is the photographing mode shown in fig. 4(a), since the photographing mode supports the cooperative photographing function, in this mode, as shown in fig. 4(B), the outer panel a displays the mirror image 101B of the inner panel B photographing screen 101 a. For another example, in the case where the current camera mode is the self-timer mode shown in fig. 6(a), since the self-timer mode does not support the cooperative shooting function, in this mode, as shown in fig. 6(B), the outer panel a does not display the mirror image of the inner panel B shooting screen 101a, but displays information such as the current date and time. It is understood that in some embodiments, the external screen a may also display other screens, such as a default lock screen, a desktop, and the like, which is not limited in this application.
It can be understood that various existing technical solutions can be adopted to perform the mirror image processing on the shot picture. For example, in some embodiments, the specific manner of obtaining the mirror image by the mobile phone 100 mirroring the shot image may be: the mobile phone 100 horizontally inverts the shot picture to obtain a mirror image of the shot picture. For example, assuming that the shot picture of the inner screen B and the corresponding pixel matrix obtained by the mobile phone 100 are as shown in fig. 8(a), the mobile phone 100 performs mirror inversion transformation on the pixel matrix corresponding to the shot picture by using the following formula (1) to obtain the pixel matrix corresponding to the mirror picture shown in fig. 8(B), and then the mobile phone 100 outputs the mirror picture matrix to the outer screen a by using the image processor to obtain the mirror picture shown in fig. 8 (B).
Wherein, the formula (1) is as follows:
M′=S(sx,sy)*M (1)
wherein M is a pixel matrix corresponding to the shot picture, M' is a pixel matrix corresponding to the mirror picture, S (S)x,sy) Inverting the transform coefficients for the mirror image, wherein sx=1,syThe vertical coordinate of each pixel in the pixel matrix of the shot picture is kept unchanged, and the horizontal coordinate is multiplied by-1, that is, the coordinate of each pixel point of the shot picture is subjected to axial symmetry processing (namely horizontal inversion) based on the y axis.
In addition, it can be understood that, as described above, after the user starts the collaborative photographing function, a mirror image of the photographed image of the inner screen B is displayed on the display interface of the outer screen a of the mobile phone, and the length-width ratio of the mirror image displayed by the outer screen a is consistent with that of the photographed image displayed by the inner screen B in the current camera mode.
It can be understood that different camera modes correspond to different image aspect ratios, and in some embodiments of the present application, the corresponding relationship between the camera mode supporting the cooperative photographing function and the image aspect ratio in the middle of the mobile phone 100 is shown in table 1 below:
TABLE 1
Camera mode Length to width ratio of image
Photographing mode
4∶3
Video mode 21∶9
Movie mode 16∶9
Then, the mobile phone 100 performs mirror image processing on the shot picture to obtain a mirror image picture, and displays the mirror image picture on the mobile phone outer screen a. It can be understood that, since the size of the external screen a of the mobile phone is different from the size of the internal screen B of the mobile phone, when the mobile phone 100 displays the mirror image on the external screen a, the mirror image is preferentially adapted to the width of the external screen a of the mobile phone, and then displayed on the external screen a of the mobile phone according to the aspect ratio in the corresponding camera mode.
For example, when the user takes a picture in the "picture taking mode", the length-width ratio of the picture displayed on the inner screen B of the mobile phone 100 is 4: 3 (see fig. 4(a)), and the picture in the length-width ratio is mirrored, and then is adapted to the width of the outer screen a of the mobile phone in the manner shown in fig. 4(B), and then is displayed on the outer screen a of the mobile phone in the ratio of 4: 3.
In addition, in order to further improve the user experience, when the user changes the resolution of the shot picture displayed by the inner screen B in the same camera mode, the resolution of the mirror image displayed by the outer screen a can be correspondingly changed at the same time, so that the resolution of the shot picture displayed by the inner screen B and the resolution of the mirror image displayed by the outer screen a are kept consistent.
By the method, when a user shoots, the photographer can visually watch the imaging effect of the photographer under the rear camera P1 through the outer screen A of the mobile phone 100 and adjust the posture and the position of the photographer according to the image displayed by the outer screen A, so that a better shooting effect is achieved, and the shooting experience of the user is improved.
In order to more clearly understand the implementation details of the above-mentioned collaborative shooting method on the mobile phone 100, the following describes the process of implementing the above-mentioned collaborative shooting method by matching the software/hardware components in the mobile phone 100 with fig. 9 to 10. The method comprises the following specific steps:
fig. 9 is a block diagram illustrating a software architecture of a mobile phone 100 implementing the collaborative shooting method according to some embodiments of the present application. As shown in fig. 9, the software framework of the handset 100 includes an application layer 10, an application framework layer 20, a hardware abstraction layer 30, and a hardware layer 40.
The application layer 10 includes a camera application 1, and the camera application 1 includes a plurality of camera modes, such as a photographing mode 11, a portrait mode 12, a recording mode 13, a movie mode 14, a self-timer mode, and the like. The photo mode 11, the portrait mode 12, the video mode 13, and the movie mode 14 all support the collaborative photo function 1105.
It is understood that in some embodiments, the photographing mode 11 may include functions of wisdom knowledge 1101, Artificial Intelligence (AI) photography 1102, a filter function 1103, a flash function 1104, and a coordinated photographing function 1105, and the user may select the above functions as desired to achieve the corresponding purpose.
For example, in the photographing mode 11, the user may select to turn on the smart object 1101 function to recognize the object in the image. Specifically, after the user starts the intelligent object recognition function in the photographing mode 11, the user only needs to align the camera with the photographed object, and when the mobile phone 100 recognizes the object, the name (e.g., "azalea") or the category ("dog") of the object is displayed on the mobile phone inner screen B. In other embodiments, the user may also turn on the filter function 1103 to beautify the photographed object. This is not limited by the present application.
In some embodiments, portrait mode 12 may include a collaborative photographing function 1105, and may also include other functions, such as a beauty function, a special effects function (not shown in the figures), and the like. When the user selects the portrait mode 12 to perform photographing, the portrait mode enables the face of the person to be photographed to be kept clear while blurring the background, thereby highlighting the subject.
In some embodiments, the video recording mode 13 may include a multi-shot recording function 1301, a flash function 1302, a filter function 1303, a collaborative photographing function 1105, and the like. When the user turns on the multi-view recording function 1301, the front and rear cameras P1 of the mobile phone 100 are turned on simultaneously, so that the user can record a plurality of objects simultaneously by using the front and rear cameras P1. For example, after the user may start the multi-view recording function 1301, the mirror image captured by the front camera P2 and the mirror image captured by the rear camera P1 may be recorded simultaneously, so as to record the scene where the user is located in detail. In addition, the user may also turn on the flash function 1302 to improve the imaging effect in a darker scene, for example, when the user shoots in a darker place, the flash function 1302 may be turned on to increase the exposure duration to improve the imaging effect of the shot object.
In some embodiments, the movie mode 14 may include a color lookup table (LUT) 1401, a 4K high-dynamic illumination rendered image (HDR) 1402, a slow motion 1403, a flash 1104, a collaborative photographing function 1105, and the like. The color search function 1401 is substantially a function of adjusting colors of an object photographed by a user, so that the object photographed by the user has more colors. In the 4K HDR function 1402, 4K refers to resolution (4096 pixels × 2160 pixels), HDR is an image rendering technology, and aims to implement reflection and refraction of scene illumination and simulated light to make an object appear more realistic, and 4K HDR is used to improve quality of a shot picture and improve image resolution and object reality.
The slow motion function 1403 increases the aesthetic feeling and the sense of realism of the captured image by the user through the slow motion function 1403, in order to increase the interest of the shooting.
In addition, the camera application implements the opening and closing of the above functions, controls the cooperative display of the inner and outer screens of the mobile phone 100, and detects the expansion state of the inner screen B of the mobile phone through corresponding control logic.
Specifically, the camera application 1 controls the on and off of the cooperative photographing function by the cooperative photographing function on/off rule 15.
When the touch operation of the user detected by the mobile phone 100 conforms to the cooperative photographing function opening and closing rule 15, the camera application opens the cooperative photographing function 1105. In some embodiments of the present application, the user may trigger the collaborative photographing function on/off rule 15 by clicking the collaborative photographing function button 102 on the mobile phone screen B as shown in fig. 3(a), so as to turn on the collaborative photographing function 1105 of the camera application.
In other embodiments, when the user turns on the collaborative photographing function, the collaborative photographing function turning on and off rules 15 may be triggered by clicking the collaborative photographing function button 102 again to turn off the collaborative photographing function 1105 of the camera application. It is understood that in some embodiments, the display status of the cooperative photographing function button 102 may be changed according to the on and off states of the cooperative photographing function, for example, when the cooperative photographing function is on, the cooperative photographing function button 102 is in a convex state, and when the cooperative photographing function is off, the cooperative photographing function button 102 is restored. This is not limited by the present application.
When the user starts the collaborative photographing function 1105, the camera application 1 controls the collaborative display of the inner and outer screens of the mobile phone through a User Interface (UI) interface control 16.
Specifically, the camera application 1 controls the display of the mobile phone inner screen B through the mobile phone inner screen B display module 1601 and controls the display of the mobile phone outer screen a through the mobile phone outer screen a display module 1602. For example, when the user opens the camera application 1 but does not yet start the cooperative photographing function, the display module 1602 of the inner screen B of the mobile phone initializes the inner screen B of the mobile phone, that is, determines the display area of the inner screen B of the mobile phone in the current camera mode, for example, if the default camera mode is the photographing mode when the camera application 1 is opened, then the display module 1602 of the inner screen B of the mobile phone determines that the display area of the inner screen B of the mobile phone needs to satisfy: the aspect ratio was 4: 3 and is shown centered.
When the user starts the cooperative shooting function, the cooperative shooting function 1105 of the camera application 1 sends a cooperative shooting instruction to the mobile phone inner screen B display module 1601, and then the mobile phone inner screen B display module 1601 initiates a request for cooperative display (i.e. a request for initializing a preview stream) to the mobile phone outer screen a display module 1602, where the request for initializing the preview stream includes an instruction for determining a display area (or a preview area) of the mobile phone outer screen a and performing mirror processing on an image displayed in the display area of the mobile phone outer screen a. It is understood that in some embodiments, the mobile phone inner screen B also performs inner screen initialization according to the cooperative photographing instruction. In other embodiments, the initialization of the inner screen B of the mobile phone may be completed when the camera application is started. This is not limited by the present application.
The mobile phone outer screen a display module 1602 responds to the request for initializing the preview stream, and performs initialization of the preview area of the outer screen a, that is, the outer screen a adds a mirror image processing tag to the preview area, and generates a layer corresponding to the preview area by the View frame 21. The manner of generating the layer corresponding to the preview area by the View frame 21 will be described below.
In addition, the camera application 1 detects whether the inner panel B of the mobile phone 100 is in the expanded state by the mobile phone inner panel B expanded state detection module 1603. Specifically, the mobile phone inner panel B expansion state detection module 1603 of the camera application determines whether the inner panel B of the mobile phone 100 is in the expansion state through the sensor data of the sensor module 43 of the hardware layer 40. The specific process is described below in conjunction with fig. 10, and it is to be understood that the application layer 10 may also include other applications, such as a video application, a chat application, and the like, which is not limited in this respect.
The application framework layer 20 of the handset 100 includes a View framework 21 and a surfacemaker 22.
In some embodiments, the View frame 21 is configured to create a corresponding display layer (display) according to the display area determined by the mobile phone inner screen B display module 1601, so that the layer is subsequently used to carry display data that needs to be displayed on the outer screen a.
For example, in the above-described photographing mode, the display area of the mobile phone inner screen B satisfies: the aspect ratio is 4: 3, the display area of the corresponding mobile phone outer screen A is displayed in the middle, and the following conditions are met: and if the aspect ratio is 4: 3 and the image is displayed in the middle, the View frame layer 21 creates an image layer with the aspect ratio of 4: 3 for the mobile phone external screen a, and displays the image layer on the mobile phone external screen a as a carrier of the mirror image.
In other embodiments, the View frame 21 is further configured to request the surface flunger 22 to perform mirror image post-processing on the content displayed in the preview area, attach the obtained mirror image post-processing content to the generated layer, and then send the layer to the mobile phone external screen a display module 1602 for displaying.
In some embodiments, the surfefinger 22 is configured to obtain data to be displayed from the hardware abstraction layer 30, process the display images accordingly, and send the processed display data to the View frame 21. In the embodiment of the present application, the surfefinger 22 acquires the content to be displayed in the preview area from the preview data stream module 32 by using an instruction for requesting mirroring of the content displayed in the preview area, which is sent from the View frame 21, and performs mirroring of the content. In other embodiments, the surfaceFlinger 22 is also used to send the mirrored content to the View framework 21.
For example, the surfefinger 22 acquires the copied data of the photographed picture from the preview data stream module 32, performs mirroring processing on the copied data of the photographed picture to obtain mirrored picture data of the photographed picture, and sends the mirrored picture data to the View frame 21.
It is understood that the application framework layer 20 may also include other modules, such as a view manager (not shown), a wireless fidelity (Wi-Fi) (not shown), and the present application is not limited thereto.
The hardware abstraction layer 30 includes a photographing data stream module 31, a preview data stream module 32, and a video data stream module 33. The photographing data stream module 31 is used for storing images obtained by photographing the camera application in the photographing mode 11, the portrait mode 12, and the like. The video data stream module 33 is used for storing the video obtained by the camera in the video mode 13, the movie mode 14, and other camera modes.
The preview data stream module 32 is configured to store a shooting picture of a shot image displayed on the inner screen B of the mobile phone when the camera is applied in any one of the above-mentioned camera modes, that is, an image obtained by shooting with the camera before the user clicks the shooting button or the recording button. Such as the image preview stream displayed on the mobile phone screen B in fig. 4(a) or the image preview stream displayed on the mobile phone screen B in fig. 5 (a). It can be understood that, in order to implement the above-mentioned cooperative display on the inner and outer screens of the mobile phone, in some embodiments of the present application, the preview data stream module 32 stores the shot picture of the inner screen B of the mobile phone, and copies the shot picture at the same time, so as to generate the above-mentioned mirror image by using the copy of the shot picture.
It is understood that in some embodiments of the present application, the stored image or video data of the photographing data streaming module 31, the preview data streaming module 32 and the video data streaming module 33 can be implemented based on the following memory shown in fig. 11, which should not be construed as a limitation.
The hardware layer 40 includes a sensor module 43, an Image Signal Processor (ISP) driver 41, and a Display Port (DP) driver 42. In some embodiments of the present application, the sensor module 43 includes a gyroscope sensor and an acceleration sensor, which are described below. In other embodiments, the sensor module 43 may further include other sensors, such as a pressure sensor (not shown), a touch sensor (not shown), an image sensor (not shown), and the like, which is not limited in this application.
The ISP driver 41 is used to process the signal transmitted by the image sensor of the handset 100. For example, an electrical signal of an image fed back by an image sensor is converted into an image visible to the naked eye. The ISP driver 41 may also perform algorithmic optimization for image noise, brightness, etc. The ISP driver 41 may also optimize parameters such as exposure and color temperature of a shooting scene, or perform anti-shake processing on an image to enhance the display effect of the image
The DP driver 42 is an interface for the external display to access the mobile phone 100, and the mobile phone 100 can be connected to the external display through the DP driver 42 and synchronously display the image displayed on the mobile phone 100 on the external display.
It is understood that the above software structure is only exemplary and not limiting to the software structure of the mobile phone 100, and in other embodiments, the mobile phone 100 may have more or less structures, which is not limited in this application.
In order to more intuitively understand the process of cooperatively displaying the mobile phone internal screen and the mobile phone external screen by the cooperation of the software modules, the process of displaying the mirror image of the mobile phone internal screen B on the mobile phone external screen a is described below by taking the interaction diagram shown in fig. 10 as an example.
Specifically, as shown in fig. 10, the above process includes:
1001, after the cooperative shooting function 1105 detects that the cooperative shooting function start rule is satisfied, the cooperative shooting function is started.
1002, the cooperative shooting function 1105 sends a cooperative shooting instruction to the display module 1601 of the mobile phone inner screen B.
1003, a mobile phone inner screen B display module 1601 performs inner screen initialization.
1004, the mobile phone inner screen B display module 1601 sends an instruction to initialize the outer screen preview area to the mobile phone outer screen a display module 1602.
1005, the mobile phone outer screen a display module 1602 initializes the outer screen preview area and adds a mirror image tag to the preview area.
In some embodiments, the mirror tag may be a character string, a number, or the like, for example, the mirror tag may be "x _ flag," which is not limited in this application.
1006, the mobile phone outer screen a display module 1602 sends preview area information with mirror label to the View frame 21.
It will be appreciated that in some embodiments, the information for the preview area may include the coordinates of the preview area.
It is understood that in some embodiments, the View frame 21 creates a layer corresponding to the preview area coordinates for the mobile phone outer screen a according to the preview area coordinates with the mirror image tag, so as to subsequently use the layer to carry the mirror image, and display the layer with the mirror image in the preview area.
1007, the View framework 21 creates a layer corresponding to the preview area.
The process of specifically creating the layer may refer to the above related description, and is not described herein again.
1008, the View frame 21 sends an instruction for mirroring the display content corresponding to the preview area to the surfefinger 22.
It can be understood that the surfefinger 22 performs mirroring processing on the photographed image to be displayed in the preview area according to the mirror image tag carried by the preview area information. The process of the mirroring process can refer to the related description above, and will not be described in detail here.
1009, the surfefinger 22 generates an instruction to acquire copy data of the in-screen shot picture.
1010, the surfefinger 22 sends an instruction to acquire copy data of the in-screen captured picture to the preview stream data module 32.
1011, the preview stream data module 32 generates copy data of the in-screen shot picture.
In some embodiments, the preview data stream module 32 generates bitmap (bitmap) copy data of the shot picture stored by the handset inner panel B display module 1601 according to an instruction sent by the surfefinger 22, where the bitmap is data constituted by pixel point information of an image, for example, a matrix constituted by coordinates of pixel points of an image shown in fig. 8.
1012, the preview stream data module 32 transmits the copy data of the in-screen shot picture to the surfaflinger 22.
1013, the surfefinger 22 performs mirroring processing on the copied data of the inner screen shot picture to obtain mirrored picture data.
The specific mirroring process may refer to the above related description, which is not described herein again.
1014, the faceflunger 22 sends the mirror picture data to the View frame 21.
1015, the View frame 21 attaches the mirror image data in the above-described layer.
1016, the View frame 21 sends the layer carrying the mirror image to the mobile phone external screen a display module 1602.
1017, the mobile phone external screen a display module 1602 displays a layer carrying a mirror image in the preview area.
A schematic diagram of a hardware structure 1100 of the mobile phone 100 for implementing the above method is described with reference to fig. 11.
As shown in fig. 11, the mobile phone 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a sensor module 180, a button 190, a display 194, and the like. Wherein the sensor module 180 may include a gyro sensor 180A, an acceleration sensor 180B, and a touch sensor 180C, etc.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the mobile phone 100. In other embodiments of the present application, the handset 100 may include more or fewer components than shown, or some components may be combined, some components may be separated, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system. In some embodiments of the present application, the processor 110 may be configured to implement a mirror process for the shot of the mobile phone inner panel B.
In some embodiments, processor 110 may include one or more interfaces. The external memory interface 120 may be used to connect an external memory, such as a Micro SD card, to extend the storage capability of the mobile phone 100. The external memory communicates with the processor 110 through the external memory interface 120 to implement data storage functions. For example, the shot picture before the user clicks the photograph button or the record button is stored in the external memory 120, or the video after the user clicks the photograph button or the record button and stops recording is stored.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store a program code corresponding to the network switching method, so that when the program code is executed, the mobile phone 100 may execute the collaborative shooting method of the present application, and the storage data area may store a shooting picture or a mirror image of the shooting picture involved in the collaborative shooting method of the present application. In some embodiments, the data storage area may be divided into several logical partitions to store the above-mentioned image or video data related to the present collaborative shooting method.
The wireless communication function of the mobile phone 100 can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the handset 100. The wireless communication module 160 may provide a solution for wireless communication applied to the mobile phone 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (blue tooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and so on.
The gyroscope sensor 180A and the acceleration sensor 180B in the sensor module 180 of the mobile phone 100 are used to obtain the unfolding angle of the inner screen B of the mobile phone, so that the mobile phone 100 can judge the unfolding state of the inner screen B of the mobile phone 100 according to the unfolding angle of the inner screen B of the mobile phone.
In some embodiments, the cellular phone 100 may have a plurality of gyro sensors 180A and a plurality of acceleration sensors 180B, which are respectively disposed on an outer screen a and an inner screen B of the cellular phone 100. For example, a set of sensors { a gyroscope sensor 180A and an acceleration sensor 180B } is disposed on the outer screen a of the mobile phone 100, and a set of sensors { a gyroscope sensor 180A and an acceleration sensor 180B } is also disposed on the inner screen B of the mobile phone 100. Then, the mobile phone 100 determines the unfolding angle of the inner screen B of the mobile phone 100 by combining the data acquired by the gyroscope sensor on the inner screen a and the data acquired by the acceleration sensor.
The touch sensor 180C may be disposed on the display screen 194, and the touch sensor 180C and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180C is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. For example, the touch sensor 180C obtains a touch operation of the user clicking the photo coordination function button 102 in fig. 3(a), and then passes the operation to the processor 110, and the processor 110 determines that the touch event is "turn on the photo coordination function". When the user clicks the photo-collaborative button 102 again, the touch sensor 180C passes the operation to the processor 110, and the processor 110 determines that the touch event is "turn off the photo-collaborative function". In some embodiments, visual output related to touch operations may be provided through the display screen 194. In other embodiments, the touch sensor 180C may be disposed on a surface of the handset 100 that is different from the location of the display 194. This is not limited by the present application.
An embodiment of the present application further provides an electronic device, including: at least one processor, a memory, and a computer program stored in the memory and executable on the at least one processor, the processor implementing the steps of any of the various method embodiments described above when executing the computer program.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps that can be implemented in the above method embodiments.
The embodiments of the present application provide a computer program product, which when running on a mobile terminal, enables the mobile terminal to implement the steps in the above method embodiments when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method of the embodiments described above can be realized by a computer program, which can be stored in a computer readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer memory, read-only memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunication signals, and software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above-described apparatus/network device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In the description above, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when" or "upon or" in response to a determination or "in response to a detection. Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," second, "third," and the like are used for distinguishing between descriptions and not necessarily for indicating or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather mean "one or more but not all embodiments," unless specifically stated otherwise. The terms "including, i.e.," comprising, "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (11)

1. The electronic equipment is characterized by comprising a first camera, a first screen and a second screen, wherein the first screen and the second screen are positioned on different sides of the electronic equipment;
the method comprises the following steps:
the first screen displays a first shooting picture shot by the first camera and detects a first instruction;
in response to the first instruction, the second screen displays a second shot picture, wherein the second shot picture is a mirror image of the first shot picture, and the first shot picture and the second shot picture have the same size ratio.
2. The method of claim 1, wherein the first screen is displayed with a display interface of a camera application, and wherein the display interface of the camera application comprises the first shot.
3. The method of claim 2, wherein the display interface of the camera application further comprises a first control, and wherein the collaborative shooting instruction is detected upon detecting a first operation of the collaborative shooting control by a user.
4. A method according to claim 2 or 3, characterized in that the camera application comprises a plurality of shooting modes, and
the responding to the first instruction, the second screen displays a second shooting picture, and the method comprises the following steps:
and controlling the second screen to display a second shooting picture in the case that the camera application current shooting mode is determined to support the collaborative shooting function.
5. The method of claim 4, further comprising:
detecting that a photographing mode of the camera application is switched from a first photographing mode in which the cooperative photographing function is supported to a second photographing mode in which the cooperative photographing function is not supported;
not displaying the first control on a display interface of the camera application.
6. The method of claim 4 or 5, wherein the shooting mode comprises one or more of a picture taking mode, a video recording mode, a portrait mode, or a movie mode.
7. The method according to any one of claims 1 to 6, wherein the electronic device further comprises a collaborative photographing function module, a first screen display module, a second screen display module; and the number of the first and second electrodes,
the first instruction is detected by the cooperative photographing function module; and is
The responding to the first instruction, the second screen displays a second shooting picture, and the method comprises the following steps:
the second screen display module receives the first instruction from the collaborative photographing function module through the first screen display module;
and the second screen display module responds to the received first instruction and controls the second screen to display the second shooting picture.
8. The method of claim 7, wherein the electronic device further comprises a view module and an image composition module, and wherein
The second screen display module controls the second screen to display the second shooting picture in response to receiving the first instruction, and the method comprises the following steps:
the second screen display module, in response to receiving the first instruction, acquires a layer bearing the second shot image from the view module, and displays the layer bearing the second shot image on the second screen, where the second shot image borne in the layer is acquired by the view module from the image synthesis module, and the second shot image is obtained by the image synthesis module by mirroring the first shot image.
9. The method of claim 8, wherein the electronic device further comprises a preview data stream module that stores data of the first shot; and is
The image synthesis module generates the second shot picture by:
the image synthesis module acquires copy data of the first shooting picture from the preview data stream module;
and the image synthesis module performs mirror image processing on the copied data to obtain data of the second shot picture.
10. A readable medium, characterized in that the readable medium has stored thereon instructions that, when executed on an electronic device, cause the electronic device to execute the collaborative shooting method according to any one of claims 1 to 9.
11. An electronic device, characterized in that the electronic device comprises:
a memory for storing instructions for execution by one or more processors of the electronic device, an
A processor, which is one of processors of an electronic device, for executing the collaborative photographing method according to any one of claims 1 to 9.
CN202111682873.1A 2021-12-31 2021-12-31 Collaborative shooting method, electronic equipment and medium thereof Active CN114401340B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111682873.1A CN114401340B (en) 2021-12-31 2021-12-31 Collaborative shooting method, electronic equipment and medium thereof
CN202311176831.XA CN117424954A (en) 2021-12-31 2021-12-31 Collaborative shooting method, electronic equipment and medium thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111682873.1A CN114401340B (en) 2021-12-31 2021-12-31 Collaborative shooting method, electronic equipment and medium thereof

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202311176831.XA Division CN117424954A (en) 2021-12-31 2021-12-31 Collaborative shooting method, electronic equipment and medium thereof

Publications (2)

Publication Number Publication Date
CN114401340A true CN114401340A (en) 2022-04-26
CN114401340B CN114401340B (en) 2023-09-26

Family

ID=81229238

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202311176831.XA Pending CN117424954A (en) 2021-12-31 2021-12-31 Collaborative shooting method, electronic equipment and medium thereof
CN202111682873.1A Active CN114401340B (en) 2021-12-31 2021-12-31 Collaborative shooting method, electronic equipment and medium thereof

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202311176831.XA Pending CN117424954A (en) 2021-12-31 2021-12-31 Collaborative shooting method, electronic equipment and medium thereof

Country Status (1)

Country Link
CN (2) CN117424954A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116048436A (en) * 2022-06-17 2023-05-02 荣耀终端有限公司 Application interface display method, electronic device and storage medium
WO2023124178A1 (en) * 2021-12-31 2023-07-06 荣耀终端有限公司 Method for displaying preview image, apparatus, and readable storage medium
CN117156270A (en) * 2023-02-03 2023-12-01 荣耀终端有限公司 Photographing processing method and related device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107333047A (en) * 2017-08-24 2017-11-07 维沃移动通信有限公司 A kind of image pickup method, mobile terminal and computer-readable recording medium
CN107613196A (en) * 2017-09-05 2018-01-19 珠海格力电器股份有限公司 A kind of self-timer method and its device, electronic equipment
CN108234875A (en) * 2018-01-15 2018-06-29 广东欧珀移动通信有限公司 Shoot display methods, device, mobile terminal and storage medium
CN108833769A (en) * 2018-05-18 2018-11-16 Oppo广东移动通信有限公司 Shoot display methods, device, mobile terminal and storage medium
CN110933306A (en) * 2019-11-29 2020-03-27 维沃移动通信有限公司 Method for sharing shooting parameters and electronic equipment
CN111385456A (en) * 2018-12-27 2020-07-07 北京小米移动软件有限公司 Photographing preview method and device and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107333047A (en) * 2017-08-24 2017-11-07 维沃移动通信有限公司 A kind of image pickup method, mobile terminal and computer-readable recording medium
CN107613196A (en) * 2017-09-05 2018-01-19 珠海格力电器股份有限公司 A kind of self-timer method and its device, electronic equipment
CN108234875A (en) * 2018-01-15 2018-06-29 广东欧珀移动通信有限公司 Shoot display methods, device, mobile terminal and storage medium
CN108833769A (en) * 2018-05-18 2018-11-16 Oppo广东移动通信有限公司 Shoot display methods, device, mobile terminal and storage medium
CN111385456A (en) * 2018-12-27 2020-07-07 北京小米移动软件有限公司 Photographing preview method and device and storage medium
CN110933306A (en) * 2019-11-29 2020-03-27 维沃移动通信有限公司 Method for sharing shooting parameters and electronic equipment

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023124178A1 (en) * 2021-12-31 2023-07-06 荣耀终端有限公司 Method for displaying preview image, apparatus, and readable storage medium
CN116048436A (en) * 2022-06-17 2023-05-02 荣耀终端有限公司 Application interface display method, electronic device and storage medium
CN116048436B (en) * 2022-06-17 2024-03-08 荣耀终端有限公司 Application interface display method, electronic device and storage medium
CN117156270A (en) * 2023-02-03 2023-12-01 荣耀终端有限公司 Photographing processing method and related device

Also Published As

Publication number Publication date
CN117424954A (en) 2024-01-19
CN114401340B (en) 2023-09-26

Similar Documents

Publication Publication Date Title
CN114401340B (en) Collaborative shooting method, electronic equipment and medium thereof
CN108833753B (en) Image acquisition and application method, terminal and computer readable storage medium
CN111541845B (en) Image processing method and device and electronic equipment
CN109345485B (en) Image enhancement method and device, electronic equipment and storage medium
CN102055834B (en) Double-camera photographing method of mobile terminal
WO2020238380A1 (en) Panoramic photography method and terminal device
CN108495032B (en) Image processing method, image processing device, storage medium and electronic equipment
JP7000050B2 (en) Imaging control device and its control method
WO2021047077A1 (en) Image processing method, apparatus, and device based on multiple photographing modules, and medium
CN108632543B (en) Image display method, image display device, storage medium and electronic equipment
CN114092364A (en) Image processing method and related device
CN112153272B (en) Image shooting method and electronic equipment
WO2022077970A1 (en) Method and apparatus for adding special effects
CN111246106B (en) Image processing method, electronic device, and computer-readable storage medium
CN106791390B (en) Wide-angle self-timer real-time preview method and user terminal
CN105554372A (en) Photographing method and device
WO2022252660A1 (en) Video capturing method and electronic device
CN112287852A (en) Face image processing method, display method, device and equipment
WO2022206499A1 (en) Image capture method and apparatus, electronic device and readable storage medium
CN115002339A (en) Collaborative photographing method, electronic device, storage medium, and program product
WO2015194084A1 (en) Information processing device, information processing system, and information processing method and program
CN113329172A (en) Shooting method and device and electronic equipment
CN113840070A (en) Shooting method, shooting device, electronic equipment and medium
CN110086998B (en) Shooting method and terminal
WO2021185374A1 (en) Image capturing method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant