CN113747056A - Photographing method and device and electronic equipment - Google Patents

Photographing method and device and electronic equipment Download PDF

Info

Publication number
CN113747056A
CN113747056A CN202110839839.4A CN202110839839A CN113747056A CN 113747056 A CN113747056 A CN 113747056A CN 202110839839 A CN202110839839 A CN 202110839839A CN 113747056 A CN113747056 A CN 113747056A
Authority
CN
China
Prior art keywords
electronic device
instruction
electronic equipment
control
shooting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110839839.4A
Other languages
Chinese (zh)
Inventor
刘梦楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202110839839.4A priority Critical patent/CN113747056A/en
Publication of CN113747056A publication Critical patent/CN113747056A/en
Priority to PCT/CN2022/094200 priority patent/WO2023000802A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W76/00Connection management
    • H04W76/10Connection setup
    • H04W76/14Direct-mode setup

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Telephone Function (AREA)

Abstract

The application provides a photographing method, a photographing device and electronic equipment, and relates to the technical field of terminals. In the photographing method, the first electronic device and the second electronic device are in communication connection. The first electronic device may synchronize the first preview image to the second electronic device. In this way, the person to be photographed refers to the posture of the person and the photographing composition formed by the landscape of the position where the person is located before the first electronic device photographs. Furthermore, the shot person can adjust the posture and the position of the shot person according to the personal shooting intention so as to achieve the shooting composition satisfied by the shot person. Furthermore, the first electronic device can take a picture based on the adjusted picture composition, so that the matching degree of the target image shot by the shot person and the picture taking intention of the shot person is high, and the satisfaction degree of the shot person on the target image is improved.

Description

Photographing method and device and electronic equipment
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a photographing method and apparatus, and an electronic device.
Background
In daily life, when people encounter beautiful scenery or good living segments, etc., the good moment can be recorded by taking a picture.
In general, when a user A needs to take a picture, the user A needs to stand at a picture taking position and set a posture; the user B, standing at location B, then holds the electronic device in alignment with the user a to capture an image. Further, in the preview screen of the electronic apparatus, the posture of the user a and the landscape where the user a is located form a photographed composition. Then, the user B takes a picture based on the picture composition in the preview screen, and obtains a picture.
However, user B typically cannot comprehend the intent of user a to take a picture. For example, whether the user a's gesture is done in place, whether the user a's station location is reasonable, etc. In this way, the degree of matching between the photograph taken by the user B and the photographing intention of the user a is low, resulting in dissatisfaction of the user a with respect to the taken photograph.
Disclosure of Invention
The application provides a shooting method, a shooting device and electronic equipment, which can enable a shot person to see a shot preview picture, and enable the shot person to adjust own posture and position according to personal shooting intentions so as to achieve a satisfactory shooting composition.
In a first aspect, the present application provides a photographing method applied to a photographing system, where the photographing system includes a first electronic device and a second electronic device, and the first electronic device includes a camera. The photographing method provided by the application comprises the following steps: the first electronic device and the second electronic device establish a communication connection. The first electronic equipment displays a first preview interface, and the first preview interface comprises a first preview image acquired by a camera. The first electronic device synchronizes the first preview image to the second electronic device. The second electronic device displays a second preview interface. Wherein the second preview interface includes the first preview image. The first electronic equipment obtains a shooting instruction and obtains a target image by shooting through the camera. The first electronic device sends the target image to the second electronic device through the communication connection. And the second electronic equipment displays the target image on the second preview interface. Receiving a confirmation instruction at the second electronic equipment, wherein the confirmation instruction is used for confirming the target image and sending the confirmation instruction to the first electronic equipment; the first electronic device saves the target image in response to the confirmation instruction. Or receiving a cancel instruction at the second electronic device, wherein the cancel instruction is used for canceling the target image, and the second electronic device sends the cancel instruction to the first electronic device; the first electronic device does not save the target image in response to the cancel instruction.
As can be seen, the first electronic device may synchronize the first preview image to the second electronic device. In this way, the person to be photographed refers to the posture of the person and the photographing composition formed by the landscape of the position where the person is located before the first electronic device photographs. Furthermore, the shot person can adjust the posture and the position of the shot person according to the personal shooting intention so as to achieve the shooting composition satisfied by the shot person. Furthermore, the first electronic device can take a picture based on the adjusted picture composition, so that the matching degree of the target image shot by the shot person and the picture taking intention of the shot person is high, and the satisfaction degree of the shot person on the target image is improved.
In one possible implementation, the first preview interface further includes a first control, and the first electronic device and the second electronic device establish a communication connection, including: when the first electronic device receives the triggering operation aiming at the first control, the first electronic device establishes communication connection with the second electronic device.
Therefore, the photographer can establish communication connection between the first electronic equipment and the second electronic equipment by only triggering the first control, and the method is convenient and fast.
Further, when the first electronic device receives a trigger operation for the first control, the first electronic device establishes a communication connection with the second electronic device, including: when the first electronic device receives the trigger operation aiming at the first control, the first electronic device starts the near field communication function and broadcasts the identification of the first electronic device. The second electronic device is the first electronic device and displays the identification of the first electronic device. And when the second electronic equipment receives the trigger operation of the identifier of the first electronic equipment, the first electronic equipment establishes communication connection with the second electronic equipment.
Therefore, the photographer can establish communication connection between the first electronic equipment and the second electronic equipment by only triggering the first control, and the method is convenient and fast.
Further, when the first electronic device receives a trigger operation for the first control, the first electronic device turns on the near field communication function and broadcasts an identifier of the first electronic device, including: when the first electronic device receives a trigger operation aiming at the first control, the first electronic device displays a hot spot setting interface. The hot spot setting interface comprises a button for starting the hot spot; and when the first electronic equipment receives a trigger operation aiming at the button for starting the hot spot, the first electronic equipment starts the short-range communication function and broadcasts the identification of the first electronic equipment. When the second electronic device receives a trigger operation on the identifier of the first electronic device, the first electronic device establishes communication connection with the second electronic device, including: and when the second electronic equipment receives the trigger operation of the identifier of the first electronic equipment, the second electronic equipment displays the first input frame. And when the key of the hotspot is obtained in the first input box, the second electronic equipment establishes communication connection with the first electronic equipment.
Therefore, when the second electronic device obtains the key of the hotspot, the second electronic device establishes communication connection with the first electronic device, and the privacy of the photographing process is improved.
Still further, the second preview interface further includes a second control, and the second electronic device detects the near field communication signal of the first electronic device and displays an identifier of the first electronic device, including: and when the second electronic equipment receives the triggering operation aiming at the second control, the second electronic equipment displays the hotspot searching interface. The second electronic device detects the identification of the first electronic device of the broadcast of the first electronic device and displays the identification of the first electronic device on the hotspot searching interface.
Therefore, the shot person only needs to trigger the second control, communication connection between the first electronic equipment and the second electronic equipment can be established, and the method is convenient and fast.
In another optional implementation manner, the second preview interface further includes a second control, and when the first electronic device receives a trigger operation for the first control, the first electronic device establishes a communication connection with the second electronic device, including: when the first electronic device receives a trigger operation aiming at the first control, the first electronic device displays a hotspot searching interface, wherein the hotspot searching interface comprises an identifier of the target WiFi. When the first electronic device receives a trigger operation aiming at the identification of the target WiFi, the first electronic device accesses the target Wi-Fi. The first electronic device sends information used for indicating that the first electronic device is a main shooting device to a server of the target WiFi. And when the second electronic equipment receives the triggering operation aiming at the second control, the second electronic equipment displays a second hotspot searching interface, wherein the second hotspot searching interface comprises the identification of the target WiFi. And when the second electronic equipment receives the trigger operation aiming at the identification of the target WiFi, the second electronic equipment accesses the target WiFi. The second electronic equipment sends information used for indicating that the second electronic equipment is the shot equipment to a server of the target WiFi; the first electronic device establishes a communication connection with the second electronic device in response to a connection instruction with the second electronic device from the server of the target WiFi.
Therefore, when the first electronic device serves as a main shooting device and the second electronic device serves as a shot device and the first electronic device and the second electronic device are both connected with the same target Wi-Fi, communication connection can be established between the first electronic device and the second electronic device, and the method is convenient and fast.
In another optional implementation, the first electronic device obtains a shooting instruction, including: the first electronic equipment responds to the triggering of the shooting button in the first preview interface and generates a shooting instruction. Thus, the shooting instruction can be manually triggered by the photographer. Or the second electronic device responds to the triggering operation of the side key and sends a shooting instruction to the first electronic device, and then the first electronic device receives the shooting instruction. Thus, the shooting instruction can be triggered manually by the shot person. Or the first electronic equipment recognizes voice information used for instructing shooting and generates a shooting instruction. Thus, the shooting instruction triggered by the voice of the photographer can be realized. Or the second electronic device recognizes the voice information for instructing shooting and sends a shooting instruction to the first electronic device, and then the first electronic device receives the shooting instruction. Thus, the shooting instruction triggered by the voice of the shot object can be realized. Or the first electronic equipment recognizes the limb movement for instructing shooting, and generates a shooting instruction. Therefore, the shooting instruction triggered by the limbs of the shot object can be realized.
In an optional implementation manner, after the first electronic device saves the target image in response to the confirmation instruction, or after the first electronic device does not save the target image in response to the cancellation instruction, the photographing method provided by the application further includes: the first electronic equipment displays a first preview interface, wherein the first preview interface comprises a second preview image acquired by the first electronic equipment through a camera. The first electronic device synchronizes the second preview image to the second electronic device.
Therefore, after the target image obtained by the last shooting is processed, the next target image can be continuously shot.
In an optional implementation manner, the second preview interface further includes a confirmation control and a discard control, and the receiving of the confirmation instruction by the second electronic device includes: and the second electronic equipment receives a confirmation instruction in response to the triggering operation of the confirmation control. Or, the receiving, by the second electronic device, the cancel instruction includes: and the second electronic equipment receives the triggering operation of the abandoning control and receives a cancelling instruction.
In an optional implementation manner, the first preview interface further includes a switching control, where the switching control is used to instruct to switch the photographing identity, and the method provided by the present application further includes: when the first electronic device receives the trigger operation of the switching control, the first electronic device sends a switching instruction to the second electronic device. And the second electronic equipment responds to the switching instruction and displays prompt information for prompting the first electronic equipment to request for switching the photographing identity. When the second electronic equipment receives a trigger operation for agreeing to switch the photographing identity, the second electronic equipment is switched to the main photographing equipment, and an agreement instruction is sent to the first electronic equipment. The first electronic device receives the consent instruction from the second electronic device to switch to the photographed device.
Therefore, the second electronic equipment can be switched from the shot equipment to the main shot equipment for image acquisition and shooting, and the first electronic equipment can be switched from the main shot equipment to the handkerchief equipment for synchronously displaying the images acquired by the second electronic equipment.
In a second aspect, an embodiment of the present application further provides an electronic device, where the electronic device includes a camera, and the electronic device further includes: the first communication unit is used for establishing communication connection with the second electronic equipment. The first display unit is used for displaying a first preview interface, and the first preview interface comprises a first preview image acquired by the camera. The first communication unit is further used for synchronizing the first preview image to the second electronic equipment through the communication connection. The first processing unit is further used for acquiring a shooting instruction and shooting a target image by using the camera. The first communication unit is also used for sending the target image to the second electronic equipment. The first communication unit is further used for receiving a confirmation instruction from the second electronic equipment, the confirmation instruction is used for confirming the target image, and the first processing unit is further used for responding to the confirmation instruction; a first storage unit for storing the target image. Or the first communication unit is further used for receiving a cancel instruction of the target image from the second electronic equipment, wherein the cancel instruction is used for canceling the target image; and the first processing unit is also used for responding to the cancel instruction and not saving the target image.
In an optional implementation manner, the first communication unit is specifically configured to establish a communication connection with the second electronic device when a trigger operation for the first control is received.
Further, the first communication unit is specifically configured to, when receiving a trigger operation for the first control, start the near field communication function and broadcast an identifier of the electronic device. And when the triggered operation of the identification of the electronic equipment is performed, establishing a communication connection with a second electronic equipment.
Furthermore, the first display unit is further configured to display a hot spot setting interface when receiving a trigger operation for the first control, where the hot spot setting interface includes a button for starting a hot spot. The first communication unit is specifically configured to, when receiving a trigger operation for a button for turning on a hot spot, turn on a near field communication function and broadcast an identifier of the electronic device.
In an optional implementation manner, the first display unit is further configured to display a hotspot search interface when a trigger operation for the first control is received, where the hotspot search interface includes an identifier of the target WiFi. The first communication unit is further used for accessing the target WiFi when the trigger operation aiming at the identification of the target WiFi is received. The first communication unit is further used for sending information used for indicating that the electronic equipment is the main shooting equipment to a server of the target WiFi. The first communication unit is further used for receiving a connection instruction with the second electronic device from the server of the target WiFi. And the first communication unit is also used for responding to the connection instruction and establishing communication connection with the second electronic equipment.
In an optional implementation manner, the first processing unit is specifically configured to generate the shooting instruction in response to a trigger of a shooting button in the first preview interface. Or, the first communication unit is used for receiving a shooting instruction from the second electronic equipment. Or the first processing unit is specifically used for recognizing voice information used for instructing shooting and generating a shooting instruction. Or, the first processing unit is specifically configured to recognize a limb movement for instructing the photographing, and generate a photographing instruction.
In an optional implementation manner, the first display unit is further configured to display a first preview interface, where the first preview interface includes a second preview image acquired by the electronic device using a camera. And the first communication unit is also used for synchronizing the second preview image to the second electronic equipment.
In an optional implementation manner, the first preview interface further includes a switching control, and the switching control is used to instruct to switch the photographing identity. The first communication unit is further used for sending a switching instruction to the second electronic device when receiving the triggering operation of the switching control. The first communication unit is also used for receiving the agreement instruction from the second electronic equipment to switch to the shot equipment.
In a third aspect, an embodiment of the present application further provides an electronic device, where the electronic device includes: and the second communication unit is used for establishing communication connection with the first electronic equipment. And the second communication unit is also used for receiving the first preview image synchronized by the first electronic equipment through the communication connection. And the second display unit is used for displaying a second preview interface, wherein the second preview interface comprises the first preview image. And the second communication unit is also used for receiving the target image shot from the first electronic equipment. And the display unit is also used for displaying the target image on the second preview interface. And the second communication unit is also used for receiving a confirmation instruction and sending the confirmation instruction to the first electronic equipment, wherein the confirmation instruction is used for confirming the target image. Or the second communication unit is further configured to receive a cancel instruction and send the cancel instruction to the first electronic device, where the cancel instruction is used to cancel the target image.
In an optional implementation manner, the second communication unit is specifically configured to, in a case that the broadcast of the first electronic device is detected, display an identifier of the first electronic device. And when a trigger operation of the identification of the first electronic equipment is received, establishing communication connection with the first electronic equipment.
Further, the display unit is further configured to display the first input box when receiving a trigger operation on the identifier of the first electronic device. The second communication unit is specifically configured to establish communication connection with the first electronic device when the key of the hotspot is obtained in the first input box.
Further, the second preview interface further includes a second control, and the display unit is specifically configured to display the hotspot search interface when receiving a trigger operation for the second control; and under the condition that the identification of the first electronic equipment broadcasted by the first electronic equipment is detected, displaying the identification of the first electronic equipment on a hotspot searching interface.
In an optional implementation manner, the second preview interface further includes a second control, and the second display unit is further configured to display a second hotspot search interface when a trigger operation for the second control is received, where the second hotspot search interface includes an identifier of the target WiFi. The second communication unit is further configured to access the target WiFi when receiving a trigger operation for the identifier of the target WiFi. And the second communication unit is further used for sending information used for indicating that the electronic equipment is the photographed equipment to a server of the target WiFi. And the second communication unit is also used for receiving a connection instruction with the first electronic device from the server of the target WiFi. And the second communication unit is also used for responding to the connection instruction and establishing communication connection with the first electronic equipment.
In an optional implementation manner, the second communication unit is further configured to send a shooting instruction to the first electronic device in response to a triggering operation of the side key. Or the second communication unit is also used for sending a shooting instruction to the first electronic equipment when the voice information used for instructing shooting is identified.
In an optional implementation, the second communication unit is further configured to receive a second preview image synchronized from the first electronic device.
In an optional implementation, the second preview interface further includes a confirm control and a discard control, and the electronic device further includes: and the second processing unit is specifically used for responding to the trigger operation of the confirmation control and receiving a confirmation instruction. Or, receiving a cancellation instruction in response to the triggering operation of the abandoning control.
In an optional implementation manner, the second communication unit is further configured to receive a handover instruction from the first electronic device. And the second display unit is also used for responding to the switching instruction and displaying prompt information for prompting the first electronic equipment to request to switch the photographing identity. The electronic device further includes: the second processing unit is used for switching to the main shooting device when receiving the triggering operation of agreeing to switch the shooting identity; and the second communication unit is further used for sending an agreement instruction to the first electronic device, wherein the agreement instruction is used for indicating the first electronic device to be switched into the photographed device.
In a fourth aspect, the present application further provides a photographing method applied to a first electronic device, where the first electronic device includes a camera. The method comprises the following steps: the first electronic device and the second electronic device establish a communication connection. The first electronic equipment displays a first preview interface, and the first preview interface comprises a first preview image acquired by a camera. The first electronic device synchronizes the first preview image to the second electronic device through the communication connection. The first electronic equipment obtains a shooting instruction and obtains a target image by shooting through the camera. The first electronic device sends the target image to the second electronic device. The first electronic device receives a confirmation instruction from the second electronic device, wherein the confirmation instruction is used for confirming the target image. The first electronic device saves the target image in response to the confirmation instruction. Or the first electronic equipment receives a cancel instruction of the target image from the second electronic equipment, wherein the cancel instruction is used for canceling the target image; and in response to the cancel instruction, the target image is not saved.
In a fifth aspect, the present application further provides a photographing method applied to a second electronic device. The method comprises the following steps: the second electronic device establishes a communication connection with the first electronic device. The second electronic device receives the first preview image synchronized by the first electronic device through the communication connection. And the second electronic equipment displays a second preview interface, wherein the second preview interface comprises the first preview image. The second electronic equipment receives the target image shot from the first electronic equipment. And the second electronic equipment displays the target image on the second preview interface. And the second electronic equipment receives the confirmation instruction and sends the confirmation instruction to the first electronic equipment, wherein the confirmation instruction is used for confirming the target image. Or the second electronic device receives a cancel instruction and sends the cancel instruction to the first electronic device, wherein the cancel instruction is used for canceling the target image.
In a sixth aspect, an embodiment of the present application further provides a photographing apparatus, including a processor and a memory, where the memory is used to store code instructions; the processor is configured to execute the code instructions to cause the electronic device to perform the photographing method as described in the first aspect or any implementation manner of the first aspect.
In a seventh aspect, an embodiment of the present application further provides a computer-readable storage medium, where instructions are stored, and when the instructions are executed, the instructions cause a computer to execute the photographing method described in the first aspect or any implementation manner of the first aspect.
In an eighth aspect, embodiments of the present application further provide a computer program product, which includes a computer program and when the computer program is executed, causes a computer to execute the photographing method described in the first aspect or any implementation manner of the first aspect.
It should be understood that the second aspect to the eighth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the beneficial effects achieved by the aspects and the corresponding possible implementations are similar and will not be described again.
In addition, the first control may be a trigger region of "master beat pairing" in the following embodiments, the second control may be a trigger region of "paired beat" in the following embodiments, and the button for turning on the hot spot may be the first switch button 207 in the following embodiments.
Drawings
Fig. 1 is a schematic view of a scene where a user a and a user B are located according to an embodiment of the present application;
FIG. 2 is a schematic interface diagram of a photographing method;
FIG. 3 is a diagram of a hardware system architecture of an electronic device;
FIG. 4 is a diagram of a software system architecture of an electronic device;
fig. 5 is a schematic interface diagram of the mobile phone 100 during the pairing process;
fig. 6 is a schematic interface diagram of the mobile phone 200 during the pairing process;
fig. 7 is a schematic interface diagram of the mobile phone 100 and the mobile phone 200 during the photographing process;
fig. 8 is a schematic diagram of an interface for triggering the user B to trigger the button of the mobile phone 200 to take a picture;
FIG. 9 is a schematic view of an interface for user B to give up taking a photo;
FIG. 10 is a schematic view of an interface for user B to confirm a photograph taken;
fig. 11 is an interface diagram illustrating that the mobile phone 100 is switched to be the photographed device and the mobile phone 200 is switched to be the main photographing device to photograph the user a;
fig. 12 is a schematic interface diagram of the user a confirming the photo taken.
Fig. 13 is a flowchart of a photographing method according to an embodiment of the present application;
fig. 14 is a schematic structural diagram of an electronic device 1400 provided in an embodiment of the present application;
fig. 15 is a schematic structural diagram of an electronic device 1500 according to an embodiment of the present disclosure;
fig. 16 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present disclosure;
fig. 17 is a schematic structural diagram of a chip according to an embodiment of the present application.
Detailed Description
In the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same or similar items having substantially the same function and action. For example, the first value and the second value are only used to distinguish different values, and the order of the values is not limited. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
It is noted that, in the present application, words such as "exemplary" or "for example" are used to mean exemplary, illustrative, or descriptive. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
In daily life, when people encounter beautiful scenery or good living segments, etc., the good moment can be recorded by taking a picture. As shown in fig. 1, when the user B stands at the photographing position and is posed to take a picture, the user a can take out the electronic device to take a picture of the user B. As shown in fig. 2 (a), the electronic device of user a displays an icon 202 of a camera application on a system desktop 201. The user A can aim at the user B by holding a camera of the electronic device, and the electronic device opens the camera application 202 to display an image preview interface in response to the triggering operation of the user A on the icon 202 of the camera application. As shown in (b) of fig. 2, a shooting button 204, a display frame 205, and a captured first preview image are included in the first preview interface 203. The captured first preview image includes the posture of the user B and the landscape of the position where the user B is located, and the display frame 205 is used to display a thumbnail of the captured photograph. The electronic device may capture the captured first preview image in response to a user's triggering operation of the capture button 204, thereby successfully completing the capture. As shown in fig. 2 (c), a thumbnail of the taken photograph is displayed within the display frame 205. As shown in (d) of fig. 2, the electronic device may display the taken photograph on the image display interface 206 in response to a trigger operation of the user a on the display frame 205.
However, user a typically cannot comprehend the intent of user B to take a picture. For example, whether the gesture of user B is done in place, whether the location of user B's station is reasonable, etc. In this way, the degree of matching between the photograph taken by the user a and the photographing intention of the user B is low, resulting in dissatisfaction of the user B with the taken photograph.
In view of this, an embodiment of the present application provides a photographing method, which may first establish a one-to-one paired communication connection between an electronic device of a user a and an electronic device of a user B. When the user B stands in the photographing position and places a posture to take a picture, the electronic device of the user B may open the camera application and display the image preview interface in response to the trigger operation of the user B. The electronic device of user a may display the captured first preview image according to the steps of (a) - (c) in fig. 2. Wherein the acquired first preview image comprises the posture of the user B and the scenery of the position where the user B is located. And the electronic equipment of the user A displays the acquired first preview image and sends the acquired first preview image to the electronic equipment of the user B. Furthermore, the display image preview interface of the electronic device of the user B may also show the first preview image captured by the electronic device of the user a. In this way, the user B can form a photographing composition according to the photographing intention of the user B based on the captured first preview image to adjust the posture, the position, and the like. Further, the electronic device of the user a may complete the photographing in response to a trigger operation of the user a or receiving a photographing instruction from the electronic device of the user B. Understandably, the matching degree of the photographed picture and the photographing intention of the user B is high, and the satisfaction degree of the user B on the photographed picture is improved.
As shown in fig. 3, it is understood that the electronic device may be a mobile phone (mobile phone), a wearable device, a tablet computer (Pad), a computer with wireless transceiving function, a Virtual Reality (VR) terminal device, an Augmented Reality (AR) terminal device, and so on. The embodiments of the present application do not limit the specific techniques and the specific device forms used in the electronics.
In order to better understand the embodiments of the present application, the following describes the structure of the electronic device according to the embodiments of the present application. Fig. 3 is a schematic structural diagram of an electronic device to which the embodiment of the present disclosure is applied. As shown in fig. 3, the electronic device 100 may include: the mobile phone comprises a processor 110, an internal memory 121, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a loudspeaker 170A, a microphone 170C, a key 190, a camera 193 and a display screen 194. It is to be understood that the illustrated structure of the present embodiment does not constitute a specific limitation to the electronic apparatus 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, a Display Processing Unit (DPU), and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. In some embodiments, the electronic device 100 may also include one or more processors 110. The processor may be, among other things, a neural center and a command center of the electronic device 100. The processor can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution. A memory may also be provided in processor 110 for storing instructions and data.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example, antennas 1 and 2 may transmit bluetooth signals or WiFi signals.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier, etc. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLAN), bluetooth, Global Navigation Satellite System (GNSS), Frequency Modulation (FM), NFC, Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technologies may include GSM, GPRS, CDMA, WCDMA, TD-SCDMA, LTE, GNSS, WLAN, NFC, FM, and/or IR technologies, among others. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a bei dou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 may implement display functions via the GPU, the display screen 194, and the application processor. The application processor may comprise an NPU, a DPU. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute instructions to generate or change display information. The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like. The DPU is also called a Display Sub-System (DSS), and is used for adjusting the color of the Display screen 194, and the DPU may adjust the color of the Display screen through a three-dimensional look-up table (3D LUT). The DPU may also perform scaling, noise reduction, contrast enhancement, backlight brightness management, hdr processing, display parameter Gamma adjustment, and the like on the picture.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The electronic device 100 may implement a capture function via the ISP, one or more cameras 193, video codec, GPU, one or more display screens 194, and application processor, among others.
The electronic device 100 may implement audio functions through the audio module 170, the speaker 170A, the microphone 170C, and the application processor, etc. Such as music playing, voice capturing, etc. The audio module 170 is configured to convert digital audio information into an analog audio signal for output, and also configured to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110. The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The microphone 170C may collect voice.
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the electronic device 100. Fig. 4 is a block diagram of a software structure of an electronic device to which the embodiment of the present application is applied. The layered architecture divides the software system of the electronic device 100 into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system may be divided into five layers, namely an application layer (applications), an application framework layer (application framework), an Android runtime (Android runtime), and a system library, a Hardware Abstraction Layer (HAL), and a kernel layer (kernel).
The application layer may include a series of application packages, and the application layer runs the application by calling an Application Programming Interface (API) provided by the application framework layer. As shown in fig. 3, the application package may include camera, gallery, WLAN, bluetooth, music, video, etc. applications.
The application framework layer provides an API and programming framework for the applications of the application layer. The application framework layer includes a number of predefined functions. As shown in FIG. 4, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, and the like. The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. The resource manager provides various resources, such as localized strings, icons, pictures, layout files, video files, etc., to the application. The notification manager allows the application to display notification information in the status bar.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, composition, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The hardware abstraction layer can contain a plurality of library modules, and the library modules can be a camera library module, an image library module and the like. The Android system can load corresponding library modules for the equipment hardware, and then the purpose that the application program framework layer accesses the equipment hardware is achieved. The device hardware may include, for example, a display screen, a camera, etc. in the electronic device.
The kernel layer is a layer between hardware and software. The kernel layer is used for driving hardware so that the hardware works. The kernel layer at least includes a display driver, a camera driver, and the like, which is not limited in the embodiments of the present application.
The following describes the technical solutions of the present application and how to solve the above technical problems with specific embodiments. The following embodiments may be implemented independently or in combination, and details of the same or similar concepts or processes may not be repeated in some embodiments.
The following describes a photographing method provided in an embodiment of the present application, taking user a as a photographer, user B as a subject, an electronic device of user a as a mobile phone 100, and an electronic device of user B as a mobile phone 200 as examples.
Before taking a picture, the mobile phone 100 of the user A and the mobile phone 200 of the user B need to establish a one-to-one communication connection in advance to complete the pairing of the mobile phone 200 and the mobile phone 100. Fig. 5-6 are schematic diagrams illustrating the interface between the mobile phone 200 and the mobile phone 100 to complete pairing.
Illustratively, as shown in fig. 5 (a), the cell phone 100 of user a displays an icon 202 of a camera application on a system desktop 201. In response to the trigger operation of the user a on the icon 202 of the camera application, the mobile phone 100 opens the camera application 202 to display a first preview interface, and the mobile phone 100 starts to acquire a first preview image. User a may hold the handset 100 so that the camera of the handset 100 is aimed at user B so that the first preview image captured includes user B. As shown in fig. 5 (b), the first preview interface 203 includes the captured first preview image, a shooting button 204, a display frame 205, and a trigger area of "main shooting pair" and a trigger area of "shot pair". The captured first preview image includes the posture of the user B and the landscape of the position where the user B is located, and the display frame 205 is used to display a thumbnail of the captured target image.
In one embodiment, as shown in (b) and (c) of fig. 5, the mobile phone 100 may jump to display the hotspot setting interface 206 in response to the triggering operation of the user a on the trigger area of the "master beat pairing" as a master beat device. In another embodiment, the mobile phone 100 may further display a text prompt message (not shown in fig. 5) of "please open the hot spot" on the first preview interface 203 in response to the user a triggering the trigger area of the "master shooting pairing" as the master shooting device. Therefore, after looking up the text prompt message of "please start the hot spot", the user can open the hot spot setting interface 206 by himself. A first switch button 207 for turning on or off the WiFi hotspot is displayed on the hotspot setting interface 206, wherein the first switch button 207 is in an off state. As shown in (c) and (d) of fig. 5, the cellular phone 100 may turn on the first switch button 207 in response to the trigger operation of the first switch button 207 by the user a. Thus, the mobile phone 100 can convert the received mobile communication signal (such as GPRS, 3G, 4G or 5G) into a WiFi signal for broadcasting. The broadcasted WiFi signal carries the identifier "HONOR-3C" of the handset 100.
As shown in fig. 6 (a), the cell phone 200 of the user B also displays an icon 602 of the camera application on the system desktop 601. The cell phone 200 may open the camera application 602 and display a second preview interface 603 in response to a triggering operation of the icon 202 of the camera application by the user B. As shown in fig. 6 (b), a trigger area of "the paired taken beat" and a trigger area of "the paired dominant beat" are displayed in the second preview interface 603. In one embodiment, the mobile phone 200 opens the hotspot searching interface 604 to search for a nearby WiFi signal in response to the triggering operation of the user B on the "paired by beat" triggering area as a paired by beat device. In another embodiment, the mobile phone 200 may further respond to the triggering operation of the user B on the triggering area of the "photographed pairing" as a photographed device, and display text prompt information of "please open the hot spot search interface", so that after the user looks up the text prompt information of "please open the hot spot search interface", the user may open the hot spot search interface 604 of the mobile phone 200 by himself, and the mobile phone 200 searches for a WiFi signal nearby. Further, as shown in (C) of fig. 6, a searched hotspot name list is displayed on the hotspot search interface 604, wherein the hotspot name list includes the identifier "HONOR-3C" of the cell phone 100.
As shown in fig. 6 (d), the cellular phone 200 may display a key input interface 606 in response to a trigger operation of the identification "HONOR-3C" by the user B. A first input box 607 and a join button are included in the key input interface 606. The handset 200 receives the key for connecting the WiFi hotspot of the handset 100, which is input by the user B in the first input box 607, and then the handset 200 sends a pairing request to the handset 100 in response to the triggering operation of the join button 608 by the user B. The pairing request carries the communication address of the handset 200 and the identification "HONOR 50".
As shown in fig. 5 (e), the mobile phone 100 receives a pairing request from the mobile phone 200, and displays a first prompt message 208 in the hotspot setting interface 206. Within the first reminder information 208 is displayed a text message "to be paired with: "HONOR 50", a cancel button, and an acknowledge control 209. The mobile phone 100 may respond to the trigger operation of the user a on the confirmation control 209, and establish a one-to-one communication connection with the mobile phone 200 according to the communication address of the mobile phone 200, thereby completing pairing. As shown in fig. 5 (f), after the pairing is successful, the cellular phone 100 cancels the display of the first prompt message 208, and displays the second prompt message 210 "the pairing with the hoonor 50 has been successful" to prompt the user a that the pairing is successful. The second prompt message 210 may also be other text messages for indicating that the pairing is successful, which is only an example.
As shown in (e) in fig. 6, after the pairing is successful, the cell phone 200 displays a third prompt message 609 "successful pairing with the HONOR-3C" on the hotspot search interface 604 to prompt that the pairing of the user B is successful. The HONOR-3C is the identifier of the mobile phone 100, and the third prompt information 609 may be other text information for indicating the success of the pairing, which is only an example.
It should be noted that the WiFi signal may also be replaced by a bluetooth signal, and the like, and is not limited herein.
It should be noted that what is described above with reference to fig. 5 to fig. 6 is only one pairing manner of the mobile phone 200 and the mobile phone 100, and another pairing manner of the mobile phone 200 and the mobile phone 100 may also be:
illustratively, as shown in fig. 5 (b), the handset 100 may jump to a hotspot search interface (not shown in fig. 5) in response to a triggering operation of the user a on a trigger area of the "master beat pair" as a master beat device. In another embodiment, the mobile phone 100 may further display a text prompt message (not shown in fig. 5) of "please open the hotspot search interface" on the first preview interface 203 in response to a trigger operation of the user a on the trigger area of the "master beat pairing" as a master beat device. In this way, after looking up the text prompt message of "please open the hot spot search interface", the user can open the hot spot setting interface (not shown in fig. 5) of the mobile phone 100 by himself. The handset 100 searches for nearby WiFi signals and displays a list of searched hotspot names on the hotspot search interface 604, wherein the list of hotspot names includes the name "Wlan 1" of the target WiFi. The handset 100 connects the target WiFi with the name "Wlan 1" in response to a trigger operation to the name "Wlan 1" of the target WiFi. In addition, handset 200 may connect to the target WiFi named "Wlan 1" in the same manner as handset 100. It is understood that the handsets 100 and 200 may also connect to target WiFi devices of other names, which are merely exemplary.
The mobile phone 100 sends the photographing identity of the mobile phone 100 as the main photographing device to the server of the "Wlan 1". The cellular phone 200 sends the photographing identity, pairing request, and communication address of the cellular phone 200 as a photographed device to the "Wlan 1". When the target WiFi server detects that the mobile phone 100 and the mobile phone 200 are connected to the same target WiFi, and the mobile phone 100 is in the master shooting device and the mobile phone 200 serves as the master shooting device, the target WiFi server sends a connection instruction with the mobile phone 200 to the mobile phone 100 and/or sends a connection instruction with the mobile phone 100 to the mobile phone 200.
Taking an example that the target WiFi server sends a connection instruction with the mobile phone 200 to the mobile phone 100, after the mobile phone 100 receives the pairing request, as shown in (e) in fig. 5, the first prompt information 208 is displayed in the hotspot setting interface 206. Within the first reminder information 208 is displayed a text message "to be paired with: "HONOR 50", a cancel button, and an acknowledge control 209. The handset 100 may send a confirmation instruction to the server of the target WiFi in response to the user B's triggering operation of the confirmation control 209. The target WiFi server receives the confirmation instruction from the handset 100, and issues the communication address of the handset 200 to the handset 100. The mobile phone 100 establishes one-to-one communication connection with the mobile phone 200 according to the communication address of the mobile phone 200, and completes pairing.
After the mobile phone 200 and the mobile phone 100 are successfully paired, the user can take a picture. The following describes a process of taking a picture of the user B by the mobile phone 100 with reference to fig. 7 to 10.
Illustratively, as shown in fig. 7 (a), after pairing between the mobile phone 200 and the mobile phone 100 is successful, the mobile phone 100 transmits the acquired first preview image to the mobile phone 200 by wireless transmission based on the established communication connection. Specifically, the wireless transmission method may be RTP transmission, TCP transmission, and the like, and is not limited herein. As also shown in fig. 7 (b), the cell phone 200 receives the first preview image from the cell phone 100. Further, the mobile phone 200 displays the first preview image on the second preview interface 603 of the mobile phone 200. In this way, the mobile phone 200 displays the acquired first preview image of the mobile phone 100 on the second preview interface 603, that is, the mobile phone 100 realizes that the first preview image is synchronized to be displayed by the mobile phone 200. Thus, the user B can look up the posture of the user B and the scenery where the user B is located in the second preview interface 603 of the mobile phone 200.
If the user B is dissatisfied with his posture, the personal posture can be adjusted, and the adjusted posture is shown in fig. 7 (c). Further, as shown in (c) of fig. 7, the cellular phone 100 can perform an operation of photographing in response to a trigger operation of the photographing button 204 by the user a. As shown in (d) of fig. 7, after the mobile phone 100 successfully takes a picture, the mobile phone 100 stops capturing the first preview image and displays the photographed target image on the first preview interface 203, and sends the photographed target image to the mobile phone 200. As shown in fig. 7 (d), the mobile phone 100 further displays a fourth prompt message 211 "photo sharing is successful, and confirmation of the other party" on the first preview interface 203. The fourth prompting message 211 may also be other text messages for indicating that the photo sharing is successful, which is only an example here. The mobile phone 200 receives the target image from the mobile phone 100, and as shown in fig. 7 (e), the target image received by the mobile phone 200 is displayed on the second preview interface 603. Wherein, the second preview interface 603 has different interface content from the first preview interface 203. For example, the first preview interface 203 has a shoot button 203, while the second preview interface 603 does not have a shoot button 203; the second preview interface 603 includes an abandonment control 610 and a confirmation control 611, while the second preview interface 603 does not abandon the control 610 and the confirmation control 611. Further, the user B can refer to the target image photographed by the cellular phone 100 in the second preview interface 603 of the cellular phone 200.
It is to be understood that, in fig. 7 described above, since the user B is looking up the posture of the user B and the photograph composition formed of the landscape where the user B is located before the photographing of the cellular phone 100. Further, the posture and position of the user can be adjusted according to the personal photographing intention, so that the user B can obtain a satisfactory photographing composition. Furthermore, the user B can instruct the mobile phone 100 of the user a to take a picture based on the adjusted picture composition, so that the matching degree between the target image taken by the mobile phone 100 of the user a and the picture intention of the user B is high, and the satisfaction degree of the user B on the taken target image is improved.
For example, the manner of triggering the mobile phone 100 to take a picture is not limited to the above manner, and may include, but is not limited to, the following three manners:
the first method comprises the following steps: the above-described "the mobile phone 100 may perform the operation of taking a picture of the adjusted posture in response to the trigger operation of the shooting button 204 by the user B" may be replaced with: as shown in fig. 8, the user B may trigger the side key 612 of the mobile phone 200, and the mobile phone 200 sends a photographing instruction to the mobile phone 100 in response to the triggering operation of the side key 612 by the user B. The mobile phone 100 receives a photographing instruction from the mobile phone 200 and performs a photographing operation. As can be seen, the first method for triggering the mobile phone 100 to take a picture is as follows: the user B triggers the button of the mobile phone 200 to trigger the photographing.
And the second method comprises the following steps: the above-described "the mobile phone 100 may perform the operation of taking a picture of the adjusted posture in response to the trigger operation of the user B on the shooting button 204" may be replaced with: the user B inputs voice information for instructing photographing, for example, the voice information may be "eggplant", "photograph", or the like, to the cellular phone 200. The mobile phone 200 recognizes voice information input by the user B to instruct photographing, and sends a photographing instruction to the mobile phone 100, or the mobile phone 100 recognizes voice information input by the user B to instruct photographing. Thus, the mobile phone 100 executes the photographing operation according to the photographing instruction. As can be seen, the second method for triggering the mobile phone 100 to take a picture is: the mobile phone 200 or the mobile phone 100 recognizes the voice message of the user B to trigger the photographing (not shown in the drawing).
And the third is that: the above-described "the mobile phone 100 may perform the operation of taking a picture of the adjusted posture in response to the trigger operation of the user B on the shooting button 204" may be replaced with: the user B makes a body motion for instructing the cellular phone 100 to photograph, for example, the user B makes a "nodding motion" or a "palming motion", and then re-poses a gesture for photographing. The mobile phone 100 detects the body movement of the user B for instructing the mobile phone 100 to shoot through the acquired first preview image, waits for a preset time (such as 3s, 5s, and the like), and performs a shooting operation. As can be seen, the first method for triggering the mobile phone 100 to take a picture is as follows: the mobile phone 100 recognizes the action of the user B to trigger the mobile phone 100 to take a picture (not shown in the figure).
How the user B handles the photographed target image will be explained below with reference to fig. 9 and 10.
On the one hand, in the case that the satisfaction degree of the target image shot by the mobile phone 100 is not high enough after the user B refers to the target image shot by the mobile phone 100 in the second preview interface 603 of the mobile phone 200, as shown in (a) of fig. 9, the abandon control 610 in the second preview interface 603 may be triggered. As shown in fig. 9 (B), the mobile phone 200 receives a cancel instruction in response to a trigger operation of the abandon control 610 by the user B, and transmits the cancel instruction to the mobile phone 100. As shown in (c) of fig. 9, the mobile phone 100 receives a cancel instruction from the mobile phone 200, cancels the display of the fourth prompt information 211 together with the captured target image, and does not save the target image but continues to capture the second preview image. The mobile phone 100 displays the acquired second preview image on the first preview interface 203. The mobile phone 100 sends the acquired second preview image to the mobile phone 200, and the mobile phone 200 receives the second preview image and displays the second preview image on the second preview interface 603.
On the other hand, in the case that the satisfaction degree of the target image shot by the mobile phone 100 is high after the second preview interface 603 of the mobile phone 200 refers to the target image shot by the mobile phone 100, as shown in (a) of fig. 10, the confirmation control 611 in the second preview interface 603 may be triggered. As shown in (B) in fig. 10, the mobile phone 200 receives a confirmation instruction in response to the trigger operation of the user B on the confirmation control 611, and sends the confirmation instruction to the mobile phone 100. As shown in fig. 10 (c), the mobile phone 100 receives a confirmation instruction from the mobile phone 200, and stores the photographed target image. In this way, the mobile phone 100 stores the target image with high satisfaction of the user B, so as to facilitate the subsequent review of the target image. Then, the mobile phone 100 cancels the display of the fourth prompt information 211 and displays the thumbnail of the photographed target image in the display frame 205. The mobile phone 100 continues to capture the second preview image and displays the captured second preview image on the first preview interface 203. The mobile phone 100 sends the acquired second preview image to the mobile phone 200, and the mobile phone 200 receives the second preview image and displays the second preview image on the second preview interface 603.
It should be understood that the above description is exemplified by the user a as the photographer and the user B as the subject. On the other hand, if the user a wants to change the shooting identity with the user B after the user B is photographed, that is, if the user a is the photographed person and the user B is the photographer, how the mobile phone 200 and the mobile phone 100 change the shooting identities of the user a and the user B and how the mobile phone 200 photographs the user a after changing the shooting identities will be described below with reference to fig. 11 and 12.
As shown in (a) of fig. 11, the mobile phone 100 may send a switching instruction to the mobile phone 200 in response to a triggering operation of the switching control 212 on the first preview interface 203 of the mobile phone 100 by the user a. As shown in (b) of fig. 11, the mobile phone 200 receives a switching instruction from the mobile phone 100, and displays a fifth prompt message 613 on the second preview interface 603 of the mobile phone 200, where the fifth prompt message is used for indicating that the photographing identity is switched. For example, the fifth prompting message 613 includes text information of "the other party requests to switch the photographing identity", a cancel button, and a confirmation control 614. As shown in (B) and (d) of fig. 11, if the user B agrees to switch the identity, the cell phone 200 switches from the captured device to the master device in response to the triggering operation of the confirmation control 614 by the user B. The mobile phone 200 starts to capture a third preview image, and the captured third preview image is displayed on the second preview interface 603 of the mobile phone 200. The third preview image acquired by the mobile phone 200 includes the posture of the user a and the scenery of the position where the user a is located.
As shown in fig. 11 (c), the cellular phone 200 transmits the approval instruction to the cellular phone 100, and the cellular phone 100 receives the approval instruction from the cellular phone 200 and switches from the master shooting device to the shot device. As shown in fig. 11 (e), the mobile phone 100 receives the third preview image transmitted from the mobile phone 200, and displays the received third preview image on the first preview interface 203 of the mobile phone 100. In this way, the user a can look up the posture of the user a and the scenery of the position where the user a is located in the first preview interface 203 of the mobile phone 100.
As shown in (d) in fig. 11, the cellular phone 200 can perform an operation of photographing in response to a trigger operation of the photographing button 616 by the user B. After the mobile phone 200 successfully takes a picture, the mobile phone 200 stops acquiring the third preview image and sends the taken target image to the mobile phone 100, and at the same time, the mobile phone 200 displays a fourth prompt message (not shown in the drawing) of "picture sharing is successful and the other party waits for confirmation" on the second preview interface 603. The fourth prompting message may also be other text messages for indicating that the photo sharing is successful, which is only an example here. The mobile phone 100 receives the target image from the mobile phone 200, and as shown in fig. 12 (a), after the mobile phone 100 receives the target image, the target image is displayed on the first preview interface 203, and the abandon control 214 and the confirmation control 215 are displayed on the first preview interface 203. Further, the user a can refer to the target image photographed by the cellular phone 200 in the first preview interface 203 of the cellular phone 100.
It is to be understood that, in fig. 7 described above, since the user a refers to the posture of the user a and the photograph composition formed of the landscape where the user a is located before the photographing of the cellular phone 200. Further, the posture and position of the user can be adjusted according to the personal photographing intention, so that the user A can obtain a satisfactory photographing composition. Furthermore, the user a can also instruct the mobile phone 100 of the user B to take a picture based on the adjusted picture composition, so that the matching degree between the target image taken by the mobile phone 200 of the user B and the picture-taking intention of the user a is high, and the satisfaction degree of the user a on the taken target image is improved.
For example, the mode of triggering the mobile phone 200 to take a picture is not limited to the above mode, and may also be other modes of triggering the mobile phone 100 to take a picture, which are described above and will not be described herein again.
The following describes the flow of the photographing method in fig. 5 to 12, taking the user a as the photographer, the user B as the subject, the electronic device of the user a as the mobile phone 100, and the electronic device of the user B as the mobile phone 200 as examples.
Fig. 13 is a flowchart illustrating a photographing method according to an embodiment of the present application. As shown in fig. 13, the photographing method may include the steps of:
s1301: the mobile phone 100 displays a first preview interface, wherein the first preview interface includes a first preview image and a first control acquired by the mobile phone 100.
Illustratively, the mobile phone 100 opens the camera application to display the first preview interface in response to the user a's trigger operation on the icon of the camera application. It will be appreciated that the first preview interface may be the first preview interface 203 of the handset 100 in (b) of figure 5 described above. It is understood that the user B may be posed in the photographing position and the user a may hold the mobile phone 100 with the camera of the mobile phone 100 directed at the user B. In this way, the cell phone 100 can capture an image containing user B. Here, the first control may be a trigger region of the "master beat pair" in (b) in fig. 5.
S1302: the cell phone 200 displays a second preview interface, wherein the second preview interface includes a second control.
For example, the cell phone 200 may open the camera application and display the second preview interface in response to the user B's trigger operation on the icon of the camera application on the system desktop. It is to be understood that the second preview interface may be the second preview interface 603 of the cell phone 200 in (b) of fig. 6 described above; the second control may be the trigger region of the "paired beats" in (b) of fig. 6 described above.
S1303: the mobile phone 100 responds to the triggering operation of the user a on the first control to serve as a main shooting device.
S1304: the mobile phone 200 responds to the triggering operation of the user B on the second control to serve as a shot device.
S1305: a one-to-one communication connection is established between the handset 100 and the handset 200.
It is to be appreciated that 1305 above describes a pairing process between handset 100 and handset 200. The specific pairing process of the mobile phone 100 and the mobile phone 200 has a plurality of possible implementation manners, and two of them are described as follows:
the first pairing procedure: the mobile phone 100 starts a WiFi hot spot function in response to the operation of the user a, and converts the received mobile communication signal (such as GPRS, 3G, 4G, or 5G) into a WiFi signal to be broadcast. The mobile phone 200 searches for a WiFi signal broadcast by the mobile phone 100, and after receiving a key input by the user B at the mobile phone 200, sends a pairing request to the mobile phone 100, where the pairing request carries a communication address of the mobile phone 200. The mobile phone 100 receives the pairing request from the mobile phone 200, and establishes one-to-one communication connection with the mobile phone 200 according to the communication address of the mobile phone 200, thereby completing pairing between the mobile phone 100 and the mobile phone 200.
It should be noted that, the interface operation and the specific implementation manner of the first pairing process may refer to fig. 5 and fig. 6, and the above description of fig. 5 and fig. 6, which are not repeated herein.
The second pairing process: the handset 100 searches for a nearby target WiFi network and connects to the target WiFi. The mobile phone 100 sends the photographing identity of the main photographing device of the mobile phone 100 to the server of the target WiFi. The handset 200 searches for a nearby target WiFi and connects to the same target WiFi as the handset 200. The mobile phone 200 sends a photographing identity and a pairing request of the photographed device of the mobile phone 200 to a server of the target WiFi, wherein the pairing request carries a communication address of the mobile phone 200. When the target WiFi server of the target WiFi detects that the mobile phone 100 and the mobile phone 200 are connected to the same target WiFi, and the mobile phone 100 is in the master shooting device and the mobile phone 200 serves as the master shooting device, the server sends a pairing request of the mobile phone 200 to the mobile phone 100. After receiving the pairing request, the mobile phone 100 establishes communication connection with the mobile phone 200 according to the communication address of the mobile phone 200.
It should be noted that, for a specific implementation manner of the second pairing process, reference may be made to the description of another pairing manner of the mobile phone 200 and the mobile phone 100 in the foregoing embodiment, which is not described herein again.
In addition, in the above S1303-S1305, the step sequence of S1305 may be interchanged with S1304, and the specific implementation principle may be: the mobile phone 100 responds to the triggering operation of the user a on the first control to serve as a main shooting device. Further, the mobile phone 100 displays a hotspot setting interface, automatically starts a WiFi hotspot function, and converts a received mobile communication signal (such as GPRS, 3G, 4G, or 5G) into a WiFi signal to be broadcast. The mobile phone 200 searches for a WiFi signal broadcast by the mobile phone 100, and after receiving a key input by the user B at the mobile phone 200, sends a pairing request to the mobile phone 100, where the pairing request carries a communication address of the mobile phone 200. The mobile phone 100 receives the pairing request from the mobile phone 200, and establishes one-to-one communication connection with the mobile phone 200 according to the communication address of the mobile phone 200, thereby completing pairing of the mobile phone 100 and the mobile phone 200. After pairing is successful, the mobile phone 200 automatically serves as a photographed device.
S1306: the mobile phone 100 sends the collected first preview image to the mobile phone 200 through the established communication connection, wherein the collected first preview image includes the user B.
Since the handset 100 and the handset 200 have already established a one-to-one communication connection, the handset 100 can transmit the acquired first preview image to the handset 200. The interface schematic diagram for implementing S1306 may be (a) - (b) in fig. 7.
S1307: the mobile phone 200 receives the image from the mobile phone 100, and displays the received image on the second preview interface.
Further, the user B can browse the acquired first preview image on the second preview interface of the mobile phone 200. The interface schematic diagram for implementing S1307 may be (a) - (b) in fig. 7.
S1308: the mobile phone 100 captures a target image and transmits the captured target image to the mobile phone 200.
If the user B is not satisfied with his posture, the personal posture can be adjusted. After the user B is satisfied with the posture, the mobile phone 100 takes a picture of the image and transmits the taken target image to the mobile phone 200. Specifically, the modes for triggering the mobile phone 100 to take a picture include, but are not limited to, the following four modes: the first method comprises the following steps: the user A triggers the mobile phone 100 to take a picture; and the second method comprises the following steps: the user B triggers a side button of the mobile phone 200 to trigger photographing; and the third is that: the user B triggers to take a picture by inputting voice information to the mobile phone 200; and fourthly: the user B inputs the body motion to the mobile phone 100 to trigger the mobile phone 100 to take a picture.
It should be noted that, specific implementation manners of the four triggered photographing modes herein can refer to (c) and (d) in fig. 7, fig. 8, and specific descriptions of the four triggered photographing modes in the foregoing embodiment, and are not described herein again.
S1309: the cell phone 200 receives the target image from the cell phone 100 and displays the target image and the confirmation control on the second preview interface.
Thus, the user B can look up the target image captured by the mobile phone 100 on the second preview interface. An interface diagram of the second preview interface displaying the target image and the confirmation control may refer to (e) in fig. 7.
S1310: the mobile phone 200 receives a confirmation instruction in response to the triggering operation of the confirmation control, and sends the confirmation instruction to the mobile phone 100, wherein the confirmation instruction is used for confirming the target image.
For S1309-S1310, if the target image captured by the mobile phone 100 matches the photographing intention of the user B, the user B may trigger the confirmation control. As such, the cell phone 200 sends a confirmation instruction to the cell phone 100 in response to the triggering operation of the confirmation control.
S1311: the mobile phone 100 receives the confirmation instruction from the mobile phone 200 and stores the photographed target image.
Specifically, the specific implementation manner and the interface schematic diagram of S1310 and S1311 may refer to fig. 10 and the foregoing embodiment, and the description of fig. 10 is omitted here for brevity.
In addition, referring to fig. 9, a discard control may be further displayed in the second preview interface, and if the target image captured by the mobile phone 100 does not meet the photographing intention of the user B, the user B may trigger the discard control. In this way, the mobile phone 200 receives a cancel instruction in response to the trigger operation of the abandon control, and sends the cancel instruction to the mobile phone 100. Wherein the cancel instruction is used for indicating that the target image is not saved. The mobile phone 100 receives the cancel instruction from the mobile phone 200, does not save the photographed target image, and further acquires a second preview image. The mobile phone 100 sends the acquired second preview image to the mobile phone 200, and the mobile phone 200 receives the second preview image and displays the second preview image on the second preview interface 603.
In addition, the mobile phone 100 may further send a switching instruction to the mobile phone 200, and the mobile phone 200 receives the switching instruction from the mobile phone 100 and displays a prompt message for instructing switching of the photographing identity on the second preview interface. The mobile phone 200 transmits an approval instruction to the mobile phone 100 in response to the input trigger operation, with the mobile phone 200 serving as a master shooting device. The cellular phone 100 receives the approval instruction as a photographed device. In this way, the user B can take a picture of the user a as the photographed person in cooperation with the mobile phone 100 as the photographer holds the mobile phone 200.
It should be noted that, the flow of taking a picture of the user a as the photographed person by the user B as the mobile phone 200 held by the photographer in cooperation with the mobile phone 100 may refer to the above-mentioned fig. 11 to 12 and the description of fig. 11 to 12, and is not repeated herein.
In addition, in the above description of the photographing method provided in the embodiment of the present application, the trigger operation may include: a click operation, a long-press operation, a gesture trigger operation, and the like, which are not limited herein.
Referring to fig. 14, an embodiment of the present application further provides an electronic device 1400, where the electronic device 1400 includes a camera, and the electronic device 1400 further includes: a first communication unit 1401, a first display unit 1402, a first processing unit 1403, and a first storage unit 1404.
Wherein, the first communication unit 1401 is used for establishing communication connection with the second electronic device 1400. The first display unit 1402 is configured to display a first preview interface, where the first preview interface includes a first preview image acquired by a camera. The first communication unit 1401 is further configured to synchronize the first preview image with the second electronic device 1400 through the communication connection. The first processing unit 1403 is further configured to obtain a shooting instruction, and obtain a target image through shooting with a camera. The first communication unit 1401 is also used for transmitting the target image to the second electronic device 1400. A first communication unit 1401 further for receiving a confirmation instruction from the second electronic device 1400, the confirmation instruction being for confirming the target image, and a first processing unit 1403 further for responding to the confirmation instruction; a first storage unit 1404 for storing a target image. Alternatively, the first communication unit 1401 is further configured to receive a cancel instruction of the target image from the second electronic device 1400, the cancel instruction being used to cancel the target image; the first processing unit 1403 is also configured to not save the target image in response to the cancel instruction.
In an optional implementation manner, the first communication unit 1401 is specifically configured to establish a communication connection with the second electronic device 1400 when a trigger operation for the first control is received.
Further, the first communication unit 1401 is specifically configured to, when receiving a trigger operation for the first control, start the near field communication function and broadcast the identifier of the electronic device 1400. Upon a triggered operation of the identity of the electronic device 1400, a communication connection is established with the second electronic device 1400.
Further, the first display unit 1402 is further configured to display a hot spot setting interface when receiving a trigger operation for the first control, where the hot spot setting interface includes a button for opening a hot spot. The first communication unit 1401 is specifically configured to, when receiving a trigger operation for a button for turning on a hot spot, turn on the near field communication function and broadcast an identifier of the electronic device 1400.
In an optional implementation manner, the first display unit 1402 is further configured to display a hotspot search interface when a trigger operation for the first control is received, where the hotspot search interface includes an identifier of the target WiFi. The first communication unit 1401 is further configured to access the target WiFi upon receiving a trigger operation for the identification of the target WiFi. The first communication unit 1401 is further configured to transmit information indicating that the electronic device 1400 is the master device to the server of the target WiFi. The first communication unit 1401 is further configured to receive a connection instruction with the second electronic device 1400 from the server of the target WiFi. The first communication unit 1401 is further configured to establish a communication connection with the second electronic device 1400 in response to the connection instruction.
In an alternative embodiment, the first processing unit 1403 is specifically configured to generate the shooting instruction in response to the trigger of the shooting button in the first preview interface. Alternatively, the first communication unit 1401 is configured to receive a shooting instruction from the second electronic device 1400. Alternatively, the first processing unit 1403 is specifically configured to recognize voice information for instructing shooting and generate a shooting instruction. Alternatively, the first processing unit 1403 is specifically configured to recognize a limb movement for instructing the shooting, and generate a shooting instruction.
In an optional implementation, the first display unit 1402 is further configured to display a first preview interface, where the first preview interface includes a second preview image captured by the electronic device 1400 using a camera. The first communication unit 1401 is further configured to synchronize the second preview image with the second electronic device 1400.
In an optional implementation manner, the first preview interface further includes a switching control, and the switching control is used to instruct to switch the photographing identity. The first communication unit 1401 is further configured to send a switching instruction to the second electronic device 1400 when receiving a trigger operation on the switching control. The first communication unit 1401 is further configured to receive an approval instruction from the second electronic device 1400 to switch to the captured device.
Referring to fig. 15, an electronic device 1500 is further provided in an embodiment of the present application, where the electronic device 1500 includes: a second communication unit 1501 is configured to establish a communication connection with the first electronic device 1500. The second communication unit 1501 is further configured to receive the first preview image synchronized by the first electronic device 1500 through a communication connection. A second display unit 1502 is configured to display a second preview interface, where the second preview interface includes the first preview image. The second communication unit 1501 is further configured to receive a target image captured from the first electronic device 1500. And the display unit is also used for displaying the target image on the second preview interface. The second communication unit 1501 is further configured to receive a confirmation instruction, and send the confirmation instruction to the first electronic device 1500, where the confirmation instruction is used to confirm the target image. Alternatively, the second communication unit 1501 is further configured to receive a cancel instruction, and send the cancel instruction to the first electronic device 1500, where the cancel instruction is used to cancel the target image.
In an optional implementation manner, the second communication unit 1501 is specifically configured to display the identifier of the first electronic device 1500 when detecting that the first electronic device 1500 broadcasts. Upon receiving a trigger operation for the identification of the first electronic device 1500, a communication connection is established with the first electronic device 1500.
Further, the display unit is further configured to display the first input box when a trigger operation on the identifier of the first electronic device 1500 is received. The second communication unit 1501 is specifically configured to establish communication connection with the first electronic device 1500 when obtaining the key of the hot spot in the first input box.
Further, the second preview interface further includes a second control, and the display unit is specifically configured to display the hotspot search interface when receiving a trigger operation for the second control; in the event that the identity of the first electronic device 1500 broadcast by the first electronic device 1500 is detected, the identity of the first electronic device 1500 is displayed at the hotspot search interface.
In an optional implementation manner, the second preview interface further includes a second control, and the second display unit 1502 is further configured to display a second hotspot search interface when a trigger operation for the second control is received, where the second hotspot search interface includes an identifier of the target WiFi. The second communication unit 1501 is further configured to access the target WiFi when a trigger operation for the identifier of the target WiFi is received. The second communication unit 1501 is further configured to transmit information indicating that the electronic device 1500 is a captured device to a server of the target WiFi. The second communication unit 1501 is further configured to receive a connection instruction with the first electronic device 1500 from the server of the target WiFi. The second communication unit 1501 is also configured to establish a communication connection with the first electronic device 1500 in response to the connection instruction.
In an alternative embodiment, the second communication unit 1501 is further configured to send a shooting instruction to the first electronic device 1500 in response to a triggering operation of the side key. Alternatively, the second communication unit 1501 is also configured to transmit a shooting instruction to the first electronic apparatus 1500 when voice information for instructing shooting is recognized.
In an alternative embodiment, the second communication unit 1501 is further configured to receive the synchronized second preview image from the first electronic device 1500.
In an optional implementation, the second preview interface further includes a confirm control and a discard control, and the electronic device 1500 further includes: the second processing unit 1503 is specifically configured to receive a confirmation instruction in response to a trigger operation on the confirmation control. Or, receiving a cancellation instruction in response to the triggering operation of the abandoning control.
In an alternative embodiment, the second communication unit 1501 is further configured to receive a switching instruction from the first electronic device 1500.
The second display unit 1502 is further configured to display, in response to the switching instruction, prompt information for prompting the first electronic device 1500 to request switching of the photographing identity. The electronic device 1500 further includes: the second processing unit 1503, configured to switch to a master shooting device when receiving a trigger operation agreeing to switch the shooting identity; the second communication unit 1501 is further configured to transmit an approval instruction to the first electronic device 1500, where the approval instruction is used to instruct the first electronic device 1500 to switch to a photographed device.
Fig. 16 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present disclosure, and as shown in fig. 16, the electronic device includes a processor 1601, a communication line 1604, and at least one communication interface (the communication interface 1603 is exemplarily illustrated in fig. 16).
The processor 1601 may be a general processing unit (CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more ics for controlling the execution of programs in accordance with the present disclosure.
The communication lines 1604 may include circuitry to communicate information between the above-described components.
Communication interface 1603 may be any device using any transceiver or the like for communicating with other devices or communication networks, such as ethernet, Wireless Local Area Networks (WLAN), etc.
Possibly, the electronic device can also include a memory 1602.
The memory 1602 may be a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that can store information and instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disk read-only memory (CD-ROM) or other optical disk storage, optical disk storage (including compact disk, laser disk, optical disk, digital versatile disk, blu-ray disk, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to these. The memory may be separate and coupled to the processor via a communication line 1604. The memory may also be integral to the processor.
The memory 1602 is used for storing computer-executable instructions for implementing the present invention, and is controlled by the processor 1601. The processor 1601 is used for executing computer-executable instructions stored in the memory 1602, so as to implement the photographing method provided by the embodiment of the application.
Possibly, the computer executed instructions in the embodiments of the present application may also be referred to as application program codes, which are not specifically limited in the embodiments of the present application.
In particular implementations, processor 1601 may include one or more CPUs such as CPU0 and CPU1 in fig. 16, for example, as an embodiment.
In particular implementations, an electronic device may include multiple processors, such as processor 1601 and processor 1605 in FIG. 16, for example, as an embodiment. Each of these processors may be a single-core (single-CPU) processor or a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
Exemplarily, fig. 17 is a schematic structural diagram of a chip provided in an embodiment of the present application. Chip 170 includes one or more (including two) processors 1710 and a communication interface 1730.
In some embodiments, memory 1740 stores the following elements: an executable module or a data structure, or a subset thereof, or an expanded set thereof.
In an embodiment of the present application, memory 1740 may include both read-only memory and random access memory, and provides instructions and data to processor 1710. A portion of memory 1740 may also include non-volatile random access memory (NVRAM).
In the illustrated embodiment, memory 1740, communication interface 1730, and memory 1740 are coupled together by a bus system 1720. The bus system 1720 may include a power bus, a control bus, a status signal bus, and the like, in addition to the data bus. For ease of description, the various buses are identified in FIG. 17 as the bus system 1720.
The method described in the embodiment of the present application may be applied to the processor 1710, or implemented by the processor 1710. The processor 1710 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by instructions in the form of hardware, integrated logic circuits, or software in the processor 1710. The processor 1710 may be a general-purpose processor (e.g., a microprocessor or a conventional processor), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an FPGA (field-programmable gate array) or other programmable logic device, discrete gate, transistor logic device, or discrete hardware component, and the processor 1710 may implement or perform the methods, steps, and logic blocks disclosed in the embodiments of the present invention.
The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in a storage medium mature in the field, such as a random access memory, a read only memory, a programmable read only memory, or a charged erasable programmable memory (EEPROM). The storage medium is located in the memory 1740, and the processor 1710 reads the information in the memory 1740 and completes the steps of the method in combination with the hardware.
In the above embodiments, the instructions stored by the memory for execution by the processor may be implemented in the form of a computer program product. The computer program product may be written in the memory in advance, or may be downloaded in the form of software and installed in the memory.
The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the present application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. Computer instructions may be stored in, or transmitted from, a computer-readable storage medium to another computer-readable storage medium, e.g., from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optics, Digital Subscriber Line (DSL), or wireless (e.g., infrared, wireless, microwave, etc.), the computer-readable storage medium may be any available medium that a computer can store or a data storage device including one or more available media integrated servers, data centers, etc., the available media may include, for example, magnetic media (e.g., floppy disks, hard disks, or magnetic tape), optical media (e.g., digital versatile disks, DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), etc.
The embodiment of the application also provides a computer readable storage medium. The methods described in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. Computer-readable media may include computer storage media and communication media, and may include any medium that can communicate a computer program from one place to another. A storage medium may be any target medium that can be accessed by a computer.
As one possible design, the computer-readable medium may include a compact disk read-only memory (CD-ROM), RAM, ROM, EEPROM, or other optical disk storage; the computer readable medium may include a disk memory or other disk storage device. Also, any connecting line may also be properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes Compact Disc (CD), laser disc, optical disc, Digital Versatile Disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
Combinations of the above should also be included within the scope of computer-readable media. The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (27)

1. The photographing method is applied to a photographing system, the photographing system comprises first electronic equipment and second electronic equipment, the first electronic equipment comprises a camera, and the method comprises the following steps:
the first electronic equipment and the second electronic equipment establish communication connection;
the first electronic equipment displays a first preview interface, wherein the first preview interface comprises a first preview image acquired by the camera;
the first electronic device synchronizes the first preview image to the second electronic device;
the second electronic equipment displays a second preview interface, wherein the second preview interface comprises the first preview image;
the first electronic equipment acquires a shooting instruction and obtains a target image by utilizing the camera;
the first electronic device sends the target image to the second electronic device through the communication connection;
the second electronic equipment displays the target image on the second preview interface;
receiving a confirmation instruction at the second electronic device, wherein the confirmation instruction is used for confirming the target image and sending the confirmation instruction to the first electronic device; the first electronic equipment responds to the confirmation instruction and saves the target image;
or receiving a cancel instruction at the second electronic device, wherein the cancel instruction is used for canceling the target image, and the second electronic device sends the cancel instruction to the first electronic device; and the first electronic equipment responds to the cancel instruction and does not save the target image.
2. The photographing method of claim 1, wherein the first preview interface further comprises a first control, and the first electronic device and the second electronic device establish a communication connection, comprising:
when the first electronic device receives a triggering operation aiming at the first control, the first electronic device establishes communication connection with the second electronic device.
3. The photographing method according to claim 2, wherein when the first electronic device receives a trigger operation for the first control, the first electronic device establishes a communication connection with the second electronic device, and the method includes:
when the first electronic device receives a trigger operation aiming at the first control, the first electronic device starts a near field communication function and broadcasts an identifier of the first electronic device;
the second electronic equipment displays the identification of the first electronic equipment;
and when the second electronic equipment receives the trigger operation of the identifier of the first electronic equipment, the first electronic equipment and the second electronic equipment establish communication connection.
4. The photographing method according to claim 3, wherein the turning on, by the first electronic device, the near field communication function and the broadcasting of the identifier of the first electronic device when the first electronic device receives the trigger operation for the first control comprises:
when the first electronic device receives a trigger operation aiming at the first control, the first electronic device displays a hot spot setting interface; the hot spot setting interface comprises a button for starting a hot spot; receiving a trigger operation aiming at the hot spot starting button at the first electronic equipment, and starting a near field communication function and broadcasting an identifier of the first electronic equipment by the first electronic equipment;
when the second electronic device receives a trigger operation on the identifier of the first electronic device, the first electronic device establishes a communication connection with the second electronic device, including:
when the second electronic equipment receives a trigger operation on the identifier of the first electronic equipment, the second electronic equipment displays a first input box; and when the key of the hotspot is obtained in the first input box, the second electronic equipment establishes communication connection with the first electronic equipment.
5. The photographing method according to claim 3 or 4, wherein the second preview interface further comprises a second control, and the second electronic device detects the close-range communication signal of the first electronic device and displays an identifier of the first electronic device, including:
when the second electronic device receives a triggering operation aiming at the second control, the second electronic device displays a hotspot searching interface;
and the second electronic equipment detects the identification of the first electronic equipment broadcasted by the first electronic equipment, and displays the identification of the first electronic equipment on the hotspot searching interface.
6. The photographing method of claim 2, wherein the second preview interface further comprises a second control, and when the first electronic device receives a trigger operation for the first control, the first electronic device establishes a communication connection with the second electronic device, including:
when the first electronic device receives a trigger operation aiming at the first control, the first electronic device displays a hotspot searching interface, wherein the hotspot searching interface comprises an identifier of target WiFi;
when the first electronic device receives a trigger operation aiming at the identification of the target WiFi, the first electronic device accesses the target WiFi;
the first electronic equipment sends information used for indicating that the first electronic equipment is a main shooting device to a server of the target WiFi;
when the second electronic device receives a trigger operation for the second control, the second electronic device displays a second hotspot searching interface, wherein the second hotspot searching interface comprises an identifier of the target WiFi;
when the second electronic device receives a trigger operation aiming at the identification of the target WiFi, the second electronic device accesses the target WiFi;
the second electronic equipment sends information used for indicating that the second electronic equipment is a photographed device to a server of the target WiFi;
the first electronic device establishes a communication connection with a second electronic device in response to a connection instruction with the second electronic device from a server of a target WiFi.
7. The photographing method according to claim 1, wherein the first electronic device acquires a photographing instruction, comprising:
the first electronic equipment responds to the triggering of a shooting button in the first preview interface and generates a shooting instruction;
or the second electronic device responds to the triggering operation of the side key and sends the shooting instruction to the first electronic device, and then the first electronic device receives the shooting instruction;
or the first electronic equipment identifies voice information used for indicating shooting and generates a shooting instruction;
or the second electronic device recognizes voice information used for indicating shooting, and sends a shooting instruction to the first electronic device, and then the first electronic device receives the shooting instruction;
or the first electronic equipment identifies the limb movement for instructing shooting, and generates the shooting instruction.
8. The photographing method according to claim 1, wherein after the first electronic device saves the target image in response to the confirmation instruction, or after the first electronic device does not save the target image in response to the cancel instruction, the method further comprises:
the first electronic equipment displays the first preview interface, wherein the first preview interface comprises a second preview image acquired by the first electronic equipment by using the camera;
the first electronic device synchronizes the second preview image to the second electronic device.
9. The method of claim 1, wherein the second preview interface further comprises a confirmation control and a discard control,
the second electronic device receiving the confirmation instruction comprises: the second electronic equipment responds to the triggering operation of the confirmation control and receives the confirmation instruction;
or, the receiving, by the second electronic device, the cancel instruction includes: and the second electronic equipment receives the triggering operation of the abandoning control and receives the cancelling instruction.
10. The method of claim 1, wherein the first preview interface further comprises a toggle control, and the toggle control is used for instructing to toggle the photographing identity, and the method further comprises:
when the first electronic device receives a trigger operation of the switching control, the first electronic device sends a switching instruction to the second electronic device;
the second electronic equipment responds to the switching instruction and displays prompt information for prompting the first electronic equipment to request for switching the photographing identity;
when the second electronic equipment receives a trigger operation for agreeing to switch the photographing identity, the second electronic equipment is switched to be the main photographing equipment, and an agreement instruction is sent to the first electronic equipment;
the first electronic device receives the consent instruction from the second electronic device to switch to the photographed device.
11. An electronic device, wherein the electronic device comprises a camera, the electronic device further comprising:
the first communication unit is used for establishing communication connection with the second electronic equipment;
the first display unit is used for displaying a first preview interface, and the first preview interface comprises a first preview image acquired by the camera;
the first communication unit is further used for synchronizing the first preview image to the second electronic equipment through the communication connection;
the first processing unit is also used for acquiring a shooting instruction and shooting by using the camera to obtain a target image;
the first communication unit is further used for sending the target image to the second electronic equipment;
the first communication unit is further used for receiving a confirmation instruction from the second electronic device, the confirmation instruction is used for confirming the target image, and the first processing unit is further used for responding to the confirmation instruction; a first storage unit for saving the target image;
or the first communication unit is further used for receiving a cancel instruction of the target image from the second electronic equipment, wherein the cancel instruction is used for canceling the target image; the first processing unit is further configured to not save the target image in response to the cancel instruction.
12. The electronic device according to claim 11, wherein the first preview interface further includes a first control, and the first communication unit is specifically configured to establish a communication connection with the second electronic device when a trigger operation for the first control is received.
13. The electronic device according to claim 12, wherein the first communication unit is specifically configured to, when receiving a trigger operation for the first control, start a near field communication function and broadcast an identifier of the electronic device; establishing a communication connection with the second electronic device upon a triggered operation of the identity of the electronic device.
14. The electronic device according to claim 13, wherein the first display unit is further configured to display a hotspot setting interface when receiving a trigger operation for the first control, where the hotspot setting interface includes a button for turning on a hotspot;
the first communication unit is specifically configured to, when receiving a trigger operation for the hot spot starting button, start a near field communication function and broadcast an identifier of the electronic device.
15. The electronic device of claim 12,
the first display unit is further configured to display a hotspot search interface when receiving a trigger operation for the first control, where the hotspot search interface includes an identifier of a target WiFi;
the first communication unit is further configured to access the target WiFi when a trigger operation for an identifier of the target WiFi is received;
the first communication unit is further configured to send information indicating that the electronic device is a main shooting device to a server of the target WiFi;
the first communication unit is further used for receiving a connection instruction with a second electronic device from a server of the target WiFi;
the first communication unit is further configured to establish a communication connection with the second electronic device in response to the connection instruction.
16. The electronic device according to claim 11, wherein the first processing unit is specifically configured to generate a shooting instruction in response to triggering of a shooting button in the first preview interface;
or, the first communication unit is used for receiving a shooting instruction from the second electronic device;
or, the first processing unit is specifically configured to recognize voice information used for instructing shooting, and generate a shooting instruction;
or, the first processing unit is specifically configured to recognize a limb motion for instructing the shooting, and generate the shooting instruction.
17. The electronic device of claim 11, wherein the first display unit is further configured to display the first preview interface, and wherein the first preview interface includes a second preview image captured by the electronic device with the camera;
the first communication unit is further configured to synchronize the second preview image with the second electronic device.
18. The electronic device of claim 11, wherein the first preview interface further comprises a toggle control, and wherein the toggle control is configured to instruct to toggle the photographing identity,
the first communication unit is further configured to send a switching instruction to the second electronic device when receiving a trigger operation on the switching control;
the first communication unit is further used for receiving an agreement instruction from the second electronic device and switching to the shot device.
19. An electronic device, characterized in that the electronic device comprises:
the second communication unit is used for establishing communication connection with the first electronic equipment;
the second communication unit is further used for receiving a first preview image synchronized by the first electronic equipment through the communication connection;
a second display unit, configured to display a second preview interface, where the second preview interface includes the first preview image;
the second communication unit is also used for receiving a target image shot by the first electronic equipment;
the display unit is further used for displaying the target image on a second preview interface;
the second communication unit is further configured to receive a confirmation instruction and send the confirmation instruction to the first electronic device, where the confirmation instruction is used to confirm the target image;
or the second communication unit is further configured to receive a cancel instruction and send the cancel instruction to the first electronic device, where the cancel instruction is used to cancel the target image.
20. The electronic device according to claim 19, wherein the second communication unit is specifically configured to, when detecting that the first electronic device broadcasts, display an identifier of the first electronic device, and establish a communication connection with the first electronic device when receiving a trigger operation on the identifier of the first electronic device.
21. The electronic device according to claim 20, wherein the display unit is further configured to display a first input box when a trigger operation for an identifier of the first electronic device is received;
the second communication unit is specifically configured to establish communication connection with the first electronic device when the key of the hotspot is obtained in the first input box.
22. The electronic device according to claim 20 or 21, wherein the second preview interface further includes a second control, and the display unit is specifically configured to display a hotspot search interface when a trigger operation for the second control is received; and under the condition that the identification of the first electronic equipment broadcasted by the first electronic equipment is detected, displaying the identification of the first electronic equipment on the hotspot searching interface.
23. The electronic device of claim 19, wherein the second preview interface further includes a second control, and the second display unit is further configured to display a second hotspot search interface when a trigger operation for the second control is received, where the second hotspot search interface includes an identifier of a target WiFi;
the second communication unit is further configured to access the target WiFi when receiving a trigger operation for the identifier of the target WiFi;
the second communication unit is further used for sending information used for indicating that the electronic equipment is a photographed equipment to the server of the target WiFi;
the second communication unit is further used for receiving a connection instruction with the first electronic device from the server of the target WiFi;
the second communication unit is further used for responding to the connection instruction and establishing communication connection with the first electronic equipment.
24. The electronic device according to claim 19, wherein the second communication unit is further configured to send a shooting instruction to the first electronic device in response to a triggering operation of a side key;
or, the second communication unit is further configured to send a shooting instruction to the first electronic device when voice information for instructing shooting is recognized.
25. The electronic device of claim 19, wherein the second communication unit is further configured to receive a synchronized second preview image from the first electronic device.
26. The electronic device of claim 19, wherein the second preview interface further comprises a confirm control and a discard control, the electronic device further comprising: the second processing unit is specifically configured to receive the confirmation instruction in response to a trigger operation on the confirmation control; or, receiving the cancelling instruction in response to the triggering operation of the abandoning control.
27. The electronic device of claim 19,
the second communication unit is further used for receiving a switching instruction from the first electronic equipment;
the second display unit is further used for responding to the switching instruction and displaying prompt information for prompting the first electronic equipment to request for switching the photographing identity;
the electronic device further includes: the second processing unit is used for switching to the main shooting device when receiving the triggering operation of agreeing to switch the shooting identity;
the second communication unit is further configured to send an agreement instruction to the first electronic device, where the agreement instruction is used to instruct the first electronic device to switch to a photographed device.
CN202110839839.4A 2021-07-23 2021-07-23 Photographing method and device and electronic equipment Pending CN113747056A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110839839.4A CN113747056A (en) 2021-07-23 2021-07-23 Photographing method and device and electronic equipment
PCT/CN2022/094200 WO2023000802A1 (en) 2021-07-23 2022-05-20 Photographing method and apparatus, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110839839.4A CN113747056A (en) 2021-07-23 2021-07-23 Photographing method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN113747056A true CN113747056A (en) 2021-12-03

Family

ID=78729155

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110839839.4A Pending CN113747056A (en) 2021-07-23 2021-07-23 Photographing method and device and electronic equipment

Country Status (2)

Country Link
CN (1) CN113747056A (en)
WO (1) WO2023000802A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115052005A (en) * 2022-06-16 2022-09-13 维沃移动通信有限公司 Synchronous display method, synchronous display device, electronic apparatus, and storage medium
WO2023000802A1 (en) * 2021-07-23 2023-01-26 荣耀终端有限公司 Photographing method and apparatus, and electronic device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070010851A (en) * 2005-07-20 2007-01-24 엘지전자 주식회사 A method of remote photographing using bluetooth for mobile station
CN104320586A (en) * 2014-11-07 2015-01-28 广东欧珀移动通信有限公司 Photographing method and system and terminals
CN105554752A (en) * 2015-11-27 2016-05-04 东莞酷派软件技术有限公司 Hotspot sharing method and related equipment
CN112181344A (en) * 2020-10-19 2021-01-05 Oppo广东移动通信有限公司 Device calling method, device calling apparatus, interaction system, electronic device, and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080015258A (en) * 2006-08-14 2008-02-19 삼성테크윈 주식회사 Digital image processing method supporting remote control mode using bluetooth communication
CN103051840A (en) * 2012-12-31 2013-04-17 广东欧珀移动通信有限公司 Method and system of preview shooting
CN104869315A (en) * 2015-05-28 2015-08-26 魅族科技(中国)有限公司 Photographing control method and terminal
CN105611176B (en) * 2016-03-08 2019-05-24 北京珠穆朗玛移动通信有限公司 A kind of photographic method, system and device
CN107205108A (en) * 2016-03-18 2017-09-26 中兴通讯股份有限公司 The photographic method and device of a kind of mobile terminal
CN113747056A (en) * 2021-07-23 2021-12-03 荣耀终端有限公司 Photographing method and device and electronic equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070010851A (en) * 2005-07-20 2007-01-24 엘지전자 주식회사 A method of remote photographing using bluetooth for mobile station
CN104320586A (en) * 2014-11-07 2015-01-28 广东欧珀移动通信有限公司 Photographing method and system and terminals
CN105554752A (en) * 2015-11-27 2016-05-04 东莞酷派软件技术有限公司 Hotspot sharing method and related equipment
CN112181344A (en) * 2020-10-19 2021-01-05 Oppo广东移动通信有限公司 Device calling method, device calling apparatus, interaction system, electronic device, and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023000802A1 (en) * 2021-07-23 2023-01-26 荣耀终端有限公司 Photographing method and apparatus, and electronic device
CN115052005A (en) * 2022-06-16 2022-09-13 维沃移动通信有限公司 Synchronous display method, synchronous display device, electronic apparatus, and storage medium
CN115052005B (en) * 2022-06-16 2024-05-10 维沃移动通信有限公司 Synchronous display method, synchronous display device, electronic equipment and storage medium

Also Published As

Publication number Publication date
WO2023000802A1 (en) 2023-01-26

Similar Documents

Publication Publication Date Title
CN114679537B (en) Shooting method and terminal
WO2021147712A1 (en) Display method for foldable screen, and related apparatus
WO2020224485A1 (en) Screen capture method and electronic device
WO2021104008A1 (en) Method for displaying folding screen and related apparatus
CN113687803A (en) Screen projection method, screen projection source end, screen projection destination end, screen projection system and storage medium
WO2020102978A1 (en) Image processing method and electronic device
CN114040242B (en) Screen projection method, electronic equipment and storage medium
WO2022100610A1 (en) Screen projection method and apparatus, and electronic device and computer-readable storage medium
WO2020221060A1 (en) Card-related processing method and apparatus
WO2021036830A1 (en) Method for displaying application on folding screen, and electronic device
CN113691842A (en) Cross-device content projection method and electronic device
WO2023000802A1 (en) Photographing method and apparatus, and electronic device
WO2023131070A1 (en) Electronic device management method, electronic device, and readable storage medium
JP2022512125A (en) Methods and Electronic Devices for Taking Long Exposure Images
CN115756270B (en) Content sharing method, device and system
CN114063951B (en) Screen projection abnormity processing method and electronic equipment
WO2022127670A1 (en) Call method and system, and related device
CN114363678A (en) Screen projection method and equipment
CN115514882A (en) Distributed shooting method, electronic device and medium
WO2022152174A9 (en) Screen projection method and electronic device
WO2021204103A1 (en) Picture preview method, electronic device, and storage medium
CN115225753A (en) Shooting method, related device and system
CN116916148B (en) Image processing method, electronic equipment and readable storage medium
WO2022179327A1 (en) Content storage method, electronic device, and system
CN117479008B (en) Video processing method, electronic equipment and chip system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20211203