CN117202108A - Multimedia file sharing method, electronic equipment and communication system - Google Patents

Multimedia file sharing method, electronic equipment and communication system Download PDF

Info

Publication number
CN117202108A
CN117202108A CN202210612621.XA CN202210612621A CN117202108A CN 117202108 A CN117202108 A CN 117202108A CN 202210612621 A CN202210612621 A CN 202210612621A CN 117202108 A CN117202108 A CN 117202108A
Authority
CN
China
Prior art keywords
interface
multimedia file
message
control
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210612621.XA
Other languages
Chinese (zh)
Inventor
张亚运
谢小灵
祝炎明
胡德启
于升升
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210612621.XA priority Critical patent/CN117202108A/en
Publication of CN117202108A publication Critical patent/CN117202108A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/06Selective distribution of broadcast services, e.g. multimedia broadcast multicast service [MBMS]; Services to user groups; One-way selective calling services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/10Multimedia information

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Telephonic Communication Services (AREA)

Abstract

The application discloses a multimedia file sharing method, electronic equipment and a communication system. The first device transmits portrait information and address information of the first device. The second device receives the portrait information sent by the first device and the address information of the first device. The second device compares the portrait information with stored image data. And when the comparison is consistent, the second equipment sends the address information of the second equipment and the portrait identifier corresponding to the portrait to the first equipment. The first device displays a portrait identifier and a control for triggering sharing. In response to operation of the control, the first device transmits the photograph or video to the second device. Therefore, in the transmission process of the photo or the video, frequent operation of a user is not needed, and the operation is simplified. Moreover, the photo or video transmission is carried out between the devices without transmission through a third party application, so that the transmission rate is improved.

Description

Multimedia file sharing method, electronic equipment and communication system
Technical Field
The embodiment of the application relates to the technical field of multimedia files, in particular to a multimedia file sharing method, electronic equipment and a communication system.
Background
With the improvement of living standard, more and more users like to record the life by photographing or shooting video. Users also prefer to share the taken photos or videos to others, such as sharing the friends' appointments with friends. The existing sharing method is generally as follows: application a is on both devices (e.g., device a and device B). After the two devices connect to the network, user A initiates the gallery, at which point device A displays the gallery interface. User a selects a photo or video on the gallery interface and clicks on the icon of application a. And in response to the operation of the user A, displaying an application A interface on the equipment A. And the user A selects friends on the application A interface, and at the moment, the equipment A enters a chat interface between the user A and the friends. On the chat interface, user A enters a photo or video into the input box and clicks the send control to send the photo or video to the buddy. On device B, user B (user a's friends) needs to open application a on device B, device B enters a chat interface with user a. On the chat interface, user B manipulates the photo or video displayed on the interface to obtain the photo or video. As can be seen, in the process of sharing photos or videos, cumbersome user operations are required, and device a needs to transmit photos or videos to device B through application a, so that the sharing rate is low.
Disclosure of Invention
The multimedia file sharing method, the electronic equipment and the communication system provided by the embodiment of the application realize quick sharing.
In order to achieve the above purpose, the following technical scheme is adopted in the embodiment of the application.
In a first aspect, an embodiment of the present application provides a method for sharing a multimedia file, where an execution body of the method may be an electronic device, or may be a component (for example, a chip system, or a processor) located in the electronic device, and the method is described below taking an example that the execution body is a first device, where the method includes: the method comprises the steps that a first device displays a first interface, one or more figures are displayed on the first interface, and a first control is corresponding to one second device or the first control corresponds to a plurality of second devices and is used for triggering one or more second devices corresponding to the first control to share a multimedia file, wherein the multimedia file is a media file containing one or more figures. The first device receives a first operation on a first control. In response to the first operation, the first device sends the multimedia file to one or more second devices.
In an embodiment of the present application, when the first device receives the operation of the first control by the user, the first device sends the multimedia file (such as a photo or video) to one or more second devices. In the transmission process of the photos or videos, frequent operation of a user is not needed, and the operation of the user is simplified. In addition, in the embodiment of the application, the photo or video transmission is carried out between the devices without transmission through a third party application, so that the transmission rate is improved. In addition, the photos or videos are directly stored in a gallery of the device, but not in other folders on the device, so that the photos or videos are convenient for users to find.
In addition, in the embodiment of the application, the second device stores the photo or video in the database of the gallery, and when the user starts the gallery of the second device, an interface of the gallery is displayed on the second device, and the photo or video can be displayed on the interface. Thus, the user can directly find the photos or videos shared by the first device in the gallery of the second device.
In some specific implementation manners, the first interface further displays identification information of one or more portraits, the number of the first controls is one or more, and the identification information of each portrait corresponds to one first control. The identification information of the portrait is displayed on the interface displayed by the first device, so that the portrait appearing in the photo or video can be clearly identified.
In a specific implementation, one or more portraits correspond to an identification message on the first interface. For example, three figures, such as figure a, figure B, and figure C, are displayed on the photograph. The identification information may be an avatar of account a of the second device. Then, on the first interface, both portrait a and portrait B correspond to the display account number a. Thus, the identification information of the portrait is displayed on the interface displayed by the first device, and the second device corresponding to the portrait appearing in the photo or video can be clearly identified.
In a specific implementation manner, the first interface is a browsing interface, and the browsing interface is an interface for a user to browse the multimedia file.
In a specific implementation manner, the first interface is a photo shooting interface, and the photo shooting interface is an interface for acquiring a photo; and a second control is also displayed on the first interface and is used for triggering and acquiring the photo, and the second control and the first control are the same control. In the embodiment of the application, the first equipment can preview the portrait in advance on the photo shooting interface and identify the portrait, so that the identification information of the portrait is obtained in the focusing process of the first equipment, the user does not need to trigger the identification operation intentionally, the program is simplified, and the photo sharing rate is effectively improved.
In a specific implementation manner, the first interface is a video shooting interface, and the video shooting interface is an interface for recording video. In response to the first operation, the first device transmits the multimedia file to one or more second devices, specifically: in response to the first operation, the first device transmits the recorded video to one or more second devices while recording the video.
In a specific implementation, the multimedia file includes at least one of: photo, video.
In some implementations, before the first device displays the first interface, the method further includes: the first device displays a second interface on which one or more portraits are displayed, and identification information of the one or more portraits.
In some implementations, before the first device displays the first interface, the method further includes: the first device sends a first message to one or more second devices, the first message carrying identification information of one or more portraits, the first message being for requesting communication. The second device is configured to send a second message to the first device when the identification information of the one or more portraits matches the pre-stored information, the second message being used to characterize the second device as a target device for the first device.
In a specific implementation manner, the first message carries address information of the first device; the second message carries address information of the second device.
In the embodiment of the application, after the second device receives the first message sent by the first device, the second device determines whether the information in the first message is consistent with the information stored in the second device according to the first message. And under the condition that the information is consistent, the second device sends a second message to the first device, wherein the second message carries the address information of the second device. Thus, the communication connection between the first equipment and the second equipment is established, the first equipment can communicate with the second equipment according to the address information of the second equipment, and the second equipment can also communicate with the first equipment according to the address information of the first equipment, so that the security and the rapidity of photo transmission are ensured in the process of subsequently transmitting photos.
In a specific implementation manner, the second message carries identification information of the target user, and the identification information of the target user is determined by the second device according to the identification information of one or more portraits; further comprises: the first device receives second messages sent by one or more second devices. The first device displays a first interface, and identification information of the target user is also displayed on the first interface.
In a specific implementation manner, the method further includes: the first device encrypts a third message to obtain a first message, wherein the third message carries identification information of one or more figures. In another specific implementation manner, the third message carries identification information of one or more portraits and address information of the first device.
In a specific implementation manner, the method further includes: the second device encrypts a fourth message to obtain a second message, wherein the fourth message carries identification information of one or more figures. In another specific implementation manner, the fourth message carries identification information of one or more portraits and address information of the second device.
In some specific implementations, after the first device sends the multimedia file to the second device in response to the first operation, further comprising: the first device sends prompt information to the second device, wherein the prompt information is used for prompting the second device that the multimedia file is updated.
In a specific implementation manner, in response to the first operation, the first device sends the multimedia file to the second device, specifically: in response to the first operation, the first device sends the multimedia file to the server. The server is used for receiving the multimedia file and providing an interface for acquiring the multimedia file for the second device.
In the embodiment of the application, the first device sends the photo to the server. The server stores the photos in a shared folder, which is a section of storage area shared on the server by the first device and the second device. The second device may obtain a photograph from the server, and in particular, the second device operates the shared folder on the interface of the gallery to open the shared folder, thereby obtaining the photograph from the shared folder. The second device stores the photos in a database of the gallery, so that a user can search the photos in the gallery directly when searching the photos, and the photos are convenient to search. In addition, in the transmission process of the photos, the first equipment does not need to consider the restrictions of the flow, resources and the like of the third party application to compress the photos and the like, and can directly report photo originals to a server, so that the photos can be transmitted to a gallery of the second equipment without loss, and the photo effect on the first equipment is effectively ensured to be consistent with that on the second equipment.
In a specific implementation, after the first device sends the multimedia file to the one or more second devices in response to the first operation, the method further includes: the first device sends a third message to the second device, the third message being used to indicate that the multimedia file is in a protected state within a first preset time.
In a specific implementation, the protected state is a state that is not acquired or used, the protected state including at least one of: an unclonable screen recording state, an unclonable state, and an unshared state.
In some specific implementations, after the first device sends the third message to the second device, the method further includes: when the multimedia file is mistransmitted, the first device receives a second operation on the multimedia file within a first preset time. In response to the second operation, the first device sends a fourth message to the second device, the fourth message requesting deletion of the multimedia file. The second device is configured to delete the multimedia file based on the fourth message.
In a second aspect, an embodiment of the present application provides a multimedia file sharing method, which is applied to a communication system, where the communication system includes a first device and one or more second devices, and the method includes: the method comprises the steps that a first device displays a first interface, one or more figures are displayed on the first interface, and a first control is corresponding to one second device or the first control corresponds to a plurality of second devices and is used for triggering one or more second devices corresponding to the first control to share a multimedia file, wherein the multimedia file is a media file containing one or more figures. The first device receives a first operation on a first control. In response to the first operation, the first device sends the multimedia file to one or more second devices.
In an embodiment of the present application, when the first device receives the operation of the first control by the user, the first device sends the multimedia file (such as a photo or video) to one or more second devices. In the transmission process of the photos or videos, frequent operation of a user is not needed, and the operation of the user is simplified. In addition, in the embodiment of the application, the photo or video transmission is carried out between the devices without transmission through a third party application, so that the transmission rate is improved. In addition, the photos or videos are directly stored in a gallery of the device, but not in other folders on the device, so that the photos or videos are convenient for users to find.
In addition, in the embodiment of the application, the second device stores the photo or video in the database of the gallery, and when the user starts the gallery of the second device, an interface of the gallery is displayed on the second device, and the photo or video can be displayed on the interface. Thus, the user can directly find the photos or videos shared by the first device in the gallery of the second device.
In some specific implementation manners, the first interface further displays identification information of one or more portraits, the number of the first controls is one or more, and the identification information of each portrait corresponds to one first control. The identification information of the portrait is displayed on the interface displayed by the first device, so that the portrait appearing in the photo or video can be clearly identified.
In a specific implementation, one or more portraits correspond to an identification message on the first interface. For example, three figures, such as figure a, figure B, and figure C, are displayed on the photograph. The identification information may be an avatar of account a of the second device. Then, on the first interface, both portrait a and portrait B correspond to the display account number a. Thus, the identification information of the portrait is displayed on the interface displayed by the first device, and the second device corresponding to the portrait appearing in the photo or video can be clearly identified.
In a specific implementation manner, the first interface is a browsing interface, and the browsing interface is an interface for a user to browse the multimedia file.
In a specific implementation manner, the first interface is a photo shooting interface, and the photo shooting interface is an interface for acquiring a photo; and a second control is also displayed on the first interface and is used for triggering and acquiring the photo, and the second control and the first control are the same control. In the embodiment of the application, the first equipment can preview the portrait in advance on the photo shooting interface and identify the portrait, so that the identification information of the portrait is obtained in the focusing process of the first equipment, the user does not need to trigger the identification operation intentionally, the program is simplified, and the photo sharing rate is effectively improved.
In a specific implementation manner, the first interface is a video shooting interface, and the video shooting interface is an interface for recording video. In response to the first operation, the first device transmits the multimedia file to one or more second devices, specifically: in response to the first operation, the first device transmits the recorded video to one or more second devices while recording the video.
In a specific implementation, the multimedia file includes at least one of: photo, video.
In some implementations, before the first device displays the first interface, the method further includes: the first device displays a second interface on which one or more portraits are displayed, and identification information of the one or more portraits.
In some implementations, before the first device displays the first interface, the method further includes: the first device sends a first message to one or more second devices, the first message carrying identification information of the one or more portraits, the first message being for requesting communication. The one or more second devices compare the identification information of the one or more portraits with pre-stored information. And when the identification information of the one or more portraits is consistent with the pre-stored information comparison, the one or more second devices send second messages to the first device, wherein the second messages are used for representing that the second device is a target device of the first device.
In some implementations, the first message carries address information of the first device; the second message carries address information of the second device.
In the embodiment of the application, after the second device receives the first message sent by the first device, the second device determines whether the information in the first message is consistent with the information stored in the second device according to the first message. And under the condition that the information is consistent, the second device sends a second message to the first device, wherein the second message carries the address information of the second device. Thus, the communication connection between the first equipment and the second equipment is established, the first equipment can communicate with the second equipment according to the address information of the second equipment, and the second equipment can also communicate with the first equipment according to the address information of the first equipment, so that the security and the rapidity of photo transmission are ensured in the process of subsequently transmitting photos.
In some specific implementations, further comprising: and the second equipment determines one or more target users and identification information of the target users when the identification information of one or more portraits is consistent with the pre-stored information. The first device receives a second message, wherein the second message carries identification information of the target user. The first device displays a first interface, and identification information of the target user is also displayed on the first interface.
In a specific implementation manner, the method further includes: the first device encrypts a third message to obtain a first message, wherein the third message carries identification information of one or more figures. In another specific implementation manner, the third message carries identification information of one or more portraits and address information of the first device.
In a specific implementation manner, the method further includes: the second device encrypts a fourth message to obtain a second message, wherein the fourth message carries identification information of one or more figures. In another specific implementation manner, the fourth message carries identification information of one or more portraits and address information of the second device.
In some specific implementations, after the first device sends the multimedia file to the second device in response to the first operation, further comprising: the first device sends prompt information to the second device, wherein the prompt information is used for prompting the second device that the multimedia file is updated. The second device receives the prompt. The second device displays a third interface, and prompt information is displayed on the third interface.
In some specific implementations, the communication system further includes a server, responsive to the first operation, the first device transmitting the multimedia file to the second device, in particular: in response to the first operation, the first device sends the multimedia file to the server. The server receives the multimedia file and is configured to provide an interface for obtaining the multimedia file to the second device.
In some specific implementations, after the server receives the multimedia file, further comprising: the server stores the multimedia files in a shared folder, which is a storage area shared on the server by the first device and the one or more second devices.
In some specific implementations, further comprising: the second device displays a fourth interface on which icons of the shared folders are displayed. The second device receives a second operation on the icon of the shared folder. In response to the second operation, the second device obtains the multimedia file from the server.
In the embodiment of the application, the first device sends the photo to the server. The server stores the photos in a shared folder, which is a section of storage area shared on the server by the first device and the second device. The second device may obtain a photograph from the server, and in particular, the second device operates the shared folder on the interface of the gallery to open the shared folder, thereby obtaining the photograph from the shared folder. The second device stores the photos in a database of the gallery, so that a user can search the photos in the gallery directly when searching the photos, and the photos are convenient to search. In addition, in the transmission process of the photos, the first equipment does not need to consider the restrictions of the flow, resources and the like of the third party application to compress the photos and the like, and can directly report photo originals to a server, so that the photos can be transmitted to a gallery of the second equipment without loss, and the photo effect on the first equipment is effectively ensured to be consistent with that on the second equipment.
In some specific implementations, the second device displays a fourth interface, specifically: after the second device satisfies the first condition, the second device displays a fourth interface.
In some implementations, the first condition includes at least one of: the second equipment receives prompt information sent by the first equipment, wherein the prompt information is used for prompting the second equipment that the multimedia file is updated; the second device receives a third operation to the second device, the third operation being an operation for triggering the display of an icon of the shared folder.
In some specific implementations, in response to the second operation, the second device obtains the multimedia file from the server, specifically: in response to the second operation, the second device displays a fifth interface, and an input box is displayed on the fifth interface, the input box being used for inputting verification information. The second device receives a fourth operation on the input box. In response to the fourth operation, the second device sends a first request to the server, the first request carrying authentication information, the first request being for obtaining the multimedia file. The server sends the multimedia file to the second device according to the first request. The second device receives the multimedia file.
In some specific implementations, after the first device sends the multimedia file to the second device in response to the first operation, further comprising: the second device displays a sixth interface on which the multimedia file is displayed.
In one specific implementation, the sixth interface is an interface of a gallery application.
In some implementations, after the first device sends the multimedia file to the one or more second devices in response to the first operation, further comprising: the first device sends a third message to the second device, the third message being used to indicate that the multimedia file is in a protected state within a first preset time.
In one particular implementation, the protected state is a state that is in a state that is not to be acquired or used; the protected state includes at least one of: an unclonable screen recording state, an unclonable state, and an unshared state.
In some specific implementations, after the first device sends the third message to the second device, the method further includes: when the multimedia file is mistransmitted, the first device receives a fifth operation on the multimedia file within a first preset time. In response to the fifth operation, the first device sends a fourth message to the second device, the fourth message requesting deletion of the multimedia file. The second device deletes the multimedia file according to the fourth message.
In a third aspect, an embodiment of the present application provides a method for sharing a multimedia file, where an execution body of the method may be an electronic device, or may be a component (for example, a chip system, or a processor) located in the electronic device, and the method is described below taking an example that the execution body is a first device, where the method includes: the first device sends a first message to one or more second devices, the first message carrying identification information of one or more portraits in the multimedia file, the first message being for requesting communication. The second device is configured to send a second message to the first device when the identification information of the one or more portraits matches the pre-stored information, the second message being used to characterize the second device as a target device for the first device. The first device receives a second message sent by the second device.
In the embodiment of the application, after the second device receives the first message sent by the first device, the second device determines whether the information in the first message is consistent with the information stored in the second device according to the first message. In the case of consistent information, the second device sends a second message to the first device, the second message being used to characterize the second device as the target device for the first device. The first device receives a second message sent by the second device. Thus, the communication connection between the first equipment and the second equipment is established, the first equipment and the second equipment are communicated, and the second equipment can also be communicated with the first equipment, so that the security and the rapidness of photo transmission are ensured in the process of subsequently transmitting photos.
In some implementations, the first message carries address information of the first device and the second message carries address information of the second device. Thus, the first equipment can communicate with the second equipment according to the address information of the first equipment, and the second equipment can also communicate with the first equipment according to the address information of the second equipment, so that the security and the rapidness of photo transmission are ensured in the process of subsequently transmitting photos.
In some implementations, the first device displays a first interface on which one or more portraits in the multimedia file are displayed. The first device further displays a first control on the first interface, the first control corresponds to one second device, or the first control corresponds to a plurality of second devices, the first control is used for triggering one or a plurality of second devices corresponding to the first control to share a multimedia file, and the multimedia file is a media file containing one or a plurality of figures. The first device receives a first operation on a first control. In response to the first operation, the first device sends the multimedia file to one or more second devices.
It can be seen that when the first device receives a user's manipulation of the first control, the first device transmits a multimedia file (e.g., a photograph or video) to one or more second devices. In the transmission process of the photos or videos, frequent operation of a user is not needed, and the operation of the user is simplified. In addition, in the embodiment of the application, the photo or video transmission is carried out between the devices without transmission through a third party application, so that the transmission rate is improved. In addition, the photos or videos are directly stored in a gallery of the device, but not in other folders on the device, so that the photos or videos are convenient for users to find.
In addition, in the embodiment of the application, the second device stores the photo or video in the database of the gallery, and when the user starts the gallery of the second device, an interface of the gallery is displayed on the second device, and the photo or video can be displayed on the interface. Thus, the user can directly find the photos or videos shared by the first device in the gallery of the second device.
In some specific implementations, the second message carries identification information of the target user, where the identification information of the target user is determined by the second device according to the identification information of the one or more portraits, and further including: the first device receives identification information of a target user and address information of the second device. The first device also displays identification information of the target user on the first interface. The identification information of the portrait is displayed on the interface displayed by the first device, so that the portrait appearing in the photo or video can be clearly identified.
In a specific implementation, one or more portraits correspond to an identification message on the first interface. For example, three figures, such as figure a, figure B, and figure C, are displayed on the photograph. The identification information may be an avatar of account a of the second device. Then, on the first interface, both portrait a and portrait B correspond to the display account number a. Thus, the identification information of the portrait is displayed on the interface displayed by the first device, and the second device corresponding to the portrait appearing in the photo or video can be clearly identified.
In a specific implementation manner, the method further includes: the first device encrypts a third message to obtain a first message, wherein the third message carries identification information of one or more figures. In another specific implementation manner, the third message carries identification information of one or more portraits and address information of the first device.
In a specific implementation manner, the method further includes: the second device encrypts a fourth message to obtain a second message, wherein the fourth message carries identification information of one or more figures. In another specific implementation manner, the fourth message carries identification information of one or more portraits and address information of the second device.
In some specific implementation manners, the first interface further displays identification information of one or more portraits, the number of the first controls is one or more, and the identification information of each portrait corresponds to one first control.
In a specific implementation manner, the first interface is a browsing interface, and the browsing interface is an interface for a user to browse the multimedia file.
In a specific implementation manner, the first interface is a photo shooting interface, and the photo shooting interface is an interface for acquiring a photo; and a second control is also displayed on the first interface and is used for triggering and acquiring the photo, and the second control and the first control are the same control. In the embodiment of the application, the first equipment can preview the portrait in advance on the photo shooting interface and identify the portrait, so that the identification information of the portrait is obtained in the focusing process of the first equipment, the user does not need to trigger the identification operation intentionally, the program is simplified, and the photo sharing rate is effectively improved.
In a specific implementation manner, the first interface is a video shooting interface, and the video shooting interface is an interface for recording video. In response to the first operation, the first device transmits the multimedia file to one or more second devices, specifically: in response to the first operation, the first device transmits the recorded video to one or more second devices while recording the video.
In a specific implementation, the multimedia file includes at least one of: photo, video.
In some specific implementations, after the first device sends the multimedia file to the second device in response to the first operation, further comprising: the first device sends prompt information to the second device, wherein the prompt information is used for prompting the second device that the multimedia file is updated. The second device is used for receiving the prompt message, displaying a third interface and displaying the prompt message on the third interface.
In some specific implementations, in response to the first operation, the first device sends the multimedia file to the second device, specifically: in response to the first operation, the first device sends the multimedia file to the server. The server is used for receiving the multimedia file, and the server is used for providing an interface for acquiring the multimedia file for the second device.
In some implementations, the server is further configured to store the multimedia files in a shared folder, the shared folder being a storage area shared on the server by the first device and the one or more second devices.
In some implementations, the second device is configured to display a fourth interface on which icons of the shared folders are displayed. The second device is to receive a second operation on the icon of the shared folder. In response to the second operation, the second device is configured to obtain the multimedia file from the server.
In some specific implementations, the second device is further to: after the second device satisfies the first condition, a fourth interface is displayed.
In the embodiment of the application, the first device sends the photo to the server. The server stores the photos in a shared folder, which is a section of storage area shared on the server by the first device and the second device. The second device may obtain a photograph from the server, and in particular, the second device operates the shared folder on the interface of the gallery to open the shared folder, thereby obtaining the photograph from the shared folder. The second device stores the photos in a database of the gallery, so that a user can search the photos in the gallery directly when searching the photos, and the photos are convenient to search. In addition, in the transmission process of the photos, the first equipment does not need to consider the restrictions of the flow, resources and the like of the third party application to compress the photos and the like, and can directly report photo originals to a server, so that the photos can be transmitted to a gallery of the second equipment without loss, and the photo effect on the first equipment is effectively ensured to be consistent with that on the second equipment.
In some implementations, the first condition includes at least one of: the second equipment receives prompt information sent by the first equipment, wherein the prompt information is used for prompting the second equipment that the multimedia file is updated; the second device receives a third operation to the second device, the third operation being an operation for triggering the display of an icon of the shared folder.
In some specific implementations, the second device is further configured to display a fifth interface in response to the second operation, the fifth interface displaying an input box thereon, the input box being configured to input the authentication information. The second device is also configured to receive a fourth operation on the input box. The second device is further configured to send, in response to the fourth operation, a first request to the server, the first request carrying authentication information, the first request being for obtaining the multimedia file. The server is used for sending the multimedia file to the second device according to the first request. The second device is also for receiving a multimedia file.
In some specific implementations, the second device is further configured to display a sixth interface on which the multimedia file is displayed.
In one specific implementation, the sixth interface is an interface of a gallery application.
In some implementations, after the first device sends the multimedia file to the one or more second devices in response to the first operation, further comprising: the first device sends a third message to the second device, the third message being used to indicate that the multimedia file is in a protected state within a first preset time.
In one particular implementation, the protected state is a state that is in a state that is not to be acquired or used; the protected state includes at least one of: an unclonable screen recording state, an unclonable state, and an unshared state.
In some specific implementations, after the first device sends the third message to the second device, the method further includes: when the multimedia file is mistransmitted, the first device receives a fifth operation on the multimedia file within a first preset time. In response to the fifth operation, the first device sends a fourth message to the second device, the fourth message requesting deletion of the multimedia file. The second device is configured to delete the multimedia file based on the fourth message.
In a fourth aspect, an embodiment of the present application provides an electronic device, including: the first display module is used for displaying a first interface, displaying one or more figures on the first interface, and a first control, wherein the first control corresponds to one second device or corresponds to a plurality of second devices, the first control is used for triggering one or more second devices corresponding to the first control to share a multimedia file, and the multimedia file is a media file containing one or more figures. And the first receiving module is used for receiving the first operation of the first control. In response to the first operation, a first transmission module is used for transmitting the multimedia file to one or more second devices.
In an embodiment of the present application, when the first device receives the operation of the first control by the user, the first device sends the multimedia file (such as a photo or video) to one or more second devices. In the transmission process of the photos or videos, frequent operation of a user is not needed, and the operation of the user is simplified. In addition, in the embodiment of the application, the photo or video transmission is carried out between the devices without transmission through a third party application, so that the transmission rate is improved. In addition, the photos or videos are directly stored in a gallery of the device, but not in other folders on the device, so that the photos or videos are convenient for users to find.
In addition, in the embodiment of the application, the second device stores the photo or video in the database of the gallery, and when the user starts the gallery of the second device, an interface of the gallery is displayed on the second device, and the photo or video can be displayed on the interface. Thus, the user can directly find the photos or videos shared by the first device in the gallery of the second device.
In some specific implementation manners, the first interface further displays identification information of one or more portraits, the number of the first controls is one or more, and the identification information of each portrait corresponds to one first control. The identification information of the portrait is displayed on the interface displayed by the first device, so that the portrait appearing in the photo or video can be clearly identified.
In a specific implementation, one or more portraits correspond to an identification message on the first interface. For example, three figures, such as figure a, figure B, and figure C, are displayed on the photograph. The identification information may be an avatar of account a of the second device. Then, on the first interface, both portrait a and portrait B correspond to the display account number a. Thus, the identification information of the portrait is displayed on the interface displayed by the first device, and the second device corresponding to the portrait appearing in the photo or video can be clearly identified.
In a specific implementation manner, the first interface is a browsing interface, and the browsing interface is an interface for a user to browse the multimedia file.
In a specific implementation manner, the first interface is a photo shooting interface, and the photo shooting interface is an interface for acquiring a photo; and a second control is also displayed on the first interface and is used for triggering and acquiring the photo, and the second control and the first control are the same control.
In a specific implementation manner, the first interface is a video shooting interface, and the video shooting interface is an interface for recording video. The first sending module is further configured to: in response to the first operation, the first device transmits the recorded video to one or more second devices while recording the video.
In a specific implementation, the multimedia file includes at least one of: photo, video.
In some implementations, the electronic device further includes: and the second display module is used for displaying a second interface, and the second interface is used for displaying the identification information of one or more portraits.
In some implementations, the electronic device further includes: and the second sending module is used for sending a first message to one or more second devices, wherein the first message carries the identification information of one or more portraits, and the first message is used for requesting communication. The second device is configured to send a second message to the first device when the identification information of the one or more portraits matches the pre-stored information, the second message being used to characterize the second device as a target device for the first device.
In a specific implementation manner, the first message carries address information of the first device; the second message carries address information of the second device.
In a specific implementation manner, the second message carries identification information of the target user, where the identification information of the target user is determined by the second device according to the identification information of one or more portraits, and the electronic device further includes: and the second receiving module is used for receiving second messages sent by one or more second devices. The first device also displays identification information of the target user on the first interface.
In some implementations, the electronic device further includes: and the third sending module is used for sending prompt information to the second equipment, wherein the prompt information is used for prompting the second equipment that the multimedia file is updated.
In a specific implementation manner, the first sending module is further configured to: in response to the first operation, the first device sends the multimedia file to the server. The server is used for receiving the multimedia file and providing an interface for acquiring the multimedia file for the second device.
In a specific implementation, the electronic device further includes: and the fourth sending module is used for sending a third message to the second equipment, wherein the third message is used for indicating that the multimedia file is in a protected state within the first preset time.
In a specific implementation, the protected state is a state that is not acquired or used, the protected state including at least one of: an unclonable screen recording state, an unclonable state, and an unshared state.
In some implementations, the electronic device further includes: and when the multimedia file is mistransmitted, the third receiving module is used for receiving a second operation on the multimedia file in the first preset time. And responding to the second operation, a fifth sending module is used for sending a fourth message to the second device, wherein the fourth message is used for requesting to delete the multimedia file. The second device is configured to delete the multimedia file based on the fourth message.
In a fifth aspect, an embodiment of the present application provides an electronic device, including: the first sending module is used for sending a first message to one or more second devices, the first message carries identification information of one or more portraits in the multimedia file, and the first message is used for requesting communication. The second device is configured to send a second message to the first device when the identification information of the one or more portraits matches the pre-stored information, the second message being used to characterize the second device as a target device for the first device. The first receiving module is used for receiving the second message sent by the second device.
In the embodiment of the application, after the second device receives the first message sent by the first device, the second device determines whether the information in the first message is consistent with the information stored in the second device according to the first message. In the case of consistent information, the second device sends a second message to the first device, the second message being used to characterize the second device as the target device for the first device. The first device receives a second message sent by the second device. Thus, the communication connection between the first equipment and the second equipment is established, the first equipment and the second equipment are communicated, and the second equipment can also be communicated with the first equipment, so that the security and the rapidness of photo transmission are ensured in the process of subsequently transmitting photos.
In some implementations, the first message carries address information of the first device and the second message carries address information of the second device. Thus, the first equipment can communicate with the second equipment according to the address information of the first equipment, and the second equipment can also communicate with the first equipment according to the address information of the second equipment, so that the security and the rapidness of photo transmission are ensured in the process of subsequently transmitting photos.
In some implementations, the first display module is configured to display a first interface on which one or more portraits are displayed. The first device further displays a first control on the first interface, the first control corresponds to one second device, or the first control corresponds to a plurality of second devices, the first control is used for triggering one or a plurality of second devices corresponding to the first control to share a multimedia file, and the multimedia file is a media file containing one or a plurality of figures. And the second receiving module is used for receiving the first operation of the first control. And a second transmitting module for transmitting the multimedia file to one or more second devices in response to the first operation.
In an embodiment of the present application, when the first device receives the operation of the first control by the user, the first device sends the multimedia file (such as a photo or video) to one or more second devices. In the transmission process of the photos or videos, frequent operation of a user is not needed, and the operation of the user is simplified. In addition, in the embodiment of the application, the photo or video transmission is carried out between the devices without transmission through a third party application, so that the transmission rate is improved. In addition, the photos or videos are directly stored in a gallery of the device, but not in other folders on the device, so that the photos or videos are convenient for users to find.
In addition, in the embodiment of the application, the second device stores the photo or video in the database of the gallery, and when the user starts the gallery of the second device, an interface of the gallery is displayed on the second device, and the photo or video can be displayed on the interface. Thus, the user can directly find the photos or videos shared by the first device in the gallery of the second device.
In some implementations, the first message carries identification information of one or more portraits, and address information of the first device; the second device is further configured to determine one or more target users and identification information of the target users according to the identification information of the one or more portraits and the user information stored in advance, and the second device is further configured to send the identification information of the target users and address information of the second device to the first device.
In some specific implementations, the second message carries identification information of the target user, where the identification information of the target user is determined by the second device according to identification information of one or more portraits, and the electronic device further includes: and the third receiving module is used for receiving the identification information of the target user and the address information of the second equipment. The first device also displays identification information of the target user on the first interface. The identification information of the portrait is displayed on the interface displayed by the first device, so that the portrait appearing in the photo or video can be clearly identified.
In a specific implementation, one or more portraits correspond to an identification message on the first interface. For example, three figures, such as figure a, figure B, and figure C, are displayed on the photograph. The identification information may be an avatar of account a of the second device. Then, on the first interface, both portrait a and portrait B correspond to the display account number a. Thus, the identification information of the portrait is displayed on the interface displayed by the first device, and the second device corresponding to the portrait appearing in the photo or video can be clearly identified.
In some specific implementation manners, the first interface further displays identification information of one or more portraits, the number of the first controls is one or more, and the identification information of each portrait corresponds to one first control.
In a specific implementation manner, the first interface is a browsing interface, and the browsing interface is an interface for a user to browse the multimedia file.
In a specific implementation manner, the first interface is a photo shooting interface, and the photo shooting interface is an interface for acquiring a photo; and a second control is also displayed on the first interface and is used for triggering and acquiring the photo, and the second control and the first control are the same control.
In a specific implementation manner, the first interface is a video shooting interface, and the video shooting interface is an interface for recording video. The second transmitting module is further configured to: in response to the first operation, the first device transmits the recorded video to one or more second devices while recording the video.
In a specific implementation, the multimedia file includes at least one of: photo, video.
In some implementations, the electronic device further includes: and the third sending module is used for sending prompt information to the second equipment, wherein the prompt information is used for prompting the second equipment that the multimedia file is updated. The second device is used for receiving the prompt message, displaying a third interface and displaying the prompt message on the third interface.
In some specific implementations, the second sending module is further configured to: in response to the first operation, the multimedia file is sent to the server. The server is used for receiving the multimedia file, and the server is used for providing an interface for acquiring the multimedia file for the second device.
In some implementations, the server is further configured to store the multimedia files in a shared folder, the shared folder being a storage area shared on the server by the first device and the one or more second devices.
In some specific implementations, the second device is further configured to display a fourth interface on which icons of the shared folders are displayed. The second device is also configured to receive a second operation on an icon of the shared folder. The second device is further configured to obtain the multimedia file from the server in response to the second operation.
In some specific implementations, the second device is further to: after the second device satisfies the first condition, a fourth interface is displayed.
In some implementations, the first condition includes at least one of: the second equipment receives prompt information sent by the first equipment, wherein the prompt information is used for prompting the second equipment that the multimedia file is updated; the second device receives a third operation to the second device, the third operation being an operation for triggering the display of an icon of the shared folder.
In some specific implementations, the second device is further to: and responding to the second operation, displaying a fifth interface, and displaying an input box on the fifth interface, wherein the input box is used for inputting verification information. The second device is also configured to receive a fourth operation on the input box. In response to the fourth operation, the second device is further configured to send a first request to the server, the first request carrying authentication information, the first request being for obtaining the multimedia file. The server is used for sending the multimedia file to the second device according to the first request. The second device is also for receiving a multimedia file.
In some specific implementations, the second device is further configured to display a sixth interface on which the multimedia file is displayed.
In one specific implementation, the sixth interface is an interface of a gallery application.
In some implementations, the electronic device further includes: and the fourth sending module is used for sending a third message to the second equipment, wherein the third message is used for indicating that the multimedia file is in a protected state within the first preset time.
In one particular implementation, the protected state is a state that is in a state that is not to be acquired or used; the protected state includes at least one of: an unclonable screen recording state, an unclonable state, and an unshared state.
In some implementations, the electronic device further includes: and when the multimedia file is mistransmitted, the fourth receiving module is used for receiving a fifth operation on the multimedia file in the first preset time. And responding to the fifth operation, a fifth sending module is used for sending a fourth message to the second device, wherein the fourth message is used for requesting to delete the multimedia file. The second device is configured to delete the multimedia file based on the fourth message.
In a sixth aspect, an embodiment of the present application provides an electronic device, including: one or more processors; and a memory in which the code is stored; the code, when executed by the processor, causes the electronic device to perform the method of the first aspect, or the electronic device to perform the method of the third aspect.
In a seventh aspect, embodiments of the present application provide a communication system that includes a first device, one or more second devices. The first device is configured to perform: the method comprises the steps that a first device displays a first interface, one or more figures are displayed on the first interface, and a first control is corresponding to one second device or the first control corresponds to a plurality of second devices and is used for triggering one or more second devices corresponding to the first control to share a multimedia file, wherein the multimedia file is a media file containing one or more figures; the method comprises the steps that first operation of a first control is received by first equipment; in response to the first operation, the first device sends the multimedia file to one or more second devices.
In an eighth aspect, an embodiment of the present application provides a communication system including a first device, one or more second devices. The first device is configured to perform: the first device displays a first interface on which one or more portraits are displayed. The first device sends a first message to one or more second devices, the first message requesting communication. The one or more second devices are configured to establish a communication connection with the first device based on the first message. The first device further displays a first control on the first interface, the first control corresponds to one second device, or the first control corresponds to a plurality of second devices, the first control is used for triggering one or a plurality of second devices corresponding to the first control to share a multimedia file, and the multimedia file is a media file containing one or a plurality of figures. The first device receives a first operation on a first control. In response to the first operation, the first device sends the multimedia file to one or more second devices.
In a ninth aspect, embodiments of the present application provide a computer readable storage medium comprising computer instructions which, when run on a first device, cause the first device to perform the method of the first aspect, or the first device to perform the method of the third aspect.
In a tenth aspect, embodiments of the present application provide a computer program product which, when run on a computer, causes the computer to perform the method according to the first aspect, or the method according to the third aspect.
Embodiments of the foregoing second aspect to the tenth aspect and corresponding technical effects may be referred to the embodiments of the foregoing first aspect and technical effects.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1A is a schematic diagram of an interface of a mobile phone 01 according to an embodiment of the present application;
fig. 1B is a schematic diagram of another interface of a mobile phone 01 according to an embodiment of the present application;
fig. 1C is a schematic diagram of another interface of a mobile phone 01 according to an embodiment of the present application;
fig. 1D is a schematic diagram of another interface of a mobile phone 01 according to an embodiment of the present application;
fig. 1E is a schematic diagram of another interface of a mobile phone 01 according to an embodiment of the present application;
fig. 1F is a schematic diagram of another interface of a mobile phone 02 according to an embodiment of the present application;
fig. 1G is a schematic diagram of another interface of a mobile phone 02 according to an embodiment of the present application;
fig. 1H is a schematic diagram of another interface of a mobile phone 02 according to an embodiment of the present application;
fig. 2A is a schematic structural diagram of a communication system according to an embodiment of the present application;
fig. 2B is a schematic structural diagram of another communication system according to an embodiment of the present application;
fig. 3A is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 3B is a software architecture diagram of an electronic device according to an embodiment of the present application;
FIG. 3C is a software architecture diagram of another electronic device according to an embodiment of the present application;
fig. 4A is a flowchart illustrating a multimedia file sharing method according to an embodiment of the present application;
Fig. 4B is a flowchart illustrating a multimedia file sharing method according to an embodiment of the present application;
fig. 4C is a flowchart illustrating a multimedia file sharing method according to an embodiment of the present application;
fig. 5A is a schematic diagram of another interface of the mobile phone 01 according to the embodiment of the present application;
fig. 5B is a schematic diagram of another interface of the mobile phone 01 according to the embodiment of the present application;
fig. 5C is a schematic diagram of another interface of the mobile phone 01 according to the embodiment of the present application;
fig. 5D is a schematic diagram of another interface of the mobile phone 01 according to the embodiment of the present application;
fig. 5E is a schematic diagram of another interface of the mobile phone 02 according to an embodiment of the present application;
fig. 5F is a schematic diagram of another interface of the mobile phone 02 according to an embodiment of the present application;
fig. 5G is a schematic diagram of another interface of the mobile phone 02 according to the embodiment of the present application;
fig. 5H is a schematic diagram of another interface of the mobile phone 02 according to the embodiment of the present application;
fig. 5I is a schematic diagram of another interface of the mobile phone 02 according to an embodiment of the present application;
fig. 5J is a schematic diagram of another interface of the mobile phone 02 according to the embodiment of the present application;
fig. 5K is a schematic diagram of another interface of the mobile phone 02 according to the embodiment of the present application;
Fig. 5L is a schematic diagram of another interface of the mobile phone 01 according to the embodiment of the present application;
fig. 5M is a schematic diagram of another interface of the mobile phone 01 according to the embodiment of the present application;
fig. 5N is a schematic diagram of another interface of the mobile phone 01 according to the embodiment of the present application;
fig. 6A is a schematic diagram of another interface of the mobile phone 01 according to the embodiment of the present application;
fig. 6B is a schematic diagram of another interface of the mobile phone 01 according to the embodiment of the present application;
fig. 6C is a schematic diagram of another interface of the mobile phone 01 according to the embodiment of the present application;
fig. 6D is a schematic diagram of another interface of the mobile phone 02 according to the embodiment of the present application;
fig. 7 is a schematic structural diagram of still another electronic device according to an embodiment of the present application.
Detailed Description
The terms "first" and "second" are used below for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present embodiment, unless otherwise specified, the meaning of "plurality" is two or more.
At present, the sharing of the multimedia file between two devices may be specifically: illustratively, the first device is handset 01 and the second device is handset 02. Fig. 1A is an interface schematic diagram of a mobile phone 01 according to an embodiment of the present application. As shown in fig. 1A, after a first user takes a picture with a camera of a mobile phone 01, the picture taken by the mobile phone 01 is stored in a gallery. When the first user wants to transfer the latest shot photo to the second user's cell phone 02, the first user needs to click on control 111 on interface 110 shown in fig. 1A. The mobile phone 01 receives the operation of the first user, and in response to the operation, the interface of the mobile phone 01 jumps from the interface 110 shown in fig. 1A to the interface 120 shown in fig. 1B. Controls 121 for sharing photos are included on the interface 120. When the user clicks the control 121, the interface of the mobile phone 01 jumps to the interface 130 shown in fig. 1C in response to the user's operation. The interface 130 displays controls 131 and application icons 132 for selection. The first user may share the photo to the second user through application a. Specifically, the first user clicks control 131 and clicks an icon of application A. In response to the above, the mobile phone 01 interface jumps from the interface 130 shown in fig. 1C to the interface 140 shown in fig. 1D. The interface 140 is an interface of application a. And displaying friend information of the first user on the interface. After the first user selects friend "XXX," i.e., the first user clicks control 141 on interface 140, the interface of handset 01 may jump to interface 150 shown in fig. 1E. The interface 150 is a chat interface between the first user and the second user, where the first user shares photos to the second user. As can be seen, in the whole photo sharing process, the user needs to click the control 121, the control 131, the application icon 132, the control 141, and the like, the sharing operation is complicated, and the mobile phone 01 needs to transmit the photo or the video to the mobile phone 02 through the application a, so that the sharing rate is low.
In addition, fig. 1F is an interface schematic diagram of a mobile phone 02 according to an embodiment of the present application. As shown in fig. 1F, the second user receives the photograph shared by the first user through application a. If the second user wants to save the photo, the second user needs to press the photo on the interface 210 shown in FIG. 1F. The mobile phone 02 displays an interface 220 shown in fig. 1G in response to a user operation. On this interface 220 is displayed a control 221 for downloading a photo. The user clicks control 221. In response to the user operation, the interface of the mobile phone 02 jumps from the interface 220 shown in fig. 1G to the interface 230 shown in fig. 1H, and a prompt message "the picture has been saved to/sdcard/pictures/application a folder" is displayed on the interface 230. It can be seen that the handset 02 has photos in the application a folder, rather than in the gallery of the handset 02. Thus, the user needs to view each folder, which is inconvenient for the user to find photos.
In order to solve the technical problem of complicated sharing steps and low sharing rate, the embodiment of the application provides a multimedia file sharing method, which can be applied to a communication system shown in fig. 2A. As shown in fig. 2A, the communication system 100 may include a first device 11 and one or more second devices 12.
The electronic devices (e.g., the first device 11, the second device 121, the second device 122, … …, and the second device 12n, where n is a positive integer greater than or equal to 1) may be a mobile phone, a tablet computer, a laptop, a notebook, an Ultra-mobile personal computer (Ultra-mobile Personal Computer, UMPC), a handheld computer, a netbook, a personal digital assistant (Personal Digital Assistant, PDA), a wearable electronic device, and the specific form of the electronic device is not particularly limited in the embodiment of the present application.
For example, as shown in fig. 2A, any one of the plurality of second devices (e.g., the second device 121, the second devices 122, … …, the second device 12 n) may be simply referred to as the second device 12 for simplicity of description. The first device 11 may be a mobile phone 01 and the second device 12 may be a mobile phone 02. The first device 11 communicates with the second device 12. The method specifically comprises the following steps: the user takes a picture or records a video using the first device 11. The first device 11 may store the photo or video in a gallery of the first device 11. The first device 11 shares the photo or video to the second device 12. The method specifically comprises the following steps: during the process of taking a picture or recording a video by the first device 11, or when the first device 11 displays an interface of the picture or video, the first device 11 recognizes portrait information on the picture or video. The first device 11 transmits a broadcast carrying portrait information and address information of the first device 11. The second device 12 receives the broadcast transmitted from the first device 11 and parses the portrait information in the broadcast and the address information of the first device. The second device 12 compares the portrait information with stored image data. When the portrait information matches the stored image data, the second device 12 transmits address information of the second device and a portrait identifier corresponding to the portrait to the first device 11. The first device 11 displays a portrait identifier and controls for triggering sharing on a display interface of the photograph or video. In response to the first user's operation of the control, the gallery of the first device 11 transmits photos or videos to the gallery of the second device 12. When the second user initiates the gallery of the second device 12, an interface of the gallery is displayed on the second device 12, on which interface the above-mentioned photos or videos may be displayed. In this way, the second user may directly find the photo or video shared by the first device 11 in the gallery of the second device 12. Therefore, in the transmission process of the photo or the video, frequent operation of a user is not needed, and the operation of the user is simplified. In addition, in the embodiment of the application, the photo or video is transmitted by the gallery between the two devices without being transmitted by a third party application, so that the transmission rate is improved. In addition, the photos or videos are directly stored in a gallery of the device, but not in other folders on the device, so that the photos or videos are convenient for users to find.
In other embodiments, the method for sharing multimedia files provided in the embodiment of the present application may also be applied to the system shown in fig. 2B. As shown in fig. 2B, the system 100 may also include a server 13. The difference compared to the above-described embodiments is that the first device 11 communicates with the server 13 and the server 13 communicates with the second device 12. The method specifically comprises the following steps: after the first device 11 displays the portrait identifier and the control for triggering sharing on the display interface of the photograph or video, in response to the first user's operation on the control, the first device 11 reports the photograph or video to the server 13 and sends a notification to the second device 12 for prompting the presence of a photograph or video update. The second user opens the shared folder function on the setup interface of the gallery of the second device 12. The shared folder is a section of the shared area of the first device 11 and the second device 12 on the server 13. The second user initiates a gallery on the second device 12 and the second device 12 displays an interface to the gallery on which the shared folder is displayed. In response to the second user's operation of the shared folder, the second device 12 displays an interface of the shared folder on which the above-described photograph or video is displayed. The second user may also operate the interface to store the photos or videos in a gallery of the second device 12. Thus, when the second user initiates the gallery of the second device 12, an interface of the gallery is displayed on the second device 12, on which interface the above-mentioned photos or videos may be displayed. Therefore, in the transmission process of the photo or the video, excessive operations of a user are not needed, and the operations of the user are simplified. In addition, in the embodiment of the application, one device reports the photo or video to the server, the server provides a shared storage space for the two devices, the server stores the photo or video reported by the device in the shared storage space, and the other device can acquire the photo or video from the server. In the whole process, the user does not need to operate frequently, the transmission by a third party application is not needed, and the transmission rate is improved. In addition, the photos or videos are directly stored in a gallery of the device, but not in other folders on the device, so that the photos or videos are convenient for users to find.
The structure of the electronic device is described below, and fig. 3A is a schematic structural diagram of the electronic device.
As shown in fig. 3A, the electronic device 100 (e.g., the first device 11, the second device 121, the second device 122, … …, the second device 12 n) may include a processor 210, a memory 220, a universal serial bus (universal serial bus, USB) interface 230, a charge management module 240, a power management module 241, a battery 242, an antenna 1, a wireless communication module 250, a sensor module 260, a key 270, a camera 280, a display 290, and the like. The sensor module 260 may include a pressure sensor 260A, a gyroscope sensor 260B, an acceleration sensor 260C, a distance sensor 260F, a fingerprint sensor 260D, a touch sensor 260E, and the like.
It should be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device. In other embodiments of the application, the electronic device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 210 may include one or more processing units such as, for example: the processor 210 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 210 for storing instructions and data. In some embodiments, the memory in the processor 210 is a cache memory. The memory may hold instructions or data that the processor 210 has just used or recycled. If the processor 210 needs to reuse the instruction or data, it may be called directly from the memory. Repeated accesses are avoided and the latency of the processor 210 is reduced, thereby improving the efficiency of the system.
In some embodiments, processor 210 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a USB-enabled interface, among others.
It should be understood that the connection relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the electronic device. In other embodiments of the present application, the electronic device may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 240 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 240 may receive a charging input of a wired charger through the USB interface 230. In some wireless charging embodiments, the charge management module 240 may receive wireless charging input through a wireless charging coil of the electronic device. The charging management module 240 may also provide power to the electronic device through the power management module 241 while charging the battery 242.
The power management module 241 is used for connecting the battery 242, and the charge management module 240 and the processor 210. The power management module 241 receives input from the battery 242 and/or the charge management module 240 and provides power to the processor 210, the memory 220, the display 290, the camera 280, the wireless communication module 250, and the like. The power management module 241 may also be configured to monitor battery capacity, battery cycle times, battery health (leakage, impedance), and other parameters. In other embodiments, the power management module 241 may also be disposed in the processor 210. In other embodiments, the power management module 241 and the charge management module 240 may be disposed in the same device.
The wireless communication function of the electronic device may be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 250, a modem processor, a baseband processor, and the like.
The antenna 1 is used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The wireless communication module 250 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc. for application on an electronic device. The wireless communication module 250 may be one or more devices that integrate at least one communication processing module. The wireless communication module 250 receives electromagnetic waves via the antenna 1, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 210. The wireless communication module 250 may also receive a signal to be transmitted from the processor 210, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 1.
In some embodiments, the wireless communication module of the first device transmits address information and portrait information of the first device to the wireless communication module of the second device. The wireless communication module of the second device sends the address information and the portrait information of the second device to the wireless communication module of the first device.
In some embodiments, the wireless communication module of the first device transmits the photograph or video acquired by the first device to the wireless communication module of the second device.
In some embodiments, the wireless communication module of the first device transmits the photograph or video acquired by the first device to the server.
In some embodiments, the antenna 1 and the wireless communication module 250 of the electronic device are coupled such that the electronic device can communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device implements display functions through a GPU, a display screen 290, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display screen 290 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 290 is used for displaying images, videos, and the like. The display screen 290 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device may include 1 or N display screens 290, N being a positive integer greater than 1.
In some embodiments, the display 290 of the first device displays a main interface, an image acquisition interface, an application interface for a gallery, a display interface for a photograph or video of the first device. The display 290 of the second device displays a main interface of the second device, an image acquisition interface, an application interface of a gallery, a display interface of a photo or video.
The electronic device may implement shooting functions through an ISP, a camera 280, a video codec, a GPU, a display screen 290, an application processor, and the like.
The ISP is used to process the data fed back by the camera 280. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also perform algorithm optimization on noise and brightness of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 280.
Camera 280 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a format of standard RGB (RGB stands for three colors of red, green and blue), YUV ("Y" stands for brightness (luminence or Luma), that is, gray scale values, "U" and "V" stand for chromaticity (Chroma)), or the like. In some embodiments, the electronic device may include 1 or N cameras 280, N being a positive integer greater than 1.
In some embodiments, camera 280 of the first device captures image information of user a and user B such that the first device obtains a video or a combination of user a and user B.
In some embodiments, the processor 210 of the first device identifies a person on a video or photograph and generates identification information. Processor 210 may display the identification information via display screen 290.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, and so on.
Video codecs are used to compress or decompress digital video. The electronic device may support one or more video codecs. In this way, the electronic device may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent cognition of electronic devices can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
Memory 220 may be used to store computer executable program code that includes instructions. The memory 220 may include a stored program area and a stored data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device (e.g., audio data, phonebook, etc.), and so forth. In addition, the memory 220 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 210 performs various functional applications of the electronic device and data processing by executing instructions stored in the memory 220 and/or instructions stored in a memory provided in the processor.
In some embodiments, the memory 220 of the first device stores photographs or videos captured by the first device. The memory 220 of the second device stores photos or videos that the second device obtains from the first device. Specifically, the first device stores the photo or video in a database of a gallery and the second device stores the photo or video in a database of a gallery.
The pressure sensor 260A is configured to sense a pressure signal and convert the pressure signal into an electrical signal. In some embodiments, pressure sensor 260A may be disposed on display screen 290. The pressure sensor 260A is of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 260A. The electronics determine the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 290, the electronic device detects the intensity of the touch operation according to the pressure sensor 260A. The electronic device may also calculate the location of the touch based on the detection signal of the pressure sensor 260A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 260B may be used to determine a motion gesture of the electronic device. In some embodiments, the angular velocity of the electronic device about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 260B. The gyro sensor 260B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 260B detects the shake angle of the electronic device, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device through the reverse motion, thereby realizing anti-shake. The gyro sensor 260B may also be used for navigating, somatosensory game scenes.
The acceleration sensor 260C may detect the magnitude of acceleration of the electronic device in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 260F for measuring distance. The electronic device may measure the distance by infrared or laser. In some embodiments, the scene is photographed and the electronic device can range using the distance sensor 260F to achieve quick focus. In some embodiments, the distance sensor 260F of the first device ranges user a and user B to achieve quick focus.
The fingerprint sensor 260D is used to collect a fingerprint. The electronic equipment can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access the application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The touch sensor 260E, also referred to as a "touch device". The touch sensor 260E may be disposed on the display screen 290, and the touch sensor 260E and the display screen 290 form a touch screen, which is also called a "touch screen". The touch sensor 260E is used to detect a touch operation acting on or near it. The touch sensor 260E may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through display screen 290. In other embodiments, the touch sensor 260E may also be disposed on the surface of the electronic device at a different location than the display screen 290.
In some embodiments, the touch sensor 260E of the first device detects touch operations acting on or near the interface of the first device. For example, the touch sensor 260E of the first device detects an operation of the first user on an interface such as a main interface, an image acquisition interface, an application interface of a gallery, a display interface of a photo or video, or the like of the first device.
The keys 270 include a power on key, a volume key, etc. The keys 270 may be mechanical keys. Or may be a touch key. The electronic device may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device.
Of course, the electronic device may also include other functional units, which are not limited by the embodiment of the present application.
In addition, the terms and the like related to the actions of the embodiments of the present application may be referred to each other without limitation. The message names of interactions between the devices or parameter names in the messages in the embodiments of the present application are just an example, and other names may be used in specific implementations without limitation.
Referring to fig. 3B and 3C, fig. 3B and 3C are software structural block diagrams of an electronic device according to an embodiment of the present application.
As shown in fig. 3B and 3C, the layered architecture divides the software into several layers, each with a clear role and division of labor. The layers communicate with each other through a software interface. In some possible embodiments, the Android system may be divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 3B and 3C, the application package may include an account application, bluetooth, a device management application (an application with a device management function), a navigation, a memo, a WLAN, a short message, a gallery, a camera, a calendar, a call, and other application programs (applications).
The application framework layer provides an application programming interface and programming framework for the application of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 3B and 3C, the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and the like. The application framework layer may further include: discovery module, HMSCore) ecocomponent (may be abbreviated as HMSCore, or HMSCore account management module) is a mobile services core (Huawei Mobile Service Core, HMSCore).
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. Such data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the device. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification presented in the form of a chart or scroll bar text in the system top status bar, such as a notification of a background running application, or a notification presented on a screen in the form of a dialog interface. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
It should be noted that the modules shown in fig. 3B and 3C may be integrated in the application framework layer, or may be integrated in a specific application program, such as a memo. The embodiment of the present application is not particularly limited.
In one embodiment, based on the communication system shown in fig. 2A, the specific working process of the software architecture shown in fig. 3B may be: (1) the first user takes a picture or records a video on the first device, and the first user operates a camera displayed on an interface of the first device. In response to an operation by the first user, the camera acquires a photo or video and stores media data such as the photo or video in a data management module of the kernel layer. (2) During the process of the camera acquiring a photo or video, the first device identifies portrait information in the photo or video. And the first device sends out a broadcast carrying the portrait information and the connection address of the first device. (3) The connection management module of the first device sends the broadcast to the connection management module of the second device. (4) And after the connection management module of the second device receives the connection address and the portrait information of the first device, the portrait information is matched with the portrait information stored in the data management module. (5) At the matching After success, the data management module feeds back the successful matching result to the connection management module. (6) And the connection management module sends the successfully matched portrait information and the connection address of the second device to the first device. (7) And the connection management module of the first device sends the portrait information successfully matched to the data management module. (8) And the data management module displays the portrait identification and the control for transmitting media on the photos or videos presented in the gallery according to the portrait information successfully matched. (9) After the first user operates the control, the first device receives the operation of the first user. In response to the operation, the connection management module of the first device establishes a connection with the link management module of the second device according to the connection address of the second device.After the connection is established, the data management module sends media data (e.g., data of photos or videos) to the link management module. />And transmitting the media data to a link management module data management module of the second device through the link management module. />The link management module of the second device sends the media data to the connection management module. />The connection management module of the second device sends the media data to the data management module. / >The data management module of the second device presents the media data in a gallery. In another implementation, at +.>Before (I)>The second user operates the gallery on the second device to obtain photos or videos shared by the first device.
In another embodiment, based on the communication system shown in fig. 2B, the specific working process of the software architecture shown in fig. 3C may be: the manner of displaying the portrait identifier and controls for transmitting media on the photos or videos presented in the gallery is the same as (1) (2) (3) (4) (5) (6) (7) (8) above. And will not be described in detail herein. Then, the transmission process of media data such as photos or videos is as follows: (1) the first user takes a picture or records a video by means of a camera of the first device. (2) After taking a picture or recording a video, media data such as the picture or video is stored in the data management module. The data management module presents the media data through a gallery. (3) After the first user operates the control for transmitting media on the gallery, the first device receives an operation of the first user. (4) In response to the operation, the connection management module of the first device transmits the connection address, media data of the second device to the connection management module. (5) The connection management module transmits the connection address and the media data of the second device to the link management module. (6) And the link management module reports the connection address and the media data of the second equipment to the server. (7) The second device receives an operation of the second user when the second user operates the shared folder displayed on the gallery on the second device. (8) And responding to the operation of the second user, and sending an acquisition request to the connection management module by the data management module of the second device, wherein the request carries the connection address of the second device. (9) The connection management module of the second device sends the request to the link management module. (1) And 0, the link management module of the second device sends the request to the server, and the server acquires the media data according to the request. The server sends the media data to the link management module. />Link management mode of second deviceThe block transmits the media data to the connection management module. />The connection management module of the second device sends the media data to the data management module. />The data management module of the second device presents the media data in a gallery.
The multimedia file of the embodiment of the present application can be understood as a file having the form of image media. For example, multimedia files include photographs, videos, and the like. Taking a photo as an example, the embodiment of the application provides a multimedia file sharing method. Fig. 4A to fig. 4C are schematic flow diagrams of a multimedia file sharing method according to an embodiment of the present application.
As shown in fig. 4A, the method is applied to a communication system, where the communication system includes a first device and one or more second devices, and the method may be described in two stages, and specifically may include:
in the first stage, a first device takes and transmits a photograph.
S400, the first device receives an operation 1 of the first user on the camera on the first device.
This operation 1 can be understood as a triggering operation on the camera. For example, the operation 1 may include a click operation, a press operation, or the like. Illustratively, the first user clicks on an icon of the camera displayed on the primary interface of the first device. The first device receives a click operation of an icon of the camera by a first user.
S401, responding to operation 1, enabling the first device to enter an interface 1, wherein the interface 1 is a shooting preview interface. The interface 1 comprises a shooting mode and a control 1, wherein the control 1 is a shooting control.
In response to operation 1 by the first user, the first device enters interface 1. A shooting mode is displayed on the interface 1. The shooting mode may include photographing, video recording, portrait, night scene, aperture, specialty, etc. Fig. 5A is an interface schematic diagram of a mobile phone 01 according to an embodiment of the present application; when the first user selects the photographing mode, the mobile phone 01 (i.e., the first device) displays the interface 410 shown in fig. 5A, and the control 414 (i.e., the control 1) is displayed on the interface 410. When the camera of the mobile phone 01 irradiates the user a and the user B, the images of the user a and the user B are displayed on the interface 410 displayed on the mobile phone 01.
In some embodiments, the method for sharing a multimedia file provided by the embodiment of the present application further includes:
s402, when the first device detects that the portrait is displayed on the interface 1, the first device further displays identification information of the portrait on the interface 1.
Specifically, the distance sensor on the first device measures distance to achieve quick focusing, and displays one or more portraits on the interface 1 of the first device. The camera of the first device collects portrait information for one or more portraits. And the first equipment obtains the identification information of the portrait face according to the portrait information. The first device also displays identification information on the interface 1, which may be displayed in the form of an identification frame. For example, a face recognition box 412 for user a and a face recognition box 411 for user B are displayed on interface 410 as shown in fig. 5A.
In the embodiment of the application, the first equipment can preview the portrait in advance on the image acquisition interface and identify the portrait, so that the identification information of the portrait is acquired in the focusing process of the first equipment, the user does not need to trigger the identification operation intentionally, the program is simplified, and the photo sharing rate is effectively improved.
S403, the first device receives operation 2 of the first user on the interface 1.
This operation 2 may be understood as a touch operation of the interface 1 by a user. For example, the touch operation of the user on the interface 1 may include a selection operation of a shooting mode on the interface 1 by the user and/or a click operation of the control 1 on the interface 1 by the user. For example, the first user selects a photographing mode on the interface 410 as shown in fig. 5A and clicks the control 414 (i.e., control 1) on the interface 410. Thus, the mobile phone 01 can take a picture of the user a and the user B.
S404, responding to the operation 2, the first device acquires the photo, the first device displays the interface 2, and the control 2 is displayed on the interface 2, wherein the control 2 is used for triggering the photo browsing operation.
As shown above, after the first user selects the photographing mode on the interface 410 as shown in fig. 5A and clicks the control 414 on the interface 410, the first device captures a photograph and stores the photograph in the database of the gallery. At this point, the first device displays control 421 on interface 420 as shown in FIG. 5B. Illustratively, information for the photograph may be displayed on control 421.
After the first device obtains the photograph, the first device may store the obtained photograph in a database of a gallery application of the first device. When a user needs to browse photos, the following way is possible:
in one particular implementation, the first user may browse the photos by operating the control 2 in the interface 2. The specific implementation can be as follows:
s405, the first device receives operation 3 of the control 2 by the first user.
This operation 3 can be understood as a touch operation of the control 2 by the user. Illustratively, the first user clicks control 421 (i.e., control 2) displayed on interface 420 (i.e., interface 2) as shown in fig. 5B, at which point handset 01 receives the first user's operation.
S406, in response to operation 3, the first device displays an interface 3, the interface 3 being for displaying a photograph.
Illustratively, after the first user clicks on the interface 420 shown in fig. 5B, in response to an operation by the first user, the first device displays an interface 430 as shown in fig. 5C, and a photograph is displayed on the interface 430.
In another particular implementation, the user may click on an icon of a gallery application on the first device. In response to a user operation, the first device displays an interface of the gallery application on which the user may view the photograph. Illustratively, the first user clicks on the primary interface of the first device, similar to interface 450 shown in FIG. 5F. The first user may click on icon 451 of the gallery application on the main interface 450. In response to the click operation by the first user, the interface of the first device jumps to an application interface 460 of the gallery application like that shown in fig. 5G. A plurality of folders are displayed on the interface 460 shown in fig. 5M, which may include all photos, videos, my collections, recent deletions, screen shots, etc. When the first user clicks on all the photo folders, the interface of the first device is displayed as an interface of all the photo folders on which photos can be viewed.
In other embodiments, the identification information of the portrait is not displayed on the shooting preview interface, that is, the method for sharing the multimedia file provided in the embodiment of the present application does not include the step of S402, but displays the identification information of the portrait on the photo display interface, specifically, as shown in fig. 4A, the method for sharing the multimedia file provided in the embodiment of the present application further includes:
s407, when the first device detects the portrait in the photo on the interface 3, the first device further displays the identification information of the portrait on the interface 3.
The specific implementation can be that the first device detects the photo and obtains the portrait information of the photo. And the first equipment obtains the identification information of the portrait face according to the portrait information. The first device also displays identification information on the interface 3, which may be displayed in the form of an identification frame. For example, face recognition box 435 for user a and face recognition box 434 for user B are displayed on interface 430 as shown in fig. 5N.
After the first device detects the portrait in the photo, the first device displays the identification information of the portrait, and the first device sends the identification information of the portrait to other devices in the following steps:
s408, the first device sends a message 1, wherein the message 1 carries the identification information of the portrait and the address information of the first device. Accordingly, the second device receives message 1.
The identification information of the portrait may include face information. The face information may be understood as a face image, or a face feature.
In one particular implementation, the first device may send message 1 in a broadcast format. Of course, the first device may also send in other manners, and embodiments of the present application are not limited in detail. In a specific implementation, in order to avoid interference with other devices, the first device sends message 1 in a broadcast within a preset frequency range. The predetermined frequency range may be 88 megahertz to 108 megahertz, for example. Of course, the preset frequency range needs to be set according to actual requirements, and the embodiment of the application is not limited specifically.
In some specific implementations, in order to ensure information security, the first device may encrypt the identification information of the portrait and the address information of the first device to obtain encrypted information. The format of the encryption information may be address information, face information one, face information two, face information three … …, etc. Illustratively, following the example above, the first device detects user a and user B. The first equipment encrypts the face information of the user A, the face information of the user B and the address information of the first equipment to obtain encrypted information. The format of the encryption information may be address information of the first device, face information of the user a, and face information of the user B.
If the first device encrypts the identification information of the portrait and the address information of the first device, the second device decrypts the identification information of the portrait and the address information of the first device after receiving the message 1, thereby obtaining the identification information of the portrait and the address information of the first device.
S409, the second device determines the target portrait according to the message 1.
Specifically, the second device compares the identification information of the portrait with a plurality of pieces of portrait information stored in advance. When the identification information is consistent with the first portrait of the plurality of portrait information, the second device determines the first portrait as a target portrait. The target person corresponds to a second device.
In a specific implementation, to reduce power consumption, the second device compares the identification information of the portrait with information present in the second device. The specific implementation mode is as follows:
in the first mode, the second device may compare the identification information of the portrait with the portrait information satisfying the condition 1 in the photos or videos applied to the gallery of the second device. Wherein, condition 1 may include: the method comprises the steps that a first item, the portraits before the preset sequencing are sequenced; illustratively, the figures in the gallery application rank the top 3 figures. A second item, a portrait associated with a user account head portrait of a second device; illustratively, the second user publishes the target photograph on the second device via the user account. The portrait in the target photo has an association relationship with the user account head portrait. For example, suppose figure 1 is a child of a second user who often publishes a work on a platform by user account on a second device, the work being a photograph of figure 1. In this way, the second device can determine that the association relationship exists between the user account head portrait and the portrait 1.
In a second mode, the second device logs in to the operating system by using the account number 1. In the process that the second device logs in the operating system by adopting the account number 1, the second user can upload the head portrait, and the uploaded head portrait is the user account head portrait of the account number 1. In this case, the second device may compare the one or more portrait information with the user account header.
S410, the second device sends a message 2 to the first device, wherein the message 2 carries address information of the second device and identification information of the target portrait. Accordingly, the first device receives message 2.
As described in S408. For example, the second device may perform encryption processing on message 2, and so on. The above description is detailed, and will not be repeated here.
In S409, the second device compares the identification information of the portrait with a plurality of pieces of portrait information stored in advance. When the identification information does not coincide with the information of the first figure among the plurality of figure information, the second device does not send the message 2 to the first device.
In the embodiment of the application, after the second device receives the message 1 sent by the first device, the second device determines whether the information in the message 1 is consistent with the information stored in the second device according to the message 1. And under the condition that the information is consistent, the second device sends a message 2 to the first device, wherein the message 2 carries the address information of the second device. Thus, the communication connection between the first equipment and the second equipment is established, the first equipment can communicate with the second equipment according to the address information of the second equipment, and the second equipment can also communicate with the first equipment according to the address information of the first equipment, so that the security and the rapidity of photo transmission are ensured in the process of subsequently transmitting photos.
S411, displaying an interface 4 on the first device, wherein the interface 4 displays identification information of the target portrait and a control 3, and the control 3 is used for representing that the photo is transmitted to a second device corresponding to the target portrait.
The identification information of the target portrait and the control 3 may be displayed on the interface 4 independently of each other. Illustratively, in one particular implementation, the control 3 may be a plurality of portrait-corresponding upload controls. Control 433 is displayed on interface 430 shown in FIG. 5L. The control 433 is used to upload media data on a one-touch basis. In the embodiment of the application, the control can be an uploading control corresponding to a plurality of figures, so that a user can finish operation by one key, so that the first equipment establishes communication with a plurality of second equipment and sends photos to the plurality of second equipment, thereby improving the rate of sharing photos by the first equipment to the plurality of second equipment and improving user experience.
Of course, the identification information of the target portrait may also be displayed on the control 3. Illustratively, in one particular implementation, the control 3 may be a one-to-one transmission control for each figure. Illustratively, the photograph is displayed on the interface 430 shown in FIG. 5C described above. Meanwhile, on the interface 430, display is made: user a's identification information (e.g., small a) and a control 432, the control 432 being configured to transmit the photograph to the device corresponding to user a (i.e., handset 2). User B's identification information (e.g., small B) and a control 431, the control 431 being used to transmit the photograph to the device corresponding to user B.
In the embodiment of the application, the control can be a transmission control corresponding to each portrait one by one, so that the user shares the photos with the appointed equipment, and the first equipment can send the photos to the appointed second equipment in a targeted manner, thereby effectively avoiding the interference to other equipment and improving the user experience.
In another specific implementation, control 414, as shown in FIG. 5A, is the same control as control 3.
S412, the first device receives an operation 4 of the first user on the control 3, where the operation 4 is an operation for triggering transmission of the photo.
The operation 4 may include a click operation, a press operation, or the like. The embodiment of the present application is not particularly limited.
S413, in response to operation 4, the first device sends a message 3 to the second device, where the message 3 is used to prompt the user for the photo update. Accordingly, the second device receives message 3.
The message 3 may be in text form, in speech form, etc. By way of example, this message 3 may be a "notification: photos are updated in your gallery, please see … … ".
As described above, if the control 3 is an upload control corresponding to a plurality of figures, the first device sends a message 3 to a plurality of second devices; if the control 3 is a transmission control corresponding to each portrait one by one, the first device sends a message 3 to a second device corresponding to the control operated by the first user. For example, as shown in fig. 5D, when the first user clicks on control 432 on interface 430 shown in fig. 5D, the first device sends message 3 to the cell a (i.e., handset 02) corresponding to control 432.
S414, the second device displays an interface 5, where the interface 5 is used to display the message 3.
Along the above example, when the first user clicks control 432 on interface 430 shown in fig. 5D, the first device sends message 3 to the cell a (i.e., handset 02) corresponding to control 432. At this point, the second device receives message 3 and displays message 3 on interface 440 as shown in fig. 5E, displaying 441 "notification on this interface 440: photos are updated in your gallery, please see … … ".
S415, in response to operation 4, the first device transmits the photograph.
The first device sends the photo, which may specifically include the following cases:
in the first case, the first device sends the photograph directly to the second device.
In one specific implementation, as shown in fig. 4B, S415 may be specifically implemented as: s4151, in response to operation 4, the first device transmits a request 1 to the second device, the request 1 requesting connection of the second device. Accordingly, the second device receives request 1. Specifically, the first device sends a request 1 to the second device according to address information of the second device. The second device establishes a communication connection with the first device according to request 1. For example, the second device may be bluetooth connected to the first device and the second device may be network connected to the first device. S4152, the second device establishes communication connection with the first device according to the request 1. S4153, after the first device establishes a communication connection with the second device, the first device transmits the photograph to the second device. Accordingly, the second device receives the photograph. Specifically, the first device sends the photograph to the second device. The second device receives the photograph and inserts the photograph into a database of a gallery of the second device.
In the embodiment of the application, the first device sends the photo to the gallery of the second device through the gallery. The second device receives the photos sent by the gallery of the first device through the gallery, so that the second device directly stores the photos shared by the first device in the database of the gallery, and a user can search in the gallery directly when searching the photos, thereby being convenient for the user to search. In addition, in the transmission process of the photos, the first equipment does not need to consider the restrictions of the flow, resources and the like of the third party application to compress the photos and the like, and can directly share the photo originals to the second equipment, so that the photos can be transmitted to the gallery of the second equipment in a lossless manner, and the effect consistency of the photos on the first equipment and the second equipment is effectively ensured.
In a specific implementation, in order to facilitate still clearly documenting the basic information of the photograph on the second device. And the first equipment sends the photo and the gallery information to the second equipment, and correspondingly, the second equipment receives the photo and the gallery information. The gallery information may refer to additional information for representing a photograph. Such as the time of acquisition of the photograph, the address of acquisition of the photograph, etc. The second device inserts the photos into a database of a gallery of the second device, and writes gallery information under a designated folder, such as a camera/DCIM folder. Therefore, the photos sent by the first device and the additional information of the photos can be displayed in the gallery of the second device, so that a user can directly view the additional information of the photos when viewing the photos.
In some embodiments, in order to avoid mistransmitting photos, a photo sharing method provided by the embodiment of the present application may further include: s416, the first device sends the message 4 to the second device, and correspondingly, the second device receives the message 4. The message 4 is used to indicate that the photo is in a protected state for a first preset time. The first preset time may be preset. For example, the first preset time may be 2 minutes, 5 minutes, etc. Where a protected state may be understood as a state that is not acquired or used. Illustratively, the photo must not be captured, recorded, copied, shared, etc. within a preset time. S4161, when the first user confirms that the photo is mistransmitted, the first user operates 5 the photo within a first preset time. The first device receives operation 5 and, in response to operation 5, the first device sends a request 2 to the second device, the request 2 for deletion of the photograph. Accordingly, the second device receives request 2. S4162, the second device deletes the photo according to the request 2. Thus, the photograph sent by the first device is not displayed in the gallery of the second device.
In the second case, the first device indirectly transmits the photograph to the second device.
In a specific implementation manner, the second device opens the shared album function in advance. The setting method may be that the second user starts a gallery application on the second device. The second device displays an application interface of the gallery application, which may include a control for opening a shared album function. When the user selects the control, the second device has the function of sharing the album. In this way, the second device may share photos, videos, etc. with other devices. Of course, the setting method is not limited to the above-described example method, and may be: and on a system setting interface of the second device, an application and a service item are included on a system device interface, a gallery is found in the application and the service item, and a control for opening a function of sharing the album can be included in a permission interface of the gallery. When the user selects the control, the second device has the function of sharing the album. In the case where the second device previously opens the shared album function, S415 may be specifically implemented as: s4154, in response to operation 4, the first device uploads the photograph to the server. Correspondingly, the server receives the photograph. S4155, the server stores the photo in the shared folder. The shared folder may be understood as a memory of a server shared by a plurality of devices (e.g., a first device and a second device). S4156, displaying the shared file space on a gallery interface of the second device, and the second device acquires the photos from the shared folder and inserts the photos into a database of a gallery of the second device.
In the embodiment of the application, the first device sends the photo to the server. The server stores the photos in a shared folder, which is a section of storage area shared on the server by the first device and the second device. The second device may obtain a photograph from the server, and in particular, the second device operates the shared folder on the interface of the gallery to open the shared folder, thereby obtaining the photograph from the shared folder. The second device stores the photos in a database of the gallery, so that a user can search the photos in the gallery directly when searching the photos, and the photos are convenient to search. In addition, in the transmission process of the photos, the first equipment does not need to consider the restrictions of the flow, resources and the like of the third party application to compress the photos and the like, and can directly report photo originals to a server, so that the photos can be transmitted to a gallery of the second equipment without loss, and the photo effect on the first equipment is effectively ensured to be consistent with that on the second equipment.
In one particular implementation, the second device displays the shared folder after the second device satisfies the first condition. Wherein the first condition includes at least one of: the second equipment receives prompt information sent by the first equipment, wherein the prompt information is used for prompting the second equipment that the multimedia file is updated; the second device receives an operation to the second device, the operation being an operation for triggering display of an icon of the shared folder.
In the second stage, the second device obtains a photograph.
In the first stage, there are two cases where the first device sends a photo: in the first case, the first device directly sends the photo to the second device; in the second case, the first device sends the photograph indirectly to the second device. For the two cases, the specific implementation manner of the second device to obtain the photo is described respectively:
in the first case, the second device directly receives the photo sent by the first device and stores the photo in the database of the gallery of the second device. Along with the above example, the updated photograph is displayed on the interface 490 of the second device shown in FIG. 5K.
In the second case, the second device obtains a photo, as shown in fig. 4C, and the specific implementation manner is as follows:
s417, the second device starts the gallery application, and the second device displays an interface 6, wherein the interface 6 is an application interface of the gallery application. The shared folder is displayed on the interface 6.
The second user initiates a gallery application on the second device. Illustratively, when the second user clicks on the icon 451 of the gallery application on the main interface 450 of the second device as shown in fig. 5F, the second device receives a user operation and, in response to the user operation, the second device enters the application interface 460 of the gallery application as shown in fig. 5G, and the shared folder 461 is displayed on the interface 460.
S418, the second device receives operation 6 on the shared folder, the operation 6 being for opening the shared folder.
The operation 6 may include a click operation, a press operation, or the like, may include a multi-click operation, a multi-press operation, or may include a combination of a multi-click operation and a multi-press operation. The embodiment of the present application is not particularly limited.
S419, in response to operation 6, the second device displays an interface 7, the interface 7 comprising an input box for inputting the authentication information and a control 4 for indicating authentication of the authentication information.
Wherein authentication information may be understood as information for authenticating the identity of the user. The verification information may include information such as a mobile phone number, a name of an account, a nickname of an account, and a real name authentication.
Illustratively, when the second user clicks on the shared folder 461 on the interface 460 shown in fig. 5G, the second device receives a user operation, and in response to the operation, the interface of the second device jumps from the interface 460 shown in fig. 5G to the interface 470 shown in fig. 5H. A window 471 is displayed on the interface 470, and an input box 4711 and a control 4712 are displayed on the window 471. As shown in fig. 5I, after the second user inputs authentication information in the input box 4711 shown in fig. 5I, the second user clicks the control 4712. The second device receives a user operation.
S420, the second device receives the operation 7 of the user on the interface 8.
This operation 7 can be understood as an input operation to the input box and a triggering operation to the control 4. The operation 7 may include a combination operation of an input operation and a trigger operation. Illustratively, the second user enters the authentication information within the input box. After entering the authentication information, the second user clicks on control 4, the operations of which may be collectively referred to as operation 7.
S421, in response to operation 7, the second device displays an interface 9, the interface 9 including a control 5, the control 5 being for indicating to take a photograph.
Along with the above example, after the second user clicks on control 4712 shown in FIG. 5I, the second device validates the validation information and after the validation passes, the interface of the second device jumps to interface 480 shown in FIG. 5J, on which control 481 is displayed.
S422, the second device receives an operation 8 of the second user on the control 5, where the operation 8 is an operation for triggering to obtain a photo.
The operation 8 may be the same as the above, and may include a clicking operation, a pressing operation, etc., a multiple clicking operation, a multiple pressing operation, or a combination of a multiple clicking operation and a multiple pressing operation. The embodiment of the present application is not particularly limited.
S423, in response to operation 8, the second device obtains a photograph.
Specifically, the second device downloads the photograph to a database of a gallery of the second device.
In the first case and the second case, after the photo exists in the database of the gallery of the second device, the following S419 is performed. The following are provided:
s424, the second device displays an interface 9, and the interface 9 displays a photograph.
Along with the above example, after the second user clicks the control 481 shown in fig. 5J, the second device receives an operation by the second user, and in response to the operation, the interface of the second device jumps from the interface 480 shown in fig. 5J to the interface 490 shown in fig. 5K, and an updated photograph is displayed at the interface 490.
In an actual application scenario, it is assumed that the first device (handset 01) is dad of small P and the second device (handset 02) is mom of small P (small a). The mobile phone 01 is highest in configuration and best in photographing effect. Dad of small P uses handset 01 to shoot happiness of the whole family of small P. Dad of small P shares the family's happiness with cell phone 02 of small a. Specifically, the mobile phone 01 shares the family' S happiness with the mobile phone 02 by adopting the steps S400-S424. After receiving the family happy and good fortune photo, the mobile phone 02 is displayed in a gallery of the mobile phone 02. Therefore, the high-quality family happy photo can also appear in the gallery of the mobile phone 02, and is consistent with the shooting experience of the mobile phone 01, so that the user experience is improved.
In another scenario, taking a case that the multimedia file is a video as an example, a method for sharing a multimedia file provided by the embodiment of the present application is described. The difference compared to the photo scene is that:
in the first stage, the first device detects the timing of the portrait, specifically including the following situations:
in case one, the person is detected before the first device performs the video recording.
As described above in S402, when the first device detects that the portrait is displayed on the interface 1, the first device also displays the identification information of the portrait on the interface 1.
Specifically, when the first device performs ranging to achieve quick focusing, one or more portraits are displayed on the interface 1 of the first device. At this time, the camera of the first device collects the portrait information of one or more portraits. And the first equipment obtains the identification information of the portrait face according to the portrait information. The first device also displays identification information on the interface 1, which may be displayed in the form of an identification frame. Illustratively, a face recognition box 512 for user A and a face recognition box 511 for user B are displayed on interface 510 as shown in FIG. 6A.
And secondly, detecting the portrait in the process of video recording by the first equipment.
Specifically, when the first device detects that the portrait is displayed on the recording interface, the first device also displays identification information of the portrait on the recording interface.
As the recording of video can be understood as combining individual image frames by time stamp. Thus, during recording of video, image frames are displayed at the recording interface with a time stamp. When the first image frame is displayed on the recording interface at the first time stamp, the first device detects a person image of the first image frame at the first time stamp and displays identification information of the person image on the recording interface. Similarly, when the recording interface displays the second photo on the second timestamp, the first device detects the portrait of the second image frame on the second timestamp and displays the identification information of the portrait on the recording interface. That is, the identification information of the first device displaying the portrait on the recording interface may change with the change of the portrait on the image frame. That is, the identification information of the portrait displayed by the first device on the recording interface is changed.
Illustratively, the first user clicks control 514 on interface 510 as shown in FIG. 6A. The first device receives an operation of the first user, and in response to the operation, the first device records video for user A and user B. At this time, the display interface of the first device jumps from the interface 510 shown in fig. 6A to the interface 520 shown in fig. 6B. At some point, face recognition box 512 for user A and face recognition box 511 for user B are displayed on interface 520 shown in FIG. 6B.
In case three, a portrait is detected on the interface where the video is displayed.
As in S407, when the first device detects a portrait in the video on the interface 3, the first device also displays the identification information of the portrait on the interface 3.
After the first user clicks on control 524 on interface 520 shown in fig. 6B, the first device stops recording the video. And the first device stores the video in a database of the gallery. When the user needs to browse the video, the video may be browsed in the manner of S405-S406, which is not described herein.
Similarly, as in the case of the video recording, the video playing may be understood as displaying individual image frames according to time stamps. Therefore, during the playing of the video, the image frames are displayed with time stamps on the interface on which the video is displayed. When the first image frame is displayed on the interface for displaying the video on the first timestamp, the first device detects a person image of the first image frame on the first timestamp and displays identification information of the person image on the interface for displaying the video. Similarly, when the interface displaying the video at the second time stamp displays the second image frame, the first device detects the person image of the second image frame at the second time stamp and displays the identification information of the person image at the interface displaying the video. That is, the identification information of the first device displaying the portrait on the interface displaying the video may change with the change of the portrait on the image frame. That is, the identification information of the portrait displayed by the first device on the interface on which the video is displayed is changed.
For example, when the first device displays the display interface of the video, the first device detects the display interface of the video and acquires the portrait information of the video. And the first equipment obtains the identification information of the portrait face according to the portrait information. The first device also displays identification information on the interface 3, which may be displayed in the form of an identification frame. For example, at some point, face recognition box 512 for user A and face recognition box 511 for user B are displayed on interface 530 as shown in FIG. 6C.
In the first stage, the first device displays the identification information of the target portrait in different ways, and specifically includes the following ways:
in a specific implementation manner, the control 3 may be a transmission control corresponding to each portrait one by one. Illustratively, the video is displayed on the interface 520 described above and shown in FIG. 6B. On the interface 520 shown in fig. 6B: user a's identification information (e.g., small a) and control 522, user B's identification information (e.g., small B) and control 521. Illustratively, as shown in FIG. 6C above, the first device receives the sliding operation when the first user slides over the interface 530 in the direction of the arrow. In response to the sliding operation, the avatar of user a and its corresponding control 531 and the avatar of user B and its corresponding control 532 are displayed on the interface 530 of the first device.
In the embodiment of the application, the control can be a transmission control corresponding to each portrait one by one, so that the user shares the video to the appointed equipment, and the first equipment can send the video to the appointed second equipment in a targeted manner, thereby effectively avoiding the interference to other equipment and improving the user experience.
In another specific implementation manner, the control 3 may be an uploading control corresponding to a plurality of portraits. Illustratively, controls 533 are displayed on interface 530 shown in FIG. 6C. The sliding operation is received by the first device when the first user slides in the arrow direction on the interface 530. In response to the sliding operation, the first device displays an avatar of user a and its corresponding control 531, and an avatar of user B and its corresponding control 532 as displayed on the interface 530 of fig. 6D. Also included on the interface 530 is a control 533. The control 533 is used to upload media data on a push-to-talk basis. Illustratively, as shown in FIG. 6C above, the first device receives the sliding operation when the first user slides over the interface 530 in the direction of the arrow. In response to the sliding operation, the first device displays an avatar of user a, an avatar of user B, and a control 533 as displayed on the interface 530 of fig. 6D, the control 533 being for one-touch uploading of media data.
In the embodiment of the application, the control can be an uploading control corresponding to a plurality of figures, so that a user can finish operation by one key, so that the first equipment establishes communication with a plurality of second equipment and sends videos to the plurality of second equipment, thereby improving the video sharing rate of the first equipment to the plurality of second equipment and improving the user experience.
In the first stage, as the ways of displaying the identification information of the target portrait by the first device are different, correspondingly, the ways of sending the video by the first device are different, which specifically comprises the following ways:
in a specific implementation manner, when the identification information of the control 3 and the target portrait is displayed on the preview interface, after the first device receives the operation of the first user on the control 3, when the first user operates the shooting control on the preview interface, the first device records the video and sends the recorded video. Illustratively, the first user clicks on face recognition box 512 of user A shown in FIG. 6A, after which the first user clicks on control 514 shown in FIG. 6A. The mobile phone 01 receives the above operation of the first user, and in response to the operation, the mobile phone 01 records the video and transmits the recorded video. In the embodiment of the application, after the user operates the transmission control and the shooting control, the first device records the video and sends the recorded video, so that the first device can share the video with other devices while recording the video, and the sharing rate is improved.
In another specific implementation manner, when the identification information of the control 3 and the target portrait is displayed on the recording interface, after the first device receives the operation of the control 3 by the first user, the first device sends the recorded video. Illustratively, the first user clicks control 514 shown in FIG. 6A. Thereafter, the mobile phone 01 starts recording the video. At this time, the interface of the mobile phone 01 jumps to the interface 520 shown in fig. 6B. User A's identification information and control 522, user B's identification information level control 521, is displayed on interface 520. When the first user clicks the control 522, the mobile phone 01 receives the above operation of the first user, and in response to the operation, the mobile phone 01 records video and simultaneously transmits the recorded video to the mobile phone 02 of the user a. In the embodiment of the application, after the user operates the shooting control, the user operates the transmission control, and at this time, the first device records the video and sends the recorded video to other devices, so that the first device can share the video with other devices while recording the video, and the sharing rate is improved.
In another specific implementation manner, when the identification information of the control 3 and the target portrait is displayed on the recording interface, the first device receives the operation of the control 3 by the first user, and after the recording end control on the recording interface is operated, the first device sends the recorded video. Illustratively, the first user clicks control 514 shown in FIG. 6A. Thereafter, the mobile phone 01 starts recording the video. At this time, the interface of the mobile phone 01 jumps to the interface 520 shown in fig. 6B. User a's identification information and control 522, user B's identification information and control 521 are displayed on interface 520. When the first user clicks the control 522 and clicks the control 524, the mobile phone 01 receives the above operation of the first user, and in response to the operation, the mobile phone 01 transmits the recorded video to the mobile phone 02 of the user a. In the embodiment of the application, when the user operates the transmission control and the user operates the recording end control, the first device sends the recorded video to other devices, so that the first device immediately shares the video after the video is recorded, and the sharing rate is improved.
In yet another specific implementation manner, when the identification information of the control 3 and the target portrait is displayed on the browsing interface (i.e. the video playing interface), after the first device receives the operation of the control 3 by the first user, the first device sends the recorded video. Illustratively, the first user operates the handset 01 to enter a browsing interface, i.e., interface 530 as shown in fig. 6D. User a's identification information and control 522, user B's identification information and control 521 are displayed on interface 530. When the first user clicks the control 522, the mobile phone 01 receives the above operation of the first user, and in response to the operation, the mobile phone 01 transmits the recorded video to the mobile phone 02 of the user a. In the embodiment of the application, after the user operates the recording end control, the user operates the transmission control on the browsing interface, and at this time, the first device sends the video to other devices, so that the first device can share the video with other devices when browsing the video, and the user experience is improved.
The above description is given by taking the electronic device as an example of a mobile phone, and the multimedia file sharing method provided by the embodiment of the application can also be applied to other electronic devices such as a tablet and the like, and the embodiment of the application is not described in detail.
It will be appreciated that in order to achieve the above-described functionality, the electronic device comprises corresponding hardware and/or software modules that perform the respective functionality. The present application can be implemented in hardware or a combination of hardware and computer software, in conjunction with the example algorithm steps described in connection with the embodiments disclosed herein. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application in conjunction with the embodiments, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The present embodiment may divide the functional modules of the electronic device according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated modules described above may be implemented in hardware. It should be noted that, in this embodiment, the division of the modules is schematic, only one logic function is divided, and another division manner may be implemented in actual implementation.
For example, in one division, referring to fig. 7, the electronic device 700 may include: the first display module 701 is configured to display a first interface, where one or more figures are displayed on the first interface, and a first control, where the first control corresponds to one second device, or the first control corresponds to a plurality of second devices, and the first control is configured to trigger sharing of a multimedia file to the one or more second devices corresponding to the first control, where the multimedia file is a media file containing one or more figures. The first receiving module 702 is configured to receive a first operation on a first control. In response to the first operation, a first sending module 703 is configured to send the multimedia file to one or more second devices.
In some specific implementation manners, the first interface further displays identification information of one or more portraits, the number of the first controls is one or more, and the identification information of each portrait corresponds to one first control.
In a specific implementation manner, the first interface is a browsing interface, and the browsing interface is an interface for a user to browse the multimedia file.
In a specific implementation manner, the first interface is a photo shooting interface, and the photo shooting interface is an interface for acquiring a photo; and a second control is also displayed on the first interface and is used for triggering and acquiring the photo, and the second control and the first control are the same control.
In a specific implementation manner, the first interface is a video shooting interface, and the video shooting interface is an interface for recording video. The first sending module 703 is further configured to: in response to the first operation, the first device transmits the recorded video to one or more second devices while recording the video.
In a specific implementation, the multimedia file includes at least one of: photo, video.
In some implementations, the electronic device 700 further includes: and a second display module 704, configured to display a second interface, where identification information of one or more portraits is displayed on the second interface.
In some implementations, the electronic device 700 further includes: the second sending module 705 is configured to send a first message to one or more second devices, where the first message carries identification information of one or more portraits, and the first message is used to request communication. The second device is configured to send a second message to the first device when the identification information of the one or more portraits matches the pre-stored information, the second message being used to characterize the second device as a target device for the first device.
In a specific implementation manner, the first message carries address information of the first device; the second message carries address information of the second device.
In a specific implementation manner, the second message carries identification information of the target user, where the identification information of the target user is determined by the second device according to the identification information of the one or more portraits, and the electronic device 700 further includes: a second receiving module 706, configured to receive second messages sent by one or more second devices. The first device also displays identification information of the target user on the first interface.
In some implementations, the electronic device 700 further includes: and a third sending module 707, configured to send a prompt message to the second device, where the prompt message is used to prompt the second device that the multimedia file has been updated.
In a specific implementation, the first sending module 703 is further configured to: in response to the first operation, the first device sends the multimedia file to the server. The server is used for receiving the multimedia file and providing an interface for acquiring the multimedia file for the second device.
In a specific implementation, the electronic device 700 further includes: a fourth sending module 708, configured to send a third message to the second device, where the third message is used to indicate that the multimedia file is in a protected state during the first preset time.
In a specific implementation, the protected state is a state that is not acquired or used, the protected state including at least one of: an unclonable screen recording state, an unclonable state, and an unshared state.
In some implementations, the electronic device 700 further includes: when the multimedia file is mistransmitted, the third receiving module 709 is configured to receive a second operation on the multimedia file within the first preset time. In response to the second operation, the fifth transmitting module 710 is configured to transmit a fourth message to the second device, where the fourth message is used to request deletion of the multimedia file. The second device is configured to delete the multimedia file based on the fourth message.
The first display module 701 and the second display module 704 may be displays, for example, may be specifically a display screen 290 in the hardware structure shown in fig. 3A. The first receiving module 702, the first sending module 703, the second sending module 705, the second receiving module 706, the third sending module 707, the fourth sending module 708, the third receiving module 709, and the fifth sending module 710 may be devices that interact with other electronic devices, such as a radio frequency circuit, a bluetooth chip, a Wi-F i chip, and the like. For example, the first receiving module 702, the first transmitting module 703, the second transmitting module 705, the second receiving module 706, the third transmitting module 707, the fourth transmitting module 708, the third receiving module 709, and the fifth transmitting module 710 may include the wireless communication module 250, the antenna 1, and the like in the hardware configuration shown in fig. 3A.
It should be noted that, all relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
There may be additional divisions of actual implementations, e.g., multiple units or components may be combined or may be integrated into another device, or some features may be omitted or not performed. In addition, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be electrical, mechanical or
It will be apparent to those skilled in the art from this description that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units, and other forms of division of only one logic function.
The units described as separate parts may or may not be physically separate, and the parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing an electronic device (which may be a single-chip microcomputer, a chip or the like) or a processor (processor) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of specific embodiments of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (33)

1. A method for sharing a multimedia file, the method comprising:
the method comprises the steps that a first device displays a first interface, one or more figures are displayed on the first interface, and a first control is corresponding to one second device or the first control corresponds to a plurality of second devices, the first control is used for triggering one or more second devices corresponding to the first control to share a multimedia file, and the multimedia file is a media file containing the one or more figures;
the first device receives a first operation on the first control;
in response to the first operation, the first device transmits the multimedia file to one or more of the second devices.
2. The method of claim 1, wherein the first interface further displays identification information of one or more portraits, the number of the first controls being one or more, the identification information of each portrait corresponding to one of the first controls.
3. The method according to claim 1 or 2, wherein the first interface is a browsing interface, which is an interface for a user to browse the multimedia file.
4. The method according to claim 1 or 2, wherein the first interface is a photo taking interface, which is an interface for taking a photo; and a second control is further displayed on the first interface, the second control is used for triggering and acquiring the photo, and the second control and the first control are the same control.
5. The method according to claim 1 or 2, wherein the first interface is a video capturing interface, the video capturing interface being an interface for recording video;
in response to the first operation, the first device transmitting the multimedia file to one or more of the second devices, comprising:
in response to the first operation, the first device transmits the recorded video to one or more of the second devices while recording the video.
6. The method of any one of claims 1-5, wherein the multimedia file comprises at least one of: photo, video.
7. The method of any of claims 1-6, further comprising, prior to the first device displaying a first interface:
the first device displays a second interface on which identification information of the one or more portraits is displayed.
8. The method of any of claims 1-7, further comprising, prior to the first device displaying a first interface:
the first device sends a first message to one or more second devices, wherein the first message carries identification information of one or more figures, and the first message is used for requesting communication; the second device is configured to send a second message to the first device when the identification information of the one or more portraits is consistent with the pre-stored information, where the second message is used to characterize that the second device is a target device of the first device.
9. The method of claim 8, wherein the first message carries address information of the first device; the second message carries address information of the second device.
10. The method of claim 8 or 9, wherein the second message carries identification information of a target user, the identification information of the target user being determined by the second device according to the identification information of the one or more portraits, further comprising:
The first device receives the second messages sent by the one or more second devices;
the first device displays the first interface, and the first interface also displays the identification information of the target user.
11. The method of any of claims 1-10, wherein after the first device sends the multimedia file to the second device in response to the first operation, further comprising:
and the first device sends prompt information to the second device, wherein the prompt information is used for prompting the second device that the multimedia file is updated.
12. The method of any of claims 1-11, wherein the first device sending the multimedia file to the second device in response to the first operation comprises:
in response to the first operation, the first device sends the multimedia file to a server;
the server is used for receiving the multimedia file and providing an interface for obtaining the multimedia file for the second device.
13. The method of any of claims 1-12, wherein after the first device sends the multimedia file to one or more of the second devices in response to the first operation, further comprising:
The first device sends a third message to the second device, wherein the third message is used for indicating that the multimedia file is in a protected state within a first preset time, and the protected state comprises at least one of the following: an unclonable screen recording state, an unclonable state, and an unshared state.
14. The method of claim 13, further comprising, after the first device sends a third message to the second device:
when the multimedia file is mistransmitted, the first device receives a second operation on the multimedia file within the first preset time;
in response to the second operation, the first device sends a fourth message to the second device, wherein the fourth message is used for requesting to delete the multimedia file;
the second device is configured to delete the multimedia file according to the fourth message.
15. A multimedia file sharing method applied to a communication system, wherein the communication system comprises a first device and one or more second devices, and the method comprises the following steps:
the first device displays a first interface, one or more portraits are displayed on the first interface, and a first control is corresponding to one second device or a plurality of second devices, the first control is used for triggering one or more second devices corresponding to the first control to share a multimedia file, and the multimedia file is a media file containing the one or more portraits;
The first device receives a first operation on the first control;
in response to the first operation, the first device transmits the multimedia file to one or more of the second devices.
16. The method of claim 15, wherein the first interface further displays identification information of one or more portraits, the number of first controls being one or more, the identification information of each portrait corresponding to one of the first controls.
17. The method of claim 15 or 16, further comprising, prior to the first device displaying a first interface:
the first device sends a first message to one or more second devices, wherein the first message carries identification information of one or more figures, and the first message is used for requesting communication;
the one or more second devices compare the identification information of the one or more portraits with pre-stored information;
and when the identification information of the one or more portraits is consistent with the pre-stored information, the one or more second devices send second messages to the first device, wherein the second messages are used for representing that the second device is a target device of the first device.
18. The method of claim 17, wherein the first message carries address information of the first device; the second message carries address information of the second device.
19. The method as recited in claim 17, further comprising:
the second device determines one or more target users and identification information of the target users when the identification information of the one or more portraits is consistent with the pre-stored information;
the first equipment receives the second message, wherein the second message carries the identification information of the target user;
the first device displays the first interface, and the first interface also displays the identification information of the target user.
20. The method of any of claims 15-19, wherein after the first device sends the multimedia file to the second device in response to the first operation, further comprising:
the first device sends prompt information to the second device, wherein the prompt information is used for prompting the second device that the multimedia file is updated;
the second equipment receives the prompt information;
And the second device displays a third interface, and the prompt information is displayed on the third interface.
21. The method of any of claims 15-20, wherein the communication system further comprises a server, the first device sending the multimedia file to the second device in response to the first operation, comprising:
in response to the first operation, the first device sends the multimedia file to the server;
the server receives the multimedia file, and the server is used for providing an interface for obtaining the multimedia file for the second device.
22. The method of claim 21, further comprising, after the server receives the multimedia file: the server stores the multimedia files in a shared folder, which is a storage area shared by the first device and the one or more second devices on the server.
23. The method as recited in claim 22, further comprising:
the second device displays a fourth interface, and icons of the shared folders are displayed on the fourth interface;
The second device receives a second operation on the icon of the shared folder;
in response to the second operation, the second device obtains the multimedia file from the server.
24. The method of claim 23, wherein the second device displays a fourth interface comprising:
after the second device satisfies the first condition, the second device displays the fourth interface.
25. The method of claim 24, wherein the first condition comprises at least one of:
the second device receives prompt information sent by the first device, wherein the prompt information is used for prompting the second device that the multimedia file is updated;
the second device receives a third operation of the second device, wherein the third operation is an operation for triggering the display of the icon of the shared folder.
26. The method of claim 23, wherein the second device, in response to the second operation, obtains the multimedia file from the server, comprising:
responding to the second operation, the second device displays a fifth interface, and an input box is displayed on the fifth interface and is used for inputting verification information;
The second device receiving a fourth operation on the input box;
responding to the fourth operation, the second device sends a first request to the server, wherein the first request carries the verification information, and the first request is used for acquiring the multimedia file;
the server sends the multimedia file to the second device according to the first request;
the second device receives the multimedia file.
27. The method of any of claims 15-26, wherein after the first device sends the multimedia file to the second device in response to the first operation, further comprising:
the second device displays a sixth interface on which the multimedia file is displayed.
28. The method of claim 27, wherein the sixth interface is an interface of a gallery application.
29. The method of any of claims 15-28, wherein after the first device sends the multimedia file to one or more of the second devices in response to the first operation, further comprising:
the first device sends a third message to the second device, wherein the third message is used for indicating that the multimedia file is in a protected state within a first preset time, and the protected state comprises at least one of the following: an unclonable screen recording state, an unclonable state, and an unshared state.
30. The method of claim 29, further comprising, after the first device sends a third message to the second device:
when the multimedia file is mistransmitted, the first device receives a fifth operation on the multimedia file within the first preset time;
in response to the fifth operation, the first device sends a fourth message to the second device, wherein the fourth message is used for requesting to delete the multimedia file;
and deleting the multimedia file according to the fourth message by the second equipment.
31. An electronic device, the electronic device comprising: one or more processors; and a memory in which the code is stored; the code, when executed by the processor, causes the electronic device to perform the method of any one of claims 1-14.
32. A communication system comprising a first device, one or more second devices; it is characterized in that the method comprises the steps of,
the first device is configured to perform:
displaying a first interface, wherein one or more figures and a first control are displayed on the first interface, the first control corresponds to one second device or corresponds to a plurality of second devices, the first control is used for triggering one or more second devices corresponding to the first control to share a multimedia file, and the multimedia file is a media file containing the one or more figures;
Receiving a first operation of the first control;
and transmitting the multimedia file to one or more second devices in response to the first operation.
33. A computer readable storage medium comprising computer instructions which, when run on a first device, cause the first device to perform the method of any of claims 1-14.
CN202210612621.XA 2022-05-31 2022-05-31 Multimedia file sharing method, electronic equipment and communication system Pending CN117202108A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210612621.XA CN117202108A (en) 2022-05-31 2022-05-31 Multimedia file sharing method, electronic equipment and communication system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210612621.XA CN117202108A (en) 2022-05-31 2022-05-31 Multimedia file sharing method, electronic equipment and communication system

Publications (1)

Publication Number Publication Date
CN117202108A true CN117202108A (en) 2023-12-08

Family

ID=88992979

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210612621.XA Pending CN117202108A (en) 2022-05-31 2022-05-31 Multimedia file sharing method, electronic equipment and communication system

Country Status (1)

Country Link
CN (1) CN117202108A (en)

Similar Documents

Publication Publication Date Title
CN111666119B (en) UI component display method and electronic device
CN112449099B (en) Image processing method, electronic equipment and cloud server
WO2022100610A1 (en) Screen projection method and apparatus, and electronic device and computer-readable storage medium
CN113254409B (en) File sharing method, system and related equipment
CN109903260B (en) Image processing method and image processing apparatus
US11871478B2 (en) Bluetooth low energy-based communication method and related apparatus
CN113973398B (en) Wireless network connection method, electronic equipment and chip system
CN112130788A (en) Content sharing method and device
CN115514882B (en) Distributed shooting method, electronic equipment and medium
CN112543447A (en) Device discovery method based on address list, audio and video communication method and electronic device
CN115756270B (en) Content sharing method, device and system
CN115967851A (en) Quick photographing method, electronic device and computer readable storage medium
CN113536374A (en) Image privacy protection method and electronic equipment
CN114528581A (en) Safety display method and electronic equipment
CN115426521A (en) Method, electronic device, medium, and program product for screen capture
CN114356195B (en) File transmission method and related equipment
CN116389884B (en) Thumbnail display method and terminal equipment
CN113542574A (en) Shooting preview method under zooming, terminal, storage medium and electronic equipment
CN116708751B (en) Method and device for determining photographing duration and electronic equipment
CN111886849B (en) Information transmission method and electronic equipment
CN117202108A (en) Multimedia file sharing method, electronic equipment and communication system
CN115686182A (en) Processing method of augmented reality video and electronic equipment
CN106709027A (en) Picture recommending method and device
CN114489876A (en) Text input method, electronic equipment and system
CN111339513A (en) Data sharing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination