CN110888854A - Content sharing method and electronic equipment - Google Patents

Content sharing method and electronic equipment Download PDF

Info

Publication number
CN110888854A
CN110888854A CN201911204952.4A CN201911204952A CN110888854A CN 110888854 A CN110888854 A CN 110888854A CN 201911204952 A CN201911204952 A CN 201911204952A CN 110888854 A CN110888854 A CN 110888854A
Authority
CN
China
Prior art keywords
type
objects
input
electronic device
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911204952.4A
Other languages
Chinese (zh)
Inventor
胡铁军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201911204952.4A priority Critical patent/CN110888854A/en
Publication of CN110888854A publication Critical patent/CN110888854A/en
Priority to PCT/CN2020/131168 priority patent/WO2021104268A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/958Organisation or management of web site content, e.g. publishing, maintaining pages or automatic linking
    • G06F16/972Access to data in other repository systems, e.g. legacy data or dynamic Web page generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/17Details of further file system functions
    • G06F16/176Support for shared access to files; File sharing support

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention provides a content sharing method and electronic equipment, which are applied to the field of communication and are used for solving the problems that a process of viewing shared content among different applications by a user is complicated and time-consuming. The method comprises the following steps: receiving a first input to a first object in an interface of a first application; in response to the first input, obtaining a second object from the first object; receiving a second input; sending, by the second application, the second object in response to the second input; the object type of each object in the first objects is a text type, a picture type, an audio type or a video type, and the object type of each object in the second objects is a text type, a picture type, an audio type or a video type. The method is particularly used in the process of forwarding content between different applications.

Description

Content sharing method and electronic equipment
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to a content sharing method and electronic equipment.
Background
Application programs (APPs) in electronic devices such as mobile phones and tablet computers can provide a large amount of rich and colorful information to people in the forms of characters, pictures, audio, video and the like. In particular, a user may desire to forward content in one application (i.e., application) to another application. For example, a user may desire to forward content such as text, pictures, or video in a news application to an instant messaging application.
Specifically, the current electronic device generally forwards the content in the sending-side application in the form of a link in the receiving-side application. Therefore, when the electronic device receives and displays the corresponding link through the receiving-side application, the user input to the link needs to be received, and then the webpage or the application indicated by the link is opened, that is, the whole page to which the forwarded content belongs is opened, so that the forwarded content can be displayed to the user. Thus, the process of the user viewing the content shared among different applications is cumbersome and time consuming.
Disclosure of Invention
The embodiment of the invention provides a content sharing method and electronic equipment, and aims to solve the problems that a process for a user to view content shared among different applications is complicated and time-consuming.
In order to solve the above technical problem, the embodiment of the present invention is implemented as follows:
in a first aspect, an embodiment of the present invention provides a content sharing method, where the method includes: receiving a first input to a first object in an interface of a first application; in response to the first input, obtaining a second object from the first object; receiving a second input; sending, by the second application, the second object in response to the second input; the object type of each object in the first objects is a text type, a picture type, an audio type or a video type, and the object type of each object in the second objects is a text type, a picture type, an audio type or a video type.
In a second aspect, an embodiment of the present invention further provides an electronic device, where the electronic device includes: the device comprises a receiving module, a processing module and a sending module; the receiving module is used for receiving a first input of a first object in an interface of a first application; the processing module is used for responding to the first input received by the receiving module and obtaining a second object according to the first object;
the receiving module is also used for receiving a second input; the sending module is used for responding to the second input received by the receiving module and sending the second object obtained by the processing module through a second application; the object type of each object in the first objects is a text type, a picture type, an audio type or a video type, and the object type of each object in the second objects is a text type, a picture type, an audio type or a video type.
In a third aspect, an embodiment of the present invention provides an electronic device, which includes a processor, a memory, and a computer program stored in the memory and being executable on the processor, where the computer program, when executed by the processor, implements the steps of the content sharing method according to the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the content sharing method according to the first aspect.
In the embodiment of the invention, the second object can be obtained according to the first object through triggering of the first input of the first object in the interface of the first application. The object type of each object in the first objects is a text type, a picture type, an audio type or a video type, and the object type of each object in the second objects can be a text type, a picture type, an audio type or a video type. That is, the electronic device may share an object of which an object type is a text type, a picture type, an audio type, or a video type in different applications, instead of sharing the object in a linked form. As such, the electronic device may present an object of which the object type is a text type, a picture type, an audio type, or a video type through the second application, instead of presenting a link including the object. Therefore, the user can directly view or operate the object shared among different applications, and does not need to click the link and then search the corresponding object. Furthermore, the sending user who shares the content can focus on the information part that the sending user wants to share, and other useless information is omitted; and the receiving user receives the information which the sending user wants to share through the electronic equipment, so that the intention of the sending user is intuitively reflected. Therefore, the accuracy of the electronic equipment for sharing the content among different applications is improved, and the convenience of the user for checking the content shared among different applications is improved.
Drawings
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention;
fig. 2 is a schematic flow chart illustrating a content sharing method according to an embodiment of the present invention;
fig. 3 is a second schematic flow chart of the content sharing method according to the embodiment of the present invention;
fig. 4 is a schematic structural diagram of a possible electronic device according to an embodiment of the present invention;
fig. 5 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that "/" in this context means "or", for example, A/B may mean A or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. "plurality" means two or more than two.
It should be noted that, in the embodiments of the present invention, words such as "exemplary" or "for example" are used to indicate examples, illustrations or explanations. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
The terms "first" and "second," and the like, in the description and in the claims of the present invention are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first object and the second object, etc. are for distinguishing different objects, not for describing a particular order of the objects.
According to the content sharing method provided by the embodiment of the invention, the second object can be obtained according to the first object through triggering the first input of the first object in the interface of the first application. The object type of each object in the first objects is a text type, a picture type, an audio type or a video type, and the object type of each object in the second objects can be a text type, a picture type, an audio type or a video type. That is, the electronic device may share an object of which an object type is a text type, a picture type, an audio type, or a video type in different applications, instead of sharing the object in a linked form. As such, the electronic device may present an object of which the object type is a text type, a picture type, an audio type, or a video type through the second application, instead of presenting a link including the object. Therefore, the user can directly view or operate the object shared among different applications, and does not need to click the link and then search the corresponding object. Furthermore, the sending user who shares the content can focus on the information part that the sending user wants to share, and other useless information is omitted; and the receiving user receives the information which the sending user wants to share through the electronic equipment, so that the intention of the sending user is intuitively reflected. Therefore, the accuracy of the electronic equipment for sharing the content among different applications is improved, and the convenience of the user for checking the content shared among different applications is improved.
The electronic device in the embodiment of the invention can be a mobile electronic device or a non-mobile electronic device. The mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), etc.; the non-mobile electronic device may be a Personal Computer (PC), a Television (TV), a teller machine, a self-service machine, or the like; the embodiments of the present invention are not particularly limited.
It should be noted that, in the content sharing method provided in the embodiment of the present invention, the execution main body may be an electronic device, or a Central Processing Unit (CPU) of the electronic device, or a control module in the electronic device for executing the content sharing method. The content sharing method provided by the embodiment of the invention is described by taking the electronic device as an example to execute the content sharing method.
The electronic device in the embodiment of the present invention may be an electronic device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present invention are not limited in particular.
The following describes a software environment applied to the content sharing method provided by the embodiment of the present invention, taking an android operating system as an example.
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention. In fig. 1, the architecture of the android operating system includes 4 layers, which are respectively: an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, a Linux kernel layer).
The application program layer comprises various application programs (including system application programs and third-party application programs) in an android operating system.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application. For example, applications such as a system setup application, a system chat application, and a system camera application. And the third-party setting application, the third-party camera application, the third-party chatting application and other application programs.
The system runtime layer includes libraries (also called system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of an android operating system and belongs to the bottommost layer of an android operating system software layer. The kernel layer provides kernel system services and hardware-related drivers for the android operating system based on the Linux kernel.
Taking an android operating system as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the content sharing method provided in the embodiment of the present invention based on the system architecture of the android operating system shown in fig. 1, so that the content sharing method may operate based on the android operating system shown in fig. 1. That is, the processor or the electronic device may implement the content sharing method provided by the embodiment of the present invention by running the software program in the android operating system.
The content sharing method provided in the embodiment of the present invention is described in detail below with reference to the flowchart of the content sharing method shown in fig. 2. Although the logical order of the content sharing method provided by the embodiments of the present invention is shown in the flowchart of the content sharing method, in some cases, the steps shown or described may be executed in an order different from that shown. For example, the content sharing method illustrated in fig. 2 may include S201-S204:
s201, the electronic equipment receives a first input of a first object in an interface of a first application.
In the embodiment of the present invention, the object type of the object may be a text (e.g., text) type, a picture type, an audio type, or a video type, that is, the object is a text, a picture, an audio, or a video. Specifically, the object type of each of the first objects is a text type, a picture type, an audio type, or a video type.
Optionally, in this embodiment of the present invention, the first object belongs to content indicated by the target link in the first application.
For example, the first input may be an input of a user selecting a first object from content of a first application.
It should be noted that the screen of the electronic device provided in the embodiment of the present invention may be a touch screen, and the touch screen may be configured to receive an input from a user and display a content corresponding to the input to the user in response to the input. The first input may be a touch screen input, a fingerprint input, a gravity input, a key input, or the like. The touch screen input is input such as a press input, a long press input, a slide input, a click input, a hover input (an input by a user near the touch screen) of a touch screen of the electronic device by the user. The fingerprint input is input by a user to a sliding fingerprint, a long-time pressing fingerprint, a single-click fingerprint, a double-click fingerprint and the like of a fingerprint recognizer of the electronic equipment. The gravity input is input such as shaking of the electronic equipment in a specific direction, shaking of the electronic equipment for a specific number of times and the like by a user. The key input corresponds to a single-click input, a double-click input, a long-press input, a combination key input, and the like, of a user for a key of a power key, a volume key, a Home key, and the like of the electronic device. Specifically, the embodiment of the present invention does not specifically limit the manner of the first input, and may be any realizable manner.
The content sharing method provided by the embodiment of the invention can be applied to the electronic equipment to share objects among two or more applications so as to realize content sharing among different applications. Specifically, at least two applications may be installed in the electronic device to support the electronic device to share an object between different applications. For example, the electronic device shares an object between a first application and a second application.
S202, responding to the first input, the electronic equipment obtains a second object according to the first object.
The content in the first object is the same as the content in the second object, and the object type of each object in the second object may be a text type, a picture type, an audio type or a video type.
Specifically, the number of objects of the first object is the same as or different from the number of objects of the second object.
Optionally, the object type of the object in the second object may be determined by the object type of the object in the first object. For example, the object type of one of the second objects is the same as the object type of the corresponding one of the first objects.
S203, the electronic equipment receives a second input.
The second input may be used to trigger the electronic device to share the second object through the second application, for example, to trigger the electronic device to determine which application is the second application.
Similarly, the description of the input mode of the second input may refer to the description of the input mode of the first input in the foregoing embodiment, and is not repeated herein in the embodiment of the present invention.
And S204, responding to the second input, and sending the second object through the second application by the electronic equipment.
Optionally, the electronic device sends the second object through the second application, specifically, sends the second object to a certain communication contact, favorite, or folder in the second application.
Optionally, in this embodiment of the present invention, the application types of the application may include a news type, a social type, a game type, a video type, an instant messaging type, and the like, which is not specifically limited in this embodiment of the present invention.
Illustratively, the application type of the first application is the same as or different from the application type of the second application. For example, the application type of the first application is a news type, and the application type of the second application is an instant messaging type.
It is understood that, in the embodiment of the present invention, the first application is a sending-side application, and the second application is a receiving-side application.
It should be noted that, the electronic devices currently share content among different applications in a linked manner. In this case, the user usually inputs a forwarding control in the application on the sending side (i.e. a forwarding control only for the application) to trigger the electronic device to send a current entire page (e.g. a web page) or a message body including information of the application (e.g. a name or a representation of the application) in the form of a link. However, usually, only a part of the content in one page, such as a certain text, a certain picture or a certain video clip in the page, is actually required to be shared by the users. Therefore, the electronic device receives the content, which is not the text, the picture or the video, but the link in the application of the receiving side, and the user needs to click the link to obtain the whole page indicated by the link, so as to obtain the content which is actually required to be shared. Therefore, in the time of information flooding, people are increasingly reluctant to find too many links and have a new trend of selecting overlooking links. Obviously, the process of displaying the shared content among different applications by the current electronic device is not flexible enough, so that the process of viewing the shared content among different applications by the user is tedious and time-consuming.
It can be understood that, in the embodiment of the present invention, the second object shared by the electronic device is a text, a picture, an audio, or a video, rather than a link. In this way, the second object received by the electronic device on the receiving side is the content itself, rather than the entire page in the first application or the body of the message containing information of the first application.
Optionally, the electronic devices (denoted as electronic devices 1) sharing the second object through the second application are the same as or different from the electronic devices (denoted as electronic devices 2) receiving the second object through the second application.
When the electronic device 1 is different from the electronic device 2, a second application is installed in both the electronic device 1 and the electronic device 2, for example, a certain instant messaging application (denoted as instant messaging application 1) is installed. Specifically, the electronic device 1 and the electronic device 2 may interact with a communication server corresponding to the instant messaging application 1, so as to forward an object through the respective instant messaging application 1 in the electronic device 1 and the electronic device 2.
Further, optionally, the electronic device may execute the content sharing method provided by the embodiment of the present invention when the fast forwarding function is started. For example, S201 may be implemented by "the electronic device receives a first input to a first object in the first application in a case that the fast forwarding function is turned on".
In the embodiment of the invention, the electronic equipment can be provided with a shortcut for starting the quick forwarding function, such as a physical key combination, a gesture and the like of the electronic equipment, so that a user can conveniently and quickly start/quit the quick switching function.
Optionally, in the embodiment of the present invention, the fast forwarding function in the electronic device may be a system-level function of the electronic device, that is, a function based on a global application of the electronic device, rather than a function based on an application of one of the electronic devices. The fast forwarding function may be used to support the electronic device to share content (i.e., share objects) among different applications. For example, the fast forwarding functionality may provide a global application-based forwarding control to enable the electronic device to share content among different applications.
It should be noted that, when the fast forwarding function of the electronic device is turned on, a first input of the first object in the first application by the user may trigger the electronic device to obtain the second object according to the first object, and further trigger the second object to be shared by the second application. At this time, even if no separate forwarding control is set in the first application, the electronic device may share the second object through the second application by using a system-level forwarding control provided by the fast forwarding function. Therefore, the controllability and flexibility of sharing content among different applications of the electronic equipment are improved.
Further, optionally, after the user triggers the electronic device to share the second object through the second application, the electronic device may turn off the fast forwarding function. For example, the user may trigger the electronic device to turn off the fast forwarding function through a preset combination of physical keys.
It should be noted that, in the content sharing method provided in the embodiment of the present invention, obtaining of the second object according to the first object may be triggered by a first input to the first object in the first application. The object type of each object in the first objects is a text type, a picture type, an audio type or a video type, and the object type of each object in the second objects can be a text type, a picture type, an audio type or a video type. That is, the electronic device may share an object of which an object type is a text type, a picture type, an audio type, or a video type in different applications, instead of sharing the object in a linked form. As such, the electronic device may present an object of which the object type is a text type, a picture type, an audio type, or a video type through the second application, instead of presenting a link including the object. Therefore, the user can directly view or operate the object shared among different applications, and does not need to click the link and then search the corresponding object. Furthermore, the sending user who shares the content can focus on the information part that the sending user wants to share, and other useless information is omitted; and the receiving user receives the information which the sending user wants to share through the electronic equipment, so that the intention of the sending user is intuitively reflected. Therefore, the accuracy of the electronic equipment for sharing the content among different applications is improved, and the convenience of the user for checking the content shared among different applications is improved.
Optionally, in this embodiment of the present invention, the first input includes a first sub-input and a second sub-input. Illustratively, as shown in fig. 3, the above S202 may be implemented by S202a and S202 b:
s202a, responding to the first sub-input, the electronic device acquires the first object.
Illustratively, the first sub-input is an input to a first object, such as a long press and select input by a user of a text, picture, audio or video in the first application.
For example, in the case that one of the first objects is text, the first sub-input may include a sliding input of the user on the characters, that is, an input of selecting the character desired to be shared. At this point, the electronic device may display the text in a particular manner, such as background highlighting of control words.
For example, in the case that one of the first objects is a picture, the first sub-input may include a long press input of the picture by the user, that is, an input of selecting a picture desired to be shared. At this point, the electronic device may display text in a particular manner, such as controlling a frame highlight of a picture.
And after the user performs the first input on the first object, the electronic device may display a forwarding selection box to support the user to select an application as the second application, and then trigger sharing of the second object obtained by the first object through the second application.
That is, the user may activate the fast forwarding function of the electronic device via a shortcut key or otherwise.
Specifically, for text (e.g., words), when the user makes a selection input to the text, the electronic device may automatically select and copy the text, and pop up the forward selection box. After the user selects the second application to be forwarded through the forwarding selection box, the electronic device shares the copied text through the second application.
For a picture, when a user presses a certain picture for a long time, the electronic device can automatically select the picture and pop up a forwarding selection box, and after the user selects a second application to be forwarded to through the forwarding selection box, the selected picture is forwarded through the second application.
For video, when a user presses a certain video for a long time, the electronic equipment can automatically start recording, when the user releases the video, the recording stops, the forwarding selection box pops up, and after the user selects a second application to be forwarded through the forwarding selection box, the video which is recorded is forwarded in the second application.
S202b, responding to the second sub-input, the electronic device processes the first object into the second object according to the target information.
Optionally, the electronic device may provide a rule selection control for supporting the electronic device of the user in determining the target information.
Wherein the second sub-input may be an input to a rule selection control. That is, the user may control at least one of the number of objects and the type of objects of the second object processed by the electronic device. For example, a user may select whether to merge multiple different types of objects in a first object into a second object of a certain type.
Further, optionally, the content sharing method provided in the embodiment of the present invention may further include S205:
s205, responding to the first sub-input, and displaying at least one application identifier by the electronic equipment.
The second sub-input is an input of an application identifier in the at least one application identifier, one application identifier is used for indicating one application, and the target identifier is used for indicating a second application.
Optionally, the electronic device may employ a forwarding selection box to display at least one application identifier.
The content sharing method provided by the embodiment of the invention can support the user to select the first object to be shared according to the self requirement through the first sub-input. And the support user triggers the electronic equipment to process the target information selected by the user through the second sub-input to obtain the second object. Therefore, the controllability of the electronic equipment for processing the object is improved, and the object shared by the electronic equipment among different applications is the object shared by the actual needs of the user.
In a possible implementation manner, in the content sharing method provided in the embodiment of the present invention, "processing the first object as the second object according to the target information" in the above embodiment may be implemented through S1 or S2:
and S1, the electronic equipment takes the first object as the second object, and the object input quantity of the first object is the same as the quantity of the second objects.
Specifically, the number of objects of the first object is greater than or equal to 1 in S1.
Here, the mode in S1 can be described as mode 1. In the method 1, the target information is used to instruct the electronic device not to merge the plurality of first objects.
And S2, merging the objects in the first objects into second objects by the electronic equipment, wherein the number of the first objects is greater than 1, and the number of the second objects is equal to 1.
Here, the mode in S2 can be described as mode 2. In the mode 2, the target information is used to instruct the electronic device to merge a plurality of objects in the first object.
The content sharing method provided by the embodiment of the invention can support the electronic device to process the second object through different target information. Therefore, the controllability and the flexibility of the electronic equipment for processing the object are improved.
Optionally, the step S2 may be implemented by the step S2 a:
s2a, the electronic equipment merges the objects in the first objects into the second objects according to the object type of each object in the first objects.
Specifically, the target information is used to instruct the electronic device to obtain the second object by performing the S2a process.
Further, optionally, in the application scenario 1 provided in the embodiment of the present invention, in a case that the object types of all the objects in the first object are the first object type, the object type of the second object is the same as or different from the first object type.
For example, in the case that the object types of all the objects in the first object are text types, the object type of the second object processed by the electronic device may be a text type or a picture type, that is, multiple pieces of text are synthesized into one picture by the electronic device. When the object types of all the objects in the first object are pictures, the object type of the second object processed by the electronic device may be a picture type, that is, the electronic device synthesizes a plurality of pictures into one picture. Under the condition that the object types of all the objects in the first object are audio, the object type of the second object processed by the electronic equipment can be an audio type, that is, the electronic equipment synthesizes a plurality of pieces of audio into one piece of audio. Under the condition that the object types of all the objects in the first object are videos, the object type of the second object processed by the electronic equipment can be a video type, namely, the electronic equipment synthesizes a plurality of pieces of audio into a piece of video.
Further, optionally, in the application scenario 2 provided in the embodiment of the present invention, in a case that the first object includes objects of different object types, the object type of the second object is the second object type; wherein the second object type is an object type of one of the first objects.
Optionally, in this embodiment of the present invention, the application scenario 2 may include the following sub-scenarios:
sub-scenario 1: in the case where text and a picture are included in the first object, the second object type is a picture type. That is, the electronic device processes the text into pictures, and then synthesizes a plurality of pictures into one picture.
Sub-scenario 2: in the case where the text and the audio are included in the first object, the second object type is an audio type.
Sub-scenario 3: in the case where the first object includes a picture and audio, the second object type is an audio type.
Sub-scenario 4: in the case where the text, the picture, and the audio are included in the first object, the second object type is an audio type.
Optionally, in the case that the second object type is a video type, the object in the first object whose object type is text/picture is processed as a thumbnail of the second object or a content part in the second object. For example, in the case where the second object is a song, the first object of the text is processed as lyrics of the second object, and the first object of the picture is processed as a playing background picture of the second object.
Sub-scenario 5: in the case where the text and the video are included in the first object, the second object type is a video type.
Sub-scene 6: in the case where the first object includes a picture and a video, the second object type is a video type.
Sub-scenario 7: in the case where text, pictures, and video are included in the first object, the second object type is a video type.
Sub-scene 8: in the case where the first object includes text, picture, audio, and video, the second object type is a video type.
It is to be understood that in the case where a video is included in a first object of a different object type, the second object type is a video type; in the case where the first object of the different object types includes audio but not video, the second object type is an audio type; in the case where a picture is included in a first object of different object types without including audio and video, the second object type is a picture type.
Optionally, when the second object type is a video, and when the second object type is a video type, one object of a third object type in the first object is used as the first frame image of the second object, and the third object type is a text type or a picture type.
Specifically, in the case that the second object type is a video type, the first object includes an object whose object type is a video type, that is, the first object includes a video (denoted as an original video).
The electronic device may process an object with an object type of text in the first object into a picture, and then use the picture as a first frame image of the second object, that is, insert the picture into the original video to obtain the second object. Or, the electronic device may directly use the object with the object type of the picture type in the first object as the first frame image of the second object.
Optionally, the electronic device may process an image obtained by processing an object of which the object type is a text type in the first object, and insert the object of which the object type is the picture type in the first object into different positions in the original video respectively to obtain the second object.
Optionally, the electronic device may combine an image obtained by processing an object of which the object type is a text type in the first object and an object of which the object type is a picture type in the first object into one image, and then insert the image into the original video to obtain the second object.
Optionally, when the number of the objects of the first object is greater than 1, the objects in the first object are arranged according to a target layout, and the target layout includes at least one of a target arrangement order and a target arrangement layout.
Optionally, the target arrangement layout may include a superposition arrangement, a left-to-right arrangement, a top-to-bottom arrangement, a delta arrangement, a field arrangement, and an mxn matrix arrangement, and the like.
Optionally, the target information selection control provided by the electronic device may not only support the user to select whether to merge the plurality of first objects and what object type to merge into, but also support the user to select a target layout for merging the plurality of first objects. That is, the user can select whether to merge the object in the first object and the target layout of the object in the first object through the second sub-input. Therefore, after the electronic equipment shares the object among different applications, the format of the object displayed to the user is the format meeting the requirements of the user, and the man-machine interaction performance is improved.
Optionally, the user may edit the first object while selecting the first object through the first sub-input. For example, the cropping object type is an object of a picture type, or the cropping object type is an object of an audio type or a video type.
For example, after the electronic device starts the fast forwarding function, if the user selects a text by sliding on a word in the first application, the electronic device may select the corresponding text and highlight the text background. Under the condition that a user triggers multiple sections of texts of the electronic equipment, the multiple sections of texts can be automatically spliced into one section of text. In addition, after the electronic equipment selects the text, function keys can be displayed around the text to support the user to select to share the text or to convert the text into a picture and then share the picture. Subsequently, the user may select the application identifier of the second application in the forwarding selection box, and trigger the electronic device to share the processed text or picture in the second application.
For example, after the electronic device starts the fast forwarding function, if a user clicks a picture in a first application, a highlighted rectangular frame appears around the picture. The user drags one corner of the rectangular frame, and the picture can be zoomed; the sliding can move the position of the rectangular frame by pressing the middle part of the rectangular frame. Therefore, any part of the picture can be intercepted and shared, and the whole picture can also be shared. If the user clicks the multiple pictures in the first application, the electronic device may share the multiple pictures through the second application, and each of the multiple pictures may be edited by the user. Specifically, a user triggers the electronic device to select a plurality of pictures, and the electronic device can display function keys around the plurality of pictures, so that the user can select to trigger the electronic device to directly share the plurality of pictures or select to splice all the pictures into a long picture for sharing. The target arrangement layout and the target arrangement sequence of the pictures can be set by a user. Subsequently, the user may select the application identifier of the second application in the forwarding selection box, and trigger the electronic device to share the processed picture through the second application.
Optionally, in the embodiment of the present invention, the step S202a may be implemented by the step S202a-1 or the step S202 a-2:
s202a-1, the electronic device obtains a third object by recording the target video in the first application, and obtains other objects except the third object in the first object.
S202a-2, the electronic device intercepts the target video according to the video time axis to obtain a third object, and obtains other objects except the third object in the first object.
For example, after the electronic device starts the fast forwarding function, if a user clicks and selects a certain video (e.g., a target video) in the first application, the electronic device may display a highlighted rectangular frame around the video and display a recording function key. If the user only wants to forward a part of the video, the user can click a recording function key to trigger the electronic equipment to start recording the video segment in the video. Specifically, the user clicks the recording function key for the first time to trigger the electronic device to start recording, and clicks the recording function key for the second time to trigger the electronic device to start recording. The user can trigger the electronic equipment to record a plurality of video segments, and the video segments are automatically spliced into one video segment. Or, if the user wants to forward the whole video, the user can trigger the electronic device to forward the original video directly without operating the recording function key.
In addition, the user can trigger the electronic device to select the starting position and the ending position of the video clip to be intercepted by adjusting the video time axis of the video (such as the target video) in the first application, so that the system of the electronic device automatically intercepts the video to select the object type as the first object of the video. Alternatively, the user may set a rectangular frame area in the video in the first application to select the start position and the end position of the video segment to be recorded, so as to trigger the electronic device to quickly intercept the video segment within a specific time period in the area, which is faster than recording.
Subsequently, the user may select the application identifier of the second application in the forwarding selection box, and trigger the electronic device to share the processed video clip through the second application.
The content sharing method provided by the embodiment of the invention can support the electronic equipment to quickly acquire the object to be shared, and improves the man-machine interaction performance of the content shared by the electronic equipment.
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention. The electronic device 40 shown in fig. 4 includes: a receiving module 41, a processing module 42 and a transmitting module 43; a receiving module 41, configured to receive a first input to a first object in an interface of a first application; a processing module 42, configured to, in response to the first input received by the receiving module 41, obtain a second object according to the first object; a receiving module 41, further configured to receive a second input; a sending module 42, configured to send, in response to the second input received by the receiving module 41, the second object obtained by the processing module 42 through the second application; the object type of each object in the first objects is a text type, a picture type, an audio type or a video type, and the object type of each object in the second objects is a text type, a picture type, an audio type or a video type.
Optionally, the first input includes a first sub-input and a second sub-input; a processing module 42, specifically configured to respond to the first sub-input, to obtain a first object; processing the first object into a second object according to the target information in response to the second sub-input; wherein the target information is used to indicate at least one of: an object type of each of the second objects, and a number of objects of the second object.
Optionally, the processing module 42 is specifically configured to use the first object as the second object, where the number of objects of the first object is equal to the number of objects of the second object; or merging the objects in the first objects into a second object, wherein the number of the objects in the first object is greater than 1, and the number of the objects in the second object is 1.
Optionally, the processing module 42 is specifically configured to merge the objects in the first objects into the second objects according to the object type of each object in the first objects.
Optionally, in a case that the object types of all the objects in the first object are the first object type, the object type of the second object is the same as or different from the first object type; in the case where the first object includes an object of a different object type, the object type of the second object is a second object type; wherein the second object type is an object type of one of the first objects.
Optionally, when the second object type is a video type, one object of a third object type in the first object is used as the first frame image of the second object, and the third object type is a text type or a picture type.
Optionally, when the number of the objects of the first object is greater than 1, the objects in the first object are arranged according to a target layout, and the target layout includes at least one of a target arrangement order and a target arrangement layout.
Optionally, the first object includes a third object of a video type; the processing module 42 is specifically configured to obtain a third object by recording the target video in the first application, and obtain other objects in the first object except the third object; or, the target video is intercepted according to the video time axis to obtain a third object, and other objects except the third object in the first object are obtained.
According to the electronic device provided by the embodiment of the invention, the second object can be triggered to be obtained according to the first object through the first input of the first object in the interface of the first application. The object type of each object in the first objects is a text type, a picture type, an audio type or a video type, and the object type of each object in the second objects can be a text type, a picture type, an audio type or a video type. That is, the electronic device may share an object of which an object type is a text type, a picture type, an audio type, or a video type in different applications, instead of sharing the object in a linked form. As such, the electronic device may present an object of which the object type is a text type, a picture type, an audio type, or a video type through the second application, instead of presenting a link including the object. Therefore, the user can directly view or operate the object shared among different applications, and does not need to click the link and then search the corresponding object. Furthermore, the sending user who shares the content can focus on the information part that the sending user wants to share, and other useless information is omitted; and the receiving user receives the information which the sending user wants to share through the electronic equipment, so that the intention of the sending user is intuitively reflected. Therefore, the accuracy of the electronic equipment for sharing the content among different applications is improved, and the convenience of the user for checking the content shared among different applications is improved.
The electronic device 40 provided in the embodiment of the present invention can implement each process implemented by the electronic device in the above-mentioned content sharing method embodiment, and for avoiding repetition, details are not described here.
Fig. 5 is a schematic diagram of a hardware structure of an electronic device 100 according to an embodiment of the present invention, where the electronic device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 5 does not constitute a limitation of the electronic device, and that the electronic device may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic device, a wearable device, a pedometer, and the like.
Wherein, the processor 110 is configured to control the user input unit 107 to receive a first input to a first object in the interface of the first application; a processor 110, further configured to obtain a second object from the first object in response to the first input received by the user input unit 107; a processor 110, further configured to control the user input unit 107 to receive a second input; the processor 110 is further configured to share the second object through the second application in response to a second input received by the user input unit 107; the object type of each object in the first objects is a text type, a picture type, an audio type or a video type, and the object type of each object in the second objects is a text type, a picture type, an audio type or a video type.
According to the electronic device provided by the embodiment of the invention, the second object can be triggered to be obtained according to the first object through the first input of the first object in the interface of the first application. The object type of each object in the first objects is a text type, a picture type, an audio type or a video type, and the object type of each object in the second objects can be a text type, a picture type, an audio type or a video type. That is, the electronic device may share an object of which an object type is a text type, a picture type, an audio type, or a video type in different applications, instead of sharing the object in a linked form. As such, the electronic device may present an object of which the object type is a text type, a picture type, an audio type, or a video type through the second application, instead of presenting a link including the object. Therefore, the user can directly view or operate the object shared among different applications, and does not need to click the link and then search the corresponding object. Furthermore, the sending user who shares the content can focus on the information part that the sending user wants to share, and other useless information is omitted; and the receiving user receives the information which the sending user wants to share through the electronic equipment, so that the intention of the sending user is intuitively reflected. Therefore, the accuracy of the electronic equipment for sharing the content among different applications is improved, and the convenience of the user for checking the content shared among different applications is improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be used for receiving and sending signals during a message transmission or call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 102, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the electronic apparatus 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The electronic device 100 also includes at least one sensor 105, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or the backlight when the electronic device 100 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 5, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the electronic device, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the electronic apparatus 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the electronic apparatus 100 or may be used to transmit data between the electronic apparatus 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the electronic device. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The electronic device 100 may further include a power source 111 (such as a battery) for supplying power to each component, and preferably, the power source 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the electronic device 100 includes some functional modules that are not shown, and are not described in detail herein.
Optionally, an embodiment of the present invention further provides an electronic device, which includes a processor 110, a memory 109, and a computer program that is stored in the memory 109 and is executable on the processor 110, and when the computer program is executed by the processor 110, the processes of the content sharing method embodiment are implemented, and the same technical effect can be achieved, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the content sharing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, a content sharing method, an article, or an apparatus that comprises a list of elements includes not only those elements but also other elements not expressly listed or inherent to such process, content sharing method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, content sharing method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art can clearly understand that the content sharing method in the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling an electronic device (e.g., a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the content sharing method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (18)

1. A method for sharing content, the method comprising:
receiving a first input to a first object in an interface of a first application;
in response to the first input, obtaining a second object from the first object;
receiving a second input;
sending, by a second application, the second object in response to the second input;
the object type of each object in the first objects is a text type, a picture type, an audio type or a video type, and the object type of each object in the second objects is a text type, a picture type, an audio type or a video type.
2. The method of claim 1, wherein the first input comprises a first sub-input and a second sub-input;
the obtaining, in response to the first input, a second object from the first object, comprising:
in response to the first sub-input, acquiring the first object;
processing the first object into the second object according to target information in response to the second sub-input;
wherein the target information is used to indicate at least one of: an object type of each of the second objects, a number of objects of the second object.
3. The method of claim 2, wherein the processing the first object into the second object according to the target information comprises:
taking the first object as the second object, wherein the number of the objects of the first object is equal to that of the objects of the second object;
alternatively, the first and second electrodes may be,
merging the objects in the first objects into the second objects, wherein the number of the objects in the first objects is greater than 1, and the number of the objects in the second objects is 1.
4. The method of claim 3, wherein merging the objects in the first object into the second object comprises:
merging the objects in the first objects into the second objects according to the object type of each object in the first objects.
5. The method of claim 4,
in a case where the object types of all the objects in the first object are a first object type, the object type of the second object is the same as or different from the first object type;
in the case that objects of different object types are included in the first object, the object type of the second object is a second object type;
wherein the second object type is an object type of one of the first objects.
6. The method according to claim 5, wherein, in a case where the second object type is a video type, one object of a third object type among the first objects is taken as a first frame image of the second object, and the third object type is a text type or a picture type.
7. The method of claim 3, wherein in a case that the number of the first objects is greater than 1, the objects in the first objects are arranged according to a target layout, and the target layout comprises at least one of a target arrangement order and a target arrangement layout.
8. The method of claim 2, wherein the first object comprises a third object of a video type;
the acquiring the first object comprises:
obtaining the third object by recording the target video in the first application, and obtaining other objects except the third object in the first object;
or, the target video is intercepted according to a video time axis to obtain the third object, and other objects except the third object in the first object are obtained.
9. An electronic device, characterized in that the electronic device comprises: the device comprises a receiving module, a processing module and a sending module;
the receiving module is used for receiving a first input of a first object in an interface of a first application;
the processing module is used for responding to the first input received by the receiving module and obtaining a second object according to the first object;
the receiving module is further used for receiving a second input;
the sending module is configured to send, through a second application, the second object obtained by the processing module in response to the second input received by the receiving module;
the object type of each object in the first objects is a text type, a picture type, an audio type or a video type, and the object type of each object in the second objects is a text type, a picture type, an audio type or a video type.
10. The electronic device of claim 9, wherein the first input comprises a first sub-input and a second sub-input;
the processing module is specifically configured to respond to the first sub-input and acquire the first object; processing the first object into the second object according to target information in response to the second sub-input;
wherein the target information is used to indicate at least one of: an object type of each of the second objects, a number of objects of the second object.
11. The electronic device according to claim 10, wherein the processing module is specifically configured to use the first object as the second object, and a number of objects of the first object is equal to a number of objects of the second object;
alternatively, the first and second electrodes may be,
merging the objects in the first objects into the second objects, wherein the number of the objects in the first objects is greater than 1, and the number of the objects in the second objects is 1.
12. The electronic device according to claim 11, wherein the processing module is specifically configured to merge the objects in the first objects into the second objects according to an object type of each of the first objects.
13. The electronic device of claim 12,
in a case where the object types of all the objects in the first object are a first object type, the object type of the second object is the same as or different from the first object type;
in the case that objects of different object types are included in the first object, the object type of the second object is a second object type;
wherein the second object type is an object type of one of the first objects.
14. The electronic device according to claim 13, wherein, in a case where the second object type is a video type, one of the first objects of a third object type is used as the first frame image of the second object, and the third object type is a text type or a picture type.
15. The electronic device of claim 11, wherein in a case that the number of objects of the first object is greater than 1, the objects in the first object are arranged according to a target layout, and the target layout comprises at least one of a target arrangement order and a target arrangement layout.
16. The electronic device of claim 10, wherein the first object comprises a third object of a video type;
the processing module is specifically configured to obtain the third object by recording the target video in the first application, and obtain other objects in the first object except the third object; or, the target video is intercepted according to a video time axis to obtain the third object, and other objects except the third object in the first object are obtained.
17. An electronic device comprising a processor, a memory, and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the content sharing method according to any one of claims 1 to 8.
18. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the content sharing method according to any one of claims 1 to 8.
CN201911204952.4A 2019-11-29 2019-11-29 Content sharing method and electronic equipment Pending CN110888854A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201911204952.4A CN110888854A (en) 2019-11-29 2019-11-29 Content sharing method and electronic equipment
PCT/CN2020/131168 WO2021104268A1 (en) 2019-11-29 2020-11-24 Content sharing method, and electronic apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911204952.4A CN110888854A (en) 2019-11-29 2019-11-29 Content sharing method and electronic equipment

Publications (1)

Publication Number Publication Date
CN110888854A true CN110888854A (en) 2020-03-17

Family

ID=69749636

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911204952.4A Pending CN110888854A (en) 2019-11-29 2019-11-29 Content sharing method and electronic equipment

Country Status (2)

Country Link
CN (1) CN110888854A (en)
WO (1) WO2021104268A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021104268A1 (en) * 2019-11-29 2021-06-03 维沃移动通信有限公司 Content sharing method, and electronic apparatus
CN114419527A (en) * 2022-04-01 2022-04-29 腾讯科技(深圳)有限公司 Data processing method, data processing equipment and computer readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090304346A1 (en) * 2008-06-06 2009-12-10 Disney Enterprises, Inc. Messaging with user generated content related to video object timecode
CN102546835A (en) * 2012-03-08 2012-07-04 腾讯科技(深圳)有限公司 Method for sharing contents, terminal, server and system
CN103294748A (en) * 2013-01-22 2013-09-11 北京旭宁信息技术有限公司 Method for excerpting and editing Internet contents
CN110489031A (en) * 2019-07-26 2019-11-22 维沃移动通信有限公司 Content display method and terminal device

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105955607B (en) * 2016-04-22 2020-06-19 北京小米移动软件有限公司 Content sharing method and device
CN106489129A (en) * 2016-09-29 2017-03-08 北京小米移动软件有限公司 The method and device that a kind of content is shared
CN108012197A (en) * 2017-12-15 2018-05-08 广州酷狗计算机科技有限公司 The method, apparatus and storage medium of sharing video frequency file
CN109491632A (en) * 2018-10-30 2019-03-19 维沃移动通信有限公司 A kind of resource sharing method and terminal
CN109933259A (en) * 2019-02-28 2019-06-25 维沃移动通信有限公司 A kind of content share method and mobile terminal
CN109948102B (en) * 2019-03-27 2021-05-25 维沃移动通信有限公司 Page content editing method and terminal
CN110233929A (en) * 2019-04-25 2019-09-13 维沃移动通信有限公司 A kind of display control method and terminal device
CN110471895B (en) * 2019-07-29 2022-07-05 维沃移动通信有限公司 Sharing method and terminal device
CN110888854A (en) * 2019-11-29 2020-03-17 维沃移动通信有限公司 Content sharing method and electronic equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090304346A1 (en) * 2008-06-06 2009-12-10 Disney Enterprises, Inc. Messaging with user generated content related to video object timecode
CN102546835A (en) * 2012-03-08 2012-07-04 腾讯科技(深圳)有限公司 Method for sharing contents, terminal, server and system
CN103294748A (en) * 2013-01-22 2013-09-11 北京旭宁信息技术有限公司 Method for excerpting and editing Internet contents
CN110489031A (en) * 2019-07-26 2019-11-22 维沃移动通信有限公司 Content display method and terminal device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
炎林数码: "剪映编辑的视频怎么发到抖⾳上", 《HTTPS://JINGYAN.BAIDU.COM/ARTICLE/FA4125AC821E5D68AC7092E3.HTML》 *
百度经验: "剪映怎么添加文字", 《HTTPS://JINGYAN.BAIDU.COM/ARTICLE/8065F87F8FEF5D62312498E2.HTML》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021104268A1 (en) * 2019-11-29 2021-06-03 维沃移动通信有限公司 Content sharing method, and electronic apparatus
CN114419527A (en) * 2022-04-01 2022-04-29 腾讯科技(深圳)有限公司 Data processing method, data processing equipment and computer readable storage medium
CN114419527B (en) * 2022-04-01 2022-06-14 腾讯科技(深圳)有限公司 Data processing method, equipment and computer readable storage medium

Also Published As

Publication number Publication date
WO2021104268A1 (en) 2021-06-03

Similar Documents

Publication Publication Date Title
CN111061574B (en) Object sharing method and electronic device
WO2020258929A1 (en) Folder interface switching method and terminal device
US11658932B2 (en) Message sending method and terminal device
US20220300302A1 (en) Application sharing method and electronic device
CN109614061B (en) Display method and terminal
WO2020238351A1 (en) Application downloading and classification method and terminal device
WO2021129536A1 (en) Icon moving method and electronic device
WO2020192299A1 (en) Information display method and terminal device
EP3699743B1 (en) Image viewing method and mobile terminal
CN111026299A (en) Information sharing method and electronic equipment
CN110908554B (en) Long screenshot method and terminal device
CN109828731B (en) Searching method and terminal equipment
CN109358931B (en) Interface display method and terminal
WO2020199783A1 (en) Interface display method and terminal device
CN111142747A (en) Group management method and electronic equipment
CN110865745A (en) Screen capturing method and terminal equipment
CN111273993B (en) Icon arrangement method and electronic equipment
WO2020215969A1 (en) Content input method and terminal device
CN111143299A (en) File management method and electronic equipment
CN111026350A (en) Display control method and electronic equipment
WO2021169954A1 (en) Search method and electronic device
CN110752981A (en) Information control method and electronic equipment
CN111383175A (en) Picture acquisition method and electronic equipment
CN111090529B (en) Information sharing method and electronic equipment
CN109992192B (en) Interface display method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200317

RJ01 Rejection of invention patent application after publication