CN117950767A - Content sharing method, system, electronic equipment and medium - Google Patents

Content sharing method, system, electronic equipment and medium Download PDF

Info

Publication number
CN117950767A
CN117950767A CN202211295723.XA CN202211295723A CN117950767A CN 117950767 A CN117950767 A CN 117950767A CN 202211295723 A CN202211295723 A CN 202211295723A CN 117950767 A CN117950767 A CN 117950767A
Authority
CN
China
Prior art keywords
electronic device
control
window
user
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211295723.XA
Other languages
Chinese (zh)
Inventor
陈才龙
吴启明
田旭杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202211295723.XA priority Critical patent/CN117950767A/en
Publication of CN117950767A publication Critical patent/CN117950767A/en
Pending legal-status Critical Current

Links

Abstract

The application relates to the technical field of communication and discloses a content sharing method, a content sharing system, electronic equipment and a medium. The method comprises the following steps: after the first electronic device receives a first operation of long-pressing a first control in a second window by a user sent by a second electronic device, monitoring an input track taking the first operation as an initial operation; the second window is an application window which is projected onto the first electronic device by the second electronic device; when the first electronic device monitors a movement track input by a user from a first position where the first control is located to a second position outside the second window, if a drag event corresponding to the first control is not acquired, displaying that the application where the first control is located does not support sharing of prompt information of a first object corresponding to the first control in a mode of dragging the first control. Based on the scheme, the electronic equipment can prompt when the file sharing can not be carried out in a dragging mode, so that a user can know the reason that the file can not be dragged conveniently, and user experience is improved.

Description

Content sharing method, system, electronic equipment and medium
Technical Field
The present application relates to the field of communications technologies, and in particular, to a content sharing method, a system, an electronic device, and a medium.
Background
With the development of electronic technology and mobile internet, a user may simultaneously have a plurality of terminal devices, such as mobile phones, tablet computers, and the like. The plurality of terminal devices can be connected in a wireless or wired mode, so that the cooperative use of the plurality of terminals is realized. Some applications exist, sharing among multiple devices is not supported in a mode of dragging a control corresponding to objects such as pictures and files in the application, and at this time, if a user drags the control in the application, the terminal device cannot respond to the dragging action of the user, so that the user experience is poor.
Disclosure of Invention
In order to solve the above problems, the present application provides a content sharing method, a system, an electronic device, and a medium.
In a first aspect, the application provides a content sharing method, which is applied to a first electronic device, wherein the first electronic device displays a first window and a second window, and the second window is an application window projected onto the first electronic device by the second electronic device; the method comprises the following steps: after the first electronic device receives a first operation of long-pressing a first control in a second window by a user sent by a second electronic device, monitoring an input track taking the first operation as an initial operation; when the first electronic device monitors a movement track input by a user from a first position where the first control is located to a second position outside the second window, if a drag event corresponding to the first control is not acquired, displaying that the application where the first control is located does not support sharing of prompt information of a first object corresponding to the first control in a mode of dragging the first control.
Based on the scheme, when the application where the first control dragged by the user is located does not support sharing of the first object corresponding to the first control in the mode of dragging the first control, the first electronic device can prompt, so that the user can know the reason that the user cannot drag conveniently, and user experience is improved. In addition, the embodiment of the application can accurately identify the user intention by combining the dragging track, the dragging event and the long-press event (namely the first operation), and avoid the occurrence of misjudgment.
In the application, the first control can be a control such as a file icon, a thumbnail and the like, and the object corresponding to the first control can be a file corresponding to the file icon, a picture corresponding to the thumbnail and the like.
In the application, when the first electronic device receives a drag event when monitoring the moving track from the first position of the first control to the second position outside the second window, which is input by the user, the object corresponding to the first control can be stored in the set storage position and displayed in the drag release position of the user when monitoring the drag release position of the user.
The drag event may include a file name, a file size, a file path, a drag shadow, and the like of an object corresponding to the first control.
In one possible implementation, the second position is any position outside the second window.
It can be understood that in the application, when the first electronic device monitors the moving track from the first position where the first control is located to any position outside the second window, if a drag event is not received, a prompt is performed.
In one possible implementation, the first window is an application window of the first electronic device; the second position is any position of the first window.
It can be understood that in the present application, when the first electronic device monitors the movement track from the first position where the first control is located to any position in the first window, if a drag event is not received, a prompt is performed.
In one possible implementation, the first electronic device monitors a movement track of a user input from a first position where the first control is located to a second position outside the second window, including: the first electronic device monitors that the mouse moves out from a first position where the first control is located to a second position outside the second window, and determines a movement track from the first position where the first control is located to the second position outside the second window, which is input by a user, when the mouse is in a pressed state all the time in the moving-out process.
In one possible implementation, the first electronic device monitors a movement track of a user input from a first position where the first control is located to a second position outside the second window, including: the first electronic device monitors that a touch track of a user moves out from a first position where the first control is located to a second position outside the second window, and determines that a movement track from the first position where the first control is located to the second position outside the second window, which is input by the user, is monitored under the condition that the touch track is always in a continuous state in the moving-out process.
In a second aspect, the present application provides a content sharing method, applied to a second electronic device, where the second electronic device is connected to a first electronic device, and the method includes: the second electronic device determines that a first control in the second window is pressed for a long time, and sends a first operation of pressing the first control for a long time to the first electronic device; the second window is an application window which is projected onto the first electronic device by the second electronic device; when the second electronic device determines that the application where the first control is located does not support sharing of the first object corresponding to the first control in a mode of dragging the first control, a dragging event corresponding to the first control is not sent to the first electronic device.
In the application, the second electronic device can monitor the long-press event (namely the first operation) in real time, and after the first operation is monitored, if the application where the first control is located is determined not to support sharing of the first object corresponding to the first control by dragging the first control, the dragging event is not sent to the first electronic device, so that the first electronic device knows that the application where the first control is located does not support sharing of the first object corresponding to the first control by dragging the first control. When the fact that the application where the first control is located supports sharing of the first object corresponding to the first control in a mode of dragging the first control is determined, a dragging event is sent to the first electronic device, and therefore the first electronic device executes a follow-up dragging flow.
In one possible implementation, the second electronic device determining that the first control in the second window is pressed for a long time includes: the second electronic device obtains position information corresponding to the operation of the user on the second window, which is sent by the first electronic device; the second electronic device determines that the first control in the second window is pressed for a long time based on the location information.
In the present application, the position information may be coordinate information. When the second electronic device obtains the coordinate information corresponding to the continuous first control in the set time sent by the first electronic device, the second electronic device can determine that the first control is pressed for a long time.
In a third aspect, the present application provides a content sharing method, where a first electronic device and a second electronic device establish a connection; the first electronic device displays a first window and a second window, wherein the second window is an application window which is projected onto the first electronic device by the second electronic device; the second electronic device monitors a first operation of long-pressing a first control in a second window by a user, and sends the first operation to the first electronic device; after receiving a first operation, the first electronic equipment monitors an input track taking the first operation as an initial operation; when the first electronic device monitors a movement track input by a user from a first position where the first control is located to a second position outside the second window, if a drag event corresponding to the first control is not acquired, displaying that the application where the first control is located does not support sharing of prompt information of a first object corresponding to the first control in a mode of dragging the first control.
In one possible implementation, the second position is any position outside the second window.
In one possible implementation, the first window is an application window of the first electronic device; the second position is any position of the first window.
In one possible implementation, the first electronic device monitors a movement track of a user input from a first position where the first control is located to a second position outside the second window, including: the first electronic device monitors that the mouse moves out from a first position where the first control is located to a second position outside the second window, and determines a movement track from the first position where the first control is located to the second position outside the second window, which is input by a user, when the mouse is in a pressed state all the time in the moving-out process.
In one possible implementation, the first electronic device monitors a movement track of a user input from a first position where the first control is located to a second position outside the second window, including: the first electronic device monitors that a touch track of a user moves out from a first position where the first control is located to a second position outside the second window, and determines that a movement track from the first position where the first control is located to the second position outside the second window, which is input by the user, is monitored under the condition that the touch track is always in a continuous state in the moving-out process.
In one possible implementation, the method further comprises: the second electronic device monitors a first operation of long-pressing a first control in a second window by a user, and the first operation comprises the following steps: the second electronic device obtains position information corresponding to the operation of the user in the second window, which is sent by the first electronic device; the second electronic device determines that the first control is pressed long according to the position information, and determines that the first operation of the first control in the second window is monitored.
In a fourth aspect, the application provides a content sharing method, which is applied to electronic equipment, wherein the electronic equipment displays a first window and a second window, the first window is a first application window of the electronic equipment, and the second window is a second application window of the electronic equipment; the method comprises the following steps: after the electronic equipment monitors a first operation of long-pressing a first control in a first window by a user, monitoring an input track taking the first operation as an initial operation; when the electronic device monitors a movement track input by a user from a first position where the first control is located to a second position outside the first window, if it is determined that a drag event corresponding to the first control is not monitored, displaying that the application where the first control is located does not support sharing of prompt information of a first object corresponding to the first control in a mode of dragging the first control.
In the application, the first window and the second window can be the windows of the same application or windows of different applications. I.e. the first application and the second application may be the same application or may be different applications.
Based on the scheme, in a single-device scene, when the application where the first control dragged by the user is located does not support sharing of the first object corresponding to the first control in a dragging mode of the first control, the electronic device can prompt, so that the user can know the reason that the user cannot drag conveniently, and user experience is improved. In addition, the embodiment of the application can accurately identify the user intention by combining the dragging track, the dragging event and the long-press event (namely the first operation), and avoid the occurrence of misjudgment.
In the application, when the electronic equipment receives a drag event when detecting the moving track from the first position of the first control to the second position outside the second window, which is input by the user, the electronic equipment can store the object corresponding to the first control in the set storage position and display the object in the drag release position of the user when detecting the drag release position of the user.
The drag event may include a file name, a file size, a file path, a drag shadow, and the like of an object corresponding to the first control.
In one possible implementation, the second position is any position outside the first window.
In one possible implementation, the second position is any position of the second window.
In one possible implementation, the electronic device monitors a movement track input by a user from a first position where the first control is located to a second position outside the second window, including: the electronic equipment monitors that the mouse moves out from a first position where the first control is located to a second position outside the second window, and determines a movement track from the first position where the first control is located to the second position outside the second window, which is input by a user, under the condition that the mouse is always in a pressed state in the moving-out process.
In one possible implementation, the electronic device monitors a movement track input by a user from a first position where the first control is located to a second position outside the second window, including:
The electronic equipment monitors that the touch track of the user moves out from the first position where the first control is located to the second position outside the second window, and determines that the movement track from the first position where the first control is located to the second position outside the second window, which is input by the user, is monitored under the condition that the touch track is always in a continuous state in the moving-out process.
In one possible implementation, the first window is a floating window and/or the second window is a floating window.
In the present application, one of the first window and the second window may be a floating window, or both the first window and the second window are floating windows.
In a fifth aspect, the present application provides a content sharing method, which is applied to a first electronic device, where the first electronic device is connected to a second electronic device, the first electronic device displays a first window, and the second electronic device displays a second window; the method comprises the following steps: when the first electronic device receives the position information of the second position outside the display screen of the second electronic device and the movement track from the first position to the second position of the first control in the second window, which are sent by the second electronic device, and does not receive the drag event corresponding to the first control, the application where the first control is displayed does not support sharing of the prompt information of the first object corresponding to the first control in a mode of dragging the first control.
It can be appreciated that when the mouse is moved out of the display screen of the second electronic device along a track or a user touch track, the mouse can traverse to the display screen of the first electronic device.
That is, in the present application, a movement trajectory from a first position to a second position of a first control of a first window may refer to a user's input operation (such as a mouse or a user touch trajectory) from the first position of the first control to a position (i.e., a second position) moved out of a display screen of a second electronic device, or may refer to a user's input operation (such as a mouse or a user touch trajectory) from the first position of the first control to a position (i.e., a second position) moved into a display screen of the first electronic device.
Based on the scheme, in a key mouse crossing scene, when the application where the first control dragged by the user is located does not support sharing of the first object corresponding to the first control in a dragging mode of the first control, the electronic device can prompt, so that the user can know the reason that the user cannot drag conveniently, and user experience is improved. In addition, the embodiment of the application can accurately identify the user intention by combining the dragging track, the dragging event and the long-press event (namely the first operation), and avoid the occurrence of misjudgment.
In the application, when the second electronic device monitors the movement track from the first position of the first control to the second position outside the display screen of the second electronic device, which is input by the user, if the application of the first control dragged by the user is determined to support sharing of the first object corresponding to the first control by dragging the first control, the coordinate information of the second position, the movement track from the first position of the first control to the second position outside the display screen of the second electronic device and the dragging event can be sent to the first electronic device, so that the first electronic device executes the subsequent dragging flow.
In the application, when the second electronic device monitors the movement track from the first position of the first control to the second position outside the display screen of the second electronic device, which is input by the user, if the application of the first control dragged by the user is determined not to support sharing of the first object corresponding to the first control by dragging the first control, only the coordinate information (position information) of the second position and the movement track from the first position of the first control to the second position outside the display screen of the second electronic device can be sent to the first electronic device, so that the first electronic device can receive the coordinate information and then prompt. For example, the mouse just moves out of the display screen of the second electronic device, enters the display screen of the first electronic device, namely reaches the second position, and the first electronic device determines that only coordinate information is received and a drag event is not received, so that prompt can be performed.
In the application, when the first electronic device receives the position information and the movement track from the first position where the first control is located to the second position outside the display screen of the second electronic device, whether a drag event is received or not can be judged, if the drag event is not received, the user can be determined to drag, and a drag intention exists, but the application where the first control of the second electronic device is located does not support sharing of the first object corresponding to the first control in a mode of dragging the first control, prompt information is displayed by the first electronic device at the moment to prompt the user that the application where the first control of the second electronic device is located does not support sharing of the first object corresponding to the first control in a mode of dragging the first control.
In the application, when the first electronic device receives the position information and the moving track from the first position of the first control to the second position outside the display screen of the second electronic device, if a drag event is not received, the first electronic device can display prompt information when continuously monitoring the track which is input by the user and moves into the first window.
The drag event may include a file name, a file size, a file path, a drag shadow, and the like of an object corresponding to the first control.
In a sixth aspect, the present application provides a content sharing method, where a first electronic device is connected to a second electronic device, where the first electronic device displays a first window, and the second electronic device displays a second window; when the second electronic device monitors a movement track input by a user from a first position of a first control in a second window to a second position outside a display screen of the second electronic device, sending position information of the second position and the movement track to the first electronic device; when the first electronic device receives the position information and the movement track of the second position sent by the second electronic device and does not receive the dragging event corresponding to the first control, the first electronic device displays that the application where the first control is located does not support sharing of the prompt information of the first object corresponding to the first control in a mode of dragging the first control.
In a seventh aspect, the present application provides a content sharing system, including a first electronic device and a second electronic device, where the first electronic device displays a first window and a second window, and the second window is an application window that the second electronic device projects onto the first electronic device; the second electronic device is used for monitoring a first operation of long-pressing a first control in the second window by a user and sending the first operation to the first electronic device; the first electronic equipment is used for monitoring an input track taking the first operation as an initial operation after receiving the first operation; and the first electronic device is used for displaying that the application where the first control is located does not support sharing of the prompt information of the first object corresponding to the first control in a mode of dragging the first control if the dragging event corresponding to the first control is not acquired when the moving track, which is input by the user, from the first position where the first control is located to the second position outside the second window is monitored.
In one possible implementation, the first electronic device includes a first system application; the second electronic device comprises a control module and a second system application; the control module is used for monitoring a first operation of long-pressing a first control in the second window by a user and sending the first operation to the second system application; the second system application is used for sending the first operation to the first system application; the first system application is used for monitoring an input track taking the first operation as an initial operation after receiving the first operation; and the first system application is used for displaying that the application where the first control is located does not support sharing of the prompt information of the first object corresponding to the first control in a mode of dragging the first control if the dragging event corresponding to the first control is not acquired when the movement track, which is input by the user, from the first position where the first control is located to the second position outside the second window is monitored.
In an eighth aspect, the present application provides a content sharing system, including a first electronic device and a second electronic device, where the first electronic device is connected to the second electronic device, the first electronic device displays a first window, and the second electronic device displays a second window; the second electronic device is used for monitoring an input track taking the first operation as an initial operation after monitoring the first operation of the first control in the second window by the user; the second electronic device is used for monitoring a movement track from a first position of the first control in the second window to a second position outside the display screen of the second electronic device, which is input by a user, and sending the position information of the second position and the movement track to the first electronic device; and the first electronic device is used for displaying that the application where the first control is located does not support sharing of the prompt information of the first object corresponding to the first control in a mode of dragging the first control if the dragging event corresponding to the first control is not received when the position information of the second position and the movement track sent by the second electronic device are received.
In a ninth aspect, the present application provides an electronic device, where the electronic device is a first electronic device, and the first electronic device displays a first window and a second window, where the second window is an application window that is projected onto the first electronic device by the second electronic device; the first electronic device is used for monitoring an input track taking the first operation as an initial operation after receiving the first operation of long-time pressing of a first control in the second window by a user sent by the second electronic device;
And the first electronic device is used for displaying that the application where the first control is located does not support sharing of the prompt information of the first object corresponding to the first control in a mode of dragging the first control if the dragging event corresponding to the first control is not acquired when the moving track, which is input by the user, from the first position where the first control is located to the second position outside the second window is monitored.
In a tenth aspect, the present application provides an electronic device, which is a second electronic device; the second electronic device is used for determining that the first control in the second window is pressed for a long time, and sending a first operation of pressing the first control in the second window for a long time to the first electronic device by a user; the second window is an application window which is projected onto the first electronic device by the second electronic device; and the second electronic device is used for not sending the drag event corresponding to the first control to the first electronic device when the application where the first control is located is determined to not support sharing of the first object corresponding to the first control in a mode of dragging the first control.
In an eleventh aspect, the present application provides an electronic apparatus, comprising: the memory is used for storing instructions executed by one or more processors of the electronic device, and the processor is one of the one or more processors of the electronic device and is used for executing the content sharing method.
In a twelfth aspect, the present application provides a readable storage medium having stored thereon instructions that, when executed on an electronic device, cause the electronic device to perform the content sharing method mentioned in the present application.
Drawings
FIG. 1a illustrates a schematic view of a screen shot scene, according to some embodiments of the application;
FIG. 1b illustrates a schematic view of a scenario of content sharing, according to some embodiments of the application;
FIG. 1c illustrates a schematic view of a scenario of content sharing, according to some embodiments of the application;
FIG. 2 illustrates a schematic view of a scenario in which drag is not supported, according to some embodiments of the application;
FIG. 3 illustrates a schematic view of a scenario in which drag is not supported, according to some embodiments of the application;
FIG. 4a illustrates a schematic view of a scenario in which drag is not supported, according to some embodiments of the application;
FIG. 4b illustrates a schematic view of a scenario in which drag is not supported, according to some embodiments of the application;
FIG. 5 is a schematic diagram illustrating a page display when a drag is disabled in a drop screen scenario according to some embodiments of the present application;
FIG. 6 is a schematic diagram illustrating a page display when a mouse cannot be dragged in a mouse traversing scenario, according to some embodiments of the present application;
FIG. 7 is a schematic diagram of a page display in a single device scenario when dragging is disabled, according to some embodiments of the application;
FIG. 8a illustrates a software architecture diagram of an electronic device, according to some embodiments of the application;
FIG. 8b illustrates a software architecture diagram of an electronic device, according to some embodiments of the application;
FIG. 8c is a schematic diagram illustrating a content sharing method according to some embodiments of the application;
fig. 9 is a schematic flow chart of a content sharing method in a screen-projection scenario according to some embodiments of the present application;
Fig. 10 is a schematic flow chart of a content sharing method in a screen-projection scenario according to some embodiments of the present application;
FIG. 11 is a flow chart illustrating a method for content sharing in a single device scenario according to some embodiments of the present application;
FIG. 12a is a schematic diagram illustrating a page display in a single device scenario when dragging is not possible, according to some embodiments of the application;
FIG. 12b is a flow chart of a content sharing method in a single device scenario according to some embodiments of the present application;
Fig. 13 is a schematic flow chart of a content sharing method in a mouse traversing scenario according to some embodiments of the present application;
Fig. 14 is a schematic flow chart of a content sharing method in a mouse traversing scenario according to some embodiments of the present application;
fig. 15 is a schematic diagram showing a hardware structure of an electronic device according to some embodiments of the present application.
Detailed Description
Illustrative embodiments of the application include, but are not limited to, a content sharing method, system, electronic device, and medium.
The following first briefly describes a scenario of content sharing among multiple devices.
In some embodiments, as shown in fig. 1a, a multi-screen collaboration relationship is established between the mobile phone 100 and the computer 200, and a second window 201 may be displayed on the computer 200, where the second window 201 is used to display an application interface of the mobile phone 100. The mobile phone 100 currently displays an interface of a first communication application, where the communication application supports a drag behavior, that is, supports sharing objects such as files and pictures corresponding to a control in a mode of dragging the control in the application. For example, the user may drag the thumbnail 202 corresponding to the thumbnail 202 in the communication application to the computer 200 by dragging the thumbnail 202 with a mouse. As shown in fig. 1b, when the user presses the thumbnail 202 to be dragged by the mouse long, a draggable shadow will appear at the mouse arrow to indicate that the user can drag movement. For example, the user may drag the thumbnail 202 to the releasable area 203 on the desktop and then release as shown in FIG. 1 c. If some pictures in the application cannot be dragged due to no download, the application may prompt the reason that the pictures cannot be dragged, for example, as shown in fig. 2, the application may prompt a prompt message "the file is not downloaded and does not support dragging" at the top end of the interface of the communication application window. If the user drags the thumbnail 202 to an area that cannot be released, such as the area 204 in a personal computer (Personal Computer, PC) housekeeping page, as shown in FIG. 3, the mouse will be displayed as a mark that prohibits the drag to indicate to the user that the thumbnail 202 cannot be dragged to that area.
There are some applications that do not support sharing of the object corresponding to the control by way of a drag control, for example, as shown in fig. 4a, when the user presses a thumbnail 202 in another second communication application that does not support drag for a long time, no drag shadow will appear at the mouse arrow. And as shown in fig. 4b, no matter when the user moves the mouse and releases it in the releasable area 203 or releases it in the non-releasable area 204, the picture corresponding to the thumbnail 202 will not be shared, and the application will not have any prompt information. In this way, the user may not know that the application does not support the dragging, and may consider that the computer or the mobile phone has some system faults and other reasons, so that the user cannot drag, and the user experience is poor.
In order to solve the above technical problems, an embodiment of the present application provides a content sharing method, which is applied to a first electronic device and a second electronic device, where the first electronic device is assumed to be a data dragging-in end (destination end) device, and the second electronic device is assumed to be a data dragging-out end (data source end) device. The first electronic device displays a first window and a second window, wherein the second window is an application window which is projected onto the first electronic device by the second electronic device. The first window may be a display window of a desktop application of the first electronic device, or may be a display window of another application of the first electronic device.
The content sharing method may include: the second electronic device monitors a long-press event and sends the long-press event to the first electronic device, wherein the long-press event can be a first operation of long-pressing a first control in a second window by a user, for example, an operation that the time for pressing the first control (such as a file icon, a thumbnail and the like) in the second window by the user is longer than a set time.
The first electronic device can always monitor a user dragging track (user input track) after receiving the long-press event, when the dragging track is monitored to be from a first position where the first control is located to a second position outside the second window, whether the dragging event sent by the second electronic device is received or not can be further judged, if the fact that the dragging event is not received is confirmed, the user is proved to drag, a dragging intention exists, but the application where the first control of the second electronic device is located does not support sharing of the first object corresponding to the first control in a mode of dragging the first control, prompt information is displayed by the first electronic device at the moment, so that the user is prompted that the application where the first control of the second electronic device is located does not support sharing of the first object corresponding to the first control in a mode of dragging the first control.
Based on the scheme, when the application where the first control dragged by the user is located does not support sharing of the first object corresponding to the first control in the mode of dragging the first control, the first electronic device can prompt, so that the user can know the reason that the user cannot drag conveniently, and user experience is improved. In addition, the embodiment of the application can accurately identify the user intention by combining the dragging track, the dragging event and the long-press event, and avoid the occurrence of misjudgment.
The drag event may include a file name, a file size, a file path, a drag shadow, and the like of an object corresponding to the first control.
In some embodiments, when the first electronic device detects that the drag track is from the first position where the first control is located to the second position outside the second window, and then a drag event is received, the object corresponding to the first control may be stored in a set storage position and displayed in the drag release position of the user when the drag release position of the user is detected.
It will be appreciated that the second position may be any position outside the second window, and may also refer to any position of the first window. For example, the first window is a communication application window opened on the computer, and the second position may refer to any position of the communication application window, or may be a first position where a mouse or a user touch track moves out of the communication application window.
It can be understood that in the embodiment of the present application, if the user drags the first control through the mouse, the first electronic device may monitor that the mouse moves out from the first position where the first control is located to the second position outside the second window, and the mouse is always in a pressed state during the moving out process, so as to determine that the user drags the track from the first position where the first control is located to the second position outside the second window.
In some embodiments, if the user drags the first control through a touch screen (e.g., a touch screen of a mobile phone, a tablet computer) or a touch area (e.g., a touch area of a notebook computer). Correspondingly, the first electronic device may determine that the user drag track moves out from the first position where the first control is located to the second position outside the second window by monitoring that the user touch track moves out from the first position where the first control is located to the second position outside the second window when the touch track is in a continuous state all the time in the moving-out process.
It can be appreciated that after the second electronic device monitors the long press event, if it is determined that the application where the first control is located does not support sharing of the first object corresponding to the first control by dragging the first control, the dragging event corresponding to the first control is not sent to the first electronic device. And if the application where the first control is located is determined to support sharing of the first object corresponding to the first control in a mode of dragging the first control, sending a dragging event corresponding to the first control to the first electronic device.
It may be appreciated that in the embodiment of the present application, a user may use a mouse or other input device to select a dragged object (e.g., a file icon, a thumbnail image, etc. control) and a target location to be dragged.
Specifically, when a user performs an input using a mouse, the user may move the mouse to cause the mouse pointer to move over an object to be dragged and specify the object to be dragged by pressing the left mouse button (or some other button for this purpose), and then move the mouse to cause the object to be dragged on a single device or across devices. When input is performed with a touch screen or a touch area, a user can designate an object to be dragged using a continuous touch screen operation at the touch screen or the touch area, and drag on a single device or drag across devices is achieved through a change in a touch screen trajectory.
It can be appreciated that in the embodiment of the present application, the content sharing method may be used in a scenario where the second electronic device performs screen-casting and cross-end dragging on the first electronic device. For example, in the scenario shown in fig. 5, the second electronic device may be the mobile phone 100 shown in fig. 5, and the first electronic device may be the computer 200 shown in fig. 5. After the mobile phone 100 and the computer 200 are connected, a second window 201 is displayed in the computer 200, where the second window 201 is used to display a first communication application window of the mobile phone 100. Dragging of the control using an input device (e.g., a mouse connected to the computer 200) may enable cross-terminal sharing of objects such as pictures or files corresponding to the control, e.g., sharing pictures corresponding to the thumbnail 202 from the cell phone 100 to the computer 200. When the computer 200 determines that the drag track is any position moved from the first position corresponding to the thumbnail 202 to the position outside the second window 201 and determines that the drag event is not received, the computer 200 displays a prompt message, for example, the prompt message may be "the application is not adapted and the drag is not supported", so as to remind the user of the reason that the drag cannot be currently performed.
In the embodiment of the application, a content sharing method for a scene of crossing and dragging by a key mouse is further provided, wherein a first electronic device is connected with a second electronic device, the first electronic device displays a first window, and the second electronic device displays a second window;
After the second electronic device monitors the first operation of the first control in the second window for the long time by the user, the second electronic device monitors an input track taking the first operation as an initial operation; when the second electronic equipment monitors a moving track input by a user from a first position of a first control in a second window to a second position outside the second window, sending position information of the second position to the first electronic equipment; when the first electronic device receives the position information of the second position outside the second window sent by the second electronic device, and does not receive the dragging event corresponding to the first control, the first electronic device displays that the application where the first control is located does not support sharing of the prompt information of the first object corresponding to the first control in a mode of dragging the first control.
As shown in fig. 6, the second electronic device is a mobile phone 100, the first electronic device is a computer 200, after the mobile phone 100 is connected to the computer 200, the sharing of objects such as pictures or files across devices can be achieved by using the same input device (for example, a mouse connected to the computer 200) through a key mouse sharing technology under the condition that screen projection is not started, for example, pictures corresponding to the thumbnails 202 are shared from the mobile phone 100 to the computer 200. When the computer 200 receives the position information of the second position outside the second display window sent by the second electronic device, a drag event corresponding to the thumbnail 202 is not received, and at this time, the computer 200 displays a prompt message, for example, the prompt message may be "the application is not adapted and does not support the drag", so as to remind the user of the reason that the user cannot drag currently.
It can be understood that the foregoing is an explanation of a content sharing method between multiple devices, and the embodiment of the present application further provides a content sharing method, which may be used for drag processing in a single device. For example, the electronic device may display a first window and a second window, where the first window is a first application window of the electronic device and the second window is a second application window of the electronic device; the first application window and the second application window may be different windows of the same application or windows of different applications.
In some embodiments the first application window may be a floating window and the second application window a normal window. In some embodiments, the first application window and the second application window may each be a floating window. In some embodiments the first application window may be a normal window and the second application window a floating window.
The content sharing method for a single device may include: after the electronic equipment monitors a first operation of long-pressing a first control in a first window by a user, monitoring an input track taking the first operation as an initial operation; when the electronic device monitors a movement track input by a user from a first position where the first control is located to a second position outside the first window, if it is determined that a drag event corresponding to the first control is not monitored, displaying that the application where the first control is located does not support sharing of prompt information of a first object corresponding to the first control in a mode of dragging the first control. The second position may be any position outside the first window or any position inside the second window.
For example, as shown in fig. 7, the single device is the mobile phone 100, and by long-pressing and moving the thumbnail 202 in the first communication application window, sharing the picture corresponding to the thumbnail 202 in the communication application into the floating window 205 of the second application can be achieved. When the mobile phone 100 monitors a movement track from the first position where the thumbnail 202 is located to the second position in the floating window 207, if it is determined that the drag event corresponding to the thumbnail 202 is not monitored, the mobile phone 100 may display a prompt message, for example, the prompt message may be "application is not adapted and does not support drag", so as to remind the user that the current application does not support drag.
Before describing the content sharing method provided in the embodiments of the present application in detail, an electronic device mentioned in the embodiments of the present application is first described. It is understood that the electronic device in the embodiment of the present application may be any electronic device such as a mobile phone, a computer, a Virtual Reality (VR) device, a tablet computer, a wearable device, a vehicle-mounted device, an augmented reality (augmented reality, AR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a personal digital assistant, and the like. The form of the electronic device in the embodiment of the application is not particularly limited.
In the embodiment of the application, the software system of the second electronic device is taken as an Android system, the software system of the first electronic device is taken as a Windows system as an example, and the software architecture of the second electronic device and the first electronic device in the embodiment of the application is schematically illustrated.
Fig. 8a shows a schematic software architecture of a second electronic device according to an embodiment of the present application. As shown in fig. 8a, the second electronic device may include an application layer, a framework layer, an Zhuoyun rows (Android runtime) and system libraries, and a kernel layer.
The application layer may include a series of application packages. The applications may include three-way applications, which may include camera applications, communication applications, etc., and system applications, which may include applications such as PC assistants. The PC assistant may be configured to register a long press event monitor with a control module (view), register a drag event monitor with a drag frame, and send the long press event or drag event to other devices connected to the second electronic device through the first communication frame after receiving the long press event or drag event.
The framework layer provides an application programming interface (application programming interface, API) and programming framework for the application programs of the application layer. The framework layer includes some predefined functions.
The framework layer may include a control module (view), a drag framework, and a first communication framework. The view is used for monitoring whether a long-press event exists or not, and if so, the PC assistant is notified; the drag frame is used for monitoring whether a drag event exists, and if the drag event exists, the PC assistant is notified; the first communication framework is used for receiving and transmitting long press events or drag events.
The framework layer may also include a window manager, content provider, view system, telephony manager, resource manager, notification manager, etc.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android runtime is responsible for scheduling and management of the android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the framework layer run in virtual machines. The virtual machine executes java files of the application layer and the framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The software architecture of the first electronic device in the embodiment of the present application is schematically described below by taking the software system of the first electronic device as a Windows system as an example. Fig. 8b shows a schematic software architecture of a first electronic device according to an embodiment of the present application. As shown in fig. 8b, the first electronic device may include an application layer and a Windows framework.
The application layer may include a series of application packages. The applications may include three-party applications, including camera applications, communication applications, etc., and system applications, including PC manager applications, etc
The Windows framework may include a second communication framework that is operable to receive the long press event and the drag event sent by the second electronic device and send the long press event and the drag event to the PC manager.
The content sharing method of the screen-casting scene in the embodiment of the application is briefly described based on the example that the second electronic device is an Android system and the first electronic device is a Windows system, wherein the second electronic device may be a mobile phone, and the first electronic device may be a computer. The first electronic device displays a first window and a second window, wherein the second window is an application window which is projected onto the first electronic device by the second electronic device. The first window may be a display window of a desktop application of the first electronic device, or may be a display window of another application of the first electronic device.
As shown in fig. 8c, the PC assistant of the mobile phone registers long press event interception with the control module (view), and registers drag event interception with the drag frame. The view of the handset monitors that a long press event by the user will notify the PC assistant (i.e., send the long press event to the PC assistant), the PC assistant sends the long press event to the first communication framework, which sends the long press event to the second communication framework of the computer, which sends the long press event to the PC manager of the computer. The PC assistant is notified of the drag event detected by the drag frame (i.e., the drag event is sent to the PC assistant), and the PC assistant sends the drag event to the first communication frame, which sends the drag event to the second communication frame, which in turn sends the drag event to the PC steward.
After receiving the long-press event, the PC manager of the computer monitors that the user drag track is from the first position of the first control in the second window to the second position outside the second window, and controls the display screen to display prompt information if the drag event is not received.
The following describes a content sharing method in the embodiment of the present application based on the above-mentioned electronic device in the embodiment of the present application. Firstly, taking a screen-throwing scene as an example, a content sharing method in the embodiment of the application is described. Fig. 9 is an interactive flow diagram of a content sharing method according to an embodiment of the present application. The first electronic device is assumed to be a data dragging-in end device, and the second electronic device is assumed to be a data dragging-out end device. The first electronic device displays a first window and a second window, wherein the second window is an application window which is projected onto the first electronic device by the second electronic device. The first window may be a display window of a desktop application of the first electronic device, or may be a display window of another application of the first electronic device. As shown in fig. 9, the method includes:
901: the first electronic device and the second electronic device establish a connection.
The second electronic device and the first electronic device can be connected in a wired or wireless mode. Based on the established connection, the second electronic device and the first electronic device may be used in cooperation. In this embodiment, the wireless Communication protocol used when the second electronic device and the first electronic device establish connection in a wireless manner may be a wireless fidelity (WIRELESS FIDELITY, wi-Fi) protocol, bluetooth (Bluetooth) protocol, zigBee protocol, near Field Communication (NFC) protocol, various cellular network protocols, or the like, which are not limited herein.
It can be understood that in the embodiment of the present application, after the second electronic device and the first electronic device establish connection, the first electronic device may display a second window, that is, an application window of the second electronic device on the first electronic device. For example, in the scenario shown in fig. 5, the second electronic device may be the mobile phone 100 shown in fig. 5, and the first electronic device may be the computer 200 shown in fig. 5. After the mobile phone 100 and the computer 200 are connected, the second window 201 is displayed in the computer 200, and the drawing of the control by using the input device (for example, a mouse connected with the computer 200) can realize cross-end sharing of the picture corresponding to the control such as the thumbnail 202, for example, the picture corresponding to the thumbnail 202 is shared from the mobile phone 100 to the computer 200.
902: The second electronic device monitors a first operation of the user to press the first control in the second window for a long time.
It may be appreciated that, in the embodiment of the present application, the first operation of the user pressing the first control in the second window for a long time, that is, the long-press event may be an operation that the time for which the first control (for example, a control such as a file icon, a picture thumbnail, etc.) in the second window is pressed by the user is longer than the set time.
For example, as shown in fig. 5, when the mobile phone 100 monitors that the user presses the thumbnail 202 in the communication application window for a set time, it may be determined that a long press event is monitored.
903: The second electronic device sends the first operation to the first electronic device.
904: After the first electronic device receives the first operation, a moving track input by a user from a first position where the first control is located to a second position outside the second window is monitored.
It can be appreciated that in the embodiment of the present application, after receiving the first operation, the first electronic device may monitor, in real time, the user input track using the first operation as the starting operation. After the movement track input by the user from the first position of the first control to the second position outside the second window is monitored, the user can prove that the dragging intention exists, and whether a dragging event is received can be judged.
After the first electronic device receives the first operation, it may specifically be monitored that the input operation (such as a mouse or a touch track of the user) of the user moves out of the second window, that is, the second position may be any position outside the second window. After the first electronic device receives the first operation, it may also be specifically monitored that the user input (such as a mouse or a user touch track) moves into the first window, that is, the second position may also refer to any position of the first window.
For example, in some embodiments, when the second window is a communication application window of the second electronic device projected on the first electronic device, and the first window is an application window of the first electronic desktop, the first electronic device may specifically monitor that an input operation (such as a mouse or a touch track of a user) of the user moves out of the communication application window after receiving the first operation, that is, the second position is any position outside the communication application window. After the first electronic device receives the first operation, it may also be specifically monitored that the input operation (such as a mouse or a touch track of a user) of the user moves into the desktop application window, that is, the second position is any position in the desktop application window.
It can be understood that in the embodiment of the present application, if the user drags the first control through the mouse, the first electronic device may monitor that the mouse moves out from the first position where the first control is located to the second position outside the second window, and the mouse is always in a pressed state during the moving out process, so as to determine that the user drags the track from the first position where the first control is located to the second position outside the second window.
In some embodiments, if the user drags the first control through a touch screen (e.g., a touch screen of a mobile phone, a tablet computer) or a touch area (e.g., a touch area of a notebook computer). Correspondingly, the first electronic device may determine that the user drag track moves out from the first position where the first control is located to the second position outside the second window by monitoring that the user touch track moves out from the first position where the first control is located to the second position outside the second window when the touch track is in a continuous state all the time in the moving-out process.
905: The first electronic device determines whether a drag event is received. If not, go to 906, send out prompt information; if yes, go to 907 to execute the drag flow.
It can be appreciated that after the second electronic device monitors the long press event, if it is determined that the application where the first control is located does not support sharing of the first object corresponding to the first control by dragging the first control, the dragging event corresponding to the first control is not sent to the first electronic device. And if the application where the first control is located is determined to support sharing of the first object corresponding to the first control in a mode of dragging the first control, sending a dragging event corresponding to the first control to the first electronic device.
When the first electronic device monitors that the dragging track is from the first position where the first control is located to the second position outside the second window, if it is determined that a dragging event is not received, it is proved that a user drags, and a dragging intention exists, but an application where the first control of the second electronic device is located does not support sharing of a first object corresponding to the first control in a mode of dragging the first control, at this time, the first electronic device displays prompt information to prompt the user that the application where the first control of the second electronic device is located does not support sharing of the first object corresponding to the first control in a mode of dragging the first control.
When the first electronic device monitors that the drag track is from the first position where the first control is located to the second position outside the second window, if it is determined that a drag event is received, the drag process can be directly executed. And the object corresponding to the first control is stored in the set storage position and displayed in the drag release position of the user when the drag release position of the user is monitored.
906: The first electronic device sends out prompt information.
It will be appreciated that in some embodiments, the first electronic device may display a prompt, e.g., as shown in fig. 5, the first electronic device may display a prompt of "application not adapted, drag not supported".
907: And the first electronic equipment executes a drag flow.
The drag flow may include storing an object corresponding to the first control in a set storage location and displaying the object in a drag release location of the user when the drag release location of the user is monitored.
The content sharing method in the embodiment of the application is described below by taking the second electronic device as a mobile phone and the first electronic device as a computer as an example, and combining the software frameworks of the first electronic device and the second electronic device, wherein the first electronic device comprises a control module, a drag framework, a first communication framework and a PC assistant. The second electronic device includes a second communication framework and a PC manager.
The first electronic device is assumed to be a data dragging-in end device, and the second electronic device is assumed to be a data dragging-out end device. The first electronic device displays a first window and a second window, wherein the second window is an application window which is projected onto the first electronic device by the second electronic device. The first window may be a display window of a desktop application of the first electronic device, or may be a display window of another application of the first electronic device. Fig. 10 is a flow chart illustrating a content sharing method according to an embodiment of the present application. As shown in fig. 10, the method includes:
1001: a control module (view) monitors a first operation of a user long pressing a first control in a second window.
It may be appreciated that in the embodiment of the present application, the control module may monitor whether a long press event exists (i.e., the user long presses the first operation of the first control in the second window) through the onTouchEvent callback. It can be appreciated that in the embodiment of the present application, when the control module monitors that the user presses the first control in the second window for a set time, it may be determined that the first operation of pressing the first control in the second window for a long time is monitored.
It is to be appreciated that the first control can include a file icon, a picture thumbnail, and the like.
1002: The control module (view) sends a first operation to the PC assistant.
It will be appreciated that the PC assistant of the handset may register long press event listening with the view. The PC assistant is notified of (i.e., sends a long press event to) when the view monitors that the user is long pressing.
1003: The PC assistant sends a first operation to the first communication frame.
The first communication framework sends 1004 a first operation to the second communication framework.
1005: The second communication framework sends the first operation to the PC manager.
1006: The drag frame monitors for a drag event.
1007: The drag frame sends a drag event to the PC assistant.
It will be appreciated that the PC assistant of the handset may register for drag event listening with the drag frame. The PC assistant is notified when the drag frame detects a user drag event (i.e., a long press event is sent to the PC assistant).
1008: The PC assistant sends a drag event to the first communication frame.
1009: The first communication frame sends a drag event to the second communication frame.
The second communication framework sends the drag event to the PC manager 1010.
1011: After receiving the first operation, the PC manager monitors a moving track input by a user from a first position where the first control is located to a second position outside the second window.
1012, The PC manager determines whether a drag event is received. If not, the process proceeds to 1013, where the control display displays the prompt message. If not, go to 1014 and execute the drag flow.
1013: The PC manager controls the display screen to display prompt information.
1014: The PC manager performs the drag flow.
The drag flow may include storing an object corresponding to the first control in a set storage location and displaying the object in a drag release location of the user when the drag release location of the user is monitored.
The content sharing method for a single device in the embodiment of the application is described below by taking an electronic device as an example of a mobile phone. For example, the electronic device may display a first window and a second window, where the first window is a first application window of the electronic device and the second window is a second application window of the electronic device; the first application window and the second application window may be different windows of the same application or windows of different applications. Fig. 11 is a flow chart illustrating a content sharing method according to an embodiment of the present application. As shown in fig. 11, the method includes:
1101: a control module (view) monitors a first operation of a user long pressing a first control in a first window.
It may be appreciated that in the embodiment of the present application, when the electronic device monitors that the user presses the first control in the first window for a set time, it may be determined that the first operation (i.e., the long press event) of the user pressing the first control in the first window for a long time is monitored.
It is to be appreciated that the first control can include a file icon, a picture thumbnail, and the like.
1102: The control module (view) sends a first operation to the PC assistant.
1103: The drag frame monitors for a drag event.
It may be appreciated that in the embodiment of the present application, when determining that the first application where the first control is located supports sharing the first object corresponding to the first control by dragging the first control, the dragging frame may determine that a dragging event is monitored.
1104: The drag frame sends a drag event to the PC assistant.
And 1105, after the PC assistant receives the first operation, monitoring a moving track from the first position of the first control to the second position outside the first window.
It can be understood that in the embodiment of the present application, if the user drags the first control through the mouse, the first electronic device may monitor that the mouse moves out from the first position where the first control is located to the second position outside the second window, and the mouse is always in a pressed state during the moving out process, so as to determine that the user drags the track from the first position where the first control is located to the second position outside the second window.
In some embodiments, if the user drags the first control through a touch screen (e.g., a touch screen of a mobile phone, a tablet computer) or a touch area (e.g., a touch area of a notebook computer). Correspondingly, the first electronic device may determine that the user drag track moves out from the first position where the first control is located to the second position outside the second window by monitoring that the user touch track moves out from the first position where the first control is located to the second position outside the second window when the touch track is in a continuous state all the time in the moving-out process.
It will be appreciated that the second position may be any position outside the second window, and may also refer to any position of the first window.
For example, as shown in fig. 12a, the first window is a communication application window (i.e. a normal full screen display window), the second window may be a floating window of a transfer station, and then the second position may be any position in the floating window 207 of the transfer station.
In some embodiments, if the first window is a floating window of the transfer station and the second window is an application window, the second position may be any position outside the floating window 207 of the transfer station.
1106: The PC assistant determines whether a drag event is received, and if not, goes to 1107 to control the display screen to display a prompt message. If so, go to 1108 to execute the drag flow.
1107: The PC assistant controls the display screen to display prompt information.
1108: The PC assistant performs a drag flow.
The drag flow may include storing an object corresponding to the first control in a set storage location and displaying the object in a drag release location of the user when the drag release location of the user is monitored.
In some embodiments, the first application window may be a floating window and the second application window is a normal window. In some embodiments, the first application window and the second application window may each be a floating window. In some embodiments the first application window may be a normal window and the second application window a floating window.
It will be appreciated that in some embodiments, as shown in fig. 12a, when the user presses and moves the thumbnail 202 in the communication application window for a long time, the sharing of the picture corresponding to the thumbnail 202 in the communication application into the floating window 207 of the transfer station may be achieved. When the mobile phone 100 monitors that the mobile track is moved from the first position corresponding to the thumbnail 202 in the communication application window to any position of the floating window of the transfer station, if it is determined that the drag event is not received, the mobile phone 100 displays a prompt message, for example, the prompt message may be "application is not adapted and does not support drag", so as to remind the user that the current application does not support drag.
The content sharing method for a single device in the embodiment of the present application is described below by taking the first window as a communication application window and the second window as a floating window of a transfer station as an example. Fig. 12b is a flow chart illustrating a content sharing method according to an embodiment of the present application. As shown in fig. 12b, the method includes:
1201: a control module (view) monitors a first operation of a first control in the user long press communication application window.
In some embodiments, as shown in fig. 12a, the first control may be a thumbnail 202 in the communication application window.
1202: The control module (view) sends a first operation to the PC assistant.
1203: The drag frame monitors for a drag event.
It may be appreciated that in the embodiment of the present application, when determining that the communication application supports sharing the first object corresponding to the first control by dragging the first control, the dragging frame may determine that a dragging event is monitored.
1204: The drag frame sends a drag event to the PC assistant.
And 1205, after receiving the first operation, the PC assistant monitors a moving track input by a user from a first position where the first control is positioned to a second position in the floating window of the middle transfer station.
1206: The PC assistant determines whether a drag event is received, and if not, goes to 1207 to control the display screen to display a prompt message. If not, go to 1208 and execute the drag flow.
1207: The PC assistant controls the display screen to display prompt information.
1208: The PC assistant performs a drag flow.
In summary, in the embodiment of the present application, when the electronic device may determine whether the application supports drag by combining the drag track and whether the drag event is acquired. And when the fact that the application does not support dragging is determined, the electronic device prompts the user of the reason why the user cannot drag, so that the user can know the reason why the user cannot drag conveniently, and user experience is improved.
In addition, the embodiment of the application can accurately identify the user intention by combining the dragging track, the dragging event and the long-press event, and avoid the occurrence of misjudgment.
The following describes a content sharing method in the embodiment of the present application by taking a mouse traversing scenario as an example. Fig. 9 is an interactive flow diagram of a content sharing method according to an embodiment of the present application. The first electronic device is assumed to be a data dragging-in end device, and the second electronic device is assumed to be a data dragging-out end device. The first electronic device displays a first window and the second electronic device displays a second window.
As shown in fig. 13, the method includes:
1301: the first electronic device and the second electronic device establish a connection.
The second electronic device and the first electronic device can be connected in a wired or wireless mode. Based on the established connection, the second electronic device and the first electronic device may be used in cooperation. In this embodiment, the wireless Communication protocol used when the second electronic device and the first electronic device establish connection in a wireless manner may be a wireless fidelity (WIRELESS FIDELITY, wi-Fi) protocol, bluetooth (Bluetooth) protocol, zigBee protocol, near Field Communication (NFC) protocol, various cellular network protocols, or the like, which are not limited herein.
1302: And after the second electronic equipment monitors the first operation of the first control in the second window by the user, the second electronic equipment monitors the input track taking the first operation as the initial operation.
It may be appreciated that, in the embodiment of the present application, the first operation of the user pressing the first control in the second window for a long time, that is, the long-press event may be an operation that the time for which the first control (for example, the file control and the picture control) in the second window is pressed by the user is longer than the set time.
1303: The second electronic device monitors a movement track input by a user from a first position where the first control is located to a second position outside a display screen of the second electronic device.
It can be appreciated that in the embodiment of the present application, after detecting the first operation, the second electronic device may monitor, in real time, the user input track using the first operation as the initial operation.
It will be appreciated that the mouse or user touch trajectory may traverse to the display of the first electronic device as it moves out of the display of the second electronic device.
That is, in the present application, a movement trajectory from a first position to a second position of a first control of a first window may refer to a user's input operation (such as a mouse or a user touch trajectory) from the first position of the first control to a position (i.e., a second position) moved out of a display screen of a second electronic device, or may refer to a user's input operation (such as a mouse or a user touch trajectory) from the first position of the first control to a position (i.e., a second position) moved into a display screen of the first electronic device.
It can be understood that in the embodiment of the application, if the user drags the first control through the mouse, the first electronic device may monitor that the mouse moves out from the first position where the first control is located to the second position outside the display screen of the second electronic device, and the mouse is always in a pressed state during the moving out process, so as to determine that the user drags the track from the first position where the first control is located to the second position outside the display screen of the second electronic device.
In some embodiments, if the user drags the first control through a touch screen (e.g., a touch screen of a mobile phone, a tablet computer) or a touch area (e.g., a touch area of a notebook computer). Correspondingly, the first electronic device may determine that the user drag track is moved out from the first position where the first control is located to the second position outside the display screen of the second electronic device by monitoring that the user touch track is moved out from the first position where the first control is located to the second position outside the display screen of the second electronic device when the touch track is in a continuous state all the time in the moving-out process.
1304: The second electronic device sends position information corresponding to the second position and a movement track input by a user from the first position of the first control to the second position outside the display screen of the second electronic device to the first electronic device.
It may be appreciated that in some embodiments, when the first operation is monitored, the first electronic device may determine whether the application in which the first control is located supports sharing the first object corresponding to the first control by dragging the first control. If not, determining that the movement track from the first position of the first control to the second position outside the second window, which is input by the user, is monitored, and only sending the position information corresponding to the second position and the movement track from the first position of the first control to the second position outside the display screen of the second electronic device to the first electronic device; if so, determining that the position information corresponding to the second position, the movement track from the first position of the first control to the second position outside the display screen of the second electronic device and the drag event are transmitted to the first electronic device when the movement track from the first position of the first control to the second position outside the display screen of the second electronic device, which are input by the user, is monitored.
1305: The second electronic device determines whether a drag event is received. If yes, go to 1306 to send out prompt information; if yes, go to 1307 to execute the drag flow.
In some embodiments, when the first electronic device receives the position information and the movement track from the first position where the first control is located to the second position outside the display screen of the second electronic device, whether a drag event is received or not may be judged, if the drag event is not received, it may be determined that the user performs the drag, and a drag intention exists, but an application where the first control of the second electronic device is located does not support sharing of the first object corresponding to the first control by dragging the first control, at this time, the first electronic device displays prompt information to prompt the user that the application where the first control of the second electronic device is located does not support sharing of the first object corresponding to the first control by dragging the first control.
In some embodiments, when the first electronic device receives the position information and the movement track from the first position where the first control is located to the second position outside the display screen of the second electronic device, if the drag event is not received, when the track which is input by the user and moves into the first window is continuously monitored, prompt information is displayed to prompt the user that the first object corresponding to the first control is not supported by the application where the first control of the second electronic device is located in a mode of dragging the first control.
That is, the time when the first electronic device prompts may be the time when the position information and the movement track from the first position where the first control is located to the second position outside the display screen of the second electronic device are received, but the drag event is not received. In some embodiments, it may also be at a time when the mouse or input track is continuously detected to traverse or move into the first window.
In some embodiments, if the first electronic device receives the position information corresponding to the second position and the movement track from the first position where the first control is located to the second position outside the display screen of the second electronic device, and receives the drag event, the drag process may be executed.
1306: The first electronic device sends out prompt information.
It will be appreciated that in some embodiments, the first electronic device may display a hint, e.g., the first electronic device may display a hint of "application not adapted, not supporting dragging".
1307: And the first electronic equipment executes a drag flow.
The drag flow may include storing an object corresponding to the first control in a set storage location and displaying the object in a drag release location of the user when the drag release location of the user is monitored.
The content sharing method in the embodiment of the application is described below by taking the second electronic device as a mobile phone and the first electronic device as a computer as an example, and combining the software frameworks of the first electronic device and the second electronic device, wherein the first electronic device comprises a control module, a drag framework, a first communication framework and a PC assistant. The second electronic device includes a second communication framework and a PC manager.
The first electronic device is assumed to be a data dragging-in end device, and the second electronic device is assumed to be a data dragging-out end device. The first electronic device displays a first window and the second electronic device displays a second window. Fig. 14 is a flow chart illustrating a content sharing method according to an embodiment of the present application. As shown in fig. 14, the method includes:
1401: a control module (view) monitors a first operation of a user long pressing a first control in a second window.
It may be appreciated that in the embodiment of the present application, the control module may monitor whether a long press event exists (i.e., the user long presses the first operation of the first control in the second window) through the onTouchEvent callback. It can be appreciated that in the embodiment of the present application, when the control module monitors that the user presses the first control in the second window for a set time, it may be determined that the first operation of pressing the first control in the second window for a long time is monitored.
It is to be appreciated that the first control can include a file icon, a picture thumbnail, and the like.
1402: The control module (view) sends a first operation to the PC assistant.
It will be appreciated that the PC assistant of the handset may register long press event listening with the view. The PC assistant is notified of (i.e., sends a long press event to) when the view monitors that the user is long pressing.
1403: The PC assistant monitors a movement track input by a user from a first position where the first control is located to a second position outside a display screen of the second electronic device.
1404: The PC assistant sends the position information of the second position and the moving track from the first position of the first control to the second position outside the display screen of the second electronic device to the first communication frame.
1405, The first communication frame sends the position information of the second position and the moving track from the first position where the first control is located to the second position outside the display screen of the second electronic device to the second communication frame.
1406: The second communication frame sends the position information of the second position and the moving track from the first position of the first control to the second position outside the display screen of the second electronic device to the PC manager.
1407: The drag frame monitors for a drag event.
It may be appreciated that in the embodiment of the present application, when determining that the communication application supports sharing the first object corresponding to the first control by dragging the first control, the dragging frame may determine that a dragging event is monitored. And sending the drag event to the first electronic device.
The drag frame may not send a drag event to the first electronic device when it is determined that the communication application does not support sharing of the first object corresponding to the first control by dragging the first control.
1408: The drag frame sends a drag event to the PC assistant.
It will be appreciated that the PC assistant of the handset may register for drag event listening with the drag frame. The PC assistant is notified when the drag frame detects a user drag event (i.e., a long press event is sent to the PC assistant).
1409: The PC assistant sends a drag event to the first communication frame.
1410: The first communication frame sends a drag event to the second communication frame.
1411 The second communication framework sends a drag event to the PC manager.
1412: The PC manager receives the position information of the second position and a moving track from the first position where the first control is located to the second position outside the display screen of the second electronic device.
1413: And judging whether a drag event is received. If not, the process proceeds to 1414, where the display screen is controlled to display the prompt message. If so, go to 1415 to execute the drag flow.
In some embodiments, when the PC manager receives the position information and the movement track from the first position where the first control is located to the second position outside the display screen of the second electronic device, it may determine whether a drag event is received, if the drag event is not received, it may be determined that the user has dragged, and there is a drag intention, but an application where the first control of the second electronic device is located does not support sharing of the first object corresponding to the first control by dragging the first control, at this time, the first electronic device displays prompt information to prompt the user that the application where the first control of the second electronic device is located does not support sharing of the first object corresponding to the first control by dragging the first control.
In some embodiments, after receiving the location information, the PC manager may determine whether a drag event is received when continuously monitoring a track input by the user and moved into the first window, and if the drag event is not received, display prompt information to prompt the user that an application where the first control of the second electronic device is located does not support sharing the first object corresponding to the first control by dragging the first control.
The time when the first electronic device prompts can be the time when the position information and the moving track from the first position where the first control is located to the second position outside the display screen of the second electronic device are received. In some embodiments, it may also be the time at which the mouse or input track crosses or moves into the first window.
In some embodiments, if the first electronic device receives the position information corresponding to the second position and the movement track from the first position where the first control is located to the second position outside the display screen of the second electronic device, and receives the drag event, the drag process may be executed.
1414: The PC manager controls the display screen to display prompt information.
1415: The PC manager performs the drag flow.
The drag flow may include storing an object corresponding to the first control in a set storage location and displaying the object in a drag release location of the user when the drag release location of the user is monitored.
Based on the scheme, when the application where the first control dragged by the user is located does not support sharing of the first object corresponding to the first control in the mode of dragging the first control, the first electronic device can prompt, so that the user can know the reason that the user cannot drag conveniently, and user experience is improved. In addition, the embodiment of the application can accurately identify the user intention by combining the dragging track, the dragging event and the long-press event, and avoid the occurrence of misjudgment.
The application also provides a content sharing system, which comprises a first electronic device and a second electronic device, wherein the first electronic device displays a first window and a second window, and the second window is an application window of the second electronic device on the first electronic device; the second electronic device is used for monitoring a first operation of long-pressing a first control in the second window by a user and sending the first operation to the first electronic device; the first electronic equipment is used for monitoring an input track taking the first operation as an initial operation after receiving the first operation; and the first electronic device is used for displaying that the application where the first control is located does not support sharing of the prompt information of the first object corresponding to the first control in a mode of dragging the first control if the dragging event corresponding to the first control is not acquired when the moving track, which is input by the user, from the first position where the first control is located to the second position outside the second window is monitored.
The first electronic device includes a first system application; the second electronic device comprises a control module and a second system application; the control module is used for monitoring a first operation of long-pressing a first control in the second window by a user and sending the first operation to the second system application; the second system application is used for sending the first operation to the first system application; the first system application is used for monitoring an input track taking the first operation as an initial operation after receiving the first operation; and the first system application is used for displaying that the application where the first control is located does not support sharing of the prompt information of the first object corresponding to the first control in a mode of dragging the first control if the dragging event corresponding to the first control is not acquired when the movement track, which is input by the user, from the first position where the first control is located to the second position outside the second window is monitored.
The application provides a content sharing system, which comprises a first electronic device and a second electronic device, wherein the first electronic device is connected with the second electronic device, the first electronic device displays a first window, and the second electronic device displays a second window; the second electronic device is used for monitoring an input track taking the first operation as an initial operation after monitoring the first operation of the first control in the second window by the user; the second electronic device is used for monitoring a movement track from a first position of the first control in the second window to a second position outside the display screen of the second electronic device, which is input by a user, and sending the position information of the second position and the movement track to the first electronic device; and the first electronic device is used for displaying that the application where the first control is located does not support sharing of the prompt information of the first object corresponding to the first control in a mode of dragging the first control if the dragging event corresponding to the first control is not received when the position information of the second position and the movement track sent by the second electronic device are received.
The following describes a hardware structure of the electronic device provided by the embodiment of the present application by taking a mobile phone as an example. As shown in fig. 15, the mobile phone 10 may include a processor 110, a power module 140, a memory 180, a mobile communication module 130, a wireless communication module 120, a sensor module 190, an audio module 150, a camera 170, an interface module 160, keys 101, a display 102, and the like.
It should be understood that the illustrated structure of the embodiment of the present application is not limited to the specific configuration of the mobile phone 10. In other embodiments of the application, the handset 10 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, for example, processing modules or processing circuits that may include a central processing unit (Central Processing Unit, CPU), an image processor (Graphics Processing Unit, GPU), a digital signal processor DSP, a microprocessor (Micro-programmed Control Unit, MCU), an artificial intelligence (ARTIFICIAL INTELLIGENCE, AI) processor, or a programmable logic device (Field Programmable GATE ARRAY, FPGA), or the like. Wherein the different processing units may be separate devices or may be integrated in one or more processors. A memory unit may be provided in the processor 110 for storing instructions and data. In some embodiments, the storage unit in the processor 110 is a cache 180.
The processor may be configured to execute the content sharing method provided by the embodiment of the present application.
The power module 140 may include a power source, a power management component, and the like. The power source may be a battery. The power management component is used for managing the charging of the power supply and the power supply supplying of the power supply to other modules. In some embodiments, the power management component includes a charge management module and a power management module. The charging management module is used for receiving charging input from the charger; the power management module is used for connecting a power supply, and the charging management module is connected with the processor 110. The power management module receives input from the power and/or charge management module and provides power to the processor 110, the display 102, the camera 170, the wireless communication module 120, and the like.
The mobile communication module 130 may include, but is not limited to, an antenna, a power amplifier, a filter, an LNA (Low noise amplify, low noise amplifier), and the like. The mobile communication module 130 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied to the handset 10. The mobile communication module 130 may receive electromagnetic waves from an antenna, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to a modem processor for demodulation. The mobile communication module 130 may amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 130 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 130 may be disposed in the same device as at least some of the modules of the processor 110. The wireless communication technologies may include global system for mobile communications (global system for mobile communications, GSM), general packet radio service (GENERAL PACKET radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), bluetooth (BT), global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), wireless local area network (wireless local area networks, WLAN), near field communication technology (NEAR FIELD communication, NFC), frequency modulation (frequency modulation, FM) and/or field communication, NFC), infrared (IR) technology, and the like. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation SATELLITE SYSTEM, GLONASS), a beidou satellite navigation system (beidou navigation SATELLITE SYSTEM, BDS), a quasi zenith satellite system (quasi-zenith SATELLITE SYSTEM, QZSS) and/or a satellite based augmentation system (SATELLITE BASED AUGMENTATION SYSTEMS, SBAS).
The wireless communication module 120 may include an antenna, and transmit and receive electromagnetic waves via the antenna. The wireless communication module 120 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (WIRELESS FIDELITY, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), frequency modulation (frequency modulation, FM), near field communication (NEAR FIELD communication, NFC), infrared (IR), etc., applied to the handset 10. The handset 10 may communicate with a network and other devices via wireless communication technology.
In some embodiments, the mobile communication module 130 and the wireless communication module 120 of the handset 10 may also be located in the same module.
The display screen 102 is used for displaying human-computer interaction interfaces, images, videos, and the like. The display screen 102 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), an active-matrix organic LIGHT EMITTING diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-oLed, a quantum dot LIGHT EMITTING diode (QLED), or the like.
The sensor module 190 may include a proximity light sensor, a pressure sensor, a gyroscope sensor, a barometric sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like. The gyroscope sensor can be used for judging information such as mobile phone shake and the like.
The audio module 150 is used to convert digital audio information into an analog audio signal output, or to convert an analog audio input into a digital audio signal. The audio module 150 may also be used to encode and decode audio signals. In some embodiments, the audio module 150 may be disposed in the processor 110, or some functional modules of the audio module 150 may be disposed in the processor 110. In some embodiments, the audio module 150 may include a speaker, an earpiece, a microphone, and an earphone interface.
The camera 170 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element converts the optical signal into an electrical signal, and then transfers the electrical signal to an ISP (IMAGE SIGNAL Processing) to convert into a digital image signal. The handset 10 may implement shooting functions through an ISP, a camera 170, a video codec, a GPU (Graphic Processing Unit, graphics processor), a display 102, an application processor, and the like.
It can be appreciated that in the embodiment of the present application, the camera 170 may include a main camera and a tele camera, and may also include other cameras, where the main camera is typically a lens with a focal length of about 27mm, and is used for shooting a scene with a medium viewing angle; the long-focus camera is a lens with a focal length of more than 50mm and is used for shooting a close-up scene.
The interface module 160 includes an external memory interface, a universal serial bus (universal serial bus, USB) interface, a subscriber identity module (subscriber identification module, SIM) card interface, and the like. Wherein the external memory interface may be used to connect an external memory card, such as a Micro SD card, to extend the memory capabilities of the handset 10. The external memory card communicates with the processor 110 through an external memory interface to implement data storage functions. The universal serial bus interface is used for communication between the handset 10 and other electronic devices. The subscriber identity module card interface is used to communicate with a SIM card mounted to the handset 1010, for example, by reading a telephone number stored in the SIM card or by writing a telephone number to the SIM card.
In some embodiments, the handset 10 further includes keys 101, motors, indicators, and the like. The key 101 may include a volume key, an on/off key, and the like. The motor is used to generate a vibration effect on the mobile phone 10, for example, when the mobile phone 10 of the user is called, so as to prompt the user to answer the call from the mobile phone 10. The indicators may include laser indicators, radio frequency indicators, LED indicators, and the like.
Embodiments of the present disclosure may be implemented in hardware, software, firmware, or a combination of these implementations. Embodiments of the application may be implemented as a computer program or program code that is executed on a programmable system comprising at least one processor, a storage system (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
Program code may be applied to input instructions to perform the functions described herein and generate output information. The output information may be applied to one or more output devices in a known manner. For the purposes of this application, a processing system includes any system having a processor such as, for example, a Digital Signal Processor (DSP), a microcontroller, an Application Specific Integrated Circuit (ASIC), or a microprocessor.
The program code may be implemented in a high level procedural or object oriented programming language to communicate with a processing system. Program code may also be implemented in assembly or machine language, if desired. Indeed, the mechanisms described in the present application are not limited in scope by any particular programming language. In either case, the language may be a compiled or interpreted language.
In some cases, the disclosed embodiments may be implemented in hardware, firmware, software, or any combination thereof. The disclosed embodiments may also be implemented as instructions carried by or stored on one or more transitory or non-transitory machine-readable (e.g., computer-readable) storage media, which may be read and executed by one or more processors. For example, the instructions may be distributed over a network or through other computer readable media. Thus, a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), including but not limited to floppy diskettes, optical disks, read-only memories (CD-ROMs), magneto-optical disks, read-only memories (ROMs), random Access Memories (RAMs), erasable programmable read-only memories (EPROMs), electrically erasable programmable read-only memories (EEPROMs), magnetic or optical cards, flash memory, or tangible machine-readable memory for transmitting information (e.g., carrier waves, infrared signal digital signals, etc.) in an electrical, optical, acoustical or other form of propagated signal using the internet. Thus, a machine-readable medium includes any type of machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (e.g., a computer).
In the drawings, some structural or methodological features may be shown in a particular arrangement and/or order. However, it should be understood that such a particular arrangement and/or ordering may not be required. Rather, in some embodiments, these features may be arranged in a different manner and/or order than shown in the illustrative figures. Additionally, the inclusion of structural or methodological features in a particular figure is not meant to imply that such features are required in all embodiments, and in some embodiments, may not be included or may be combined with other features.
It should be noted that, in the embodiments of the present application, each unit/module mentioned in each device is a logic unit/module, and in physical terms, one logic unit/module may be one physical unit/module, or may be a part of one physical unit/module, or may be implemented by a combination of multiple physical units/modules, where the physical implementation manner of the logic unit/module itself is not the most important, and the combination of functions implemented by the logic unit/module is only a key for solving the technical problem posed by the present application. Furthermore, in order to highlight the innovative part of the present application, the above-described device embodiments of the present application do not introduce units/modules that are less closely related to solving the technical problems posed by the present application, which does not indicate that the above-described device embodiments do not have other units/modules.
It should be noted that in the examples and descriptions of this patent, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. While the application has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the application.

Claims (28)

1. The content sharing method is applied to first electronic equipment and is characterized in that the first electronic equipment displays a first window and a second window, wherein the second window is an application window of the second electronic equipment, which is projected onto the first electronic equipment;
The method comprises the following steps:
After the first electronic device receives a first operation of long-pressing a first control in the second window by a user sent by the second electronic device, monitoring an input track taking the first operation as an initial operation;
When the first electronic device monitors a movement track input by a user from a first position where the first control is located to a second position outside the second window, if a drag event corresponding to the first control is not acquired, displaying that the application where the first control is located does not support sharing of prompt information of a first object corresponding to the first control in a mode of dragging the first control.
2. The method of claim 1, wherein the second location is any location outside the second window.
3. The method of claim 1, wherein the first window is an application window of the first electronic device;
the second position is any position of the first window.
4. A method according to any of claims 1-3, wherein the first electronic device monitoring a movement trajectory of a user input from a first position where the first control is located to a second position outside the second window, comprising:
The first electronic device monitors that the mouse moves out from a first position where the first control is located to a second position outside the second window, and determines a movement track from the first position where the first control is located to the second position outside the second window, which is input by a user, when the mouse is in a pressed state all the time in the moving-out process.
5. A method according to any of claims 1-3, wherein the first electronic device monitoring a movement trajectory of a user input from a first position where the first control is located to a second position outside the second window, comprising:
And under the condition that the first electronic equipment monitors that the touch track of the user moves out from the first position where the first control is positioned to the second position outside the second window and the touch track is always in a continuous state in the moving-out process, determining that the movement track of the user input from the first position where the first control is positioned to the second position outside the second window is monitored.
6. A content sharing method applied to a second electronic device, wherein the second electronic device is connected with a first electronic device, the method comprising:
The second electronic device determines that a first control in a second window is pressed for a long time, and sends a first operation of pressing the first control for a long time to the first electronic device; the second window is an application window which is projected to the first electronic device by the second electronic device;
And when the second electronic equipment determines that the application where the first control is located does not support sharing of the first object corresponding to the first control in a mode of dragging the first control, the dragging event corresponding to the first control is not sent to the first electronic equipment.
7. The method according to claim 6, comprising:
The second electronic device determines that a first control in a second window is pressed for a long time, and the method comprises the following steps:
The second electronic device obtains position information corresponding to the operation of the user on the second window, which is sent by the first electronic device;
the second electronic device determines that a first control in the second window is pressed for a long time based on the position information.
8. The content sharing method is characterized in that a first electronic device and a second electronic device are connected; the first electronic device displays a first window and a second window, wherein the second window is an application window which is projected onto the first electronic device by the second electronic device;
the second electronic device monitors a first operation of long-pressing a first control in the second window by a user, and sends the first operation to the first electronic device;
after the first electronic equipment receives the first operation, monitoring an input track taking the first operation as an initial operation;
When the first electronic device monitors a movement track input by a user from a first position where the first control is located to a second position outside the second window, if a drag event corresponding to the first control is not acquired, displaying that the application where the first control is located does not support sharing of prompt information of a first object corresponding to the first control in a mode of dragging the first control.
9. The method of claim 8, wherein the second location is any location outside the second window.
10. The method of claim 8, wherein the first window is an application window of the first electronic device;
the second position is any position of the first window.
11. The method of any of claims 8-10, wherein the first electronic device monitoring a movement trajectory of a user input from a first location where the first control is located to a second location outside the second window comprises:
The first electronic device monitors that the mouse moves out from a first position where the first control is located to a second position outside the second window, and determines a movement track from the first position where the first control is located to the second position outside the second window, which is input by a user, when the mouse is in a pressed state all the time in the moving-out process.
12. The method of any of claims 8-10, wherein the first electronic device monitoring a movement trajectory of a user input from a first location where the first control is located to a second location outside the second window comprises:
And under the condition that the first electronic equipment monitors that the touch track of the user moves out from the first position where the first control is positioned to the second position outside the second window and the touch track is always in a continuous state in the moving-out process, determining that the movement track of the user input from the first position where the first control is positioned to the second position outside the second window is monitored.
13. The method according to any one of claims 8-12, further comprising:
the second electronic device monitors a first operation of long-pressing a first control in the second window by a user, and the first operation comprises the following steps:
the second electronic device obtains position information corresponding to the operation of the user in the second window, which is sent by the first electronic device;
And the second electronic equipment determines that the first control is pressed for a long time based on the position information, and determines that the first operation of pressing the first control in the second window for a long time by the user is monitored.
14. The content sharing method is applied to electronic equipment and is characterized in that the electronic equipment displays a first window and a second window, wherein the first window is a first application window of the electronic equipment, and the second window is a second application window of the electronic equipment;
The method comprises the following steps:
After the electronic equipment monitors a first operation of long-pressing a first control in the first window by a user, monitoring an input track taking the first operation as an initial operation;
When the electronic device monitors a moving track input by a user from a first position where the first control is located to a second position outside the first window, if it is determined that a drag event corresponding to the first control is not monitored, displaying that the application where the first control is located does not support sharing of prompt information of a first object corresponding to the first control in a mode of dragging the first control.
15. The method of claim 14, wherein the second location is any location outside of the first window.
16. The method of claim 14, wherein the second position is any position of the second window.
17. The method of any of claims 14-16, wherein the electronic device monitoring a trajectory of movement of a user input from a first position where the first control is located to a second position outside the second window comprises:
The electronic equipment monitors that the mouse moves out from a first position where the first control is located to a second position outside the second window, and determines a movement track from the first position where the first control is located to the second position outside the second window, which is input by a user, under the condition that the mouse is always in a pressed state in the moving-out process.
18. The method of any of claims 14-16, wherein the electronic device monitoring a trajectory of movement of a user input from a first position where the first control is located to a second position outside the second window comprises:
and under the condition that the electronic equipment monitors that the touch track of the user moves out from the first position where the first control is positioned to the second position outside the second window and the touch track is always in a continuous state in the moving-out process, determining that the movement track of the user input from the first position where the first control is positioned to the second position outside the second window is monitored.
19. The method of any of claims 14-18, wherein the first window is a floating window and/or the second window is a floating window.
20. The content sharing method is applied to first electronic equipment and is characterized in that the first electronic equipment is connected with second electronic equipment, the first electronic equipment displays a first window, and the second electronic equipment displays a second window;
The method comprises the following steps:
When the first electronic device receives the position information of the second position outside the display screen of the second electronic device and the movement track from the first position of the first control in the second window to the second position, which are sent by the second electronic device, a drag event corresponding to the first control is not received, and then the prompt information that the application of the first control does not support sharing the first object corresponding to the first control in a mode of dragging the first control is displayed.
21. The content sharing method is characterized in that a first electronic device is connected with a second electronic device, the first electronic device displays a first window, and the second electronic device displays a second window;
After the second electronic device monitors a first operation of long-pressing a first control in the second window by a user, monitoring an input track taking the first operation as an initial operation;
When the second electronic device monitors a movement track input by a user from a first position of a first control in the second window to a second position outside a display screen of the second electronic device, sending the position information of the second position and the movement track to the first electronic device;
When the first electronic device receives the position information of the second position and the movement track sent by the second electronic device, and does not receive the dragging event corresponding to the first control, the first electronic device displays that the application where the first control is located does not support sharing of the prompt information of the first object corresponding to the first control in a mode of dragging the first control.
22. The content sharing system is characterized by comprising a first electronic device and a second electronic device, wherein the first electronic device displays a first window and a second window, and the second window is an application window which is projected onto the first electronic device by the second electronic device;
The second electronic device is used for monitoring a first operation of long-pressing a first control in the second window by a user and sending the first operation to the first electronic device;
The first electronic device is configured to monitor an input track using the first operation as an initial operation after receiving the first operation;
And the first electronic device is configured to monitor a movement track input by a user from a first position where the first control is located to a second position outside the second window, and if a drag event corresponding to the first control is not acquired, display that an application where the first control is located does not support sharing of prompt information of a first object corresponding to the first control by dragging the first control.
23. The system of claim 22, wherein the first electronic device comprises a first system application; the second electronic device comprises a control module and a second system application;
the control module is used for monitoring the first operation of long-pressing a first control in the second window by a user and sending the first operation to the second system application;
The second system application is configured to send the first operation to the first system application;
the first system application is used for monitoring an input track taking the first operation as an initial operation after receiving the first operation;
And the first system application is used for monitoring a moving track input by a user from a first position where the first control is located to a second position outside the second window, and if a drag event corresponding to the first control is not acquired, displaying that the application where the first control is located does not support sharing of prompt information of a first object corresponding to the first control in a mode of dragging the first control.
24. The content sharing system is characterized by comprising a first electronic device and a second electronic device, wherein the first electronic device is connected with the second electronic device, the first electronic device displays a first window, and the second electronic device displays a second window;
The second electronic device is configured to monitor an input track using a first operation as an initial operation after monitoring the first operation of the first control in the second window by a user;
The second electronic device is configured to send, when a movement track from a first position where a first control in the second window is located to a second position outside a display screen of the second electronic device, which is input by a user, to the first electronic device, the position information of the second position and the movement track;
And the first electronic device is configured to, when receiving the position information of the second position and the movement track sent by the second electronic device, display a prompt message that an application where the first control is located does not support sharing of the first object corresponding to the first control by dragging the first control if a dragging event corresponding to the first control is not received.
25. The electronic equipment is characterized in that the electronic equipment is first electronic equipment, the first electronic equipment displays a first window and a second window, and the second window is an application window which is projected onto the first electronic equipment by the second electronic equipment;
The first electronic device is configured to monitor an input track using the first operation as an initial operation after receiving a first operation of long-pressing a first control in the second window by a user sent by the second electronic device;
And the first electronic device is configured to monitor a movement track input by a user from a first position where the first control is located to a second position outside the second window, and if a drag event corresponding to the first control is not acquired, display that an application where the first control is located does not support sharing of prompt information of a first object corresponding to the first control by dragging the first control.
26. An electronic device, wherein the electronic device is a second electronic device;
The second electronic device is used for determining that the first control in the second window is pressed for a long time, and sending a first operation of pressing the first control in the second window for a long time to the first electronic device by a user; the second window is an application window which is projected to the first electronic device by the second electronic device;
And the second electronic device is configured to not send a drag event corresponding to the first control to the first electronic device when it is determined that the application in which the first control is located does not support sharing of the first object corresponding to the first control by dragging the first control.
27. An electronic device, comprising: a memory for storing instructions for execution by one or more processors of the electronic device, and the processor, which is one of the one or more processors of the electronic device, for performing the content sharing method of any of claims 1-21.
28. A readable storage medium having stored thereon instructions that, when executed on an electronic device, cause the electronic device to perform the content sharing method of any of claims 1-21.
CN202211295723.XA 2022-10-21 2022-10-21 Content sharing method, system, electronic equipment and medium Pending CN117950767A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211295723.XA CN117950767A (en) 2022-10-21 2022-10-21 Content sharing method, system, electronic equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211295723.XA CN117950767A (en) 2022-10-21 2022-10-21 Content sharing method, system, electronic equipment and medium

Publications (1)

Publication Number Publication Date
CN117950767A true CN117950767A (en) 2024-04-30

Family

ID=90803584

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211295723.XA Pending CN117950767A (en) 2022-10-21 2022-10-21 Content sharing method, system, electronic equipment and medium

Country Status (1)

Country Link
CN (1) CN117950767A (en)

Similar Documents

Publication Publication Date Title
US11722449B2 (en) Notification message preview method and electronic device
WO2021121052A1 (en) Multi-screen cooperation method and system, and electronic device
CN110865837B (en) Method and terminal for system upgrade
CN116095881A (en) Multi-device cooperation method, electronic device and related products
CN111225108A (en) Communication terminal and card display method of negative screen interface
CN111656347B (en) Project display method and terminal
CN113835569A (en) Terminal device, quick start method for internal function of application and storage medium
CN112420217A (en) Message pushing method, device, equipment and storage medium
CN111176766A (en) Communication terminal and component display method
CN116431044A (en) Method and device for starting application program and terminal equipment
WO2021042881A1 (en) Message notification method and electronic device
US11310177B2 (en) Message display method and terminal
CN116095413B (en) Video processing method and electronic equipment
CN113642010B (en) Method for acquiring data of extended storage device and mobile terminal
CN117950767A (en) Content sharing method, system, electronic equipment and medium
CN116204254A (en) Annotating page generation method, electronic equipment and storage medium
CN111142648B (en) Data processing method and intelligent terminal
CN116095412B (en) Video processing method and electronic equipment
CN114143456B (en) Photographing method and device
CN116055643B (en) Visual voice mail service starting method and electronic equipment
CN113641533B (en) Terminal and short message processing method
CN114531493B (en) Request processing method and device, electronic equipment and storage medium
CN114968423B (en) Long page screen capturing method and electronic equipment
CN113901255A (en) Image processing terminal and method
CN115344160A (en) Terminal device, desktop display method and storage medium

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination