CN116301538A - Interaction method, device, electronic equipment and storage medium - Google Patents

Interaction method, device, electronic equipment and storage medium Download PDF

Info

Publication number
CN116301538A
CN116301538A CN202211721131.XA CN202211721131A CN116301538A CN 116301538 A CN116301538 A CN 116301538A CN 202211721131 A CN202211721131 A CN 202211721131A CN 116301538 A CN116301538 A CN 116301538A
Authority
CN
China
Prior art keywords
drag
electronic device
sharing
drag object
opening
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211721131.XA
Other languages
Chinese (zh)
Inventor
王剑锋
李轩恺
魏曦
汤志斌
许达兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202211721131.XA priority Critical patent/CN116301538A/en
Publication of CN116301538A publication Critical patent/CN116301538A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The application discloses an interaction method, an interaction device, electronic equipment and a storage medium, and relates to the technical field of electronic equipment. The method is applied to a first electronic device that includes a display screen having an aperture. The method comprises the following steps: and displaying the current interface, if the drag operation acting on the current interface is detected, determining a drag object corresponding to the drag operation from the current interface, controlling the drag object to move based on the drag operation, and if the drag object moves into the opening based on the drag operation, sharing the drag object to the second electronic equipment. According to the method and the device, the open holes in the display screen are matched with information sharing, new purposes can be given to the open holes in the display screen, and convenience of information sharing and visual experience of users are improved.

Description

Interaction method, device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of electronic devices, and in particular, to an interaction method, an interaction device, an electronic device, and a storage medium.
Background
With the development of science and technology, electronic devices are increasingly widely used, and have more and more functions, and become one of the necessities in daily life. Meanwhile, the user pursues the full screen, but due to the existence of photosensitive devices such as a front camera, an open area is reserved for supporting the realization of the functions of the devices, so that the occupation of the display area is caused, the display effect is affected, and bad user experience is brought.
Disclosure of Invention
In view of the above, the present application proposes an interaction method, an interaction device, an electronic device, and a storage medium, so as to solve the above problem.
In a first aspect, an embodiment of the present application provides an interaction method applied to a first electronic device, where the first electronic device includes a display screen, and the display screen has an opening, and the method includes: displaying a current interface; if the drag operation acting on the current interface is detected, determining a drag object corresponding to the drag operation from the current interface, and controlling the drag object to move based on the drag operation; and if the drag object moves into the opening based on the drag operation, sharing the drag object to a second electronic device.
In a second aspect, an embodiment of the present application provides an interaction device applied to a first electronic device, where the first electronic device includes a display screen, and the display screen has an opening, and the device includes: the current interface display module is used for displaying a current interface; the drag object moving module is used for determining a drag object corresponding to the drag operation from the current interface and controlling the drag object to move based on the drag operation if the drag operation acting on the current interface is detected; and the drag object sharing module is used for sharing the drag object to the second electronic equipment if the drag object moves into the opening based on the drag operation.
In a third aspect, embodiments of the present application provide an electronic device comprising a display screen, a memory, and a processor, the display screen and the memory being coupled to the processor, the memory storing instructions that when executed by the processor perform the above-described method.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium having program code stored therein, the program code being callable by a processor to perform the above method.
According to the interaction method, the device, the electronic equipment and the storage medium, the current interface is displayed, if the drag operation acting on the current interface is detected, the drag object corresponding to the drag operation is determined from the current interface, the drag object is controlled to move based on the drag operation, if the drag object moves to the open hole of the display screen of the electronic equipment based on the drag operation, the drag object is shared to the second electronic equipment, so that the open hole which originally influences the full screen display is matched with the sharing of information, the defect which originally influences the full screen display can be converted into the sharing of the information, and therefore, any area which originally needs to be used as the display is not sacrificed to participate in the sharing of the information, and new use can be given to the defect which originally influences the full screen display, so that different use experiences are brought to users, and convenience of the information sharing and visual experience of users can be improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 shows a first schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 2 shows a second structural schematic diagram of the electronic device provided in the embodiment of the present application;
fig. 3 shows a third structural schematic diagram of the electronic device provided in the embodiment of the present application;
fig. 4 shows a fourth structural schematic diagram of an electronic device provided in an embodiment of the present application;
fig. 5 shows a fifth structural schematic diagram of an electronic device according to an embodiment of the present application;
FIG. 6 is a flow chart illustrating an interaction method according to an embodiment of the present application;
fig. 7 shows a first interaction schematic diagram of an electronic device provided in an embodiment of the present application;
fig. 8 shows a second interaction schematic diagram of an electronic device provided in an embodiment of the present application;
fig. 9 shows a first interface schematic diagram of an electronic device according to an embodiment of the present application;
Fig. 10 shows a second interface schematic diagram of an electronic device according to an embodiment of the present application;
fig. 11 shows a third interface schematic diagram of an electronic device according to an embodiment of the present application;
FIG. 12 is a flow chart illustrating an interaction method according to an embodiment of the present application;
FIG. 13 is a flow chart illustrating step S240 of the interaction method illustrated in FIG. 12 of the present application;
fig. 14 shows a fourth interface schematic diagram of an electronic device according to an embodiment of the present application;
FIG. 15 is a flow chart illustrating an interaction method according to an embodiment of the present application;
fig. 16 shows a schematic view of a scenario application of an electronic device provided in an embodiment of the present application;
FIG. 17 is a flow chart illustrating an interaction method according to an embodiment of the present application;
FIG. 18 is a flow chart illustrating an interaction method according to an embodiment of the present application;
FIG. 19 is a flow chart illustrating step S530 of the interaction method illustrated in FIG. 18 of the present application;
FIG. 20 illustrates a block diagram of an interaction device provided by an embodiment of the present application;
FIG. 21 shows a block diagram of an electronic device for performing an interaction method according to an embodiment of the present application;
fig. 22 shows a storage unit for storing or carrying program code implementing an interaction method according to an embodiment of the present application.
Detailed Description
In order to enable those skilled in the art to better understand the present application, the following description will make clear and complete descriptions of the technical solutions in the embodiments of the present application with reference to the accompanying drawings in the embodiments of the present application.
Display screens typically function to display text, pictures, icons, or video content in electronic devices such as cell phones and tablet computers. Typically, the electronic device includes a front panel, a rear cover, and a bezel. The front panel includes an upper forehead area, a middle screen area, and a lower key area. In general, the forehead area is provided with a headphone sound outlet, a front camera and other photosensitive devices, the middle screen area is provided with a display screen, and the lower key area is provided with one to three physical keys. With the development of technology, the lower key area is gradually cancelled, and the physical keys originally arranged in the lower key area are replaced by virtual keys in the display screen.
The earphone sound outlet hole and the front camera and other photosensitive devices arranged in the forehead area are important for the functional support of the mobile phone and are not easy to cancel, so that the display area of the display screen is expanded to cover the forehead area with great difficulty. After a series of researches, the inventor finds that an opening can be formed in the display screen, and the original forehead area can be expanded into a displayable area of the display screen by arranging the photosensitive device arranged in the forehead area into the opening, so as to increase the area of the displayable area, wherein, it can be understood that the displayable area is an area capable of being lightened and displayed.
Illustratively, the aperture may be disposed on one or more edges of the display screen, or may be disposed on a non-edge region of the display screen, and the aperture may be semi-circular, rectangular, rounded rectangular, circular, regular polygonal, irregular, or the like. Illustratively, referring to fig. 1, the opening 140 may be a circular notch formed at a position of a non-edge area of the display screen 130, where a hole formed by the circular notch is used to accommodate at least one front panel component of a projection component, a camera, a proximity light sensor, a receiver, a distance sensor, an ambient light brightness sensor, a temperature sensor, and a pressure sensor as the photosensitive device. Alternatively, the photosensitive device may be disposed below the display screen, where the position corresponds to the opening 140; for example, the front camera is disposed below the display screen, and is disposed corresponding to the opening 140 in the thickness direction of the display screen, so that ambient light can enter the front camera through the opening to support the front camera to perform photosensitive imaging. Referring to fig. 2, the opening 140 may be a semicircular/V-shaped notch formed at an edge of the display screen 130, and the hole formed by the semicircular notch is used to accommodate at least one front panel component of a projection component, a camera, a proximity sensor, a receiver, a distance sensor, an ambient light level sensor, a temperature sensor, and a pressure sensor as a photosensitive device. Alternatively, the photosensitive device may be disposed below the display screen, where the position corresponds to the opening 140; for example, the front camera is disposed below the display screen, and is disposed corresponding to the opening 140 in the thickness direction of the display screen, so that ambient light can enter the front camera through the opening to support the front camera to perform photosensitive imaging. Referring to fig. 3, the opening 140 may be a semicircular notch formed at a non-edge area of the display screen 130, where a hole formed by the semicircular notch is used to accommodate at least one front panel component of a projection component, a camera, a proximity sensor, a receiver, a distance sensor, an ambient light level sensor, a temperature sensor, and a pressure sensor as a photosensitive device. Alternatively, the photosensitive device may be disposed below the display screen, where the position corresponds to the opening 140; for example, the front camera is disposed below the display screen, and is disposed corresponding to the opening 140 in the thickness direction of the display screen, so that ambient light can enter the front camera through the opening to support the front camera to perform photosensitive imaging.
In addition to the middle area of the display screen, the opening may be formed in an edge area of the display screen, for example, referring to fig. 4, the opening 140 may be a circular notch formed in a left edge area of the display screen 130, where a hole formed by the circular notch is used to accommodate at least one front panel component of the projection component, the camera, the proximity sensor, the earpiece, the distance sensor, the ambient light level sensor, the temperature sensor, and the pressure sensor as a power sensing device. Alternatively, the photosensitive device may be disposed below the display screen, where the position corresponds to the opening 140; for example, the front camera is disposed below the display screen, and is disposed corresponding to the opening 140 in the thickness direction of the display screen, so that ambient light can enter the front camera through the opening to support the front camera to perform photosensitive imaging. Referring to fig. 5, the opening 140 may be a circular notch formed in a right edge region of the display screen 130, where a hole formed by the circular notch is used to accommodate at least one front panel component of a projection component, a camera, a proximity sensor, a receiver, a distance sensor, an ambient light level sensor, a temperature sensor, and a pressure sensor as a photosensitive device. Alternatively, the photosensitive device may be disposed below the display screen, where the position corresponds to the opening 140; for example, the front camera is disposed below the display screen, and is disposed corresponding to the opening 140 in the thickness direction of the display screen, so that ambient light can enter the front camera through the opening to support the front camera to perform photosensitive imaging.
However, the inventor finds through research that, at present, due to the existence of photosensitive devices such as a front camera, an open area has to be reserved for supporting the realization of the functions of the devices on the display screen, so that the occupation of the display area is caused, the display effect is affected, and bad user bodies are brought. In addition, when information sharing is carried out between the electronic equipment and other electronic equipment, the screen throwing window of the electronic equipment needs to be displayed on the other electronic equipment, and then information is selected from the screen throwing window for sharing, so that the mode occupies a desktop space, the use scene is limited and the operation is complex, and the convenience and the visual experience of information sharing are poor. Aiming at the problems, the inventor provides the interaction method, the device, the electronic equipment and the storage medium, and the novel purpose can be given to the holes in the display screen by matching the holes in the display screen with the sharing of information, so that the convenience of information sharing and the visual experience of users are improved. The specific interaction method is described in detail in the following embodiments.
Referring to fig. 6, fig. 6 is a schematic flow chart of an interaction method according to an embodiment of the present application. The method is used for matching with information sharing through the holes in the display screen, so that new purposes can be given to the holes in the display screen, and convenience in information sharing and visual experience of users are improved. In a specific embodiment, the interaction method may be applied to the interaction device 200 as shown in fig. 20 and the first electronic apparatus 100 (fig. 21) configured with the interaction device 200. The specific flow of the present embodiment will be described below by taking an electronic device as an example, and it will be understood that the electronic device applied in the present embodiment may include a smart phone, a tablet computer, a wearable electronic device, and the like, which is not limited herein. As will be described in detail below with respect to the flowchart shown in fig. 1, in this embodiment, the first electronic device may include a display screen, where the display screen has an opening, and the interaction method specifically may include the following steps:
Step S110: and displaying the current interface.
In this embodiment, the electronic device may display the current interface. The current interface may be an interface displaying any content, or the current interface may be an interface displaying any operable (e.g., draggable) content, which is not limited herein.
In some implementations, the electronic device can display the current interface in response to the interface display instruction. Optionally, when receiving the target voice information, the electronic device may determine that an interface display instruction is received and display a current interface; when the target touch operation of the target entity key acting on the electronic equipment is detected, determining that an interface display instruction is received and displaying a current interface; when the target touch operation of the target virtual key acting on the electronic equipment is detected, determining that an interface display instruction is received and displaying a current interface; when the target shaking operation (such as the lifting operation) acted on the electronic equipment is detected, the interface display instruction is determined to be received and the current interface is displayed; when the target sliding operation acting on the display screen of the electronic equipment is detected, determining that an interface display instruction is received and displaying a current interface; when the environment where the electronic device is detected to be in meets the preset environmental conditions (such as the current time reaches the preset time, the current place is located at the preset place, the current temperature reaches the preset temperature, etc.), the interface display instruction is determined to be received and the current interface is displayed, and the method is not limited herein.
In some embodiments, the current interface displayed by the electronic device may include a system desktop, a negative one-screen interface, a lock screen interface, a chat interface, a video play interface, a browser interface, an album interface, and the like, without limitation. The current interface includes, but is not limited to, text, pictures, characters, etc. displayed.
As an implementation manner, the current interface displayed by the electronic device may be a full-screen display interface or a non-full-screen display interface. The full screen display interface characterizes that the display content corresponding to the current interface comprises a peripheral area of an opening, such as a left area of the opening, a right area of the opening, an upper area of the opening and the like; the non-full screen display interface characterizes that the display content corresponding to the current interface does not include a peripheral area of the opening, such as a left area of the opening, a right area of the opening, an upper area of the opening, and the like, which are not limited herein.
Step S120: and if the drag operation acting on the current interface is detected, determining a drag object corresponding to the drag operation from the current interface, and controlling the drag object to move based on the drag operation.
In this embodiment, the electronic device may determine whether a drag operation acting on the current interface is detected during the process of displaying the current interface. If it is determined that the drag operation acting on the current interface is detected, a drag object corresponding to the drag operation is determined from the current interface in response to the drag operation acting on the current interface, and the drag object is controlled to move based on the drag operation. If it is determined that the drag operation on the current interface is not detected, the current interface may be maintained and whether the drag operation on the current interface is detected may be continuously determined.
In some embodiments, the electronic device may detect, in real time, a drag operation acting on the current interface during a process of displaying the current interface, detect, at a preset time interval, a drag operation acting on the current interface, detect, at a preset time point, detect, at other preset rules, a drag operation acting on the current interface, and so on, which is not limited herein.
Optionally, in this embodiment, the drag object may include text displayed on the current interface, a picture displayed on the current interface, a video displayed on the current interface, a file displayed on the current interface, a website displayed on the current interface, an application window displayed on the current interface, a picture-in-picture displayed on the current interface, a window of the current interface, and the like, which are not limited herein.
In some embodiments, in the process of displaying the current interface, the electronic device may select an object that is desired to be shared as a drag object through a pressing operation applied to the current interface, and drag the drag object to move through the drag operation. As an implementation manner, in the process of displaying the current interface, the electronic device may select an object to be shared as a drag object in a long-press manner, and then drag the selected drag object to control the drag object to move.
In some implementations, determining a drag object corresponding to a drag operation from a current interface can include: and determining the object selected by the pressing and dragging operation from the current interface, and determining the object selected by the pressing and dragging operation as a dragging object. For example, if the object selected by the pressing and dragging operations is a file in the current interface, the file may be determined to be a dragging object; if the object selected by the pressing and dragging operation is a window of the current interface, the current interface can be determined to be a dragging object; if the object selected by the pressing and dragging operation is a picture-in-picture in the current interface, the picture-in-picture can be determined as the dragging object.
In some implementations, the drag operation on the current interface can include: the single-finger drag operation acting on the current interface or the multi-finger drag operation acting on the current interface is not limited herein.
It is understood that controlling the movement of the drag object based on the drag operation may include: and controlling the dragging object to move according to the moving track of the dragging operation, namely, the moving track of the dragging object on the display screen is consistent with the moving track of the dragging operation on the display screen. And controlling the dragging object to move according to the moving speed of the dragging operation, namely, the moving speed of the dragging object on the display screen is consistent with the moving speed of the dragging operation on the display screen.
Taking a drag object as a window of the current interface as an example. Referring to fig. 7 and 8, fig. 7 shows a first interaction schematic diagram of an electronic device provided in an embodiment of the present application, and fig. 8 shows a second interaction schematic diagram of an electronic device provided in an embodiment of the present application. As shown in fig. 7, the first electronic device displays a current interface, and the user may determine the window of the current interface as a drag object a by means of a single-finger up-sliding manner, as shown in fig. 8.
Step S130: and if the drag object moves into the opening based on the drag operation, sharing the drag object to a second electronic device.
In this embodiment, in the process of controlling the movement of the drag object based on the drag operation, it may be detected whether the drag object moves into the opening based on the drag operation. If the drag object is detected to move into the opening based on the drag operation, the drag object can be shared to the second electronic device. If the drag object is detected not to move into the opening based on the drag operation, the drag object may not be shared to the second electronic device.
In some embodiments, the electronic device may be pre-configured and store the location of the aperture. Then, in controlling the movement of the drag object based on the drag operation, it is possible to detect the position of the drag object and determine whether the position of the drag object coincides with the position of the opening. If the position of the drag object is determined to be consistent with the position of the opening, the drag object may be determined to move into the opening based on the drag operation, and if the position of the drag object is determined to be inconsistent with the position of the opening, the drag object may be determined not to move into the opening based on the drag operation.
In some embodiments, the number of the second electronic devices may be one or more, and when the number of the second electronic devices is a plurality of the second electronic devices, the plurality of second electronic devices may be the same type, or may be different types, where when the plurality of second electronic devices are the same type, the plurality of second electronic devices may be computers, all tablets, and the like, and when the plurality of second electronic devices are different types, the plurality of second electronic devices may include computers, tablets, and the like, and are not limited herein. In this embodiment, when the number of the second electronic devices is one, the drag object may be shared to the one second electronic device; when the number of the second electronic devices is plural, the drag object may be respectively shared to the plural second electronic devices.
In some implementations, sharing the drag object to the second electronic device may include: sharing the drag object to the second electronic equipment in a file transmission mode; or the drag object is shared to the second electronic equipment in a screen throwing mode.
Referring to fig. 9-11, fig. 9 shows a first interface schematic diagram of an electronic device provided in an embodiment of the present application, fig. 10 shows a second interface schematic diagram of an electronic device provided in an embodiment of the present application, and fig. 11 shows a third interface schematic diagram of an electronic device provided in an embodiment of the present application. As shown in fig. 9, the current interface of the electronic device displays a drag object a, where the drag object a may move to the opening based on the drag operation, and in the process, may move to the position shown in fig. 10, and then continue to move to the opening under the drag operation until moving to the opening, as shown in fig. 11.
According to the interaction method provided by the embodiment of the invention, the current interface is displayed, if the drag operation acting on the current interface is detected, the drag object corresponding to the drag operation is determined from the current interface, the drag object is controlled to move based on the drag operation, and if the drag object moves to the open hole of the display screen of the electronic device based on the drag operation, the drag object is shared to the second electronic device, so that new use of the open hole in the display screen can be given through the sharing of the open hole matching information in the display screen, and convenience of information sharing and visual experience of users are improved.
Referring to fig. 12, fig. 12 is a flow chart illustrating an interaction method according to an embodiment of the present application. The interaction method is applied to the first electronic device, and the first electronic device comprises a display screen, wherein the display screen is provided with an opening. The following details about the flow shown in fig. 12, the interaction method specifically may include the following steps:
step S210: and displaying the current interface.
Step S220: and if the drag operation acting on the current interface is detected, determining a drag object corresponding to the drag operation from the current interface, and controlling the drag object to move based on the drag operation.
The specific description of step S210 to step S220 refers to step S110 to step S120, and will not be repeated here.
Step S230: and if the drag object moves to a first target position based on the drag operation, detecting the static duration of the drag object at the first target position, wherein the distance between the first target position and the position of the opening is in a first preset distance range.
Optionally, the electronic device may preset and store a first target position, where a distance between the first target position and a position of the opening is within a first preset distance range. That is, since the distance between the first target position and the position of the opening is within the first preset distance range, the first target position is characterized as being located in the vicinity of the position of the opening, that is, the first target position is closer to the opening.
In this embodiment, in the process of controlling the movement of the drag object based on the drag operation, it may be detected whether the drag object moves to the first target position based on the drag operation. If it is detected that the drag object moves to the first target position based on the drag operation, the drag object is characterized to move to a vicinity of the opening based on the drag operation, and a stationary duration of the drag object at the first target position may be detected. If the drag object is detected not to move to the first target position based on the drag operation, the drag object is characterized to not move to the vicinity of the opening based on the drag operation, and the static duration of the drag object at the first target position may not be detected.
In some embodiments, in the case where it is detected that the drag object moves to the first target position based on the drag operation, it may be detected whether the position where the drag object is located is changed. If the position of the drag object is kept fixed or the change range is within the preset change range, the drag object is characterized to be stationary at the first target position, and the stationary time of the drag object at the first target position can be accumulated; if the change range of the position of the drag object is out of the preset change range, the drag object is characterized as not being stationary at the first target position, and the accumulation of the stationary time length of the drag object at the first target position can be terminated.
In some embodiments, the electronic device may include a clock, and the stationary duration of the drag object at the first target position may be obtained by the clock.
Step S240: and if the static duration reaches a duration threshold, moving the dragging object from the first target position to the opening, and sharing the dragging object to a second electronic device.
In some embodiments, the electronic device may preset and store a duration threshold, where the duration threshold is used as a basis for determining the static duration. Therefore, in this embodiment, in the case of obtaining the stationary duration of the drag object at the first target position, the stationary duration may be compared with the duration threshold to determine whether the stationary duration reaches the duration threshold. If the static duration reaches the duration threshold, the drag object can be automatically moved from the first target position to the opening, and the drag object is shared to the second electronic device. If the static time length does not reach the time length threshold value, the display of the drag object at the first target position can be maintained.
As an implementation manner, when the rest duration reaches the duration threshold, the drag object may be sucked into the opening, and the drag object may be shared to the second electronic device.
Alternatively, the duration threshold may include 3S, 4S, etc., without limitation.
Referring to fig. 13, fig. 13 is a flow chart illustrating step S240 of the interaction method shown in fig. 12 of the present application. The following will describe the flow shown in fig. 13 in detail, and the method specifically may include the following steps:
step S241: if the static duration reaches a duration threshold, moving the dragging object from the first target position to the opening, and displaying a target interface, wherein the target interface comprises identifiers corresponding to a plurality of electronic devices to be selected, and the electronic devices to be selected and the first electronic device have an association relationship.
In some embodiments, if it is determined that the stationary duration reaches the duration threshold, the drag object may be automatically moved from the first target position into the aperture and a target interface displayed. That is, if it is determined that the rest duration reaches the duration threshold, the drag object may be automatically moved from the first target position into the opening, and jump from the current interface into a new interface, i.e., the target interface. It will be appreciated that throughout the process, the user maintains the pressing and dragging of the drag object (without releasing the drag object).
As an implementation manner, the target interface may be an infinite interface, and scaling of the target interface may be achieved by a single-finger up-down sliding operation acting on the target interface.
In some embodiments, the target interface may include identifiers corresponding to a plurality of electronic devices to be selected, where the plurality of electronic devices to be selected have an association relationship with the first electronic device. Optionally, the association between the electronic device to be selected and the first electronic device may include: the electronic device to be selected is connected with the first electronic device, for example, the electronic device to be selected and the first electronic device belong to cooperative devices, such as a smart phone, a tablet computer, a personal computer, a smart television and the like. And/or the electronic device to be selected and the first electronic device log on the same account, such as a smart screen, a printer, a router, etc. in home, without limitation.
As an implementation manner, if the static duration reaches the duration threshold, the drag object may be automatically moved from the first target position to the opening, an electronic device connected with the first electronic device is determined to be the electronic device to be selected, and/or an electronic device logged in to the same account as the first electronic device is determined to be the electronic device to be selected, so that a plurality of electronic devices to be selected are determined, and under the condition that the plurality of electronic devices to be selected are determined, the identifiers corresponding to the plurality of electronic devices to be selected may be acquired, and the identifiers corresponding to the plurality of electronic devices to be selected are displayed in the target interface.
Optionally, the identifiers corresponding to the plurality of electronic devices to be selected respectively may include: the icons, the appearances, the texts, the illustrations, and the like corresponding to the plurality of electronic devices to be selected are not limited herein.
In some embodiments, the target interface may also display a screen of the electronic device, where the screen may be the target interface, or may be a screen before entering the target interface, and the like, which is not limited herein.
In some implementations, the target interface can also include an application page of an application running on the first electronic device. Based on this, if the drag object moves into the application page at the target interface based on the drag operation, the drag object may be added to the application program. Optionally, in the process of jumping from displaying the current interface to displaying the target interface, the electronic device may always maintain a drag operation acting on the drag object, based on which, in the case of displaying the target interface, the drag object may move on the target interface based on the drag operation, and when the user wants to add the drag object to the application page, the drag object may be dragged into the application page, and accordingly, when the first electronic device detects that the drag object moves into the application page based on the drag operation, the drag object may be added into the application page as a response.
As an implementation manner, if the number of the application programs operated by the first electronic device is one, an application page of one application program operated by the first electronic device may be displayed on the target interface. Then, the drag object may be added to the one application program when the drag object moves to the application page of the one application program based on the drag operation.
As still another implementation manner, if the number of the application programs operated by the first electronic device is a plurality of application programs, application pages corresponding to the plurality of application programs operated by the first electronic device may be displayed on the target interface. Then, the drag object may be moved to one or several application pages of the application pages corresponding to the application programs based on the drag operation, and when the drag object is moved to one application page of the application pages corresponding to the application programs based on the drag operation, the drag operation may be added to one application program corresponding to the one application page; when the drag object moves to several application pages among the application pages corresponding to the application programs based on the drag operation, the drag operation may be respectively added to one application program corresponding to the application pages.
Referring to fig. 14, fig. 14 shows a fourth interface schematic diagram of an electronic device according to an embodiment of the present application. As shown in fig. 14, the electronic device may display a target interface, where the target interface may include a screen B of the first electronic device, an identifier C corresponding to the electronic device to be selected, and an application page D of an application running on the first electronic device.
Step S242: and determining the second electronic equipment from the plurality of electronic equipment to be selected.
In this embodiment, in the process of displaying the identifiers corresponding to each of the plurality of electronic devices to be selected on the target interface, one or more electronic devices to be selected may be determined as the second electronic device from the plurality of electronic devices to be selected.
In some embodiments, in the process of displaying the identifiers corresponding to the plurality of electronic devices to be selected on the target interface, the second electronic device may be determined from the plurality of electronic devices to be selected by means of voice, or the second electronic device may be determined from the plurality of electronic devices to be selected by means of touch operation, which is not limited herein.
As an implementation manner, if the drag object moves to a target identifier in identifiers corresponding to a plurality of electronic devices to be selected based on drag operation at the target interface, determining the electronic device to be selected corresponding to the target identifier as the second electronic device. Optionally, in the process that the electronic device jumps from displaying the current interface to displaying the target interface, the user can always keep the drag operation acting on the drag object, based on this, in the case that the target interface is displayed, the drag object can move on the target interface based on the drag operation, when the user wants to determine the second electronic device from the multiple electronic devices to be selected, the drag object can be dragged to the identifier corresponding to the electronic device to be selected, which wants to determine the second electronic device, and accordingly, when the first electronic device detects that the drag object moves on the target interface to the target identifier in the identifiers corresponding to the multiple electronic devices to be selected based on the drag operation, the first electronic device can determine the electronic device to be selected, which corresponds to the target identifier, as the second electronic device in response.
Optionally, moving the drag object to the target mark based on the drag operation may include: the drag object moves to the target mark on the target interface based on the drag operation, or the drag object moves to the vicinity of the target mark based on the drag operation.
Step S243: and sharing the drag object to the second electronic equipment.
In this embodiment, in the case where the second electronic device is determined from the plurality of electronic devices to be selected, the drag object may be shared to the second electronic device.
According to the interaction method provided by the embodiment of the invention, the current interface is displayed, if the drag operation acting on the current interface is detected, the drag object corresponding to the drag operation is determined from the current interface, the drag object is controlled to move based on the drag operation, if the drag object moves to the first target position based on the drag operation, the static time length of the drag object at the first target position is detected, wherein the distance between the first target position and the position of the opening is within a first preset distance range, and if the static time length reaches a time length threshold, the drag object is moved from the first target position to the opening and is shared to the second electronic equipment. Compared with the interaction method shown in fig. 6, the embodiment also enters a new interface according to the static duration of the drag operation, and selects the sharing device in the new interface, so that the interaction experience of the user can be improved.
Referring to fig. 15, fig. 15 is a schematic flow chart of an interaction method according to an embodiment of the present application. The interaction method is applied to the first electronic device, and the first electronic device comprises a display screen, wherein the display screen is provided with an opening. In this embodiment, the first electronic device points to the second electronic device, and the following details will be described with respect to the flowchart shown in fig. 15, and the interaction method specifically may include the following steps:
step S310: and displaying the current interface.
Step S320: and if the drag operation acting on the current interface is detected, determining a drag object corresponding to the drag operation from the current interface, and controlling the drag object to move based on the drag operation.
The specific description of step S310 to step S320 refer to step S110 to step S120, and are not described herein.
Step S330: and if the drag object detects a release operation when moving into the opening based on the drag operation, sharing the drag object to the second electronic equipment.
In this embodiment, the first electronic device is pointed towards the second electronic device. Alternatively, the first electronic device may include an Ultra Wide Band (UWB) module, and the first electronic device may determine its pointing direction through the included UWB module, that is, may determine whether it points to the second electronic device through the included UWB module. Alternatively, the first electronic device may include a camera (e.g., front camera, rear camera, etc.), and the first electronic device may determine its orientation through the included camera, i.e., may determine whether it is oriented toward the second electronic device through the included camera.
As an implementation manner, in a case that the first electronic device points to the second electronic device, it may be determined whether the drag object moves into the opening based on the drag operation, and whether a release operation is detected when the drag object moves into the opening based on the drag operation. If it is determined that the drag object moves into the opening based on the drag operation, and it is determined that the release operation is detected when the drag object moves into the opening based on the drag operation, it may be considered that information sharing is triggered, and the drag object may be shared to the second electronic device.
As yet another implementation, it may be determined whether the drag object moves into the aperture based on the drag operation, and whether a release operation is detected when the drag object moves into the aperture based on the drag operation. If it is determined that the drag object moves into the opening based on the drag operation, and if it is determined that the release operation is detected when the drag object moves into the opening based on the drag operation, whether the first electronic device points to the second electronic device may be detected, and if it is detected that the first electronic device points to the second electronic device, information sharing may be considered to be triggered, and the drag object may be shared to the second electronic device.
Referring to fig. 16, fig. 16 shows a schematic view of a scenario application of an electronic device according to an embodiment of the present application. As shown in fig. 16, if the first electronic device 100 points to the second electronic device (notebook computer) 400, the first electronic device 100 may share the drag object a determined by the user to the notebook computer 400; if the first electronic device 100 points to the second electronic device (smart tv) 500, the first electronic device 100 may share the drag object a determined by the user to the smart tv 500.
According to the interaction method provided by the embodiment of the application, the current interface is displayed, if the drag operation acting on the current interface is detected, the drag object corresponding to the drag operation is determined from the current interface, the drag object is controlled to move based on the drag operation, and under the condition that the first electronic device points to the second electronic device, if the release operation is detected when the drag object moves into the opening based on the drag operation, the drag object is shared to the second electronic device. Compared with the interaction method shown in fig. 6, the embodiment further determines the sharing device according to the drag operation and the execution of the electronic device, thereby improving the efficiency of information sharing.
Referring to fig. 17, fig. 17 is a schematic flow chart of an interaction method according to an embodiment of the present application. The interaction method is applied to the first electronic device, and the first electronic device comprises a display screen, wherein the display screen is provided with an opening. The following details about the flowchart shown in fig. 17, where the interaction method specifically includes the following steps:
Step S410: and displaying the current interface.
Step S420: and if the drag operation acting on the current interface is detected, determining a drag object corresponding to the drag operation from the current interface, and controlling the drag object to move based on the drag operation.
Step S430: and if the drag object moves into the opening based on the drag operation, sharing the drag object to a second electronic device.
The specific description of step S410 to step S430 refer to step S110 to step S130, and are not described herein.
Step S440: and if the dragging object moves into the opening based on the dragging operation, displaying a preset action effect at a second target position, wherein the distance between the second target position and the position of the opening is within a second preset distance range.
Optionally, the electronic device may preset and store a second target position, where a distance between the second target position and a position of the opening is within a second preset distance range. That is, since the distance between the second target position and the position of the opening is within the second preset distance range, the second target position is characterized as being located in the vicinity of the position of the opening, that is, the second target position is closer to the opening. The first target position and the second target position may be the same or different, and are not limited herein.
In some embodiments, the electronic device may preset and store a preset action, where the preset action may include halation, photons, and the like, which is not limited herein. Based on this, in this embodiment, if the drag object moves into the opening based on the drag operation, the preset moving effect may be displayed at the second target position, so as to improve the display effect during information sharing.
According to the interaction method provided by the embodiment of the application, the current interface is displayed, if the drag operation acting on the current interface is detected, a drag object corresponding to the drag operation is determined from the current interface, the drag object is controlled to move based on the drag operation, if the drag object moves to the open hole based on the drag operation, the drag object is shared to the second electronic device, if the drag object moves to the open hole based on the drag operation, the preset action effect is displayed at the second target position, and the distance between the second target position and the position of the open hole is within the second preset distance range. Compared with the interaction method shown in fig. 6, the preset dynamic effects are correspondingly displayed on the periphery of the opening, so that the display effect during information sharing can be improved.
Referring to fig. 18, fig. 18 is a flow chart illustrating an interaction method according to an embodiment of the present application. The interaction method is applied to the first electronic device, and the first electronic device comprises a display screen, wherein the display screen is provided with an opening. The following details about the flow shown in fig. 18, the interaction method specifically may include the following steps:
step S510: and displaying the current interface.
Step S520: and if the drag operation acting on the current interface is detected, determining a drag object corresponding to the drag operation from the current interface, and controlling the drag object to move based on the drag operation.
The specific description of step S510 to step S520 refers to step S110 to step S120, and is not repeated here.
Step S530: and if the dragging object moves into the opening based on the dragging operation, determining a sharing mode corresponding to the dragging object, wherein the sharing mode comprises a file transmission mode or a screen throwing mode.
In this embodiment, if the drag object moves into the opening based on the drag operation, a sharing manner corresponding to the drag object may be determined, where the sharing manner may include a file transmission manner or a screen projection manner. Optionally, if the drag object moves into the opening based on the drag operation, a sharing manner corresponding to the drag object may be determined based on the drag object.
In some embodiments, the first electronic device may preset and store a plurality of objects, a plurality of sharing manners, and correspondence between the plurality of objects and the plurality of sharing manners. The correspondence may include one object corresponding to one sharing manner, a plurality of objects corresponding to one sharing manner, and the like, which is not limited herein. In this case, when the drag object is moved into the hole by the drag operation, the sharing method corresponding to the drag object may be determined based on the correspondence between the plurality of objects and the plurality of sharing methods.
For example, if the drag object is a picture, it may be determined that the sharing mode corresponding to the drag object is a file transmission mode; if the drag object is a file, determining that the sharing mode corresponding to the drag object is a file transmission mode; if the drag object is an application window, determining that a sharing mode corresponding to the drag object is a screen throwing mode; if the drag object is a picture-in-picture window, determining that a sharing mode corresponding to the drag object is a screen throwing mode; if the dragged object is the current interface window, it may be determined that the sharing mode corresponding to the dragged object is a screen-throwing mode, and the like, which will not be described herein.
Referring to fig. 19, fig. 19 is a flowchart illustrating step S530 of the interaction method shown in fig. 18 of the present application.
The following will describe the flow chart shown in fig. 19 in detail, and the method specifically may include the following steps:
step S531: and if the dragging object moves into the opening based on the dragging operation, acquiring the type corresponding to the dragging object.
In some embodiments, if the drag object moves into the opening based on the drag operation, the type corresponding to the drag object may be obtained.
As an implementation manner, the type corresponding to the drag object may include a window type or a non-window type. Therefore, in this embodiment, when the drag object moves into the opening based on the drag operation, it is possible to detect whether the type corresponding to the drag object is a window type or a non-window type.
Optionally, if the drag object is an application window, a picture-in-picture window, or a current interface window, it may be determined that the type corresponding to the drag object is a window type; if the drag object is a picture or a file, it may be determined that the type corresponding to the drag object is a non-window type.
Step S532: and determining a sharing mode corresponding to the dragging object based on the type corresponding to the dragging object.
In this embodiment, when the type corresponding to the drag object is obtained, the sharing manner corresponding to the drag object may be determined based on the type corresponding to the drag object.
In some embodiments, if the type corresponding to the drag object is a window type, it may be determined that the sharing mode corresponding to the drag object is a screen throwing mode, and if the type corresponding to the drag object is a non-window type, it may be determined that the sharing mode corresponding to the drag object is a file transfer mode.
Step S540: and sharing the dragging object to the second electronic equipment based on the sharing mode.
In this embodiment, when the sharing manner corresponding to the drag object is determined, the drag object may be shared to the second electronic device based on the sharing manner.
In some embodiments, if the sharing mode corresponding to the drag object is determined to be a file transmission mode, the drag object may be shared to the second electronic device through the file transmission mode; if the sharing mode corresponding to the drag object is determined to be the screen throwing mode, the drag object can be shared to the second electronic device through the screen throwing mode.
According to the interaction method provided by the embodiment of the application, the current interface is displayed, if the drag operation acting on the current interface is detected, a drag object corresponding to the drag operation is determined from the current interface, the drag object is controlled to move based on the drag operation, if the drag object moves to the open hole based on the drag operation, a sharing mode corresponding to the drag object is determined, wherein the sharing mode comprises a file transmission mode or a screen throwing mode, and the drag is shared to the second electronic equipment solely based on the sharing mode. Compared with the interaction method shown in fig. 6, the embodiment can also determine the sharing mode corresponding to the dragged object, and share the dragged object based on the sharing mode, so as to improve the diversity of information sharing.
Referring to fig. 20, fig. 20 is a block diagram of an interaction device according to an embodiment of the present application. The interaction device 200 is applied to the first electronic device, and the first electronic device includes a display screen, where the display screen has an opening. The following will describe the block diagram shown in fig. 20, and the interaction device 200 includes: a current interface display module 210, a drag object movement module 220, and a drag object sharing module 230, wherein:
The current interface display module 210 is configured to display a current interface.
And the drag object moving module 220 is configured to determine a drag object corresponding to the drag operation from the current interface and control the drag object to move based on the drag operation if the drag operation acting on the current interface is detected.
And a drag object sharing module 230, configured to share the drag object to a second electronic device if the drag object moves into the opening based on the drag operation.
Further, the drag object sharing module 230 includes: the static duration detection sub-module and the first dragging object sharing sub-module are used for:
and the static duration detection sub-module is used for detecting the static duration of the dragging object at the first target position if the dragging object moves to the first target position based on the dragging operation, wherein the distance between the first target position and the position of the opening is in a first preset distance range.
And the first dragging object sharing sub-module is used for moving the dragging object from the first target position to the opening if the static duration reaches a duration threshold value, and sharing the dragging object to the second electronic equipment.
Further, the first drag object sharing sub-module includes: the electronic device comprises a target interface display unit, a second electronic device determining unit and a dragging object sharing unit, wherein:
and the target interface display unit is used for moving the dragging object from the first target position to the opening and displaying a target interface if the static duration reaches a duration threshold, wherein the target interface comprises a plurality of identifiers corresponding to the electronic equipment to be selected, and the electronic equipment to be selected has an association relationship with the first electronic equipment.
And the second electronic equipment determining unit is used for determining the second electronic equipment from the plurality of electronic equipment to be selected.
Further, the second electronic device determining unit includes: a second electronic device determining subunit, wherein:
and the second electronic equipment determining subunit is used for determining the electronic equipment to be selected corresponding to the target identifier as the second electronic equipment if the drag object moves to the target identifier in the identifiers corresponding to the plurality of electronic equipment to be selected based on the drag operation at the target interface.
And the dragging object sharing unit is used for sharing the dragging object to the second electronic equipment.
Further, the target interface further includes an application page of an application program running on the first electronic device, and the first drag object sharing submodule includes: a drag object adding unit in which:
and the drag object adding unit is used for adding the drag object to the application program if the drag object moves to the application page on the target interface based on the drag operation.
Further, the drag object sharing module 230 includes: the sharing mode determining sub-module and the second dragging object sharing sub-module, wherein:
and the sharing mode determining submodule is used for determining a sharing mode corresponding to the dragging object if the dragging object moves into the opening based on the dragging operation, wherein the sharing mode comprises a file transmission mode or a screen throwing mode.
Further, the sharing mode determining submodule includes: a type acquisition unit and a sharing manner determination unit, wherein:
and the type acquisition unit is used for acquiring the type corresponding to the dragging object if the dragging object moves into the opening based on the dragging operation.
And the sharing mode determining unit is used for determining the sharing mode corresponding to the dragging object based on the type corresponding to the dragging object.
And the second dragging object sharing sub-module is used for sharing the dragging object to the second electronic equipment based on the sharing mode.
Further, the interaction device 200 further includes: dynamic effect display module, wherein:
and the dynamic effect display module is used for displaying a preset dynamic effect at a second target position if the dragging object moves into the opening based on the dragging operation, wherein the distance between the second target position and the position of the opening is within a second preset distance range.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the apparatus and modules described above may refer to the corresponding process in the foregoing method embodiment, which is not repeated herein.
In several embodiments provided herein, the coupling of the modules to each other may be electrical, mechanical, or other.
In addition, each functional module in each embodiment of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module. The integrated modules may be implemented in hardware or in software functional modules.
Referring to fig. 21, a block diagram of an electronic device 100 according to an embodiment of the present application is shown. The electronic device 100 may be a smart phone, a tablet computer, an electronic book, or the like capable of running an application program. The electronic device 100 in this application may include one or more of the following components: processor 110, memory 120, display 130, and one or more application programs, wherein the one or more application programs may be stored in memory 120 and configured to be executed by the one or more processors 110, the one or more program(s) configured to perform the methods as described in the foregoing method embodiments.
Wherein the processor 110 may include one or more processing cores. The processor 110 utilizes various interfaces and lines to connect various portions of the overall electronic device 100, perform various functions of the electronic device 100, and process data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 120, and invoking data stored in the memory 120. Alternatively, the processor 110 may be implemented in hardware in at least one of digital signal processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), programmable logic array (Programmable Logic Array, PLA). The processor 110 may integrate one or a combination of several of a central processing unit (Central Processing Unit, CPU), a graphics processor (Graphics Processing Unit, GPU), and a modem, etc. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for being responsible for rendering and drawing the content to be displayed; the modem is used to handle wireless communications. It will be appreciated that the modem may not be integrated into the processor 110 and may be implemented solely by a single communication chip.
The Memory 120 may include a random access Memory (Random Access Memory, RAM) or a Read-Only Memory (Read-Only Memory). Memory 120 may be used to store instructions, programs, code, sets of codes, or sets of instructions. The memory 120 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing functions (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the various method embodiments described below, and the like. The storage data area may also store data created by the electronic device 100 in use (e.g., phonebook, audiovisual data, chat log data), and the like.
The display 130 is used to display information input by a user, information provided to the user, and various graphical user interfaces of the electronic device 100, which may be formed by graphics, text, icons, numbers, video, and any combination thereof, and in one example, the display 130 may be a liquid crystal display (Liquid Crystal Display, LCD) or an Organic Light-Emitting Diode (OLED), which is not limited herein.
Referring to fig. 22, a block diagram of a computer readable storage medium according to an embodiment of the present application is shown. The computer readable medium 300 has stored therein program code which can be invoked by a processor to perform the methods described in the method embodiments described above.
The computer readable storage medium 300 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. Optionally, the computer readable storage medium 300 comprises a non-volatile computer readable medium (non-transitory computer-readable storage medium). The computer readable storage medium 300 has storage space for program code 310 that performs any of the method steps described above. The program code can be read from or written to one or more computer program products. Program code 310 may be compressed, for example, in a suitable form.
In summary, according to the interaction method, the device, the electronic device and the storage medium provided by the embodiments of the present application, the current interface is displayed, if a drag operation acting on the current interface is detected, a drag object corresponding to the drag operation is determined from the current interface, the drag object is controlled to move based on the drag operation, if the drag object moves to an opening of a display screen of the electronic device based on the drag operation, the drag object is shared to a second electronic device, so that a new purpose of the opening in the display screen can be given by matching the opening in the display screen with information sharing, and convenience of information sharing and visual experience of users are improved.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and are not limiting thereof; although the present application has been described in detail with reference to the foregoing embodiments, one of ordinary skill in the art will appreciate that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not drive the essence of the corresponding technical solutions to depart from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (14)

1. An interaction method, characterized by being applied to a first electronic device, the first electronic device including a display screen, the display screen having an aperture, the method comprising:
displaying a current interface;
if the drag operation acting on the current interface is detected, determining a drag object corresponding to the drag operation from the current interface, and controlling the drag object to move based on the drag operation;
and if the drag object moves into the opening based on the drag operation, sharing the drag object to a second electronic device.
2. The method of claim 1, wherein the sharing the drag object to a second electronic device if the drag object moves into the aperture based on the drag operation comprises:
If the drag object moves to a first target position based on the drag operation, detecting the static time length of the drag object at the first target position, wherein the distance between the first target position and the position of the opening is in a first preset distance range;
and if the static duration reaches a duration threshold, moving the dragging object from the first target position to the opening, and sharing the dragging object to a second electronic device.
3. The method of claim 2, wherein moving the drag object from the first target location into the aperture and sharing the drag object to a second electronic device if the stationary duration reaches a duration threshold comprises:
if the static duration reaches a duration threshold, moving the dragging object from the first target position to the opening, and displaying a target interface, wherein the target interface comprises a plurality of identifiers corresponding to the electronic equipment to be selected, and the electronic equipment to be selected has an association relationship with the first electronic equipment;
determining the second electronic equipment from the plurality of electronic equipment to be selected;
And sharing the drag object to the second electronic equipment.
4. The method of claim 3, wherein the determining the second electronic device from the plurality of electronic devices to be selected comprises:
and if the drag object moves to a target identifier in identifiers corresponding to the plurality of electronic devices to be selected based on the drag operation on the target interface, determining the electronic device to be selected corresponding to the target identifier as the second electronic device.
5. The method of claim 3, wherein the association of the electronic device to be selected with the first electronic device comprises: the electronic equipment to be selected is connected with the first electronic equipment, and/or the electronic equipment to be selected and the first electronic equipment are/is logged in the same account.
6. The method of claim 3, wherein the target interface further comprises an application page of an application running on the first electronic device, the method further comprising:
and if the drag object moves to the application page on the target interface based on the drag operation, adding the drag object to the application program.
7. The method of claim 1, wherein the sharing the drag object to a second electronic device if the drag object moves into the aperture based on the drag operation when the first electronic device is pointed at the second electronic device comprises:
and if the drag object detects a release operation when moving into the opening based on the drag operation, sharing the drag object to the second electronic equipment.
8. The method according to any one of claims 1-7, further comprising:
and if the dragging object moves into the opening based on the dragging operation, displaying a preset action effect at a second target position, wherein the distance between the second target position and the position of the opening is within a second preset distance range.
9. The method of any of claims 1-7, wherein the sharing the drag object to a second electronic device if the drag object moves into the aperture based on the drag operation comprises:
if the dragging object moves into the opening based on the dragging operation, determining a sharing mode corresponding to the dragging object, wherein the sharing mode comprises a file transmission mode or a screen throwing mode;
And sharing the dragging object to the second electronic equipment based on the sharing mode.
10. The method of claim 9, wherein determining the sharing manner corresponding to the drag object if the drag object moves into the opening based on the drag operation comprises:
if the dragging object moves into the opening based on the dragging operation, acquiring a type corresponding to the dragging object;
and determining a sharing mode corresponding to the dragging object based on the type corresponding to the dragging object.
11. The method of any of claims 1-7, wherein the electronic device further comprises a photosensitive device, the opening corresponding in position to the photosensitive device.
12. An interactive apparatus for use with a first electronic device, the first electronic device including a display screen having an aperture, the apparatus comprising:
the current interface display module is used for displaying a current interface;
the drag object moving module is used for determining a drag object corresponding to the drag operation from the current interface and controlling the drag object to move based on the drag operation if the drag operation acting on the current interface is detected;
And the drag object sharing module is used for sharing the drag object to the second electronic equipment if the drag object moves into the opening based on the drag operation.
13. An electronic device comprising a display screen, a memory, and a processor, the display screen and the memory coupled to the processor, the memory storing instructions that when executed by the processor perform the method of any of claims 1-11.
14. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a program code, which is callable by a processor for executing the method according to any one of claims 1-11.
CN202211721131.XA 2022-12-30 2022-12-30 Interaction method, device, electronic equipment and storage medium Pending CN116301538A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211721131.XA CN116301538A (en) 2022-12-30 2022-12-30 Interaction method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211721131.XA CN116301538A (en) 2022-12-30 2022-12-30 Interaction method, device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116301538A true CN116301538A (en) 2023-06-23

Family

ID=86821027

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211721131.XA Pending CN116301538A (en) 2022-12-30 2022-12-30 Interaction method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116301538A (en)

Similar Documents

Publication Publication Date Title
CN108089786B (en) User interface display method, device, equipment and storage medium
EP3842905B1 (en) Icon display method and apparatus, terminal and storage medium
EP3454197B1 (en) Method, device, and non-transitory computer-readable storage medium for switching pages of applications in a terminal device
US9582049B2 (en) Method and device for controlling user interface based on user's gesture
CN111596845B (en) Display control method and device and electronic equipment
CN116055610B (en) Method for displaying graphical user interface and mobile terminal
US20110163966A1 (en) Apparatus and Method Having Multiple Application Display Modes Including Mode with Display Resolution of Another Apparatus
US20180039403A1 (en) Terminal control method, terminal, and storage medium
US20120326994A1 (en) Information processing apparatus, information processing method and program
US20130167093A1 (en) Display apparatus for releasing locked state and method thereof
CN112000269A (en) Screen opening method and device and electronic equipment
WO2021037073A1 (en) Control method and terminal device
CN110045890B (en) Application identifier display method and terminal equipment
KR20110082494A (en) Method for data transferring between applications and terminal apparatus using the method
CN111026480A (en) Content display method and electronic equipment
WO2023016463A1 (en) Display control method and apparatus, and electronic device and medium
CN106126041B (en) Desktop icon management method of mobile terminal and mobile terminal
CN111459349B (en) Application recommendation method and electronic equipment
CN112148167A (en) Control setting method and device and electronic equipment
CN111338525A (en) Control method of electronic equipment and electronic equipment
CN108984062B (en) Content display method and terminal
WO2021104193A1 (en) Interface display method and electronic device
CN111638828A (en) Interface display method and device
US20130167054A1 (en) Display apparatus for releasing locked state and method thereof
CN111880700B (en) Application program control method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination