CN115033142B - Application interaction method and electronic equipment - Google Patents

Application interaction method and electronic equipment Download PDF

Info

Publication number
CN115033142B
CN115033142B CN202111340669.1A CN202111340669A CN115033142B CN 115033142 B CN115033142 B CN 115033142B CN 202111340669 A CN202111340669 A CN 202111340669A CN 115033142 B CN115033142 B CN 115033142B
Authority
CN
China
Prior art keywords
application
interface
target text
displaying
text
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111340669.1A
Other languages
Chinese (zh)
Other versions
CN115033142A (en
Inventor
王晨博
卫渊
周元甲
刘秋冶
伍国林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202111340669.1A priority Critical patent/CN115033142B/en
Publication of CN115033142A publication Critical patent/CN115033142A/en
Application granted granted Critical
Publication of CN115033142B publication Critical patent/CN115033142B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The embodiment of the application provides an application interaction method and electronic equipment. The application interaction method comprises the following steps: and displaying an interface of the first application, and receiving a selection operation of a first target text on the interface of the first application. Then, in response to dragging the first target text to the first area, an icon of a second application and an icon of a third application are displayed in a second area, wherein the second area is located at a screen edge portion of the electronic device. Then, in response to dragging the first target text onto the icon of the second application, the second application is started, and the interface of the second application is displayed in the area where the interface of the first application is displayed. Therefore, the user can conveniently start the second application, namely the forward application, and display the forward application interface by dragging the text to the second application icon on the first application interface, namely the text source application interface, so that the operation is simplified, and the user experience is improved.

Description

Application interaction method and electronic equipment
Technical Field
The embodiment of the application relates to the field of terminal electronic equipment, in particular to an application interaction method and electronic equipment.
Background
Currently, a large number of applications are generally installed in intelligent terminal electronic devices such as mobile phones and tablets. Users often need to copy text from one application and paste it into another application when using such electronic devices. At this time, after the user copies the text, the user exits the application interface of the copied text, finds the application that needs to paste the copied text from the main interface, opens the application interface that needs to paste the copied text, finds the paste position, and finally pastes. It can be seen that this application interaction process is rather cumbersome. Especially under the condition of copying and pasting for a plurality of times, a user needs to frequently switch between different applications, so that the operation is complex, the time consumption is more, and the user experience is poor.
Disclosure of Invention
In order to solve the technical problems, the application interaction method and the electronic device provided by the application have the advantages that a user can conveniently start the destination application and display the destination application interface through the operation of the application icon on the text source application interface, so that the operation is simplified, and the user experience is improved.
In a first aspect, the present application provides an application interaction method, including: and displaying an interface of the first application, and receiving a selection operation of a first target text on the interface of the first application. And in response to dragging the first target text to the first area, displaying an icon of the second application and an icon of the third application in a second area, wherein the second area is positioned at the edge part of the screen of the electronic device. Then, in response to dragging the first target text onto the icon of the second application, the second application is started, and the interface of the second application is displayed in the area where the interface of the first application is displayed. Therefore, the user can conveniently start the second application, namely the forward application, and display the forward application interface by dragging the text to the second application icon on the first application interface, namely the text source application interface, so that the operation is simplified, and the user experience is improved.
According to the first aspect, in response to dragging the first target text to the icon of the second application, starting the second application, after displaying the interface of the second application in the area where the interface of the first application is displayed, the method may further include: when the interface of the second application is detected to comprise a first text input box matched with the first target text, the first target text is displayed in an overlapping mode in the area in the first text input box. Then, in response to dragging the first target text outside the first area, the area within the first text entry box is stopped from being displayed in an overlaid manner. Next, in response to dragging the first target text to the first area, an icon of the second application and an icon of the third application are displayed in the second area. Then, in response to dragging the first target text onto the icon of the third application, the third application is started, and the interface of the third application is displayed in the area where the interface of the second application is displayed. Therefore, the user can conveniently switch the destination application by dragging the text from one application icon on the source application interface to another application icon on the source application interface, so that the operation is simplified, and the user experience is improved.
According to a first aspect, the application interaction method further comprises: the first target text is copied in response to dragging the first target text to the first region. Then, in response to dragging the first target text to the icon of the second application, starting the second application, after displaying the interface of the second application in the area where the interface of the first application is displayed, when the interface of the second application is detected to comprise a first text input box matched with the first target text, displaying the first target text in an overlapping mode in the area in the first text input box. Next, in response to the hands-free operation, the first target text is pasted into the first text input box and displayed within the first text input box. Therefore, the user can automatically paste the text to the corresponding position through the loosening operation after dragging the text, the operation is simplified, and the user experience is improved.
Wherein, the matching means that the content of the first target text meets the requirement of the first text input box on the input content. For example, the text entry box named "detailed address" requires the input content to be: the input content is an address information. If the content of a certain text is an address, a text entry box named "detailed address" is a text entry box that matches the text.
According to a first aspect, the application interaction method may further comprise: the first target text is copied in response to dragging the first target text to the first region. In response to dragging the first target text onto the icon of the second application, launching the second application, after displaying the interface of the second application in the area where the interface of the first application is displayed, further comprising: the interface of the second application comprises a second text input box and a third text input box, and the first target text is pasted into the second text input box and displayed in the second text input box in response to the operation of loosening hands after dragging the first target text to the second text input box. Therefore, when a plurality of text input boxes exist on the application interface, the text input box for pasting the text can be selected according to the position where the text is dragged, so that the operation is simplified, and the use experience of a user is improved.
According to the first aspect, in response to dragging the first target text to the icon of the second application, starting the second application, after displaying the interface of the second application in the area where the interface of the first application is displayed, the method further includes: and stopping displaying the icon of the second application and the icon of the third application. And then, the interface of the second application comprises a session area of the first object and a session area of the second object, and the session interface with the first object is displayed in the area of the interface of the second application in response to dragging the first target text to the session area of the first object, and belongs to the second application. Next, the conversation interface with the first object includes a fourth text entry box that matches the first target text, the first target text is pasted into the fourth text entry box, and the first target text is displayed within the fourth text entry box. Then, in response to receiving the transmission instruction, the content in the fourth text input box is transmitted to the first object. Therefore, by continuing dragging the text in the forward application interface, the display of the next interface of the current interface of the forward application can be conveniently started, the operation is simplified, and the use experience of a user is improved.
According to the first aspect, in response to dragging the first target text to the icon of the second application, starting the second application, after displaying the interface of the second application in the area where the interface of the first application is displayed, the method further includes: and stopping displaying the icon of the second application and the icon of the third application. The interface of the second application comprises a session area of the first object and a session area of the second object, and the session interface with the first object is displayed in the area of the interface of the second application in response to dragging the first target text to the session area of the first object, and belongs to the second application. The conversation interface with the first object includes a fourth text entry box that matches the first target text, pastes the first target text into the fourth text entry box, and displays the first target text within the fourth text entry box. And receiving an editing instruction for the text in the fourth text input box, and editing the text in the fourth text input box according to the editing instruction. After the editing is completed, the content in the fourth text input box is sent to the first object in response to receiving the sending instruction. Therefore, the text pasted into the text input box is edited in the forward application interface, the text can be conveniently modified according to the needs of the user, the needs of the user in a specific scene are met, and the use experience of the user is improved.
According to the first aspect, in response to dragging the first target text to the icon of the second application, starting the second application, after displaying the interface of the second application in the area where the interface of the first application is displayed, the method further includes: stopping displaying the icon of the second application and the icon of the third application; the interface of the second application comprises a session area of the first object and a session area of the second object, the first target text is sent to the first object in response to dragging the first target text to the session area of the first object, and the session interface with the first object after the first target text is sent is displayed in the area of the interface of the second application. Therefore, the text can be automatically sent to the target object by dragging the text to the conversation area of the target object, so that the operation is simplified, and the use experience of a user is improved.
According to the first aspect, in response to dragging the first target text onto the icon of the second application, the second application is started, and after displaying the interface of the second application in the area where the interface of the first application is displayed, the method further includes: and stopping displaying the icon of the second application and the icon of the third application.
According to a first aspect, the application interaction method may further comprise: and displaying an interface of the fourth application. And receiving a selection operation of a second target text on the interface of the fourth application, and responding to dragging the second target text to the first area, and displaying an icon of the fifth application and an icon of the sixth application in the second area. And then, in response to dragging the second target text to the icon of the fifth application, starting the fifth application, displaying a floating window in a superimposed manner on the interface of the fourth application, and displaying the interface of the fifth application in the floating window. Therefore, the source application interface and the destination application interface of the text can be displayed on the screen of the electronic equipment simultaneously by using the floating window, so that the electronic equipment is convenient to copy and paste for multiple times, the operation is simplified, and the use experience of a user is improved.
According to a first aspect, the application interaction method may further comprise: displaying an interface of the fourth application, receiving a selection operation of a second target text on the interface of the fourth application, and responding to dragging the second target text to the first area, and displaying an icon of the fifth application and an icon of the sixth application in the second area. And then, in response to dragging the second target text to the icon of the fifth application, starting the fifth application, displaying a first split screen and a second split screen in a region displaying the fourth interface, displaying the interface of the fourth application in the first split screen, and displaying the interface of the fifth application in the second split screen. Therefore, the source application interface and the destination application interface of the text can be displayed on the screen of the electronic equipment simultaneously by utilizing the split screen, so that the copy and paste can be conveniently carried out for a plurality of times, the operation is simplified, and the use experience of a user is improved.
According to a first aspect, the first split screen is located in an upper half of the electronic device screen and the second split screen is located in a lower half of the electronic device screen.
According to a first aspect, the first split screen is located in a left half of the electronic device screen and the second split screen is located in a right half of the electronic device screen.
According to a first aspect, the application interaction method may further comprise: and receiving a selection operation of a third target text on the interface of the fifth application, and responding to dragging the third target text to the first area, and displaying an icon of the seventh application and an icon of the eighth application in the second area. Then, in response to dragging the third target text to the icon of the seventh application, the seventh application is started, and an interface of the seventh application is displayed on the first split screen. Therefore, the split screen where the text source application interface is located is kept unchanged, the destination application interface is displayed on the other split screen, the source application interface and the destination application interface of the text can be simultaneously displayed on the screen of the electronic equipment, multiple copying and pasting are convenient, operation is simplified, and user experience is improved.
According to a first aspect, the second application and the third application are predicted from the first target text.
According to a first aspect, the first area is a screen side area. For example, in one example, the first region may be a right side region of the screen. In another example, the first region may be a left side region of the screen.
In one example, the first region is not coincident with the second region.
In a second aspect, the present application provides an application interaction method, applied to an electronic device, where the method includes: receiving a selection operation of a first target text on an interface of a first application, predicting a second application and a third application according to the first target text, and displaying an icon of the second application and an icon of the third application in a second area in response to dragging the first target text to the first area, wherein the second area is positioned at the edge part of a screen of the electronic equipment; and in response to dragging the first target text to the icon of the second application, starting the second application, and displaying the interface of the second application in the area of the interface of the first application. Therefore, the user can conveniently start the second application, namely the forward application, and display the forward application interface by dragging the text to the second application icon on the first application interface, namely the text source application interface, so that the operation is simplified, and the user experience is improved.
According to a second aspect, the application interaction method may further comprise: the first target text is copied in response to dragging the first target text to the first region. When the interface of the second application is detected to comprise a first text input box matched with the first target text, the first target text is displayed in an overlapping mode in the area in the first text input box. Then, in response to the release operation, the first target text is pasted into the first text input box and displayed within the first text input box. Therefore, pasting can be conveniently and rapidly completed, and the user experience is improved.
According to a second aspect, the application interaction method may further comprise: and receiving a selection operation of a second target text on the interface of the fourth application, and predicting a fifth application and a sixth application according to the second target text. Then, in response to dragging the second target text to the first area, an icon of the fifth application and an icon of the sixth application are displayed in the second area. And then, in response to dragging the second target text to the icon of the fifth application, starting the fifth application, displaying a floating window in a superposition manner on the interface of the fourth application, and displaying the interface of the fifth application in the floating window. Therefore, the source application interface and the destination application interface of the text can be displayed on the screen of the electronic equipment simultaneously by using the floating window, so that the electronic equipment is convenient to copy and paste for multiple times, the operation is simplified, and the use experience of a user is improved.
According to a second aspect, the application interaction method may further comprise: and receiving a selection operation of a second target text on the interface of the fourth application, and predicting a fifth application and a sixth application according to the second target text. Then, in response to dragging the second target text to the first area, an icon of the fifth application and an icon of the sixth application are displayed in the second area. And then, in response to dragging the second target text to the icon of the fifth application, starting the fifth application, displaying a first split screen and a second split screen in a region displaying the fourth interface, displaying the interface of the fourth application in the first split screen, and displaying the interface of the fifth application in the second split screen. Therefore, the source application interface and the destination application interface of the text can be displayed on the screen of the electronic equipment simultaneously by utilizing the split screen, so that the copy and paste can be conveniently carried out for a plurality of times, the operation is simplified, and the use experience of a user is improved.
According to a second aspect, the application interaction method may further comprise: and receiving a selection operation of a third target text on the interface of the fifth application, and predicting a seventh application and an eighth application according to the third target text. Then, in response to dragging the third target text to the first area, an icon of the seventh application and an icon of the eighth application are displayed in the second area. Then, in response to dragging the third target text to the icon of the seventh application, the seventh application is started, and an interface of the seventh application is displayed on the first split screen. Therefore, the split screen where the text source application interface is located is kept unchanged, the destination application interface is displayed on the other split screen, the source application interface and the destination application interface of the text can be simultaneously displayed on the screen of the electronic equipment, multiple copying and pasting are convenient, operation is simplified, and user experience is improved.
According to a second aspect, displaying a first split screen and a second split screen in a region where a fourth interface is displayed includes: when the electronic equipment is single-screen equipment, a first split screen is displayed on the upper half part of the screen of the electronic equipment, and a second split screen is displayed on the lower half part of the screen of the electronic equipment. Therefore, according to the type of the equipment, a proper split screen mode is adopted, so that the use experience of a user can be improved.
According to a second aspect, displaying a first split screen and a second split screen in a region where a fourth interface is displayed includes: when the electronic equipment is a folding screen equipment, a first split screen is displayed on the left half part of the screen of the electronic equipment, and a second split screen is displayed on the right half part of the screen of the electronic equipment. Therefore, according to the type of the equipment, a proper split screen mode is adopted, so that the use experience of a user can be improved.
In a third aspect, the present application provides an electronic device comprising: the system comprises a memory and a processor, wherein the memory is coupled with the processor. The memory stores program instructions that, when executed by the processor, cause the electronic device to perform any of the application interaction methods of the first aspect.
In a fourth aspect, the application provides a computer readable storage medium comprising a computer program which, when run on an electronic device, causes the electronic device to execute any one of the application interaction methods of the first aspect.
In a fifth aspect, the present application provides a chip comprising one or more interface circuits and one or more processors. The interface circuit is configured to receive signals from a memory of the electronic device and to send signals to the processor, the signals including computer instructions stored in the memory. The computer instructions, when executed by a processor, cause the electronic device to perform any of the application interaction methods of the first aspect.
Drawings
Fig. 1 is a schematic structural diagram of an exemplary electronic device 100;
fig. 2 is a software architecture block diagram of an electronic device 100 of an exemplary illustrated embodiment of the present application;
FIG. 3 is a timing diagram illustrating various operations in an application interaction process according to an embodiment of the present application;
FIG. 4 is an interface schematic diagram illustrating a process prior to OCR recognition results of a display picture;
FIG. 5 is a schematic diagram illustrating a process of selecting and dragging a target text in the OCR recognition result of picture a;
FIG. 6 is a schematic diagram illustrating an exemplary process for initiating the display of a list of recommended applications;
FIG. 7A1 is a schematic diagram illustrating a process for displaying a target application interface directly in full screen without loosening hands after dragging a target text;
FIG. 7A2 is a schematic diagram illustrating another process for directly displaying a target application interface in full screen without loosening hands after dragging a target text;
FIG. 7A3 is a schematic diagram illustrating yet another process for displaying a target application interface directly full screen without loosening hands after dragging a target text;
FIG. 7B is a schematic diagram illustrating a process for displaying a target application interface directly in full screen with a loose hand after dragging a target text;
FIG. 8 is a schematic diagram illustrating interactions between a user and an electronic device in an exemplary illustrated application interaction method;
FIG. 9A is a schematic diagram illustrating a process for switching a recommendation application without loosening hands after dragging a target text;
FIG. 9B is a schematic diagram illustrating a process for switching a recommendation application in the case of a loose hand after dragging a target text;
FIG. 10A1 is a schematic diagram illustrating an operation procedure after entering a WeChat application without loosening hands after dragging a target text;
FIG. 10A2 is a schematic diagram illustrating another operation procedure after entering a WeChat application without loosening hands after dragging a target text;
FIG. 10B1 is a schematic diagram illustrating an operation procedure after entering a WeChat application in case of releasing hands after dragging a target text;
FIG. 10B2 is a schematic diagram illustrating another operation procedure after entering a WeChat application in case of releasing hands after dragging a target text;
FIG. 11A is a schematic diagram illustrating one process for displaying a map application interface in a floating window without loosening hands after dragging a target text;
FIG. 11B is a schematic diagram illustrating one process for displaying a map application interface in a floating window with a loose hand after dragging a target text;
FIG. 12A is a schematic diagram illustrating one process for displaying a map application interface in a split screen without loosening hands after dragging a target text;
FIG. 12B is a schematic diagram illustrating one process for displaying a map application interface in a split screen with a loose hand after dragging a target text;
FIG. 13A is a schematic diagram illustrating a process for displaying a Jingdong application interface in a split screen without loosening hands after dragging a target text;
FIG. 13B is a schematic diagram illustrating a process for displaying a Beijing east application interface in a split screen with a loose hand after dragging the target text;
FIG. 14A is a schematic diagram illustrating a process for switching to an application in a folding screen electronic device or a wide screen electronic device without loosening hands after dragging a target text;
FIG. 14B is a schematic diagram illustrating a process for switching to an application in a folding screen electronic device or a wide screen electronic device in the case of a loose hand after dragging a target text;
fig. 15 is a schematic diagram exemplarily showing a process of displaying one destination application based on target text in the other destination application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The term "and/or" is herein merely an association relationship describing an associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone.
The terms first and second and the like in the description and in the claims of embodiments of the application, are used for distinguishing between different objects and not necessarily for describing a particular sequential order of objects. For example, the first target object and the second target object, etc., are used to distinguish between different target objects, and are not used to describe a particular order of target objects.
In embodiments of the application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g." in an embodiment should not be taken as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the description of the embodiments of the present application, unless otherwise indicated, the meaning of "a plurality" means two or more. For example, the plurality of processing units refers to two or more processing units; the plurality of systems means two or more systems.
The application interaction method in the embodiment of the application can be applied to copying and pasting the text in one application of the same electronic equipment to the scene in another application.
In one example, the scenario may be, for example: text in applications such as browsers, memos, mails, weChats, etc. is copied and pasted into other applications different from the application from which the text originated. In this example, text in the application can be directly selected.
For example, the user sees a significant knowledge point in the browser interface, which he wants to record in the memo. At this time, the application interaction method in the embodiment of the present application may be applied to paste the knowledge point in the browser interface into the memo. For another example, a phone number is provided in the WeChat chat interface, and the user wants to add the phone number to the address book. At this time, the application interaction method in the embodiment of the application can be applied to paste the telephone number in the WeChat chat interface into the address book.
In another example, the scenario may be, for example: the text in the result of the optical character recognition (Optical Character Recognition, OCR) recognition of the picture is copy-pasted into an application. Wherein, the pictures can be from gallery, real-time screen capture, browser, weChat, QQ, etc. applications. The embodiment of the application does not limit the source of the picture. In this example, the text in the application originates from the OCR recognition result of the picture, and the text in the picture cannot be directly selected.
For example, an express photo has address information that the user wants to add to the shipping address of a shopping application. At this time, the application interaction method in the embodiment of the application can be applied to paste the address information in the photo to the receiving address of the shopping application.
It should be noted that the above examples are only illustrative of application scenarios of embodiments of the present application. The embodiments of the present application do not limit the source application of the text (i.e., the application that contains the text) and the destination application (i.e., the application that needs to paste the text).
An application may be referred to herein simply as an application.
The application interaction method in the embodiment of the application can be applied to electronic equipment. The electronic device may be, for example, a cell phone, tablet, etc. The following describes, taking the electronic device 100 as an example, a hardware structure and a software structure of the electronic device to which the interaction method is applied in the embodiment of the present application.
Fig. 1 is a schematic diagram of an exemplary illustrated electronic device 100. It should be understood that the electronic device 100 shown in fig. 1 is only one example of an electronic device, and that the electronic device 100 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in fig. 1 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
Referring to fig. 1, an electronic device 100 may include: processor 110, external memory interface 120, internal memory 121, universal serial bus (universal serial bus, USB) interface 130, charge management module 140, power management module 141, battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headset interface 170D, sensor module 180, keys 190, motor 191, indicator 192, camera 193, display 194, and subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. The embodiment of the application takes an Android (Android) system with a layered architecture as an example, and illustrates a software structure of the electronic device 100.
Fig. 2 is a software structural block diagram of the electronic device 100 of the exemplary embodiment of the present application.
The layered architecture of the electronic device 100 divides the software into several layers, each with a distinct role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in FIG. 2, the application packages of the application layer may include applications of sensors, cameras, gallery, OCR engines, application recommendations, application interaction modules, three-way applications, screen shots, and the like.
The OCR engine is used for carrying out OCR recognition on the picture to obtain an OCR recognition result of the picture, wherein the OCR recognition result comprises selectable and reproducible text information from the picture. The pictures can be screen capturing pictures obtained by screen capturing application, pictures obtained by camera application, pictures stored in a gallery and the like, and the embodiment of the application does not limit the sources of the pictures.
The application recommendation is used for predicting the application related to the text content according to the text content and recommending the predicted application to the user. The application interaction module is used for executing the application interaction method of the embodiment of the application. For details of the application interaction method in the embodiment of the present application, please refer to the following description. The three-party application is used to provide pictures or to provide selectable, reproducible text information.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layer may include a view system, resource manager, window manager, activity manager, and the like.
The view system comprises visual controls, such as a control for displaying text, a control for displaying pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text notification icon may include a view displaying text and a view displaying a picture.
The resource manager provides various resources for the application program, such as localized text strings, icons, pictures, layout files, video files, and the like.
The window manager is used for managing window programs. The window manager may obtain the display screen size, determine if there is a status bar, lock the screen, intercept the screen, etc.
The activity manager is used for managing the life cycle of each application program and the navigation rollback function, and is responsible for the creation of the main thread of the Android, and the maintenance of the life cycle of each application program.
Android runtimes include core libraries and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function to be called by java language, and the other part is a core library of Android (Android).
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules, such as surface managers (surface managers) or the like.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
The kernel layer is a layer between hardware and software. The kernel layer may contain modules such as display drivers, sensor drivers, etc.
It will be appreciated that the layers and components contained in the layers in the software structure shown in fig. 2 do not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer layers than shown and may include more or fewer components per layer, as the application is not limited.
For convenience of description, the electronic device will be hereinafter simply referred to as an electronic device. It should be appreciated that the electronic devices mentioned herein may each have the same hardware structure (e.g., the hardware structure shown in fig. 1) as the electronic device 100 in the foregoing embodiment in the same software structure (e.g., the software structure shown in fig. 2).
The following describes an application interaction method according to an embodiment of the present application, taking a scenario in which text in an OCR recognition result of a picture is copied and pasted to a certain application as an example. In the following embodiments, it is assumed that the user wants to paste text information in a picture into one text entry box in a map application.
Fig. 3 is a timing diagram illustrating operations in an application interaction process according to an embodiment of the present application. Fig. 8 is a schematic diagram illustrating interaction between a user and an electronic device in an exemplary method of application interaction. The following describes in detail the procedure of applying the interaction method according to the embodiment of the present application with reference to fig. 3 and 8.
Referring to fig. 3, the process of applying the interactive method according to the embodiment of the present application includes a stage of "opening a picture, and displaying an OCR recognition result without displaying a button". At this stage, the electronic device performs processing of "displaying the picture a on the electronic device screen" and "detecting whether text exists in the picture a, and if text exists, acquiring the OCR recognition result of the picture a".
Referring to fig. 8, in the stage of "open picture, OCR recognition result indicates that the button is not displayed", the process of applying the interaction method according to the embodiment of the present application may include the following steps:
step S1, the user clicks on the picture a.
In step S2, the electronic device displays the picture a on the screen of the electronic device in response to clicking the picture a.
And S3, the electronic equipment detects whether the text exists in the picture a, and if the text exists, the electronic equipment acquires an OCR recognition result of the picture a.
The user may perform an operation of clicking on the picture in various cases, thereby displaying the picture on the screen of the electronic device. The electronic device may be, for example, a mobile phone, a tablet, etc. The electronic device is described herein as an example of a mobile phone. It should be understood that the description herein of a cell phone is equally applicable to other electronic devices, such as a tablet, that are not cell phones.
For example, fig. 4 is an interface diagram exemplarily showing a process before OCR recognition results of a display picture. Referring to fig. 4, in an exemplary implementation, the user may click on the "gallery" icon in the main interface of the mobile phone (see fig. 4 (a)), click on the "gallery" icon, and then enter the internal interface of the "gallery" application, where the interface is displayed as shown in fig. 4 (b). Referring to fig. 4 (b), the "gallery" includes pictures a, 1, 2, 3, n, etc. The user may continue to click on the picture on the interface shown in fig. 4 (b), for example, assuming that the user clicks on picture a.
At this time, the mobile phone displays the picture a on the screen of the mobile phone in response to clicking the picture a, as shown in fig. 4 (c). Fig. 4 (a) to (c) illustrate examples of entering the picture display interface from the gallery, and the embodiment of the present application is not limited to the manner of entering the picture display interface. For example, in other embodiments of the present application, the image display interface may be accessed by clicking on the screen shot image after the screen shot, or may be accessed by clicking on the image in the web page during the web page browsing process, etc., which are not listed here.
Referring to fig. 4, after displaying the picture a, the application interaction method according to the embodiment of the present application further detects whether there is text in the picture a. And if the text exists in the picture a, acquiring an OCR recognition result of the picture a. If the text does not exist in the picture a, the application interaction method of the embodiment of the application ends the application interaction process. Thus, the process of the "open picture, OCR recognition result display button not displayed" stage shown in fig. 3 is completed.
In the embodiment of the application, the electronic equipment can detect whether the text exists in the picture a in a plurality of modes, and acquire the OCR recognition result of the picture a under the condition that the text exists in the picture a.
In an exemplary implementation, if the stored data of the picture a includes an OCR recognition result of the picture a, the electronic device may determine whether text exists in the picture a according to the stored data of the picture a. And, in the case that the text exists in the picture a, the electronic device may directly read the OCR recognition result of the picture a from the stored data of the picture a.
In another exemplary implementation, if the stored data of the picture a does not include the OCR recognition result of the picture a, the electronic device may send the picture a to the OCR engine of the application layer shown in fig. 2, where the OCR engine performs real-time OCR recognition on the picture a and outputs the OCR recognition result of the picture a. Then, the electronic device acquires the OCR recognition result of the picture a from the output data of the OCR engine.
The OCR engine is used for carrying out OCR recognition on the picture and outputting an OCR recognition result of the picture.
With continued reference to fig. 3, after the OCR recognition result of the picture a is obtained, the application interaction method according to the embodiment of the present application enters a stage of "displaying the OCR recognition result display button but not turning on the button". At this stage, the electronic apparatus performs a process of "popup OCR recognition result display button on the interface of the picture a".
With continued reference to fig. 8, in the stage of "displaying OCR recognition result display button but not on" the process of applying the interaction method according to an embodiment of the present application may include the following steps:
step S4, if the OCR recognition result of the picture a is obtained, the electronic device pops up an OCR recognition result display button on the screen of the electronic device.
Here, the pop-up OCR recognition result indicates that the button is in an unopened state. At this time, please refer to fig. 4 (d) for the electronic device interface. In the case where the user does not click the OCR recognition result display button in the (d) diagram of fig. 4, the text in the picture a is not selectable.
At this time, the user may click the OCR recognition result display button so that the operations of selecting a text from the OCR recognition result and opening the map application can be performed later.
If the electronic device does not detect the operation of clicking the OCR recognition result display button within a preset period of time after the OCR recognition result display button is popped up, it may be confirmed that the user has no need to select text from the OCR recognition result and paste it to other applications. At this time, the electronic device may actively stop displaying the OCR recognition result display button on the electronic device screen, i.e., the OCR recognition result display button disappears from the interface of the picture a.
In one exemplary implementation, if the user does not have the need to select text from the OCR recognition results and paste the selected text to other applications, the user may click anywhere outside of the OCR recognition result display button. At this time, the electronic device detects a click operation by the user on any place other than the OCR recognition result display button, and can stop displaying the OCR recognition result display button on the electronic device screen. In this way, the electronic device passively stops displaying the OCR recognition result display button based on the user operation.
Of course, in another exemplary implementation, if the user does not have the need to select text from the OCR recognition result and paste the selected text to the target application, the electronic device may also keep the OCR recognition result display button in an unopened state at all times.
Step S5, the user clicks the OCR recognition result display button.
The electronic device detects the clicking operation of the user on the OCR recognition result display button, and can confirm that the user has a need of selecting a text from the OCR recognition results and pasting the selected text to the destination application, so as to enter a flow of subsequently displaying the OCR recognition results of the picture a.
With continued reference to fig. 3, after the user clicks the OCR recognition result display button, the application interaction method according to an embodiment of the present application enters a phase of "displaying the OCR recognition result display button and the button is turned on". At this stage, the electronic device displays the OCR recognition result of picture a on the electronic device screen in response to a click operation of the OCR recognition result display button; the electronic equipment responds to the operation of selecting the target text on the OCR recognition result of the picture a and dragging the target text to the side of the screen, and a recommended application list is displayed on the side of the screen of the electronic equipment; and the electronic equipment responds to the selection operation of the map application in the recommended application list, automatically starts the map application, and displays an interface of the map application on a screen of the electronic equipment.
With continued reference to fig. 8, in the stage of "displaying the OCR recognition result display button and the button is turned on", the process of applying the interaction method according to the embodiment of the present application may include the following steps:
In step S6, the electronic device displays the OCR recognition result of the picture a on the electronic device screen in response to the click operation of the OCR recognition result display button.
The user clicks the OCR recognition result display button to thereby display an interface, see fig. 4 (e). At this time, the text in the OCR recognition result is in a selectable state.
In fig. 4 (e), a border is added to the text in the OCR recognition result, in order to compare with the text not selectable in the picture a. However, the present application is not limited to the display mode of the OCR recognition result. For example, in one example, no frames or other content may be added to the text in the OCR recognition result. In another example, text in the OCR recognition result may be set with a different ground color than background. In yet another example, the electronic device may highlight text in the OCR recognition result and may also highlight text in the text that is of a preset content type.
The preset content type may include, for example, a name, an address, a phone number, etc. For the text belonging to the preset content type, the electronic device may display in a preset emphasis display mode. For example, the electronic device may underline the text belonging to the preset content type, highlight the text belonging to the preset content type, set a different ground color for the text belonging to the preset content type than other texts in the OCR recognition result, or the like.
Step S7, the user selects the target text from the OCR recognition results and drags the target text to the side of the screen of the electronic device.
For example, the user may select the target text to be copied by sliding over the target text.
The side of the screen may be the left side of the screen or the right side of the screen. The following description of the embodiments of the present application will take dragging the target text to the right side of the screen as an example.
In the embodiment of the application, the side edge of the screen is a preset dragging destination area. The present application does not limit the drag destination area as long as it is set in advance. In other embodiments of the present application, the drag destination area may be set as another area, for example, a lower area of the screen, or the like. The subsequent display of the recommended application list may be triggered as long as the user drags the target text to a preset drag destination area.
In step S8, the electronic device copies the target text in response to the operation of selecting the target text and dragging the target text to the side of the screen.
Fig. 5 is a schematic diagram illustrating a process of selecting and dragging a target text in the OCR recognition result of the picture a. Referring to fig. 5 (a), the user selects the target text "beijing city laundry area x road x building" by sliding on "beijing city laundry area x road x building", and the electronic device pops up a text operation menu on the electronic device screen in response to the user's operation of sliding on "beijing city laundry area x road x building" in the OCR recognition result. The text operation menu can comprise operation options such as full selection, cutting, copying, pasting, translation and the like. Next, in fig. 5 (b), the user does not need to click any operation option in the text operation menu, but directly drags the target text "beijing city laundromat x road x building" to the right side of the screen, as shown in fig. 5 (c), wherein the path and direction of the drag are shown by the dotted lines and arrows in fig. 5 (b), respectively. The electronic device responds to the selection of the target text "beijing city lake area x road x building" and drags the target text "beijing city lake area x road x building" to the right side of the screen, the copy target text "beijing city lake district x road x building". At this time, the target text "Beijing city sea lake area x road x building" was present in the clipboard. When the user performs a drag operation, the electronic device stops displaying a text operation menu on the screen.
It should be noted that, in the application interaction process of the embodiment of the present application, the user does not need to click any operation options in the text operation menu. For the user, if the user's intention is not to select text to paste into one application from another, but other intents, such as foreign language translation, etc. of the selected text, the user may click on the relevant operation options in the text operation menu.
And S9, the electronic equipment determines a recommended application list according to the target text content, and displays the recommended application list on the side of the screen.
In the application interaction method of the embodiment of the application, a process of starting and displaying a recommended application list is shown in fig. 6. Fig. 6 is a schematic diagram illustrating a process for starting up and displaying a recommended application list. Referring to fig. 6, fig. 6 (a) is the same as fig. 5 (c), and will not be repeated here. In fig. 6 (b), the electronic device detects that the target text "beijing city lake area x road x building" has been dragged to the side of the screen, transmits the target text "beijing city lake area x road x building" to the application recommendation app (as shown in fig. 2), determines at least two recommended applications according to the target text content, and notifies the recommended applications to the application interaction module. After the application interaction module receives the recommended application, the recommended application can be displayed in a mode of recommending an application list.
In the embodiment of the present application, the method for determining the recommended application by the application recommendation app according to the target text content may be: and calculating the association degree of the target text content and each existing application, and adding at least two applications with the largest association degree into a recommended application list as recommended applications. For example, the target text "Beijing city lake area x road x building" in the embodiment of the present application represents one address, map-like applications, shopping-like applications, social-like applications, etc. that are associated with addresses to a greater extent. Accordingly, it can be determined that the map application in the (b) and (c) diagrams of fig. 6 is a map class application, the jingdong application is a shopping class application, and the WeChat application is a social class application.
It should be understood that the above manner of determining the recommended application list according to the target text content is merely an illustrative example, and the embodiment of the present application is not limited to a specific manner of determining the recommended application list according to the target text content.
It should be noted that, in the embodiment of the present application, whether the target text is located on the side of the screen is based on the touch position of the finger dragging the target text. If the touch position of the finger dragging the target text is located at the screen side, it is determined that the target text is already located at the screen side.
It should be noted that, in the embodiment of the present application, the application displaying the recommendation in the manner of the list of recommended applications is only one schematic example of displaying the recommended application, and the display manner of the recommended application of the present application is not limited. In other embodiments of the present application, the recommended applications may be displayed in other ways. For example, icons of each recommended application may be displayed on the electronic device screen separately.
In one example, after the list of recommended applications has been displayed on the electronic device, the user may continue to remain in a hands-free state for subsequent operations as in fig. 7A1, 7A2, and 7 A3.
In another example, after the recommended application list has been displayed on the electronic device, the user may loosen his hand, at which time the electronic device may stop displaying the target text "beijing city seashore area×× building" dragged to the side of the screen, and display the hover ball, as shown in fig. 6 (c). The suspension ball may include key information of the target character, for example, "x road" may be displayed in the suspension ball corresponding to the target character "x road x building in the seashore area of beijing". In this case, after the user releases his hand, the subsequent operations may be performed as shown in fig. 7B.
In yet another example, after the list of recommended applications has been displayed on the electronic device, the user may loosen his hand, at which point the electronic device may stop displaying the target text "beijing city seashore x road x building" dragged to the side of the screen, and not display the hover sphere. In this case, after the user releases his hand, the subsequent operations may be performed as shown in fig. 7B.
In yet another example, after the list of recommended applications has been displayed on the electronic device, the user may loosen his hand, at which point the electronic device may display the target text "beijing city laundry area x road x building" dragged to the side of the screen in a hovering manner, namely, the target text of 'Beijing city sea lake district x road x building' is hovered and displayed on the screen of the electronic equipment, without the user having to hold the target text with his hands. In this case, after the user releases his hand, the subsequent operation may be performed as shown in fig. 7B.
In the embodiment of the application, the recommended application list can be displayed at the side of the screen. For example, the recommended application list may be on the same side of the screen as the drag destination area. In the embodiment of the application, the recommended application list is displayed on the right side of the screen as an example.
In the embodiment of the application, the control corresponding to the recommended application list is positioned at the uppermost layer of the screen, so that the recommended application list can be prevented from being covered by other interfaces on the screen of the electronic equipment, and the subsequent operation of selecting the application from the recommended application list is prevented from being influenced.
The recommended application list may include application icons of at least two applications determined according to the target text content. When there are more applications in the recommended application list, a preset number (e.g., 3) of application icons may be displayed, and an add button may be displayed, as shown in fig. 6 (b) and (c). In fig. 6 (b) and (c), three applications, namely, application icons of a map application, a jindong application, and a WeChat application, are displayed in the recommended application list, and an add button is displayed.
When it is desired to display application icons of a map application, a jindong application, and other applications (e.g., contact applications) than a West letter application in the recommended application list, in the (b) diagram of fig. 6, the user may drag the target text "beijing city seashore x road x building" to the position of the add button, at which time the recommended application list may display application icons of the jindong application, the West letter application, and the contact applications in order from top to bottom, the application icons of the map application being associated to the add button.
Accordingly, when it is desired to display application icons of the map application, the jindong application, and other applications (e.g., contact applications) than the WeChat application in the recommended application list, in the (c) view of fig. 6, the user may click the add button, at which time the recommended application list may sequentially display application icons of the jindong application, the WeChat application, and the contact application from top to bottom, the application icons of the map application being associated with the add button.
If the application icons of all the applications included in the recommended application list have been displayed in the recommended application list, the add button may not be included in the display content of the recommended application list.
In step S10, the user drags the target text to stay on the map application icon of the recommended application list, or hovers the target text at the side of the screen and then clicks the map application icon.
In step S12, the electronic device automatically starts the map application in response to dragging the target text to stay on the map application icon of the recommended application list, or in response to hovering the target text at the side of the screen and then clicking the map application icon, and displays an interface of the map application on the screen of the electronic device.
Fig. 7A1 is a schematic diagram illustrating a procedure for directly displaying a target application interface in full screen without loosening hands after dragging a target text. In this case, please refer to the situation shown in fig. 6 (b) for the situation that the user does not loose his hands after dragging the target text. Referring to fig. 7A1, in (a) of fig. 7A1, the user drags the target text "beijing city lake district x road x building" to the map application icon to stay.
In fig. 7A1 (b), the electronic device stops on the map application icon in response to dragging the target text to the recommended application list, opens the map application, and displays an interface of the map application on the electronic device screen. Here, the electronic device displays an interface of the map application on a screen in a full-screen display manner, and, the target text "beijing city lake district x road x building" is still displayed on the electronic device screen. In the (b) diagram of fig. 7A1, when the user drags the target text onto the map application icon of the recommended application list, the effect that the target text has been pasted to the target location (here, the text input box) at which the target text needs to be pasted may be displayed.
In the case of fig. 7A1 (b), if the user releases his hand, the electronic device will stop displaying the target text "beijing city laundry area x road x building" and the recommended application list, and the electronic device may automatically add the target text "beijing city laundry area x road x building" to the designated text input box on the map application interface, at which time the interface of the electronic device is as shown in fig. 7A1 (c).
Fig. 7A2 is a schematic diagram illustrating another procedure for directly displaying a target application interface in full screen without loosening hands after dragging a target text. Fig. 7A2 (a) is the same as fig. 7A1 (a), and the related process is referred to the above description, and is not repeated here. In the (b) view of fig. 7A2, in a case where the user has dragged the target text onto the map application icon of the recommended application list, the user keeps a state of not getting loose hands, and continues to drag the target text along the path and direction indicated by the arrow-headed broken line in the (b) view of fig. 7 A2. The electronic device stops displaying the list of recommended applications on the screen in response to continuing to drag the target text to the text input box. When the user drags the target text to the text input box pointed by the arrow, the user releases his hand, and the electronic device adds the target text "beijing city seashore area x road x building" to the text input box pointed by the arrow on the map application interface, at which time the interface of the electronic device is as shown in fig. 7A2 (c).
Fig. 7A3 is a schematic diagram illustrating yet another procedure for directly displaying a target application interface in full screen without loosening hands after dragging a target text. Fig. 7A3 (a) is the same as fig. 7A1 (a), and the related process is referred to the above description, and is not repeated here. In fig. 7A3 (b), the electronic device opens the map application in response to dragging the target text to the map application icon of the recommended application list, and displays the interface of the map application in a full screen display manner on the screen of the electronic device.
In fig. 7A3 (c), the user releases his hand in the case where the map application interface has been displayed on the screen of the electronic device. After detecting the loosening operation, the electronic device hovers the target text over the screen and stops displaying the recommended application list on the screen.
In fig. 7A3 (d), the user drags the target text along the path and direction indicated by the arrow dotted line in the drawing until dragging to the text input box. The electronic device detects the operations of dragging the target text and releasing the hand after dragging the target text to the text input box, and adds the target text "beijing city seashore area x road x building" to the text input box pointed by the arrow on the map application interface, at which time the interface of the electronic device is as shown in (e) of fig. 7 A3.
FIG. 7B is a schematic diagram illustrating one process for displaying a target application interface directly full screen with a loose hand after dragging the target text. In this case, please refer to the situation shown in fig. 6 (c) for the case of releasing the hands after dragging the target text. Referring to fig. 7B, in the (a) of fig. 7B, a hover ball corresponding to the target text "beijing city lake area x road x building" hovers on the right side of the screen of the electronic device, at this point the user clicks the map application icon in the recommended application list.
In fig. 7B, in response to hovering a hover ball corresponding to the target text at the side of the screen and then clicking the map application icon, the electronic device opens the map application, displays an interface of the map application on the screen of the electronic device, and stops displaying the target text on the screen, and then the user loosens his hands.
In fig. 7B, after the user releases his hand, the text entry box on the map application interface is clicked. The electronic device pops up a list containing paste operation options on the screen in response to clicking the text input box on the map application interface and stops displaying the recommended application list.
In fig. 7B, in (d), the user clicks on a paste operation option in the list containing paste operation options. In fig. 7B, in response to the click-and-paste operation option, the electronic device adds the target text "beijing city laught area x road x building" to the text input box on the map application interface.
After the recommended application list has been displayed on the electronic device, the user releases his hand, and in the case where the electronic device stops displaying the target text dragged to the side of the screen and does not display the hover ball, the subsequent operations may be performed as shown in (B) to (e) of fig. 7B.
Similarly, after the recommended application list has been displayed on the electronic device, if the user releases his hand and the electronic device hovers to display the target text, the subsequent operations may be performed as shown in fig. 7B (B) to (e).
In one exemplary implementation, the application interaction process may include (a), (B) and (e) of fig. 7B in the event that the user releases his hand after the list of recommended applications has been displayed on the electronic device. That is, after displaying the interface of the map application (as shown in fig. 7B), if the user releases his hand, after a preset period of time, the electronic device adds the target text to the text input box on the map application interface, and stops displaying the recommended application list, without requiring the user to perform the operation of clicking the text input box shown in fig. 7B and the operation of clicking the paste operation option shown in fig. 7B.
According to the above listed exemplary process of application interaction, the embodiment of the present application can trigger to display a plurality of recommended application icons associated with the target text on the source application interface based on the operation of selecting the target text in the source application and then dragging the target text, and intelligently and automatically start the destination application of the target text according to the selection of the recommended application icons by the user (by dragging the target text to the recommended application icons or clicking the recommended application icons), thereby simplifying the user operation and improving the user experience.
In practical applications, there are scenarios where a user wants to switch to another application in the recommended application list after selecting one application in the recommended application list. The embodiment of the application provides a mode of application interaction aiming at the scene.
Fig. 9A is a schematic diagram illustrating a process of switching a recommended application without loosening hands after dragging a target text. Referring to fig. 9A, for the case where the target text is not loosened after being dragged as shown in fig. 6 (b), when the interface of the map application in the recommended application list has been displayed on the screen of the electronic device, the recommended application list is still displayed, and the user holds the target text in the non-loosened state (refer to fig. 9A), the user may continue to drag the target text to an area other than the recommended application list, at which time the electronic device stops displaying the recommended application list (refer to fig. 9A (b)).
Next, the user may continue to drag the target text to the right side of the screen, maintaining the state of no looseness, and in response to dragging the target text to the right side of the screen, the electronic device displays a recommended application list on the screen (see (c) of fig. 9A). The user then drags the target text to another application in the recommended application list, such as a WeChat application, to stay on (see FIG. 9A (c)). The electronic device opens the WeChat application in response to dragging the target text to the WeChat application icon in the recommended application list, and displays an interface of the WeChat application on the electronic device screen (see (d) of FIG. 9A).
Next, the user may continue dragging the target text to a contact, such as contact a in the (e) diagram of 9A, on the interface of the micro-letter application, after which the electronic device performs the operation after entering the micro-letter application as shown in fig. 10A1 or performs the operation after entering the micro-letter application as shown in fig. 10A2 in response to dragging the target text to contact a on the interface of the micro-letter application.
Fig. 9B is a schematic diagram illustrating a process of switching a recommended application in a case of releasing hands after dragging a target text. Referring to fig. 9B, in the case of releasing the hand after dragging the target text shown in fig. 6 (c), when the interface of the map application in the recommended application list is displayed on the screen of the electronic device and the recommended application list is still displayed (see fig. 9B (a)), the user may continue to click on another application icon in the recommended application list, for example, the WeChat application icon (see fig. 9B (a)). The electronic device opens the micro-letter application in response to the operation of clicking the micro-letter application icon in the recommended application list, and displays an interface of the micro-letter application on the screen of the electronic device (see fig. 9B (B)). The electronic device stops displaying the recommended application list in response to the duration of displaying the interface of the WeChat application reaching a certain duration, see fig. 9B (c).
The user may then continue to click on a contact, such as contact a, on the interface of the WeChat application shown in FIG. 9B (c), after which the electronic device performs the operations after entering the WeChat application shown in FIG. 10B1 or performs the operations after entering the WeChat application shown in FIG. 10B2 in response to clicking on contact a.
In another example, the user may click on the contact a on the interface shown in fig. 9B (B), that is, click on the contact a while the recommended application list is still being displayed, at which time the electronic device stops displaying the recommended application list in response to clicking on the contact a and performs an operation after entering the micro-letter application as shown in fig. 10B1 or performs an operation after entering the micro-letter application as shown in fig. 10B 2.
Under the condition that the interface to the application is displayed on the screen of the electronic equipment, if the current interface to the application does not have the pasting position of the target text, the embodiment of the application can continuously start and display the next-level interface or the multi-level interface to the current interface to the application.
Fig. 10A1 is a schematic diagram illustrating an operation procedure after entering a WeChat application without loosening hands after dragging a target text. Referring to fig. 10A1, in the case that the user does not loose his hand after dragging the target text shown in fig. 6 (b), when the interface of the WeChat application in the recommended application list is displayed on the screen of the electronic device (see fig. 10A1 (a)), the user may continue to drag the target text to a clickable location on the interface of the WeChat application, and when the clickable location is clicked, the next level interface displaying the current interface of the WeChat application may be triggered, for example, the clickable location may be contact a, as shown in fig. 10A1 (a).
Next, the electronic device displays a next level interface of the current interface of the WeChat application, i.e., a chat interface with the contact A, in response to clicking the contact A, as shown in FIG. 10A1 (b). A text entry box is included in the chat interface that sends a message to contact a.
In fig. 10A1 (b), the user continues to drag the target text to the text input box, and the electronic device pastes the target text "beijing city seashore x road x building" into the text input box in response to dragging the target text to the text input box, and after the paste, the wechat application displays a "send" button on the interface. As shown in fig. 10A1 (c). If the user wants to adjust the content being sent, the user can edit the content in the text entry box (i.e., the target text) before clicking the "send" button.
Next, the user may click on the "send" button, and the electronic device transmits the content in the text input box to the contact a in response to clicking on the "send" button, and displays the transmitted interface, as shown in fig. 10A1 (d).
In one example, the process of pasting the target text into the text input box shown in fig. 10A1 (b) and (c) may be replaced with a process similar to fig. 7A1 (b) and (c), or a process similar to fig. 7A3 (c) to (e), which will not be repeated here.
Fig. 10A2 is a schematic diagram illustrating another operation procedure after entering a WeChat application without loosening hands after dragging a target text. In this example, fig. 10A2 (a) is the same as fig. 10A1 (a), and the description of fig. 10A2 (a) is referred to the description of fig. 10A1 (a), and is not repeated here. Unlike the example shown in fig. 10A1, in the example shown in fig. 10A2, when the user clicks on the contact nail, the electronic device automatically transmits the target text to the contact nail and displays an interface for transmitting the target text to the contact nail, as shown in fig. 10A2 (b).
As can be seen by comparing fig. 10A1, the example shown in fig. 10A2 omits the operation steps shown in the (b) diagram and the (c) diagram of fig. 10A1, and changes the process of manually sending the target text to the contact person a into the process of automatically sending the target text to the contact person a by the electronic device, thereby reducing manual operation, improving automation and intellectualization degree and improving use experience of users.
Fig. 10B1 is a schematic diagram illustrating an operation procedure after entering a WeChat application in the case of releasing hands after dragging a target text. Referring to fig. 10B1, in the case of releasing the hand after dragging the target text shown in fig. 6 (c), when the interface of the WeChat application in the recommended application list is displayed on the screen of the electronic device (refer to fig. 10B1 (a)), the user may click on the contact nail on the interface of the WeChat application. The electronic device displays a chat interface with contact a in response to clicking on contact a, as shown in fig. 10B1 (B).
The chat interface of figure 10B1 (B) includes a text entry box that sends a message to contact a, the electronic device automatically pastes the target text of 'Beijing city sea lake area x road x building' into the text input box, and after the target text is pasted, the WeChat application displays a 'send' button on an interface. If the user wants to adjust the transmitted content. The user may edit the content in the text entry box (i.e., the target text) before clicking the "send" button.
The user may then click on the "send" button in the interface shown in fig. 10B1 (B), and the electronic device sends the content in the text entry box to the contact a in response to clicking on the "send" button, and displays the sent interface, as shown in fig. 10B1 (c).
Fig. 10B2 is a schematic diagram illustrating another operation procedure after entering the WeChat application in the case of releasing hands after dragging the target text. Referring to fig. 10B2, in the case of releasing the hand after dragging the target text shown in fig. 6 (c), when the user clicks the contact person a, the electronic device transmits the target text to the contact person a in response to clicking the contact person a, and displays the transmitted interface, as shown in fig. 10B2 (B). As can be seen by comparing fig. 10B1, the example shown in fig. 10B2 omits the operation steps shown in fig. 10B1 (B), and the process of manually sending the target text to the contact person is converted into the process of automatically sending the target text to the contact person, so that the manual operation is further reduced, the degree of automation and intelligence is improved, and the use experience of the user is improved.
The foregoing embodiments describe the process of application interaction in a manner that results in a direct full screen display to an application interface. The mode can be switched to the application interface rapidly, and the user experience is improved.
However, in some application scenarios, the user does not want the original interface (e.g., the original interface may be the interface that displays the OCR recognition result of picture a) to be completely occluded, e.g., a scenario where the user wants to be able to quickly switch back to the original interface. In order to meet the user requirements in the scene, the embodiment of the application provides another display mode of the application interface.
Fig. 11A is a schematic diagram illustrating a process of displaying a map application interface in a floating window without loosening hands after dragging a target text. The process of opening the map application and displaying the map application interface in the example shown in fig. 11A is the same as that in the example shown in fig. 7A2, and will not be described here again. The example shown in fig. 11A differs from the example shown in fig. 7A2 in that the map application interface is displayed in a floating window in the example shown in fig. 11A, whereas the map application interface is directly displayed in full screen in the example shown in fig. 7 A2.
It should be noted that, in the case where the floating window displays the map application interface, the process of opening the map application and displaying the map application interface may be the same as the process in the example shown in fig. 7A1, or the process in the example shown in fig. 7A3, and the difference is only that the display manner of the map application interface is different, which is not repeated here.
Fig. 11B is a schematic diagram illustrating a process of displaying a map application interface in a floating window in case of releasing hands after dragging a target text. The process of opening the map application and displaying the map application interface in the example shown in fig. 11B may be the same as the process in the example shown in fig. 7B, or the process indicated by the (a) diagram→ (B) diagram→ (e) diagram in the example shown in fig. 7B (i.e., after displaying the map application interface, the electronic device adds the target text to the text input box on the map application interface), which will not be described again here. The example shown in fig. 11B differs from the example shown in fig. 7B in that the map application interface is displayed in a floating window in the example shown in fig. 11B, whereas the map application interface is directly displayed in full screen in the example shown in fig. 7B.
According to the embodiment of the application, the destination application is displayed on the floating window, so that a user can conveniently see the source interface of the target text, the user requirement is met, and the user experience is improved.
Fig. 12A is a schematic diagram illustrating a process of displaying a map application interface in a split screen without loosening hands after dragging a target text. The process of opening the map application and displaying the map application interface in the example shown in fig. 12A is the same as that in the example shown in fig. 7A2, and will not be described here again. The example shown in fig. 12A differs from the example shown in fig. 7A2 in that the map application interface is displayed in a split screen in the example shown in fig. 12A in a manner of up-down split screen, whereas the map application interface is directly displayed in full screen in the example shown in fig. 7 A2.
It should be noted that, in the case where the map application interface is displayed in the split screen, the process of opening the map application and displaying the map application interface may be the same as the process in the example shown in fig. 7A1, or the process in the example shown in fig. 7A3, and the difference is only that the display manner of the map application interface is different, which is not repeated herein.
Fig. 12B is a schematic diagram illustrating a process of displaying a map application interface in a split screen in case of loose hands after dragging a target text. The process of opening the map application and displaying the map application interface in the example shown in fig. 12B may be the same as the process in the example shown in fig. 7B, or the process indicated by the (a) diagram→ (B) diagram→ (e) diagram in the example shown in fig. 7B (i.e., after switching to the map application interface, the electronic device adds the target text to the text input box on the map application interface), which will not be described again here. The example shown in fig. 12B differs from the example shown in fig. 7B in that the map application interface is displayed in a split screen in the example shown in fig. 12B in such a manner that the map application interface is displayed in full screen directly in the example shown in fig. 7B.
According to the embodiment of the application, the destination application is displayed in the split screen, so that a user can see the source interface of the target text without interface switching, the user requirement is met, and the user experience is improved. And the user can see the source interface content and the destination application interface content simultaneously by displaying the destination application in the split screen, so that the user can copy the content from the source interface to the destination application interface for multiple times to paste, the trouble of frequently switching the interfaces is avoided, and the time is saved for the user.
The foregoing description is made taking an electronic device as a single-screen electronic device (for example, a single-screen mobile phone, a tablet, etc.), and is equally applicable to a full-screen electronic device (for example, a folding-screen mobile phone) and a single-screen folded by the folding-screen electronic device.
For electronic devices with smaller screen widths, the up-down split screen mode shown in fig. 12A and 12B can be used to display the destination application interface. For electronic equipment with wide screen, such as a flat panel, a folding screen mobile phone with a folding screen unfolded to a full screen state, the embodiment of the application provides a left-right split screen display mode.
Fig. 13A is a schematic diagram illustrating a process of displaying a jindong application interface in a split screen without loosening hands after dragging a target text. The process of opening the jingdong application and displaying the jingdong application interface in the example shown in fig. 13A is the same as the process in the example shown in fig. 7A1, and will not be repeated here. The example shown in fig. 13A differs from the example shown in fig. 7A1 in that the forward application interface is displayed in a split screen in the example shown in fig. 13A, and the split screen manner is left and right split screens, whereas the forward application interface is directly displayed full screen in the example shown in fig. 7 A1.
It should be noted that, in the case where the forward application interface is displayed in the split screen, the process of opening the forward application and displaying the forward application interface may be the same as the process in the example shown in fig. 7A2, or the process in the example shown in fig. 7A3, where the difference is only that the display manner of the forward application interface is different, and will not be described herein again.
Fig. 13B is a schematic diagram illustrating a process of displaying a jindong application interface in a split screen in a case of releasing hands after dragging the target text. The process of opening the jingdong application and displaying the jingdong application interface in the example shown in fig. 13B may be the same as the process in the example shown in fig. 7B, or the process indicated by the (a) diagram→ (B) diagram→ (e) diagram in the example shown in fig. 7B (i.e. after displaying the jingdong application interface, the electronic device automatically adds the target text to the text input box on the jingdong application interface), which will not be repeated here. The example shown in fig. 13B differs from the example shown in fig. 7B in that the forward application interface is displayed in a split screen in the example shown in fig. 12B, and the split screen manner is left and right split screens, whereas the forward application interface is directly displayed full screen in the example shown in fig. 7B.
It should be noted that, the foregoing modes of displaying the application interface in the electronic device, such as direct full screen display, floating window display, and split screen display, may be set simultaneously, and a start switch may be set for each display mode. Thus, the user can select the corresponding mode of displaying the forward application interface by turning on the corresponding start switch according to the personal use habit. When the user selects the mode displayed in the split screen, the electronic equipment can determine whether to adopt upper and lower split screens or left and right split screens according to the type of the electronic equipment before split screen. For example, when the electronic device type is a single-screen and narrow-screen electronic device (such as a single-screen mobile phone), the electronic device may determine to employ upper and lower split screens; when the electronic device type is a single-screen and wide-screen electronic device (such as a tablet), the electronic device can determine to adopt left and right split screens; when the electronic device is a folding screen electronic device, the electronic device can further determine what kind of split screen is adopted according to whether the screen is in a folding state, namely, if the screen is in a folding state, the split screen is adopted up and down, and if the screen is in an unfolded state, the split screen is adopted left and right.
Fig. 14A is a schematic diagram illustrating a process of switching to an application in a folding screen electronic device or a wide screen electronic device without loosening hands after dragging a target text. Fig. 14A illustrates a process of switching to an application from a jindong application to a WeChat application without loosening hands after dragging a target text. The process in the example shown in fig. 14A is the same as the principle of the process in the example shown in fig. 9A, and will not be repeated here. After (b) and before (c) in fig. 14A, the process of dragging the target text to an area outside the recommended application list by the user and displaying the recommended application list by the electronic device in response to dragging the target text to the area outside the recommended application list is omitted.
Fig. 14B is a schematic diagram illustrating a process of switching to an application in a folding screen electronic device or a wide screen electronic device in case of releasing a hand after dragging a target text. Fig. 14B illustrates a procedure for switching to an application from the jindong application to the WeChat application in the case of releasing hands after dragging the target text. The process in the example shown in fig. 14B is the same as the principle of the process in the example shown in fig. 9B, and will not be repeated here.
Fig. 15 is a schematic diagram exemplarily showing a process of displaying one destination application based on target text in the other destination application. Referring to fig. 15 (a), in which the electronic device has displayed a WeChat application interface on the right side of the screen based on "Beijing city sea area x road x building" in which a dragged target text is selected in the picture a interface on the left side of the screen. At this time, the user clicks the picture b in the micro-letter application interface on the right side of the screen, and the electronic device displays the picture b on the right side of the screen in response to clicking the picture b in the micro-letter application interface on the right side of the screen, as shown in fig. 15 (b).
Next, the electronic device acquires the OCR recognition result of the picture b by the same principle as the OCR recognition result of the aforementioned picture a. After acquiring the OCR recognition result of the picture b, the electronic apparatus displays an OCR recognition result display button on the interface of the picture b, as shown in fig. 15 (c).
Then, when the user clicks the OCR recognition result display button on the interface of the picture b, the electronic apparatus displays the OCR recognition result of the picture b on the right side of the screen in response to clicking the OCR recognition result display button on the interface of the picture b. Thereafter, according to the same procedure principle as that of fig. 7B, the user selects the contact application as the destination application of the new target text "177" and "1991" through the corresponding recommended application list triggered by the picture B, as shown in fig. 15 (d).
And then, the electronic equipment responds to the fact that the user selects the contact application as a destination application in the corresponding recommended application list triggered by the picture b, and a contact application interface is displayed on the left side of the screen. Then, the electronic device pops up a text operation menu including paste operation options on the contact application interface in response to the user clicking a text input box corresponding to the "mobile phone" on the contact application interface, and stops displaying a recommended application list and a hover ball corresponding to the target text on the right side of the screen, as shown in (e) of fig. 15.
Finally, the electronic device responds to clicking the paste operation option in the text operation menu popped up by the contact application interface to add the new target text ' 177 ' 1991 ' to the text input box corresponding to the ' mobile phone ', as shown in the (f) diagram of fig. 15.
The embodiment of the application also provides electronic equipment, which comprises:
a memory and a processor, the memory coupled to the processor;
the memory stores program instructions that, when executed by the processor, cause the electronic device to perform any of the aforementioned application interaction methods.
The embodiment of the application also provides a computer readable storage medium, which comprises a computer program, and when the computer program runs on the electronic device, the electronic device is caused to execute any application interaction method.
The embodiment of the application also provides a chip, which comprises one or more interface circuits and one or more processors; the interface circuit is used for receiving signals from the memory of the electronic device and sending signals to the processor, wherein the signals comprise computer instructions stored in the memory; the computer instructions, when executed by the processor, cause the electronic device to perform any of the application interaction methods described previously.
The electronic device, the computer storage medium, the computer program product, or the chip provided in this embodiment are used to execute the corresponding methods provided above, so that the beneficial effects thereof can be referred to the beneficial effects in the corresponding methods provided above, and will not be described herein.
It will be appreciated by those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts shown as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
Any of the various embodiments of the application, as well as any of the same embodiments, may be freely combined. Any combination of the above is within the scope of the application.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing an electronic device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are to be protected by the present application.

Claims (23)

1. An application interaction method applied to an electronic device, the method comprising:
displaying an interface of a first application;
receiving a selection operation of a first target text on an interface of the first application;
responsive to dragging the first target text to a first area, displaying icons of a second application and icons of a third application in a second area, wherein the second area is positioned at the edge part of the screen of the electronic equipment;
in response to dragging the first target text to the icon of the second application, starting the second application, and displaying the interface of the second application in the area of the interface of the first application;
further comprises:
copying the first target text in response to dragging the first target text to a first area;
In response to dragging the first target text to the icon of the second application, starting the second application, and after displaying the interface of the second application in the area of displaying the interface of the first application, further comprising:
when the interface of the second application is detected to comprise a first text input box matched with the first target text, displaying the first target text in a superposition mode in an area in the first text input box;
responsive to a hands-free operation in the first area, pasting the first target text into the first text input box, and displaying the first target text in the first text input box, wherein the content of the first target text is text meeting the requirement of the first text input box on input content; or in response to a release operation outside the first region, not displaying the first target text within the first text input box.
2. The method of claim 1, wherein in response to dragging the first target text onto an icon of the second application, launching the second application, after displaying an interface of the second application in a region where the first application interface is displayed, further comprising:
When the interface of the second application is detected to comprise a first text input box matched with the first target text, displaying the first target text in a superposition mode in an area in the first text input box;
stopping the area overlapping display of the first target text in the first text input box in response to dragging the first target text outside the first area;
responsive to dragging the first target text to the first region, displaying an icon of the second application and an icon of the third application in the second region;
and responding to dragging the first target text to the icon of the third application, starting the third application, and displaying the interface of the third application in the area of the interface of the second application.
3. The method as recited in claim 1, further comprising:
copying the first target text in response to dragging the first target text to a first area;
in response to dragging the first target text to the icon of the second application, starting the second application, and after displaying the interface of the second application in the area of displaying the interface of the first application, further comprising:
And a second text input box and a third text input box are arranged on the interface of the second application, the first target text is pasted into the second text input box in response to the operation of releasing hands after dragging the first target text to the second text input box, and the first target text is displayed in the second text input box.
4. The method of claim 1, wherein in response to dragging the first target text onto an icon of the second application, launching the second application, after displaying an interface of the second application in a region where the first application interface is displayed, further comprising:
stopping displaying the icon of the second application and the icon of the third application;
the interface of the second application comprises a session area of a first object and a session area of a second object, and the session interface with the first object is displayed in the area of the interface of the second application and belongs to the second application in response to dragging the first target text to the session area of the first object;
a conversation interface with the first object comprises a fourth text input box matched with the first target text, the first target text is pasted into the fourth text input box, and the first target text is displayed in the fourth text input box;
And transmitting the content in the fourth text input box to the first object in response to receiving the transmission instruction.
5. The method of claim 1, wherein in response to dragging the first target text onto an icon of the second application, launching the second application, after displaying an interface of the second application in a region where the first application interface is displayed, further comprising:
stopping displaying the icon of the second application and the icon of the third application;
the interface of the second application comprises a session area of a first object and a session area of a second object, and the session interface with the first object is displayed in the area of the interface of the second application and belongs to the second application in response to dragging the first target text to the session area of the first object;
a conversation interface with the first object comprises a fourth text input box matched with the first target text, the first target text is pasted into the fourth text input box, and the first target text is displayed in the fourth text input box;
receiving an editing instruction for the text in the fourth text input box, and editing the text in the fourth text input box according to the editing instruction;
And after the editing is completed, transmitting the content in the fourth text input box to the first object in response to receiving a transmitting instruction.
6. The method of claim 1, wherein in response to dragging the first target text onto an icon of the second application, launching the second application, after displaying an interface of the second application in a region where the first application interface is displayed, further comprising:
stopping displaying the icon of the second application and the icon of the third application;
the interface of the second application comprises a session area of a first object and a session area of a second object, the first target text is sent to the first object in response to dragging the first target text to the session area of the first object, and the session interface with the first object is displayed after the first target text is sent to the area of the interface of the second application.
7. The method of claim 1, wherein launching the second application in response to dragging the first target text onto an icon of the second application, after displaying an interface of the second application in a region of the interface where the first application is displayed, further comprises:
And stopping displaying the icon of the second application and the icon of the third application.
8. The method as recited in claim 1, further comprising:
displaying an interface of a fourth application;
receiving a selection operation of a second target text on an interface of the fourth application;
responsive to dragging the second target text to the first region, displaying an icon of a fifth application and an icon of a sixth application in the second region;
and responding to dragging the second target text to the icon of the fifth application, starting the fifth application, displaying a floating window in a superposition manner on the interface of the fourth application, and displaying the interface of the fifth application in the floating window.
9. The method as recited in claim 1, further comprising:
displaying an interface of a fourth application;
receiving a selection operation of a second target text on an interface of the fourth application;
responsive to dragging the second target text to the first region, displaying an icon of a fifth application and an icon of a sixth application in the second region;
and in response to dragging the second target text to the icon of the fifth application, starting the fifth application, displaying a first split screen and a second split screen in a region displaying the interface of the fourth application, displaying the interface of the fourth application in the first split screen, and displaying the interface of the fifth application in the second split screen.
10. The method of claim 9, wherein the first split screen is located in an upper half of the electronic device screen and the second split screen is located in a lower half of the electronic device screen.
11. The method of claim 9, wherein the first split screen is located in a left half of the electronic device screen and the second split screen is located in a right half of the electronic device screen.
12. The method as recited in claim 9, further comprising:
receiving a selection operation of a third target text on an interface of the fifth application;
displaying an icon of a seventh application and an icon of an eighth application in the second area in response to dragging the third target text to the first area;
and responding to dragging the third target text to the icon of the seventh application, starting the seventh application, and displaying an interface of the seventh application on the first split screen.
13. The method of claim 1, wherein the second application and the third application are predicted from the first target text.
14. The method of claim 1, wherein the first region is a screen side region.
15. An application interaction method applied to an electronic device, the method comprising:
receiving a selection operation of a first target text on an interface of a first application;
predicting a second application and a third application according to the first target text;
responsive to dragging the first target text to a first area, displaying an icon of the second application and an icon of the third application in a second area, wherein the second area is positioned at the edge part of the screen of the electronic equipment;
in response to dragging the first target text to the icon of the second application, starting the second application, and displaying the interface of the second application in the area of the interface of the first application;
further comprises:
copying the first target text in response to dragging the first target text to a first area;
when the interface of the second application is detected to comprise a first text input box matched with the first target text, displaying the first target text in a superposition mode in an area in the first text input box;
responsive to a hands-free operation in the first area, pasting the first target text into the first text input box, and displaying the first target text in the first text input box, wherein the content of the first target text is text meeting the requirement of the first text input box on input content; or in response to a release operation outside the first region, not displaying the first target text within the first text input box.
16. The method as recited in claim 15, further comprising:
receiving a selection operation of a second target text on an interface of a fourth application;
predicting a fifth application and a sixth application according to the second target text;
responsive to dragging the second target text to the first region, displaying an icon of a fifth application and an icon of a sixth application in the second region;
and responding to dragging the second target text to the icon of the fifth application, starting the fifth application, displaying a floating window in a superposition manner on the interface of the fourth application, and displaying the interface of the fifth application in the floating window.
17. The method as recited in claim 15, further comprising:
receiving a selection operation of a second target text on an interface of a fourth application;
predicting a fifth application and a sixth application according to the second target text;
responsive to dragging the second target text to the first region, displaying an icon of a fifth application and an icon of a sixth application in the second region;
and in response to dragging the second target text to the icon of the fifth application, starting the fifth application, displaying a first split screen and a second split screen in a region displaying the interface of the fourth application, displaying the interface of the fourth application in the first split screen, and displaying the interface of the fifth application in the second split screen.
18. The method as recited in claim 17, further comprising:
receiving a selection operation of a third target text on an interface of the fifth application;
predicting a seventh application and an eighth application according to the third target text;
displaying an icon of a seventh application and an icon of an eighth application in the second area in response to dragging the third target text to the first area;
and responding to dragging the third target text to the icon of the seventh application, starting the seventh application, and displaying an interface of the seventh application on the first split screen.
19. The method of claim 17, wherein displaying the first and second split screens in the area of the interface displaying the fourth application comprises:
and when the electronic equipment is single-screen equipment, displaying the first split screen on the upper half part of the screen of the electronic equipment, and displaying the second split screen on the lower half part of the screen of the electronic equipment.
20. The method of claim 17, wherein displaying the first and second split screens in the area of the interface displaying the fourth application comprises:
and when the electronic equipment is a folding screen equipment, displaying the first split screen on the left half part of the screen of the electronic equipment, and displaying the second split screen on the right half part of the screen of the electronic equipment.
21. An electronic device, comprising:
a memory and a processor, the memory coupled with the processor;
the memory stores program instructions that, when executed by the processor, cause the electronic device to perform the application interaction method of any of claims 1-20.
22. A computer readable storage medium comprising a computer program which, when run on an electronic device, causes the electronic device to perform the application interaction method of any of claims 1-20.
23. A chip comprising one or more interface circuits and one or more processors; the interface circuit is configured to receive a signal from a memory of an electronic device and to send the signal to the processor, the signal including computer instructions stored in the memory; the computer instructions, when executed by the processor, cause the electronic device to perform the application interaction method of any of claims 1-20.
CN202111340669.1A 2021-11-12 2021-11-12 Application interaction method and electronic equipment Active CN115033142B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111340669.1A CN115033142B (en) 2021-11-12 2021-11-12 Application interaction method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111340669.1A CN115033142B (en) 2021-11-12 2021-11-12 Application interaction method and electronic equipment

Publications (2)

Publication Number Publication Date
CN115033142A CN115033142A (en) 2022-09-09
CN115033142B true CN115033142B (en) 2023-09-12

Family

ID=83117948

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111340669.1A Active CN115033142B (en) 2021-11-12 2021-11-12 Application interaction method and electronic equipment

Country Status (1)

Country Link
CN (1) CN115033142B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104050153A (en) * 2013-03-11 2014-09-17 三星电子株式会社 Method And Apparatus For Copying And Pasting Of Data
CN106575195A (en) * 2014-10-24 2017-04-19 谷歌公司 Improved drag-and-drop operation on a mobile device
CN109960446A (en) * 2017-12-25 2019-07-02 华为终端有限公司 It is a kind of to control the method and terminal device that selected object is shown in application interface
CN111124709A (en) * 2019-12-13 2020-05-08 维沃移动通信有限公司 Text processing method and electronic equipment
CN112882623A (en) * 2021-02-09 2021-06-01 维沃移动通信有限公司 Text processing method and device, electronic equipment and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6986105B2 (en) * 2003-01-30 2006-01-10 Vista Print Limited Methods employing multiple clipboards for storing and pasting textbook components
US8370762B2 (en) * 2009-04-10 2013-02-05 Cellco Partnership Mobile functional icon use in operational area in touch panel devices

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104050153A (en) * 2013-03-11 2014-09-17 三星电子株式会社 Method And Apparatus For Copying And Pasting Of Data
CN106575195A (en) * 2014-10-24 2017-04-19 谷歌公司 Improved drag-and-drop operation on a mobile device
CN109960446A (en) * 2017-12-25 2019-07-02 华为终端有限公司 It is a kind of to control the method and terminal device that selected object is shown in application interface
CN111124709A (en) * 2019-12-13 2020-05-08 维沃移动通信有限公司 Text processing method and electronic equipment
CN112882623A (en) * 2021-02-09 2021-06-01 维沃移动通信有限公司 Text processing method and device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
交互界面功能性扩展设计研究:以智能手机为例;崔天剑董甜甜;《南京艺术学院学报(美术与设计)》;第206-210页 *

Also Published As

Publication number Publication date
CN115033142A (en) 2022-09-09

Similar Documents

Publication Publication Date Title
US20220377128A1 (en) File transfer display control method and apparatus, and corresponding terminal
US20220342850A1 (en) Data transmission method and related device
WO2021063074A1 (en) Method for split-screen display and electronic apparatus
CN106775334B (en) File calling method and device on mobile terminal and mobile terminal
US20240053879A1 (en) Object Drag Method and Device
CN111666055B (en) Data transmission method and device
CN110519461B (en) File transmission method, device, computer equipment and storage medium
WO2019128923A1 (en) Method for controlling displaying selected object in application interface, and terminal device
CN113407086B (en) Object dragging method, device and storage medium
CN112306325B (en) Interaction control method and device
WO2021179904A1 (en) Labeled data processing method, device, and storage medium
CN111597000A (en) Small window management method and terminal
WO2020259669A1 (en) View display method and electronic device
CN113467660A (en) Information sharing method and electronic equipment
WO2016173307A1 (en) Message copying method and device, and smart terminal
WO2022135476A1 (en) Screenshot method and apparatus, and electronic device
CN113849092A (en) Content sharing method and device and electronic equipment
WO2023221946A1 (en) Information transfer method and electronic device
WO2023241563A1 (en) Data processing method and electronic device
CN115033142B (en) Application interaction method and electronic equipment
CN112399010A (en) Page display method and device and electronic equipment
WO2023082817A1 (en) Application program recommendation method
CN115033153B (en) Application program recommendation method and electronic device
CN111796733B (en) Image display method, image display device and electronic equipment
CN115756269A (en) Translation method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant