CN115033142A - Application interaction method and electronic equipment - Google Patents

Application interaction method and electronic equipment Download PDF

Info

Publication number
CN115033142A
CN115033142A CN202111340669.1A CN202111340669A CN115033142A CN 115033142 A CN115033142 A CN 115033142A CN 202111340669 A CN202111340669 A CN 202111340669A CN 115033142 A CN115033142 A CN 115033142A
Authority
CN
China
Prior art keywords
application
interface
target text
displaying
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111340669.1A
Other languages
Chinese (zh)
Other versions
CN115033142B (en
Inventor
王晨博
卫渊
周元甲
刘秋冶
伍国林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202111340669.1A priority Critical patent/CN115033142B/en
Publication of CN115033142A publication Critical patent/CN115033142A/en
Application granted granted Critical
Publication of CN115033142B publication Critical patent/CN115033142B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides an application interaction method and electronic equipment. The application interaction method comprises the following steps: displaying an interface of a first application, and receiving a selected operation of a first target text on the interface of the first application. Then, in response to dragging the first target text to the first area, an icon of a second application and an icon of a third application are displayed in a second area, wherein the second area is located at an edge portion of a screen of the electronic device. Then, in response to dragging the first target text onto the icon of the second application, the second application is started, and the interface of the second application is displayed in the area where the interface of the first application is displayed. Therefore, the user can conveniently start the second application, namely the destination application and display the destination application interface by dragging the text to the second application icon on the first application interface, namely the text source application interface, so that the operation is simplified, and the use experience of the user is improved.

Description

Application interaction method and electronic equipment
Technical Field
The embodiment of the application relates to the field of terminal electronic equipment, in particular to an application interaction method and electronic equipment.
Background
Currently, a large number of applications are generally installed in intelligent terminal electronic devices such as mobile phones and tablets. Users often need to copy text from one application and paste into another application when using such electronic devices. At this time, after the user finishes copying the text, the user exits the application interface for copying the text, finds the application needing to be pasted with the copied text from the main interface, opens the application interface needing to be pasted with the copied text, finds the pasting position, and finally pastes the application. As can be seen, the application interaction process is quite cumbersome. Particularly, under the condition of copying and pasting for multiple times, a user needs to frequently switch between different applications, the operation is complicated, the consumed time is long, and the user experience is poor.
Disclosure of Invention
In order to solve the technical problem, the application interaction method and the electronic device are provided, and a user can conveniently start a destination application and display a destination application interface through operating an application icon on a text source application interface, so that the operation is simplified, and the user experience is improved.
In a first aspect, the present application provides an application interaction method, including: displaying an interface of a first application, and receiving a selected operation of a first target text on the interface of the first application. And in response to dragging the first target text to the first area, displaying an icon of a second application and an icon of a third application in a second area, wherein the second area is positioned at the edge part of the screen of the electronic equipment. Then, in response to dragging the first target text onto the icon of the second application, the second application is started, and the interface of the second application is displayed in the area where the interface of the first application is displayed. Therefore, the user can conveniently start the second application, namely the destination application and display the destination application interface by dragging the text to the second application icon on the first application interface, namely the text source application interface, so that the operation is simplified, and the use experience of the user is improved.
According to the first aspect, in response to dragging the first target text onto the icon of the second application, launching the second application, after displaying an interface of the second application in the area where the first application interface is displayed, the method may further include: when detecting that the interface of the second application comprises a first text input box matched with the first target text, displaying the first target text in an area in the first text input box in an overlapping mode. Then, in response to dragging the first target text outside the first area, stopping displaying the first target text in the area superposition mode in the first text input box. Next, in response to dragging the first target text to the first area, an icon of the second application and an icon of the third application are displayed in the second area. Then, in response to dragging the first target text onto an icon of the third application, the third application is started, and an interface of the third application is displayed in an area where the interface of the second application is displayed. Therefore, the user can conveniently switch the destination application by dragging the text from one application icon on the source application interface to the other application icon on the source application interface, the operation is simplified, and the user experience is improved.
According to a first aspect, the application interaction method further comprises: in response to dragging the first target text to the first region, the first target text is copied. Then, in response to dragging the first target text onto an icon of the second application, starting the second application, and after displaying an interface of the second application in an area where the interface of the first application is displayed, when detecting that a first text input box matched with the first target text is included on the interface of the second application, displaying the first target text in an area inside the first text input box in an overlapping mode. Next, in response to the hands-free operation, pasting the first target text into the first text entry box and displaying the first target text within the first text entry box. Therefore, the user can automatically paste the text to the corresponding position through the loose-hand operation after dragging the text, the operation is simplified, and the user experience is improved.
The matching means that the content of the first target text meets the requirement of the first text input box on the input content. For example, a text entry box named "detail address" requires the input content to be: the input content is an address information. If the content of a certain text is an address, the text input box named "detailed address" is the text input box matching the text.
According to the first aspect, the application interaction method may further include: in response to dragging the first target text to the first region, the first target text is copied. In response to dragging the first target text onto an icon of the second application, launching the second application, further comprising, after displaying an interface of the second application in the area where the first application interface is displayed: and the interface of the second application comprises a second text input box and a third text input box, the first target text is pasted to the second text input box in response to the operation of releasing hands after the first target text is dragged to the second text input box, and the first target text is displayed in the second text input box. Therefore, when a plurality of text input boxes are arranged on the destination application interface, the text input box for pasting the text can be selected according to the position where the text is dragged, so that the operation is simplified, and the use experience of a user is improved.
According to the first aspect, in response to dragging the first target text onto the icon of the second application, the second application is started, and after the interface of the second application is displayed in the area where the first application interface is displayed, the method further includes: and stopping displaying the icon of the second application and the icon of the third application. Then, a conversation area of the first object and a conversation area of the second object are included on the interface of the second application, and in response to dragging the first target text to the conversation area of the first object, a conversation interface with the first object is displayed in an area where the interface of the second application is displayed, and the conversation interface with the first object belongs to the second application. Next, the conversation interface with the first object includes a fourth text entry box matching the first target text, pastes the first target text into the fourth text entry box, and displays the first target text within the fourth text entry box. Then, in response to receiving the transmission instruction, the content in the fourth text entry box is transmitted to the first object. Therefore, the text is continuously dragged in the destination application interface, so that the display of the next level interface of the current interface of the destination application can be conveniently started, the operation is simplified, and the user experience is improved.
According to the first aspect, in response to dragging the first target text onto the icon of the second application, the second application is started, and after the interface of the second application is displayed in the area where the first application interface is displayed, the method further includes: and stopping displaying the icon of the second application and the icon of the third application. And displaying the conversation interface with the first object in an area where the interface of the second application is displayed in response to dragging the first target text to the conversation area of the first object, wherein the conversation interface with the first object belongs to the second application. The conversation interface with the first object includes a fourth text entry box matching the first target text, pastes the first target text into the fourth text entry box, and displays the first target text within the fourth text entry box. And receiving an editing instruction of the text in the fourth text input box, and editing the text in the fourth text input box according to the editing instruction. And after the editing is finished, responding to the received sending instruction, and sending the content in the fourth text input box to the first object. Therefore, the text pasted into the text input box is edited in the destination application interface, the text can be conveniently modified according to the user requirement, the requirement of the user in a specific scene is met, and the user experience is improved.
According to the first aspect, in response to dragging the first target text onto the icon of the second application, the second application is started, and after the interface of the second application is displayed in the area where the first application interface is displayed, the method further includes: stopping displaying the icon of the second application and the icon of the third application; the interface of the second application comprises a conversation area of the first object and a conversation area of the second object, the first target text is sent to the first object in response to the first target text being dragged to the conversation area of the first object, and the conversation interface with the first object after sending is displayed in the area for displaying the interface of the second application. Therefore, the text can be automatically sent to the target object by dragging the text to the conversation area of the target object, so that the operation is simplified, and the use experience of a user is improved.
According to the first aspect, in response to dragging the first target text onto the icon of the second application, the second application is launched, and after displaying the interface of the second application in the area where the interface of the first application is displayed, the method further includes: and stopping displaying the icon of the second application and the icon of the third application.
According to the first aspect, the application interaction method may further include: displaying an interface of a fourth application. And receiving a selected operation of a second target text on the interface of the fourth application, and displaying an icon of the fifth application and an icon of the sixth application in the second area in response to dragging the second target text to the first area. And then, in response to the second target text being dragged to the icon of the fifth application, starting the fifth application, displaying a floating window on the interface of the fourth application in an overlapping manner, and displaying the interface of the fifth application in the floating window. Therefore, by using the floating window, the source application interface and the destination application interface of the text can be displayed on the screen of the electronic equipment at the same time, so that the text can be copied and pasted for many times conveniently, the operation is simplified, and the user experience is improved.
According to the first aspect, the application interaction method may further include: displaying an interface of a fourth application, receiving a selection operation of a second target text on the interface of the fourth application, responding to dragging of the second target text to the first area, and displaying an icon of the fifth application and an icon of the sixth application in the second area. And then, in response to dragging the second target text onto the icon of the fifth application, starting the fifth application, displaying a first split screen and a second split screen in an area where the fourth interface is displayed, displaying the interface of the fourth application in the first split screen, and displaying the interface of the fifth application in the second split screen. Therefore, by means of split screen, the source application interface and the destination application interface of the text can be displayed on the screen of the electronic equipment at the same time, repeated copying and pasting are facilitated, operation is simplified, and user experience is improved.
According to a first aspect, the first screen division is located in an upper half of a screen of the electronic device and the second screen division is located in a lower half of the screen of the electronic device.
According to a first aspect, the first screen division is located in a left half of a screen of the electronic device and the second screen division is located in a right half of the screen of the electronic device.
According to the first aspect, the application interaction method may further include: and receiving a selection operation of a third target text on an interface of a fifth application, and displaying an icon of a seventh application and an icon of an eighth application in a second area in response to dragging the third target text to the first area. Then, in response to dragging the third target text onto the icon of the seventh application, the seventh application is started, and an interface of the seventh application is displayed on the first screen. Therefore, the split screen where the text source application interface is located is kept unchanged, the destination application interface is displayed on the other split screen, the source application interface and the destination application interface of the text can be displayed on the screen of the electronic equipment at the same time, repeated copying and pasting are facilitated, operation is simplified, and user experience is improved.
According to a first aspect, the second application and the third application are predicted from the first target text.
According to a first aspect, the first region is a screen side region. For example, in one example, the first region may be a right side region of the screen. In another example, the first region may be a left side region of the screen.
In one example, the first region is not coincident with the second region.
In a second aspect, the present application provides an application interaction method, applied to an electronic device, including: receiving a selected operation of a first target text on an interface of a first application, predicting a second application and a third application according to the first target text, and displaying an icon of the second application and an icon of the third application in a second area in response to dragging the first target text to the first area, wherein the second area is located at the edge part of a screen of the electronic equipment; and responding to the first target text dragged to the icon of the second application, starting the second application, and displaying the interface of the second application in the area where the interface of the first application is displayed. Therefore, the user can conveniently start the second application, namely the destination application and display the destination application interface by dragging the text to the second application icon on the first application interface, namely the text source application interface, so that the operation is simplified, and the use experience of the user is improved.
According to the second aspect, the application interaction method may further include: in response to dragging the first target text to the first area, the first target text is copied. When detecting that the interface of the second application comprises a first text input box matched with the first target text, displaying the first target text in an area in the first text input box in an overlapping mode. Then, in response to the hands-free operation, the first target text is pasted into the first text entry box, and the first target text is displayed within the first text entry box. Like this, can accomplish to paste convenient and fast ground, promote user's use and experience.
According to the second aspect, the application interaction method may further include: and receiving a selected operation of a second target text on the interface of the fourth application, and predicting a fifth application and a sixth application according to the second target text. Then, in response to dragging the second target text to the first area, an icon of a fifth application and an icon of a sixth application are displayed in the second area. And then, responding to the dragging of the second target text to the icon of the fifth application, starting the fifth application, displaying a floating window on the interface of the fourth application in an overlapping mode, and displaying the interface of the fifth application in the floating window. Therefore, by using the floating window, the source application interface and the destination application interface of the text can be displayed on the screen of the electronic equipment at the same time, so that the text can be copied and pasted for many times conveniently, the operation is simplified, and the user experience is improved.
According to the second aspect, the application interaction method may further include: and receiving a selected operation of a second target text on the interface of the fourth application, and predicting a fifth application and a sixth application according to the second target text. Then, in response to dragging the second target text to the first area, an icon of a fifth application and an icon of a sixth application are displayed in the second area. And then, responding to the dragging of the second target text to the icon of the fifth application, starting the fifth application, displaying a first split screen and a second split screen in the area for displaying the fourth interface, displaying the interface of the fourth application in the first split screen, and displaying the interface of the fifth application in the second split screen. Therefore, by means of split screen, the source application interface and the destination application interface of the text can be displayed on the screen of the electronic equipment at the same time, repeated copying and pasting are facilitated, operation is simplified, and user experience is improved.
According to the second aspect, the application interaction method may further include: and receiving a selected operation on a third target text on an interface of the fifth application, and predicting a seventh application and an eighth application according to the third target text. Next, in response to dragging the third target text to the first area, an icon of a seventh application and an icon of an eighth application are displayed in the second area. Then, in response to dragging the third target text onto the icon of the seventh application, the seventh application is started, and an interface of the seventh application is displayed on the first screen. Therefore, the split screen where the text source application interface is located is kept unchanged, the destination application interface is displayed on the other split screen, the source application interface and the destination application interface of the text can be displayed on the screen of the electronic equipment at the same time, repeated copying and pasting are facilitated, operation is simplified, and user experience is improved.
According to a second aspect, displaying a first split screen and a second split screen in an area where a fourth interface is displayed includes: when the electronic equipment is single-screen equipment, a first split screen is displayed on the upper half part of the screen of the electronic equipment, and a second split screen is displayed on the lower half part of the screen of the electronic equipment. Therefore, according to the type of the equipment, the use experience of a user can be improved by adopting a proper screen splitting mode.
According to a second aspect, displaying a first split screen and a second split screen in an area where a fourth interface is displayed includes: when the electronic equipment is the folding screen equipment, a first split screen is displayed on the left half part of the screen of the electronic equipment, and a second split screen is displayed on the right half part of the screen of the electronic equipment. Therefore, according to the type of the equipment, the use experience of a user can be improved by adopting a proper screen splitting mode.
In a third aspect, the present application provides an electronic device comprising: a memory and a processor, the memory coupled with the processor. The memory stores program instructions that, when executed by the processor, cause the electronic device to perform any of the application interaction methods of the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium comprising a computer program which, when run on an electronic device, causes the electronic device to perform the method of application interaction of any of the first aspects.
In a fifth aspect, the present application provides a chip comprising one or more interface circuits and one or more processors. Wherein the interface circuit is configured to receive signals from a memory of the electronic device and to send signals to the processor, the signals comprising computer instructions stored in the memory. The computer instructions, when executed by the processor, cause the electronic device to perform any of the application interaction methods of the first aspect.
Drawings
Fig. 1 is a schematic structural diagram of an exemplary electronic device 100;
fig. 2 is a block diagram illustrating a software structure of the electronic device 100 according to the embodiment of the present application;
FIG. 3 is a timing diagram illustrating operations in an application interaction process according to an embodiment of the present application;
FIG. 4 is an exemplary illustrative interface diagram showing a process prior to displaying OCR recognition results for a picture;
FIG. 5 is a schematic diagram illustrating an exemplary process of selecting and dragging a target text in the OCR recognition result of picture a;
FIG. 6 is a diagram illustrating an exemplary process for initiating a display of a list of recommended applications;
FIG. 7A1 is a diagram illustrating an exemplary process for displaying a target application interface directly in full screen without loosing hands after dragging target text;
FIG. 7A2 is a schematic diagram illustrating another process for directly displaying a target application interface in a full screen without loosing hands after dragging target text;
FIG. 7A3 is a diagram illustrating another alternative process for displaying a target application interface directly in full screen without loosing hands after dragging a target text;
FIG. 7B is a diagram illustrating an exemplary process for displaying a target application interface directly in full screen with a loose hand after dragging a target text;
FIG. 8 is a schematic diagram illustrating interaction between a user and an electronic device in an application interaction method;
FIG. 9A is a diagram illustrating an exemplary process of switching recommended applications without loosing hands after dragging the target text;
FIG. 9B is a diagram illustrating an exemplary process of switching recommended applications in the case of a loose hand after dragging a target text;
FIG. 10A1 is a diagram illustrating an exemplary operation of entering a WeChat application without loosing hands after dragging a target text;
FIG. 10A2 is a diagram illustrating another operation process after entering a WeChat application without loosing hands after dragging a target text;
FIG. 10B1 illustrates an exemplary operation process after entering a WeChat application with the target text being dragged and then released;
FIG. 10B2 is a diagram illustrating another operation process after entering a WeChat application in the case of a loose hand after dragging a target text;
FIG. 11A is a schematic diagram illustrating a process for displaying a map application interface in a floating window without loosing hands after dragging target text;
FIG. 11B is a schematic diagram illustrating a process for displaying a map application interface in a floating window with a loose hand after dragging target text;
FIG. 12A is a schematic diagram illustrating a process for displaying a map application interface in a split screen without loosing hands after dragging a target text;
FIG. 12B is a diagram illustrating a process for displaying a map application interface in a split screen with a loose hand after dragging target text;
FIG. 13A is a schematic diagram illustrating a process of displaying a Jingdong application interface in split screen without loosing hands after dragging a target text;
FIG. 13B is a diagram illustrating a process of displaying the Jingdong application interface in split screen with the target text dragged and released;
FIG. 14A is a schematic diagram illustrating an exemplary process of switching to an application in a folding screen electronic device or a wide screen electronic device without loosing hands after dragging a target text;
FIG. 14B is a schematic diagram illustrating a process of switching to an application in a folder-screen electronic device or a wide-screen electronic device when a user releases his/her hand after dragging a target text;
FIG. 15 is a diagram illustrating a process for displaying one going application based on target text in another going application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, of the embodiments of the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone.
The terms "first" and "second," and the like, in the description and in the claims of the embodiments of the present application are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first target object and the second target object, etc. are specific sequences for distinguishing different target objects, rather than describing target objects.
In the embodiments of the present application, words such as "exemplary" or "for example" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the description of the embodiments of the present application, the meaning of "a plurality" means two or more unless otherwise specified. For example, a plurality of processing units refers to two or more processing units; the plurality of systems refers to two or more systems.
The application interaction method in the embodiment of the application can be applied to a scene in which text in one application is copied and pasted into another application of the same electronic equipment.
In one example, the scenario may be, for example: copy-paste text in browser, memo, mail, WeChat, etc. applications to other applications different from the source application of the text. In this example, the text in the application can be selected directly.
For example, the user sees an important knowledge point in the browser interface, and wants to record the knowledge point in the memo. At this time, the knowledge point in the browser interface may be pasted to the memo by applying the application interaction method in the embodiment of the present application. For another example, there is a phone number in the WeChat chat interface, and the user wants to add the phone number to the address book. At this time, the application interaction method in the embodiment of the application may be applied to paste the telephone number in the wechat interface into the address book.
In another example, the scenario may be, for example: copy and paste the text in the Optical Character Recognition (OCR) Recognition result of the picture into an application. Wherein the pictures may be derived from gallery, real-time screen capture, browser, WeChat, QQ, and the like applications. The embodiment of the present application does not limit the source of the picture. In this example, the text in the application is derived from the OCR recognition results of the picture, and the text in the picture cannot be directly selected.
For example, a photo of a courier has address information that the user wants to add to the shipping address of a shopping application. At this time, the application interaction method in the embodiment of the present application may be applied to paste the address information in the photo into the shipping address of the shopping application.
It should be noted that the above examples are only schematic illustrations of application scenarios of the embodiments of the present application. The embodiment of the present application does not limit the source application (i.e. the application containing the text) and the destination application (i.e. the application needing to paste the text) of the text.
Herein, an application program may be simply referred to as an application.
The application interaction method in the embodiment of the application can be applied to electronic equipment. The electronic device may be, for example, a cell phone, tablet, etc. The hardware structure and the software structure of the electronic device to which the application interaction method in the embodiment of the present application is applied will be described below by taking the electronic device 100 as an example.
Fig. 1 is a schematic structural diagram of an exemplary electronic device 100. It should be understood that the electronic device 100 shown in fig. 1 is only one example of an electronic device, and that the electronic device 100 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in fig. 1 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
Referring to fig. 1, the electronic device 100 may include: the mobile terminal includes a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. In the embodiment of the present application, an Android (Android) system with a layered architecture is taken as an example to exemplarily illustrate a software structure of the electronic device 100.
Fig. 2 is a block diagram illustrating a software structure of the electronic device 100 according to the embodiment of the present application.
The layered architecture of the electronic device 100 divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom, respectively.
The application layer may include a series of application packages.
As shown in fig. 2, the application packages of the application layer may include applications such as sensors, cameras, galleries, OCR engines, application recommendations, application interaction modules, three-party applications, screen shots, and the like.
The OCR engine is used for performing OCR recognition on the picture to obtain an OCR recognition result of the picture, and the OCR recognition result comprises selectable and reproducible text information from the picture. The picture may be a screen shot picture obtained by a screen shot application, a photo obtained by a camera application, a picture stored in a gallery, and the like.
The application recommendation is used for predicting the application related to the text content according to the text content and recommending the predicted application to the user. The application interaction module is used for executing the application interaction method of the embodiment of the application. For details of the application interaction method in the embodiment of the present application, please refer to the description below. Three-party applications are used to provide pictures or to provide selectable, reproducible text messages.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layers may include a view system, an explorer, a window manager, an activity manager, and the like.
The view system includes visual controls such as controls for displaying text, controls for displaying pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying a picture.
The resource manager provides various resources for the application, such as localized text strings, icons, pictures, layout files, video files, and so forth.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, and judge whether a status bar, a lock screen, a capture screen and the like exist.
The activity manager is used for managing the life cycle and the navigation backspacing function of each application program, and is responsible for establishing the main thread of the Android and maintaining the life cycle of each application program.
The Android runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of Android (Android).
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application layer and the application framework layer as binary files. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules, such as surface managers (surface managers) or the like.
The surface manager is used to manage the display subsystem and provide a fusion of the 2D and 3D layers for multiple applications.
The kernel layer is a layer between hardware and software. The core layer may contain modules such as display drivers, sensor drivers, etc.
It is to be understood that the layers in the software structure shown in fig. 2 and the components included in each layer do not constitute a specific limitation of the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer layers than those shown, and may include more or fewer components in each layer, which is not limited in this application.
For convenience of description, the electronic device is hereinafter referred to simply as the electronic device. It should be understood that the electronic devices mentioned herein may each have the same hardware structure (e.g., the hardware structure shown in fig. 1) and the same software structure (e.g., the software structure shown in fig. 2) as the electronic device 100 in the foregoing embodiment.
The following describes an application interaction method according to an embodiment of the present application, taking a scene in which a text in an OCR recognition result of a picture is copied and pasted into an application as an example. In the following embodiment, it is assumed that the user wants to paste the text information in the picture into one text input box in the map application.
Fig. 3 is a schematic timing diagram illustrating operations in an application interaction process according to an exemplary embodiment of the present application. Fig. 8 is a schematic diagram illustrating interaction between a user and an electronic device in an application interaction method. The following describes in detail a process of applying the interaction method according to the embodiment of the present application with reference to fig. 3 and 8.
Referring to fig. 3, the process of applying the interactive method according to the embodiment of the present application includes a stage of "open picture, OCR recognition result display button is not displayed". At this stage, the electronic device performs processing of "displaying the picture a on the screen of the electronic device" and "detecting whether there is text in the picture a, and if there is text, acquiring an OCR recognition result of the picture a".
Referring to fig. 8, in the stage of "open picture, OCR recognition result display button not displayed", the process of applying the interaction method of the embodiment of the present application may include the following steps:
in step S1, the user clicks on picture a.
In step S2, the electronic device displays the picture a on the screen of the electronic device in response to clicking the picture a.
Step S3, the electronic device detects whether there is text in the picture a, and if so, obtains an OCR recognition result of the picture a.
The user can perform an operation of clicking a picture in various cases, thereby displaying the picture on the screen of the electronic device. The electronic device may be, for example, a mobile phone, a tablet, or the like. The electronic device is taken as a mobile phone as an example for explanation. It should be understood that the description herein of the example of a cell phone is equally applicable to other electronic devices other than cell phones, such as tablets and the like.
For example, fig. 4 is an interface diagram illustrating a process before displaying an OCR recognition result of a picture. Referring to fig. 4, in an exemplary implementation manner, a user may click a "gallery" icon (see fig. 4 (a)) in a main interface of a mobile phone, and enter an internal interface of a "gallery" application after clicking the "gallery" icon, where the interface display at this time is as shown in fig. 4 (b). Referring to fig. 4 (b), the "gallery" includes pictures a, 1, 2, 3, n, etc. The user may continue to click on the picture on the interface shown in fig. 4 (b), for example, assuming that the user has clicked on the picture a.
At this time, the mobile phone displays the picture a on the mobile phone screen in response to clicking the picture a, as shown in fig. 4 (c). Fig. 4 (a) to (c) illustrate the example of entering the picture display interface from the gallery, and it should be noted that the embodiment of the present application is not limited to the mode of entering the picture display interface. For example, in other embodiments of the present application, the picture display interface may be entered by clicking a screenshot picture after the screenshot is performed, or may be entered by clicking a picture in a webpage during browsing the webpage, and so on, which are not listed here.
With continued reference to fig. 4, after the picture a is displayed, the application interaction method of the embodiment of the application further detects whether a text exists in the picture a. And if the text exists in the picture a, acquiring an OCR recognition result of the picture a. If the picture a does not have a text, the application interaction method in the embodiment of the application ends the application interaction process. At this point, the process of the "open picture, OCR recognition result display button not displayed" stage shown in fig. 3 is completed.
In the embodiment of the application, the electronic device may detect whether a text exists in the picture a in multiple ways, and acquire an OCR recognition result of the picture a when the text exists in the picture a.
In an exemplary implementation process, if the storage data of the picture a includes the OCR recognition result of the picture a, the electronic device may determine whether text exists in the picture a according to the storage data of the picture a. And, in case that there is text in the picture a, the electronic device may directly read the OCR recognition result of the picture a from the stored data of the picture a.
In another exemplary implementation process, if the stored data of the picture a does not include the OCR recognition result of the picture a, the electronic device may send the picture a to an OCR engine of an application layer shown in fig. 2, where the OCR engine performs real-time OCR recognition on the picture a and outputs the OCR recognition result of the picture a. Then, the electronic device acquires an OCR recognition result of the picture a from output data of the OCR engine.
The OCR engine is used for performing OCR recognition on the picture and outputting an OCR recognition result of the picture.
With reference to fig. 3, after the OCR recognition result of the picture a is obtained, the application interaction method according to the embodiment of the present application enters a stage of displaying an OCR recognition result display button but not turning on the button. At this stage, the electronic apparatus performs a process of "pop-up OCR recognition result display button on the interface of the picture a".
Referring to fig. 8, in the stage of displaying the OCR recognition result display button but not turning on the button, the process of applying the interaction method according to the embodiment of the present application may include the following steps:
in step S4, if the OCR recognition result of the picture a is acquired, the electronic device pops up an OCR recognition result display button on the screen of the electronic device.
Here, the ejected OCR recognition result display button is in an unopened state. At this time, the electronic device interface please refer to the diagram (d) in fig. 4. In the case where the user does not click the OCR recognition result display button in the graph of fig. 4 (d), the text in the picture a is not selectable.
At this time, the user may click an OCR recognition result display button so that an operation of selecting a text from the OCR recognition result and opening the map application can be performed later.
If the electronic equipment does not detect the operation of clicking the OCR recognition result display button within a preset time period after the OCR recognition result display button is popped up, the user can be confirmed that the user does not have the requirement of selecting texts from the OCR recognition results and pasting the texts into other applications. At this time, the electronic device may actively stop displaying the OCR recognition result display button on the electronic device screen, that is, the OCR recognition result display button disappears from the interface of the picture a.
In one exemplary implementation, if the user does not have the need to select text from the OCR recognition results and paste the selected text to other applications, the user may also click anywhere outside the OCR recognition result display button. At this time, the electronic device may stop displaying the OCR recognition result display button on the screen of the electronic device by detecting a user's clicking operation on any position other than the OCR recognition result display button. In this manner, the electronic apparatus passively stops displaying the OCR recognition result display button based on the user operation.
Of course, in another exemplary implementation, the electronic device may also maintain the OCR recognition result display button in an unopened state at all times if the user does not have a need to select text from the OCR recognition results and paste the selected text to the target application.
In step S5, the user clicks the OCR recognition result display button.
The electronic device detects the clicking operation of the user on the display button of the OCR recognition result, and can confirm that the user has a requirement for selecting a text from the OCR recognition result and pasting the selected text to an application, so that the flow of the subsequent OCR recognition result of the displayed picture a is entered.
With continued reference to fig. 3, after the user clicks the OCR recognition result display button, the application interaction method of the embodiment of the present application enters a stage of "displaying the OCR recognition result display button and the button is turned on". At this stage, the electronic device responds to the click operation of the OCR recognition result display button, and displays the OCR recognition result of the picture a on the screen of the electronic device; the electronic equipment responds to the operation of selecting a target text on the OCR recognition result of the picture a and dragging the target text to the side of the screen, and a recommended application list is displayed on the side of the screen of the electronic equipment; and the electronic equipment responds to the selection operation of the map application in the recommended application list, automatically starts the map application and displays the interface of the map application on the screen of the electronic equipment.
Referring to fig. 8, in the stage of displaying the OCR recognition result display button and turning on the button, the process of the application interaction method according to the embodiment of the present application may include the following steps:
in step S6, the electronic device displays the OCR recognition result of the picture a on the screen of the electronic device in response to the click operation of the OCR recognition result display button.
The interface after the user clicks the OCR recognition result display button is shown in fig. 4 (e). At this time, the text in the OCR recognition result is in a selectable state.
Note that in order to contrast with the text that is not selectable on the picture a, in the diagram (e) of fig. 4, a border is added to the text in the OCR recognition result. However, the present application is not limited to the display mode of the OCR recognition result. For example, in one example, no borders or other content may be added to the text in the OCR recognition results. In another example, text in the OCR recognition results may be set with a different background color than the background. In yet another example, the electronic device may highlight text in the OCR recognition result and may also highlight text belonging to a preset content type in the text.
For example, the predetermined content type may include, for example, a name, an address, a phone number, and the like. For the text belonging to the preset content type, the electronic equipment can display the text in a preset highlighting manner. For example, the electronic device may underline text belonging to a preset content type, highlight text belonging to the preset content type, set a different background color for text belonging to the preset content type than for other text in the OCR recognition result, and the like.
In step S7, the user selects a target text from the OCR recognition results and drags the target text to the side of the screen of the electronic device.
Illustratively, the user may select the target text to be copied by sliding over the target text.
The side of the screen may be the left side of the screen or the right side of the screen. The following description of the embodiment of the present application takes the example of dragging the target text to the right side of the screen.
In the embodiment of the present application, the side of the screen is a preset destination area for dragging. The present application does not limit the drag destination area, as long as it is set in advance. In other embodiments of the present application, the drag destination area may also be set to other areas, for example, a lower area of the screen. As long as the user drags the target text to a preset dragging destination area, subsequent display of a recommended application list can be triggered.
In step S8, the electronic device copies the target text in response to an operation of selecting the target text and dragging the target text to the side of the screen.
Fig. 5 is a schematic diagram illustrating an exemplary process of selecting and dragging a target text in the OCR recognition result of the picture a. Referring to fig. 5 (a), the user selects the target text "beijing chinese haichi district × lxralx building" by sliding on "beijing chinese haichi district × lx building", and the electronic device pops up a text operation menu on the electronic device screen in response to the user's operation of sliding on "beijing chinese haichi district × lx building" in the OCR recognition result. The text operation menu can comprise operation options of full selection, cutting, copying, pasting, translation and the like. Next, in the diagram (b) of fig. 5, the user directly drags the target text "beijing haihui district x rayx mansion" to the right side of the screen without clicking any operation option in the text operation menu, as shown in the diagram (c) of fig. 5, wherein the path and direction of the dragging are respectively referred to the dotted line and arrow in the diagram (b) of fig. 5. The electronic device copies the target text "beijing haichi district × lx road × mansion" in response to selecting the target text "beijing haichi district × lx road × mansion" and dragging the target text "beijing haichi district × lx road × mansion" to the right side of the screen. At this time, the target text "beijing haihui district x rayaxmansion" was found in the cut plate. When the user performs the drag operation, the electronic device stops displaying the text operation menu on the screen.
It should be noted that, in the application interaction process of the embodiment of the present application, the user does not need to click any operation option in the text operation menu. For the user, if the user's intention is not to select text from one application to paste into another application, but other intentions, such as the user wants to know the foreign language translation of the selected text, etc., at this time, the user may click on the relevant operation option in the text operation menu.
In step S9, the electronic device determines a recommended application list according to the target text content, and displays the recommended application list at the side of the screen.
In the application interaction method according to the embodiment of the present application, a process of starting and displaying a recommended application list is shown in fig. 6. Fig. 6 is a schematic diagram illustrating an exemplary process of initiating display of a list of recommended applications. Referring to fig. 6, the diagram (a) of fig. 6 is the same as the diagram (c) of fig. 5, and the description thereof is omitted. In fig. 6 (b), the electronic device detects that the target text "beijing haichi district x rayxx building" has been dragged to the side of the screen, transmits the target text "beijing haichi district x rayxx building" to the application recommendation app (as shown in fig. 2), determines at least two recommended applications according to the target text content, and notifies the application interaction module of the recommended applications. After receiving the recommended applications, the application interaction module may display the recommended applications in a recommended application list.
In this embodiment of the application, the mode of determining, by the application recommendation app according to the target text content, to recommend the application may be: and calculating the association degree of the target text content and each existing application, and adding at least two applications with the maximum association degree into a recommended application list as recommended applications. For example, the target text "beijing city haihui district xx road xx mansion" in the embodiment of the present application indicates one address, and there are a map-based application, a shopping-based application, a social-based application, and the like, which are greatly related to the address. Accordingly, it can be determined that the map application in the (b) diagram and the (c) diagram of fig. 6 is a map-like application, the kyoto application is a shopping-like application, and the WeChat application is a social-like application.
It should be understood that the above manner of determining the recommended application list according to the target text content is only an illustrative example, and the embodiment of the present application is not limited to a specific manner of determining the recommended application list according to the target text content.
It should be noted that, in the embodiment of the present application, whether the target text is located on the side of the screen is based on the touch position of the finger dragging the target text. If the touch position of the finger dragging the target text is located at the side of the screen, it is determined that the target text is already located at the side of the screen.
It should be noted that, in the embodiment of the present application, displaying recommended applications in a recommended application list is only one illustrative example of displaying recommended applications, and a recommended application display manner of the present application is not limited. In other embodiments of the present application, the recommended applications may be displayed in other manners. For example, icons for each recommended application may be individually and independently displayed on the screen of the electronic device.
In one example, after the list of recommended applications has been displayed on the electronic device, the user may continue to remain in a hands-free state for subsequent operations as in fig. 7a1, 7a2, 7 A3.
In another example, after the list of recommended applications has been displayed on the electronic device, the user may release his/her hands, at which point the electronic device may stop displaying the target text "beijing city haihui district xx road x building" dragged to the side of the screen and display the hover, as shown in fig. 6 (c). The levitation ball may include key information of the target text, for example, the levitation ball corresponding to the target text "beijing haihui district xx road x mansion" may display "xx road". In this case, after the user releases his/her hand, the subsequent operation can be performed as shown in fig. 7B.
In yet another example, after the list of recommended applications has been displayed on the electronic device, the user may release his or her hands, at which point the electronic device may stop displaying the target text "beijing haichi district x rayx mansion" dragged to the side of the screen and not display the hover ball. In this case, after the user releases his hand, the subsequent operation can be performed as shown in fig. 7B.
In yet another example, after the recommended applications list has been displayed on the electronic device, the user may release his or her hands, at which time the electronic device may hover to display the target text "beijing haihui district x rayx mansion" dragged to the side of the screen, i.e., the target text "beijing haihui district x rayx mansion" hovered on the screen of the electronic device without the user having to hold the target text with his or her hands. In this case, after the user releases his hand, the subsequent operation can be performed as shown in fig. 7B.
In the embodiment of the application, the recommended application list may be displayed at the side of the screen. For example, the recommended application list may be on the same side of the screen as the drag destination area. In the embodiment of the present application, a recommended application list is displayed on the right side of the screen as an example for explanation.
In the embodiment of the application, the control corresponding to the recommended application list is located on the uppermost layer of the screen, so that the recommended application list can be prevented from being covered by other interfaces on the screen of the electronic equipment, and the subsequent operation of selecting the application from the recommended application list can be prevented from being influenced.
The recommended application list may include application icons of at least two applications determined according to the target text content. When there are many applications in the recommended application list, a preset number (e.g., 3) of application icons may be displayed, and an add button may be displayed, as shown in (b) and (c) of fig. 6. In the (b) and (c) diagrams of fig. 6, application icons of three applications, i.e., a map application, a kyoto application, and a wechat application, are displayed in the recommended application list, and an add button is displayed.
When it is desired to display application icons of applications other than the map application, the jingdong application, and the wechat application (e.g., the contact application) in the recommended application list, in the diagram (b) of fig. 6, the user may drag the target text "beijing city haihui district × × × × × × × × × × × × × × × mansion" to the position of the add button, at which time, the recommended application list may sequentially display application icons of the jingdong application, the wechat application, and the contact application from top to bottom, with the application icon of the map application being associated to the add button.
Accordingly, when it is desired to display application icons of applications other than the map application, the jingdong application, and the wechat application (for example, the contact application) in the recommended application list, the user may click on the add button in the diagram (c) of fig. 6, and at this time, the recommended application list may sequentially display application icons of the jingdong application, the wechat application, and the contact application from top to bottom, with the application icon of the map application being associated to the add button.
The add button may not be included in the display content of the recommended application list if the application icons of all applications included in the recommended application list have already been displayed in the recommended application list.
In step S10, the user drags the target text to stay on the map application icon of the recommended application list, or hovers the target text to the side of the screen and then clicks the map application icon.
In step S12, the electronic device automatically starts the map application and displays an interface of the map application on the screen of the electronic device in response to dragging the target text to stay on the map application icon of the recommended application list or in response to hovering the target text to the side of the screen and then clicking the map application icon.
FIG. 7A1 is an exemplary illustration of a process for displaying a target application interface directly in full screen without loosing hands after dragging target text. In the case where the hand is not loosened after the target text is dragged, please refer to the situation shown in fig. 6 (b). Referring to fig. 7a1, in fig. 7a1 (a), the user drags the target text "beijing haichi district x road x building" to the map application icon to stop.
In fig. 7a1 (b), in response to dragging the target text to stay on the map application icon of the recommended application list, the electronic device opens the map application and displays an interface of the map application on the screen of the electronic device. Here, the electronic device displays the interface of the map application on the screen in a full-screen display manner, and the target text "beijing haihui district × roxlou mansion" is still displayed on the electronic device screen. In fig. 7a1 (b), when the user drags the target text over the map application icons of the recommended application list, the effect that the target text has been pasted to the target location (here, the text input box) where the target text needs to be pasted may be displayed.
In the case of the diagram (b) of fig. 7a1, if the user looses his or her hands, the electronic device will stop displaying the target text "beijing haihui district x rayx mansion" and the recommended applications list, and the electronic device may automatically add the target text "beijing haihui district x rayx mansion" to the designated text input box on the map application interface, when the interface of the electronic device is as shown in the diagram (c) of fig. 7a 1.
FIG. 7A2 is a diagram illustrating another process for directly displaying a target application interface in full screen without loosing hands after dragging target text. Fig. 7a2 (a) is the same as fig. 7a1 (a), and please refer to the foregoing description for related processes, which is not repeated herein. In the diagram (b) of fig. 7a2, in a case where the user has dragged the target text onto the map application icons of the recommended application list, the user keeps the hands-free state, and continues dragging the target text along the path and direction indicated by the dotted line with arrows in the diagram (b) of fig. 7a 2. The electronic device stops displaying the recommended application list on the screen in response to an operation of continuing to drag the target text to the text input box. When the user drags the target text to the text input box pointed by the arrow, and the user releases his hand, the electronic device adds the target text "beijing city haihui district xx road xx mansion" to the text input box pointed by the arrow on the map application interface, and the interface of the electronic device is as shown in fig. 7a2 (c).
Fig. 7a3 is a schematic diagram illustrating another procedure for directly displaying the target application interface in a full screen without loosing hands after dragging the target text. Fig. 7A3 (a) is the same as fig. 7a1 (a), and the related processes are referred to the above description, and are not repeated herein. In fig. 7a3 (b), in response to dragging the target text to stay on the map application icon of the recommended application list, the electronic device opens the map application, and displays an interface of the map application in a full screen display on the screen of the electronic device.
In the diagram (c) of fig. 7a3, the user looses his or her hands in the case that the map application interface has been displayed on the screen of the electronic device. After detecting the hands-free operation, the electronic device hovers the target text on the screen and stops displaying the recommended application list on the screen.
In fig. 7a3 (d), the user drags the target text along the path and direction indicated by the dotted arrow in the figure until the target text is dragged to the text input box. The electronic device detects the operation of dragging the target text and dragging the target text to the text input box, and adds the target text "beijing hai lake district x rayx mansion" to the text input box pointed by the arrow on the map application interface, at this time, the interface of the electronic device is as shown in fig. 7a3 (e).
FIG. 7B is a diagram illustrating a process for displaying the target application interface directly in full screen when the user releases his hand after dragging the target text. In the case of the target text being dragged and then released, please refer to the situation shown in fig. 6 (c). Referring to fig. 7B, in the diagram (a) of fig. 7B, the hover ball corresponding to the target text "beijing haichi district x rayx mansion" is hovering on the right side of the screen of the electronic device, and at this time, the user clicks the map application icon in the recommended application list.
In the diagram (B) of fig. 7B, in response to hovering the hover ball corresponding to the target text to the side of the screen and then clicking the map application icon, the electronic device opens the map application, displays an interface of the map application on the screen of the electronic device, and stops displaying the target text on the screen, and then the user releases his hand.
In fig. 7B (c), after the user releases his hand, he clicks on the text entry box on the map application interface. The electronic equipment responds to clicking of a text input box on the map application interface, pops up a list containing the pasting operation options on a screen, and stops displaying the recommended application list.
In fig. 7B (d), the user clicks on the paste operation option in the list containing the paste operation options. In fig. 7B (e), the electronic device adds the target text "beijing haichi district x road x building" to the text entry box on the map application interface in response to the click-and-paste operation option.
After the recommended application list has been displayed on the electronic device, in a case where the user releases his hand, the electronic device stops displaying the target text dragged to the side of the screen, and the hover ball is not displayed, the subsequent operations may be performed as shown in fig. 7B (B) to (e).
Similarly, in the case where the user releases his hand after the recommended application list has been displayed on the electronic apparatus and the electronic apparatus displays the target character in a hovering manner, the subsequent operations may be performed as shown in fig. 7B (B) to (e).
In an exemplary implementation, in a case where a user is loose after a recommended application list has been displayed on an electronic device, the application interaction process may include (a) diagram, (B) diagram, and (e) diagram of fig. 7B. That is, after the interface of the map application is displayed (as shown in fig. 7B (B)), if the user releases his hand, and a preset time period elapses, the electronic device adds the target text to the text input box on the interface of the map application, and stops displaying the recommended application list, without the user performing the operation of clicking the text input box shown in fig. 7B (c) and the operation of clicking the paste operation option shown in fig. 7B (d).
According to the exemplary processes of the application interaction listed above, the embodiment of the application can trigger the display of a plurality of recommended application icons associated with the target texts on the source application interface based on the operation of selecting the target texts in the source application and then dragging the target texts, and intelligently and automatically open the destination application of the target texts according to the selection of the recommended application icons by the user (by dragging the target texts to the recommended application icons or clicking the recommended application icons), so that the user operation is simplified, and the use experience of the user is improved.
In an actual application, there is a scene in which a user wants to switch to another application in the recommended application list after selecting one application in the recommended application list. The embodiment of the application provides an application interaction mode aiming at the scene.
Fig. 9A is an exemplary process diagram illustrating switching of the recommended application in a case where the user does not loose his/her hand after dragging the target text. Referring to fig. 9A, for the case that the user does not loose their hands after dragging the target text as shown in fig. 6 b, when the interface of the map application in the recommended application list has been displayed on the screen of the electronic device, the recommended application list is still displayed, and the user holds the target text without loosing their hands (see fig. 9A), the user may continue dragging the target text to an area outside the recommended application list, and at this time, the electronic device stops displaying the recommended application list (see fig. 9A b).
Next, the user may continue to drag the target text to the right side of the screen, maintain the hands-off state, and in response to dragging the target text to the right side of the screen, the electronic device displays a recommended application list on the screen (see (c) of fig. 9A). Then, the user drags the target text to another application in the recommended application list, for example, the WeChat application to stay on (see (c) diagram of FIG. 9A). In response to dragging the target text to stay on the wechat application icon in the recommended application list, the electronic device starts the wechat application, and displays an interface of the wechat application on the screen of the electronic device (see (d) of fig. 9A).
Next, the user may continue to drag the target text on the interface of the wechat application to a certain contact, for example, contact a in fig. 9A (e), after which the electronic device may perform the operation after entering the wechat application as shown in fig. 10a1 or perform the operation after entering the wechat application as shown in fig. 10a2 in response to dragging the target text on the interface of the wechat application to contact a.
Fig. 9B is an exemplary process diagram illustrating switching of the recommended application in the case where the user releases the hand after dragging the target text. Referring to fig. 9B, in case of hands being loosened after the target text is dragged, as shown in fig. 6 (c), when the interface of the map application in the recommended application list is already displayed on the screen of the electronic device and the recommended application list is still displayed (see fig. 9B (a)), the user may continue to click another application icon, such as a wechat application icon, in the recommended application list (see fig. 9B (a)). In response to an operation of clicking the wechat application icon in the recommended application list, the electronic device opens the wechat application, and displays an interface of the wechat application on the screen of the electronic device (see fig. 9B (B)). In response to that the time length for displaying the interface of the wechat application reaches a certain time length, the electronic device stops displaying the recommended application list, please refer to fig. 9B (c).
Then, the user may continue to click on a certain contact, for example, a contact a on the interface of the wechat application shown in fig. 9B (c), and thereafter, the electronic device may perform the operation after entering the wechat application shown in fig. 10B1 or perform the operation after entering the wechat application shown in fig. 10B2 in response to clicking on the contact a.
In another example, the user may also click on the contact a on the interface shown in fig. 9B (B), that is, click on the contact a while the recommended application list is still being displayed, at this time, in response to clicking on the contact a, the electronic device stops displaying the recommended application list, and performs the operation after entering the wechat application as shown in fig. 10B1, or performs the operation after entering the wechat application as shown in fig. 10B 2.
Under the condition that the destination application interface is displayed on the screen of the electronic equipment, if the pasting position of the target text does not exist in the current interface of the destination application, the next-level interface or the multi-level interface of the current interface of the destination application can be continuously opened and displayed.
Fig. 10a1 is an exemplary diagram illustrating an operation process after entering a WeChat application without loosing hands after dragging the target text. Referring to fig. 10a1, for the case that the user is not loose after dragging the target text as shown in fig. 6 (b), when the interface of the wechat application in the list of recommended applications is already displayed on the screen of the electronic device (see fig. 10a1 (a)), the user may continue to drag the target text to a clickable location on the interface of the wechat application, which may trigger the next level of interface displaying the current interface of the wechat application when clicked, for example, the clickable location may be a contact nail, as shown in fig. 10a1 (a).
Next, in response to clicking on the contact a, the electronic device displays a next level interface of the current interface of the WeChat application, that is, a chat interface with the contact a, as shown in (b) of FIG. 10A 1. The chat interface includes a text entry box for sending a message to contact a.
In fig. 10a1 (b), the user continues to drag the target text into the text entry box, and the electronic device pastes the target text "beijing, hailuo x louis" into the text entry box in response to dragging the target text into the text entry box, and after pasting, the wechat application displays a "send" button on the interface. As shown in fig. 10a1 (c). If the user wants to make adjustments to the content being sent, the user can edit the content in the text entry box (i.e., the target text) before clicking the "send" button.
Next, the user may click the "send" button, and in response to clicking the "send" button, the electronic device sends the content in the text input box to contact a, and displays the sent interface, as shown in fig. 10a1 (d).
In one example, the process of pasting the target text into the text input box shown in fig. 10a1 (b) and (c) may be replaced with a process similar to fig. 7a1 (b) and (c), or with a process similar to fig. 7A3 (c) - (e), and will not be described again here.
Fig. 10a2 is an exemplary diagram illustrating another operation procedure after entering the WeChat application without loosing hands after dragging the target text. In this example, the diagram (a) of fig. 10a2 is the same as the diagram (a) of fig. 10a1, and for the description of the diagram (a) of fig. 10a2, reference is made to the description of the diagram (a) of fig. 10a1, which is not repeated here. Unlike the example shown in fig. 10a1, in the example shown in fig. 10a2, when the user clicks the contact a, the electronic device automatically sends the target text to the contact a and displays an interface for sending the target text to the contact a, as shown in fig. 10a2 (b).
Compared with fig. 10a1, it can be seen that in the example shown in fig. 10a2, the operation steps shown in fig. 10a1 (b) and (c) are omitted, and the process of manually sending the target text to the contact a is changed into the process of automatically sending the target text to the contact a by the electronic device, so that manual operation is reduced, the automation and intelligence degree is improved, and the use experience of the user is improved.
Fig. 10B1 is an exemplary diagram illustrating an operation process after entering the WeChat application in the case of a loose hand after dragging the target text. Referring to fig. 10B1, in case of hands release after dragging the target text as shown in fig. 6 (c), when an interface of the wechat application in the recommended application list has been displayed on the screen of the electronic device (see fig. 10B1 (a)), the user may click on a contact a on the interface of the wechat application. In response to clicking on contact A, the electronic device displays a chat interface with contact A, as shown in FIG. 10B1 (B).
The chat interface of fig. 10B1 (B) includes a text entry box for sending a message to contact a, and the electronic device automatically pastes the target text "beijing haihui district x rayx mansion" into the text entry box, and after pasting, the WeChat application displays a "send" button on the interface. If the user wants to make adjustments to the content being sent. The user may edit the content (i.e., the target text) in the text entry box before clicking the "send" button.
Then, the user may click the "send" button in the interface illustrated in fig. 10B1 (B), and in response to clicking the "send" button, the electronic device sends the content in the text input box to the contact a and displays the sent interface, as illustrated in fig. 10B1 (c).
Fig. 10B2 is an exemplary diagram illustrating another operation procedure after entering the WeChat application in the case of a loose hand after dragging the target text. Referring to fig. 10B2, in the case of the user being loose after dragging the target text shown in fig. 6 (c), when the user clicks the contact a, the electronic device transmits the target text to the contact a in response to the click of the contact a, and displays the transmitted interface, as shown in fig. 10B2 (B). Compared with fig. 10B1, it can be seen that the operation step shown in fig. 10B1 (B) is omitted in the example shown in fig. 10B2, and the process of manually sending the target text to the contact person a is converted into the process of automatically sending the target text to the contact person a, so that manual operation is further reduced, the degree of automation and intelligence is improved, and the use experience of the user is improved.
The foregoing embodiments describe the process of application interaction in a manner that displays the inbound application interface directly full-screen. The method can be quickly switched to the destination application interface, and the user experience is improved.
However, in some application scenarios, the user does not want the original interface (e.g., the original interface may be the interface displaying the OCR recognition result of picture a) to be completely occluded, e.g., the user wants to be able to quickly switch back to the scene of the original interface. In order to meet the user requirements in the scenario, another display mode for the application interface is provided in the embodiments of the present application.
FIG. 11A is a diagram illustrating an exemplary process for displaying a map application interface in a floating window without loosing hands after dragging target text. The process of starting the map application and displaying the map application interface in the example shown in fig. 11A is the same as the process in the example shown in fig. 7a2, and is not described again here. The example shown in FIG. 11A differs from the example shown in FIG. 7A2 in that the map application interface is displayed in a floating window in the example shown in FIG. 11A, while the map application interface is displayed directly full screen in the example shown in FIG. 7A 2.
It should be noted that, in the case that the floating window displays the map application interface, the process of starting the map application and displaying the map application interface may also be the same as the process in the example shown in fig. 7a1, or the same as the process in the example shown in fig. 7A3, the difference is only that the display mode of the map application interface is different, and details are not described here.
FIG. 11B is an exemplary illustration of a process for displaying a map application interface in a floating window with a loose hand after dragging target text. The process of starting the map application and displaying the map application interface in the example shown in fig. 11B may be the same as the process in the example shown in fig. 7B, or the process indicated by diagram (a) → (B) → (e) in the example shown in fig. 7B (i.e., after the map application interface is displayed, the electronic device adds the target text to the text input box on the map application interface), and thus details are not repeated here. The example shown in FIG. 11B differs from the example shown in FIG. 7B in that the map application interface is displayed in a floating window in the example shown in FIG. 11B, whereas the map application interface is displayed directly full-screen in the example shown in FIG. 7B.
According to the method and the device, the destination application is displayed on the floating window, so that a user can conveniently see the source interface of the target text, the user requirements are met, and the user experience is improved.
Fig. 12A is a schematic diagram illustrating a process of displaying a map application interface in split screen without loosing hands after dragging the target text. The process of starting the map application and displaying the map application interface in the example shown in fig. 12A is the same as the process in the example shown in fig. 7a2, and the description thereof is omitted. The example shown in fig. 12A differs from the example shown in fig. 7a2 in that the map application interface is displayed in split screen in the example shown in fig. 12A, and the split screen manner is split up and down, whereas the map application interface is displayed directly in full screen in the example shown in fig. 7a 2.
It should be noted that, in the case of displaying the map application interface in split screen, the process of starting the map application and displaying the map application interface may also be the same as the process in the example shown in fig. 7a1, or the same as the process in the example shown in fig. 7A3, the difference is only that the display mode of the map application interface is different, and details are not described here.
Fig. 12B is a schematic diagram illustrating a process of displaying the map application interface in split screen when the user releases his hand after dragging the target text. The process of starting the map application and displaying the map application interface in the example shown in fig. 12B may be the same as the process in the example shown in fig. 7B, or the process indicated by diagram (a) → diagram (B) → diagram (e) in the example shown in fig. 7B (i.e., after switching to the map application interface, the electronic device adds the target text to the text input box on the map application interface), and is not repeated here. The example shown in fig. 12B is different from the example shown in fig. 7B in that the map application interface is displayed in split screens in the example shown in fig. 12B, and the split screen manner is split up and down, whereas the map application interface is displayed directly in full screen in the example shown in fig. 7B.
According to the method and the device, the destination application is displayed in the split screen, so that the user can see the source interface of the target text without switching the interface, the user requirements are met, and the user use experience is improved. In addition, the destination application is displayed in the split screen, so that the user can see the source interface content and the destination application interface content at the same time, the user can conveniently copy the content from the source interface to the destination application interface for pasting, the trouble of frequently switching the interfaces is avoided, and the time is saved for the user.
It should be noted that, the foregoing description takes the electronic device as a single-screen electronic device (for example, a single-screen mobile phone, a tablet, etc.) as an example, and the same applies to the case that the folding-screen electronic device (for example, a folding-screen mobile phone) is full screen, and the single screen after the folding-screen electronic device is folded.
For an electronic device with a smaller screen width, the destination application interface may be displayed in a split-screen manner as shown in fig. 12A and 12B. For wide-screen electronic equipment such as a flat panel and a folding-screen mobile phone with a folding screen unfolded to be in a full-screen state, the embodiment of the application provides a left-right split-screen display mode.
Fig. 13A is an exemplary process diagram illustrating a process of displaying the kyoto application interface in split screen without loosing hands after dragging the target text. The process of opening the jingdong application and displaying the interface of the jingdong application in the example shown in fig. 13A is the same as the process in the example shown in fig. 7a1, and the description thereof is omitted. The difference between the example shown in fig. 13A and the example shown in fig. 7a1 is that the going application interface is displayed in split screen in the example shown in fig. 13A, and the split screen mode is left-right split screen, whereas the going application interface is displayed directly in full screen in the example shown in fig. 7a 1.
It should be noted that, in the case of displaying the destination application interface in the split screen, the process of starting the destination application and displaying the destination application interface may also be the same as the process in the example shown in fig. 7a2, or the same as the process in the example shown in fig. 7A3, and the difference is only that the display manner of the destination application interface is different, and the description is omitted here.
Fig. 13B is a schematic diagram illustrating an exemplary process of displaying the kyoto application interface in split screen when the user releases his hand after dragging the target text. The process of opening the jingdong application and displaying the jingdong application interface in the example shown in fig. 13B may be the same as the process in the example shown in fig. 7B, or the process indicated by diagram → (B) and → (e) in fig. 7B (that is, after the jingdong application interface is displayed, the target text is automatically added to the text input box on the jingdong application interface by the electronic device), and details are not repeated here. The difference between the example shown in fig. 13B and the example shown in fig. 7B is that the outgoing application interface is displayed in split screen in the example shown in fig. 12B, and the split screen mode is left-right split screen, whereas the outgoing application interface is directly displayed in full screen in the example shown in fig. 7B.
It should be noted that, the electronic device may set the aforementioned ways of displaying the destination application interface, such as direct full-screen display, floating window display, split screen display, and the like, at the same time, and set the start switch for each display way. Therefore, the user can select a corresponding mode for displaying the destination application interface by starting the corresponding starting switch according to the personal use habit. When the user selects the mode of displaying in split screen, the electronic equipment can determine whether to adopt up-down split screen or left-right split screen according to the type of the electronic equipment before split screen. For example, when the type of the electronic device is a single-screen and narrow-screen electronic device (e.g., a single-screen mobile phone), the electronic device may determine to adopt a split screen; when the electronic device type is a single-screen and wide-screen electronic device (such as a tablet), the electronic device may determine to adopt left and right split screens; when the electronic equipment type is folding screen electronic equipment, electronic equipment can further be according to whether the screen is fold condition and determine what kind of branch screen to adopt, if the screen is fold condition promptly, adopt and divide the screen from top to bottom, if the screen is unfolded condition, adopt and divide the screen from left to right.
Fig. 14A is an exemplary illustration of a process for switching to an application in a folding screen electronic device or a wide screen electronic device without loosing hands after dragging a target text. Fig. 14A shows a process of switching to an application from the kyoto application to the wechat application without loosing hands after dragging the target text. The process in the example shown in fig. 14A is the same in principle as the process in the example shown in fig. 9A, and is not described here again. It should be noted that after (b) and before (c) of fig. 14A, the user drags the target text to the area outside the recommended application list is omitted, and the electronic device stops the process of displaying the recommended application list in response to dragging the target text to the area outside the recommended application list.
Fig. 14B is an exemplary illustration of a process of switching to an application in a folder-screen electronic device or a wide-screen electronic device in a case where a user releases his hand after dragging a target text. Fig. 14B shows a process of switching to the application from the kyoto application to the wechat application in the case of releasing the hand after dragging the target text. The process in the example shown in fig. 14B is the same in principle as the process in the example shown in fig. 9B, and is not described here again.
FIG. 15 is a diagram illustrating a process for displaying one going application based on target text in another going application. Referring to fig. 15 (a), the electronic device has displayed a WeChat application interface on the right side of the screen based on the target text "beijing Haihu district X road X mansion" selected for dragging in the picture a interface on the left side of the screen. At this time, the user clicks the picture b in the wechat application interface on the right side of the screen, and the electronic device displays the picture b on the right side of the screen in response to clicking the picture b in the wechat application interface on the right side of the screen, as shown in fig. 15 (b).
Next, the electronic device acquires the OCR recognition result of the picture b by the same principle as the aforementioned OCR recognition result of the picture a. After obtaining the OCR recognition result of the picture b, the electronic device displays an OCR recognition result display button on the interface of the picture b, as shown in fig. 15 (c).
Then, when the user clicks the OCR recognition result display button on the interface of the picture b, the electronic device displays the OCR recognition result of the picture b on the right side of the screen in response to clicking the OCR recognition result display button on the interface of the picture b. Thereafter, according to the same process principle as the aforementioned fig. 7B, the user selects the contact application as the going application of the new target text "177 × 1991" through the corresponding recommended application list triggered by the picture B, as shown in the (d) diagram of fig. 15.
Next, the electronic device displays a contact application interface on the left side of the screen in response to the user selecting the contact application as the go-to application in the corresponding recommended application list triggered by the picture b. Then, in response to the user clicking a text input box corresponding to the "mobile phone" on the contact application interface, the electronic device pops up a text operation menu including a paste operation option on the contact application interface, and stops displaying the recommended application list and the hover ball corresponding to the target text on the right side of the screen, as shown in fig. 15 (e).
Finally, the electronic device adds the new target text "177 × 1991" to the text input box corresponding to "mobile phone" in response to clicking the paste operation option in the text operation menu popped up from the contact application interface, as shown in fig. 15 (f).
An embodiment of the present application further provides an electronic device, including:
a memory and a processor, the memory coupled with the processor;
the memory stores program instructions that, when executed by the processor, cause the electronic device to perform any of the application interaction methods described above.
An embodiment of the present application further provides a computer-readable storage medium, which includes a computer program and when the computer program runs on an electronic device, causes the electronic device to execute any one of the foregoing application interaction methods.
The embodiment of the present application also provides a chip, which includes one or more interface circuits and one or more processors; the interface circuit is used for receiving signals from a memory of the electronic equipment and sending the signals to the processor, and the signals comprise computer instructions stored in the memory; the computer instructions, when executed by the processor, cause the electronic device to perform any of the application interaction methods previously described.
The electronic device, the computer storage medium, the computer program product, or the chip provided in this embodiment are all configured to execute the corresponding method provided above, so that the beneficial effects achieved by the electronic device, the computer storage medium, the computer program product, or the chip may refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Through the description of the foregoing embodiments, those skilled in the art will understand that, for convenience and simplicity of description, only the division of the functional modules is used for illustration, and in practical applications, the above function distribution may be completed by different functional modules as needed, that is, the internal structure of the device may be divided into different functional modules, so as to complete all or part of the functions described above.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, a module or a unit may be divided into only one logic function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another apparatus, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed to a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit.
Any of the various embodiments of the present application, as well as any of the same embodiments, can be freely combined. Any combination of the above is within the scope of the present application.
The integrated unit, if implemented as a software functional unit and sold or used as a separate product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable an electronic device (which may be a single chip microcomputer, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (25)

1. An application interaction method applied to an electronic device, the method comprising:
displaying an interface of a first application;
receiving a selection operation on a first target text on an interface of the first application;
in response to dragging the first target text to a first area, displaying an icon of a second application and an icon of a third application in a second area, wherein the second area is located at the edge part of the screen of the electronic equipment;
and responding to the first target text dragged to the icon of the second application, starting the second application, and displaying the interface of the second application in the area where the interface of the first application is displayed.
2. The method of claim 1, wherein in response to dragging the first target text over an icon of the second application, launching the second application, further comprising, after displaying an interface of the second application in an area where the first application interface is displayed:
when detecting that a first text input box matched with the first target text is included on the interface of the second application, displaying the first target text in an area overlapping mode in the first text input box;
stopping displaying the first target text in an area overlay within the first text entry box in response to dragging the first target text outside the first area;
in response to dragging the first target text to the first area, displaying an icon of the second application and an icon of the third application in the second area;
and responding to the first target text dragged to the icon of the third application, starting the third application, and displaying the interface of the third application in the area where the interface of the second application is displayed.
3. The method of claim 1, further comprising:
in response to dragging the first target text to a first area, copying the first target text;
in response to dragging the first target text onto an icon of the second application, launching the second application, further comprising, after displaying an interface of the second application in an area where the first application interface is displayed:
when detecting that a first text input box matched with the first target text is included on the interface of the second application, displaying the first target text in an area overlapping mode in the first text input box;
in response to a hands-free operation, pasting the first target text into the first text entry box and displaying the first target text within the first text entry box.
4. The method of claim 1, further comprising:
in response to dragging the first target text to a first area, copying the first target text;
in response to dragging the first target text onto an icon of the second application, launching the second application, further comprising, after displaying an interface of the second application in an area where the first application interface is displayed:
and a second text input box and a third text input box are included on the interface of the second application, the first target text is pasted to the second text input box in response to the operation of releasing hands after the first target text is dragged to the second text input box, and the first target text is displayed in the second text input box.
5. The method of claim 1, wherein in response to dragging the first target text over an icon of the second application, launching the second application, further comprising, after displaying an interface of the second application in an area where the first application interface is displayed:
stopping displaying the icon of the second application and the icon of the third application;
the interface of the second application comprises a session area of a first object and a session area of a second object, and in response to dragging the first target text to the session area of the first object, the session interface of the first object is displayed in the area where the interface of the second application is displayed, and the session interface of the first object belongs to the second application;
the conversation interface with the first object comprises a fourth text input box matched with the first target text, the first target text is pasted into the fourth text input box, and the first target text is displayed in the fourth text input box;
and responding to the received sending instruction, and sending the content in the fourth text input box to the first object.
6. The method of claim 1, wherein in response to dragging the first target text over an icon of the second application, launching the second application, further comprising, after displaying an interface of the second application in an area where the first application interface is displayed:
stopping displaying the icon of the second application and the icon of the third application;
the interface of the second application comprises a session area of a first object and a session area of a second object, and in response to dragging the first target text to the session area of the first object, the session interface of the first object is displayed in the area where the interface of the second application is displayed, and the session interface of the first object belongs to the second application;
the conversation interface with the first object comprises a fourth text input box matched with the first target text, the first target text is pasted into the fourth text input box, and the first target text is displayed in the fourth text input box;
receiving an editing instruction of the text in the fourth text input box, and editing the text in the fourth text input box according to the editing instruction;
and after the editing is finished, responding to the received sending instruction, and sending the content in the fourth text input box to the first object.
7. The method of claim 1, wherein in response to dragging the first target text over an icon of the second application, launching the second application, further comprising, after displaying an interface of the second application in an area where the first application interface is displayed:
stopping displaying the icon of the second application and the icon of the third application;
the interface of the second application comprises a session area of a first object and a session area of a second object, the first target text is sent to the first object in response to the first target text being dragged to the session area of the first object, and the session interface with the first object after being sent is displayed in the area for displaying the interface of the second application.
8. The method of claim 1, wherein in response to dragging the first target text over an icon of the second application, launching the second application, further comprising, after displaying an interface of the second application in an area where the interface of the first application is displayed:
stopping displaying the icon of the second application and the icon of the third application.
9. The method of claim 1, further comprising:
displaying an interface of a fourth application;
receiving a selection operation on a second target text on the interface of the fourth application;
in response to dragging the second target text to the first area, displaying an icon of a fifth application and an icon of a sixth application in the second area;
responding to the second target text dragged to the icon of the fifth application, starting the fifth application, displaying a floating window on the interface of the fourth application in an overlapping mode, and displaying the interface of the fifth application in the floating window.
10. The method of claim 1, further comprising:
displaying an interface of a fourth application;
receiving a selected operation on a second target text on the interface of the fourth application;
in response to dragging the second target text to the first area, displaying an icon of a fifth application and an icon of a sixth application in the second area;
responding to the second target text dragged to the icon of the fifth application, starting the fifth application, displaying a first split screen and a second split screen in the area displaying the fourth interface, displaying the interface of the fourth application in the first split screen, and displaying the interface of the fifth application in the second split screen.
11. The method of claim 10, wherein the first screen is located on an upper half of the screen of the electronic device and the second screen is located on a lower half of the screen of the electronic device.
12. The method of claim 10, wherein the first screen is located in a left half of a screen of the electronic device and the second screen is located in a right half of the screen of the electronic device.
13. The method of claim 10, further comprising:
receiving a selection operation on a third target text on the interface of the fifth application;
in response to dragging the third target text to the first area, displaying an icon of a seventh application and an icon of an eighth application in the second area;
responding to the third target text dragged to the icon of the seventh application, starting the seventh application, and displaying an interface of the seventh application on the first screen.
14. The method of claim 1, wherein the second application and the third application are predicted from the first target text.
15. The method of claim 1, wherein the first area is a screen side area.
16. An application interaction method applied to an electronic device, the method comprising:
receiving a selection operation of a first target text on an interface of a first application;
predicting a second application and a third application according to the first target text;
in response to dragging the first target text to a first area, displaying an icon of the second application and an icon of the third application in a second area, wherein the second area is located at an edge portion of a screen of the electronic device;
and responding to the first target text dragged to the icon of the second application, starting the second application, and displaying the interface of the second application in the area where the interface of the first application is displayed.
17. The method of claim 16, further comprising:
in response to dragging the first target text to a first area, copying the first target text;
when detecting that a first text input box matched with the first target text is included on the interface of the second application, displaying the first target text in an area overlapping mode in the first text input box;
in response to a hands-free operation, pasting the first target text into the first text entry box and displaying the first target text within the first text entry box.
18. The method of claim 16, further comprising:
receiving a selection operation of a second target text on an interface of a fourth application;
predicting a fifth application and a sixth application according to the second target text;
in response to dragging the second target text to the first area, displaying an icon of a fifth application and an icon of a sixth application in the second area;
responding to the second target text dragged to the icon of the fifth application, starting the fifth application, displaying a floating window on the interface of the fourth application in an overlapping mode, and displaying the interface of the fifth application in the floating window.
19. The method of claim 16, further comprising:
receiving a selection operation on a second target text on an interface of a fourth application;
predicting a fifth application and a sixth application according to the second target text;
in response to dragging the second target text to the first area, displaying an icon of a fifth application and an icon of a sixth application in the second area;
responding to the second target text dragged to the icon of the fifth application, starting the fifth application, displaying a first split screen and a second split screen in the area displaying the fourth interface, displaying the interface of the fourth application in the first split screen, and displaying the interface of the fifth application in the second split screen.
20. The method of claim 19, further comprising:
receiving a selection operation on a third target text on the interface of the fifth application;
predicting a seventh application and an eighth application according to the third target text;
in response to dragging the third target text to the first area, displaying an icon of a seventh application and an icon of an eighth application in the second area;
in response to the third target text being dragged to the icon of the seventh application, the seventh application is started, and an interface of the seventh application is displayed on the first screen.
21. The method of claim 19, wherein displaying the first split screen and the second split screen in the area where the fourth interface is displayed comprises:
when the electronic equipment is single-screen equipment, the first split screen is displayed on the upper half portion of the screen of the electronic equipment, and the second split screen is displayed on the lower half portion of the screen of the electronic equipment.
22. The method of claim 19, wherein displaying the first split screen and the second split screen in the area where the fourth interface is displayed comprises:
when the electronic equipment is folding screen equipment, the first split screen is displayed on the left half part of the screen of the electronic equipment, and the second split screen is displayed on the right half part of the screen of the electronic equipment.
23. An electronic device, comprising:
a memory and a processor, the memory coupled with the processor;
the memory stores program instructions that, when executed by the processor, cause the electronic device to perform the application interaction method of any of claims 1-22.
24. A computer-readable storage medium, comprising a computer program which, when run on an electronic device, causes the electronic device to perform an application interaction method as claimed in any one of claims 1-22.
25. A chip comprising one or more interface circuits and one or more processors; the interface circuit is configured to receive signals from a memory of an electronic device and to transmit the signals to the processor, the signals including computer instructions stored in the memory; the computer instructions, when executed by the processor, cause the electronic device to perform the application interaction method of any of claims 1-22.
CN202111340669.1A 2021-11-12 2021-11-12 Application interaction method and electronic equipment Active CN115033142B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111340669.1A CN115033142B (en) 2021-11-12 2021-11-12 Application interaction method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111340669.1A CN115033142B (en) 2021-11-12 2021-11-12 Application interaction method and electronic equipment

Publications (2)

Publication Number Publication Date
CN115033142A true CN115033142A (en) 2022-09-09
CN115033142B CN115033142B (en) 2023-09-12

Family

ID=83117948

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111340669.1A Active CN115033142B (en) 2021-11-12 2021-11-12 Application interaction method and electronic equipment

Country Status (1)

Country Link
CN (1) CN115033142B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024131288A1 (en) * 2022-12-20 2024-06-27 Oppo广东移动通信有限公司 Image text sharing method, apparatus and device, and computer-readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040153974A1 (en) * 2003-01-30 2004-08-05 Walker Kenneth A. Markup language store-and-paste
US20100262928A1 (en) * 2009-04-10 2010-10-14 Cellco Partnership D/B/A Verizon Wireless Smart object based gui for touch input devices
CN104050153A (en) * 2013-03-11 2014-09-17 三星电子株式会社 Method And Apparatus For Copying And Pasting Of Data
CN106575195A (en) * 2014-10-24 2017-04-19 谷歌公司 Improved drag-and-drop operation on a mobile device
CN109960446A (en) * 2017-12-25 2019-07-02 华为终端有限公司 It is a kind of to control the method and terminal device that selected object is shown in application interface
CN111124709A (en) * 2019-12-13 2020-05-08 维沃移动通信有限公司 Text processing method and electronic equipment
CN112882623A (en) * 2021-02-09 2021-06-01 维沃移动通信有限公司 Text processing method and device, electronic equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040153974A1 (en) * 2003-01-30 2004-08-05 Walker Kenneth A. Markup language store-and-paste
US20100262928A1 (en) * 2009-04-10 2010-10-14 Cellco Partnership D/B/A Verizon Wireless Smart object based gui for touch input devices
CN104050153A (en) * 2013-03-11 2014-09-17 三星电子株式会社 Method And Apparatus For Copying And Pasting Of Data
CN106575195A (en) * 2014-10-24 2017-04-19 谷歌公司 Improved drag-and-drop operation on a mobile device
CN109960446A (en) * 2017-12-25 2019-07-02 华为终端有限公司 It is a kind of to control the method and terminal device that selected object is shown in application interface
CN111124709A (en) * 2019-12-13 2020-05-08 维沃移动通信有限公司 Text processing method and electronic equipment
CN112882623A (en) * 2021-02-09 2021-06-01 维沃移动通信有限公司 Text processing method and device, electronic equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ANKIT KUMAR; V AJITH KUMAR; JOY BOSE: "Multiple copy and paste operation in mobile web browsers", 《 2017 INTERNATIONAL CONFERENCE ON ADVANCES IN COMPUTING, COMMUNICATIONS AND INFORMATICS (ICACCI)》, pages 2307 - 2312 *
崔天剑董甜甜: "交互界面功能性扩展设计研究:以智能手机为例", 《南京艺术学院学报(美术与设计)》, pages 206 - 210 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024131288A1 (en) * 2022-12-20 2024-06-27 Oppo广东移动通信有限公司 Image text sharing method, apparatus and device, and computer-readable storage medium

Also Published As

Publication number Publication date
CN115033142B (en) 2023-09-12

Similar Documents

Publication Publication Date Title
JP7013466B2 (en) Application data processing methods, equipment, and computer programs
EP2854428B1 (en) Terminal for providing instant messaging service
EP2615535B1 (en) Mobile terminal and method of controlling the same
CN110519461B (en) File transmission method, device, computer equipment and storage medium
CN109710909B (en) Content acquisition method, device, terminal and storage medium
US20100281409A1 (en) Apparatus and method for handling notifications within a communications device
CN112269508B (en) Display method and device and electronic equipment
EP2690848A1 (en) Mobile terminal and controlling method thereof
US20130024818A1 (en) Apparatus and Method for Handling Tasks Within a Computing Device
CN114356198A (en) Data transmission method and device
CN112306325B (en) Interaction control method and device
US20240086231A1 (en) Task migration system and method
CN113407086B (en) Object dragging method, device and storage medium
CN108304234B (en) Page display method and device
US20240248582A1 (en) Content Sharing Method and Apparatus, and Electronic Device
CN112948844B (en) Control method and device and electronic equipment
CN111787493A (en) Message sending method, message sending device and electronic equipment
CN112163432A (en) Translation method, translation device and electronic equipment
CN115033142B (en) Application interaction method and electronic equipment
CN112181351A (en) Voice input method and device and electronic equipment
WO2023082817A1 (en) Application program recommendation method
CN115033153B (en) Application program recommendation method and electronic device
EP3538981B1 (en) Layered content selection
CN115016710B (en) Application program recommendation method
US11256855B2 (en) Systems and methods for collation of digital content

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant