CN108228020B - Information processing method and terminal - Google Patents

Information processing method and terminal Download PDF

Info

Publication number
CN108228020B
CN108228020B CN201611138111.4A CN201611138111A CN108228020B CN 108228020 B CN108228020 B CN 108228020B CN 201611138111 A CN201611138111 A CN 201611138111A CN 108228020 B CN108228020 B CN 108228020B
Authority
CN
China
Prior art keywords
floating window
picture object
application interface
unit
current application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611138111.4A
Other languages
Chinese (zh)
Other versions
CN108228020A (en
Inventor
赵娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110771359.9A priority Critical patent/CN113434065B/en
Priority to CN201611138111.4A priority patent/CN108228020B/en
Publication of CN108228020A publication Critical patent/CN108228020A/en
Application granted granted Critical
Publication of CN108228020B publication Critical patent/CN108228020B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses an information processing method, which comprises the following steps: receiving a first operation of a user in a current application interface, wherein the first operation is used for triggering and displaying a first selection interface for selecting a picture object; responding to the first operation, displaying a first selection interface, and receiving a second operation of a user on the first selection interface, wherein the second operation is used for selecting a picture object; responding to the second operation to acquire a picture object; acquiring operation information obtained according to the first operation and the second operation, and creating a floating window for displaying a picture object according to the operation information, wherein the floating window is arranged at a target area on the current application interface, and the target area is an area outside an input frame and a character input area which are arranged on the current application interface; the picture object is displayed in the floating window. The embodiment of the invention also discloses a terminal.

Description

Information processing method and terminal
Technical Field
The present invention relates to information processing technologies in the field of communications, and in particular, to an information processing method and a terminal.
Background
With the continuous development of information technology, the variety of application functions on the terminal is increasing. Currently, when a user uses a terminal to implement a certain function, a situation that another function needs to be implemented at the same time may occur. For example, when a user chats with a chat application of a mobile phone, the user may also be simultaneously viewing pictures or documents, viewing or watching videos, and the like.
In the prior art, when a user uses a terminal to realize a certain function, the user needs to use the terminal to realize another function. Based on the behavior of the user, the terminal switches a certain function currently being realized to the background, and realizes another function which the user wants to realize in the foreground. Therefore, when the user wants to realize the certain function again, the terminal can switch the certain function from the background to the foreground again based on the behavior of the user.
However, when multiple applications or functions are simultaneously implemented on a terminal by using the existing technology, the implementation of the multiple applications or functions is complicated and complicated due to the continuous foreground and background switching.
Disclosure of Invention
In order to solve the above technical problems, embodiments of the present invention are expected to provide an information processing method and a terminal, which can simultaneously implement different application functions, reduce complexity of implementing the application functions, and improve flexibility of implementing the application functions.
The technical scheme of the invention is realized as follows:
the embodiment of the invention provides an information processing method, which comprises the following steps:
receiving a first operation of a user in a current application interface, wherein the first operation is used for triggering and displaying a first selection interface for selecting a picture object;
responding to the first operation to display the first selection interface, and receiving a second operation of the user on the first selection interface, wherein the second operation is used for selecting the picture object;
responding to the second operation, and acquiring the picture object;
acquiring operation information obtained according to the first operation and the second operation, and creating a floating window for displaying the picture object according to the operation information, wherein the floating window is arranged at a target area on the current application interface, and the target area is an area outside an input frame and a character input area which are arranged on the current application interface;
and displaying the picture object in the floating window.
An embodiment of the present invention provides a terminal, including:
the receiving unit is used for receiving a first operation of a user in a current application interface, wherein the first operation is used for triggering and displaying a first selection interface for selecting a picture object;
a display unit configured to display the first selection interface in response to the first operation received by the receiving unit,
the receiving unit is further configured to receive a second operation of the user at the first selection interface displayed by the display unit, where the second operation is used to select the picture object;
an obtaining unit, configured to obtain the picture object in response to the second operation received by the receiving unit; acquiring operation information obtained according to the first operation and the second operation;
the establishing unit is used for establishing a floating window for displaying the picture object according to the operation information, wherein the floating window is arranged at a target area on the current application interface, and the target area is an area outside an input frame and a character input area which are arranged on the current application interface;
the display unit is further configured to perform floating display on the image object acquired by the acquisition unit in the floating window created by the creation unit.
The embodiment of the invention provides an information processing method and a terminal, wherein in a current application interface, a first operation of a user is received, and the first operation is used for triggering and displaying a first selection interface for selecting a picture object; responding to the first operation, displaying a first selection interface, and receiving a second operation of a user on the first selection interface, wherein the second operation is used for selecting a picture object; responding to the second operation to acquire a picture object; acquiring operation information obtained according to the first operation and the second operation, and creating a floating window for displaying a picture object according to the operation information, wherein the floating window is arranged at a target area on the current application interface, and the target area is an area outside an input frame and a character input area which are arranged on the current application interface; the picture object is displayed in the floating window. By adopting the technical scheme, on the premise of realizing the current application function, the display function of the picture object is displayed simultaneously in the form of the suspended window, the foreground and the background are not required to be switched back and forth, and the operation is simple and concise, so that different application functions can be realized simultaneously, the complexity of realizing the application functions is reduced, and the flexibility of realizing the application functions is improved.
Drawings
Fig. 1 is a system architecture diagram of an information processing method according to an embodiment of the present invention;
fig. 2 is a first flowchart of an information processing method according to an embodiment of the present invention;
fig. 3 is a first schematic interface diagram of an exemplary terminal according to an embodiment of the present invention;
fig. 4 is a schematic interface diagram ii of an exemplary terminal according to an embodiment of the present invention;
fig. 5 is a third schematic interface diagram of an exemplary terminal according to an embodiment of the present invention;
fig. 6 is a fourth schematic interface diagram of an exemplary terminal according to an embodiment of the present invention;
fig. 7 is a second flowchart of an information processing method according to an embodiment of the present invention;
fig. 8 is a schematic diagram of an exemplary interface of a terminal according to an embodiment of the present invention;
fig. 9 is a flowchart of a third information processing method according to an embodiment of the present invention;
fig. 10 is a fourth flowchart of an information processing method according to an embodiment of the present invention;
fig. 11 is a sixth schematic interface diagram of an exemplary terminal according to an embodiment of the present invention;
fig. 12 is a seventh schematic interface diagram of an exemplary terminal according to an embodiment of the present invention;
fig. 13 is a first schematic structural diagram of a terminal according to an embodiment of the present invention;
fig. 14 is a schematic structural diagram of a terminal according to an embodiment of the present invention;
fig. 15 is a schematic structural diagram of a terminal according to a third embodiment of the present invention;
fig. 16 is a schematic structural diagram of a terminal according to a fourth embodiment of the present invention;
fig. 17 is a schematic structural diagram of a terminal according to an embodiment of the present invention.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention.
Fig. 1 is an architecture diagram of an information processing system according to an embodiment of the present invention, where fig. 1 includes: one or more servers 2, terminal devices 1, and a network 3, where the network 3 includes network entities such as routers, gateways, etc., which are not shown in the figure. The terminal 1 performs information interaction with the server 2 through a wired network or a wireless network, so that relevant data information collected from the terminal 1 is transmitted to the server 2. The types of terminals are shown in fig. 1 and include mobile phone, tablet or PDA, desktop, PC, smart TV, etc. The terminal is installed with various applications required by the user, such as applications with entertainment functions (e.g., video applications, audio playing applications, game applications, reading software, chat applications and live broadcast applications), and applications with service functions (e.g., map navigation applications, group buying applications, shooting applications, etc.).
Based on the above-described architecture, the following embodiments are implemented.
Example one
An embodiment of the present invention provides an information processing method, as shown in fig. 2, the method may include:
s101, receiving a first operation of a user in a current application interface, wherein the first operation is used for triggering and displaying a first selection interface for selecting a picture object.
An application scenario of the information processing method provided by the embodiment of the present invention may be as follows: when a user performs a first application on a terminal, the user can simultaneously implement a second application in a certain area of a current application interface.
Optionally, the first application may be an application such as a game, a browser, a chat application, and a picture library, and the second application is a picture library application, for example, when the first application is a chat interface and the second application is a picture application, and when the current application interface is a chat interface, the user views a certain picture while chatting. The embodiment of the present invention may not limit the type of the second application, and for example, a document application and the like are also possible.
Optionally, the terminal in the embodiment of the present invention may be a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart watch, or the like, and certainly, the present invention may also be other terminals without specific limitation. The user can perform applications such as games, browsers, chat applications, and photo libraries through the above-described terminal.
Here, as shown in fig. 3, it is assumed that when a user is chatting on a terminal (mobile phone) using an instant chat application and wants to view a certain picture (one of A, B and C) on the terminal, first, the terminal implements a picture library application in a current application interface, and a process in which the user selects the picture library application from a plurality of applications displayed in the current application interface is a process in which the terminal receives a first operation of the user, and then the terminal displays an object to be selected in the picture library application based on the first operation in the current application interface. The number of the objects to be selected may be multiple, and is specifically determined by a file in an actual document application, which is not limited in the embodiment of the present invention.
Optionally, the picture object is an object selected from the objects to be selected of the second application. The first operation in the embodiment of the present invention may be: the embodiment of the present invention is not limited by the specific implementation manner of the first operation, which is an operation in which a user clicks or selects a picture library application among a plurality of applications of a terminal.
It should be noted that, an entry (an add button, and a specific form is not limited) of another application is set on the current application interface of the terminal, and in the current application interface, a user can enter a selection interface of multiple applications through the entry. That is, before the terminal receives the first operation of the user, the user opens the display interfaces of the multiple applications by clicking, double-clicking, and the like on an entry (for example, an add button) set on the current application interface. At this time, as shown in fig. 4, the mobile phone (terminal) may display interfaces of a plurality of applications, for example, display interfaces including icons of pictures, documents, videos, and the like, in a predetermined area of the current application interface. Then, the user can select an application (e.g., a picture library application) to be implemented in the interfaces of the multiple applications, that is, when the mobile phone displays multiple interfaces of the selected applications in the current application interface at this time, a first operation of the user is received, and the picture library application is selected through the first operation, where the first operation is an operation of selecting a second application for the multiple applications, and the first selection interface of the selected picture object in the second application is triggered to be displayed.
Optionally, the predetermined area in the embodiment of the present invention may be an area on the terminal display screen that is preset, or an area on the terminal display screen that can be determined by the user by setting the screen coordinates, and the specific determination manner of the predetermined area is not limited in the embodiment of the present invention. The predetermined area may be a lowermost display area of the current application interface, and the embodiment of the present invention is not limited.
And S102, responding to the first operation, displaying a first selection interface, and receiving a second operation of the user on the first selection interface, wherein the second operation is used for selecting the picture object.
After the terminal receives a first operation of a user, the terminal selects a second application (for example, a document application) which is desired to realize a function through the first operation, and then, the terminal displays a first selection interface of a selectable picture object in the second application in response to the first operation, wherein the first selection interface can be displayed at a preset area of the current application interface.
It can be understood that there may be a plurality of objects to be selected in the second application, that is, a plurality of objects may be displayed in the first selection interface displayed by the terminal. Therefore, the user needs to select a picture object that is desired to be displayed or used among the plurality of objects.
Specifically, a user selects a second application (for example, a picture library application icon) through the terminal, and meanwhile, when the user selects the second application (the picture library application) through a first operation, the terminal triggers and displays a first selection interface for selecting a picture object, where the first selection interface may be a list interface or a display interface of multiple selectable objects to be selected in the document application.
For example, as shown in fig. 5, a user selects a picture library application or a picture library icon by clicking or touching a mobile phone, and at the same time, based on the first operation, the mobile phone is triggered to display a picture selection interface (first selection interface) of 3 selectable pictures, and on the picture selection interface, the user clicks or touches a first picture (second operation), that is, the first picture is selected as a picture object.
And S103, responding to the second operation and acquiring the picture object.
And S104, acquiring operation information obtained according to the first operation and the second operation, and creating a floating window for displaying the picture object according to the operation information, wherein the floating window is arranged at a target area on the current application interface, and the target area is an input frame arranged on the current application interface and an area outside a character entry area.
And S105, displaying the picture object in a floating window in a floating mode.
After the terminal receives the second operation of the user on the first selection interface, the terminal responds to the second operation to acquire the picture object because the second operation is the operation for selecting the picture object. And after the picture object is acquired in response to the second operation, the terminal acquires operation information obtained according to the first operation and the second operation, the operation information represents a function to be displayed in a floating mode, and then the terminal creates a floating window for displaying the picture object according to the operation information. And because the first selected interface of the current application interface of the terminal is not necessarily displayed, the terminal cancels the interface display in the preset area based on the second operation and displays the current application interface back, and meanwhile, the terminal can display the picture object in the target area of the current application interface, thereby simultaneously realizing the functional application of the second application and the first application.
Specifically, when the terminal receives a first operation and a second operation, the terminal starts to call a window manager to create a top window in a target area, and the top window is used as a floating window; and the terminal configures the floating window according to the preset window parameters to complete the establishment of the floating window.
Optionally, in the embodiment of the present invention, the preset window parameter is a setting of attribute information of the floating window, for example, a screen coordinate, a display width, a corresponding information between a moving track and a moving size of the floating window, a display size, and the like.
Optionally, the target area in the embodiment of the present invention may be a preset area located on the terminal display screen, or may also be an area located on the terminal display screen that can be determined by the user by setting the screen coordinates, and the specific determination method of the target area is not limited in the embodiment of the present invention. The display size of the picture object in the floating window may be preset or default, which is not limited in the embodiment of the present invention.
It should be noted that, in the embodiment of the present invention, the mode of displaying the picture object may be performing floating display in a floating window set in the current application interface, and the picture object is not limited to have other display modes.
Specifically, in the embodiment of the present invention, the terminal is provided with a floating window in a target area on the current application interface, and the target area is an area outside an input box and a character entry area that are set on the current application interface.
Optionally, the floating window is displayed by placing the picture object on the upper layer of the current application interface, so that the application interface of the running first application is not blocked.
It should be noted that, in the process of implementing the information processing method according to the embodiment of the present invention, the function of the first application running on the terminal is not turned off, but is running all the time. The display and processing of the terminal in the floating window and the receiving and sending of the information or the function use in the current application interface are not affected mutually.
In the embodiment of the present invention, the predetermined area and the target area may refer to one same area, and may refer to two different areas, and the specific relationship between the predetermined area and the target area is not limited in the embodiment of the present invention.
It can be understood that, in the embodiment of the present invention, since the current application interface is an application interface of a first application in use on the terminal, if the terminal performs the floating display of the picture object of the floating window on the current application interface, the display of the picture object of a second application or the implementation of the application in the floating window is realized without switching the first application to the background, and the back-and-forth switching between the foreground and the background is not needed, so that the operation is very simple and convenient; therefore, the information processing method provided by the embodiment of the invention can simultaneously realize different application functions, reduce the complexity of realizing the application functions and improve the flexibility of realizing the application functions.
It should be noted that, in order to minimize the influence on the operation of the first application that the terminal is applying when implementing the information processing method according to the embodiment of the present invention, the floating window may be displayed in a manner of translucency, and a specific display manner is not limited in the embodiment of the present invention.
Illustratively, as shown in fig. 6, it is assumed that the user is chatting on the mobile phone, i.e., the first application is a chat application, and the user wants to see the photos in the gaps of the chat, i.e., the photo library application. Under the implementation of the information processing method provided by the embodiment of the invention, the chat interface on the mobile phone also receives and sends the chat information normally, and the content of the selected first picture (the picture object is a character picture) is displayed in the floating window arranged on the upper layer of the chat interface.
Further, the process of simultaneously applying multiple applications can be realized in the embodiment of the present invention, and in this way, when the floating window is displayed on the current application interface, the implementation of S101 to S105 is performed again, and the target area is set in the area not covered each time. The embodiment of the invention does not limit the number of applications simultaneously realized.
Example two
Based on the implementation of the first embodiment, after the terminal performs the floating display of the picture object in the floating window, the terminal may implement some corresponding operations and functions performed on the floating window. Therefore, an embodiment of the present invention provides an information processing method, as shown in fig. 7, after S105, the method may include:
s201, receiving first gesture information of a user, wherein the first gesture information is used for operating a floating window according to a first motion track.
It should be noted that, in the first embodiment, it is described that the operation is implemented when the plurality of applications run simultaneously, and when the terminal has run the first application and the second application simultaneously, the embodiment of the present invention may also perform corresponding functional operations on the floating window, and implement different functional processes on the floating window.
Here, the user operates on the floating window on the current application interface where the first application and the second application are being executed on the terminal.
It should be noted that the terminal in the embodiment of the present invention may be an intelligent terminal.
Specifically, the user performs gesture operation on the floating window, that is, the terminal receives the first gesture information. The terminal in the embodiment of the invention can be an intelligent terminal with a touch display screen, and a user performs a first gesture operation on the floating window on the touch display screen of the terminal, namely first gesture information is obtained. The track of the first gesture moving on the touch display screen of the terminal is a first motion track, namely the first gesture information is used for operating the floating window according to the first motion track.
Optionally, the first gesture information in the embodiment of the present invention may be: double-finger relative sliding, single-finger dragging sliding, etc., and embodiments of the present invention are not limited.
It should be noted that, in the embodiment of the present invention, a corresponding relationship between a gesture and a function operation is preset in a terminal, and a function operation represented by the gesture in the context of different applications is set. For example, when the two fingers slide relatively in the floating window point, the floating window is correspondingly amplified.
It can be understood that the first motion track under the action of the first gesture information can acquire a start position and an end position of a touch display area of the corresponding terminal.
S202, determining a first function operation instruction corresponding to the first gesture information according to a preset rule and the first motion track.
After the terminal receives the first gesture information of the user, the first gesture information and the function operation have a corresponding relationship, and furthermore, the position of the end position of the first motion track corresponding to the first gesture information in the display area of the terminal is also in a certain relation with the type of the function operation.
It should be noted that, the first application on the terminal in the embodiment of the present invention may implement a function of sending information or an object in another application in its own application interface. Therefore, it is also possible that the user wants to display the content of the picture object in the floating window on the current application interface. The current application interface of the first application is provided with a preset inlet, and the application or display of the picture object can be performed on the current application interface of the first application through the preset inlet. Therefore, the first gesture information may be a single movement function operation only for the floating window, or may be a gesture operation (an input operation or a sending operation) for inputting the picture object of the floating window into a preset entry and displaying the picture object on the current application interface. Specifically, the embodiment of the invention adopts a preset rule to judge whether the operation of the first gesture information on the floating window is a moving function operation or a gesture operation displayed on the current application interface.
Optionally, the preset entry in the embodiment of the present invention may be an input box or a preset area, which is not limited in the embodiment of the present invention. For example, when the first application is a chat application, if a user drags a floating window on an upper layer of a current application interface to a display position of a chat input box through a first gesture operation, the terminal triggers sending of prompt information of a picture object in the floating window so as to send the picture object in the current application interface of the chat.
It should be noted that, in the embodiment of the present invention, the terminal determines the first function operation instruction corresponding to the first gesture information according to the preset rule and the first motion trajectory. The preset rule may be a rule related to a display position or area. Of course, the embodiment of the present invention provides an implementation form, and the embodiment of the present invention may not limit the manner of determining the first function operation instruction corresponding to the first gesture information.
Specifically, in the embodiment of the present invention, the terminal may determine the target position of the floating window in the current application interface according to the first motion trajectory; when the target position is not within the range of the input box, determining that the first function operation instruction is a moving function instruction, wherein the moving function instruction is used for indicating moving operation on the floating window; and when the destination position is within the range of the input box, determining that the first function operation instruction is a sending instruction, wherein the sending instruction is used for indicating to send the picture object displayed by the floating window. The embodiment of the invention is not limited to the input box range, and can be adaptively the preset position range, and when the input box is arranged on the current application interface such as the chat application, the preset position range is the input box range.
Optionally, the preset position range is a position of a preset entry which can be connected between the floating window in the terminal and the current application interface.
And S203, realizing the functional operation on the floating window according to the first functional operation instruction.
After the terminal determines the first function operation instruction corresponding to the first gesture information according to the preset rule and the first motion track, it is described what the terminal has found the function operation instruction corresponding to the first gesture information, and at this time, the terminal only needs to perform corresponding function operation on the floating window according to the determined first function operation instruction.
Specifically, when the first function operation instruction is a mobile function instruction, the terminal determines that the mobile function instruction is a first mobile function operation corresponding to the first gesture information; and the terminal responds to the first moving function operation to realize the moving function of the floating window. When the first function operation instruction is a sending instruction, the terminal generates prompt information of sending operation; and the terminal sends the picture object displayed by the floating window to the current application interface according to the prompt message.
It should be noted that, in the embodiment of the present invention, when the first function operation instruction is a move function instruction, there are many kinds of move function operations for the floating window, such as: magnification, positional shift, etc. Therefore, when the first function operation command is a move function command, the terminal needs to further determine that the move function command is a specific first move function operation corresponding to the first gesture information. Namely, the terminal is preset with a corresponding relationship between gestures and function operations, and is provided with function operations represented by the gestures in different application contexts. For example, when the two fingers slide relatively in the floating window point, the floating window is correspondingly amplified.
It should be noted that, in the embodiment of the present invention, when the first function operation instruction is a sending instruction, the representation terminal displays the floating window in the current application interface through the preset entry, so that the terminal generates a prompt message to notify the user that the operation of displaying the floating window in the current application interface through the preset entry can be performed, and then the terminal sends the picture object displayed by the floating window to the current application interface according to the prompt message (the user may perform a prompt operation according to the prompt message, and then the terminal responds to the prompt operation).
Illustratively, taking a chat application as an example, the terminal performs top-level window movement on a floating window through a gesture, detects a control position and content of a lower-level chat interface (or a chat window), and when a target position or an end position of the gesture movement is a control of a non-input box, the terminal realizes top-level smooth movement (i.e., the movement of the floating window); and in the process of moving the top-layer window, the target position or the end position of the gesture movement is moved to the upper position of the input box, the sending operation is prompted, and after the user determines to send the picture object, the picture object is sent to a message list of the chat interface.
Illustratively, as shown in fig. 8, it is assumed that the first application is a chat application and the second application is a picture library application. The floating window floating on the chat interface of the mobile phone is displaying pictures. At this time, the mobile phone determines that the floating window is dragged to the position of the input box of the chat interface by the sliding gesture based on the sliding gesture of the user, so that the mobile phone displays a sending button, and after the user clicks the sending button, the mobile phone responds to a sending instruction and sends the picture object displayed by the floating window to the current chat interface.
It can be understood that the terminal can realize some function operations of the floating window while realizing the functions of a plurality of applications, thereby improving the flexibility and diversity of application function realization.
Further, in the embodiment of the present invention, in addition to implementing the above operation function, the terminal may also perform switching of different application interfaces of the current application, and specifically, the terminal receives an application interface switching operation, where the application interface switching operation is used to trigger the current application interface to switch among multiple application interfaces of the current application, or to trigger switching with application interfaces of other applications; and then, the terminal responds to the switching operation of the application interface, switches the current application interface and adaptively displays the floating window at the front end of the target area of the switched current application interface.
It can be understood that, when the terminal switches the current application interface, the display of the floating window is not affected, that is, when the user switches the current application interface among multiple interfaces of the current application, or switches the current application interface with application interfaces of other applications, the floating window in the embodiment of the present invention may select the adaptive target region according to the switching of the corresponding current application interface and maintain the front-end display.
EXAMPLE III
Based on the implementation of the first embodiment, in a process that the terminal creates the floating window before the terminal performs the floating display of the picture object in the floating window, an embodiment of the present invention provides an information processing method, as shown in fig. 9, before S105, a process of S104 when creating the floating window may specifically include: and S301. The method comprises the following specific steps:
s301, obtaining operation information obtained according to the first operation and the second operation, creating a floating window for displaying the picture object according to the operation information, and arranging a scroll bar key or a closing key on the floating window.
Based on the description of S104 in the first embodiment, when the terminal creates the floating window for displaying the picture object according to the operation information, the terminal may further set a function key on the floating window to implement other functions. Such as a scroll bar key or a close key. Specifically, the information of the file is viewed by adding a scroll bar parameter in the floating window setting. And closing the floating window is realized by adding a closing parameter in the setting of the floating window.
Further, since some function keys may be set on the established floating window, after the terminal displays the floating window, as shown in fig. 10, based on S301, after S105, the information processing method provided in the embodiment of the present invention further includes: S302-S305. The method comprises the following specific steps:
s302, receiving a third operation of the user, wherein the third operation is used for triggering a scroll bar key, and the scroll bar key is arranged on the floating window.
And S303, responding to the third operation, and realizing the viewing function of the picture object.
After the terminal carries out suspension display on the picture object in the suspension window, the user can check the picture object through the scroll bar key as the suspension window is also provided with the scroll bar key.
Alternatively, the scroll bar key shown in fig. 11 may be a bar that slides up and down, and the specific form is not limited. And when the terminal receives a third operation of dragging the scroll bar by the user, the terminal responds to the third operation to realize the scroll viewing of the picture object of the floating window.
S304, receiving a fourth operation of the user, wherein the fourth operation is used for triggering a closing key, and the closing key is arranged on the floating window.
And S305, responding to the fourth operation, and realizing the function of closing the display of the picture object.
After the terminal carries out the suspension display on the picture object in the suspension window, because the suspension window is also provided with a closing key, a user can carry out the display stopping operation on the picture object through the closing key, namely, the suspension window is closed.
Optionally, as shown in fig. 12, the close key may be an "x" key, and the close key may also be an "close" key, and when the terminal receives a fourth operation that the user clicks the close key, the terminal responds to the fourth operation to close the floating window.
Further, the closing of the floating window in the embodiment of the invention is only exited through a closing key arranged on the floating window. If the first application which is currently running is exited, the display of the preset floating window is not affected.
Example four
As shown in fig. 13, an embodiment of the present invention provides a terminal 1, where the terminal 1 may include:
the receiving unit 10 is configured to receive a first operation of a user in a current application interface, where the first operation is used to trigger display of a first selection interface for selecting a picture object.
A display unit 11, configured to display the first selection interface in response to the first operation received by the receiving unit 10.
The receiving unit 10 is further configured to receive, at the first selection interface displayed by the display unit 11, a second operation of the user, where the second operation is used to select the picture object.
An obtaining unit 12, configured to obtain the picture object in response to the second operation received by the receiving unit 10; and acquiring operation information obtained according to the first operation and the second operation.
The establishing unit 17 is configured to create a floating window for displaying the picture object according to the operation information, where the floating window is set in a target area on the current application interface, and the target area is an area outside an input frame and a character entry area that are set on the current application interface.
The display unit 11 is further configured to perform floating display on the picture object acquired by the acquisition unit 12 in a floating window created by the creation unit 17.
Optionally, based on fig. 13, as shown in fig. 14, the terminal 1 further includes: a determination unit 13 and an operation unit 14.
The receiving unit 10 is further configured to receive first gesture information of a user after the display unit 11 performs a floating display on the picture object in a floating window, where the first gesture information is used to operate the floating window according to a first motion trajectory.
The determining unit 13 is configured to determine a first function operating instruction corresponding to the first gesture information according to a preset rule and the first motion trajectory received by the receiving unit 10.
The operation unit 14 is configured to implement a functional operation on the floating window according to the first functional operation instruction determined by the determination unit 13.
Optionally, the determining unit 13 is specifically configured to determine, according to the first motion trajectory, a destination position of the floating window in the current application interface; when the target position is not within the range of the input frame, determining that the first function operation instruction is a moving function instruction, wherein the moving function instruction is used for indicating to move the floating window; and when the destination position is within the range of the input box, determining that the first function operation instruction is a sending instruction, wherein the sending instruction is used for indicating to send the picture object displayed by the floating window.
Optionally, the determining unit 13 is further configured to determine that the mobile function instruction is a first mobile function operation corresponding to the first gesture information when the first function operation instruction is the mobile function instruction.
The operation unit 14 is specifically configured to respond to the first moving function operation determined by the determination unit 13, and implement a moving function on the floating window.
Optionally, as shown in fig. 14, the terminal further includes: a generating unit 15 and a transmitting unit 16.
The generating unit 15 is configured to generate prompt information of a sending operation when the determining unit 13 determines that the first function operation instruction is the sending instruction.
The sending unit 16 is configured to send the picture object displayed in the floating window to the current application interface according to the prompt information generated by the generating unit 15.
Optionally, based on fig. 13, as shown in fig. 15, the terminal 1 further includes: a switching unit 18.
The receiving unit 10 is further configured to receive an application interface switching operation after the picture object is displayed in a floating window in a floating manner, where the application interface switching operation is used to trigger a current application interface to switch among multiple application interfaces of a current application, or to trigger switching with application interfaces of other applications.
The switching unit 18 is configured to perform switching of the current application interface in response to the application interface switching operation received by the receiving unit 10.
The display unit 11 is further configured to adaptively display the floating window at the front end of the target area of the current application interface switched by the switching unit 18.
Optionally, based on fig. 13, as shown in fig. 16, the terminal 1 further includes: an operation unit 14.
The receiving unit 10 is further configured to receive a third operation of the user after the display unit 11 performs the floating display on the picture object in the floating window, where the third operation is used to trigger a scroll bar key, and the floating window is provided with the scroll bar key.
The operation unit 14 is configured to implement a viewing function for the picture object in response to the third operation received by the receiving unit 10.
Optionally, as shown in fig. 16, the terminal 1 further includes: an operation unit 14.
The receiving unit 10 is further configured to receive a fourth operation of the user after the display unit 11 performs the floating display on the picture object in the floating window, where the fourth operation is used to trigger a closing key, and the floating window is provided with the closing key.
The operation unit 14 is configured to implement a function of closing display of the picture object in response to the fourth operation received by the receiving unit 10.
As shown in fig. 17, in practical applications, the obtaining unit 12, the determining unit 13, the operating unit 14, the generating unit 15, the establishing unit 17, and the switching unit 18 may be implemented by a processor 19 located on a terminal, specifically, a Central Processing Unit (CPU), a Microprocessor (MPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), or the like, the receiving unit 10 is implemented by a receiver 110, the transmitting unit 16 is implemented by a transmitter 111, and the display unit 11 is implemented by a display 112, where the terminal further includes: storage medium 113, the receiver 110, the transmitter 111, and the display 112 may be connected to the processor 19 via a system bus 114, wherein the storage medium 113 is used to store executable program code comprising computer operating instructions, and the storage medium 113 may comprise a high-speed RAM memory and may also comprise a non-volatile memory, such as at least one disk memory.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of a hardware embodiment, a software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention.

Claims (16)

1. An information processing method characterized by comprising:
receiving a first operation of a user in a current application interface, wherein the first operation is used for triggering and displaying a first selection interface for selecting a picture object;
responding to the first operation to display the first selection interface, and receiving a second operation of the user on the first selection interface, wherein the second operation is used for selecting the picture object;
responding to the second operation, and acquiring the picture object;
acquiring operation information obtained according to the first operation and the second operation, and creating a floating window for displaying the picture object according to the operation information, wherein the floating window is arranged at a target area on the current application interface, the target area is an input frame arranged on the current application interface and an area outside a character input area, and the target area is determined by screen coordinates set by the user;
the picture object is placed on the upper layer of the current application interface which is not mutually affected with the function use of the first application through the floating window to be displayed;
when the detected operation corresponding to the first gesture information of the user on the floating window is an input operation or a sending operation, recognizing characters from the picture object of the floating window, and inputting the characters of the picture object to a preset inlet of the current application interface.
2. The method of claim 1, wherein after the displaying the picture object through the floating window on the upper layer of the current application interface that does not affect the functional usage of the first application, the method further comprises:
receiving first gesture information of a user, wherein the first gesture information is used for operating the floating window according to a first motion track;
determining a first function operation instruction corresponding to the first gesture information according to a preset rule and the first motion track;
and realizing the functional operation on the floating window according to the first functional operation instruction.
3. The method according to claim 2, wherein the determining, according to a preset rule and the first motion trajectory, a first function operation instruction corresponding to the first gesture information includes:
determining the target position of the floating window in the current application interface according to the first motion track;
when the destination position is not within the range of the input box, determining that the first function operation instruction is a moving function instruction, wherein the moving function instruction is used for indicating moving operation on the floating window;
when the destination position is within the range of the input box, determining that the first function operation instruction is a sending instruction, wherein the sending instruction is used for indicating to send the picture object displayed by the floating window.
4. The method according to claim 3, wherein the implementing the functional operation on the floating window according to the first functional operation instruction comprises:
when the first function operation instruction is the mobile function instruction, determining that the mobile function instruction is a first mobile function operation corresponding to the first gesture information;
and responding to the first moving function operation to realize the moving function of the floating window.
5. The method according to claim 3, wherein the implementing the functional operation on the floating window according to the first functional operation instruction comprises:
when the first function operation instruction is the sending instruction, generating prompt information of sending operation;
and sending the picture object displayed by the floating window to the current application interface according to the prompt message.
6. The method of claim 1, wherein after the displaying the picture object through the floating window on the upper layer of the current application interface that does not affect the functional usage of the first application, the method further comprises:
receiving application interface switching operation, wherein the application interface switching operation is used for triggering the current application interface to switch among a plurality of application interfaces of the current application or triggering the current application interface to switch with application interfaces of other applications;
responding to the switching operation of the application interface, switching the current application interface, and adaptively displaying the floating window at the front end of the target area of the switched current application interface.
7. The method of claim 1, wherein after the displaying the picture object through the floating window on the upper layer of the current application interface that does not affect the functional usage of the first application, the method further comprises:
receiving a third operation of a user, wherein the third operation is used for triggering a scroll bar key, and the scroll bar key is arranged on the floating window;
and responding to the third operation to realize the viewing function of the picture object.
8. The method of claim 1, wherein after the displaying the picture object through the floating window on the upper layer of the current application interface that does not affect the functional usage of the first application, the method further comprises:
receiving a fourth operation of a user, wherein the fourth operation is used for triggering a closing key, and the closing key is arranged on the floating window;
and responding to the fourth operation to realize the function of closing the display of the picture object.
9. A terminal, comprising:
the receiving unit is used for receiving a first operation of a user in a current application interface, wherein the first operation is used for triggering and displaying a first selection interface for selecting a picture object;
a display unit configured to display the first selection interface in response to the first operation received by the receiving unit,
the receiving unit is further configured to receive a second operation of the user at the first selection interface displayed by the display unit, where the second operation is used to select the picture object;
an obtaining unit, configured to obtain the picture object in response to the second operation received by the receiving unit; acquiring operation information obtained according to the first operation and the second operation;
the establishing unit is used for establishing a floating window for displaying the picture object according to the operation information, the floating window is arranged at a target area on the current application interface, the target area is an area outside an input frame and a character input area which are arranged on the current application interface, and the target area is determined by screen coordinates set by the user;
the display unit is further configured to set the picture object to an upper layer of the current application interface, where the function usage of the current application interface is not mutually affected with that of the first application, through the floating window to display the picture object;
and the operation unit is used for recognizing characters from the picture object of the floating window and inputting the characters of the picture object to a preset inlet of the current application interface when the detected operation corresponding to the first gesture information of the floating window by the user is an input operation or a sending operation.
10. The terminal of claim 9, wherein the terminal further comprises: a determination unit and an operation unit;
the receiving unit is further configured to receive first gesture information of a user after the display unit performs floating display on the picture object in a floating window, where the first gesture information is used to operate the floating window according to a first motion trajectory;
the determining unit is configured to determine a first function operating instruction corresponding to the first gesture information according to a preset rule and the first motion trajectory received by the receiving unit;
the operation unit is configured to implement a functional operation on the floating window according to the first functional operation instruction determined by the determination unit.
11. The terminal of claim 10,
the determining unit is specifically configured to determine a destination position of the floating window in the current application interface according to the first motion trajectory; when the target position is not within the range of the input frame, determining that the first function operation instruction is a moving function instruction, wherein the moving function instruction is used for indicating to move the floating window; and when the destination position is within the range of the input box, determining that the first function operation instruction is a sending instruction, wherein the sending instruction is used for indicating to send the picture object displayed by the floating window.
12. The terminal of claim 11,
the determining unit is further configured to determine that the mobile function instruction is a first mobile function operation corresponding to the first gesture information when the first function operation instruction is the mobile function instruction;
the operation unit is specifically configured to respond to the first moving function operation determined by the determination unit, and implement a moving function on the floating window.
13. The terminal of claim 11, wherein the terminal further comprises: a generating unit and a transmitting unit;
the generating unit is used for generating prompt information of sending operation when the determining unit determines that the first function operation instruction is the sending instruction;
and the sending unit is used for sending the picture object displayed by the floating window to the current application interface according to the prompt message generated by the generating unit.
14. The terminal of claim 9, wherein the terminal further comprises: a switching unit;
the receiving unit is further configured to receive an application interface switching operation after the picture object is displayed in a floating window in a floating manner, where the application interface switching operation is used to trigger a current application interface to switch among multiple application interfaces of a current application, or trigger switching with application interfaces of other applications;
the switching unit is used for responding to the application interface switching operation received by the receiving unit and switching the current application interface;
the display unit is further configured to adaptively display the floating window at the front end of the target area of the current application interface switched by the switching unit.
15. The terminal of claim 9, wherein the terminal further comprises: an operation unit;
the receiving unit is further configured to receive a third operation of the user after the display unit performs the floating display on the picture object in the floating window, where the third operation is used to trigger a scroll bar key, and the floating window is provided with the scroll bar key;
and the operation unit is used for responding to the third operation received by the receiving unit and realizing the viewing function of the picture object.
16. The terminal of claim 9, wherein the terminal further comprises: an operation unit;
the receiving unit is further configured to receive a fourth operation of the user after the display unit performs the floating display on the picture object in the floating window, where the fourth operation is used to trigger a closing key, and the floating window is provided with the closing key;
the operation unit is configured to respond to the fourth operation received by the receiving unit, and implement a function of closing display of the picture object.
CN201611138111.4A 2016-12-12 2016-12-12 Information processing method and terminal Active CN108228020B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110771359.9A CN113434065B (en) 2016-12-12 2016-12-12 Information processing method and terminal
CN201611138111.4A CN108228020B (en) 2016-12-12 2016-12-12 Information processing method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611138111.4A CN108228020B (en) 2016-12-12 2016-12-12 Information processing method and terminal

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202110771359.9A Division CN113434065B (en) 2016-12-12 2016-12-12 Information processing method and terminal

Publications (2)

Publication Number Publication Date
CN108228020A CN108228020A (en) 2018-06-29
CN108228020B true CN108228020B (en) 2021-09-07

Family

ID=62637856

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201611138111.4A Active CN108228020B (en) 2016-12-12 2016-12-12 Information processing method and terminal
CN202110771359.9A Active CN113434065B (en) 2016-12-12 2016-12-12 Information processing method and terminal

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202110771359.9A Active CN113434065B (en) 2016-12-12 2016-12-12 Information processing method and terminal

Country Status (1)

Country Link
CN (2) CN108228020B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109521923B (en) * 2018-11-21 2020-10-27 北京小米智能科技有限公司 Floating window control method and device and storage medium
CN109933767B (en) * 2019-03-15 2023-11-03 Oppo(重庆)智能科技有限公司 Picture editing method, electronic device and device with storage function
CN110377378B (en) * 2019-06-18 2022-11-25 平安科技(深圳)有限公司 Picture suspension display method, device, terminal and storage medium
CN110263191B (en) * 2019-06-24 2022-02-22 广州市托奥智能科技有限公司 Stacking display method and system for multimedia resources
CN114281225A (en) * 2020-09-28 2022-04-05 华为技术有限公司 Window display method and device
CN111601012B (en) * 2020-05-28 2022-10-25 维沃移动通信(杭州)有限公司 Image processing method and device and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103412711A (en) * 2013-08-27 2013-11-27 宇龙计算机通信科技(深圳)有限公司 Document comparison reference method and device
WO2014071624A1 (en) * 2012-11-12 2014-05-15 东莞宇龙通信科技有限公司 Terminal and application program interaction method
CN105320693A (en) * 2014-07-31 2016-02-10 腾讯科技(深圳)有限公司 Information querying method and terminal
CN105430168A (en) * 2015-10-30 2016-03-23 努比亚技术有限公司 Mobile terminal and file sharing method

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5623681A (en) * 1993-11-19 1997-04-22 Waverley Holdings, Inc. Method and apparatus for synchronizing, displaying and manipulating text and image documents
US10740117B2 (en) * 2010-10-19 2020-08-11 Apple Inc. Grouping windows into clusters in one or more workspaces in a user interface
US20140115446A1 (en) * 2012-10-22 2014-04-24 Apple Inc. Content Control Tools for a Document Authoring Application
CN104123078B (en) * 2014-08-12 2017-05-31 广州三星通信技术研究有限公司 The method and apparatus of input information
CN104166458A (en) * 2014-08-12 2014-11-26 广州华多网络科技有限公司 Method and device for controlling multimedia player
CN105468612A (en) * 2014-09-01 2016-04-06 深圳富泰宏精密工业有限公司 Auxiliary browsing system and method
CN104317508A (en) * 2014-09-28 2015-01-28 宇龙计算机通信科技(深圳)有限公司 Method, system and terminal for information sharing
US10528207B2 (en) * 2015-01-12 2020-01-07 Facebook, Inc. Content-based interactive elements on online social networks

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014071624A1 (en) * 2012-11-12 2014-05-15 东莞宇龙通信科技有限公司 Terminal and application program interaction method
CN103412711A (en) * 2013-08-27 2013-11-27 宇龙计算机通信科技(深圳)有限公司 Document comparison reference method and device
CN105320693A (en) * 2014-07-31 2016-02-10 腾讯科技(深圳)有限公司 Information querying method and terminal
CN105430168A (en) * 2015-10-30 2016-03-23 努比亚技术有限公司 Mobile terminal and file sharing method

Also Published As

Publication number Publication date
CN113434065B (en) 2022-09-30
CN108228020A (en) 2018-06-29
CN113434065A (en) 2021-09-24

Similar Documents

Publication Publication Date Title
CN108228020B (en) Information processing method and terminal
CN105955607B (en) Content sharing method and device
KR101527827B1 (en) Split-screen display method and apparatus, and electronic device thereof
CN109062479B (en) Split screen application switching method and device, storage medium and electronic equipment
CN106201171B (en) Split-screen display method and electronic equipment
US20190391732A1 (en) Method, Apparatus, and Terminal for Processing Notification Information
RU2582854C2 (en) Method and device for fast access to device functions
CN109032485A (en) Display methods, device, electronic equipment, Intelligent flat and storage medium
US9829706B2 (en) Control apparatus, information processing apparatus, control method, information processing method, information processing system and wearable device
KR20130093043A (en) Method and mobile device for user interface for touch and swipe navigation
KR102027879B1 (en) Menu contolling method of media equipment, apparatus thereof, and medium storing program source thereof
CN112099707A (en) Display method and device and electronic equipment
CN113163050B (en) Session interface display method and device
CN107479818B (en) Information interaction method and mobile terminal
CN110928614B (en) Interface display method, device, equipment and storage medium
CN107694087B (en) Information processing method and terminal equipment
CN103076974A (en) Unlocking method and device of touch screen and touch screen equipment
CN109358941B (en) Control method and electronic equipment
US20240031317A1 (en) Image Sharing Method and Electronic Device
CN113452744A (en) File sharing method, device, equipment and storage medium
CN112148167A (en) Control setting method and device and electronic equipment
CN110007838B (en) Processing method, device and equipment for erasing control
CN103634631B (en) Icon method for selecting based on remote control touch screen and system
CN102880413A (en) Method for controlling display of mobile terminal with touch screen and mobile terminal
KR101495802B1 (en) Apparatus and methdo for running application

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant