CN111813285B - Floating window management method and device, electronic equipment and readable storage medium - Google Patents

Floating window management method and device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN111813285B
CN111813285B CN202010582023.3A CN202010582023A CN111813285B CN 111813285 B CN111813285 B CN 111813285B CN 202010582023 A CN202010582023 A CN 202010582023A CN 111813285 B CN111813285 B CN 111813285B
Authority
CN
China
Prior art keywords
target
floating window
user
target area
objects
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010582023.3A
Other languages
Chinese (zh)
Other versions
CN111813285A (en
Inventor
王文杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Weiwo Software Technology Co ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202010582023.3A priority Critical patent/CN111813285B/en
Publication of CN111813285A publication Critical patent/CN111813285A/en
Application granted granted Critical
Publication of CN111813285B publication Critical patent/CN111813285B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a floating window management method and device, electronic equipment and a readable storage medium, and belongs to the technical field of communication. The electronic equipment receives a first input of a user to a display screen of the electronic equipment, responds to the first input, determines a target area in the display screen, acquires object information of an object displayed in the target area, and displays a floating window corresponding to the target area according to the object information. The menu items in the floating window are associated with the objects in the area concerned by the user, and the menu items in the floating window can be changed according to the change of the objects in the area concerned by the user, so that the user can conveniently execute the operation related to the objects in the area concerned by the user, and the flexibility of the user operation can be improved.

Description

Floating window management method and device, electronic equipment and readable storage medium
Technical Field
The application belongs to the technical field of communication, and particularly relates to a floating window management method and device, electronic equipment and a readable storage medium.
Background
At present, most electronic devices are integrated with a floating window, and a certain number of menu items can be displayed in a display screen of the electronic device through the floating window, so that a user can operate the electronic device conveniently. When the user uses the electronic equipment, the menu item in the floating window can be operated, and the electronic equipment can be flexibly operated through the menu item in the floating window.
In the process of implementing the present application, the inventor finds that at least the following problems exist in the prior art: different areas in the display screen display different information, the areas that the user focuses on are different, the operations that need to be performed are also different, and when the user uses the electronic equipment at present, and stays at a certain position in the display screen of the electronic equipment, the menu items displayed in the floating window are all fixed menu items, and the fixed menu items can only provide the function operations corresponding to the fixed menu items for the user, so that the user is inconvenient to perform other function operations except the fixed menu items.
Content of application
An object of the embodiments of the present application is to provide a floating window management method, an apparatus, an electronic device, and a readable storage medium, which can solve the problem that it is inconvenient for a user to perform other function operations except for a fixed menu item because menu items displayed in a floating window are all fixed.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides a floating window management method, where the method includes:
receiving a first input of a user to a display screen of the electronic equipment;
determining a target area in a display screen in response to the first input;
acquiring object information of an object displayed in the target area;
displaying a floating window corresponding to the target area according to the object information; wherein the floating window includes a menu item corresponding to the object.
In a second aspect, an embodiment of the present application provides a floating window management device, including:
the receiving module is used for receiving a first input of a user to a display screen of the electronic equipment;
a response module for determining a target area in a display screen in response to the first input;
an acquisition module configured to acquire object information of an object displayed in the target area;
the display module is used for displaying the floating window corresponding to the target area according to the object information; wherein the floating window includes a menu item corresponding to the object.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, the electronic device receives a first input of a user to a display screen of the electronic device, determines a target area in the display screen in response to the first input, acquires object information of an object displayed in the target area, and displays a floating window corresponding to the target area according to the object information, wherein the floating window comprises a menu item corresponding to the object. The menu items in the floating window are associated with the objects in the area concerned by the user, and the menu items in the floating window can be changed according to the change of the objects in the area concerned by the user, so that the user can conveniently execute the operation related to the objects in the area concerned by the user, and the flexibility of the user operation is improved.
Drawings
Fig. 1 is a flowchart illustrating steps of a floating window management method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a desktop of a display screen provided in an embodiment of the present application;
FIG. 3 is a flowchart illustrating steps of another floating window management method according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of an application interface provided by an embodiment of the present application;
FIG. 5 is a schematic diagram of another application interface provided by an embodiment of the present application;
FIG. 6 is a schematic diagram of yet another application interface provided by an embodiment of the present application;
FIG. 7 is a schematic diagram of yet another application interface provided by an embodiment of the present application;
fig. 8 is a block diagram illustrating a floating window management apparatus according to an embodiment of the present disclosure;
fig. 9 is a block diagram illustrating a structure of another floating window management apparatus according to an embodiment of the present disclosure;
fig. 10 is a block diagram of an electronic device according to an embodiment of the present disclosure;
fig. 11 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or described herein. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The floating window management method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
Referring to fig. 1, fig. 1 is a flowchart illustrating steps of a floating window management method provided in an embodiment of the present application, where the method is applied to an electronic device, and may include:
step 101, receiving a first input of a user to a display screen of an electronic device.
The electronic device may be, for example, an electronic device with a display screen, such as a mobile phone, a tablet computer, and a wearable device.
In this embodiment, the electronic device may receive a first input performed by a user on the display screen. For example, the first input may be a sliding operation in a display screen along a clockwise direction or a counterclockwise direction, as shown in fig. 2, fig. 2 is a desktop schematic view of a display screen provided in an embodiment of the present application, a user may slide from a position a to a position B in the display screen along the clockwise direction, and the electronic device may receive the sliding operation of the user and determine an operation trajectory corresponding to the sliding operation. The specific form of the first input may be set as desired.
Step 102, in response to a first input, determining a target area in a display screen.
In this embodiment, the electronic device may determine the target area in the display screen in response to the first input.
In conjunction with step 101, in response to a sliding operation of a user starting from a position a in the display screen and sliding clockwise to a position B, the electronic device may determine a target area 201, i.e., a circular area, corresponding to the sliding operation according to a track of the sliding operation. The track of the sliding operation may be a regular or irregular circle or an ellipse, or may be a polygon such as a triangle, a quadrangle, or a pentagon, which is not limited in this embodiment.
In this embodiment, the electronic device may determine the target area corresponding to the sliding operation when the trajectory of the sliding operation is closed, for example, the electronic device may determine the target area corresponding to the sliding operation when the position a and the position B coincide with each other. Alternatively, the electronic device may determine the target area corresponding to the sliding operation when the distance between the a position and the B position is less than or equal to a preset value (e.g., 2 mm). Still alternatively, the electronic device may determine the target area corresponding to the sliding operation when the trajectory of the sliding operation is a preset shape (e.g., a rectangle). The target area in the display screen is determined when the track of the sliding operation of the electronic equipment is closed or the track of the sliding operation is in a preset shape, so that the target area can be prevented from being determined after misoperation of a user, and the power consumption of the electronic equipment is reduced.
And 103, acquiring object information of the object displayed in the target area.
In this embodiment, the objects are objects such as images, Application (APP) icons, controls, and characters displayed in the target area. After determining the target area in the display screen, the electronic device may identify an object in the target area to determine object information of the object in the target area.
For example, after determining a target area in the display screen, the electronic device may perform a screen capture operation to obtain an image corresponding to the target area, and then input the image into a pre-trained image recognition model to obtain object information output by the image recognition model. As shown in fig. 1, the target region 201 includes a first icon 202 of a dialing number APP and a second icon 203 of a friend making APP (object in the target region), and after the image corresponding to the target region 201 is obtained, the electronic device may identify the image corresponding to the target region 201 through an image identification model, so as to obtain object information output by the image identification model, such as a "dialing APP icon" and a "friend making APP icon". The process of acquiring the image corresponding to the target area by the electronic device and the training and using method of the image recognition model may refer to the prior art, which is not described in detail in this embodiment.
It should be noted that the above is only an exemplary example, and the process of acquiring the object information by the electronic device may include, but is not limited to, a method using an image recognition model.
And 104, displaying the floating window corresponding to the target area according to the object information.
Wherein the floating window includes menu items corresponding to the object.
In this embodiment, after obtaining the object information of the object in the target area, the electronic device may display a floating window according to the object information, where the floating window includes a menu item corresponding to the object in the target area.
For example, as shown in fig. 2, after determining that the object information of the first icon 202 of the dialing APP is the "dialing APP icon" through the image recognition model, the electronic device may determine that the user may need to open the dialing APP according to the object information "dialing APP icon". If the application program interface included in the dialing APP has a dialing interface and an address list interface, the electronic device may determine the dialing APP from all the application programs installed in the electronic device according to the object information "dialing APP icon", determine the dialing APP in the dialing APP, generate a menu item 301 (the menu item is, for example, a button) corresponding to the dialing APP, and establish a call relationship between the menu item 301 and the dialing interface. When the user clicks the menu item 301, the electronic device may open a dialing interface in response to the user's clicking operation. Similarly, a menu item 302 corresponding to the friend-making APP can be established, and the menu item 302 is used for opening a chat interface in the friend-making APP.
The electronic device, after generating the menu item, can display a floating window that includes the menu item. As shown in fig. 2, after generating menu items 301 and 302, the electronic device may display a floating window 300 including menu items 301 and 302 at a position near the target area 201. The user can navigate to menu item 301 in flyover window 300 to quickly access the dialing interface, and navigate to menu item 302 in flyover window 300 to quickly access the chat interface. The floating window may be displayed at a position close to the target area, or may be directly displayed on the target area, and the display position of the floating window may be set according to a requirement, which is not limited in this embodiment.
It should be noted that one or more objects may exist in the target area, and the electronic device may generate one or more menu items for each object, or may generate one or more menu items for only some of the objects.
In summary, in this embodiment, the electronic device receives a first input of a user to a display screen of the electronic device, determines a target area in the display screen in response to the first input, acquires object information of an object displayed in the target area, and displays a floating window corresponding to the target area according to the object information. The menu items in the floating window are associated with the objects in the area concerned by the user, and the menu items in the floating window can be changed according to the change of the objects in the area concerned by the user, so that the user can conveniently execute the operation related to the objects in the area concerned by the user, and the flexibility of the user operation is improved.
Referring to fig. 3, fig. 3 is a flowchart illustrating steps of another floating window management method provided in an embodiment of the present application, where the method is applied to an electronic device, and may include:
step 301, receiving a first input of a user to an application program interface displayed on a display screen.
Step 302, in response to a first input, determining a target area in a display screen.
The application program interface is, for example, a dialing interface of a dialing APP or a chatting interface of a friend making APP.
In this embodiment, the first input may act on the application program interface, and the electronic device may receive the first input of the application program interface from the user, and determine, in response to the first input, a target area in the display screen, that is, a target area in the application program interface. Objects that exist in different areas within the application interface are different. After the electronic device receives first input in different areas of the application program interface, different object information can be obtained in response to the first input, and then corresponding menu items are generated.
Alternatively, the target area may be plural. That is, the electronic device may determine the plurality of target areas in response to a first input by the user.
For example, as shown in fig. 4, fig. 4 is a schematic diagram of an application interface provided in an embodiment of the present application, and in connection with step 101, the first input may be a sliding operation performed multiple times continuously, and a user may perform the sliding operation multiple times continuously in the application interface. The electronic equipment can respond to a plurality of sliding operations continuously executed by a user and determine the target area corresponding to each sliding operation. Namely a first target area 401, a second target area 402 and a third target area 403 as shown in fig. 4. In practical applications, the electronic device may determine the target area corresponding to each sliding operation at one time after the user performs the sliding operation for multiple times, or may determine the target area corresponding to the sliding operation at this time when the user performs the sliding operation at each time.
Optionally, after determining the target area in the display screen, the electronic device may display a contour line corresponding to the target area, so as to facilitate a user to identify the target area. As shown in fig. 2, the electronic device may determine a trajectory of the sliding operation in response to the sliding operation by the user, and display a contour line 2011 corresponding to the trajectory of the sliding operation, so as to facilitate the user to identify the target area 201 according to the contour line 2011. The contour line may be, for example, a thick solid line or a thin solid line, and the embodiment does not limit the specific form of the contour line.
Step 303, object information of the object displayed in the target area is acquired.
In this embodiment, when the target area is multiple, the electronic device may respectively obtain object information of an object in each target area.
For example, as shown in fig. 4, after the electronic device determines the first target area 401, the second target area 402, and the third target area 403 in response to the first input, the electronic device may perform a screen capture operation on the first target area 401, the second target area 402, and the third target area 403 to obtain images corresponding to the first target area 401, the second target area 402, and the third target area 403, respectively, and sequentially input the images corresponding to the first target area 401, the second target area 402, and the third target area 403 into the image recognition model to obtain object information of the object in the first target area 401, object information of the object in the second target area 402, and object information of the object in the third target area 403, which are output by the image recognition model, respectively.
And step 304, determining the target operation of the user according to the object information, and determining a target application program related to the target operation from the electronic equipment.
Wherein the target operation is a user operation which the user may perform. An operation of a user operating, for example, clicking a certain control in an application program interface to open a certain interface in an application program; or the user clicks an application icon in the desktop of the electronic equipment to open an application; or the user inputs a keyword in an application program interface of a certain application program to perform retrieval operation. The electronic device may predict the user operation according to the object information to obtain a target operation corresponding to the object in the target area.
It should be noted that, when there are a plurality of target areas, the electronic device may determine, according to the object information of the object in each target area, a target operation corresponding to each target area. As shown in fig. 4, the electronic device may determine a target operation corresponding to the first target area 401 according to the object information of the object in the first target area 401, and similarly may determine a target operation corresponding to the second target area 402 according to the object information of the object in the second target area 402, and determine a target operation corresponding to the third target area 403 according to the object information of the object in the third target area 403. The electronic device may synchronously determine the target operation corresponding to each target region, or may determine the target operation corresponding to each target region step by step, which is not limited in this embodiment.
In this embodiment, the electronic device may determine the target operation and the target application program related to the target operation directly according to the object information, or may determine the target application program related to the target operation from the electronic device according to the target operation after determining the target operation.
For example, as shown in fig. 4, the application program interface currently displayed in the display screen of the electronic device is an application program interface in a shopping APP. Objects included in the first target area 401 are a search box control 4011 and a keyword 4012 in the application interface. After determining the first target region 401, the electronic device may determine object information of an object (search box) in the first target region 401 as a "search box" through an image recognition model. After obtaining the object information "search box", the electronic device may determine, according to the "search box", that a target operation that the user may perform is to perform an operation related to the search, such as opening a search history in a current application program or copying contents in the sticky pad into the search box by using the sticky pad, according to the "search box".
As another example, the object included in the second target area 402 is a takeaway APP icon 4021. After the electronic device determines the second target region 402, the object information of the takeaway APP icon 4021 in the second target region 402 may be determined as a "takeaway APP icon" by the image recognition model. After the electronic equipment obtains the object information 'takeaway APP icon', according to the 'takeaway APP icon', the target operation which is possibly performed by the user can be determined to be opening the takeaway APP and ordering to purchase the takeaway through a preset recommendation algorithm or a preset recommendation model. Meanwhile, according to the takeaway APP icon in the second target area 402, it can be determined that the target application program corresponding to the target operation is the takeaway APP installed in the electronic device.
It should be noted that, for convenience of understanding, the object information in the present embodiment is the same as the expression of the object, and in practical applications, the object information may also be other types of information representing characteristics of the object, which is not limited in the present embodiment.
Optionally, step 304 may be implemented as follows:
and determining a target operation according to at least one of the type of the application program corresponding to the application program interface, the user portrait of the user and the object information, and determining a target application program related to the target operation from the electronic equipment.
The type of the application program may be, for example, a type of an application program determined by classifying application programs installed in the electronic device in advance according to functions of the application program. For example, the type of APP1 is "friend-making type APP", the type of APP2 is "shopping type APP", and the type of APP3 is "image processing type APP". The electronic device may predetermine and store type information of each application installed in the electronic device. In practical applications, the application programs may also be classified by other characteristics of the application programs, which is not limited in this embodiment.
The user profile is data that can describe various characteristics, preferences, and habits of the user. For example, a user representation of a user may include data information describing the user's age, gender, income profile, hobbies, job category, fitness habits, and the like. The electronic device may obtain and store the user representation of the user from the server in advance, although any method for obtaining the user representation known in the art and possibly appearing in the future may be applied to the present application, and the specific manner for obtaining the user representation is not limited in the present application.
For example, the electronic device may determine the target operation according to the object information and the application type. As shown in fig. 4, the object included in the third target area 403 is a picture and a price of the cup 4031. After determining the third target area 403, the electronic device may determine the object information of the object (cup) in the third target area 403 as "cup" and "price" through an image recognition model. Meanwhile, the electronic device can determine that the type information of the pre-stored shopping APP is a shopping APP. After the electronic device obtains the object information 'cup' and 'price' and determines the type 'shopping APP' of the application program, according to the 'cup', 'price' and 'shopping APP', a user operation that a user may possibly perform a target operation of searching for the cup in the shopping APP can be determined through a preset recommendation algorithm or a recommendation model. At this time, according to the target operation "go to search cup in shopping APP", it can be determined whether another shopping APPX (target application) is installed in the electronic device, and the shopping APPX installed in the electronic device is taken as the target application.
For example, the electronic device may determine the target operation according to the object information and a user image of the user. As shown in fig. 5, fig. 5 is a schematic view of another application interface provided in the embodiment of the present application, where an application interface currently displayed in a display screen of an electronic device is an application interface in an image processing APP (e.g., an album). The electronic device determines a target area 501 in response to a first input by the user, the object included in the target area 501 being a lipstick 5012 of the user's face. After determining the target area 501, the electronic device may determine the object information of the object (lipstick 5012) in the target area 501 as "lipstick" by the image recognition model. Meanwhile, the electronic device may determine that the pre-acquired user representation is "female". After the electronic equipment determines the object information 'lipstick' and the user portrait 'woman', according to the 'lipstick' and the 'woman', a user operation that a user may need to search for lipstick in a shopping APP can be determined through a preset recommendation algorithm or a recommendation model, and a user operation that a sharing interface is opened in a friend-making APP and pictures are shared can be determined. At this time, according to the target operation, a target application (shopping APP) corresponding to the search lipstick may be determined from among applications installed in the electronic device, and a target application (friend-making APP) corresponding to the shared picture may be determined from among applications installed in the electronic device.
For example, the electronic device may determine the target operation based on the object information, the type of the application, and a user profile of the user. As shown in fig. 6, fig. 6 is a schematic view of another application interface provided in the embodiment of the present application, where an application interface currently displayed in a display screen of an electronic device is an application interface in a friend-making APP. The electronic device determines a target area 601 in response to a first input of a user, wherein an object included in the target area 601 is an emoticon 6012. After determining the target area 601, the electronic device may determine the object information of the object (the emoticon 6012) in the target area 601 as the "emoticon" through an image recognition model. Meanwhile, the electronic device may determine that the pre-acquired user portrait is "optimistic" and "expressive reach", and determine that the type information of the application is "friend-making APP". After the electronic equipment determines that the object information 'emoticon' and the user portrait are 'optimistic' and 'emoticon', and the type information 'friend making type APP' of the application program, according to the 'emoticon', the 'optimistic', the 'emoticon' and the 'friend making type APP', a user operation that the user may possibly need to perform a target operation of clicking the emoticon to collect the emoticon or selecting and sending the emoticon in a sending window in a chat interface of the friend making APP can be determined through a preset recommendation algorithm or a recommendation model. After the target operation is determined to be sending the emotion packet, a friend-making APP can be determined from the application programs installed in the electronic equipment, and the friend-making APP is used as the target application program related to sending the emotion packet.
It should be noted that the above is only an exemplary example, the method for determining the target operation according to the object information is not limited to the recommendation model and the recommendation algorithm, and any method for determining the target operation according to the object information known in the art and possibly appearing in the future may be applied to the present application, and the present application is not limited to a specific way of determining the target operation. The Recommendation algorithm may be any one of a Content-based Recommendation algorithm (Content-based Recommendation) algorithm, a Collaborative Filtering-based Recommendation algorithm (Collaborative Filtering) algorithm, and an Association Rule-based Recommendation algorithm (Association Rule-based Recommendation). The training and using methods of the recommendation model can refer to the prior art, and the implementation is not described in detail herein.
In practical application, target operation is predicted according to at least one of the type of the application program and the user portrait of the user and the object information, the target operation which is more consistent with the user can be obtained, and then more accurate menu items are displayed, and the accuracy of the electronic equipment is improved.
And 305, displaying the floating window corresponding to the target area according to the target operation and the target application program.
In this embodiment, after the target operation and the target application related to the target operation are determined, a menu item corresponding to the target operation may be generated according to the target operation and the target application, so as to display the floating window corresponding to the target area.
For example, if it is determined that the target operation is to open a search history in a currently running application (target application) and use a sticky-up board, the electronic device may determine a component corresponding to the search history and a component corresponding to the sticky-up board from the currently running application, may generate a menu item 4013 shown in fig. 4 according to a keyword (search history) in the target operation (open search history), and establish a call relationship between the menu item 4013 and the component corresponding to the search history, and may generate a menu item 4014 shown in fig. 4 according to keyword information (sticky-up board) in the target operation (open the sticky-up board), and establish a call relationship between the menu item 4014 and the component corresponding to the sticky-up board. The electronic apparatus can open the search history in response to the user's operation of clicking the menu item 4013, and open the sticker in response to the user's operation of clicking the menu item 4014.
With reference to step 303, if it is determined that the target operation is to open a takeout APP, place an order to purchase a takeout, and if the electronic device determines that a takeout APPW (target application) is already installed, a menu item 4022 as shown in fig. 4 may be generated according to the keyword information "open a takeout APP" in the target operation and the name "APPW" of the target application, and a call relationship between the menu item 4022 and the takeout APPW is established. The electronic device may open the takeout APPW in response to a user clicking on menu item 4022. On the contrary, if the takeaway APPZ is not installed in the electronic device, at this time, the electronic device may determine that the target application program related to the target operation is a download APP providing the APP download function, the electronic device may establish the menu item 4023 as shown in fig. 5 according to the keyword "download" and the takeaway APPZ corresponding to the target operation, and establish a call relationship between the menu item 4023 and the download APP, and the electronic device may respond to the operation of the user clicking the menu item 4023, open the download APP, and download the takeaway APPZ.
In connection with step 303, if the target operation is a user operation for searching for a cup in the shopping APP and it is determined that another shopping APPX (i.e., a target application) is installed in the electronic device, the menu items and 4032 shown in fig. 4 may be generated according to the keyword information "search" of the target operation and the name "APPX" of another shopping APPX, and a call relationship between the menu items and 4032 and another shopping APPX may be established, and the electronic device may open another shopping APPX in response to an operation of the user clicking on the menu item 4032.
In connection with step 303, if the target operation is a user operation for searching for lipstick in the shopping APP, and a sharing interface is opened in the friend-making APP, an operation for sharing pictures is performed, and a target application (shopping APP) corresponding to the searched lipstick and a target application (friend-making APP) corresponding to the shared pictures are determined, then according to the keyword information "search" and "lipstick" of the target operation and the name of the target application, the menu items 5013 and 5014 shown in fig. 5 are generated
In practical application, if the target operation is to open an application program interface in an application program, the electronic device can directly establish a call relationship between a menu item and the application program interface in a process of generating the menu item. As shown in fig. 6, if the target operation is a user operation of selecting and sending an emoticon in a sending window of a chat interface of a friend-making APP, the electronic device may generate a menu item 6013 shown in fig. 6, and the electronic device may open the sending window of the chat interface in response to an operation of clicking the menu item 6013 by the user to send the emoticon. And when the target operation is a user operation of clicking the favorite emoticon, the electronic device may generate a menu item 6014 as shown in fig. 6, and the electronic device may open a favorite interface to favorite the emoticon in response to the user operation of clicking the menu item 6014.
It should be noted that, the above is only an exemplary example, and in practical applications, the electronic device may determine a corresponding target application program according to different target operations, and generate different menu items according to different target operations and the target application program, so as to display a floating window including different menu items.
In this embodiment, after generating one or more menu items corresponding to the target area, the electronic device may display the floating window according to the menu items. As shown in fig. 4, 5, and 6, the electronic device may display a floating window 300 corresponding to each target area. And when the number of the menu items corresponding to the target area is multiple, the electronic device may display a floating window including the plurality of menu items to sequentially display the plurality of menu items in the floating window.
Optionally, when a plurality of objects are included in the target area, step 305 may be implemented as follows:
determining a priority for each of a plurality of objects;
displaying a floating window corresponding to the target area according to the target operation, the target application program and the priority level of each object; the menu items corresponding to each object are sequentially displayed in the floating window according to the priority level of each object; the priority of the object corresponding to the first menu item in the floating window is the highest priority among all priorities.
In this embodiment, when the floating window includes a plurality of objects, the electronic device may first determine a priority of each object, and determine a display position of a menu item corresponding to each object in the floating window according to the priority of each object, in a display process of the floating window, menu items are sequentially arranged from top to bottom according to a priority, and a higher priority of an object is, a higher position of a corresponding menu item in the floating window is.
Optionally, determining the priority of each object in the plurality of objects may include:
the priority of each of the plurality of objects is determined according to at least one of a ratio of an area corresponding to each of the plurality of objects to an area of the target region, color information of each of the plurality of objects, a weight corresponding to each of the plurality of objects, and heat information of each of the plurality of objects.
For example, as shown in fig. 4, the object in the first target region includes a search box control 4011 and a keyword 4012, when the electronic device determines the object by using an image recognition model, the electronic device may determine the ratio of the area of the search box control 4011 to the area of the first target region 401 of the keyword 4012, and if the ratio of the area of the search box control 4011 to the area of the first target region 401 is greater than the ratio of the area of the keyword 4012 to the area of the first target region 401, it is determined that the priority of the search box control 4011 is higher than the priority of the keyword 4012. In the process of displaying the floating window, the menu item corresponding to the search box control 4011 may be disposed above the menu item corresponding to the keyword 4012.
For another example, after determining the search box control 4011 and the keyword 4012, the electronic device may obtain color information (color information such as hue, brightness, and saturation) of the search box control 4011 and the keyword 4012, respectively, and determine that the search box control 4011 with high brightness has higher priority than the keyword 4012 with low brightness.
For another example, when the electronic device determines the search box control 4011 and the keyword 4012 through the image recognition model, the electronic device may determine the weight of the search box control 4011 and the weight of the keyword 4012 at the same time, and determine that the search box control 4011 with a high weight has a higher priority than the keyword 4012 with a low weight. Wherein the weight represents a probability of the object information determined by the image recognition model. The weight can be understood by referring to the prior art, and the embodiment does not limit the weight.
For another example, as shown in fig. 4, the first target area may further include other keywords 4015, and the electronic device may determine, according to the keywords 4012 and the heat information of the keywords 4015, that the keywords 4015 with high heat information have a higher priority than the keywords 4012 with low heat information. The popularity information represents the occurrence frequency of the keyword, and the understanding of the popularity information can refer to the prior art, which is not limited in this embodiment.
In the embodiment, in the process of displaying the floating window, the priority of each object in the target area is determined at first, the position of the menu item corresponding to each object in the floating window is determined according to the priority, the menu item with high attention of the user can be arranged at the important position in the floating window, the practicability of the floating window can be improved, and more convenient menu item selection is provided for the user. Meanwhile, the priority of each object is determined according to at least one of the area size, the color information, the weight and the heat information of each object in the target area, so that the priority of each object can be more accurately determined.
It should be noted that, the above is only an exemplary example, and the method of determining the priority of each object may include, but is not limited to, determining the priority of the object through one or more of the proportion of the area corresponding to each object to the area of the target region, the color information of each object, the weight corresponding to each object, and the heat information of each object.
In summary, in this embodiment, the electronic device may determine a target area in the application program interface in response to a first input of the user to the application program interface, and display a floating window including a menu item according to object information of an object in the target area. The menu items in the floating window are related to the objects in the area concerned by the user, and the user operation can be predicted according to the objects in the area concerned by the user to generate the corresponding menu items. Because the types of the objects in the application program interface are more, when the area concerned by the user is positioned in the application program interface, a plurality of menu items can be generated according to the objects in the application program interface, more operations can be provided for the user through the floating window, and the operation flexibility of the user can be further improved. Meanwhile, in the process of generating the menu item, firstly, the user operation is predicted according to the object information, and then the menu item is generated according to the predicted target operation, so that the electronic equipment can provide the menu item which meets the current requirement of the user, the flexibility of the electronic equipment is improved, the real-time performance of the electronic equipment is improved, and the real-time requirement of the user is met.
Optionally, the number of the target regions is N, where N is an integer greater than 1;
correspondingly, displaying the floating window corresponding to the target area according to the object information may include: displaying a floating window corresponding to the N target areas according to the object information of the objects in the N target areas;
or displaying N floating windows according to the object information of the object in each target area in the N target areas; one target area in the N target areas corresponds to one floating window in the N floating windows, and the floating window corresponding to each target area in the N target areas is different.
In this embodiment, when the number of the target areas is multiple, the electronic device may acquire object information of objects in all the target areas, and display a floating window including a menu item corresponding to each object according to the object information of all the objects.
For example, as shown in fig. 7, fig. 7 is a schematic diagram of another application program interface provided in an embodiment of the present application, and after determining a first target area 401, a second target area 402, and a third target area 403, the electronic device may display a floating window 300. With reference to the above example, the floating window 300 includes a menu item corresponding to an object in the first target area 401, a menu item corresponding to an object in the second target area 402, and a menu item corresponding to an object in the third target area 303. In practical application, the electronic device can display a total floating window according to the object information in all the target areas, and one floating window can be associated with a plurality of target areas, so that a user can conveniently determine a larger range, and more operation options can be provided for the user through one floating window.
In this embodiment, when the number of the target areas is multiple, the electronic device may display a floating window corresponding to each target area according to the object information of the object in each target area. For a specific process of displaying the floating window corresponding to each target region, refer to step 301 to step 305, which is not described herein again.
Optionally, the electronic device may display a floating window corresponding to each of the plurality of target areas according to the object information of the object in the plurality of target areas when the first input is the first preset input. And under the condition that the first input is a second preset input, displaying the floating windows corresponding to each target area according to the object information in each target area.
For example, the electronic device may display one floating window corresponding to each of the plurality of target areas according to object information of objects in the plurality of target areas when the first input is a clockwise sliding. And under the condition that the first input is sliding along the anticlockwise direction, displaying the floating windows corresponding to each target area according to the object information in each target area. The specific form of the first preset input and the second preset input may be set according to the requirement, which is not limited in this embodiment.
It should be noted that, in the floating window management method provided in the embodiment of the present application, the execution subject may be a floating window management apparatus, or a control module in the floating window management apparatus for executing a loading floating window management method. In the embodiment of the present application, a floating window management apparatus is taken as an example to execute a method for managing a loaded floating window, and the method for managing a floating window provided in the embodiment of the present application is described.
Referring to fig. 8, fig. 8 is a block diagram of a floating window management apparatus provided in an electronic device according to an embodiment of the present application, where the apparatus 800 may include: a receiving module 801, a responding module 802, an obtaining module 803 and a displaying module 804.
The receiving module 801 is configured to receive a first input of a display screen of an electronic device from a user.
The response module 802 is configured to determine a target area in the display screen in response to a first input.
The obtaining module 803 is configured to obtain object information of an object displayed in the target area.
The display module 804 is configured to display a floating window corresponding to the target area according to the object information; wherein the floating window includes menu items corresponding to the object.
Optionally, the receiving module 801 is specifically configured to receive a first input of the application program interface displayed on the display screen by the user.
Optionally, referring to fig. 9, fig. 9 is a block diagram of another floating window management apparatus provided in this embodiment of the application, and the display module 804 may include: a determination unit 8041 and a display unit 8042.
The determining unit 8041 is configured to determine a target operation of the user according to the object information, and determine a target application program related to the target operation from the electronic device.
The display unit 8042 is configured to display a floating window corresponding to the target area according to the target operation and the target application.
Optionally, the determining unit 8041 is specifically configured to determine a target operation according to the object information and at least one of a type of an application program corresponding to the application program interface, a user portrait of the user, and a target application program related to the target operation from the electronic device.
Optionally, the target area comprises a plurality of objects; the display unit 8042 includes: a determination subunit 80421 and a display subunit 80422.
The determining subunit 80421 is operable to determine a priority for each of the plurality of objects.
The display subunit 80422 is configured to display a floating window corresponding to the target area according to the target operation, the target application, and the priority level of each object; the menu items corresponding to each object are sequentially displayed in the floating window according to the priority level of each object; the priority of the object corresponding to the first menu item in the floating window is the highest priority among all priorities.
Optionally, the determining subunit 80421 is specifically configured to determine the priority of each of the plurality of objects according to at least one of a ratio of an area corresponding to each of the plurality of objects to an area of the target region, color information of each of the plurality of objects, a weight corresponding to each of the plurality of objects, and heat information of each of the plurality of objects.
Optionally, the number of target regions is N, where N is an integer greater than 1.
The display module 804 is specifically configured to display a floating window corresponding to each of the N target areas according to the object information of the objects in the N target areas; or displaying N floating windows according to the object information of the object in each target area in the N target areas; one target area in the N target areas corresponds to one floating window in the N floating windows, and the floating window corresponding to each target area in the N target areas is different.
The floating window management device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The floating window management device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The floating window management device provided in the embodiment of the present application can implement each process implemented by the floating window management device in the method embodiments of fig. 1 and fig. 3, and is not described here again to avoid repetition.
In this embodiment, the electronic device receives a first input of a user to a display screen of the electronic device, determines a target area in the display screen in response to the first input, acquires object information of an object displayed in the target area, and displays a floating window corresponding to the target area according to the object information. The menu items in the floating window are associated with the objects in the area concerned by the user, and the menu items in the floating window can be changed according to the change of the objects in the area concerned by the user, so that the user can conveniently execute the operation related to the objects in the area concerned by the user, and the flexibility of the user operation is improved.
Optionally, an electronic device is further provided in an embodiment of the present application, as shown in fig. 10, fig. 10 is a block diagram of a structure of the electronic device provided in the embodiment of the present application, where the electronic device 1000 includes a processor 1001, a memory 1002, and a program or an instruction stored in the memory 1002 and capable of running on the processor 1001, and when the program or the instruction is executed by the processor 1001, the process of the embodiment of the floating window management method is implemented, and the same technical effect can be achieved, and in order to avoid repetition, details are not repeated here.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic devices and the non-mobile electronic devices described above.
Fig. 11 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 1100 includes, but is not limited to: a radio frequency unit 1101, a network module 1102, an audio output unit 1103, an input unit 1104, a sensor 1105, a display unit 1106, a user input unit 1107, an interface unit 1108, a memory 1109, a processor 1110, and the like.
Those skilled in the art will appreciate that the electronic device 1100 may further include a power source (e.g., a battery) for supplying power to the various components, and the power source may be logically connected to the processor 1110 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system. The electronic device structure shown in fig. 11 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
A processor 1110 configured to control the user input unit 1107 to receive a first input of a user to a display screen of the electronic device;
determining a target area in the display screen in response to the first input;
acquiring object information of an object displayed in a target area;
according to the object information, the display unit 1106 is controlled to display a floating window corresponding to the target area; wherein the floating window includes menu items corresponding to the object.
In this embodiment, the electronic device receives a first input of a user to a display screen of the electronic device, determines a target area in the display screen in response to the first input, acquires object information of an object displayed in the target area, and displays a floating window corresponding to the target area according to the object information. The menu items in the floating window are associated with the objects in the area concerned by the user, and the menu items in the floating window can be changed according to the change of the objects in the area concerned by the user, so that the user can conveniently execute the operation related to the objects in the area concerned by the user, and the flexibility of the user operation is improved.
The processor 1110 is specifically configured to control the user input unit 1107 to receive a first input of the application program interface displayed on the display screen by the user.
In this embodiment, the electronic device may determine, in response to a first input to the application program interface by a user, a target area in the display screen in the application program interface, and display a floating window including a menu item according to object information of an object in the target area. The menu items in the floating window are related to the objects in the area concerned by the user, and the user operation can be predicted according to the objects in the area concerned by the user to generate the corresponding menu items. Because the types of the objects in the application program interface are more, when the area concerned by the user is positioned in the application program interface, a plurality of menu items can be generated according to the objects in the application program interface, more operations can be provided for the user through the floating window, and the operation flexibility of the user can be further improved.
The processor 1110 is further configured to determine a target operation of the user according to the object information, and determine a target application program related to the target operation from the electronic device;
according to the target operation and the target application program, the display unit 1106 is controlled to display the floating window corresponding to the target area.
In the embodiment, in the process of generating the menu item, firstly, the user operation is predicted according to the object information, and then, the menu item is generated according to the predicted target operation, so that the electronic equipment can provide the menu item meeting the current requirement of the user, the flexibility of the electronic equipment is improved, the instantaneity of the electronic equipment is improved, and the instantaneity requirement of the user is met.
The processor 1110 is specifically configured to determine a target operation according to the object information and at least one of a type of an application corresponding to the application interface, a user portrait of the user, and a target application related to the target operation from the electronic device.
In this embodiment, the target operation is predicted according to the object information and at least one of the type of the application program and the user portrait of the user, and the target operation more conforming to the user can be obtained, so that more accurate menu items can be displayed, and the accuracy of the electronic device is improved.
A processor 1110 further configured to determine a priority for each of the plurality of objects;
according to the target operation, the target application program and the priority level of each object, controlling the display unit 1106 to display a floating window corresponding to the target area; the menu items corresponding to each object are sequentially displayed in the floating window according to the priority level of each object; the priority of the object corresponding to the first menu item in the floating window is the highest priority in all priorities;
in the embodiment, in the process of displaying the floating window, the priority of each object in the target area is determined at first, the position of the menu item corresponding to each object in the floating window is determined according to the priority, the menu item with high attention of the user can be arranged at the important position in the floating window, the practicability of the floating window can be improved, and more convenient menu item selection is provided for the user.
The processor 1110 is further configured to determine a priority of each of the plurality of objects according to at least one of a ratio of an area corresponding to each of the plurality of objects to an area of the target region, color information of each of the plurality of objects, a weight corresponding to each of the plurality of objects, and heat information of each of the plurality of objects.
In this embodiment, the priority of each object is determined according to at least one of the area size, the color information, the weight, and the heat information of each object in the target region, so that the priority of each object can be determined more accurately.
The number of target regions is N, where N is an integer greater than 1.
A processor 1110, further configured to obtain object information of objects in the N target regions;
the floating window control unit is configured to control the display unit 1106 to display a floating window corresponding to each of the N target areas according to the object information of the objects in the N target areas; or displaying N floating windows according to the object information of the object in each target area in the N target areas; one target area in the N target areas corresponds to one floating window in the N floating windows, and the floating window corresponding to each target area in the N target areas is different.
In this embodiment, when the number of the target areas is multiple, the electronic device may acquire object information of objects in all the target areas, and display a floating window including a menu item corresponding to each object according to the object information of all the objects. Multiple target areas can be associated through one floating window, a user can conveniently determine a larger range, and therefore more operation options can be provided for the user through one floating window.
It should be understood that in the embodiment of the present application, the input Unit 1104 may include a Graphics Processing Unit (GPU) 11041 and a microphone 11042, and the Graphics processor 11041 processes image data of still pictures or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 1106 may include a display panel 11061, and the display panel 11061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1107 includes a touch panel 11071 and other input devices 11072. A touch panel 11071, also called a touch screen. The touch panel 11071 may include two portions of a touch detection device and a touch controller. Other input devices 11072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 1109 may be used for storing software programs and various data including, but not limited to, application programs and an operating system. Processor 1110 may integrate an application processor that handles primarily operating systems, user interfaces, applications, etc. and a modem processor that handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1110.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the above-mentioned floating window management method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the above-mentioned floating window management method embodiment, and can achieve the same technical effect, and in order to avoid repetition, the details are not repeated here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (6)

1. A floating window management method, comprising:
receiving a first input of a user to a display screen of the electronic equipment;
determining a target area in a display screen in response to the first input;
acquiring object information of an object displayed in the target area;
displaying a floating window corresponding to the target area according to the object information; wherein the floating window includes a menu item corresponding to the object;
wherein the receiving a first input of a user to a display screen of an electronic device comprises: receiving a first input of the user to an application program interface displayed on the display screen;
the displaying the floating window corresponding to the target area according to the object information includes: determining target operation according to the type of an application program corresponding to the application program interface, the user portrait of the user and the object information, and determining a target application program related to the target operation from the electronic equipment; displaying a floating window corresponding to the target area according to the target operation and the target application program;
wherein the target region comprises a plurality of objects; the displaying the floating window corresponding to the target area according to the target operation and the target application program comprises:
determining a priority for each of the plurality of objects;
displaying a floating window corresponding to the target area according to the target operation, the target application program and the priority level of each object; the menu items corresponding to each object are sequentially displayed in the floating window according to the priority level of each object; and the priority of the object corresponding to the first menu item in the floating window is the highest priority in all priorities.
2. The method of claim 1, wherein the determining the priority of each of the plurality of objects comprises:
determining a priority of each of the plurality of objects according to at least one of a ratio of an area corresponding to each of the plurality of objects to an area of the target region, color information of each of the plurality of objects, a weight corresponding to each of the plurality of objects, and heat information of each of the plurality of objects.
3. The method of claim 1, wherein the number of target regions is N, wherein N is an integer greater than 1;
the displaying the floating window corresponding to the target area according to the object information includes:
displaying a floating window corresponding to the N target areas according to the object information of the objects in the N target areas; alternatively, the first and second electrodes may be,
displaying N floating windows according to the object information of the object in each target area in the N target areas; and one of the N target areas corresponds to one of the N floating windows, and the floating window corresponding to each of the N target areas is different.
4. A floating window management device, comprising:
the receiving module is used for receiving a first input of a user to a display screen of the electronic equipment;
a response module for determining a target area in a display screen in response to the first input;
an acquisition module configured to acquire object information of an object displayed in the target area;
the display module is used for displaying the floating window corresponding to the target area according to the object information; wherein the floating window includes a menu item corresponding to the object;
the receiving module is specifically configured to receive a first input of the user to an application program interface displayed on the display screen;
the display module includes:
the determining unit is used for determining target operation according to the type of the application program corresponding to the application program interface, the user portrait of the user and the object information, and determining a target application program related to the target operation from the electronic equipment;
the display unit is used for displaying a floating window corresponding to the target area according to the target operation and the target application program;
wherein the target region comprises a plurality of objects; the display unit includes:
a determining subunit for determining a priority of each of the plurality of objects;
the display subunit is configured to display a floating window corresponding to the target area according to the target operation, the target application, and the priority level of each object; the menu items corresponding to each object are sequentially displayed in the floating window according to the priority level of each object; and the priority of the object corresponding to the first menu item in the floating window is the highest priority in all priorities.
5. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the floating window management method according to any one of claims 1-3.
6. A readable storage medium, on which a program or instructions are stored, which when executed by a processor, carry out the steps of the floating window management method according to any one of claims 1-3.
CN202010582023.3A 2020-06-23 2020-06-23 Floating window management method and device, electronic equipment and readable storage medium Active CN111813285B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010582023.3A CN111813285B (en) 2020-06-23 2020-06-23 Floating window management method and device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010582023.3A CN111813285B (en) 2020-06-23 2020-06-23 Floating window management method and device, electronic equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN111813285A CN111813285A (en) 2020-10-23
CN111813285B true CN111813285B (en) 2022-02-22

Family

ID=72845951

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010582023.3A Active CN111813285B (en) 2020-06-23 2020-06-23 Floating window management method and device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN111813285B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104090762A (en) * 2014-07-10 2014-10-08 福州瑞芯微电子有限公司 Screenshot processing device and method
WO2017177592A1 (en) * 2016-04-13 2017-10-19 北京小米移动软件有限公司 Operation processing method and device
CN108829319A (en) * 2018-06-15 2018-11-16 驭势科技(北京)有限公司 A kind of exchange method of touch screen, device, electronic equipment and storage medium
CN111182205A (en) * 2019-12-30 2020-05-19 维沃移动通信有限公司 Photographing method, electronic device, and medium
CN111221599A (en) * 2018-11-23 2020-06-02 奇酷互联网络科技(深圳)有限公司 Method for displaying floating window, mobile terminal and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104090762A (en) * 2014-07-10 2014-10-08 福州瑞芯微电子有限公司 Screenshot processing device and method
WO2017177592A1 (en) * 2016-04-13 2017-10-19 北京小米移动软件有限公司 Operation processing method and device
CN108829319A (en) * 2018-06-15 2018-11-16 驭势科技(北京)有限公司 A kind of exchange method of touch screen, device, electronic equipment and storage medium
CN111221599A (en) * 2018-11-23 2020-06-02 奇酷互联网络科技(深圳)有限公司 Method for displaying floating window, mobile terminal and storage medium
CN111182205A (en) * 2019-12-30 2020-05-19 维沃移动通信有限公司 Photographing method, electronic device, and medium

Also Published As

Publication number Publication date
CN111813285A (en) 2020-10-23

Similar Documents

Publication Publication Date Title
CN111506758B (en) Method, device, computer equipment and storage medium for determining article name
CN111612557A (en) Method and device for providing commodity object information and electronic equipment
CN112988006B (en) Display method, display device, electronic equipment and storage medium
CN112612391B (en) Message processing method and device and electronic equipment
CN112333084B (en) File sending method and device and electronic equipment
CN112099704A (en) Information display method and device, electronic equipment and readable storage medium
CN112083854A (en) Application program running method and device
CN112615958A (en) Contact person display method and device and electronic equipment
CN112954111A (en) Method and device for sharing pictures, electronic equipment and storage medium
CN113037925B (en) Information processing method, information processing apparatus, electronic device, and readable storage medium
CN113268182B (en) Application icon management method and electronic device
CN113114845B (en) Notification message display method and device
CN112882619B (en) Application interface management method and device, electronic equipment and medium
CN112788178B (en) Message display method and device
CN113836089A (en) Application program display method and device, electronic equipment and readable storage medium
CN112286611A (en) Icon display method and device and electronic equipment
WO2023138475A1 (en) Icon management method and apparatus, and device and storage medium
CN112000766A (en) Data processing method, device and medium
WO2022237877A1 (en) Information processing method and apparatus, and electronic device
CN111813285B (en) Floating window management method and device, electronic equipment and readable storage medium
CN114374663B (en) Message processing method and message processing device
WO2022166811A1 (en) Information processing method and apparatus, electronic device, and storage medium
CN113779293A (en) Image downloading method, device, electronic equipment and medium
CN111796736B (en) Application sharing method and device and electronic equipment
CN113779288A (en) Photo storage method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20230712

Address after: 5 / F, building B, No. 25, Andemen street, Yuhuatai District, Nanjing City, Jiangsu Province, 210012

Patentee after: NANJING WEIWO SOFTWARE TECHNOLOGY CO.,LTD.

Address before: 523860 No. 283 BBK Avenue, Changan Town, Changan, Guangdong.

Patentee before: VIVO MOBILE COMMUNICATION Co.,Ltd.