CN116627312A - Display method, device, equipment and storage medium - Google Patents

Display method, device, equipment and storage medium Download PDF

Info

Publication number
CN116627312A
CN116627312A CN202310626828.7A CN202310626828A CN116627312A CN 116627312 A CN116627312 A CN 116627312A CN 202310626828 A CN202310626828 A CN 202310626828A CN 116627312 A CN116627312 A CN 116627312A
Authority
CN
China
Prior art keywords
interface
display
controls
control
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310626828.7A
Other languages
Chinese (zh)
Inventor
周雨晴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202310626828.7A priority Critical patent/CN116627312A/en
Publication of CN116627312A publication Critical patent/CN116627312A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a display method, a device, equipment and a storage medium, which belong to the technical field of computers, and the display method can comprise the following steps: receiving a first input of a user under the condition of displaying the first interface in a full screen mode, wherein the first input is used for triggering the second interface to be displayed in a floating window; responsive to the first input, determining a target control in the second interface; and displaying the target control in a floating manner on the first interface.

Description

Display method, device, equipment and storage medium
Technical Field
The application belongs to the technical field of computers, and particularly relates to a display method, a display device, display equipment and a storage medium.
Background
As the screen size of electronic devices increases and the functions of the system are becoming more and more abundant, the electronic devices can show the contents of different applications on one screen. For example, the electronic device may play the video in a floating window while displaying the chat interface of the social application, so that the user may chat with others while playing the video.
However, the display area of the floating window is smaller, and controls in the display area of the floating window are also smaller, so that the difficulty of a user in operating the controls in the floating window is increased, and meanwhile, the electronic equipment cannot accurately identify the touch position of the user, and further, the user-clicked control cannot be determined, and the use of the electronic equipment is affected.
Disclosure of Invention
The embodiment of the application aims to provide a display method, a device, equipment and a storage medium, which can solve the problem that when an electronic device displays an application interface in a floating window mode, the difficulty of a user in touching a control in the application interface is large.
In a first aspect, an embodiment of the present application provides a display method, including:
receiving a first input of a user under the condition of displaying the first interface in a full screen mode, wherein the first input is used for triggering the second interface to be displayed in a floating window;
responsive to the first input, determining a target control in the second interface;
and displaying the target control in a floating manner on the first interface.
In a second aspect, an embodiment of the present application provides a display apparatus, including:
the receiving module is used for receiving a first input of a user under the condition of displaying the first interface in a full screen mode, wherein the first input is used for triggering the second interface to be displayed in a floating window;
a determining module for determining a target control in the second interface in response to the first input;
and the display module is used for displaying the target control on the first interface in a floating manner.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions implementing the steps of the display method as shown in the first aspect when executed by the processor.
In a fourth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which, when executed by a processor, implement the steps of the display method as shown in the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a display interface, where the display interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the steps of the display method as shown in the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product stored in a storage medium, the program product being executable by at least one processor to implement the steps of the display method as shown in the first aspect.
In the embodiment of the application, the first input for triggering the second interface to be displayed in the floating window can be received under the condition of displaying the first interface in a full screen, then, the target control in the second interface is determined in response to the first input, and the target control is displayed on the first interface in a floating mode. In this way, the target controls in the second interface can be split, so that the split target controls can be respectively and independently displayed in the first interface in the form of a floating window, the second interface is not required to be completely displayed, the user can conveniently and normally use the target controls in the second interface, the target controls in the second interface are independently displayed in the first interface, the difficulty of operating the controls by the user can be reduced by guaranteeing the display size of the target controls, the display space of the electronic device is effectively utilized, the target controls in the second interface are more flexibly and independently presented to the user, the electronic device can accurately identify the target controls of the user touch control, and the use efficiency of the electronic device is improved.
Drawings
FIG. 1 is a flow chart of a display method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of an interface of a display method according to an embodiment of the present application;
FIG. 3 is a second interface diagram of a display method according to an embodiment of the present application;
FIG. 4 is a third interface diagram of a display method according to an embodiment of the present application;
FIG. 5 is a diagram illustrating an interface of a display method according to an embodiment of the present application;
FIG. 6 is a fifth exemplary diagram of an interface of a display method according to an embodiment of the present application;
FIG. 7 is a diagram illustrating an interface of a display method according to an embodiment of the present application;
FIG. 8 is a diagram illustrating an interface of a display method according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a display device according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 11 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions of the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which are obtained by a person skilled in the art based on the embodiments of the present application, fall within the scope of protection of the present application.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or otherwise described herein, and that the objects identified by "first," "second," etc. are generally of a type not limited to the number of objects, for example, the first object may be one or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
In the related art, since an operating system of an electronic device can support a plurality of application programs (hereinafter, referred to as applications) to simultaneously share the same screen, application of a floating window mode occurs. The floating window mode is to scale down the application interface and display the interface with reduced scale in the form of a floating window, so that multiple applications can be displayed on the same screen of the electronic device at the same time, for example, when a video conference is performed, a conference record can be translated through a translation application, or when a game is performed, chat can be performed through an instant messaging application. However, since the floating window mode is to reduce the whole application interface in equal proportion, the controls in the display area of the floating window can be reduced, so that the difficulty of operating the controls in the floating window by a user is increased, and meanwhile, the electronic equipment cannot accurately identify the touch position of the user, and further, the user cannot be determined to click the controls in the display area, so that the use of the electronic equipment is affected.
In order to solve the problems in the related art, the following describes in detail the display method provided by the embodiment of the present application through specific embodiments and application scenarios thereof with reference to fig. 1 to 8.
First, a display method provided in an embodiment of the present application will be described in detail with reference to fig. 1.
Fig. 1 is a flowchart of a display method according to an embodiment of the present application.
As shown in fig. 1, the display method provided by the embodiment of the present application may be applied to an electronic device, and based on this, the display method may include the following steps:
step 110, receiving a first input of a user under the condition of displaying a first interface in a full screen, wherein the first input is used for triggering a second interface to be displayed in a floating window; step 120, in response to the first input, determining a target control in the second interface; and 130, floating and displaying the target control on the first interface.
In this way, the target controls in the second interface can be split, so that the split target controls can be respectively and independently displayed in the first interface in a floating window mode, the second interface is not required to be displayed completely, the user can conveniently and normally use the target controls in the second interface, the target controls in the second interface are independently displayed in the first interface, the difficulty of operating the controls by the user can be reduced by guaranteeing the display size of the target controls, the display space of the electronic device can be effectively utilized, the target controls in the second interface can be more flexibly and independently displayed in the first interface, the target controls of the user touch can be accurately identified by the electronic device, and the use efficiency of the electronic device is improved.
The above steps are described in detail below, and are specifically described below.
First, referring to step 110, the actual requirement of the user to trigger the first input is identified, and different interface types are determined according to the actual requirement of the user.
The actual requirement that the user trigger the first input may include the following.
1) The first interface and the second interface may be interfaces of different applications, for example, the first interface is an application interface of the first application, and the second interface is an application interface of the second application;
illustratively, the first application is a game application and the second application is an instant messaging application, and then, in the case of displaying the game interface of the game application in full screen, the first input is used to trigger displaying the chat interface with user a in the instant messaging application in a floating window. 2) The first interface is a desktop of the electronic device, and the second interface is an application interface of a first application in the electronic device.
Illustratively, the first application is a game application, and then, in the case of a full screen display of the desktop, the first input is used to trigger an interface in the game application to be displayed in a floating window, where, as shown in fig. 2, the user may press the application icon 21 of the game application in the desktop 20 for a long time and select "floating window display" to implement the interface in the game application to be displayed in the floating window.
3) The first interface and the second interface may be interfaces with different functions of one application, for example, as shown in fig. 3, the first application is a video application, the first interface is a video selection list interface 30, and the video selection list interface 30 includes a video selection list and a video frame of video 1. Based on this, in the case of a full screen display of the video selection list interface 30, the user may click on a "pop-up" control 31 in the video selection list interface, enabling triggering of a second interface comprising video pictures of video 1 for floating window display. Furthermore, referring to step 120, embodiments of the present application may determine the target control in three ways, as follows.
In one or more possible embodiments, selected by the user, this step 120 may include, in particular:
responding to the first input, displaying N controls in the second interface, wherein the N controls are all controls displayed in the second interface, and N is a positive integer;
and determining at least one control as a target control under the condition that the selection input of the user on at least one control in the N controls is received.
For example, as shown in fig. 4, if the second interface is a chat interface with the user a in the instant application, currently, the controls displayed in the chat interface are a control 20 for "inputting reply information" and a control 21 for "displaying chat records", and if the user clicks the control 20 for "inputting reply information", the control 20 for "inputting reply information" may be determined as a target control.
In another or more possible embodiments, identified by the operating system, this step 120 may include, in particular:
responding to the first input, acquiring N controls in the second interface, wherein the N controls are all controls displayed in the second interface, and N is a positive integer;
and determining the N controls as target controls.
For example, still referring to the example shown in fig. 4, if the second interface is a chat interface with the user a in the instant application program, and the controls displayed in the chat interface are the control 20 for "inputting reply information" and the control 21 for "displaying chat record", then the control 20 for "inputting reply information" and the control 21 for "displaying chat record" may be determined directly as target controls.
In yet another or more possible embodiments, since the controls corresponding to the second interface may include N controls currently in the second interface, and may also include controls hidden in the application corresponding to the second interface and associated with the second interface, the step 120 may specifically include: responding to the first input, and acquiring P controls corresponding to the second interface, wherein the P controls at least comprise the N controls;
and determining the P controls as target controls.
Here, the P controls may be N controls in the second interface; or, N controls in the second interface and at least one control hidden in the application corresponding to the second interface and associated with the second interface.
For example, if the second interface is a video playing interface, the video playing interface may include a "pause/play" control and a "return full screen control", then the N controls are the "pause/play" control and the "return full screen control", and the control hidden in the application corresponding to the second interface and associated with the second interface may be a "select video" control, a "fast forward" control, a "open bullet screen" control, and so on.
It should be noted that, the above manner of selection by the user and the manner of recognition by the operating system may be used in combination, that is, as shown in fig. 5, if the second interface is a chat interface with the user a in the instant application program, then, 2 controls displayed in the chat interface are the "input reply information" control 20, the "display chat record" control 21, and the controls hidden in the application corresponding to the second interface and associated with the second interface are the "transfer" control 22 and the "photo" control 23, at this time, the foregoing controls 20 to 23 may be displayed, and in response to the input, the "display chat record" control 21 is determined as the target control.
In yet another or more possible embodiments, pushed by a behavioral habit or application, the step 120 may specifically include:
responding to the first input, acquiring a behavior control list of a user and/or an application control set associated with an application corresponding to a second interface, wherein the behavior control list comprises at least one historical target control, and the application control set comprises at least one control with an independent display function, which is included in the application corresponding to the second interface;
and determining the control in the behavior control list of the user and/or the control in the application control set associated with the application corresponding to the second interface as a target control.
For example, if the user previously triggered a control that displayed a target control, such as "display chat record," then the control that displayed the chat record "may be considered the current target control. If the application corresponding to the second interface is an instant messaging application program, and the control with the function of displaying the instant messaging application program independently only comprises a control for displaying the chat record, determining the control for displaying the chat record as a target control.
Also, this embodiment may be used in combination with the above embodiment to determine the target control, which is not described herein.
Then, referring to step 130, in order to ensure that the target control does not occlude the middle content in its first interface, step 130 may specifically include:
acquiring display content of a first interface;
and displaying the target control in a floating manner at a target position, wherein the target position is the position with the minimum display content.
As shown in fig. 6, the first interface is a game interface of a game application, the second interface is a chat interface with the user a in the instant messaging application, and the target control is a control for inputting reply information, that is, an input box in fig. 6, where the input box may be displayed in the upper left corner of the first interface to avoid shielding the gesture operation position in the first interface. Alternatively, the input box may be displayed transparently, that is, the transparency of the input box is improved to avoid obscuring the content in the first interface.
Based on this, in one or more possible embodiments, the target control includes a plurality of target controls, where the plurality of target controls are a first target control and a second target control, based on which the step 130 may specifically include:
displaying the first target control at a first position of the first interface and displaying the second target control at a second position of the first interface; wherein the first position and the second position are different.
As shown in fig. 7, the first interface is a game interface of a game application, the second interface is a chat interface with a user a in an instant messaging application, the first target control is a control for inputting reply information, that is, an input box in fig. 7, and the second target control is a control for displaying chat records, that is, a dialog box in fig. 7, where, in order to ensure that contents in the two target controls are not blocked, the user touch is convenient, and the electronic device accurately identifies the target control touched by the user, the first position and the first position may not be the same position, that is, the input box is displayed at the upper left corner of the game interface, and the dialog box is displayed at the upper right corner of the game interface.
It should be noted that, as shown in fig. 7, the first target control and the second target control displayed in the floating window may be used normally, for example, the user may input the target content through an input box, and in the case that the user confirms to send, the target content that the user just inputs is displayed through a dialog box, where the user may browse the reply content of the user a through the dialog box. And, the user can browse the chat record with user a by sliding the dialog up and down.
Based on this, since the target control in the embodiment of the present application may be provided by the operating system of the electronic device in one example, in the case where the first interface is an interface in the first application and the second interface is a second application, the display method provided in the embodiment of the present application may further include, after step 130:
receiving a third input to the target control in the first interface;
responding to the third input, and acquiring the content input by the user in the target control;
the content is transmitted to the second application.
For example, as shown in fig. 8, if the user inputs the target content, such as "hello", through the input box and receives the input of clicking "send" by the user, the operating system of the electronic device may re-nick the target content "hello" to the second application, and the effect of replying to the user a is achieved through the function of replying to the information in the second application. At this time, a "hello" chat content can be displayed in the dialog. And if the user slides down the control for displaying the chat record to check the chat record with the user A, at the moment, the operating system of the electronic equipment can copy the sliding operation to the second application program, acquire the chat record corresponding to the sliding operation through the second application program, and display the chat record in the control for displaying the chat record in the first interface.
In addition, the user may also move the target control displayed in the first interface according to actual needs, for example, still referring to fig. 7, the user may optionally move the position of the input box and/or the position of the dialog box according to the state of the game interface. Therefore, the target control in the second interface can be split, so that the split first target control and second target control can be respectively and independently displayed in the first interface in the form of a floating window, the second interface is not required to be completely displayed in the form of the floating window, the control corresponding to the second interface can be normally used, the control corresponding to the second interface is independently displayed in the first interface, the display space of the electronic device can be effectively utilized while the display size of the target control in the second interface is not reduced, the target control which can be independently displayed in the first interface in the second interface is more flexibly presented to a user, the user can accurately browse the target control, the user can conveniently touch the control, and the electronic device can accurately identify the control touched by the user.
In addition, in another or more possible embodiments, if the first interface is switched to another interface, the target control displayed in the first interface may be changed, where the position of the target control may be changed in the manner shown in fig. 6, and the type of the target control may also be changed through the following steps, based on this, after step 130, the display method provided in the embodiment of the present application may further include:
step 1401, displaying a selection list of N controls when the first interface is switched to be displayed as a third interface;
step 1402, when receiving a selection input of a first control in the N controls by a user, displaying the first control in a floating manner on a third interface.
Illustratively, the first interface is an interface for selecting teammates in the game application, the third interface is a role control interface in the game application, and since it takes a certain time for the first interface to switch to be displayed in the third interface, the user may want to adjust some other target controls in the process, so that in the case of switching from the first interface to be displayed in the third interface, a selection list of N controls may be displayed so that the user selects the target controls that are displayed in suspension on the third interface.
In this embodiment, the selection list of N controls may be displayed, the selection list of P controls shown above may be displayed, or the target control that is not displayed in the first interface may be displayed for the user to select.
In summary, the embodiment of the application provides a display method for avoiding inconvenience in operation caused by a floating window mode, which can split target controls in a second interface, so that the split target controls can be respectively and independently displayed in a first interface in a floating window mode, thus, the second interface is not required to be completely displayed, a user can conveniently and normally use the target controls in the second interface, and the target controls in the second interface are independently displayed in the first interface, so that the difficulty in operating the controls by the user can be reduced while the display size of the target controls is ensured, the display space of the electronic device is effectively utilized, the target controls in the second interface can be more flexibly presented to the user, the electronic device can accurately identify the target controls touched by the user, and the use efficiency of the electronic device is improved.
According to the display method provided by the embodiment of the application, the execution main body can be a display device. In the embodiment of the present application, a display device executes a display method as an example, and a device of the display method provided in the embodiment of the present application is described.
Based on the same inventive concept, the application also provides a display device. This is described in detail with reference to fig. 9.
Fig. 9 is a schematic structural diagram of a display device according to an embodiment of the present application.
As shown in fig. 9, the display device 90 may be applied to an electronic apparatus, and the display device 90 may specifically include:
the receiving module 901 is configured to receive a first input of a user in a case of displaying a first interface in a full screen, where the first input is used to trigger displaying a second interface in a floating window;
a determining module 902 for determining a target control in the second interface in response to the first input;
and the display module 903 is configured to hover and display the target control on the first interface.
The display device 90 in the embodiment of the present application will be described in detail as follows.
In one or more possible embodiments, the display device 90 provided in the embodiments of the present application may further include an acquisition module; wherein,,
the acquisition module is used for acquiring the display content of the first interface;
the display module 903 may also be configured to hover display the target control at a target location, where the target location is the location with the least content displayed.
In another or more possible embodiments, the display module 903 may also be used to,
in the case that the target controls include a first target control and a second target control, displaying the first target control at a first position of the first interface and displaying the second target control at a second position of the first interface;
wherein the first position and the second position are different.
In yet another or more possible embodiments, the display module 903 is further configured to display N controls in the second interface in response to the first input, where the N controls are all controls displayed in the second interface;
the determining module 902 is further configured to determine, when a user input for selecting at least one of the N controls is received, the at least one control as a target control.
In still another or more possible embodiments, the display module 903 is further configured to display a selection list of N controls when the first interface is displayed as the third interface;
the display module 903 is further configured to hover display a first control of the N controls on the third interface when receiving a selection input of the first control by the user.
The display device in the embodiment of the application can be an electronic device or a component in the electronic device, such as an integrated circuit or a chip. The electronic device may be a terminal, or may be other devices than a terminal. By way of example, the electronic device may be a mobile phone, tablet computer, notebook computer, palm computer, vehicle-mounted electronic device, mobile internet appliance (Mobile Internet Device, MID), augmented reality (augmented reality, AR)/Virtual Reality (VR) device, robot, wearable device, ultra-mobile personal computer, UMPC, netbook or personal digital assistant (personal digital assistant, PDA), etc., but may also be a server, network attached storage (Network Attached Storage, NAS), personal computer (personal computer, PC), television (TV), teller machine or self-service machine, etc., and the embodiments of the present application are not limited in particular.
The display device in the embodiment of the application may be a device having an operating system. The operating system may be an Android operating system, an IOS operating system, or other possible operating systems, and the embodiment of the present application is not limited specifically.
The device cooperation apparatus provided by the embodiment of the present application can implement each process implemented by the embodiment of the display method shown in fig. 1 to 8, so as to achieve the same technical effect, and in order to avoid repetition, a detailed description is omitted here.
Based on the above, the display device provided by the embodiment of the application can receive the first input for triggering the second interface to be displayed in the floating window under the condition of displaying the first interface in a full screen, then, the target control in the second interface is determined in response to the first input, and the target control is displayed in a floating manner on the first interface. Therefore, the target controls in the second interface can be split, the split target controls can be respectively and independently displayed in the first interface in the form of a floating window, the second interface is not required to be displayed completely, the user can conveniently and normally use the target controls in the second interface, the target controls in the second interface are independently displayed in the first interface, the difficulty of operating the controls by the user can be reduced by guaranteeing the display size of the target controls, the display space of the electronic device is effectively utilized, the target controls in the second interface are presented to the user more flexibly, the electronic device can accurately identify the target controls for touch control by the user, and the use efficiency of the electronic device is improved.
Optionally, as shown in fig. 10, the embodiment of the present application further provides an electronic device 100, which includes a processor 1001 and a memory 1002, where the memory 1002 stores a program or an instruction that can be executed on the processor 1001, and the program or the instruction implements each step of the embodiment of the display method when executed by the processor 1001, and the steps can achieve the same technical effect, so that repetition is avoided, and no further description is given here.
The electronic device in the embodiment of the application includes the mobile electronic device and the non-mobile electronic device.
Fig. 11 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application.
The electronic device 1100 includes, but is not limited to: radio frequency unit 1101, network module 1102, audio output unit 1103, input unit 1104, sensor 1105, display unit 1106, user input unit 1107, interface unit 11011, memory 1109, processor 1110, and the like.
Those skilled in the art will appreciate that the electronic device 1100 may further include a power source (e.g., a battery) for powering the various components, which may be logically connected to the processor 1110 by a power management system, such as to perform functions such as managing charging, discharging, and power consumption by the power management system. The electronic device structure shown in fig. 11 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than illustrated, or may combine some components, or may be arranged in different components, which are not described in detail herein.
Wherein, in the embodiment of the present application, the user input unit 1107 is configured to receive a first input of a user in a case of displaying the first interface in a full screen, where the first input is used to trigger the second interface to be displayed in a floating window; a processor 1110 for determining a target control in the second interface in response to the first input; and the display unit 1106 is used for suspending and displaying the target control on the first interface.
The electronic device 1100 is described in detail below, and is specifically as follows:
in one or more possible embodiments, the processor 1110 may also be configured to obtain display content of the first interface;
the display unit 1106 may also be configured to hover display the target control at a target location, where the target location is a location with minimal display content.
In another or more possible embodiments, the display unit 1106 may also be configured to display the first target control in a first position of the first interface and display the second target control in a second position of the first interface, where the target controls include the first target control and the second target control;
wherein the first position and the second position are different.
In yet another or more possible embodiments, the display unit 1106 is further configured to display N controls in the second interface in response to the first input, where the N controls are all controls displayed in the second interface;
the processor 1110 may also be configured to determine at least one control as a target control upon receiving user selection input for at least one control of the N controls.
In still another or more possible embodiments, the display unit 1106 may be further configured to display a selection list of N controls when the first interface is displayed as a third interface;
the display unit 1106 may also be configured to hover display a first control of the N controls over the third interface upon receiving a user selection input of the first control.
It should be appreciated that the input unit 1104 may include a graphics processor (Graphics Processing Unit, GPU) 11041 and a microphone 11042, the graphics processor 11041 processing image data of still images or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The display unit 1106 may include a display panel, which may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1107 includes at least one of a touch panel 11071 and other input devices 11072. The touch panel 11071 is also referred to as a touch screen. The touch panel 11071 may include two parts, a touch detection device and a touch display. Other input devices 11072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume display keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
The memory 1109 may be used to store software programs and various data, and the memory 1109 may mainly include a first storage area storing programs or instructions and a second storage area storing data, wherein the first storage area may store an operating system, application programs or instructions (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 1109 may include volatile memory or nonvolatile memory, or the memory 1109 may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM), static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (ddr SDRAM), enhanced SDRAM (Enhanced SDRAM), synchronous DRAM (SLDRAM), and Direct RAM (DRRAM). Memory 1109 in embodiments of the present application includes, but is not limited to, these and any other suitable types of memory.
Processor 1110 may include one or more processing units; optionally, processor 1110 integrates an application processor that primarily processes operations involving an operating system, user interface, application programs, and the like, and a modem processor that primarily processes wireless display signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 1110.
The embodiment of the application also provides a readable storage medium, and the readable storage medium stores a program or an instruction, which when executed by a processor, implements each process of the above-mentioned display method embodiment, and can achieve the same technical effects, so that repetition is avoided, and no further description is provided here.
The processor is a processor in the electronic device in the above embodiment. Among them, the readable storage medium includes computer readable storage medium such as computer readable memory ROM, random access memory RAM, magnetic disk or optical disk, etc.
In addition, the embodiment of the application further provides a chip, the chip comprises a processor and a display interface, the display interface is coupled with the processor, the processor is used for running programs or instructions, the processes of the embodiment of the display method can be realized, the same technical effects can be achieved, and the repetition is avoided, and the description is omitted here.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
Embodiments of the present application provide a computer program product stored in a storage medium, where the program product is executed by at least one processor to implement the respective processes of the embodiments of the display method described above, and achieve the same technical effects, and for avoiding repetition, a detailed description is omitted herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in part in the form of a computer software product stored on a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method of the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are to be protected by the present application.

Claims (12)

1. A display method, comprising:
receiving a first input of a user under the condition of displaying a first interface in a full screen mode, wherein the first input is used for triggering a second interface to be displayed in a floating window;
responsive to the first input, determining a target control in the second interface;
and displaying the target control in a floating manner on the first interface.
2. The method of claim 1, wherein the hovering the target control over the first interface comprises:
acquiring display content of the first interface;
and displaying the target control in a suspension manner at a target position, wherein the target position is the position with the minimum display content.
3. The method of claim 1 or 2, wherein the target controls comprise a first target control and a second target control;
the floating display of the target control on the first interface includes:
displaying the first target control at a first position of the first interface and displaying the second target control at a second position of the first interface; wherein the first position and the second position are different.
4. The method of claim 1, wherein the determining, in response to the first input, a target control in the second interface comprises:
responding to the first input, and displaying N controls in the second interface, wherein the N controls are all controls displayed in the second interface;
and under the condition that the selection input of the user to at least one control in the N controls is received, determining the at least one control as the target control.
5. The method of claim 4, wherein after hovering the target control over the first interface, the method further comprises:
displaying a selection list of the N controls under the condition that the first interface is switched to be displayed as a third interface;
and under the condition that selection input of a user on a first control in the N controls is received, the first control is displayed on the third interface in a floating mode.
6. A display device, comprising:
the receiving module is used for receiving a first input of a user under the condition of displaying the first interface in a full screen mode, wherein the first input is used for triggering the second interface to be displayed in a floating window;
a determining module for determining a target control in the second interface in response to the first input;
and the display module is used for displaying the target control on the first interface in a floating manner.
7. The apparatus of claim 6, wherein the display apparatus further comprises an acquisition module; wherein,,
the acquisition module is used for acquiring the display content of the first interface;
the display module is further used for displaying the target control in a suspended mode at a target position, wherein the target position is the position with the minimum display content.
8. The apparatus of claim 6 or 7, wherein the display module is further configured to display a first target control in a first position of the first interface and to display a second target control in a second position of the first interface if the target controls include the first target control and the second target control;
wherein the first position and the second position are different.
9. The apparatus of claim 6, wherein the display module is further configured to display N controls in the second interface in response to the first input, wherein the N controls are all controls displayed in the second interface;
the determining module is further configured to determine, when receiving a selection input from a user to at least one control of the N controls, the at least one control as the target control.
10. The apparatus of claim 9, wherein the display module is further configured to display a selection list of the N controls if the first interface is displayed as a third interface; the display module is further configured to, when receiving a selection input of a user to a first control of the N controls, hover-display the first control on the third interface.
11. An electronic device, comprising: a processor, a memory and a program or instructions stored on the memory and executable on the processor, which when executed by the processor, implement the steps of the display method of any one of claims 1-5.
12. A readable storage medium, characterized in that the readable storage medium has stored thereon a program or instructions which, when executed by a processor, implement the steps of the display method according to any of claims 1-5.
CN202310626828.7A 2023-05-30 2023-05-30 Display method, device, equipment and storage medium Pending CN116627312A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310626828.7A CN116627312A (en) 2023-05-30 2023-05-30 Display method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310626828.7A CN116627312A (en) 2023-05-30 2023-05-30 Display method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116627312A true CN116627312A (en) 2023-08-22

Family

ID=87637925

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310626828.7A Pending CN116627312A (en) 2023-05-30 2023-05-30 Display method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116627312A (en)

Similar Documents

Publication Publication Date Title
WO2023016463A1 (en) Display control method and apparatus, and electronic device and medium
CN113360062A (en) Display control method and device, electronic equipment and readable storage medium
CN113282213A (en) Interface display method and device
CN115357158A (en) Message processing method and device, electronic equipment and storage medium
CN114385049A (en) Message processing method, device, equipment and storage medium
CN115658209A (en) Notification message display method and device and electronic equipment
CN116107531A (en) Interface display method and device
CN114564134A (en) Application icon display method and device
CN114416269A (en) Interface display method and display device
CN114443203A (en) Information display method and device, electronic equipment and readable storage medium
CN113885750A (en) Message processing method and device and electronic equipment
CN111638828A (en) Interface display method and device
CN114398128B (en) Information display method and device
CN115167721A (en) Display method and device of functional interface
CN114895815A (en) Data processing method and electronic equipment
CN116627312A (en) Display method, device, equipment and storage medium
CN114489420A (en) Voice information sending method and device and electronic equipment
CN113835601A (en) Screenshot management method and device
CN112764862A (en) Application program control method and device and electronic equipment
CN113393373B (en) Icon processing method and device
CN117707732A (en) Application switching method, device, electronic equipment and readable storage medium
CN116755591A (en) Application component display method, device, equipment and storage medium
CN118250419A (en) Session window display method and electronic equipment
CN114356164A (en) Sharing method and sharing device
CN117369930A (en) Interface control method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination