WO2019085810A1 - Procédé, dispositif et appareil de traitement, et support lisible par machine - Google Patents

Procédé, dispositif et appareil de traitement, et support lisible par machine Download PDF

Info

Publication number
WO2019085810A1
WO2019085810A1 PCT/CN2018/111814 CN2018111814W WO2019085810A1 WO 2019085810 A1 WO2019085810 A1 WO 2019085810A1 CN 2018111814 W CN2018111814 W CN 2018111814W WO 2019085810 A1 WO2019085810 A1 WO 2019085810A1
Authority
WO
WIPO (PCT)
Prior art keywords
control
sliding
interface
sliding direction
present application
Prior art date
Application number
PCT/CN2018/111814
Other languages
English (en)
Chinese (zh)
Inventor
康琳
许侃
Original Assignee
阿里巴巴集团控股有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 阿里巴巴集团控股有限公司 filed Critical 阿里巴巴集团控股有限公司
Publication of WO2019085810A1 publication Critical patent/WO2019085810A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present application relates to the field of terminal technologies, and in particular, to a processing method, a processing device, an apparatus, and a machine readable medium.
  • terminals with touch screens have become popular in daily life, such as mobile phones and tablet computers.
  • physical keys such as a return key, a home (back to desktop) key, and a menu key, are usually disposed under the touch screen to implement the navigation function of the terminal through the above physical keys.
  • the existing terminal needs to occupy a certain frame space to place the physical key, which makes the screen ratio of the terminal (relative ratio of the area of the touch screen and the front panel of the terminal) low.
  • the technical problem to be solved by the embodiments of the present application is to provide a processing method, which can improve the screen ratio of the terminal.
  • the embodiment of the present application further provides a processing device, a device, and a machine readable medium to ensure implementation and application of the foregoing method.
  • a processing method including:
  • An operation corresponding to the sliding direction is performed in accordance with the sliding direction of the sliding operation.
  • the embodiment of the present application further discloses a processing apparatus, including:
  • Display module for displaying controls
  • a receiving module configured to receive a sliding operation for the control
  • An operation execution module is configured to perform an operation corresponding to the sliding direction according to a sliding direction of the sliding operation.
  • the embodiment of the present application also discloses an apparatus, including: one or more processors; and one or more machine readable mediums on which instructions are stored, when executed by the one or more processors Having the apparatus perform the one or more of the methods described above.
  • Embodiments of the present application also disclose one or more machine-readable media having stored thereon instructions that, when executed by one or more processors, cause the device to perform one or more of the methods described above.
  • the embodiment of the present application further discloses an operating system for a device, including:
  • a display unit for displaying a control
  • a receiving unit configured to receive a sliding operation for the control
  • An operation unit configured to perform an operation corresponding to the sliding direction according to a sliding direction of the sliding operation.
  • the embodiment of the present application includes the following advantages:
  • the embodiment of the present application implements the navigation function of the terminal through the control displayed in the interface, which can solve the excessive wear of the physical key caused by frequent use, thereby prolonging the service life of the terminal; and saving the frame space for placing the physical key, and further It can increase the screen ratio of the terminal.
  • the display state may include: a number of controls, a shape of the control, a size of the control, and the like.
  • the embodiment of the present application can implement the navigation of the terminal by using one control, so that the difficulty of the user to memorize the location of the multiple physical keys can be saved, and the operation difficulty of the user is reduced; For example, a smaller area control can be realized by a smaller control size, so that the control occupies less touch screen space, so more space can be released to the touch screen interface, thereby increasing the space for information presentation.
  • FIG. 1 is a schematic diagram of a display control on a terminal according to an embodiment of the present application.
  • FIG. 2 is a schematic diagram of another display control on a terminal according to an embodiment of the present application.
  • FIG. 3 is a flow chart showing the steps of an embodiment of a processing method of the present application.
  • FIG. 4 is a schematic diagram of multiple view layers of an embodiment of the present application.
  • FIG. 5 is a schematic diagram of navigation through a control according to an embodiment of the present application.
  • FIG. 6 is a flow chart showing the steps of an embodiment of a processing method of the present application.
  • FIG. 7 is a schematic diagram of a navigation process according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of an interface of an embodiment of the present application.
  • FIG. 9 is a schematic diagram of an interface of an embodiment of the present application.
  • FIG. 11 is a flow chart showing the steps of an embodiment of a processing method of the present application.
  • 12(a) and 12(b) are schematic views respectively showing a display control of an embodiment of the present application.
  • FIG. 13(a) and 13(b) are schematic views respectively showing a display control of an embodiment of the present application.
  • 15(a) and 15(b) are schematic views respectively showing a display control of an embodiment of the present application.
  • 16(a) and 16(b) are schematic views respectively showing a display control of an embodiment of the present application.
  • 17(a) and 17(b) are schematic views respectively showing a display control of an embodiment of the present application.
  • FIG. 19 is a schematic diagram of a plurality of objects in an embodiment of the present application.
  • 21 is a schematic diagram of a handover object according to an embodiment of the present application.
  • FIG. 22 is a schematic diagram of another switching object according to an embodiment of the present application.
  • FIG. 23 is a schematic diagram of still another switching object in the embodiment of the present application.
  • 24 is a flow chart showing the steps of an embodiment of a processing method of the present application.
  • 25 is a schematic diagram of still another switching object in the embodiment of the present application.
  • Figure 26 is a block diagram showing the structure of an embodiment of a processing apparatus of the present application.
  • FIG. 27 is a schematic structural diagram of hardware of a device according to an embodiment of the present disclosure.
  • FIG. 29 is a schematic diagram of an operating system according to an embodiment of the present application.
  • the embodiment of the present application provides a processing scheme, which can display a control, receive a sliding operation for a control, and perform an operation corresponding to the sliding direction according to a sliding direction of the sliding operation.
  • control may refer to a component that provides or implements a user interface function
  • control is a package of data and a method
  • the control may have its own attributes and methods.
  • the controls can be displayed on the lower, top, left, or right edge of the screen.
  • the above control may be displayed at any position in the middle of the screen, and the specific position of the above control in the screen is not limited in the embodiment of the present application.
  • the embodiment of the present application implements the navigation function of the terminal through the control displayed in the interface, which can solve the excessive wear of the physical key caused by frequent use, thereby prolonging the service life of the terminal; and saving the frame space for placing the physical key, and further It can increase the screen ratio of the terminal.
  • the display state may include: a number of controls, a shape of the control, a size of the control, and the like.
  • the control may be in the shape of a rectangle, a racetrack, a bar, or the like, or the size of the control may include: a control width, a control height, and the like.
  • the embodiment of the present application can implement the navigation of the terminal by using one control, so that the difficulty of the user to memorize the location of the multiple physical keys can be saved, and the operation difficulty of the user is reduced; For example, a smaller area control can be realized by a smaller control size, so that the control occupies less touch screen space, so more space can be released to the touch screen interface, thereby increasing the space for information presentation.
  • the terminal 1 includes a touch screen 2, and a control 3 is displayed below the touch screen 2.
  • the shape of the control 3 is a racetrack shape.
  • the terminal 1 includes a touch screen 2, and a track-shaped area is displayed below the touch screen 2.
  • the track-shaped area is provided with a control 4 and a control 5 And control 6.
  • the embodiments of the present application can be applied to a terminal having a touch screen, including but not limited to: a smart phone, a tablet computer, an e-book reader, and an MP3 (Moving Picture Experts Group Audio Layer III). Player, MP4 (Moving Picture Experts Group Audio Layer IV) player, laptop portable computer, car computer, desktop computer, set-top box, smart TV, wearable device, etc.
  • the touch screen of the terminal may be a full screen.
  • the full screen adopts no border or ultra-narrow bezel design, and pursues a screen ratio of nearly 100%.
  • the screen ratio of the full screen is usually above 90%.
  • the touch screen of the terminal may also be a non-full screen.
  • the embodiment of the present application does not limit the specific touch screen of the terminal.
  • FIG. 3 a flow chart of steps of an embodiment of a processing method of the present application is shown. Specifically, the method may include the following steps:
  • Step 301 displaying a control
  • Step 302 Receive a sliding operation for the control.
  • Step 303 Perform an operation corresponding to the sliding direction according to a sliding direction of the sliding operation.
  • the embodiment of the present application can display a control in the case of a display interface.
  • the control may be displayed by an operating system or an application.
  • the embodiment of the present application does not limit the specific execution body corresponding to the display control.
  • the display control may specifically include: displaying the control at a lower edge, an upper edge, a left edge, or a right edge of the interface.
  • the foregoing control may be displayed at any position in the middle of the interface, and the specific position of the above control in the interface is not limited in the embodiment of the present application.
  • the embodiment of the present application can support the operation corresponding to the sliding direction by the sliding operation for the control, thereby implementing the navigation function of the terminal.
  • the sliding operation may be a sliding operation for the control, and the sliding operation may be an operation of the user's finger or the stylus for the control, and the sliding operation may cause the property of the control to be sent, the change may include at least one of the following:
  • the generating the displacement may include: according to the sliding direction of the sliding operation, the control generates a displacement in a direction opposite to the sliding direction or the sliding direction, wherein the speed of the displacement may be The speed matching of the sliding operation.
  • the speed of the sliding operation may be obtained according to the sliding distance of the sliding operation during the monitoring period.
  • the above-mentioned monitoring period can be determined by a person skilled in the art according to the actual application requirements.
  • the monitoring period can be a period in which the event listener monitors the operation event, etc., and it can be understood that the specific monitoring period is not limited in the embodiment of the present application.
  • the sliding distance of the sliding operation in the monitoring period may be determined according to the difference between the ending time of the monitoring period and the operating coordinate corresponding to the starting time.
  • the monitoring period may be The difference between the operation coordinates corresponding to the end time and the start time (currentX-previousX) as the sliding distance of the sliding operation during the monitoring period.
  • the speed of the displacement may be obtained according to the speed of the sliding operation, and the speed of the displacement is matched with the speed of the sliding operation, and specifically may include: the speed of the displacement is equal to the speed of the sliding operation, and the speed of the displacement It is smaller than the speed of the sliding operation, or the speed of the displacement is greater than the speed of the sliding operation.
  • the above controls may include a number of icons (computer graphics with explicit meanings) in which all or part of the icons may be displaced with the sliding operation.
  • the icon on the control generates a displacement
  • the method may include: according to the sliding direction of the sliding operation, the icon on the control is displaced toward the sliding direction or the opposite direction of the sliding direction, wherein the The speed of the displacement can be matched to the speed of the sliding operation.
  • the above control may include: a dot icon or a rectangular icon, and the dot icon or the rectangular icon may be displaced as the sliding operation.
  • control is compressed or stretched, and may include: the size of the control is reduced or increased according to a sliding direction of the sliding operation, wherein the reduced speed or the increased speed may be Matches the speed of the sliding operation.
  • the shape of the control may be a strip shape, and the width of the strip may gradually increase or decrease with the sliding operation, and the like.
  • the display appearance of the control is changed, and may include: changing a display appearance of a complete area of the control according to the sliding operation, or changing a display appearance of the sliding area of the control.
  • the display appearance parameter of the control may include at least one of a color parameter, a shape parameter, and a font parameter.
  • the display appearance of the control can be changed by changing the display appearance parameters of the control. For example, the transparency of the full area of the control may fade as the sliding operation; or the color of the swept area of the control may fade with the sliding operation, and the like.
  • the display appearance outside the control may be an interface appearance other than the control.
  • the display appearance outside the control is changed, and may include: an interface element other than the control changes in a sliding direction of the sliding operation.
  • the interface can create a shadow or the like in the sliding direction of the control or in the opposite direction of the sliding operation.
  • the attribute change of the control corresponding to the change 1 to change 5 is only an example, and a person skilled in the art can change any attribute of the control by the sliding operation according to the actual application requirement.
  • the specific process of changing any attribute is not limited.
  • the operation corresponding to the sliding direction may be used to jump between multiple view layers or jump within a view layer, thereby implementing the navigation function of the terminal.
  • the view layer is a functional layer provided by the operating system, and one view layer can be used to implement the corresponding function.
  • the plurality of view layers may include one or more of the following layers: a global notification control layer, a lock screen layer, an application layer, a desktop layer, and a global voice assistant layer.
  • the global notification control layer can be used to provide a global notification and control function.
  • the global notification function can include: a control panel function.
  • the control panel function can include: a notification function of a message
  • the global control function can include: Setting functions such as network, display, and application opening of the terminal.
  • the lock screen can be used to provide operational functions in a lock screen state.
  • the application layer is the view layer corresponding to the application and is used to run the application.
  • the desktop layer is used to run the desktop.
  • the global voice assistant layer can be used to provide a global voice control function, which can include: controlling the opening of the application through the voice, the back of the interface, and the operation of the interface content (such as dialing, etc.). It can be understood that the embodiments of the present application do not limit the specific view layer.
  • Page is an interface, which shows an illustration of multiple view layers in the embodiment of the present application.
  • the multiple view layers include, in order from top to bottom, a global notification control layer, a lock screen layer, an application layer, and a desktop. Layer, and global voice helper layer. It can be understood that the order of the multiple view layers shown in FIG. 4 is only an optional embodiment. In fact, the specific order of the multiple view layers is not limited in the embodiment of the present application.
  • the performing the operation corresponding to the sliding direction according to the sliding direction of the sliding operation may specifically include:
  • the sliding direction is a first direction, and then switching from the first view layer to the second view layer, wherein the second view layer is an adjacent view layer above the first view layer; or
  • the sliding direction is a second direction, and then switching from the first view layer to the second view layer, wherein the second view layer is a view layer below the first view layer and adjacent.
  • the first direction is a left-to-right direction
  • it is possible to switch from the first view layer to the second view layer above the first view layer for example, switching from the lock screen layer to the global notification Control layer.
  • the first view layer can be switched to the second view layer below the first view layer and adjacent, such as switching from the application layer to the desktop layer.
  • the performing a jump within a view layer may specifically include: jumping between multiple interfaces inside a view layer. Taking the view layer as the application layer as an example, you can jump between different interfaces of one application and interfaces of different applications.
  • the plurality of interfaces may be horizontally ordered in a set order.
  • the setting sequence may be a loading order of the interface, and the loading sequence is specifically a sorting of the loading time of the interface.
  • the interface may sequentially include: interface 1, interface 2, ... interface i, interface n, etc., where i and n are positive integers.
  • the performing the operation corresponding to the sliding direction according to the sliding direction of the sliding operation may specifically include:
  • the sliding direction is a first direction, and then switching from a first interface of the first view layer to a second interface of the first view layer, wherein the second interface is the left side of the first interface and adjacent Interface; or
  • the sliding direction is a second direction, and then switching from the first interface of the first view layer to the second interface of the first view layer, wherein the second interface is the right side of the first interface and adjacent Interface.
  • the first direction is the direction from left to right
  • the second direction is the right-to-left direction
  • FIG. 5 an illustration of navigation through a control is illustrated in an embodiment of the present application, in which a sliding operation for the control 501 can be received, and an operation corresponding to the sliding direction is performed according to a sliding direction of the sliding operation.
  • the sliding direction is a first direction, and the operation corresponding to the sliding direction is an operation of displaying a recent task list;
  • the sliding direction is a second direction, and the operation corresponding to the sliding direction is a return operation;
  • the sliding direction is a third direction, and the operation corresponding to the sliding direction is an operation of calling up a global notification control layer;
  • the sliding direction is a fourth direction, and the operation corresponding to the sliding direction is an operation of calling up the desktop layer.
  • the first direction may be a right-to-left direction
  • the second direction may be a left-to-right direction
  • the third direction may be a bottom-up direction
  • the fourth direction may be a top-to-bottom direction.
  • the returning operation may include: returning to the previous interface, returning to the previous application, or returning to the preset view layer.
  • a person skilled in the art can determine a corresponding preset view layer for a view layer according to actual application requirements, and return a preset view layer corresponding to the view layer according to the return operation.
  • the preset view layer corresponding to the application layer can be For the desktop layer.
  • the processing method of the embodiment of the present application implements the navigation function of the terminal through the control displayed in the interface, which can solve the excessive wear of the physical key caused by frequent use, thereby prolonging the service life of the terminal;
  • the border space of the physical key can further increase the screen ratio of the terminal.
  • the display state may include: a number of controls, a shape of the control, a size of the control, and the like.
  • the embodiment of the present application can implement the navigation of the terminal by using one control, so that the difficulty of the user to memorize the location of the multiple physical keys can be saved, and the operation difficulty of the user is reduced; For example, a smaller area control can be realized by a smaller control size, so that the control occupies less touch screen space, so more space can be released to the touch screen interface, thereby increasing the space for information presentation.
  • FIG. 6 a flow chart of steps of an embodiment of a processing method of the present application is shown, which may specifically include the following steps:
  • Step 601 displaying a control
  • Step 602 Receive a sliding operation for the control.
  • Step 603 Perform an operation corresponding to the sliding direction according to a sliding direction of the sliding operation
  • Step 604 entering the desktop layer in response to the first operation for the control.
  • Step 605 Enter a global voice assistant layer in response to the second operation for the control.
  • Step 606 Enter the global notification control layer in response to the third operation.
  • the embodiment may also perform between the plurality of view layers according to the first operation or the second operation for the control. Jumping, specifically, may switch to the desktop layer according to the first operation for the control; or, according to the second operation for the control, switch to the global voice assistant layer, thereby being fast between multiple view layers Jump.
  • the embodiment may further perform a jump between the plurality of view layers according to the third operation. Specifically, In response to the third operation, the global notification control layer is entered, whereby a quick jump can be made between the plurality of view layers.
  • the first operation and the second operation may be non-sliding operations, for example, the first operation may be a click operation, and the second operation may be a long press operation or the like.
  • the third operation may not be directed to the operation of the control.
  • the third operation may be an overall sliding operation or the like at the bottom of the touch screen. It can be understood that the embodiments of the present application are specific to the first operation, the second operation, and the third operation. No restrictions.
  • FIG. 7 a schematic diagram of a navigation process according to an embodiment of the present application is illustrated, in which a corresponding navigation operation may be performed according to a user operation:
  • Long press operation evokes the global voice assistant layer when it detects that the user has long pressed the control and there is no effective displacement.
  • the present embodiment details the display process of the control to weaken the interference of the control on the interface content by controlling the display state of the control, so that the interface can visually obtain a good overall sense.
  • the display state of the control may include:
  • Display state 2 the screen area where the control is located is different from the screen area where the interface is located;
  • a corresponding display state parameter may be set for the display state, and the setting interface corresponding to the display state parameter may be opened, so that the application or the operating system changes the display state of the control by setting the display state parameter.
  • the display state 1 may correspond to the first display state parameter
  • the display state 2 may correspond to the second display state parameter
  • the display state 3 may correspond to the third display state parameter; thus, if the display state 1 is required, the control may be corresponding to The display status parameter is set to the first display status parameter; if the display status 2 is required, the display status parameter corresponding to the control can be set to the second display status parameter; if the display status 3 is required, the corresponding display status of the control can be The parameter is set to the third display state parameter.
  • Display state 1 can be applied to a content-type interface (such as a news interface), wherein the interface can be maximized, and the control can be suspended above the interface without occupying the display control of the touch screen, thereby enabling the user to have a better field of view.
  • a content-type interface such as a news interface
  • the control 801 can be suspended above the interface.
  • control may be suspended on the application interface by setting a size of the application interface to a screen size; and displaying the navigation control on the application interface.
  • size of the application interface is set to the screen size, and the interface can be maximized.
  • the transparency value of the backplane where the control is located may conform to a preset condition to implement display of interface content of the area where the backplane is located.
  • the backplane can be used to carry the control.
  • the display layer of the interface includes a control layer, a backplane layer and an interface content layer from top to bottom.
  • the control layer is used to display the display control, the control layer is located at the uppermost layer, and the backplane layer is located.
  • the middle layer, the interface content layer is located at the bottom layer.
  • the transparency of the backplane layer may be fully transparent or translucent to achieve display of the interface content of the area where the backplane is located.
  • the transparency value of the backplane of the control may be in the range of [0.5, 1]. It can be understood that the embodiment of the present application is for the control.
  • the transparency value of the backplane is not limited.
  • the display state 2 can be applied to the operation type interface, and the operation control is already present at the bottom of the operation type interface, so that the control of the embodiment of the present application cannot be suspended. Therefore, the embodiment of the present application may be different from the screen area where the interface is located, that is, the screen area where the control is located is different from the screen area where the interface is located. In this case, the interface may be at least the maximum. State.
  • FIG. 9 a schematic diagram of an interface of the embodiment of the present application is shown, where the screen area where the control 901 is located is A, the screen area where the interface is located is B, A and B are both part of the touch screen, and A is located on the touch screen. bottom.
  • the color value corresponding to the control matches the color value corresponding to the adjacent interface part;
  • the interface portion can be an interface portion adjacent to the control. Since the color value corresponding to the control matches the color value corresponding to the adjacent interface portion, the effect that the control is immersed in the interface can be presented.
  • the color value corresponding to the control may be determined by: segmenting the control and the adjacent interface portion; and determining, according to the color value corresponding to the segment of the adjacent interface portion, the control The color value corresponding to the segment.
  • the process of determining the color value corresponding to the segment of the control may include: acquiring, for the segment of the neighboring interface portion, a color value corresponding to the plurality of pixel points included therein; If the color values corresponding to the plurality of pixels included in the segment match, the color value corresponding to one pixel included in the segment is used as the color value corresponding to the segment of the control; or if the segment contains more If the color values corresponding to the pixels do not match, the average value of the color values corresponding to the plurality of pixel points included in the segment is used as the color value corresponding to the segment of the control.
  • the color value at the bottom of the interface can be obtained and applied to the background strip of the control (eg, the backplane).
  • the bottom picture of the interface may be intercepted in real time, and the multi-point value of the segmentation may be performed. If the multi-point value is consistent (a solid color background), the color value is directly applied to the control; if the multi-point value is different (the color background), then Pass the segmentation mean to the control for gradual processing); make the interface visually complete and immersive.
  • Display state 3 can be applied to a full-screen interface, such as a game interface, a video interface, a full-screen reading interface, etc., which can reduce the interference of the control on the full-screen interface.
  • a full-screen interface such as a game interface, a video interface, a full-screen reading interface, etc.
  • FIG. 10 there is shown a contrast in which a control of the embodiment of the present application is hidden and displayed, wherein (b) shows the effect that the control 1001 is displayed, and (a) shows the effect that the control 1001 is hidden.
  • the process of displaying the control may include: displaying the control if a preset operation is received if the control is hidden.
  • the embodiment of the present application can trigger the display of the control by a preset operation.
  • the preset operation may be an operation of sliding from the edge of the screen to the inside, or an operation of presetting the pressing force, or a double-click operation, etc., and the embodiment of the present application does not limit the specific preset operation.
  • the control may be located at any position such as an upper edge/left edge/right edge of the interface. Controls can be hovered over the interface and moved to position. Different navigation functions can be implemented by clicking/double-clicking/long-pressing/pressing/sliding.
  • the control can always occupy a whole space, located at the bottom, does not change according to the interface content, or can set the display state parameters of the control according to the interface content to achieve different display states of the control.
  • the control can display multiple buttons by clicking/long-pressing/double-clicking/sliding.
  • FIG. 11 a flow chart of steps of an embodiment of a processing method of the present application is shown, which may specifically include the following steps:
  • Step 1101 displaying a control
  • Step 1102 If the setting condition is satisfied, change the display state of the control, and change at least part of the function of the response when the control is operated.
  • the embodiment of the present application can add a new function to the control by changing the display state of the control and changing at least part of the function of the response when the control is operated; since the multiplexing of the space occupied by the control can be realized, the method can be simplified The interface of the application, and can increase the space for information presentation.
  • the embodiment of the present application can prompt the user to display the event by displaying the state-changed control, so that the user can be given Whether to view the selection of the event and improve the user experience; and the embodiment of the present application may not occupy more screen space, and thus may increase the space for information presentation.
  • an event may refer to information of a response generated due to some actions.
  • the event can include: system events and/or application events, and the like.
  • the event can include: events from the user, events from the hardware, or events from the software, and the like.
  • system events may include: powering on, powering off, opening an application, switching applications, entering screen savers, exiting screen savers, power, network connections, system notifications, etc.
  • application events may include: current interface content of the application, events received by the application , application generated messages, etc.
  • the event may be a message from an IOT (Internet of Things) device
  • the IOT device may include: a home device, etc., taking the home device as an air conditioner as an example, if the air conditioner temperature is lower than the setting If the value is set, the message below the set value can be pushed to the mobile phone. In this case, the control of the mobile phone can be changed to the display state corresponding to the temperature adjustment of the air conditioner.
  • the message may include: a recommendation message and/or a notification message, and the like.
  • the recommendation message may be from an application, such as a current application or a non-current application; the notification message may be from an operating system or an application or a cloud.
  • a person skilled in the art may change the display state of the control for the recommendation message according to the actual application requirement, or change the display state of the control for the notification message, wherein the display state of the control corresponding to the recommendation message and the display state of the control corresponding to the notification message may be the same. Or different.
  • the changing the display state of the control may include: changing a style attribute of the control.
  • the appearance of the control can be defined by a style attribute, so that the display state of the control can be changed by configuring the style attribute.
  • the foregoing style attribute may specifically include at least one of the following parameters: an icon parameter, a color parameter, a size parameter, and a text parameter.
  • the changing the style attribute of the control may include: replacing the original icon of the control with a new icon, and/or adding a new icon to the control.
  • FIG. 12(a) and FIG. 12(b) respectively, a schematic diagram of a display control of an embodiment of the present application is shown, wherein FIG. 12(a) shows a schematic diagram of a control 1201 before a display state change. 12(b) shows an illustration of the control 1202 after the display state is changed, wherein the control 1202 has changed the icon relative to the control 1201.
  • the function of the control can be configured to remain unchanged, or can be configured to change the function.
  • the changed control displays an icon of “recommended”, and the function may be a recommendation function, and when the user clicks the control, the recommendation is triggered. Display of information.
  • the icon displayed on the control can have a certain reminder function, as shown in Figure 12 (b), the control prompts the recommendation function through the "recommended" icon; in another example, the changed control can also display "search”.
  • the icon can be a search function.
  • the function of triggering the search is displayed;
  • the icon displayed on the control can be a text, an image or a shortcut icon, etc., not listed here, the specific display mode and the corresponding The function can be flexibly configured according to actual needs.
  • Color parameters can be used to define the background color or foreground color of the control.
  • the control after the display state is changed may be marked and displayed by changing the color parameter corresponding to the background, so as to improve the recognition degree of the control after the display state is changed.
  • the background color of the control 1202 in FIG. 12(b) can be set to a highly recognizable color such as yellow or red.
  • Text parameters can be used to define the font of the control.
  • the display state of the control can be changed by changing the text parameters of the control (such as font, font size, text).
  • the text corresponding to the control before the status change is “return to the desktop”, and, for example, the text corresponding to the control after the status change is “notification message”.
  • the size parameter can be used to define the display size of the control. It can be understood that the above icon parameters, color parameters, size parameters, and text parameters are only examples. In fact, those skilled in the art can determine the required style attributes according to actual application requirements, and then change the display state of the control through the required style attributes. .
  • the changing manner used to change at least part of the function of the response in the case where the control is operated may specifically include:
  • Change mode 2 delete all or part of the original function of the control, and add new functions to the control.
  • FIG. 13(a) and FIG. 13(b) respectively, a schematic diagram of a display control of the embodiment of the present application is shown, wherein FIG. 13(a) shows the control before the display state is changed. 13(b) shows an illustration of the control 1302 after the display state is changed, wherein the original function of the control 1301 is a return function; the functions of the control 1302 may include: a return function and a search function, the control 1302
  • the icons may include: an icon 1321 and an icon 1322, wherein the icon 1321 and the icon 1322 are respectively used to implement a return function and a notification function.
  • FIG. 12(a) and FIG. 12(b) An example of changing mode 2 is shown in FIG. 12(a) and FIG. 12(b).
  • the original function of the control 1201 in FIG. 12(a) is a navigation function, and all of the original functions of the control 1201 can be deleted and added.
  • a new feature - recommended function, the search function can be implemented through the control 1202.
  • one display state of the control may correspond to one or more functions of the control, and those skilled in the art may determine the association between the display state and the function according to actual application requirements.
  • the recommended function and the notification function of the control may correspond to the same or different display states and the like.
  • the background color of the control is red, the recommended function of the corresponding control, and the background color of the control is green, corresponding to the notification function of the control; or, for example, the background color of the control is red, corresponding to the recommended function and notification function of the control.
  • the embodiment of the present application does not limit the specific association between the display state and the function.
  • the prompt of the event may be sent to the user by displaying the control after the state change.
  • the message may include: a recommendation message and/or a notification message, and accordingly, the control is operated after at least part of the function of the response is changed if the control is operated
  • the function of the response may include: a recommendation function and/or a notification function and/or a control function. Examples of the control function may include: controlling the brightness of the screen in the case where the system generates an event of low battery; or controlling the temperature of the air conditioner or the like in the event of receiving an event in which the temperature of the air conditioner is lower than the set value.
  • the method of the embodiment of the present application may further include: in response to a trigger operation of the control after the display state is changed by the user, executing the changed corresponding to the control after the display state change
  • a trigger operation of the user for the control 1202 if a trigger operation of the user for the control 1202 is received, a recommendation function may be performed.
  • a recommendation interface may be displayed, and the recommendation interface may include a message requiring recommendation.
  • a notification function may be performed.
  • a notification interface may be displayed, and the notification interface may include a message requiring notification.
  • the processing method of the embodiment of the present application can add a new function to the control by changing the display state of the control and changing at least part of the response of the control being operated; since the space occupied by the control can be realized. Reuse, therefore, simplifies the interface of the application and increases the space for information presentation.
  • the embodiment of the present application may change the display state of the control and change at least part of the function of the response when the control is operated, in case the event is received, wherein the control is operated
  • the function of the response in the case where the control is operated may include: a recommended function and/or a notification function and/or a control function of the message.
  • the embodiment of the present application may judge the received event, change the display state of the control, and change the operation of the control when the received event satisfies the set condition. At least part of the function of the response.
  • At least one of the steps included in the method of the embodiments of the present application may be performed by an operating system or an application.
  • the application in the case of being executed by the application, the application may be the current application located in the foreground, and the current application may determine whether the message satisfies the set condition after receiving the message, and if so, change the control: change the The display state of the control, and at least some of the functions that respond when the control is manipulated.
  • the operating system may determine whether the message satisfies the set condition, and if so, change the control in the interface: change the control Displaying the state, and changing at least some of the functionality of the response in the event that the control is operated.
  • the received event is associated with the current interface or the application service of the current interface;
  • the received event corresponds to the type of event registered in advance.
  • the received event is associated with the current interface or the application service of the current interface, and can exclude messages (such as advertisement messages) that are not related to the current interface or the application service of the current interface, so that irrelevant messages can be avoided for the user. disturb.
  • An application service can refer to a service provided by an application.
  • Examples of application services may include: downloading novels, watching videos, inquiring about surrounding restaurants, booking air tickets, obtaining coupons, downloading ringtones, etc., to meet the daily life and work needs of the user.
  • the received event is associated with the current interface or the application service of the current interface, and specifically includes:
  • the received event corresponds to the first keyword
  • the current interface or the application service of the current interface corresponds to the second keyword
  • the first keyword matches the second keyword
  • the received event corresponds to a first application
  • the current interface corresponds to a current application
  • the first application is associated with the current application
  • the application service corresponding to the received event is associated with the application service of the current interface.
  • the first keyword may be extracted from the received event, and the application service of the current interface or the current interface corresponds to a second keyword, and the second keyword may be provided by a service provider corresponding to the application service, and The first keyword matches the second keyword.
  • the matching between the first keyword and the second keyword may include: the first keyword is the same as the second keyword, is similar or related, and the like.
  • the correlation between the applications may be determined in advance, and according to the correlation, whether the first application is associated with the current application is determined.
  • the degree of relevance between applications can be determined by the user. For example, the user can determine its associated application for an application, such as "WeChat” associated applications including "QQ” and the like.
  • the relevance between the applications may be determined according to the category to which the application belongs, for example, if the two applications belong to the same category, the two applications are associated.
  • the correlation between applications can also be obtained according to the co-occurrence probability of the input content of the user in the two application environments. It is assumed that the number of input content of the user in the two application environments is N, and the user co-occurs in the two application environments.
  • the number M of input contents, M/N is taken as the co-occurrence probability, and the co-occurrence probability can be directly used as the correlation degree.
  • the matching between the keyword of the application service corresponding to the received event and the keyword of the application service of the current interface it is determined whether the application service corresponding to the received event is associated with the application service of the current interface, and no description is made herein. .
  • the application corresponding to the received event meets the preset priority condition, and the message corresponding to the application that meets the preset priority condition may be prompted.
  • the preset priority condition may include: the priority of the application corresponding to the received event is higher than the priority of the current application; or the received event is originated from the current application, or the received event corresponds to
  • the application is a user-preset application or the like. It can be understood that the specific preset priority conditions are not limited in the embodiment of the present application.
  • the user's personalized features can be used to represent the user's unique characteristics or user tags, thereby providing the current user with an information service that meets their individual needs.
  • the personalized feature may specifically include at least one of the following: a user attribute feature, a preference feature.
  • the user attribute feature may include: a relatively stable static attribute feature, such as a user's age, gender, region, education, learning stage, grade stage, business circle, occupation, marriage, consumption level, and the like.
  • Preference features are typically dynamic relative to the relative stability of the user attribute features described above, which can vary with changing user behavior.
  • the preference feature may specifically include: a user's preference characteristics for content such as games, books, entertainment, sports items, and the like. It can be understood that a person skilled in the art can also adopt a corresponding preference feature according to actual application requirements, and the embodiment of the present application does not limit the specific preference feature.
  • the preference feature may be obtained based on historical behavior data of the user. For example, if the user frequently searches for and views a certain star A related webpage in the most recent time period, the personalized feature of the user may include the star A, and thus the star A related message may be provided to the user.
  • the personalized feature of the user may include the TV series B, so the user may provide the TV drama B related news, for example, the drama B. Episode update news, etc.
  • the pre-registered event type may be a sensor event, an IOT device event, a power event, a network connection event, or a preset application. Events and the like, the embodiment of the present application does not limit the types of events registered in advance.
  • the setting conditions are described above by setting the condition 1 to the setting condition 4. It can be understood that the above-mentioned setting conditions 1 to 4 are only examples, and in fact, those skilled in the art can adopt according to actual application requirements. Other setting conditions, the embodiment of the present application does not limit the specific setting conditions.
  • the setting condition may be that the received event is a passive message; the passive message is relative to the active message, and the active message may be used to represent a message requested by the user, such as requested commodity information, etc.
  • the passive type Messages are used to characterize messages that are not actively requested by the user, such as unsolicited but recommended item information.
  • the process of changing the display state of the control in a case that the setting condition is satisfied may include: changing the control when receiving the setting message Display status.
  • the setting message may be determined by a person skilled in the art according to the actual application requirement.
  • the setting message may be that the power is lower than the power threshold, or the setting message may be that the air conditioning temperature is lower than the set value, or
  • the setting message may be that the received message is associated with the current interface or the application service of the current interface, etc., and it may be understood that the setting message may be determined according to the setting condition 1 to the setting condition 4, and the embodiment of the present application is The specific setting message is not limited.
  • the method may further include: displaying the setting message in response to the control after the state change is triggered.
  • a recommendation function may be performed.
  • a recommendation interface may be displayed, and the recommendation interface may include a message requiring recommendation.
  • a notification function may be performed.
  • a notification interface may be displayed, and the notification interface may include a message requiring notification.
  • the setting message may include: a message generated by an application corresponding to the current interface, or a service message provided by the current interface, or a message pushed by the server corresponding to the current interface.
  • the message generated by the application corresponding to the current interface may be information corresponding to the product B similar to the product A.
  • the service message provided by the current interface may be a message related to a recent contact, or a search message of a contact, or the like. It can be understood that the information corresponding to the commodity B similar to the commodity A may be generated by an application corresponding to the current interface, or may be pushed by a server corresponding to the current interface.
  • control in step 1101 to step 1102 may be the control in step 301 to step 303, that is, a processing method in the embodiment of the present application may include: displaying a control; receiving a sliding for the control Performing an operation corresponding to the sliding direction according to a sliding direction of the sliding operation; and, in a case where the setting condition is satisfied, changing a display state of the control, and changing a case where the control is operated At least part of the functionality of the response.
  • the processing method of the embodiment of the present application may determine the received event, change the display state of the control, and change the operation of the control when the received event satisfies the set condition. At least part of the function of the response; this can avoid the message disturbing the user and improve the intelligence of message prompting through the control.
  • FIG. 14 a flow chart of steps of an embodiment of a processing method of the present application is shown, which may specifically include the following steps:
  • Step 1401 Display a first control having a navigation function
  • Step 1402 Replace the first control with a second control having an operational function of the application.
  • the navigation function between the applications can be realized through the navigation bar, and the operation inside the application can be realized through the operation interface set in the interface of the application.
  • a navigation bar can be used at the bottom of the front screen to accommodate virtual keys such as a back button, a home button, and a menu button to implement the functions corresponding to the three virtual buttons.
  • the operation interface may include: a search interface or the like.
  • the operation interfaces set in the interface of the application are often scattered, and the user has high cost for finding and memorizing the operation interface, which makes the operation difficulty of the user increase and the operation efficiency is lowered.
  • the embodiment of the present application can replace the control to implement the navigation function or the operation function of the application through the control corresponding to the same screen space; since the multiplexing of the space occupied by the control can be realized, the interface of the application can be simplified and the interface can be reduced. The user's operational difficulty and improved operational efficiency, and can increase the space for information presentation.
  • the navigation function may include: performing a jump between multiple view layers, and/or a function of jumping within a view layer.
  • the second control may be displayed by a mark to improve the recognition of the second control.
  • FIG. 15(a) and FIG. 15(b) respectively, a schematic diagram of a display control of an embodiment of the present application is shown, wherein FIG. 15(a) shows the first control 1501, and FIG. 15(b) shows A second control 1502 is output, wherein the function corresponding to the second control 1502 is a search function, and the second control 1502 is highlighted.
  • the first control and the second control may be different controls.
  • the first control and the second control may correspond to a visible attribute; the replacing the first control with a second control having an operation function of the application may specifically include: The visible property of the second control is made visible and the visible property of the first control is hidden. Wherein, before replacing the first control with the second control having the operational function of the application, the visible property of the second control may be hidden, and the visible property of the second control may be made visible.
  • the replacing the first control with a second control having an operation function of the application includes: after setting a display position of the second control to a position where the first control is located, The display position of the first control is set to be outside the screen display range.
  • the display position of the first control may be set to be within a screen display range;
  • replacing the first control with The process of the second control of the operation function of the application may be a process of exchanging the display position of the first control and the display position of the second control, that is, the display position of the second control may be set to be within the screen display range.
  • the first control and the second control may be the same control, and the operation attribute and the style attribute corresponding to the first control and the second control different.
  • the style attribute can be used to define the style of the control.
  • the appearance of the control can be defined by the style attribute, so that the display state of the control can be changed by configuring the style attribute.
  • the operation attribute can be used to represent the response of the control to the user operation, and the response can be processed after receiving the user operation, for example, after receiving the user operation, jumping to a specified interface, etc., the operation attribute can reflect the function of the control.
  • the second control may be multiple, and the replacing the first control with the second control having the operation function of the application may be include:
  • Substitute 2 displaying a second control of the portion, and displaying a first entry corresponding to the second control that is not displayed;
  • the second control is not displayed, and the second entry corresponding to the second control is displayed.
  • the alternative 1 can display all of the second controls.
  • Alternative 2 may display a portion of the second control and display a first entry corresponding to the second control that is not displayed, such that the user triggers display of the second control that is not displayed through the first entry.
  • FIG. 16(a) and FIG. 16(b) respectively, a schematic diagram of a display control of an embodiment of the present application is shown, wherein FIG. 16(a) shows a part of the second control 1601 and the first inlet 1602.
  • FIG. 16(b) may display the second control 1603 and the second control 1604 that are not displayed, wherein the second control 1603 and the second control 1604 respectively use Implement message function and return function.
  • Alternative 3 may display a second entry corresponding to the second control.
  • Figures 17(a) and 17(b) there are shown schematic views of a display control of an embodiment of the present application, wherein Figure 17(a) shows a portion of the first entry 1701, upon receiving In the case of the triggering operation of the first entry 1702, FIG. 17(b) may display the second control 1702, the second control 1703, and the second control 1704, wherein the second control 1702, the second control 1703, and the second control 1704 respectively Used to implement search, message, and return functions.
  • the alternatives 1 to 3 are merely examples of alternatives adopted by the first control to replace the second control with the operational function of the application, and those skilled in the art can according to the actual application requirements.
  • the first control is replaced with a second control having an operational function of the application.
  • the specific embodiment of the present application does not limit the specific alternative.
  • control described in step 1401 to step 1402 may be the control in step 301 to step 303, that is, a processing method in the embodiment of the present application may include: displaying a control; receiving a sliding for the control Performing an operation corresponding to the sliding direction according to a sliding direction of the sliding operation; and, in a case where the setting condition is satisfied, changing a display state of the control, and changing a case where the control is operated At least part of the functionality of the response.
  • the processing method of the embodiment of the present application may replace the control to implement a navigation function or an operation function of the application through a control corresponding to the same screen space; since the space occupied by the control may be reused, Simplify the interface of the application, reduce the user's operation difficulty and improve the operation efficiency, and can increase the space for information presentation.
  • control of one screen space may include a navigation function or an operation function of the application, and may switch between the navigation function of the control and the operation function of the application according to actual application requirements.
  • this embodiment illustrates the determination process of the second control.
  • the second control may be obtained according to at least one of the following information:
  • the location of the interface where the first control is located the operating habit of the user, the content attribute of the interface where the first control is located, and the business attribute of the application where the first control is located.
  • the operation habits may include: a used operation function, where the operation function may include: an operation function used by the user at an interface where the first control is located or an interface of the interface where the first control is located.
  • the operation function that the user is accustomed to use is a dialing function
  • the function of the second control can be a dialing function
  • the second control can be a control with a dialing function.
  • the function of the second control can be a search function
  • the second control can be a control with a search function.
  • a person skilled in the art can use the operation function used by the user at the interface where the first control is located or the related interface of the interface where the first control is located as the function of the second control according to actual application requirements. It should be noted that, in the case that the user has used multiple operation functions on the interface where the first control is located or the related interface on which the first control is located, the user may use information such as the frequency of use and the latest usage time. Sorting a plurality of operation functions used by the user at the interface where the first control is located or the related interface of the interface where the first control is located, and selecting an optimal operation function according to the sorting result, as a function of the second control .
  • the content attribute of the interface where the first control is located may include: a multimedia attribute
  • the function of the second control may specifically include: a download function
  • the content attribute of the interface where the first control is located may include: a download completion attribute, and the function of the second control may include: a view function of downloading the content.
  • the user can be provided with a second control having a download function of the episode of the video.
  • the content attribute changes, and the user can be provided with a second control having a viewing function of the episode of the downloaded video.
  • the location of the interface where the first control is located may be a preset location
  • the functions of the second control may include: a search function, a historical traceback function, a return initial position function, and At least one of the termination position functions is reached.
  • the second control having at least one of a search function, a history tracing function, and a return home position function can be provided to the user.
  • the search function can be used to search for pictures
  • the historical trace function can be used to provide the best viewed pictures
  • the return to initial position function can be used to return to the first picture in the picture collection, thereby saving users from finding the desired picture. the cost of.
  • the user can be provided with the function of returning the initial position. Or reaching a second control at the termination position to cause the user to return to the top of the webpage through the second control that returns the initial position function, or to enable the user to reach the bottom of the webpage through the second control that reaches the termination location function, thereby Save users the cost of scrolling web content.
  • the user data the data display usually refers to the search by the user in this case
  • the second control of the function is highlighted for the user to click.
  • the service attribute may include an update attribute
  • the function of the second control may include: an update function
  • the business attribute may include an advertisement attribute
  • the function of the second control may include: an advertisement function
  • the user may be provided with a second control having an update function.
  • the function of the second control can be determined by a person skilled in the art according to the actual service attribute, and the specific function of the specific service attribute and the second control is not limited in the embodiment of the present application.
  • the embodiment of the present application may be based on the location of the interface where the first control is located, the operation habit of the user, the content attribute of the interface where the first control is located, and the service of the application where the first control is located. At least one of the attributes determines whether there is an operation requiring recommendation, and if so, replaces the first control with a second control having an operational function of the application.
  • the distance between the current position of the interface where the first control is located and the initial position of the interface where the first control is located may be determined, if the distance meets the preset Condition, it can be considered that the user has a content requirement, and thus it can be considered that there is an operation requiring recommendation, such as at least one of a search function, a history trace function, and a return home position function.
  • the operation habit of the user it may be determined whether the operation function used by the user at the interface where the first control is located or the related interface of the interface where the first control is located exists, and if so, it may be considered that there is an operation that needs to be recommended, such as The operation corresponding to the operating habits.
  • the content attribute of the interface where the first control is located it may be determined whether the content attribute is, for example, a multimedia attribute, a download completion attribute, or the like. If yes, it may be considered that there is an operation that requires recommendation, such as a download operation or a view operation of downloading the content.
  • the location of the interface where the first control is located, the operation habit of the user, the content attribute of the interface where the first control is located, and the service attribute of the application where the first control is located may be combined.
  • the recommended operation is an operation that conforms to the operation habit of the user.
  • the recommended operation is to conform to the operation of the user.
  • Customary operation For example, in combination with the content attribute of the interface where the first control is located and the service attribute of the application where the first control is located, the recommended operation is an operation that conforms to the business attribute and the content attribute, and the like.
  • the processing method of the embodiment of the present application may be based on the location of the interface where the first control is located, the operation habit of the user, the content attribute of the interface where the first control is located, and the application where the first control is located.
  • At least one of the business attributes determines a function of the second control, and in particular, may match a function of the second control with an environment in which the first control is located (interface environment, user environment, or business environment), thereby enabling The functions of the two controls meet the needs of the user.
  • the embodiment of the present application can switch between the navigation function of the control and the operation function of the application according to actual application requirements. For example, according to at least one of a position of the interface where the first control is located, a user's operating habit, a content attribute of an interface where the first control is located, and a service attribute of an application where the first control is located, Determining whether there is an operation that needs to be recommended, and if so, replacing the first control with a second control having an operation function of the application; further, continuing to follow the position of the interface where the first control is located, the user's operating habit Determining whether there is an operation requiring recommendation, at least one of a content attribute of an interface where the first control is located and a service attribute of an application where the first control is located, and if not, the second control may be Replace with the first control, etc.
  • FIG. 18 a flow chart of steps of an embodiment of a processing method of the present application is shown. Specifically, the method may include the following steps:
  • Step 1801 displaying a plurality of objects
  • Step 1802 determining a target object of the plurality of objects in response to a sliding operation of the user, and outputting prompt information corresponding to the target object;
  • Step 1803 when the sliding operation is completed, enter an interface corresponding to the object determined to be the target object when the sliding operation ends.
  • the target object can be used to represent the object of interest.
  • the embodiment of the present application can support the user to select a target object from multiple objects by a sliding operation, and enter the interface corresponding to the target object when the sliding operation ends;
  • the example can enter the target object required by the user by one sliding operation, so the convenience of object switching can be improved.
  • the sliding operation of the embodiment of the present application can be completed in a limited area, so that the operating range of the user can be reduced, which can be applied to the case of holding the terminal with one hand.
  • the object may be an entity having an attribute in the computer system.
  • the above objects may include, but are not limited to, an application, a page, a card, a message, or a contact, and the like.
  • the application can refer to an application in the terminal.
  • a page can refer to a page of a terminal (such as a page of an application).
  • a card may refer to a business object owned by a user in a platform corresponding to an application, and the card may give the user some rights, thereby allowing the user to enjoy the corresponding authority by using the card.
  • the type of the card may include: a card type, a ticket type or a ticket type, and the like.
  • the attribute information corresponding to the card type may include: a card number, a merchant information, a membership level, a platform information, or a discount strength; and the attribute information corresponding to the ticket type may include: a departure time (starting time), a train number (shadow hall number), and boarding Port, or station number, etc.; attribute information corresponding to the ticket type may include: number, merchant information, platform information, expiration date, or applicable conditions.
  • the message may refer to a message of an operating system or an application.
  • the message may include a notification message of an operating system, or the message may include a message of an application, such as a short message of a short message application, a message of an instant messaging application, or the like.
  • Contacts can be used to indicate contacts for communication, such as contacts in the address book, contacts in the buddy list, and so on. It can be understood that a person skilled in the art can adopt a required object according to actual application requirements, and the embodiment of the present application does not limit a specific object.
  • FIG. 19 there is shown a schematic diagram of a plurality of objects in an embodiment of the present application, where a plurality of objects may include: object 1, object 2, ... object i... object n, where i, n are positive integers,
  • the plurality of objects in Fig. 1 are arranged vertically. It can be understood that a plurality of objects in the embodiments of the present application may also be arranged in a horizontal direction. The specific arrangement manner of multiple objects in the embodiment of the present application is not limited.
  • multiple objects can be displayed through an object list or a multi-object interface.
  • the user can trigger the display of multiple objects by any trigger operation.
  • the above triggering operation may include: a sliding operation, or a click operation, or a long press operation or the like.
  • the area corresponding to the sliding operation may be a certain control, or may be an area specified in the screen or the edge, and the sliding direction may be left to right, right to left, top to bottom, bottom to top, Or 45 degrees direction, etc.
  • the controls may be displayed at the lower edge, the upper edge, the left edge, or the right edge of the interface.
  • the foregoing control may be displayed at any position in the middle of the interface, and the specific position of the above control in the interface is not limited in the embodiment of the present application.
  • the click operation can be a click operation, or a double click operation.
  • the area that responds to a click can be a control, or it can be an area specified on the screen or at the edge.
  • the long press operation may refer to an operation in which the operation time of the screen exceeds a preset duration, and an example of the preset duration may include 5s or the like.
  • the area that responds to a long press can be a control, or it can be an area specified in the screen or at the edge.
  • the sliding operation in step 1801 may be an operation of sliding from the edge of the screen to the inner side.
  • the sliding operation may be an operation triggered by the user's finger 2001 according to the sliding direction 2002, specifically from the upper edge of the screen. The operation of sliding down, so that the target object can be moved along with the sliding operation, and the switching of the target object is realized by the sliding operation.
  • the sliding operation shown in FIG. 20 is only an example.
  • the sliding operation may be an operation of sliding upward from the lower edge of the screen, or an operation of sliding from the left edge of the screen to the right, or sliding from the right edge of the screen to the left. Operation, etc.
  • the determined target objects are different.
  • the above stage may be determined according to at least one of a time, a position, and a sliding distance corresponding to the sliding operation.
  • the process of determining the target object of the plurality of objects in response to the sliding operation of the user may include:
  • the matching the location of the first object with the first location may include: the area where the first object is located matches the area corresponding to the first location.
  • the matching the location of the second object with the second location may include: the area where the second object is located matches the area corresponding to the second location.
  • the target object can be switched. Specifically, the time corresponding to the sliding operation is located in the first period, and the first object is determined to be the target object, or the time corresponding to the sliding operation is located in the second period, the second object is determined as the target object, and the like.
  • the target object may change according to the sliding operation.
  • the initial target object can be obtained by preset.
  • the initial target object can be the first object in the set order, or the last object, or the intermediate object.
  • FIG. 21 a schematic diagram of an interface of an embodiment of the present application is shown, wherein after displaying a plurality of objects, the initial target object may be the first object of the upper edge of the screen, the object 1, and the sliding operation of the user The target object can be switched to the object 2 by the object 1, and then the target object can be switched to the object 3 by the object 2. In this case, if the sliding operation of the user ends, the object 3 can be entered, that is, the interface corresponding to the object 3 can be displayed. .
  • the target object may be obtained according to the sliding distance corresponding to the sliding operation, or the target object may be obtained according to the sliding distance and the sliding direction corresponding to the sliding operation.
  • the sliding distance can be used as a basis for switching timing
  • the sliding direction can be used as a basis for switching directions.
  • the sliding distance corresponding to the sliding operation may be the total distance of the sliding operation, such as the distance between the current position of the user's finger and the initial position; or the sliding distance corresponding to the sliding operation may be the sliding distance within the monitoring period.
  • the sliding direction is horizontal
  • the user's finger coordinate corresponding to the start time of the monitoring period is previousX
  • the user's finger coordinate corresponding to the end time of the monitoring period is currentX
  • the operation corresponding to the end time of the monitoring period and the starting time may be operated.
  • the difference between the coordinates (currentX-previousX) as the sliding distance of the sliding operation during the monitoring period can establish a mapping rule between the sliding distance corresponding to the sliding operation and the switching of the target object according to actual application requirements, and determine the target object in real time according to the mapping rule.
  • the target object may be obtained according to a sliding distance corresponding to the sliding operation and a reference sliding distance corresponding to an object. Specifically, if the sliding distance of the sliding operation in the monitoring period reaches the reference sliding distance, the switching of the target object is performed.
  • the reference sliding distance can be used to characterize the sliding distance corresponding to the switching, that is, how much distance the sliding operation requires to switch the target object. Specifically, if the sliding distance of the sliding operation reaches the reference sliding distance in the monitoring period, the target object may be switched from the jth object of the plurality of objects to the (j+1)th object, where j, (j+1) can be the number of the object, j can be a positive integer, can be in the order of display of multiple objects (such as top to bottom, or bottom to top, or left to right) Objects are numbered sequentially or right-to-left. Taking FIG.
  • the target object is switched, that is, the target object may be The object 1 is switched to the object 2, and then, if the sliding distance of the sliding operation in the next monitoring period reaches the reference sliding distance, the switching of the target object is performed, that is, the target object can be switched from the object 2 to the object 3.
  • the reference sliding distance may be obtained according to the number of the plurality of objects, for example, the reference sliding distance may be a ratio of a distance threshold to a number of objects, and the distance threshold may not exceed the sliding
  • the screen size corresponding to the direction for example, if the sliding direction is from top to bottom, or from bottom to top, the distance threshold may not exceed the vertical screen size, and if the sliding direction is from left to right or from right To the left, the distance threshold may not exceed the horizontal screen size.
  • the reference sliding distance may be a preset value.
  • the target object may be obtained according to a touch position corresponding to the sliding operation.
  • the object that matches the touch position corresponding to the sliding operation may be the target object, that is, the area where the target object is located may match the area corresponding to the touch position.
  • the matching of the region where the target object is located and the region corresponding to the touch location may include: the region where the target object is located is the same as the region corresponding to the touch location, or the region corresponding to the touch location is closest to the region where the target object is located.
  • the positions of the plurality of objects may be relatively fixed. Taking FIG. 22 as an example, if the initial target object is the object 1, if the touch position corresponding to the sliding operation enters the area where the object 2 is located, the target object is switched, that is, the target object can be switched from the object 1 to Object 2, next, if the touch position corresponding to the sliding operation enters the area where the object 3 is located, the switching of the target object is performed, that is, the target object can be switched from the object 2 to the object 3.
  • the positions of the plurality of objects may be varied in response to the sliding operation described above, that is, the plurality of objects may be moved in response to the sliding operation described above.
  • a plurality of objects may move upward with a sliding operation from top to bottom. If the initial target object is the object 1 , if the touch position corresponding to the sliding operation enters the area where the object 2 is located, then The switching of the target object, that is, the target object can be switched from the object 1 to the object 2, and then, if the touch position corresponding to the sliding operation enters the area where the object 3 is located, the target object is switched, that is, the target object. It can be switched to object 3 by object 2.
  • determining the target object among the multiple objects is described in detail above by determining the solution 1 and the determining solution 2. It can be understood that the foregoing determining solution 1 and determining the solution 2 are only examples of the embodiment of the present application, in fact, A specific object in the plurality of objects may be determined by a person skilled in the art according to an actual application requirement. The specific process of determining the target object among the multiple objects is not limited in the embodiment of the present application.
  • outputting the prompt information corresponding to the target object may enable the user to determine which object is the target object.
  • the process of outputting the prompt information corresponding to the target object may include:
  • the prompt information corresponding to the target object is played by audio.
  • the mark display may be a visual display mode.
  • the mark display may specifically include: highlighting at least part of the interface elements of the target object (such as a background, an icon, a font, etc.), or targeting the target.
  • the object adds a corresponding focus mark (such as a focus mark, etc.), or enlarges the target object, and the like.
  • the prompt information corresponding to the target object played by the audio may be an audible display mode, so that the user obtains the prompt information audibly. It can be understood that the specific process of outputting the prompt information corresponding to the target object is not limited in the embodiment of the present application.
  • the processing method of the embodiment of the present application can enter the target object required by the user by one sliding operation, so that the convenience of object switching can be improved.
  • the sliding operation of the embodiment of the present application can be completed in a limited area (such as an edge area of the screen, or even an edge area of a shorter distance of the screen), so that the operation range of the user can be reduced, which can be applied to one-handed holding.
  • a limited area such as an edge area of the screen, or even an edge area of a shorter distance of the screen
  • the method may include the following steps:
  • Step 2401 displaying a plurality of objects in response to a sliding operation for the control
  • Step 2402 after displaying the plurality of objects, if the sliding operation has not ended, determining a target object of the plurality of objects in response to the sliding operation, and outputting prompt information corresponding to the target object;
  • Step 2403 When the sliding operation ends, enter an interface corresponding to the object determined to be the target object when the sliding operation ends.
  • control in step 2401 to step 2403 may be the control in step 301 to step 303, that is, a processing method in the embodiment of the present application may include: displaying a control; receiving a sliding for the control Performing an operation corresponding to the sliding direction according to a sliding direction of the sliding operation; and displaying a plurality of objects in response to a sliding operation for the control; after displaying the plurality of objects, if the sliding operation is not yet finished And determining, in response to the sliding operation, the target object of the plurality of objects, and outputting prompt information corresponding to the target object; in the case that the sliding operation ends, entering the sliding operation end The interface corresponding to the object that is determined to be the target object.
  • the process of displaying a plurality of objects may specifically include: displaying a plurality of objects in response to a sliding operation for the control.
  • the method may further include: after displaying the plurality of objects, setting the size of the control to a preset size.
  • the preset size may be a size that is convenient for the user to operate.
  • the preset size may be larger than the size of the front control for displaying a plurality of objects, so that the user may generate a slide thereon.
  • FIG. 25 a schematic diagram of a switching object of an embodiment of the present application is shown, wherein a plurality of APPs (Applications) can be displayed in response to a user's right-to-left sliding operation on the control 2501.
  • APP A can be used as a target application to highlight the target application and extend the horizontal size of the control 2501 so that the user can
  • the sliding is generated on the above; in this case, the left and right sliding control 2501 can switch the target application in multiple applications in a left-right order, and the user can release the sliding when the application (such as APP B) that the user wants to enter is highlighted.
  • the operation ends, so you can enter the target application (such as APP B) at the end of the sliding operation.
  • the target object in the object list is determined in response to the sliding operation, and the prompt information corresponding to the target object is output.
  • the user can trigger the display of multiple objects by the sliding operation of the control.
  • the sliding operation has not ended (for example, the user's finger does not leave and continues to hold), and the multi-object interface can be highlighted.
  • the slide control can continue to switch the target object among the multiple objects.
  • the application can be entered.
  • the user can quickly trigger the display of multiple objects and quickly enter the required object through a smooth sliding operation. Therefore, the embodiment of the present application has the advantages of convenient operation and high switching efficiency. .
  • the sliding operation of the embodiment of the present application can be completed in a limited area (such as an edge area of the screen, or even an edge area of a shorter distance of the screen), so that the operation range of the user can be reduced, which can be applied to one-handed holding.
  • a limited area such as an edge area of the screen, or even an edge area of a shorter distance of the screen
  • the embodiment of the present application also provides a processing device.
  • FIG. 26 a structural block diagram of an embodiment of a processing apparatus of the present application is shown, which may specifically include the following modules:
  • a display module 2601 configured to display a control
  • a receiving module 2602 configured to receive a sliding operation for the control
  • the operation execution module 2603 is configured to perform an operation corresponding to the sliding direction according to the sliding direction of the sliding operation.
  • the foregoing apparatus may further include at least one of the following:
  • a second control change module configured to generate a displacement on the icon on the control in response to the sliding operation
  • a third control change module for compressing or being stretched in response to the sliding operation
  • a fourth control change module configured to change a display appearance of the control in response to the sliding operation
  • a fifth control change module is configured to change a display appearance outside the control in response to the sliding operation.
  • the first control change module may be specifically configured to generate a displacement in a direction opposite to the sliding direction or the sliding direction according to a sliding direction of the sliding operation, wherein the displacement The speed matches the speed of the sliding operation.
  • the second control change module may be specifically configured to generate a displacement in the opposite direction of the sliding direction or the sliding direction according to a sliding direction of the sliding operation, where The speed of the displacement matches the speed of the sliding operation.
  • the third control change module may be specifically configured to reduce or increase the size of the control according to a sliding direction of the sliding operation, wherein the reduced speed or the increased speed is The speed of the sliding operation is matched.
  • the fourth control change module is specifically configured to change a display appearance of a complete area of the control according to the sliding operation, or change a display appearance of the swept area of the control.
  • the fifth control change module may be specifically used to change an interface element other than the control in a sliding direction of the sliding operation.
  • the operation corresponding to the sliding direction is used to jump between multiple view layers or to jump within a view layer.
  • the multiple view layers include: a global notification control layer, a lock screen layer, an application layer, a desktop layer, and a global voice assistant layer.
  • the plurality of view layers are sequentially sorted from top to bottom.
  • the operation execution module 2603 may specifically include:
  • a first switching submodule configured to switch from a first view layer to a second view layer, wherein the second view layer is above the first view layer and adjacent to the first view layer View layer;
  • a second switching submodule configured to switch from the first view layer to the second view layer, wherein the second view layer is below the first view layer and adjacent to the second view layer View layer.
  • the step of performing a jump inside a view layer includes:
  • the plurality of interfaces are horizontally ordered in a set order.
  • the operation execution module 2603 may specifically include:
  • a third switching submodule configured to switch from a first interface of the first view layer to a second interface of the first view layer if the sliding direction is the first direction, where the second interface is An interface adjacent to the left side of the first interface;
  • a fourth switching submodule configured to switch from a first interface of the first view layer to a second interface of the first view layer if the sliding direction is the second direction, where the second interface is The interface on the right side of the first interface and adjacent to each other.
  • the sliding direction is a first direction
  • the operation corresponding to the sliding direction is an operation of displaying a recent task list
  • the sliding direction is a second direction, and the operation corresponding to the sliding direction is a return operation;
  • the sliding direction is a third direction, and the operation corresponding to the sliding direction is an operation of calling up a global notification control layer;
  • the sliding direction is a fourth direction, and the operation corresponding to the sliding direction is an operation of calling up the desktop layer.
  • the returning operation includes: returning to the previous interface, returning to the previous application, or returning to the preset view layer.
  • the display module 2601 may specifically include:
  • the first display sub-module is configured to display the control on a lower edge, an upper edge, a left edge, or a right edge of the interface.
  • control is suspended above the interface; or
  • the screen area where the control is located is different from the screen area where the interface is located.
  • the color value corresponding to the control matches the color value corresponding to the adjacent interface part; the adjacent interface part is adjacent to the control. Interface section.
  • the device may further include:
  • a segmentation module configured to segment the control and the adjacent interface portion
  • the color value determining module is configured to determine a color value corresponding to the segment of the control according to the color value corresponding to the segment of the adjacent interface portion.
  • the color value determining module includes:
  • a color value acquisition sub-module configured to acquire a color value corresponding to the plurality of pixel points included by the segment of the neighboring interface portion
  • a first color value determining submodule configured to use, if the color values corresponding to the plurality of pixel points included in the segment match, a color value corresponding to one pixel point included in the segment as a segment of the control Corresponding color value;
  • a second color value determining sub-module configured to: if the color values corresponding to the plurality of pixel points included in the segment do not match, the average value of the color values corresponding to the plurality of pixel points included in the segment is used as the control The color value corresponding to the segment.
  • the display module 2601 may specifically include:
  • the second display sub-module is configured to display the control if a preset operation is received if the control is hidden.
  • the preset operation is an operation of sliding inward from the edge of the screen.
  • the device may further include:
  • a control change module for changing a display state of the control and satisfying at least part of a function of the response when the control is operated, if a set condition is satisfied.
  • control is a first control having a navigation function
  • device may further include:
  • a control replacement module for replacing the first control with a second control having an operational function of the application.
  • the device may further include:
  • An object display module for displaying a plurality of objects
  • a first target object processing module configured to determine a target object of the plurality of objects in response to a sliding operation of the user, and output prompt information corresponding to the target object;
  • the interface entry module is configured to enter an interface corresponding to the object determined to be the target object in the case that the sliding operation ends when the sliding operation ends.
  • the object display module is specifically configured to display a plurality of objects in response to a sliding operation for the control.
  • the device may further include:
  • the second target object processing module determines, after the plurality of objects are displayed, the target object in the object list, and outputs the prompt information corresponding to the target object, if the sliding operation has not ended.
  • the embodiment of the present application further provides an apparatus, the apparatus may include: one or more processors; and one or more machine readable mediums on which instructions are stored, when executed by the one or more processors In the case, the apparatus is caused to perform the method described in FIGS. 1 to 25.
  • the device can be used as a device or as a server. Examples of devices can include: smart phones, tablets, e-book readers, MP3 (motion image expert compression standard audio layer 3, Moving Picture Experts Group Audio Layer III) Player, MP4 (Moving Picture Experts Group Audio Layer IV) player, laptop portable computer, car computer, desktop computer, set-top box, smart TV, wearable device, etc.
  • the embodiments of the present application do not limit the specific device.
  • the embodiment of the present application further provides a non-volatile readable storage medium, where the storage medium stores one or more programs, and if the one or more modules are applied to the device, the The apparatus performs the instructions of the steps involved in the methods illustrated in Figures 1 through 25 of the embodiments of the present application.
  • FIG. 27 is a schematic structural diagram of hardware of a device according to an embodiment of the present disclosure.
  • the device may include an input device 2700, a processor 2701, an output device 2702, a memory 2703, and at least one communication bus 2704.
  • Communication bus 2704 is used to implement a communication connection between components.
  • the memory 2703 may include a high speed RAM memory, and may also include a non-volatile memory NVM, such as at least one disk memory, in which various programs may be stored for performing various processing functions and implementing the method steps of the present embodiment.
  • NVM non-volatile memory
  • the processor 2701 may be, for example, a central processing unit (CPU), an application specific integrated circuit (ASIC), a digital signal processor (DSP), a digital signal processing device (DSPD), and a programmable logic.
  • CPU central processing unit
  • ASIC application specific integrated circuit
  • DSP digital signal processor
  • DSPD digital signal processing device
  • PLD device
  • FPGA field programmable gate array
  • controller controller
  • microcontroller microprocessor
  • microprocessor or other electronic component
  • the input device 2700 may include multiple input devices, for example, at least one of a user-oriented user interface, a device-oriented device interface, a software programmable interface, a camera, and a sensor.
  • the device-oriented device interface may be a wired interface for data transmission between the device and the device, or may be a hardware insertion interface (for example, a USB interface, a serial port, etc.) for data transmission between the device and the device.
  • the user-oriented user interface may be, for example, a user-oriented control button, a voice input device for receiving voice input, and a touch-sensing device for receiving a user's touch input (eg, a touch screen with touch sensing function, touch
  • the programmable interface of the software may be, for example, an input for the user to edit or modify the program, such as an input pin interface or an input interface of the chip; optionally, the transceiver may have Radio frequency transceiver chip, baseband processing chip, and transceiver antenna for communication functions.
  • An audio input device such as a microphone can receive voice data.
  • the output device 2702 can include an output device such as a display, an audio, and the like.
  • the processor of the device includes functions for executing the modules of the data processing device in each device.
  • the specific functions and technical effects may be referred to the foregoing embodiments, and details are not described herein again.
  • FIG. 28 is a schematic structural diagram of hardware of a device according to an embodiment of the present application.
  • Figure 28 is a specific embodiment of the implementation of Figure 27.
  • the apparatus of this embodiment may include a processor 2801 and a memory 2802.
  • the processor 2801 executes the computer program code stored in the memory 2802 to implement the method shown in FIGS. 1 to 25 in the above embodiment.
  • the memory 2802 is configured to store various types of data to support operation at the device. Examples of such data include instructions for any application or method operating on the device, such as messages, pictures, videos, and the like.
  • the memory 2802 may include a random access memory (RAM), and may also include a non-volatile memory, such as at least one disk storage.
  • processor 2801 is disposed in processing component 2800.
  • the device may also include a communication component 2803, a power component 2804, a multimedia component 2805, an audio component 2806, an input/output interface 2807, and/or a sensor component 2808.
  • the components and the like included in the device are set according to actual requirements, which is not limited in this embodiment.
  • Processing component 2800 typically controls the overall operation of the device.
  • Processing component 2800 can include one or more processors 2801 to execute instructions to perform all or part of the steps of the methods illustrated in Figures 1 through 25 above.
  • processing component 2800 can include one or more modules to facilitate interaction between component 2800 and other components.
  • the processing component 2800 can include a multimedia module to facilitate interaction between the multimedia component 2805 and the processing component 2800.
  • Power component 2804 provides power to various components of the device.
  • Power component 2804 can include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the devices.
  • the multimedia component 2805 includes a display screen between the device and the user that provides an output interface.
  • the display screen can include a liquid crystal display (LCD) and a touch panel (TP). If the display includes a touch panel, the display can be implemented as a touch screen to receive input signals from the user.
  • the touch panel includes one or more touch sensors to sense touches, slides, and gestures on the touch panel. The touch sensor may sense not only the boundary of the touch or sliding action, but also the duration and pressure associated with the touch or slide operation.
  • the audio component 2806 is configured to output and/or input an audio signal.
  • audio component 2806 includes a microphone (MIC) that is configured to receive an external audio signal when the device is in an operational mode, such as a voice recognition mode.
  • the received audio signal may be further stored in memory 2802 or transmitted via communication component 2803.
  • audio component 2806 also includes a speaker for outputting an audio signal.
  • the input/output interface 2807 provides an interface between the processing component 2800 and the peripheral interface module, which may be a click wheel, a button, or the like. These buttons may include, but are not limited to, a volume button, a start button, and a lock button.
  • Sensor assembly 2808 includes one or more sensors for providing a status assessment of various aspects of the device.
  • sensor component 2808 can detect the on/off state of the device, the relative positioning of the components, and the presence or absence of user contact with the device.
  • Sensor assembly 2808 can include a proximity sensor configured to detect the presence of nearby objects without any physical contact, including detecting the distance between the user and the device.
  • the sensor assembly 2808 can also include a camera or the like.
  • Communication component 2803 is configured to facilitate wired or wireless communication between the device and other devices.
  • the device can access a wireless network based on communication standards such as WiFi, 2G or 3G, or a combination thereof.
  • the device may include a SIM card slot for inserting the SIM card so that the device can log into the GPRS network to establish communication with the server via the Internet.
  • the communication component 2803, the audio component 2806, the input/output interface 2807, and the sensor component 2808 involved in the corresponding embodiment of FIG. 28 can be implemented as an input device in the embodiment of FIG.
  • the embodiment of the present application further provides an operating system for a device.
  • the operating system may include:
  • a display unit 2901 configured to display a control
  • a receiving unit 2902 configured to receive a sliding operation for the control
  • the operation unit 2903 is configured to perform an operation corresponding to the sliding direction according to the sliding direction of the sliding operation.
  • the description is relatively simple, and the relevant parts can be referred to the description of the method embodiment.
  • embodiments of the embodiments of the present application can be provided as a method, apparatus, or computer program product. Therefore, the embodiments of the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware. Moreover, embodiments of the present application can take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) including computer usable program code.
  • computer-usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
  • the computer device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
  • the memory may include non-persistent memory, random access memory (RAM), and/or non-volatile memory in a computer readable medium, such as read only memory (ROM) or flash memory.
  • RAM random access memory
  • ROM read only memory
  • Memory is an example of a computer readable medium.
  • Computer readable media includes both permanent and non-persistent, removable and non-removable media.
  • Information storage can be implemented by any method or technology. The information can be computer readable instructions, data structures, modules of programs, or other data.
  • Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read only memory. (ROM), electrically erasable programmable read only memory (EEPROM), flash memory or other memory technology, compact disk read only memory (CD-ROM), digital versatile disk (DVD) or other optical storage, Magnetic tape cartridges, magnetic tape storage or other magnetic storage devices or any other non-transportable media that can be used to store information that can be accessed by the device.
  • computer readable media does not include non-persistent computer readable media, such as modulated data signals and carrier waves.
  • Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, devices (systems), and computer program products according to embodiments of the present application. It will be understood that each flow and/or block of the flowchart illustrations.
  • These computer program instructions can be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing device to produce a machine for the execution of instructions for execution by a processor of a computer or other programmable data processing device.
  • the computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device.
  • the apparatus implements the functions specified in one or more blocks of a flow or a flow and/or block diagram of the flowchart.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device.
  • the instructions provide steps for implementing the functions specified in one or more of the flow or in a block or blocks of a flow diagram.

Abstract

Des modes de réalisation de l'invention concernent un procédé, un dispositif et un appareil de traitement, ainsi qu'un support lisible par machine. Le procédé consiste en particulier à : afficher un composant de commande ; recevoir une opération de glissement effectuée sur le composant de commande ; et selon la direction de glissement de l'opération de glissement, exécuter une opération correspondant à la direction de glissement. Les modes de réalisation de l'invention augmentent le rapport écran-corps d'un terminal.
PCT/CN2018/111814 2017-11-06 2018-10-25 Procédé, dispositif et appareil de traitement, et support lisible par machine WO2019085810A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201711080504.9A CN109753214A (zh) 2017-11-06 2017-11-06 处理方法、装置、设备和机器可读介质
CN201711080504.9 2017-11-06

Publications (1)

Publication Number Publication Date
WO2019085810A1 true WO2019085810A1 (fr) 2019-05-09

Family

ID=66332436

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/111814 WO2019085810A1 (fr) 2017-11-06 2018-10-25 Procédé, dispositif et appareil de traitement, et support lisible par machine

Country Status (3)

Country Link
CN (1) CN109753214A (fr)
TW (1) TW201918858A (fr)
WO (1) WO2019085810A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111638836A (zh) * 2020-04-30 2020-09-08 维沃移动通信有限公司 信息的显示方法及电子设备

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113407144B (zh) * 2021-07-15 2022-10-25 维沃移动通信(杭州)有限公司 显示控制方法、装置
CN113760150B (zh) * 2021-09-22 2023-05-30 北京字跳网络技术有限公司 页面处理方法、装置、设备及存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100194677A1 (en) * 2009-02-03 2010-08-05 Microsoft Corporation Mapping of physical controls for surface computing
CN106339149A (zh) * 2016-08-09 2017-01-18 北京三快在线科技有限公司 一种显示控制方法、装置及电子设备
CN106855796A (zh) * 2015-12-09 2017-06-16 阿里巴巴集团控股有限公司 一种数据处理方法、装置和智能终端

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102866916B (zh) * 2012-09-07 2014-12-24 华为终端有限公司 一种终端及动态加载应用程序界面的方法
CN104142789B (zh) * 2013-05-07 2018-07-20 腾讯科技(深圳)有限公司 内容选择方法、装置及终端
CN106155454B (zh) * 2015-03-30 2020-06-16 阿里巴巴集团控股有限公司 一种界面显示方法、装置及电子设备
CN107133022B (zh) * 2016-02-26 2021-05-25 百度在线网络技术(北京)有限公司 终端设备中的控件显示方法和装置
CN107315747A (zh) * 2016-04-26 2017-11-03 斑马网络技术有限公司 服务显示方法、装置、设备和系统
CN106020678A (zh) * 2016-04-29 2016-10-12 青岛海信移动通信技术股份有限公司 一种在移动设备进行触控操作的方法和装置
CN106055243A (zh) * 2016-05-23 2016-10-26 北京金山安全软件有限公司 一种应用程序图标处理方法、装置及电子设备
CN106095269B (zh) * 2016-06-02 2020-12-25 腾讯科技(深圳)有限公司 信息显示方法、装置及系统
CN106227422A (zh) * 2016-07-15 2016-12-14 乐视控股(北京)有限公司 视频播放器控制方法及相关装置
CN106250190A (zh) * 2016-08-01 2016-12-21 深圳市金立通信设备有限公司 一种应用启动方法及终端
CN107122181A (zh) * 2017-04-01 2017-09-01 珠海市魅族科技有限公司 一种控件展示方法及装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100194677A1 (en) * 2009-02-03 2010-08-05 Microsoft Corporation Mapping of physical controls for surface computing
CN106855796A (zh) * 2015-12-09 2017-06-16 阿里巴巴集团控股有限公司 一种数据处理方法、装置和智能终端
CN106339149A (zh) * 2016-08-09 2017-01-18 北京三快在线科技有限公司 一种显示控制方法、装置及电子设备

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111638836A (zh) * 2020-04-30 2020-09-08 维沃移动通信有限公司 信息的显示方法及电子设备

Also Published As

Publication number Publication date
CN109753214A (zh) 2019-05-14
TW201918858A (zh) 2019-05-16

Similar Documents

Publication Publication Date Title
WO2019085821A1 (fr) Procédé, dispositif et appareil de traitement, et support lisible par machine
WO2019085820A1 (fr) Procédé de traitement, dispositif, appareil, et support lisible par machine
CN108089786B (zh) 用户界面显示方法、装置、设备及存储介质
EP3454197B1 (fr) Procédé, dispositif, et support de stockage lisible par ordinateur non transitoire pour commuter des pages d'applications dans un dispositif terminal
CN109164964B (zh) 内容分享方法、装置、终端及存储介质
US10367765B2 (en) User terminal and method of displaying lock screen thereof
US11500513B2 (en) Method for icon display, terminal, and storage medium
US9213467B2 (en) Interaction method and interaction device
CN107153541B (zh) 浏览交互处理方法及装置
US10761688B2 (en) Method and apparatus for editing object
CN103391469B (zh) 移动终端及其控制方法
KR101810468B1 (ko) 이동 단말기 및 그것의 제어 방법
WO2018112925A1 (fr) Procédé, dispositif et appareil terminal d'affichage d'informations
WO2019233307A1 (fr) Procédé et appareil d'affichage d'interface utilisateur, terminal et support d'informations
WO2018112928A1 (fr) Procédé d'affichage d'informations, appareil et dispositif terminal
US11537265B2 (en) Method and apparatus for displaying object
WO2019085810A1 (fr) Procédé, dispositif et appareil de traitement, et support lisible par machine
CN114281225A (zh) 一种窗口显示方法及设备
EP2947567A1 (fr) Customisation de l´apparence d´un texte d´une interface utilisateur
CN108475172B (zh) 一种信息展示方法、装置及终端设备
CN109683760B (zh) 最近内容的显示方法、装置、终端及存储介质
CN109388309B (zh) 菜单的显示方法、装置、终端及存储介质
WO2016155518A1 (fr) Procédé et dispositif d'interaction d'interface
CN107562324B (zh) 数据显示控制的方法和终端
CN107862728B (zh) 图片标签的添加方法、装置及计算机可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18872352

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18872352

Country of ref document: EP

Kind code of ref document: A1