WO2020233323A1 - Procédé de commande d'affichage, dispositif terminal, et support de stockage lisible par ordinateur - Google Patents

Procédé de commande d'affichage, dispositif terminal, et support de stockage lisible par ordinateur Download PDF

Info

Publication number
WO2020233323A1
WO2020233323A1 PCT/CN2020/086113 CN2020086113W WO2020233323A1 WO 2020233323 A1 WO2020233323 A1 WO 2020233323A1 CN 2020086113 W CN2020086113 W CN 2020086113W WO 2020233323 A1 WO2020233323 A1 WO 2020233323A1
Authority
WO
WIPO (PCT)
Prior art keywords
input
target
display
target control
terminal device
Prior art date
Application number
PCT/CN2020/086113
Other languages
English (en)
Chinese (zh)
Inventor
陈艺锦
彭义军
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2020233323A1 publication Critical patent/WO2020233323A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the embodiments of the present disclosure relate to the field of display control technology, and in particular, to a display control method, a terminal device, and a computer-readable storage medium.
  • the embodiments of the present disclosure provide a display control method, a terminal device, and a computer-readable storage medium, so as to solve the problem that when the user needs to adjust the display interface in the related art, the overall operation is inconsistent and the user interface is blocked.
  • the embodiments of the present disclosure provide a display control method, which is applied to a terminal device.
  • the terminal device includes a first touch screen and a touch area provided opposite to the first touch screen.
  • Methods include:
  • an embodiment of the present disclosure provides a terminal device, the terminal device including a first touch screen and a touch area provided opposite to the first touch screen, the terminal device further including:
  • the first receiving module is configured to receive the first input of the touch area when the first touch screen has a display interface
  • the first adjustment module is configured to adjust at least one target control on the display interface in response to the first input.
  • the embodiments of the present disclosure provide another terminal device, including a processor, a memory, and a program stored on the memory and running on the processor.
  • the program is executed by the processor, The steps of the above display control method are realized.
  • embodiments of the present disclosure provide a computer-readable storage medium with a program stored on the computer-readable storage medium, and when the program is executed by a processor, the steps of the above display control method are implemented.
  • the display control method receives a first input in the touch area when the first touch screen has a display interface; in response to the first input, adjusts the display At least one target control of the interface.
  • the display control method provided by the embodiment of the present disclosure supports the operation of quickly adjusting the display interface of the first touch screen in the touch area set opposite to the first touch screen, and does not need to be on the first touch screen. Perform complex gesture operations without blocking the user interface, which can improve the continuity of the overall operation.
  • FIG. 1 is one of the flowcharts of the display control method provided by an embodiment of the present disclosure
  • Figure 3 is a schematic diagram of a first user interface provided by an embodiment of the present disclosure.
  • FIG. 4 is a schematic diagram of the first input provided by an embodiment of the present disclosure.
  • Figure 5 is a schematic diagram of a second input provided by an embodiment of the present disclosure.
  • FIG. 6 is one of the structural diagrams of a terminal device provided by an embodiment of the present disclosure.
  • FIG. 7 is the second structural diagram of a terminal device provided by an embodiment of the present disclosure.
  • FIG. 8 is the third structural diagram of a terminal device provided by an embodiment of the present disclosure.
  • FIG. 9 is the fourth structural diagram of a terminal device provided by an embodiment of the present disclosure.
  • FIG. 10 is the fifth structural diagram of a terminal device provided by an embodiment of the present disclosure.
  • FIG. 11 is a sixth structural diagram of a terminal device provided by an embodiment of the present disclosure.
  • FIG. 12 is the seventh structural diagram of a terminal device provided by an embodiment of the present disclosure.
  • FIG. 13 is a schematic diagram of the hardware structure of a terminal device provided by an embodiment of the present disclosure.
  • FIG. 1 is one of the flowcharts of the display control method provided by an embodiment of the present disclosure.
  • the display control method is applied to a terminal device.
  • the terminal device includes a first touch screen and a first touch screen.
  • the touch area set opposite to the screen, as shown in Figure 1, includes the following steps:
  • Step 101 When the first touch screen has a display interface, receive a first input in the touch area.
  • the device receives the first input in the touch area.
  • the touch area may be an area with touch function, for example, it may be a fingerprint recognition area with touch function, or it may be a second touch screen.
  • the first input may be a preset input for triggering switching of display content, for example, it may be a pressing input with a pressing area greater than a preset number, or a pressing input with a pressing duration longer than a preset duration, or a pressing A pressing input with a strength greater than a preset strength or a sliding input that slides along a first preset track.
  • the user may perform an operation corresponding to the first input in the touch area when the first user interface is displayed on the first touch screen.
  • the first user interface may be an interface of a preset application program, or may be a preset user interface, or may be another user interface, which is not specifically limited in the embodiment of the present disclosure.
  • Step 102 In response to the first input, adjust at least one target control on the display interface.
  • the terminal device adjusts at least one target control on the display interface in response to the first input.
  • the at least one target control for adjusting the display interface may include at least one of the following:
  • the terminal device may determine the at least one target control according to the first input, and/or determine the target according to the first input Content and/or target area.
  • the aforementioned terminal device may be a mobile terminal device, such as a mobile phone, a tablet (Personal Computer), a laptop (Laptop Computer), a personal digital assistant (personal digital assistant, PDA for short), and mobile Internet Device (Mobile Internet Device, MID) or wearable device (Wearable Device), digital camera, etc.; it can also be a fixed terminal device, such as a computer.
  • a mobile terminal device such as a mobile phone, a tablet (Personal Computer), a laptop (Laptop Computer), a personal digital assistant (personal digital assistant, PDA for short), and mobile Internet Device (Mobile Internet Device, MID) or wearable device (Wearable Device), digital camera, etc.
  • MID mobile Internet Device
  • Wiarable Device wearable device
  • digital camera etc.
  • it can also be a fixed terminal device, such as a computer.
  • the display control method receives a first input in the touch area when the first touch screen has a display interface; in response to the first input, adjusts the display interface At least one target control.
  • the display control method provided by this embodiment supports the operation of quickly adjusting the display interface of the first touch screen in a touch area set opposite to the first touch screen, and does not need to be performed on the first touch screen.
  • the complex gesture operation does not block the user interface, which can improve the continuity of the overall operation.
  • the adjusting at least one target control of the display interface includes at least one of the following:
  • the method before the adjusting at least one target control of the display interface in response to the first input, the method further includes:
  • the terminal device determines at least one target control of the display interface according to the first input, and/or determines a target area and/or target content according to the first input.
  • the terminal device may determine at least one target control of the display interface only according to the first input, or may determine the target area and/or the target content only according to the first input, and may also determine the target area and/or the target content according to the first input.
  • the first input determines at least one target control of the display interface, and the target area and/or target content is determined according to the first input.
  • the terminal device may determine that the preset control is at least one target control when receiving the first input, and may also determine that the control corresponding to the first input is at least one target control, wherein different inputs correspond to different Control. Specifically, the terminal device may determine at least one target control according to the input type of the first input; may also determine at least one target control according to the input area of the first input; and may also determine the at least one target control according to the first input fingerprint The information determines at least one target control.
  • the terminal device determines a target area and/or target content according to the first input.
  • the terminal device may determine that the preset content is the target content and/or determine that the preset area is the target area when receiving the first input; it may also determine that the content corresponding to the first input is the target content, where Different inputs correspond to different content, and/or it is determined that the area corresponding to the first input is the target area, where different inputs correspond to different areas.
  • the terminal device may determine the target area and/or target content according to the input type of the first input; it may also determine the target area and/or target content according to the input area of the first input;
  • the first input fingerprint information determines the target area and/or target content.
  • the terminal device may determine that the area corresponding to the first input is the first area, and then determine that the content displayed in the first area on the first user interface is the target content, where different inputs correspond to different area.
  • the target content may be preset as the central area content of the minimap.
  • the terminal device may determine the target area and/or target content according to the first input fingerprint information. Specifically, the terminal device may obtain the first input fingerprint information, and then determine the target area and/or the target content according to the fingerprint information.
  • the determining at least one target control of the display interface according to the first input includes:
  • the terminal device obtains the first input fingerprint information, and determines at least one target control of the display interface according to the fingerprint information.
  • the first input fingerprint information may include one fingerprint image, or may include multiple fingerprint images.
  • the fingerprint image corresponds to at least one control.
  • different fingerprint image combinations correspond to different controls, or each fingerprint image corresponds to at least one control.
  • the fingerprint information includes at least two fingerprint images
  • the determining at least one target control of the display interface according to the fingerprint information includes:
  • each fingerprint image corresponds to at least one control, or different combinations of fingerprint images correspond to different controls.
  • the fingerprint information includes at least two fingerprint images
  • the terminal device determines at least one target control on the display interface according to the at least two fingerprint images, wherein each fingerprint image corresponds to at least one control , Or different fingerprint image combinations correspond to different controls.
  • the determining the target area and/or target content according to the first input includes:
  • the target area and/or the target content are determined according to the fingerprint information.
  • the terminal device obtains the first input fingerprint information, and determines the target area and/or the target content according to the fingerprint information.
  • the first input fingerprint information may include one fingerprint image, or may include multiple fingerprint images.
  • the fingerprint image corresponds to at least one area or at least one content.
  • the fingerprint information includes multiple fingerprint images, different combinations of fingerprint images correspond to different regions or contents, or each fingerprint image corresponds to at least one region or at least one item of content.
  • the fingerprint information includes at least two fingerprint images, and the determining the target area and/or the target content according to the fingerprint information includes:
  • each fingerprint image corresponds to at least one area or at least one item of content, or different fingerprint image combinations correspond to different areas or different content.
  • the fingerprint information includes at least two fingerprint images
  • the terminal device determines the target area and/or target content according to the at least two fingerprint images, wherein each fingerprint image corresponds to at least one area Or at least one piece of content, or different fingerprint image combinations correspond to different areas or different content.
  • the first input includes at least one of the following:
  • the number of pressing areas is greater than the preset number of pressing inputs (for example, two-finger pressing input);
  • FIG. 2 is the second flow chart of the display control method provided by an embodiment of the present disclosure. The method is applied to a terminal device.
  • the main difference between this embodiment and the embodiment shown in FIG. 1 is that this embodiment further defines The at least one target control is a map, and the adjustment of the visual field content in the at least one target control according to the second input is limited, as shown in FIG. 2, including the following steps:
  • Step 201 When the first touch screen has a display interface, receive a first input in the touch area.
  • Step 202 In response to the first input, adjust at least one target control on the display interface.
  • step 201 and step 202 are the same as the step 101 and step 102 in the embodiment shown in FIG. 1 of the present disclosure, and will not be repeated here.
  • Step 203 Receive a second input in the touch area.
  • the terminal device after the terminal device adjusts at least one target control of the display interface in response to the first input, the user can adjust the display of the enlarged image by performing a second input in the touch area content.
  • the terminal device receives a second input in the touch area.
  • the second input may include a sliding input.
  • Step 204 Adjust the visual field content in the at least one target control according to the second input.
  • the terminal device adjusts the visual field content in the at least one target control according to the second input.
  • the terminal device may adjust the visual field content in the at least one target control according to the sliding direction and/or sliding distance of the second input. For example, controlling the field of view displayed in the at least one target control to move in the sliding direction of the second input, and the moving distance is a distance corresponding to the sliding distance of the sliding input. In this way, the view of the map can be adjusted in the touch area without requiring the user to perform operations on the first touch screen, which can improve the continuity of the overall operation.
  • the at least one target control may be a minimap
  • the target content may be the content of the central area of the minimap
  • the target area may be the game interface The central area.
  • a second input can be performed in the touch area to adjust the visual field content displayed in the at least one target control. In this way, it is possible to adjust the field of view of the minimap without performing operations on the first touch screen, and view the content of different fields of view in the minimap, thereby improving the continuity of the overall operation of the game.
  • the display control method receives a first input in the touch area when the first touch screen has a display interface; determines a target area or target content according to the first input; In response to the first input, adjust at least one target control of the display interface; receive a second input in the touch area; adjust the field of view content in the at least one target control according to the second input.
  • the display control method provided in this embodiment supports the operation of quickly adjusting the content of the field of view in the at least one target control in the touch area set opposite to the first touch screen, and does not need to be performed on the first touch screen.
  • the complex gesture operation does not block the user interface, which can improve the continuity of the overall operation.
  • adjusting at least one target control of the display interface is specifically adjusting the display position and/or display ratio of the target content in the at least one target control in the at least one target control.
  • the terminal device receives the first input, and the terminal device may respond to the first input to display at the target position of the user interface An enlarged view of the at least one target control.
  • the at least one target control may be a control determined according to the first input.
  • the terminal device may respond to the first input, determine at least one target control and determine the target area according to the first input, and then display the zoom of the at least one target control in the target area of the display interface Figure.
  • the first user interface 300 includes a first display area 301, a second display area 302, a third display area 303, and a fourth display area. Display area 304.
  • the first user interface 300 is an interface of MOBA (Multiplayer Online Battle Arena, multiplayer online tactical competitive game), and the first user interface 300 includes a virtual character game scene, a small map, a virtual character direction control interface, and virtual character skills
  • the minimap can be a thumbnail of the entire game scene or a partial abbreviation of the game scene. The contents displayed on the minimap are different in different types of games.
  • the first display area 301 is used to display a small map
  • the second display area 302 is used to display a virtual character game scene
  • the third display area 303 is used to display a virtual character movement control interface.
  • the fourth display area 304 is used to display the virtual character skill control interface.
  • the user can perform viewing operations on the minimap in the first display area 301, viewing the virtual character's game situation in the second display area 302, and perform movement control operations on the virtual character in the third display area 303, and in the fourth display area 304 Perform skill control operations for virtual characters.
  • the detailed content of the minimap can be displayed in the second display area 302.
  • the user in the process of playing a game, the user usually operates the third display area 303 with his left hand and the fourth display area 304 with his right hand.
  • the left hand is required at the same time.
  • the operation is difficult; or the left hand is required to give up the operation on the third display area 303 (that is, to give up the operation of moving the virtual character), which easily affects the user's game experience; or the user is required Using the right hand to perform a viewing operation on the small map in the first display area 301 can easily block the user's line of sight.
  • the first input can be performed on the touch area of the terminal device, and the terminal device receives the first input, An enlarged view of the small map is displayed on the second display area 302.
  • the first input can be performed on the touch area by the finger held on the side of the terminal device opposite to the first touch screen, that is, the first input can be performed without affecting the normal operation of the game.
  • check the detailed content of the minimap which can effectively improve the continuity of the overall operation.
  • the adjusting the visual field content in the at least one target control according to the second input includes:
  • the terminal device may determine the view movement direction and the view movement distance according to the second input, and then move the view in the at least one target control according to the view movement direction and the view movement distance, Thus, the visual field content in the at least one target control can be adjusted.
  • the user may perform a second input in the touch area, and the terminal device responds to the second input and adjusts the visual field content of the at least one target control according to the second input.
  • the second input may include a sliding input.
  • the terminal device may adjust the visual field content in the at least one target control according to the sliding direction and/or sliding distance of the second input. For example, controlling the field of view displayed in the at least one target control to move in the sliding direction of the second input, and the moving distance is a distance corresponding to the sliding distance of the sliding input. In this way, the view of the map can be adjusted in the touch area without requiring the user to perform an operation on the first touch screen, which can improve the continuity of the overall operation.
  • the at least one target control may be a minimap
  • the target content may be the content of the central area of the minimap
  • the target area may be the content of the game interface. Central region.
  • a second input can be performed in the touch area to adjust the visual field content of the minimap displayed in the at least one target control. In this way, it is possible to adjust the field of view of the minimap without performing operations on the first touch screen, and view the content of different fields of view in the minimap, thereby improving the continuity of the overall operation of the game.
  • the adjusting at least one target control of the display interface when the adjusting at least one target control of the display interface is displaying an enlarged view of the target content in the at least one target control on the display interface, the adjusting in response to the first input After displaying at least one target control of the interface, the method further includes:
  • the enlarged view of the target content displayed on the display interface is closed.
  • a third input may be performed in the touch area, and the terminal device may close the display displayed on the display interface in response to the third input.
  • An enlarged view of the target content In this way, the enlarged view of the user interface can be closed in the touch area, without the user performing an operation on the first touch screen, and the continuity of the overall operation is improved.
  • a third input can be performed in the touch area, and the terminal device closes the second display in response to the third input
  • the enlarged view displayed in the area 302, that is, the virtual character game scene is redisplayed in the second display area 302.
  • the third input may be a preset input, may be the same input as the first input, or may be an input different from the first input. For example, assuming that the first input is a two-finger pressing input and the third input is also a two-finger pressing input, if the terminal device does not display an enlarged view of the target content in the at least one target control If a two-finger pressing input on the touch area is received, the terminal device responds to the two-finger pressing input and displays an enlarged view of the target content in the at least one target control on the display interface. Conversely, if the terminal device receives a two-finger pressing input in the touch area when the display interface displays an enlarged view of the target content in the at least one target control, the terminal device closes all The zoomed-in view of the display interface.
  • the terminal device receives a sliding input that slides along a first preset trajectory, Displaying an enlarged view of the target content in the at least one target control on the display interface. If the terminal device receives the sliding input of the second preset trajectory, close the enlarged view displayed on the display interface.
  • the third input may also be a touch withdrawal input in the touch area.
  • the terminal device displays an enlarged view of the target content in the at least one target control on the display interface, if all the touch inputs performed on the touch area are withdrawn, that is, there is no effect on the For touch input of the touch area, the terminal device closes the enlarged view displayed on the display interface.
  • the first input includes a two-finger pressing input
  • the second input includes a single-finger pressing input and a single-finger sliding input.
  • the first input includes a two-finger pressing input
  • the second input includes a single-finger pressing input and a single-finger sliding input.
  • FIG. 4 is a schematic diagram of the first input provided by an embodiment of the present disclosure.
  • the terminal device may determine that the first input in the touch area is received.
  • the second input includes a pressing input of a first finger on the touch area and a sliding input of a second finger on the touch area, and the pressing input and the sliding input are performed simultaneously.
  • FIG. 5 FIG.
  • FIG. 5 is a schematic diagram of the second input provided by an embodiment of the present disclosure. As shown in FIG. 5, when the area 401 of the touch area 400 receives a pressing input, and a sliding track of 403 is received Input, the terminal device may determine that the second input in the touch area is received.
  • the third input may include a touch withdrawal input in the touch area.
  • the first user interface is a game interface
  • the user's first finger for example, left middle finger
  • second finger for example, right middle finger
  • the The terminal device determines that it receives a two-finger pressing input in the touch area, and displays the content of the center area of the small map on the first user interface.
  • the terminal device determines that the second input in the touch area is received, and adjusts according to the second input
  • the field of view of the small map in the enlarged image If the field of view that the user needs to view is far from the center area of the minimap, and one swipe input cannot be adjusted, the user can lift the second finger away from the touch area while keeping the first finger's pressing input unchanged. Sliding in the touch area again, the terminal device determines that the second input in the touch area is received again, and continues to adjust the field of view of the small map in the enlarged image according to the received second input.
  • the terminal device After viewing the detailed content of the minimap, the user can lift both the first finger and the second finger away from the touch area, and when both the first finger and the second finger are lifted away from the touch area At this time, the terminal device closes the enlarged image displayed in the first user interface. It should be noted that, in this case, after receiving the first input, the terminal device keeps displaying the enlarged image on the first user interface until the last finger is lifted away from the touch area .
  • the determining the visual field movement direction and visual field movement distance according to the second input includes:
  • the moving distance of the visual field is determined according to the sliding distance of the single-finger sliding input of the second input.
  • the terminal device may determine the movement direction of the field of view according to the sliding direction of the single-finger sliding input of the second input, and The sliding distance of the single-finger sliding input of the second input determines the moving distance of the visual field.
  • FIG. 6 is one of the structural diagrams of the terminal device provided by the embodiment of the present disclosure, which can realize the details of the display control method in the above-mentioned embodiment and achieve the same effect.
  • the terminal device 600 includes a first receiving module 601 and a first adjusting module 602 that are connected to each other, wherein:
  • the first receiving module 601 is configured to receive a first input in the touch area when the first touch screen has a display interface
  • the first adjustment module 602 is configured to adjust at least one target control on the display interface in response to the first input.
  • the first adjustment module 601 is specifically configured to perform at least one of the following:
  • FIG. 7 is a second structural diagram of a terminal device provided by an embodiment of the present disclosure.
  • the terminal device 600 further includes:
  • the first determining module 603 is configured to determine at least one target control of the display interface according to the first input;
  • the second determining module 604 is configured to determine the target area and/or target content according to the first input.
  • FIG. 8 is the third structural diagram of a terminal device provided by an embodiment of the present disclosure.
  • the first determining module 603 includes:
  • the first obtaining unit 6031 is configured to obtain the first input fingerprint information
  • the first determining unit 6032 is configured to determine at least one target control of the display interface according to the fingerprint information.
  • the fingerprint information includes at least two fingerprint images, and the first determining unit 6032 is specifically configured to:
  • each fingerprint image corresponds to at least one control, or different combinations of fingerprint images correspond to different controls.
  • FIG. 9 is a fourth structural diagram of a terminal device provided in an embodiment of the present disclosure.
  • the second determining unit 604 includes:
  • the second obtaining unit 6041 is configured to obtain the first input fingerprint information
  • the second determining unit 6042 is configured to determine the target area and/or the target content according to the fingerprint information.
  • the fingerprint information includes at least two fingerprint images
  • the second determining unit 6042 is specifically configured to:
  • each fingerprint image corresponds to at least one area or at least one item of content, or different fingerprint image combinations correspond to different areas or different content.
  • FIG. 10 is the fifth structural diagram of a terminal device provided by an embodiment of the present disclosure.
  • the terminal device 600 further includes:
  • the second receiving module 605 is configured to receive a second input in the touch area
  • the second adjustment module 606 is configured to adjust the visual field content in the at least one target control according to the second input.
  • FIG. 11 is a sixth structural diagram of a terminal device according to an embodiment of the present disclosure.
  • the second adjustment module 606 includes:
  • the determining unit 6061 is configured to determine the moving direction and moving distance of the visual field according to the second input;
  • the adjusting unit 6062 is configured to adjust the visual field content in the at least one target control according to the visual field moving direction and the visual visual moving distance.
  • the first input includes a two-finger pressing input
  • the second input includes a single-finger pressing input and a single-finger sliding input.
  • FIG. 12 is a seventh structural diagram of a terminal device provided by an embodiment of the present disclosure.
  • the determining unit 6061 includes:
  • the first determining subunit 60611 is configured to determine the direction of movement of the field of view according to the sliding direction of the single-finger sliding input of the second input;
  • the second determining sub-unit 60612 is configured to determine the visual field moving distance according to the sliding distance of the single-finger sliding input of the second input.
  • the first input includes at least one of the following:
  • the number of pressing areas is greater than the preset number of pressing inputs
  • the terminal device includes a first touch screen and a touch area provided opposite to the first touch screen.
  • the terminal device includes a first touch screen with a display interface. , Receiving a first input in the touch area; in response to the first input, adjusting at least one target control of the display interface. In this way, the operation of quickly adjusting the display interface of the first touch screen is supported in the touch area set opposite to the first touch screen, and there is no need to perform complex gesture operations on the first touch screen, and no Blocking the user interface can improve the continuity of the overall operation.
  • FIG. 13 is a schematic diagram of the hardware structure of a terminal device implementing various embodiments of the present disclosure.
  • the terminal device 1300 includes, but is not limited to: a radio frequency unit 1301, a network module 1302, an audio output unit 1303, and an input unit 1304 , Sensor 1305, display unit 1306, user input unit 1307, interface unit 1308, memory 1309, processor 1310, and power supply 1311.
  • a radio frequency unit 1301 a radio frequency unit 1301
  • the terminal device 1300 includes, but is not limited to: a radio frequency unit 1301, a network module 1302, an audio output unit 1303, and an input unit 1304 , Sensor 1305, display unit 1306, user input unit 1307, interface unit 1308, memory 1309, processor 1310, and power supply 1311.
  • the terminal device may include more or less components than those shown in the figure, or combine certain components, or different components. Layout.
  • terminal devices include, but are not limited to, mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted terminal devices, wearable devices, and
  • the processor 1310 is used for:
  • the adjustment of at least one target control of the display interface executed by the processor 1310 includes at least one of the following:
  • the following steps may be further implemented:
  • the determining at least one target control of the display interface according to the first input executed by the processor 1310 includes:
  • the fingerprint information includes at least two fingerprint images
  • determining at least one target control of the display interface according to the fingerprint information executed by the processor 1310 includes:
  • each fingerprint image corresponds to at least one control, or different combinations of fingerprint images correspond to different controls.
  • the determination of the target area and/or target content according to the first input performed by the processor 1310 includes:
  • the target area and/or the target content are determined according to the fingerprint information.
  • the fingerprint information includes at least two fingerprint images, and determining the target area and/or the target content according to the fingerprint information executed by the processor 1310 includes:
  • each fingerprint image corresponds to at least one area or at least one item of content, or different fingerprint image combinations correspond to different areas or different content.
  • the at least one target control includes a map
  • the processor 1310 adjusts the at least one target control on the display interface in response to the first input, the following steps may be further implemented:
  • adjusting the visual field content in the at least one target control according to the second input performed by the processor 1310 includes:
  • the first input includes a two-finger pressing input
  • the second input includes a single-finger pressing input and a single-finger sliding input.
  • the determination by the processor 1310 according to the second input to determine the moving direction and moving distance of the visual field includes:
  • the moving distance of the visual field is determined according to the sliding distance of the single-finger sliding input of the second input.
  • the first input includes at least one of the following:
  • the number of pressing areas is greater than the preset number of pressing inputs
  • the terminal device includes a first touch screen and a touch area provided opposite to the first touch screen.
  • the terminal device includes a first touch screen with a display interface. , Receiving a first input in the touch area; in response to the first input, adjusting at least one target control of the display interface. In this way, the operation of quickly adjusting the display interface of the first touch screen is supported in the touch area set opposite to the first touch screen, and there is no need to perform complex gesture operations on the first touch screen, and no Blocking the user interface can improve the continuity of the overall operation.
  • the radio frequency unit 1301 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, after receiving downlink data from the base station, it is processed by the processor 1310; Uplink data is sent to the base station.
  • the radio frequency unit 1301 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 1301 can also communicate with the network and other devices through a wireless communication system.
  • the terminal device provides users with wireless broadband Internet access through the network module 1302, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 1303 may convert the audio data received by the radio frequency unit 1301 or the network module 1302 or stored in the memory 1309 into audio signals and output them as sounds. Moreover, the audio output unit 1303 may also provide audio output related to a specific function performed by the terminal device 1300 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 1303 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 1304 is used to receive audio or video signals.
  • the input unit 1304 may include a graphics processing unit (GPU) 13041 and a microphone 13042, and the graphics processor 13041 is configured to respond to images of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Data is processed.
  • the processed image frame can be displayed on the display unit 1306.
  • the image frame processed by the graphics processor 13041 may be stored in the memory 1309 (or other storage medium) or sent via the radio frequency unit 1301 or the network module 1302.
  • the microphone 13042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 1301 in the case of a telephone call mode for output.
  • the terminal device 1300 further includes at least one sensor 1305, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 13061 according to the brightness of the ambient light
  • the proximity sensor can close the display panel 13061 and 13061 when the terminal device 1300 is moved to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when it is stationary, and can be used to identify the posture of the terminal device (such as horizontal and vertical screen switching, related games) , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, percussion), etc.; sensors 1305 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
  • the display unit 1306 is used to display information input by the user or information provided to the user.
  • the display unit 1306 may include a display panel 13061, and the display panel 13061 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an organic light-emitting diode (Organic Light-Emitting Diode, OLED), etc.
  • LCD Liquid Crystal Display
  • OLED Organic Light-Emitting Diode
  • the user input unit 1307 may be used to receive inputted numeric or character information, and generate key signal input related to user settings and function control of the terminal device.
  • the user input unit 1307 includes a touch panel 13071 and other input devices 13072.
  • the touch panel 13071 also called a touch screen, can collect user touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 13071 or near the touch panel 13071. operating).
  • the touch panel 13071 may include two parts, a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 1310, the command sent by the processor 1310 is received and executed.
  • the touch panel 13071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 1307 may also include other input devices 13072.
  • other input devices 13072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 13071 can be overlaid on the display panel 13061.
  • the touch panel 13071 detects a touch operation on or near it, it transmits it to the processor 1310 to determine the type of the touch event, and then the processor 1310 according to The type of touch event provides corresponding visual output on the display panel 13061.
  • the touch panel 13071 and the display panel 13061 are used as two independent components to realize the input and output functions of the terminal device, but in some embodiments, the touch panel 13071 and the display panel 13061 can be integrated
  • the implementation of the input and output functions of the terminal device is not specifically limited here.
  • the interface unit 1308 is an interface for connecting an external device and the terminal device 1300.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 1308 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the terminal device 1300 or can be used to connect to the terminal device 1300 and external devices. Transfer data between devices.
  • the memory 1309 can be used to store software programs and various data.
  • the memory 1309 may mainly include a storage program area and a storage data area.
  • the storage program area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data (such as audio data, phone book, etc.) created by the use of mobile phones.
  • the memory 1309 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 1310 is the control center of the terminal device. It uses various interfaces and lines to connect various parts of the entire terminal device. It runs or executes the software programs and/or modules stored in the memory 1309, and calls the data stored in the memory 1309. , Perform various functions of the terminal equipment and process data, so as to monitor the terminal equipment as a whole.
  • the processor 1310 may include one or more processing units; optionally, the processor 1310 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, and application programs, etc.
  • the adjustment processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 1310.
  • the terminal device 1300 may also include a power source 1311 (such as a battery) for supplying power to various components.
  • a power source 1311 such as a battery
  • the power source 1311 may be logically connected to the processor 1310 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
  • terminal device 1300 includes some functional modules not shown, which will not be repeated here.
  • the embodiment of the present disclosure further provides a terminal device, including a processor 1310, a memory 1309, a computer program stored in the memory 1309 and capable of running on the processor 1310, and the computer program is executed by the processor 1310
  • a terminal device including a processor 1310, a memory 1309, a computer program stored in the memory 1309 and capable of running on the processor 1310, and the computer program is executed by the processor 1310
  • the various processes performed by the terminal device in the embodiment of the display control method are realized at a time, and the same technical effect can be achieved. In order to avoid repetition, details are not repeated here.
  • the embodiments of the present disclosure also provide a computer-readable storage medium on which a computer program is stored.
  • a computer program is stored.
  • the computer program is executed by a processor, each process of the above-mentioned display control method embodiment is realized, and the same technology can be achieved. The effect, in order to avoid repetition, will not be repeated here.
  • the computer readable storage medium such as read-only memory (Read-Only Memory, ROM for short), random access memory (Random Access Memory, RAM for short), magnetic disk or optical disk, etc.
  • the technical solution of the present disclosure essentially or the part that contributes to the related technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk). ) Includes several instructions to make a terminal device (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the method described in each embodiment of the present disclosure.
  • a terminal device which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.

Abstract

L'invention concerne un procédé de commande d'affichage, un dispositif terminal, et un support de stockage lisible par ordinateur. Le dispositif terminal comporte un premier écran de commande tactile, et une région de commande tactile qui est orientée à l'opposé du premier écran de commande tactile. Le procédé comporte les étapes consistant: dans la mesure où le premier écran de commande tactile est muni d'une interface d'affichage, à recevoir une première entrée dans la région de commande tactile qui est orientée à l'opposé du premier écran de commande tactile (101); et en réponse à la première entrée, à ajuster au moins une commande cible de l'interface d'affichage (102).
PCT/CN2020/086113 2019-05-20 2020-04-22 Procédé de commande d'affichage, dispositif terminal, et support de stockage lisible par ordinateur WO2020233323A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910417831.1A CN110174993B (zh) 2019-05-20 2019-05-20 一种显示控制方法、终端设备及计算机可读存储介质
CN201910417831.1 2019-05-20

Publications (1)

Publication Number Publication Date
WO2020233323A1 true WO2020233323A1 (fr) 2020-11-26

Family

ID=67691677

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/086113 WO2020233323A1 (fr) 2019-05-20 2020-04-22 Procédé de commande d'affichage, dispositif terminal, et support de stockage lisible par ordinateur

Country Status (2)

Country Link
CN (1) CN110174993B (fr)
WO (1) WO2020233323A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114415922A (zh) * 2022-01-19 2022-04-29 网易(杭州)网络有限公司 操作控件调整方法、装置、电子设备及可读介质
CN114691259A (zh) * 2020-12-31 2022-07-01 北京奇艺世纪科技有限公司 一种控件显示方法及装置、电子设备和可读存储介质
CN115328371A (zh) * 2022-06-23 2022-11-11 网易(杭州)网络有限公司 对象调节的方法、装置和电子设备

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110174993B (zh) * 2019-05-20 2021-05-07 维沃移动通信有限公司 一种显示控制方法、终端设备及计算机可读存储介质
CN111050109B (zh) * 2019-12-24 2021-09-17 维沃移动通信有限公司 电子设备控制方法及电子设备
CN111957041A (zh) * 2020-09-07 2020-11-20 网易(杭州)网络有限公司 一种游戏中的地图查看方法、终端、电子设备及存储介质
CN114139403B (zh) * 2021-12-13 2023-08-25 中国核动力研究设计院 一种基于概率论的事故规程整定值优化方法、装置和设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080288795A1 (en) * 2007-05-17 2008-11-20 Tetsuya Maruyama Method and system for reducing power consumption of storage system serving as target during remote copying employing journal
CN103513879A (zh) * 2013-04-26 2014-01-15 展讯通信(上海)有限公司 触控设备及其显示控制方法及装置
CN106375545A (zh) * 2015-07-24 2017-02-01 中兴通讯股份有限公司 终端的控制方法和装置
CN107613077A (zh) * 2017-10-16 2018-01-19 白海燕 一种控制手机屏幕的方法
CN110174993A (zh) * 2019-05-20 2019-08-27 维沃移动通信有限公司 一种显示控制方法、终端设备及计算机可读存储介质

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160139481A (ko) * 2015-05-27 2016-12-07 삼성전자주식회사 사용자 단말 장치 및 그 제어 방법
CN105204729A (zh) * 2015-08-26 2015-12-30 广东欧珀移动通信有限公司 一种指纹触控方法及系统
CN108920073B (zh) * 2018-06-22 2020-11-20 维沃移动通信有限公司 一种显示控制方法及终端设备
CN108804015A (zh) * 2018-06-28 2018-11-13 维沃移动通信有限公司 一种功能触发方法及移动终端
CN109157832A (zh) * 2018-07-12 2019-01-08 努比亚技术有限公司 一种终端游戏控制方法、终端及计算机可读存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080288795A1 (en) * 2007-05-17 2008-11-20 Tetsuya Maruyama Method and system for reducing power consumption of storage system serving as target during remote copying employing journal
CN103513879A (zh) * 2013-04-26 2014-01-15 展讯通信(上海)有限公司 触控设备及其显示控制方法及装置
CN106375545A (zh) * 2015-07-24 2017-02-01 中兴通讯股份有限公司 终端的控制方法和装置
CN107613077A (zh) * 2017-10-16 2018-01-19 白海燕 一种控制手机屏幕的方法
CN110174993A (zh) * 2019-05-20 2019-08-27 维沃移动通信有限公司 一种显示控制方法、终端设备及计算机可读存储介质

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114691259A (zh) * 2020-12-31 2022-07-01 北京奇艺世纪科技有限公司 一种控件显示方法及装置、电子设备和可读存储介质
CN114691259B (zh) * 2020-12-31 2023-12-12 北京奇艺世纪科技有限公司 一种控件显示方法及装置、电子设备和可读存储介质
CN114415922A (zh) * 2022-01-19 2022-04-29 网易(杭州)网络有限公司 操作控件调整方法、装置、电子设备及可读介质
CN114415922B (zh) * 2022-01-19 2024-03-15 网易(杭州)网络有限公司 操作控件调整方法、装置、电子设备及可读介质
CN115328371A (zh) * 2022-06-23 2022-11-11 网易(杭州)网络有限公司 对象调节的方法、装置和电子设备
CN115328371B (zh) * 2022-06-23 2023-09-15 网易(杭州)网络有限公司 对象调节的方法、装置和电子设备

Also Published As

Publication number Publication date
CN110174993B (zh) 2021-05-07
CN110174993A (zh) 2019-08-27

Similar Documents

Publication Publication Date Title
WO2021098678A1 (fr) Procédé de commande de vidéocapture d'écran et dispositif électronique
WO2020233323A1 (fr) Procédé de commande d'affichage, dispositif terminal, et support de stockage lisible par ordinateur
US11675442B2 (en) Image processing method and flexible-screen terminal
US11604567B2 (en) Information processing method and terminal
WO2021104321A1 (fr) Procédé d'affichage d'image et dispositif électronique
US20200257433A1 (en) Display method and mobile terminal
WO2020233285A1 (fr) Procédé d'affichage de message et dispositif terminal
WO2019179332A1 (fr) Procédé de fermeture d'application et terminal mobile
WO2019196929A1 (fr) Terminal mobile et procédé de traitement de données vidéo
WO2019196864A1 (fr) Procédé de commande de bouton virtuel et terminal mobile
WO2020238497A1 (fr) Procédé de déplacement d'icône et dispositif terminal
WO2020238449A1 (fr) Procédé de traitement de messages de notification et terminal
WO2021098705A1 (fr) Procédé d'affichage et dispositif électronique
WO2021068885A1 (fr) Procédé de commande et dispositif électronique
WO2020259091A1 (fr) Procédé d'affichage de contenu d'écran et terminal
WO2020001604A1 (fr) Procédé d'affichage et dispositif terminal
WO2019114522A1 (fr) Procédé de commande d'écran, appareil de commande d'écran et terminal mobile
WO2019228296A1 (fr) Procédé de traitement d'affichage et dispositif terminal
CN108196753B (zh) 一种界面切换方法及移动终端
WO2021129732A1 (fr) Procédé de traitement d'affichage et dispositif électronique
WO2019154360A1 (fr) Procédé de commutation d'interface et terminal mobile
WO2020155980A1 (fr) Procédé de commande et dispositif terminal
WO2020199986A1 (fr) Procédé d'appel vidéo et dispositif terminal
WO2021136330A1 (fr) Procédé de commande d'affichage d'écran à texte défilant et dispositif électronique
WO2020220893A1 (fr) Procédé de capture d'écran et terminal mobile

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20808699

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20808699

Country of ref document: EP

Kind code of ref document: A1