WO2021129732A1 - Procédé de traitement d'affichage et dispositif électronique - Google Patents

Procédé de traitement d'affichage et dispositif électronique Download PDF

Info

Publication number
WO2021129732A1
WO2021129732A1 PCT/CN2020/138970 CN2020138970W WO2021129732A1 WO 2021129732 A1 WO2021129732 A1 WO 2021129732A1 CN 2020138970 W CN2020138970 W CN 2020138970W WO 2021129732 A1 WO2021129732 A1 WO 2021129732A1
Authority
WO
WIPO (PCT)
Prior art keywords
keyboard interface
input
virtual keyboard
target control
area
Prior art date
Application number
PCT/CN2020/138970
Other languages
English (en)
Chinese (zh)
Inventor
周甲黎
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2021129732A1 publication Critical patent/WO2021129732A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0202Constructional details or processes of manufacture of the input device
    • G06F3/0219Special purpose keyboards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0237Character input methods using prediction or retrieval techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the present invention relates to the field of communication technology, in particular to a display processing method and electronic equipment.
  • the screen is too large, the user's one-handed operation is often restricted by the fixed layout of the operation object. For example, a scenario where text content is input on an electronic device. The text input will pop up a keyboard to facilitate user input, and the keyboard display methods include full keyboard and nine square grid. However, because the screen is too large, a single-handed user's finger cannot press the button at the correct position, which affects the user's input speed.
  • the embodiments of the present invention provide a display processing method and an electronic device, so as to solve the problem that the screen of the existing electronic device is too large, which causes inconvenience for the user to operate.
  • the present invention is implemented as follows:
  • an embodiment of the present invention provides a display processing method applied to an electronic device, including:
  • the display position of the target control in the virtual keyboard interface is updated.
  • an embodiment of the present invention also provides an electronic device, including:
  • the receiving module is configured to receive the first input to determine the target object to be displayed when the virtual keyboard interface is displayed;
  • the first processing module is configured to update the display position of the target control in the virtual keyboard interface in response to the first input.
  • an embodiment of the present invention also provides an electronic device, including a processor, a memory, and a computer program stored on the memory and running on the processor, and the computer program is executed by the processor. When executed, the steps of the display processing method as described above are realized.
  • an embodiment of the present invention also provides a computer-readable storage medium having a computer program stored on the computer-readable storage medium, and when the computer program is executed by a processor, the display processing method described above is implemented. step.
  • the display position of the target control in the virtual keyboard interface can be updated based on the user's needs by receiving the first input, and the target control can be displayed in the comparison.
  • a good display area is convenient for the user to continue to operate the target control, and to avoid the problem of inconvenience for the user to operate due to the large display screen of the electronic device.
  • FIG. 1 is a flowchart of steps of a display processing method according to an embodiment of the present invention
  • FIG. 2 is one of the application schematic diagrams of the method of the embodiment of the present invention.
  • FIG. 3 is the second schematic diagram of the application of the method according to the embodiment of the present invention.
  • FIG. 4 is the third application schematic diagram of the method of the embodiment of the present invention.
  • FIG. 5 is a schematic diagram of the structure of an electronic device according to an embodiment of the present invention.
  • FIG. 6 is a schematic structural diagram of an electronic device according to another embodiment of the present invention.
  • a display processing method applied to an electronic device, includes:
  • Step 101 Receive a first input when the virtual keyboard interface is displayed.
  • the first input may be an input based on physical keys or virtual keys, or an input based on biometric technology, such as voice, touch, infrared, gesture, etc.
  • biometric technology such as voice, touch, infrared, gesture, etc.
  • a single physical key can be used, or a combination of multiple physical keys can be used.
  • Step 102 In response to the first input, update the display position of the target control in the virtual keyboard interface.
  • the target controls are virtual keys on the virtual keyboard interface, such as letter keys and symbol keys.
  • the display position of the target control in the virtual keyboard interface can be updated in response to the first input.
  • the method of the embodiment of the present invention can perform the display position of the target control in the virtual keyboard interface by receiving the first input based on the needs of the user when the virtual keyboard interface is displayed. Update, display the target control in a better display area, so that the user can continue to operate the target control, and avoid the problem of user inconvenience caused by the large display screen of the electronic device.
  • updating the display position of the target control in the virtual keyboard interface in step 102 includes:
  • the first input is an operation of adjusting the position of the target control in the virtual keyboard interface displayed by the user when the virtual keyboard interface is displayed.
  • the first candidate object including the target control is used as an adjustment object to adjust the first display area in the virtual keyboard interface.
  • the first display area is a target area required by the user. In the first display area, a more convenient operation for the target control can be realized.
  • the first input is a sliding operation, so that when the user holds the electronic device with one hand, the position of the first candidate object can be adjusted.
  • the input method keyboard 201 ie, virtual keyboard interface
  • each row of the keyboard is rotated Scroll until the row of the key to be input (ie, the first candidate object), such as the row of keys 202, scroll to the first display area, such as the penultimate row.
  • the first input can also be the user's sliding left or right, so that each column of the keyboard is scrolled until the column of the key to be input is scrolled to the first display area.
  • the adjustment of the first candidate object is in the virtual After the first display area in the keyboard interface, it also includes:
  • buttons 202 after adjusting the button row 202 to the penultimate row, as shown in FIG. 3, it can also respond to the user's sliding left or right (first input) to make the button row 202
  • Each button is shifted to the left or right corresponding to the sliding direction until the button to be input is at the position required by the user, that is, the button to be input is close to the position of the user's finger. In this way, it is convenient for the user's finger to click the button to complete the input.
  • updating the display position of the target control in the virtual keyboard interface in step 102 includes:
  • the first input is an input based on a virtual keyboard.
  • the target content corresponding to the first input is the character, word or letter obtained by completing the input.
  • the input history usage record may be input content within a preset time period, and may also include a record of the number of times each input content is used within the preset time, a record of use time, and the like. In this way, after acquiring the target content and the input history usage record, the electronic device can intelligently predict the target control based on the target content and the input history usage record, thereby adjusting the display position of the target control in the virtual keyboard interface.
  • the associated content whose use times are greater than the preset threshold within the preset time can be selected, and its corresponding control can be used as the target control, or the most recently used associated content can be selected by The corresponding control is used as the target control, etc., which will not be listed here.
  • the electronic device can predict the key to be input by referring to the user's input habits according to the historical input history usage record, and use it as a target control.
  • the target content "learning” and input historical usage records can be obtained, so that in the input content corresponding to "learning”, it is determined that the number of uses within the preset time is greater than the preset
  • the associated content of the threshold is "Xi”, and then the target control "X” is obtained, and the display position of "X" in the virtual keyboard interface is adjusted.
  • the display position of the target control can also be adjusted according to the holding mode. Therefore, optionally, before the adjusting the display position of the target control in the virtual keyboard interface, the method further includes:
  • the holding mode includes: a left-hand holding mode and a right-hand holding mode
  • the virtual keyboard interface includes at least a first area and a second area
  • the left-hand holding mode is related to the The first area corresponds to, and the right-hand holding mode corresponds to the second area;
  • the adjusting the display position of the target control in the virtual keyboard interface includes:
  • the second candidate object moves to the area corresponding to the current holding mode, the second candidate object is the target control, or the second candidate object is the same as the target control and belongs to the virtual At least one control in the same area of the keyboard interface.
  • the virtual keyboard interface is divided into different areas corresponding to the holding modes, that is, the left-hand holding mode corresponds to the first area, and the right-hand holding mode corresponds to the second area.
  • the second candidate object can be moved to the area corresponding to the current holding mode.
  • the second candidate object is a previously intelligently prejudged target control, or the second candidate object is at least one control belonging to the same area of the virtual keyboard interface as the target control.
  • the area on which the second candidate object is determined may be an area divided by the corresponding grip mode, or an area divided by other rules.
  • the electronic device divides the display area of the input method keyboard 400 into four areas: A area 401, B area 402, C area 403, and D area 404, where C area 403 corresponds to the left hand grip
  • the first area of the holding mode, the D area 404 is the second area corresponding to the right-hand holding mode. Therefore, when the key is predicted to be located in the B area 402, but the current holding mode is the left-hand holding mode, the electronic device can automatically adjust the key to the C area 403 to facilitate user input.
  • the system adjusts the area where the key is located through automatic horizontal and vertical movement. Wherein, the adjustment can be performed only for the key, or for multiple keys that include the key and are located within a preset range.
  • the acquisition of the user's current holding mode can be achieved through the angular motion detection device or the touch sensor of the electronic device.
  • the angular motion detection device may be a gyroscope.
  • the electronic device calls the built-in gyroscope parameters. If it is detected that the left side of the electronic device is lower than the right side, it is determined as the left-hand holding mode, and if it is detected as the right side wall of the electronic device If the left side is low, it is judged as a right-handed holding mode.
  • the touch sensor can be arranged on the back area of the electronic device, and the holding mode can be determined by the user's touch position information.
  • step 102 the adjustment to the target object is only to facilitate the user's current needs. Therefore, for the convenience of subsequent continued use, optionally, after step 102, it further includes:
  • the second input is an input for triggering initialization.
  • the second input may be an input based on physical keys or virtual keys, or an input based on biometric technology, such as voice, touch, infrared, gesture, etc.
  • biometric technology such as voice, touch, infrared, gesture, etc.
  • a single physical key can be used, or a combination of multiple physical keys can be used.
  • the virtual keyboard interface automatically restores the initial display state, that is, the original Layout.
  • the implementation of the above embodiment is described with the keys in the virtual keyboard interface as the object.
  • the method of the embodiment of the present invention can also take the application as the object and be applied to the application search of the electronic device. Scene, by providing horizontal and/or vertical sliding, or a display area with a preset holding mode, the corresponding application is displayed in the target display area so that the user can use the application.
  • the implementation principle is the same as that in the above-mentioned input method keyboard.
  • the keys are the same, so I won’t repeat them here.
  • the method of the embodiment of the present invention can update the display position of the target control in the virtual keyboard interface based on the needs of the user by receiving the first input when the virtual keyboard interface is displayed, and the target The control is displayed in a better display area, so that it is convenient for the user to continue to operate the target control, and the problem of inconvenience for the user's operation caused by the large display screen of the electronic device is avoided.
  • Fig. 5 is a block diagram of an electronic device according to an embodiment of the present invention.
  • the electronic device 500 shown in FIG. 5 includes a receiving module 510 and a first processing module 520.
  • the receiving module is used to receive the first input when the virtual keyboard interface is displayed
  • the first processing module is configured to update the display position of the target control in the virtual keyboard interface in response to the first input.
  • the first processing module includes:
  • the first adjustment sub-module is configured to adjust the first display area of the first candidate object in the virtual keyboard interface, and the first candidate object is a plurality of controls including the target control.
  • the first processing module further includes:
  • the second adjustment sub-module is used to adjust the display position of the target control in the first display area.
  • the first processing module includes:
  • the third adjustment submodule is configured to adjust the display position of the target control in the virtual keyboard interface according to the target content and the input history usage record.
  • the electronic device further includes:
  • the acquiring module is used to acquire the user's current holding mode; wherein the holding mode includes a left-hand holding mode and a right-hand holding mode, the virtual keyboard interface includes at least a first area and a second area, and the left hand The holding mode corresponds to the first area, and the right-hand holding mode corresponds to the second area;
  • the third adjustment submodule is also used for:
  • the second candidate object moves to the area corresponding to the current holding mode, the second candidate object is the target control, or the second candidate object is the same as the target control and belongs to the virtual At least one control in the same area of the keyboard interface.
  • the electronic device further includes:
  • the second processing module is used for restoring the initial display state of the current interface in response to the second input of the user.
  • the electronic device 500 can implement each process implemented by the electronic device in the method embodiments of FIGS. 1 to 4, and to avoid repetition, details are not described herein again.
  • the electronic device of the embodiment of the present invention can update the display position of the target control in the virtual keyboard interface based on the needs of the user by receiving the first input while displaying the virtual keyboard interface, and display the target control in the comparison.
  • a good display area is convenient for the user to continue to operate the target control, and to avoid the problem of inconvenience for the user to operate due to the large display screen of the electronic device.
  • the electronic device 600 includes but is not limited to: a radio frequency unit 601, a network module 602, an audio output unit 603, an input unit 604, a sensor 605, and a display unit 606, a user input unit 607, an interface unit 608, a memory 609, a processor 610, a power supply 611 and other components.
  • a radio frequency unit 601 for example, a radio frequency unit 601
  • a network module 602 for example, a Wi-Fi Protected Access (WMA)
  • an audio output unit 603 an input unit 604
  • a sensor 605 a sensor
  • a display unit 606 a user input unit 607
  • an interface unit 608 a memory 609
  • a processor 610 a power supply 611 and other components.
  • the electronic device may include more or fewer components than those shown in the figure, or a combination of certain components, or different components. Layout.
  • electronic devices include, but are not limited to, mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted
  • the processor 610 is configured to control the user input unit 607 to receive the first input when the virtual keyboard interface is displayed;
  • the display position of the target control in the virtual keyboard interface is updated.
  • the electronic device can update the display position of the target control in the virtual keyboard interface based on the needs of the user by receiving the first input while displaying the virtual keyboard interface, and display the target control in a better display. Area, so as to facilitate the user to continue to operate the target control, and avoid the problem of inconvenience for the user to operate due to the large display screen of the electronic device.
  • the radio frequency unit 601 can be used to receive and send signals during information transmission or communication. Specifically, the downlink data from the base station is received and sent to the processor 610 for processing; in addition, Uplink data is sent to the base station.
  • the radio frequency unit 601 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 601 can also communicate with the network and other devices through a wireless communication system.
  • the electronic device provides users with wireless broadband Internet access through the network module 602, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 603 can convert the audio data received by the radio frequency unit 601 or the network module 602 or stored in the memory 609 into audio signals and output them as sounds. Moreover, the audio output unit 603 may also provide audio output related to a specific function performed by the electronic device 600 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 603 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 604 is used to receive audio or video signals.
  • the input unit 604 may include a graphics processing unit (GPU) 6041 and a microphone 6042, and the graphics processor 6041 is used to capture images of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Data is processed.
  • the processed image frame may be displayed on the display unit 606.
  • the image frame processed by the graphics processor 6041 may be stored in the memory 609 (or other storage medium) or sent via the radio frequency unit 601 or the network module 602.
  • the microphone 6042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 601 for output in the case of a telephone call mode.
  • the electronic device 600 further includes at least one sensor 605, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 6061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 6061 and the display panel 6061 when the electronic device 600 is moved to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three axes), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of electronic devices (such as horizontal and vertical screen switching, related games) , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, percussion), etc.; sensor 605 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
  • the display unit 606 is used to display information input by the user or information provided to the user.
  • the display unit 606 may include a display panel 6061, and the display panel 6061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 607 may be used to receive inputted numeric or character information, and generate key signal input related to user settings and function control of the electronic device.
  • the user input unit 607 includes a touch panel 6071 and other input devices 6072.
  • the touch panel 6071 also called a touch screen, can collect the user's touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 6071 or near the touch panel 6071. operating).
  • the touch panel 6071 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, and detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 610, the command sent by the processor 610 is received and executed.
  • the touch panel 6071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 607 may also include other input devices 6072.
  • other input devices 6072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 6071 can cover the display panel 6061.
  • the touch panel 6071 detects a touch operation on or near it, it is transmitted to the processor 610 to determine the type of the touch event, and then the processor 610 determines the type of the touch event according to the touch.
  • the type of event provides corresponding visual output on the display panel 6061.
  • the touch panel 6071 and the display panel 6061 are used as two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 6071 and the display panel 6061 can be integrated
  • the implementation of the input and output functions of the electronic device is not specifically limited here.
  • the interface unit 608 is an interface for connecting an external device and the electronic device 600.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 608 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the electronic device 600 or can be used to connect the electronic device 600 to an external device. Transfer data between devices.
  • the memory 609 can be used to store software programs and various data.
  • the memory 609 may mainly include a storage program area and a storage data area.
  • the storage program area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data created by the use of mobile phones (such as audio data, phone book, etc.), etc.
  • the memory 609 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 610 is the control center of the electronic device. It uses various interfaces and lines to connect the various parts of the entire electronic device, runs or executes the software programs and/or modules stored in the memory 609, and calls the data stored in the memory 609. , Perform various functions of electronic equipment and process data, so as to monitor the electronic equipment as a whole.
  • the processor 610 may include one or more processing units; preferably, the processor 610 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, application programs, etc., and the modem
  • the processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 610.
  • the electronic device 600 may also include a power source 611 (such as a battery) for supplying power to various components.
  • a power source 611 such as a battery
  • the power source 611 may be logically connected to the processor 610 through a power management system, so as to manage charging, discharging, and power consumption management through the power management system. And other functions.
  • the electronic device 600 includes some functional modules not shown, which will not be repeated here.
  • the embodiment of the present invention also provides an electronic device including a processor, a memory, and a computer program stored on the memory and capable of running on the processor.
  • a computer program stored on the memory and capable of running on the processor.
  • the embodiment of the present invention also provides a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, each process of the above-mentioned display processing method embodiment is implemented, And can achieve the same technical effect, in order to avoid repetition, I will not repeat them here.
  • the computer-readable storage medium such as read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk, or optical disk, etc.
  • the technical solution of the present invention essentially or the part that contributes to the existing technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, The optical disc) includes a number of instructions to enable a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to execute the method described in each embodiment of the present invention.
  • a terminal which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé de traitement d'affichage et un dispositif électronique, lesquels se rapportent au domaine technique de la communication. Ledit procédé est appliqué à un dispositif électronique, et consiste : à recevoir, dans des cas dans lesquels une interface de clavier virtuel est affichée, une première entrée (101) ; et à mettre à jour, en réponse à la première entrée, une position d'affichage d'une commande cible dans l'interface de clavier virtuel (102).
PCT/CN2020/138970 2019-12-25 2020-12-24 Procédé de traitement d'affichage et dispositif électronique WO2021129732A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911354477.9 2019-12-25
CN201911354477.9A CN111142679A (zh) 2019-12-25 2019-12-25 一种显示处理方法及电子设备

Publications (1)

Publication Number Publication Date
WO2021129732A1 true WO2021129732A1 (fr) 2021-07-01

Family

ID=70519817

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/138970 WO2021129732A1 (fr) 2019-12-25 2020-12-24 Procédé de traitement d'affichage et dispositif électronique

Country Status (2)

Country Link
CN (1) CN111142679A (fr)
WO (1) WO2021129732A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111142679A (zh) * 2019-12-25 2020-05-12 维沃移动通信有限公司 一种显示处理方法及电子设备
CN112527968A (zh) * 2020-12-22 2021-03-19 大唐融合通信股份有限公司 一种基于神经网络的作文评阅方法和系统
CN115695646A (zh) * 2021-07-28 2023-02-03 Oppo广东移动通信有限公司 拨号界面的显示方法、装置、终端及存储介质
CN113703592A (zh) * 2021-08-31 2021-11-26 维沃移动通信有限公司 安全输入方法和装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102362254A (zh) * 2009-03-23 2012-02-22 韩国科亚电子股份有限公司 虚拟键盘提供装置及方法
CN103329086A (zh) * 2011-01-25 2013-09-25 索尼电脑娱乐公司 输入装置、输入方法及计算机程序
CN103543913A (zh) * 2013-10-25 2014-01-29 小米科技有限责任公司 一种终端设备操作方法、装置和终端设备
US20140359473A1 (en) * 2013-05-29 2014-12-04 Huawei Technologies Co., Ltd. Method for switching and presenting terminal operation mode and terminal
CN104793880A (zh) * 2014-01-16 2015-07-22 华为终端有限公司 一种界面操作的方法及终端
CN111142679A (zh) * 2019-12-25 2020-05-12 维沃移动通信有限公司 一种显示处理方法及电子设备

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102750108B (zh) * 2012-08-03 2016-09-21 东莞宇龙通信科技有限公司 终端和终端控制方法
CN103425394B (zh) * 2013-08-13 2016-05-25 广东欧珀移动通信有限公司 一种用于触摸屏的变换图标位置的方法及其装置
CN103488420A (zh) * 2013-09-02 2014-01-01 宇龙计算机通信科技(深圳)有限公司 虚拟键盘调整的方法和装置
CN104991735B (zh) * 2015-07-23 2018-08-03 宁波萨瑞通讯有限公司 一种虚拟键盘输入方法及移动终端
CN106610779A (zh) * 2015-10-26 2017-05-03 阿里巴巴集团控股有限公司 触屏移动设备键盘转换方法、装置及移动设备
CN108322603A (zh) * 2018-01-29 2018-07-24 上海萌王智能科技有限公司 一种可分组控制应用程序图标的手机
CN109814974A (zh) * 2019-01-31 2019-05-28 维沃移动通信有限公司 应用程序界面调整方法及移动终端

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102362254A (zh) * 2009-03-23 2012-02-22 韩国科亚电子股份有限公司 虚拟键盘提供装置及方法
CN103329086A (zh) * 2011-01-25 2013-09-25 索尼电脑娱乐公司 输入装置、输入方法及计算机程序
US20140359473A1 (en) * 2013-05-29 2014-12-04 Huawei Technologies Co., Ltd. Method for switching and presenting terminal operation mode and terminal
CN103543913A (zh) * 2013-10-25 2014-01-29 小米科技有限责任公司 一种终端设备操作方法、装置和终端设备
CN104793880A (zh) * 2014-01-16 2015-07-22 华为终端有限公司 一种界面操作的方法及终端
CN111142679A (zh) * 2019-12-25 2020-05-12 维沃移动通信有限公司 一种显示处理方法及电子设备

Also Published As

Publication number Publication date
CN111142679A (zh) 2020-05-12

Similar Documents

Publication Publication Date Title
WO2020147674A1 (fr) Procédé d'invite de message non lu et terminal mobile
US20220365641A1 (en) Method for displaying background application and mobile terminal
US11675442B2 (en) Image processing method and flexible-screen terminal
WO2020258929A1 (fr) Procédé de commutation d'interface de dossier et dispositif terminal
WO2021129732A1 (fr) Procédé de traitement d'affichage et dispositif électronique
WO2021129762A1 (fr) Procédé de partage d'application, dispositif électronique et support de stockage lisible par ordinateur
EP4071606A1 (fr) Procédé de partage d'application, premier dispositif électronique et support de stockage lisible par ordinateur
WO2021017776A1 (fr) Procédé de traitement d'informations et terminal
US11604567B2 (en) Information processing method and terminal
US20200257433A1 (en) Display method and mobile terminal
WO2021136159A1 (fr) Procédé de capture d'écran et dispositif électronique
WO2020233323A1 (fr) Procédé de commande d'affichage, dispositif terminal, et support de stockage lisible par ordinateur
US11354017B2 (en) Display method and mobile terminal
US20220053082A1 (en) Application interface display method and mobile terminal
WO2020238497A1 (fr) Procédé de déplacement d'icône et dispositif terminal
WO2020001604A1 (fr) Procédé d'affichage et dispositif terminal
WO2021068885A1 (fr) Procédé de commande et dispositif électronique
WO2021036553A1 (fr) Procédé d'affichage d'icônes et dispositif électronique
WO2019228296A1 (fr) Procédé de traitement d'affichage et dispositif terminal
JP7229365B2 (ja) 権限管理方法及び端末機器
WO2020220893A1 (fr) Procédé de capture d'écran et terminal mobile
WO2020024770A1 (fr) Procédé pour déterminer un objet de communication, et terminal mobile
WO2020042921A1 (fr) Procédé de commande d'écran et dispositif électronique
WO2021104232A1 (fr) Procédé d'affichage et dispositif électronique
WO2020156119A1 (fr) Procédé de réglage d'interface de programme d'application et terminal mobile

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20905052

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20905052

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 12.01.2023)

122 Ep: pct application non-entry in european phase

Ref document number: 20905052

Country of ref document: EP

Kind code of ref document: A1