WO2020156115A1 - Procédé d'affichage d'objets et appareil de terminal - Google Patents

Procédé d'affichage d'objets et appareil de terminal Download PDF

Info

Publication number
WO2020156115A1
WO2020156115A1 PCT/CN2020/071709 CN2020071709W WO2020156115A1 WO 2020156115 A1 WO2020156115 A1 WO 2020156115A1 CN 2020071709 W CN2020071709 W CN 2020071709W WO 2020156115 A1 WO2020156115 A1 WO 2020156115A1
Authority
WO
WIPO (PCT)
Prior art keywords
input
display area
screen
display
response
Prior art date
Application number
PCT/CN2020/071709
Other languages
English (en)
Chinese (zh)
Inventor
刘善国
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2020156115A1 publication Critical patent/WO2020156115A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions

Definitions

  • the embodiments of the present disclosure relate to the field of communication technology, and in particular, to an object display method and terminal device.
  • Smart terminals such as mobile phones
  • Smart terminals have become indispensable electronic products in people's lives.
  • Smart terminals have more and more functions, making them integrate the functions of communication, camera, audio and video, and Internet e-commerce. Among them, taking pictures and online shopping are more frequently used scenarios by users.
  • the embodiments of the present disclosure provide an object display method and a terminal device, so as to solve the problem that the operation of the terminal is cumbersome and cannot be displayed intuitively when comparing different objects.
  • an embodiment of the present disclosure provides an object display method, applied to a terminal device, including: receiving a first input to a screen; in response to the first input, displaying a first object currently displayed on the screen A first display area displayed on the screen; receiving a second input to a second display area on the screen; and, in response to the second input, displaying a second object in the second display area ,
  • the first object and the second object are objects in the same application.
  • the embodiments of the present disclosure also provide a terminal device, including: a first receiving module, configured to receive a first input to the screen; a first display module, configured to respond to the first input The first object currently displayed on the screen is displayed in the first display area on the screen; a second receiving module for receiving a second input to the second display area on the screen; and, a second display module , Used to display a second object in the second display area in response to the second input, where the first object and the second object are objects in the same application.
  • the embodiments of the present disclosure also provide a terminal device, including a processor, a memory, and a computer program stored on the memory and running on the processor, and the computer program is executed by the processor. When executed, the steps of the object display method as described in the first aspect are realized.
  • the embodiments of the present disclosure also provide a computer-readable storage medium having a computer program stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program described in the first aspect The steps of the object display method.
  • the first input to the screen is received; in response to the first input, the first object currently displayed on the screen is displayed in the first display area of the screen; the second display area to the screen is received.
  • FIG. 1 is a schematic flowchart of an object display method provided by an embodiment of the disclosure
  • FIG. 2 is a schematic diagram of an implementation process of an object display method provided by an embodiment of the disclosure.
  • FIG. 3 is a schematic diagram of an implementation process of an object display method provided by another embodiment of the present disclosure.
  • FIG. 4 is a schematic structural diagram of a terminal device provided by an embodiment of the disclosure.
  • FIG. 5 is a schematic diagram of the hardware structure of a terminal device provided by an embodiment of the disclosure.
  • FIG. 1 it is a schematic flowchart of an object display method provided by an embodiment of the present disclosure. The following describes the implementation process of this method in detail with reference to the figure.
  • Step 101 Receive a first input to the screen.
  • the first input is a preset input.
  • the first input may include, but is not limited to, at least one of click input, press input, long press input, pinch input, drag input, slide input, and swipe input
  • the first input can be one of the above-mentioned inputs, or can also be a combined input of two or more of them.
  • Step 102 In response to the first input, display a first object currently displayed on the screen in a first display area on the screen.
  • the terminal device in response to the first input received in step 101, displays the first object currently displayed on the screen in the first display area on the screen.
  • Step 103 Receive a second input to the second display area on the screen.
  • the second input is a preset input.
  • the second input may include, but is not limited to, at least one of click input, press input, long press input, pinch input, drag input, slide input, and swipe input.
  • the second input may be one of the foregoing inputs, or may also be a combined input of two or more of the inputs.
  • the second display area and the first display area are different display areas on the screen.
  • Step 104 In response to the second input, display a second object in the second display area, where the first object and the second object are objects in the same application.
  • the terminal device in response to the second input received in step 103, displays the second object corresponding to the second input in the second display area.
  • the first object and the second object under the same application can be displayed on the screen intuitively, which is convenient for users to compare and meets the needs of users for different objects.
  • first object and the second object are objects of the same information type.
  • first object and the second object may include information content such as text, picture, audio, video, and network media information.
  • the present disclosure by receiving a first input to the screen; in response to the first input, displaying the first object currently displayed on the screen in the first display area of the screen; receiving the second input to the second display area of the screen Input; In response to the second input, the second object is displayed in the second display area.
  • the first object and the second object are objects under the same application. In this way, through a simple input operation, the first object and the Both objects are visually displayed on the screen to meet users' comparative needs for different objects and improve user experience.
  • step 104 the method further includes:
  • the first object displayed on the first display area is controlled to remain unchanged.
  • the above steps indicate that the first object has been determined by the user as one of the objects to be compared.
  • the purpose is to prevent the user’s input operation on the second display area from affecting the second display area when the user browses and views objects through input operations on the second display area.
  • step 102 and before step 103 the method further includes:
  • the third input is a preset input.
  • the third input may include but is not limited to at least one of click input, press input, long press input, pinch input, drag input, slide input, and swipe input.
  • the third input can be one of the above-mentioned inputs, or can also be a combined input of two or more of them.
  • the first object displayed on the first display area is controlled to remain unchanged.
  • the terminal device controls the first object displayed on the first display area to remain unchanged.
  • the first object displayed on the first display area is controlled to remain unchanged, and the purpose is for the user to perform input operations on the second display area.
  • the user's input operation on the second display area is prevented from affecting the display of the first object on the first display area.
  • the terminal device may be a terminal with a folding screen, wherein the folding screen has at least two sub-screens. It should be noted that the at least two sub-screens may be actually physically divided sub-screens, or may be sub-screens set by the system or the user.
  • FIG. 2 a scene where a user compares pictures.
  • the first picture A in the gallery has been displayed on the first sub-screen 1 through a user operation (ie, the first input).
  • the terminal device receives the user's continuous pressing input on the first sub-screen 1, and responds to the continuous pressing input , Control the first picture A displayed on the first sub-screen 1 to remain unchanged; then, receive the user's sliding input on the second sub-screen 2 (the direction is shown by the arrow in the figure, that is, from right to left),
  • the second picture B corresponding to the sliding input is displayed on the second sub-screen 2.
  • the sliding input direction shown in FIG. 2 is only an example. Those skilled in the art can understand that the sliding input can also be in other directions, and means that different pictures are displayed on the first sub-screen 1 or the second sub-screen. The corresponding sub-screen of screen 2 or other sub-screens will not be repeated here.
  • the first picture A and the second picture B are different pictures in the gallery.
  • the method in order to prevent an accidental touch operation on the first display area where the first object that has been determined by the user as the object to be compared is located, the method also includes:
  • the fourth input is a preset input.
  • the fourth input may include, but is not limited to, at least one of click input, press input, long press input, pinch input, drag input, slide input, and swipe input.
  • the fourth input may be one of the above-mentioned inputs, or may also be a combined input of two or more of them.
  • the fourth input is a double-click input, for example, a double-click input or a triple-click input.
  • the first display area is controlled to shield the response to the input operation.
  • the first display area is controlled to shield the response to input operations other than the preset input.
  • the preset input is used to start the first display area in response to the input operation.
  • the first object is network media information.
  • the first object currently displayed on the screen is displayed in the first display area on the screen. Specifically:
  • network media information may include web browsing interfaces, such as news web browsing interfaces, e-commerce (such as online shopping) web browsing interfaces, and so on.
  • web browsing interfaces such as news web browsing interfaces, e-commerce (such as online shopping) web browsing interfaces, and so on.
  • a screenshot of the network media information currently displayed on the screen is taken and displayed in the first display area on the screen.
  • the first input includes a pressing input on the first display area and a sliding input on the second display area on the screen.
  • the terminal device may be a terminal with a folding screen, wherein the folding screen has at least two sub-screens. It should be noted that the at least two sub-screens may be actually physically divided sub-screens, or may be sub-screens set by the system or the user.
  • a user compares similar shopping products.
  • the second sub-screen 2 of the terminal device displays the related content of the product, and the terminal device receives the user's continuous pressing input on the first sub-screen 1 and receives the user's sliding input on the second sub-screen 2 (the direction diagram is as shown in the figure). (Indicated by the arrow), and then in response to the continuous pressing input and sliding input, a screenshot of the relevant content of the product displayed on the second sub-screen 2 (used for subsequent product comparison), and displayed on the first sub-screen 1.
  • the present disclosure by receiving a first input to the screen; in response to the first input, displaying the first object currently displayed on the screen in the first display area of the screen; receiving the second input to the second display area of the screen Input; In response to the second input, the second object is displayed in the second display area.
  • the first object and the second object are objects under the same application. In this way, through a simple input operation, the first object and the Both objects are visually displayed on the screen to meet users' comparative needs for different objects and improve user experience.
  • embodiments of the present disclosure provide a terminal device for implementing the above method.
  • FIG. 4 it is a schematic structural diagram of a terminal device provided by an embodiment of the present disclosure.
  • An embodiment of the present disclosure provides a terminal device 400, which may include: a first receiving module 401, a first display module 402, a second receiving module 403, and a second display module 404.
  • the first receiving module 401 is configured to receive a first input to the screen
  • the first display module 402 is configured to display a first object currently displayed on the screen in a first display area on the screen in response to the first input;
  • the second receiving module 403 is configured to receive a second input to the second display area on the screen
  • the second display module 404 is configured to display a second object in the second display area in response to the second input, and the first object and the second object are objects in the same application.
  • the terminal device 400 may further include: a first control module.
  • the first control module is configured to control the first object displayed on the first display area to remain unchanged while displaying a second object in the second display area in response to the second input .
  • the terminal device 400 may further include: a third receiving module and a second control module.
  • the third receiving module is configured to receive a third input to the first display area
  • the second control module is configured to control the first object displayed on the first display area to remain unchanged in response to the third input.
  • the terminal device 400 may further include: a fourth receiving module and a third control module.
  • a fourth receiving module configured to receive a fourth input to the first display area after controlling the first object displayed on the first display area to remain unchanged
  • the third control module is configured to control the first display area to shield the response to the input operation in response to the fourth input.
  • the first object is network media information
  • the first display module 402 includes:
  • the display unit is configured to take a screenshot of the network media information currently displayed on the screen and display it in the first display area on the screen.
  • the mobile terminal provided in the embodiment of the present disclosure can implement each process implemented by the mobile terminal in the method embodiments of FIG. 1 to FIG. 3, and in order to avoid repetition, details are not repeated here.
  • the terminal device receives the first input to the screen through the first receiving module; the first display module displays the first object currently displayed on the screen in the first display area of the screen in response to the first input;
  • the second receiving module receives the second input to the second display area of the screen; the second display module displays the second object in the second display area in response to the second input, and the first object and the second object are the same application.
  • FIG. 5 is a schematic diagram of the hardware structure of a terminal device 500 that implements various embodiments of the present disclosure.
  • the terminal device 500 includes but is not limited to: a radio frequency unit 501, a network module 502, an audio output unit 503, an input unit 504, a sensor 505, a display unit 506, a user input unit 507, an interface unit 508, a memory 509, a processor 510, and Power 511 and other components.
  • a radio frequency unit 501 includes but is not limited to: a radio frequency unit 501, a network module 502, an audio output unit 503, an input unit 504, a sensor 505, a display unit 506, a user input unit 507, an interface unit 508, a memory 509, a processor 510, and Power 511 and other components.
  • terminal devices include but are not limited to mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted terminal devices, wearable devices, and pedometers.
  • the user input unit 507 is configured to receive a first input to the screen; the display unit 506 is configured to display the first object currently displayed on the screen in the first display on the screen in response to the first input Area; the user input unit 507 is used to receive a second input to the second display area on the screen; the display unit 506 is used to display a second object in the second display area in response to the second input,
  • the first object and the second object are objects in the same application.
  • the present disclosure by receiving a first input to the screen; in response to the first input, displaying the first object currently displayed on the screen in the first display area of the screen; receiving the second input to the second display area of the screen Input; In response to the second input, the second object is displayed in the second display area.
  • the first object and the second object are objects under the same application. In this way, through a simple input operation, the first object and the Both objects are visually displayed on the screen to meet users' comparative needs for different objects and improve user experience.
  • the radio frequency unit 501 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, the downlink data from the base station is received and processed by the processor 510; Uplink data is sent to the base station.
  • the radio frequency unit 501 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 501 can also communicate with the network and other devices through a wireless communication system.
  • the terminal device provides users with wireless broadband Internet access through the network module 502, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 503 may convert the audio data received by the radio frequency unit 501 or the network module 502 or stored in the memory 509 into an audio signal and output it as sound. Moreover, the audio output unit 503 may also provide audio output related to a specific function performed by the terminal device 500 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 503 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 504 is used to receive audio or video signals.
  • the input unit 504 may include a graphics processing unit (GPU) 5041 and a microphone 5042.
  • the graphics processor 5041 is configured to monitor images of still pictures or videos obtained by an image capture device (such as a camera) in the video capture mode or the image capture mode. Data is processed.
  • the processed image frame may be displayed on the display unit 506.
  • the image frame processed by the graphics processor 5041 may be stored in the memory 509 (or other storage medium) or sent via the radio frequency unit 501 or the network module 502.
  • the microphone 5042 can receive sound and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 501 for output in the case of a telephone call mode.
  • the terminal device 500 also includes at least one sensor 505, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 5061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 5061 and the display panel 5061 when the terminal device 500 is moved to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three axes), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of mobile terminal equipment (such as horizontal and vertical screen switching, related Games, magnetometer posture calibration), vibration recognition related functions (such as pedometer, percussion), etc.; sensor 505 can also include fingerprint sensor, pressure sensor, iris sensor, molecular sensor, gyroscope, barometer, hygrometer, thermometer , Infrared sensors, etc., I won’t repeat them here.
  • the display unit 506 is used to display information input by the user or information provided to the user.
  • the display unit 506 may include a display panel 5061, and the display panel 5061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 507 can be used to receive inputted numeric or character information, and generate key signal input related to user settings and function control of the mobile terminal device.
  • the user input unit 507 includes a touch panel 5071 and other input devices 5072.
  • the touch panel 5071 also called a touch screen, can collect the user's touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 5071 or near the touch panel 5071. operating).
  • the touch panel 5071 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, and detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 510, the command sent by the processor 510 is received and executed.
  • the touch panel 5071 can be realized in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 507 may also include other input devices 5072.
  • other input devices 5072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 5071 can be overlaid on the display panel 5061.
  • the touch panel 5071 detects a touch operation on or near it, it transmits it to the processor 510 to determine the type of the touch event, and then the processor 510 responds to the touch The type of event provides corresponding visual output on the display panel 5061.
  • the touch panel 5071 and the display panel 5061 are used as two independent components to implement the input and output functions of the mobile terminal device, in some embodiments, the touch panel 5071 and the display panel 5061 may be combined. Integrate to realize the input and output functions of the mobile terminal device, which is not specifically limited here.
  • the interface unit 508 is an interface for connecting an external device and the terminal device 500.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 508 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the terminal device 500 or can be used to connect to the terminal device 500 and an external device. Transfer data between devices.
  • the memory 509 can be used to store software programs and various data.
  • the memory 509 may mainly include a program storage area and a data storage area.
  • the program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data (such as audio data, phone book, etc.) created by the use of mobile phones.
  • the memory 509 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 510 is the control center of the mobile terminal device. It uses various interfaces and lines to connect the various parts of the entire mobile terminal device, runs or executes the software programs and/or modules stored in the memory 509, and calls and stores them in the memory 509 Execute various functions and process data of the mobile terminal equipment, so as to monitor the mobile terminal equipment as a whole.
  • the processor 510 may include one or more processing units; optionally, the processor 510 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, and application programs, etc.
  • the adjustment processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 510.
  • the terminal device 500 may also include a power source 511 (such as a battery) for supplying power to various components.
  • a power source 511 such as a battery
  • the power source 511 may be logically connected to the processor 510 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
  • the terminal device 500 includes some functional modules not shown, which will not be repeated here.
  • an embodiment of the present disclosure further provides a terminal device, including a processor 510, a memory 509, a computer program stored in the memory 509 and running on the processor 510, and the computer program is implemented when the processor 510 is executed.
  • a terminal device including a processor 510, a memory 509, a computer program stored in the memory 509 and running on the processor 510, and the computer program is implemented when the processor 510 is executed.
  • the embodiments of the present disclosure also provide a computer-readable storage medium, and a computer program is stored on the computer-readable storage medium.
  • a computer program is stored on the computer-readable storage medium.
  • the computer program is executed by a processor, each process of the foregoing object display method embodiment is realized, and the same technology can be achieved. The effect, in order to avoid repetition, will not be repeated here.
  • the computer-readable storage medium such as read-only memory (Read-Only Memory, ROM for short), random access memory (Random Access Memory, RAM for short), magnetic disk or optical disk, etc.
  • the technical solution of the present disclosure can be embodied in the form of a software product in essence or the part that contributes to the related technology.
  • the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk). ) Includes several instructions to make a terminal device (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the method described in each embodiment of the present disclosure.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Human Computer Interaction (AREA)
  • Development Economics (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Economics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé d'affichage d'objets et un appareil de terminal. Le procédé comprend les étapes consistant à : recevoir une première entrée par rapport à un écran ; en réponse à la première entrée, afficher un premier objet actuellement affiché sur l'écran dans une première région d'affichage de l'écran ; recevoir une seconde entrée par rapport à une seconde région d'affichage de l'écran ; et en réponse à la seconde entrée, afficher un second objet dans la seconde région d'affichage, le premier objet et le second objet étant des objets dans la même application.
PCT/CN2020/071709 2019-01-31 2020-01-13 Procédé d'affichage d'objets et appareil de terminal WO2020156115A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910100624.3A CN109871176A (zh) 2019-01-31 2019-01-31 一种对象显示方法及终端设备
CN201910100624.3 2019-01-31

Publications (1)

Publication Number Publication Date
WO2020156115A1 true WO2020156115A1 (fr) 2020-08-06

Family

ID=66918501

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/071709 WO2020156115A1 (fr) 2019-01-31 2020-01-13 Procédé d'affichage d'objets et appareil de terminal

Country Status (2)

Country Link
CN (1) CN109871176A (fr)
WO (1) WO2020156115A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109871176A (zh) * 2019-01-31 2019-06-11 维沃移动通信有限公司 一种对象显示方法及终端设备
CN111159983B (zh) * 2019-12-31 2023-07-04 维沃移动通信有限公司 一种编辑方法及电子设备
CN111243200A (zh) * 2019-12-31 2020-06-05 维沃移动通信有限公司 购物方法、穿戴式设备及介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120066621A1 (en) * 2010-09-14 2012-03-15 Nintendo Co., Ltd. Computer-readable storage medium having stored thereon display control program, display control system, display control apparatus, and display control method
CN106293772A (zh) * 2016-08-25 2017-01-04 维沃移动通信有限公司 一种分屏显示的处理方法及移动终端
CN106940621A (zh) * 2016-01-05 2017-07-11 腾讯科技(深圳)有限公司 图片处理的方法和装置
CN107707752A (zh) * 2017-09-28 2018-02-16 维沃移动通信有限公司 一种图片处理方法及移动终端
CN108108418A (zh) * 2017-12-14 2018-06-01 北京小米移动软件有限公司 图片管理方法、装置及存储介质
CN109871176A (zh) * 2019-01-31 2019-06-11 维沃移动通信有限公司 一种对象显示方法及终端设备

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4929414B1 (ja) * 2011-08-31 2012-05-09 楽天株式会社 情報処理装置、情報処理装置の制御方法、プログラム、及び情報記憶媒体
CN106681606A (zh) * 2016-12-06 2017-05-17 宇龙计算机通信科技(深圳)有限公司 一种图片处理方法及终端
CN106990908B (zh) * 2017-04-06 2020-06-16 广州视源电子科技股份有限公司 一种局部触摸屏蔽方法、装置、系统、设备及存储介质
CN108446062A (zh) * 2018-02-13 2018-08-24 广州视源电子科技股份有限公司 一种对象固定方法、装置、终端设备及存储介质
CN108763317B (zh) * 2018-04-27 2021-06-29 维沃移动通信有限公司 一种辅助选取图片的方法和终端设备
CN109284065B (zh) * 2018-10-23 2021-05-25 南昌努比亚技术有限公司 按键防误触方法、装置、移动终端及可读存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120066621A1 (en) * 2010-09-14 2012-03-15 Nintendo Co., Ltd. Computer-readable storage medium having stored thereon display control program, display control system, display control apparatus, and display control method
CN106940621A (zh) * 2016-01-05 2017-07-11 腾讯科技(深圳)有限公司 图片处理的方法和装置
CN106293772A (zh) * 2016-08-25 2017-01-04 维沃移动通信有限公司 一种分屏显示的处理方法及移动终端
CN107707752A (zh) * 2017-09-28 2018-02-16 维沃移动通信有限公司 一种图片处理方法及移动终端
CN108108418A (zh) * 2017-12-14 2018-06-01 北京小米移动软件有限公司 图片管理方法、装置及存储介质
CN109871176A (zh) * 2019-01-31 2019-06-11 维沃移动通信有限公司 一种对象显示方法及终端设备

Also Published As

Publication number Publication date
CN109871176A (zh) 2019-06-11

Similar Documents

Publication Publication Date Title
WO2021098678A1 (fr) Procédé de commande de vidéocapture d'écran et dispositif électronique
WO2019228294A1 (fr) Procédé de partage d'objet et terminal mobile
US20220365641A1 (en) Method for displaying background application and mobile terminal
US11675442B2 (en) Image processing method and flexible-screen terminal
WO2020156169A1 (fr) Procédé de commande d'affichage et dispositif terminal
WO2021017776A1 (fr) Procédé de traitement d'informations et terminal
WO2020238449A1 (fr) Procédé de traitement de messages de notification et terminal
WO2021098677A1 (fr) Procédé d'affichage et dispositif électronique
WO2020151513A1 (fr) Procédé de traitement d'informations et dispositif terminal
WO2020259091A1 (fr) Procédé d'affichage de contenu d'écran et terminal
CN109525710B (zh) 一种访问应用程序的方法和装置
WO2020238497A1 (fr) Procédé de déplacement d'icône et dispositif terminal
WO2019223492A1 (fr) Procédé d'affichage d'informations, terminal mobile, et support de stockage lisible par ordinateur
US11354017B2 (en) Display method and mobile terminal
WO2020238463A1 (fr) Procédé de traitement de messages et terminal
WO2019223569A1 (fr) Procédé de traitement d'informations et terminal mobile
WO2020156115A1 (fr) Procédé d'affichage d'objets et appareil de terminal
CN108646960B (zh) 一种文件处理方法及柔性屏终端
WO2020199783A1 (fr) Procédé d'affichage d'interface et dispositif terminal
WO2021197265A1 (fr) Procédé de présentation d'informations, dispositif électronique et support de stockage
WO2021077908A1 (fr) Procédé de réglage de paramètre et dispositif électronique
WO2020220893A1 (fr) Procédé de capture d'écran et terminal mobile
WO2020199986A1 (fr) Procédé d'appel vidéo et dispositif terminal
WO2021104232A1 (fr) Procédé d'affichage et dispositif électronique
WO2020173316A1 (fr) Procédé d'affichage d'images, terminal et terminal mobile

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20747654

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20747654

Country of ref document: EP

Kind code of ref document: A1