WO2020151460A1 - 对象处理方法及终端设备 - Google Patents

对象处理方法及终端设备 Download PDF

Info

Publication number
WO2020151460A1
WO2020151460A1 PCT/CN2019/129861 CN2019129861W WO2020151460A1 WO 2020151460 A1 WO2020151460 A1 WO 2020151460A1 CN 2019129861 W CN2019129861 W CN 2019129861W WO 2020151460 A1 WO2020151460 A1 WO 2020151460A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
terminal device
screen
input
content indicated
Prior art date
Application number
PCT/CN2019/129861
Other languages
English (en)
French (fr)
Inventor
李昊晨
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2020151460A1 publication Critical patent/WO2020151460A1/zh
Priority to US17/383,434 priority Critical patent/US20210349591A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Definitions

  • the embodiments of the present disclosure relate to the field of communication technologies, and in particular, to an object processing method and terminal equipment.
  • a file as a photo if there are many photos in the album of the terminal device, the screen of the terminal device may not be able to display all the photos in the album at the same time. Therefore, the user can swipe on the screen to The terminal device is triggered to scroll the photos in the album, so that the user can select multiple photos from these photos to perform management operations such as deleting the multiple photos.
  • the user wants to change the selected photo, when the screen cannot display the photo that the user has selected at the same time, the user can slide on the screen again to trigger the terminal device to scroll to display the user The selected photo. In this way, the user can change the selected photos and perform management operations on the changed photos, resulting in a cumbersome and time-consuming process of viewing and operating files.
  • the embodiments of the present disclosure provide an object processing method and terminal equipment to solve the problem of cumbersome and time-consuming processes of viewing and operating files.
  • embodiments of the present disclosure provide an object processing method, which is applied to a terminal device including a first screen and a second screen.
  • the method includes: receiving a first input of a user, the first input being a selection input of a target object among at least one first object displayed on a first screen; and in response to the first input, displaying the first input on the second screen Target object; receiving a second input from a user for at least one second object displayed on the second screen, the at least one second object including the target object; in response to the second input, performing target processing on the at least one second object .
  • embodiments of the present disclosure provide a terminal device, the terminal device includes a first screen and a second screen, and the terminal device includes a receiving module, a display module, and a processing module.
  • the receiving module is configured to receive a user's first input, and the first input is a selection input of a target object among at least one first object displayed on the first screen; and the display module is configured to respond to the input received by the receiving module
  • the first input is used to display the target object on the second screen;
  • the receiving module is also used to receive a second input of the user for at least one second object displayed on the second screen, the at least one second object includes the target object
  • the processing module is configured to perform target processing on the at least one second object in response to the second input received by the receiving module.
  • the embodiments of the present disclosure provide a terminal device, including a processor, a memory, and a computer program stored in the memory and running on the processor.
  • the computer program implements the first The steps of the object processing method provided by the aspect.
  • embodiments of the present disclosure provide a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, the steps of the object processing method provided in the first aspect are implemented.
  • a user's first input may be received, and the first input is a selection input of a target object among at least one first object displayed on the first screen; in response to the first input, the second input Display the target object on the screen; and receive a second input from the user for at least one second object displayed on the second screen, the at least one second object includes the target object; in response to the second input, the at least one second object The second object performs target processing.
  • the terminal device can display the object selected by the user from the multiple objects on the first screen on the second screen, the user can modify and modify one or more second objects on the second screen. Management operations, etc., without sliding up and down on the first screen to trigger the terminal device to scroll and display the objects that the user has selected, which can simplify the process of viewing and operating files and save the user time.
  • FIG. 1 is a schematic diagram of the architecture of an Android operating system provided by an embodiment of the disclosure
  • FIG. 2 is one of the schematic diagrams of an object processing method provided by an embodiment of the disclosure.
  • FIG. 3 is a schematic diagram of an operation on a target object provided by an embodiment of the disclosure.
  • FIG. 4 is a schematic diagram of an operation for a second object provided by an embodiment of the disclosure.
  • FIG. 5 is a second schematic diagram of an object processing method provided by an embodiment of the disclosure.
  • FIG. 6 is the third schematic diagram of an object processing method provided by an embodiment of the disclosure.
  • FIG. 7 is a schematic diagram of a terminal device displaying a third object according to an embodiment of the disclosure.
  • FIG. 8 is a fourth schematic diagram of an object processing method provided by an embodiment of the disclosure.
  • FIG. 9 is a schematic diagram of a user's operation of a target control provided by an embodiment of the disclosure.
  • FIG. 10 is a schematic structural diagram of a terminal device provided by an embodiment of the disclosure.
  • FIG. 11 is a schematic diagram of hardware of a terminal device provided by an embodiment of the disclosure.
  • first and second in the specification and claims of the present disclosure are used to distinguish different objects, rather than describing a specific order of objects.
  • first input and the second input are used to distinguish different inputs, rather than to describe a specific order of input.
  • words such as “exemplary” or “for example” are used as examples, illustrations, or illustrations. Any embodiment or design solution described as “exemplary” or “for example” in the embodiments of the present disclosure should not be construed as being more advantageous than other embodiments or design solutions. To be precise, words such as “exemplary” or “for example” are used to present related concepts in a specific manner.
  • plural means two or more than two, for example, a plurality of elements means two or more elements, and so on.
  • Embodiments of the present disclosure provide an object processing method and terminal device, which can receive a user's first input, the first input being a selection input of a target object among at least one first object displayed on a first screen; in response to the First input, display the target object on the second screen; and receive a second input from the user for at least one second object displayed on the second screen, the at least one second object includes the target object; in response to the second Input to perform target processing on the at least one second object.
  • the terminal device can display the object selected by the user from the multiple objects on the first screen on the second screen, the user can modify and modify one or more second objects on the second screen. Management operations, etc., without the need to slide up and down on the first screen to trigger the terminal device to scroll and display the objects that the user has selected, which can simplify the process of viewing and operating files and save the user time.
  • the terminal device in the embodiment of the present disclosure may be a terminal device with an operating system.
  • the operating system may be an Android operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present disclosure.
  • FIG. 1 it is a schematic diagram of the architecture of an Android operating system provided by an embodiment of the present disclosure.
  • the architecture of the Android operating system includes 4 layers, which are: application layer, application framework layer, system runtime library layer, and kernel layer (specifically, it may be the Linux kernel layer).
  • the application layer includes various applications (including system applications and third-party applications) in the Android operating system.
  • the application framework layer is the framework of the application. Developers can develop some applications based on the application framework layer while complying with the development principles of the application framework.
  • the system runtime library layer includes a library (also called a system library) and an Android operating system runtime environment.
  • the library mainly provides various resources needed by the Android operating system.
  • the Android operating system operating environment is used to provide a software environment for the Android operating system.
  • the kernel layer is the operating system layer of the Android operating system and belongs to the lowest level of the Android operating system software.
  • the kernel layer is based on the Linux kernel to provide core system services and hardware-related drivers for the Android operating system.
  • developers can develop software programs that implement the object processing method provided by the embodiments of the present disclosure based on the system architecture of the Android operating system as shown in FIG.
  • the processing method can be run based on the Android operating system as shown in FIG. 1. That is, the processor or the terminal device can implement the object processing method provided by the embodiments of the present disclosure by running the software program in the Android operating system.
  • the terminal device in the embodiment of the present disclosure may be a mobile terminal device or a non-mobile terminal device.
  • the mobile terminal device may be a mobile phone, a tablet computer, a notebook computer, a handheld computer, a vehicle-mounted terminal device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook, or a personal digital assistant (personal digital assistant).
  • UMPC ultra-mobile personal computer
  • PDA personal digital assistant
  • the non-mobile terminal device may be a personal computer (PC), television (television, TV), teller machine, or self-service machine, etc., which are not specifically limited in the embodiment of the present disclosure.
  • the execution subject of the object processing method provided by the embodiments of the present disclosure may be the above-mentioned terminal device, or may be a functional module and/or functional entity in the terminal device that can implement the object processing method, and the specifics may be determined according to actual usage requirements.
  • the embodiments of the present disclosure are not limited.
  • the following uses a terminal device as an example to illustrate the object processing method provided by the embodiment of the present disclosure.
  • an embodiment of the present disclosure provides an object processing method.
  • the method can be applied to a terminal device including a first screen and a second screen.
  • the method may include the following steps 101 to 104.
  • Step 101 The terminal device receives a user's first input.
  • the first input may be a selection input of a target object among at least one first object displayed on the first screen.
  • the user can trigger the terminal device to display at least one first object on the first screen, and each of the at least one first object
  • the first objects may be respectively used to indicate a file; and the target object is selected from at least one first object, so that the terminal device may receive an input of the user selecting the target object, that is, the first input.
  • the content indicated by each first object in the at least one first object may be any one of the following: pictures, videos, audios, documents, and applications.
  • the first object may be a thumbnail of the picture; if the content indicated by the first object is a video, then the first object may be any frame of the video Image thumbnail; if the content indicated by the first object is audio, then the first object can be a picture, text, logo, etc.; if the content indicated by the first object is a document, then the first object can be a picture, text, and Identification, etc.; if the content indicated by the first object is an application, then the first object can be an icon and text of the application.
  • first objects for indicating pictures, videos, audios, documents, applications, etc. are displayed on the first screen of the terminal device, which can facilitate the user to compare pictures, videos, audios, etc. , Documents, applications and other unified management operations.
  • the first input may be at least one of touch input, gravity input, and key input.
  • the touch input may be a long-press input, a sliding input, or a click input of the user on the touch screen of the terminal device;
  • the gravity input may be the user shaking the terminal device in a specific direction or shaking the terminal device a specific number of times, etc.;
  • the key input can be a single-click input, a double-click input, a long-press input, or a combination key input by the user on the terminal device key.
  • the first screen and the second screen of the terminal device may be two independent screens, and the first screen and the second screen may be connected by a shaft or a hinge; or, the terminal device
  • the screen may also be a flexible screen, and the flexible screen may be folded into at least two parts, for example, into a first screen and a second screen.
  • it can be determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • the embodiment of the present disclosure is exemplified by taking the terminal device including two screens as an example, which does not impose any limitation on the embodiment of the present disclosure. It can be understood that, in actual implementation, the terminal device may include three screens or more than the number of screens, and the details may be determined according to actual use requirements.
  • Step 102 In response to the first input, the terminal device displays the target object on the second screen.
  • the second screen of the terminal device may include a first area and a second area.
  • the first area may be used to display an object selected by the user, and the second area may be used to display at least one management operation control.
  • the at least one first object is a plurality of photos in an album
  • the first input is a user's touch input to one of the plurality of photos as an example.
  • the user can trigger the terminal device to open the album and display thumbnails of the photos in the album on the first screen 01, thereby The user can select and input "Image 1", “Image 2", “Image 3", “Image 4", “Image 5", and "Image 6" in the album.
  • the terminal device can receive the user's selection input of the "Image 6" 03, that is, the first An input, and in response to the first input, "Image 6" is displayed on the first area 04 of the second screen 02, that is, "Image 6" 03 is displayed on the second screen.
  • the terminal device may display the target object on the second screen according to a preset display ratio.
  • the preset display ratio may be smaller than the first display ratio, that is, the display size of the target object on the second screen is smaller than that on the first screen. Size; or, the preset display ratio can be equal to the first display ratio, that is, the display size of the target object on the second screen is equal to the display size on the first screen; or, the preset display ratio can be greater than the first display ratio, that is, the target object
  • the display size on the second screen is larger than the display size on the first screen.
  • Step 103 The terminal device receives a second input of the user for at least one second object displayed on the second screen.
  • the aforementioned at least one second object may include a target object.
  • an optional implementation manner is that at least one second object may be an object selected by the user from the objects displayed on the first screen; another optional implementation method is, If the number of at least one second object is multiple, the target object may be an object selected by the user from the objects displayed on the first screen, and the at least one second object other than the target object may be the object displayed from the second screen The selected object in the object.
  • the second screen of the terminal device includes M second objects, and the number of at least one second object may be N, then the at least one second object may be an object among the M second objects.
  • M and N are both positive integers.
  • the second screen 02 of the terminal device may include "Image 1", “Image 2", “Image 3", “Image 4", and "Image 4" selected by the user from the first screen.
  • 5" and “Image 6" are 6 images, and these 6 images can be selected.
  • the user wants to change the selected "Image 4" to the non-selected state, that is, the user wants to change the selected picture, then the user can click on "Image 4" so that the terminal device can change the " Image 4" is changed to a non-selected state. If the user clicks on the share in the second area 05 of the second screen 02 to select the image control 06, then the terminal device can send "Image 1", “Image 2", “Image 3", “Image 4" and "Image 5" To the target device.
  • the content indicated by the at least one second object may be any one of the following: pictures, videos, audios, documents, applications, and application installation packages.
  • the second object may be a thumbnail of the picture; if the content indicated by the second object is a video, then the second object may be any frame of the video Image thumbnail; if the content indicated by the second object is audio, then the second object can be a thumbnail, text, logo, etc.; if the content indicated by the second object is a document, then the second object can be a thumbnail, Text and logo, etc.; if the content indicated by the second object is an application or an installation package of an application, then the second object can be an icon, text, etc. of the application.
  • the second input may be at least one of touch input, gravity input, and key input.
  • the touch input may be a long-press input, a sliding input, or a click input of the user on the touch screen of the terminal device;
  • the gravity input may be the user shaking the terminal device in a specific direction or shaking the terminal device a specific number of times, etc.;
  • the key input can be a single-click input, a double-click input, a long-press input, or a combination key input by the user on the terminal device key.
  • Step 104 In response to the second input, the terminal device performs target processing on at least one second object.
  • performing target processing on at least one second object may include any of the following: sending at least one first object to the target device, sending at least one content indicated by the first object to the target device, Delete at least one first object from the terminal device, delete at least one content indicated by the first object from the terminal device, change the file format of at least one first object, change the file format of at least one first object, change at least Change the storage area of a first object to the target storage area, change the storage area of the content indicated by at least one first object to the target storage area, merge at least one first object into one object, and merge the storage area indicated by at least one first object The content is merged into one content.
  • the aforementioned target device may be a server or other terminal devices.
  • performing target processing on the at least one second object may include the following (1) to (10) Any one.
  • the terminal device may send the S thumbnails to the target device.
  • the content indicated by the S thumbnails may be S pictures, S videos, S audios, and S documents. S is a positive integer.
  • the number of at least one second object is S as an example. If the content indicated by the at least one second object is S pictures, the terminal device may send the S pictures to the target device. If the content indicated by the at least one second object is S videos, the terminal device may send the S videos to the target device. If the content indicated by the at least one second object is S audio, the terminal device may send the S audio to the target device. If the content indicated by the at least one second object is S documents, the terminal device may send the S documents to the target device.
  • the terminal device may delete the S thumbnails from the terminal device.
  • the content indicated by the S thumbnails may be S pictures, S videos, S audios, and S documents.
  • the number of at least one second object is S as an example. If the content indicated by the at least one second object is S pictures, the terminal device may delete the S pictures. If the content indicated by the at least one second object is S videos, the terminal device may delete the S videos. If the content indicated by the at least one second object is S audio, the terminal device may delete the S audio. If the content indicated by the at least one second object is S documents, the terminal device may delete the S documents.
  • the terminal device may change the file format of the S thumbnails.
  • the content indicated by the S thumbnails may be S pictures, S videos, S audios, and S documents.
  • the number of at least one second object is S as an example. If the content indicated by the at least one second object is S pictures, the terminal device can change the file format of the S pictures. If the content indicated by the at least one second object is S videos, the terminal device can change the file format of the S videos. If the content indicated by the at least one second object is S audios, the terminal device can change the file format of the S audios. If the content indicated by the at least one second object is S documents, the terminal device can change the file format of the S documents.
  • the terminal device may change the storage area of the S thumbnails to the target storage area.
  • the content indicated by the S thumbnails may be S pictures, S videos, S audios, and S documents.
  • the number of at least one second object is S as an example. If the content indicated by the at least one second object is S pictures, the terminal device may change the storage area of the S pictures to the target storage area. If the content indicated by the at least one second object is S videos, the terminal device may change the storage area of the S videos to the target storage area. If the content indicated by the at least one second object is S audios, the terminal device may change the storage area of the S audios to the target storage area. If the content indicated by the at least one second object is S documents, the terminal device may change the storage area of the S documents to the target storage area.
  • the terminal device may merge the S thumbnails into one object.
  • the content indicated by the S thumbnails may be S pictures, S videos, S audios, and S documents.
  • the number of at least one second object is S as an example. If the content indicated by the at least one second object is S pictures, the terminal device may merge the S pictures into one picture. If the content indicated by the at least one second object is S videos, the terminal device may merge the S videos into one video. If the content indicated by the at least one second object is S audios, the terminal device may merge the S audios into one audio. If the content indicated by the at least one second object is S documents, the terminal device may merge the S documents into one document.
  • performing target processing on the at least one second object may include any one of the following (1) to (4).
  • the terminal device may delete the S application icons from the terminal device.
  • the content indicated by the S application icons may be S application programs.
  • the terminal device may delete the S application programs from the terminal device.
  • the terminal device may change the storage area of the S application icons to the target storage area.
  • the content indicated by the S application icons may be S application programs.
  • the terminal device may change the storage area of the S application programs to the target storage area.
  • performing target processing on the at least one second object may include any one of the following (1) to (7).
  • the terminal device may send the S application icons to the target device.
  • the content indicated by the S application icons may be installation packages of S application programs.
  • the terminal device may delete the S application icons from the terminal device.
  • the terminal device may delete the installation packages of the S applications from the terminal device.
  • the terminal device may change the file format of the installation package of the S application programs.
  • the terminal device may change the storage area of the S application icons to the target storage area.
  • the content indicated by the S application icons may be installation packages of S application programs.
  • the terminal device may change the storage area of the installation package of the S applications to the target storage area.
  • the terminal device may merge the installation packages of the S application programs into one installation package.
  • the embodiment of the present disclosure provides an object processing method. Since the terminal device can display the object selected by the user from the multiple objects on the first screen on the second screen, the user can display the selected object on the second screen. Object changes and management operations, etc., without the need to slide up and down on the first screen to trigger the terminal device to scroll and display the objects that the user has selected, which can simplify the process of viewing and operating files and save users time.
  • the terminal device may update the display effect of the target object on the first screen to the target display effect before displaying the target object on the second screen.
  • the above step 102 may be implemented through the following step 102A.
  • Step 102A In response to the first input, the terminal device updates the display effect of the target object on the first screen to the target display effect, and displays the target object on the second screen.
  • the above-mentioned target display effect may be enlarged display of the target object, display of the target image in a preset color, display of the target image in transparency, display of the target image in blinking, display of the target image in suspension, and display on the target object.
  • Preset logo such as dashed frame, etc.
  • the target display effect may also include other possible display effects, which are not specifically limited in the embodiment of the present disclosure.
  • the first input may include a first sub-input and a second sub-input.
  • the first sub-input may be a user's pressing input on the target object, and the second sub-input may be a sliding input on the target icon.
  • FIG. 3 is still taken as an example for exemplary description.
  • the user can press input on "Image 6" 03 on the first screen 01, so that the terminal device can receive the user's press input on "Image 6" 03, that is, the first sub Input, and in response to the first sub-input, "Image 6" 03 is displayed enlarged.
  • the terminal device can receive the sliding input of "Image 6" 03, that is, the second sub-input, and as shown in Figure 3 ( As shown in b), the terminal device can display "Image 6" 03 on the second screen 04 in response to the second sub-input, that is, display the target object on the second screen.
  • the object processing method provided by the embodiments of the present disclosure can make the user know that the target image has been selected by displaying the target image with the target display effect, thereby facilitating other operations by the user.
  • the object processing method provided by the embodiment of the present disclosure may further include the following step 105 and step 106.
  • Step 105 The terminal device receives the user's third input on the first screen.
  • Step 106 In response to the third input, the terminal device updates at least one first object displayed on the first screen to at least one third object.
  • the above-mentioned at least one first object and at least one third object may be completely different or partially different. Specifically, it can be determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • FIG. 6 is an example for the terminal device to perform step 101 and step 102 first, and then perform step 105 and step 106 as an example, which does not limit the embodiment of the present disclosure in any way. It can be understood that in actual implementation, the terminal device may also perform step 105 and step 106 first, and then perform step 101 to step 104; or, the terminal device may first perform step 101 to step 104, and then perform step 105 and step 106, specifically Determined according to actual usage requirements.
  • the third input may be a long press input, a slide input, or a click input. Specifically, it can be determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • FIG. 7 is a schematic diagram of displaying a third object on a terminal device according to an embodiment of the present disclosure.
  • Figure 3 Take the above-mentioned Figure 3 as an example for illustrative description.
  • the terminal device can receive the user's sliding input, that is, the fourth input, and in response to the fourth input, update the first object displayed on the first screen as shown in FIG. 3 to as shown in FIG.
  • the second object displayed on the first screen that is, the terminal device may update at least one first object displayed on the first screen to at least one third object.
  • the user since the user can trigger the terminal device to display at least one third object according to actual use requirements, the user can select other objects different from the target object from the third object to trigger the terminal device to display at least one third object.
  • the selected object is displayed on the second screen.
  • the target control may also be displayed on the first screen.
  • the object processing method provided by the embodiment of the present disclosure may further include the following step 107 and step 108.
  • Step 107 The terminal device receives the fourth input of the user to the target control.
  • Step 108 In response to the fourth input, the terminal device controls at least one first object to be in a selectable state.
  • the fourth input may be a long-press input, a sliding input, or a click input to the target control.
  • it can be determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • FIG. 9 is a schematic diagram of a user's operation of a target control provided by an embodiment of the disclosure.
  • the target control is "Edit Photo” 07 as shown in FIG. 9, before the user moves the object on the first screen to the second screen, the user can first click "Edit Photo” 07.
  • the terminal device receives the user's input to "Edit Photo” 07, that is, the fourth input, and in response to the fourth input, controls at least one first object to be in a selectable state.
  • the user can select "Image 1", “Image 2", “Image 3", “Image 4", and “Image 5" as shown in FIG. "And “Image 6" these 6 images.
  • the user can select one or more objects from the first object.
  • FIG. 5, FIG. 6, and FIG. 8 in the embodiments of the present disclosure are all illustrated in conjunction with Figure 2 and form any limitation on the embodiments of the present disclosure. It can be understood that in actual implementation, FIG. 5, FIG. 6, and FIG. 8 can also be implemented in combination with any other combined drawings.
  • an embodiment of the present disclosure provides a terminal device 1000.
  • the terminal device includes a first screen and a second screen.
  • the terminal device may include a receiving module 1001, a display module 1002, and a processing module 1003.
  • the receiving module 1001 can be used to receive a user's first input, and the first input can be a selection input of a target object among at least one first object displayed on the first screen; the display module 1002 can be used to respond The first input received by the receiving module 1001 is used to display the target object on the second screen; the receiving module 1001 may also be used to receive a second input from the user for at least one second object displayed on the second screen.
  • One second object may include a target object; the processing module 1003 is configured to perform target processing on the at least one second object in response to the second input received by the receiving module 1001.
  • the content indicated by each first object may be any one of the following: pictures, videos, audios, documents, and applications.
  • the content indicated by the at least one second object is any one of the following: pictures, videos, audios, and documents.
  • the processing module 1003 may be specifically configured to: send at least one first object to the target device; or send at least one content indicated by the first object to the target device; or delete at least one first object from the terminal device; or Delete the content indicated by at least one first object in the terminal device; or change the file format of at least one first object; or change the file format of the content indicated by at least one first object; or store the at least one first object Change the area to the target storage area; or change the storage area of the content indicated by at least one first object to the target storage area; or merge at least one first object into one object; or, combine the storage area indicated by at least one first object The content is merged into one content.
  • the content indicated by the at least one second object is an application program.
  • the processing module 1003 may be specifically used to: delete at least one second object from the terminal device; or delete the content indicated by the at least one second object from the terminal device; or change the storage area of the at least one second object to the target Storage area; or, change the storage area of the content indicated by the at least one second object to the target storage area.
  • the content indicated by the at least one second object is an application program.
  • the processing module 1003 may be specifically configured to: send content indicated by at least one second object to the target device; or delete at least one second object from the terminal device; or delete content indicated by at least one second object from the terminal device ; Or, change the file format of the content indicated by at least one second object; or, change the storage area of at least one second object to the target storage area; or, change the storage area of the content indicated by at least one second object to the target Storage area; or, combining the content indicated by at least one second object into one content.
  • the display module 1002 may also be used to update the display effect of the target object on the first screen to the target display effect before displaying the target object on the second screen.
  • the receiving module 1001 can also be used to receive a third input from the user on the first screen; the display module 1002 can also be used to respond to the third input received by the receiving module 1001 To update at least one first object displayed on the first screen to at least one third object.
  • the target control is also displayed on the first screen.
  • the receiving module 1001 can also be used to receive a user's fourth input to the target control before receiving the first input; the processing module 1003 can also be used to control at least one first object in response to the fourth input received by the receiving module 1001 In a selectable state.
  • the terminal device provided in the embodiment of the present disclosure can implement each process implemented by the terminal device in the foregoing method embodiment, and to avoid repetition, details are not described herein again.
  • the embodiments of the present disclosure provide a terminal device. Since the terminal device can display an object selected by a user from a plurality of objects on a first screen on a second screen, the user can view one or more objects on the second screen.
  • the second object performs modification and management operations, etc., without sliding up and down on the first screen to trigger the terminal device to scroll and display the object that the user has selected, so that the terminal device provided by the embodiment of the present disclosure can simplify the viewing and operating file Process and save user time.
  • FIG. 11 is a schematic diagram of the hardware structure of a terminal device provided by an embodiment of the disclosure.
  • the terminal device 200 includes but is not limited to: a radio frequency unit 201, a network module 202, an audio output unit 203, an input unit 204, a sensor 205, a display unit 206, a user input unit 207, an interface unit 208, and a memory 209 , Processor 210, and power supply 211.
  • Those skilled in the art can understand that the structure of the terminal device shown in FIG. 11 does not constitute a limitation on the terminal device.
  • the terminal device may include more or fewer components than shown in the figure, or a combination of certain components, or different components. Layout.
  • terminal devices include but are not limited to mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted terminal devices, wearable devices, and pedometers.
  • the user input unit 207 may be used to receive a user's first input, and the first input may be a selection input of a target object among the at least one first object displayed on the first screen; the display unit 206 may be used to In response to the first input received by the user input unit 207, the target object is displayed on the second screen; the user input unit 207 may also be used to receive a second input of the user for at least one second object displayed on the second screen.
  • the at least one second object may include a target object; the processor 210 may be configured to perform target processing on the at least one second object in response to the second input received by the user input unit 207.
  • the embodiments of the present disclosure provide a terminal device. Since the terminal device can display an object selected by a user from a plurality of objects on a first screen on a second screen, the user can view one or more objects on the second screen.
  • the second object performs modification and management operations, etc., without sliding up and down on the first screen to trigger the terminal device to scroll and display the object that the user has selected, so that the terminal device provided by the embodiment of the present disclosure can simplify the viewing and operating file Process and save user time.
  • the radio frequency unit 201 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, the downlink data from the base station is received and processed by the processor 210; in addition, Uplink data is sent to the base station.
  • the radio frequency unit 201 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 201 can also communicate with the network and other devices through a wireless communication system.
  • the terminal device provides users with wireless broadband Internet access through the network module 202, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 203 can convert the audio data received by the radio frequency unit 201 or the network module 202 or stored in the memory 209 into audio signals and output them as sounds. Moreover, the audio output unit 203 may also provide audio output related to a specific function performed by the terminal device 200 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 203 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 204 is used to receive audio or video signals.
  • the input unit 204 may include a graphics processing unit (GPU) 2041 and a microphone 2042.
  • the graphics processing unit 2041 is configured to monitor still pictures or video images obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Data is processed.
  • the processed image frame may be displayed on the display unit 206.
  • the image frame processed by the graphics processor 2041 may be stored in the memory 209 (or other storage medium) or sent via the radio frequency unit 201 or the network module 202.
  • the microphone 2042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to the mobile communication base station via the radio frequency unit 201 in the case of a telephone call mode for output.
  • the terminal device 200 also includes at least one sensor 205, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 2061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 2061 and the display panel 2061 when the terminal device 200 is moved to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the terminal device (such as horizontal and vertical screen switching, related games , Magnetometer posture calibration), vibration recognition related functions (such as pedometer, percussion), etc.; the sensor 205 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
  • the display unit 206 is used to display information input by the user or information provided to the user.
  • the display unit 206 may include a display panel 2061, and the display panel 2061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 207 can be used to receive inputted numeric or character information, and generate key signal input related to user settings and function control of the terminal device.
  • the user input unit 207 includes a touch panel 2071 and other input devices 2072.
  • the touch panel 2071 also called a touch screen, can collect user touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 2071 or near the touch panel 2071. operating).
  • the touch panel 2071 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, and detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 210, the command sent by the processor 210 is received and executed.
  • the touch panel 2071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 207 may also include other input devices 2072.
  • other input devices 2072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 2071 can be overlaid on the display panel 2061.
  • the touch panel 2071 detects a touch operation on or near it, it is transmitted to the processor 210 to determine the type of the touch event, and then the processor 210 according to The type of touch event provides corresponding visual output on the display panel 2061.
  • the touch panel 2071 and the display panel 2061 are used as two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 2071 and the display panel 2061 can be integrated
  • the implementation of the input and output functions of the terminal device is not specifically limited here.
  • the interface unit 208 is an interface for connecting an external device with the terminal device 200.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 208 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the terminal device 200 or can be used to connect to the terminal device 200 and an external device. Transfer data between devices.
  • the memory 209 can be used to store software programs and various data.
  • the memory 209 may mainly include a storage program area and a storage data area.
  • the storage program area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data (such as audio data, phone book, etc.) created by the use of mobile phones.
  • the memory 209 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 210 is the control center of the terminal device. It uses various interfaces and lines to connect the various parts of the entire terminal device, runs or executes software programs and/or modules stored in the memory 209, and calls data stored in the memory 209 , Perform various functions of terminal equipment and process data, so as to monitor the terminal equipment as a whole.
  • the processor 210 may include one or more processing units; alternatively, the processor 210 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, and application programs, etc.
  • the adjustment processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 210.
  • the terminal device 200 may also include a power source 211 (such as a battery) for supplying power to various components.
  • a power source 211 such as a battery
  • the power source 211 may be logically connected to the processor 210 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
  • the terminal device 200 includes some functional modules not shown, which will not be repeated here.
  • an embodiment of the present disclosure also provides a terminal device, including a processor 210 as shown in FIG. 11, a memory 209, a computer program stored in the memory 209 and running on the processor 210, and the computer program is
  • the processor 210 implements the various processes of the foregoing method embodiments while executing, and can achieve the same technical effect. To avoid repetition, details are not described herein again.
  • the embodiments of the present disclosure also provide a computer-readable storage medium on which a computer program is stored.
  • a computer program When the computer program is executed by a processor, each process of the foregoing method embodiment is implemented, and the same technical effect can be achieved. To avoid repetition, I won’t repeat them here.
  • computer-readable storage media such as read-only memory (ROM), random access memory (RAM), magnetic disks or optical disks, etc.
  • the technical solution of the present disclosure essentially or the part that contributes to the traditional technology can be embodied in the form of a software product, the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) )
  • a terminal device which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.

Abstract

本公开实施例公开了一种对象处理方法及终端设备。该方法包括:接收用户的第一输入,该第一输入为对第一屏幕上显示的至少一个第一对象中的目标对象的选择输入;响应于该第一输入,在第二屏幕上显示该目标对象;接收用户针对第二屏幕上显示的至少一个第二对象的第二输入,该至少一个第二对象包括该目标对象;响应于该第二输入,对该至少一个第二对象进行目标处理。

Description

对象处理方法及终端设备
相关申请的交叉引用
本申请主张在2019年01月25日提交国家知识产权局、申请号为201910074692.7、申请名称为“一种对象处理方法及终端设备”的中国专利申请的优先权,其全部内容通过引用结合在本公开中。
技术领域
本公开实施例涉及通信技术领域,尤其涉及一种对象处理方法及终端设备。
背景技术
随着通信技术的发展,终端设备的内存容量越来越大,因此用户可以在终端设备中存储照片、文档和视频等各类文件。
目前,用户可以对终端设备中存储的多个文件进行管理操作。以文件为照片为例进行示例性说明,如果终端设备的相册中包括的照片较多,那么终端设备的屏幕可能无法同时显示相册中的全部照片,因此用户可以通过在屏幕上进行滑动操作,以触发终端设备滚动显示相册中的照片,从而用户可以从这些照片中选择多张照片,以对该多张照片进行删除等管理操作。
但是,在上述管理操作的过程中,如果用户想要更改已选择的照片,那么当屏幕无法同时显示用户已选择的照片时,用户可以再次在屏幕上进行滑动操作,以触发终端设备滚动显示用户已选择的照片。如此用户可以更改选择的照片并对更改后的照片进行管理操作,从而导致查看和操作文件的过程繁琐且较为耗时。
发明内容
本公开实施例提供一种对象处理方法及终端设备,以解决查看和操作文件的过程繁琐且较为耗时的问题。
为了解决上述技术问题,本公开实施例是这样实现的:
第一方面,本公开实施例提供了一种对象处理方法,该方法应用于包括第一屏幕和第二屏幕的终端设备。该方法包括:接收用户的第一输入,该第一输入为对第一屏幕上显示的至少一个第一对象中的目标对象的选择输入;响应于该第一输入,在第二屏幕上显示该目标对象;接收用户针对第二屏幕上显示的至少一个第二对象的第二输入,该至少一个第二对象包括该目标对象;响应于该第二输入,对该至少一个第二对象进行目标处理。
第二方面,本公开实施例提供了一种终端设备,该终端设备包括第一屏幕和第二屏幕,该终端设备包括接收模块、显示模块和处理模块。其中,接收模块,用于接收用户的第一输入,该第一输入为对第一屏幕上显示的至少一个第一对象中的目标对象的选择输入;显示模块,用于响应于接收模块接收的该第一输入,在第二屏幕上显示该目标对象;接收模块,还用于接收用户针对第二屏幕上显示的至少一个第二对象的第二输入,该至少一个第二对象包括该目标对象;处理模块,用于响应于接收模块接收的该第二输入,对该至少一个第二对象进行目标处理。
第三方面,本公开实施例提供了一种终端设备,包括处理器、存储器及存储在该存储器上并可在该处理器上运行的计算机程序,该计算机程序被该处理器执行时实现第一方面提供的对象处理方法的步骤。
第四方面,本公开实施例提供了一种计算机可读存储介质,该计算机可读存储介质上存储计算机程序,该计算机程序被处理器执行时实现第一方面提供的对象处理方法的步骤。
在本公开实施例中,可以接收用户的第一输入,该第一输入为对第一屏幕上显示的至少一个第一对象中的目标对象的选择输入;响应于该第一输入,在第二屏幕上显示该目标对象;并接收用户针对第二屏幕上显示的至少一个第二对象的第二输入,该至少一个第二对象包括该目标对象;响应于该第二输入,对该至少一个第二对象进行目标处理。通过该方案,由于终端设备可以将用户从第一屏幕上的多个对象中选择的对象在第二屏幕上进行显示,因此用户可以在第二屏幕上对一个或多个第二对象进行更改和管理操作等,而无需在第一屏幕上进行上下滑动操作,以触发终端设备滚动显示用户已选择的对象,从而可以简化查看和操作文件的过程,并节省用户时间。
附图说明
图1为本公开实施例提供的一种安卓操作系统的架构示意图;
图2为本公开实施例提供的一种对象处理方法的示意图之一;
图3为本公开实施例提供的一种对目标对象的操作示意图;
图4为本公开实施例提供的一种针对第二对象的操作示意图;
图5为本公开实施例提供的一种对象处理方法的示意图之二;
图6为本公开实施例提供的一种对象处理方法的示意图之三;
图7为本公开实施例提供的一种终端设备显示第三对象的示意图;
图8为本公开实施例提供的一种对象处理方法的示意图之四;
图9为本公开实施例提供的一种用户对目标控件的操作示意图;
图10为本公开实施例提供的终端设备的结构示意图;
图11为本公开实施例提供的终端设备的硬件示意图。
具体实施方式
下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
本文中术语“和/或”,是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。本文中符号“/”表示关联对象是或者的关系,例如A/B表示A或者B。
本公开的说明书和权利要求书中的术语“第一”和“第二”等是用于区别不同的 对象,而不是用于描述对象的特定顺序。例如,第一输入和第二输入等是用于区别不同的输入,而不是用于描述输入的特定顺序。
在本公开实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本公开实施例中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。
在本公开实施例的描述中,除非另有说明,“多个”的含义是指两个或者两个以上,例如,多个元件是指两个或者两个以上的元件等。
本公开实施例提供一种对象处理方法及终端设备,可以接收用户的第一输入,该第一输入为对第一屏幕上显示的至少一个第一对象中的目标对象的选择输入;响应于该第一输入,在第二屏幕上显示该目标对象;并接收用户针对第二屏幕上显示的至少一个第二对象的第二输入,该至少一个第二对象包括该目标对象;响应于该第二输入,对该至少一个第二对象进行目标处理。通过该方案,由于终端设备可以将用户从第一屏幕上的多个对象中选择的对象在第二屏幕上进行显示,因此用户可以在第二屏上对一个或多个第二对象进行更改和管理操作等,而无需在第一屏上进行上下滑动操作,以触发终端设备滚动显示用户已选择的对象,从而可以简化查看和操作文件的过程,并节省用户时间。
本公开实施例中的终端设备可以为具有操作系统的终端设备。该操作系统可以为安卓(Android)操作系统,可以为ios操作系统,还可以为其他可能的操作系统,本公开实施例不作具体限定。
以安卓操作系统为例,介绍一下本公开实施例提供的对象处理方法所应用的软件环境。
如图1所示,为本公开实施例提供的一种安卓操作系统的架构示意图。在图1中,安卓操作系统的架构包括4层,分别为:应用程序层、应用程序框架层、系统运行库层和内核层(具体可以为Linux内核层)。
其中,应用程序层包括安卓操作系统中的各个应用程序(包括系统应用程序和第三方应用程序)。
应用程序框架层是应用程序的框架,开发人员可以在遵守应用程序的框架的开发原则的情况下,基于应用程序框架层开发一些应用程序。
系统运行库层包括库(也称为系统库)和安卓操作系统运行环境。库主要为安卓操作系统提供其所需的各类资源。安卓操作系统运行环境用于为安卓操作系统提供软件环境。
内核层是安卓操作系统的操作系统层,属于安卓操作系统软件层次的最底层。内核层基于Linux内核为安卓操作系统提供核心系统服务和与硬件相关的驱动程序。
以安卓操作系统为例,本公开实施例中,开发人员可以基于上述如图1所示的安卓操作系统的系统架构,开发实现本公开实施例提供的对象处理方法的软件程序,从而使得该对象处理方法可以基于如图1所示的安卓操作系统运行。即处理器或者终端设备可以通过在安卓操作系统中运行该软件程序实现本公开实施例提供的对象处理方法。
本公开实施例中的终端设备可以为移动终端设备,也可以为非移动终端设备。示例性的,移动终端设备可以为手机、平板电脑、笔记本电脑、掌上电脑、车载终端设备、可穿戴设备、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本或者个人数字助理(personal digital assistant,PDA)等,非移动终端设备可以为个人计算机(personal computer,PC)、电视机(television,TV)、柜员机或者自助机等,本公开实施例不作具体限定。
本公开实施例提供的对象处理方法的执行主体可以为上述的终端设备,也可以为该终端设备中能够实现该对象处理方法的功能模块和/或功能实体,具体的可以根据实际使用需求确定,本公开实施例不作限定。下面以终端设备为例,对本公开实施例提供的对象处理方法进行示例性的说明。
如图2所示,本公开实施例提供一种对象处理方法。该方法可以应用于包括第一屏幕和第二屏幕的终端设备。该方法可以包括下述的步骤101至步骤104。
步骤101、终端设备接收用户的第一输入。
其中,第一输入可以为对第一屏幕上显示的至少一个第一对象中的目标对象的选择输入。
本公开实施例中,如果用户想要对终端设备中存储的多个文件进行管理操作,那么用户可以触发终端设备在第一屏幕上显示至少一个第一对象,该至少一个第一对象中的每个第一对象可以分别用于指示一个文件;并从至少一个第一对象中选择目标对象,从而终端设备可以接收用户选择目标对象的输入,即第一输入。
可选地,本公开实施例中,至少一个第一对象中的每个第一对象指示的内容可以为下述任意一项:图片、视频、音频、文档、应用程序。
示例性的,如果第一对象指示的内容为图片,那么该第一对象可以为该图片的缩略图;如果第一对象指示的内容为视频,那么该第一对象可以为该视频的任意一帧图像的缩略图;如果第一对象指示的内容为音频,那么该第一对象可以为图片、文字和标识等;如果第一对象指示的内容为文档,那么该第一对象可以为图片、文字和标识等;如果第一对象指示的内容为应用程序,那么该第一对象可以为该应用程序的图标和文字等。
可以理解的是,本公开实施例中,在终端设备的第一屏幕上显示多个用于指示图片、视频、音频、文档、应用程序等的第一对象,可以便于用户对图片、视频、音频、文档、应用程序等统一进行管理操作。
可选地,本公开实施例中,第一输入可以为触控输入、重力输入以及按键输入等中的至少一种。具体的,该触控输入可以为用户对终端设备的触控屏的长按输入、滑动输入或者点击输入等;该重力输入可以为用户在特定方向晃动终端设备或者对终端设备晃动特定次数等;该按键输入可以为用户对终端设备按键的单击输入、双击输入、长按输入或者组合按键输入等。
可选地,本公开实施例中,终端设备的第一屏幕和第二屏幕可以为两块独立的屏幕,且该第一屏幕和第二屏幕可以通过轴或铰链等连接;或者,终端设备的屏幕还可以为一块柔性屏,该柔性屏可以被折叠为至少两个部分,例如被折叠为第一屏幕和第二屏幕。具体可以根据实际使用需求确定,本公开实施例不作限定。
需要说明的是,本公开实施例是以终端设备包括两个屏幕为例进行示例性说明的,其并不对本公开实施例造成任何限定。可以理解,实际实现时,终端设备可以包括三个屏幕及以上数量的屏幕,具体的可以根据实际使用需求确定。
步骤102、终端设备响应于该第一输入,在第二屏幕上显示目标对象。
可选地,本公开实施例中,终端设备的第二屏幕可以包括第一区域和第二区域。其中,该第一区域可以用于显示用户选择的对象,该第二区域可以用于显示至少一个管理操作控件。
示例性的,以至少一个第一对象为相册中的多张照片、第一输入为用户对多张照片中的一张照片的触控输入为例进行示例性说明。如图3中的(a)所示,如果用户想要对相册中的照片进行管理操作,那么用户可以触发终端设备打开相册,并在第一屏幕01上显示相册中的照片的缩略图,从而用户可以对相册中的“图像1”、“图像2”、“图像3”、“图像4”、“图像5”和“图像6”等进行选择输入。例如,如果用户对第一屏01上的“图像6”03进行按压操作,那么如图3中的(b)所示,终端设备可以接收到用户对“图像6”03的选择输入,即第一输入,并响应于该第一输入,在第二屏02的第一区域04显示“图像6”,即在第二屏幕上显示“图像6”03。
可选地,本公开实施例中,终端设备可以按照预设显示比例在第二屏幕上显示目标对象。
示例性的,假设终端设备按照第一显示比例在第一屏幕上显示目标对象,那么预设显示比例可以小于第一显示比例,即目标对象在第二屏幕的显示尺寸小于在第一屏幕的显示尺寸;或者,预设显示比例可以等于第一显示比例,即目标对象在第二屏幕的显示尺寸等于在第一屏幕的显示尺寸;或者,预设显示比例可以大于第一显示比例,即目标对象在第二屏幕的显示尺寸大于在第一屏幕的显示尺寸。具体可以根据实际使用需求确定,本公开实施例不作限定。
步骤103、终端设备接收用户针对第二屏幕上显示的至少一个第二对象的第二输入。
其中,上述至少一个第二对象可以包括目标对象。
可选地,本公开实施例中,一种可选的实现方式为,至少一个第二对象可以为用户从第一屏幕上显示的对象中选择的对象;另一种可选的实现方为,若至少一个第二对象的数量为多个,目标对象可以为用户从第一屏幕上显示的对象中选择的对象,至少一个第二对象中除目标对象以外的对象可以为从第二屏幕显示的对象中选择的对象。
本公开实施例中,假设终端设备的第二屏幕上包括M个第二对象,至少一个第二对象的数量可以为N个,那么至少一个第二对象可以为M个第二对象中的对象。具体的,在N=M的情况下,至少一个第二对象为该M个第二对象;在N<M的情况下,至少一个第二对象为该M个第二对象中的部分对象。其中,M和N均为正整数。
示例性的,如图3所示,终端设备的第二屏02上可以包括用户从第一屏上选择的“图像1”、“图像2”、“图像3”、“图像4”、“图像5”和“图像6”这6个图像,且这6个图像可以处于选中状态。如图4所示,如果用户想要将处于选中状态的“图像4”更改非选中状态,即用户想要对选中的图片进行更改,那么用户可以点击“图像4”,从而终端设备可以将“图像4”更改为非选中状态。如果用户点击第二屏 02的第二区域05中的分享选中图像控件06,那么终端设备可以将“图像1”、“图像2”、“图像3”、“图像4”和“图像5”发送至目标设备。
可选地,本公开实施例中,至少一个第二对象指示的内容可以为下述任意一项:图片、视频、音频、文档、应用程序、应用程序的安装包。
示例性的,如果第二对象指示的内容为图片,那么该第二对象可以为该图片的缩略图;如果第二对象指示的内容为视频,那么该第二对象可以为该视频的任一帧图像的缩略图;如果第二对象指示的内容为音频,那么该第二对象可以为缩略图、文字和标识等;如果第二对象指示的内容为文档,那么该第二对象可以为缩略图、文字和标识等;如果第二对象指示的内容为应用程序或应用程序的安装包,那么该第二对象可以为该应用程序的图标和文字等。
可选地,本公开实施例中,第二输入可以为触控输入、重力输入以及按键输入等中的至少一种。具体的,该触控输入可以为用户对终端设备的触控屏的长按输入、滑动输入或者点击输入等;该重力输入可以为用户在特定方向晃动终端设备或者对终端设备晃动特定次数等;该按键输入可以为用户对终端设备按键的单击输入、双击输入、长按输入或者组合按键输入等。
步骤104、终端设备响应于该第二输入,对至少一个第二对象进行目标处理。
可选地,本公开实施例中,对至少一个第二对象进行目标处理,可以包括以下任意一项:向目标设备发送至少一个第一对象、向目标设备发送至少一个第一对象指示的内容、从终端设备中删除至少一个第一对象、从终端设备中删除至少一个第一对象指示的内容、更改至少一个第一对象的文件格式、更改至少一个第一对象指示的内容的文件格式、将至少一个第一对象的存储区域更改为目标存储区域、将至少一个第一对象指示的内容的存储区域更改为目标存储区域、将至少一个第一对象合并为一个对象、将至少一个第一对象指示的内容合并为一个内容。
可选地,本公开实施例中,上述目标设备可以为服务器或其他终端设备。
可选地,若至少一个第二对象指示的内容为下述任意一项:图片、视频、音频、文档,则对至少一个第二对象进行目标处理可以包括下述(1)至(10)中任意一项。
(1)向目标设备发送至少一个第二对象。
示例性的,在至少一个第二对象为S个缩略图的情况下,终端设备可以向目标设备发送该S个缩略图。其中,该S个缩略图指示的内容可以为S个图片、S个视频、S个音频、S个文档。S为正整数。
(2)向目标设备发送至少一个第二对象指示的内容。
示例性的,以至少一个第二对象的数量为S个为例进行示例性说明。若至少一个第二对象指示的内容为S个图片,则终端设备可以向目标设备发送该S个图片。若至少一个第二对象指示的内容为S个视频,则终端设备可以向目标设备发送该S个视频。若至少一个第二对象指示的内容为S个音频,则终端设备可以向目标设备发送该S个音频。若至少一个第二对象指示的内容为S个文档,则终端设备可以向目标设备发送该S个文档。
(3)从终端设备中删除至少一个第二对象。
示例性的,在至少一个第二对象为S个缩略图的情况下,终端设备可以从终端设 备中删除该S个缩略图。其中,该S个缩略图指示的内容可以为S个图片、S个视频、S个音频、S个文档。
(4)从终端设备中删除至少一个第二对象指示的内容。
示例性的,以至少一个第二对象的数量为S个为例进行示例性说明。若至少一个第二对象指示的内容为S个图片,则终端设备可以删除该S个图片。若至少一个第二对象指示的内容为S个视频,则终端设备可以删除该S个视频。若至少一个第二对象指示的内容为S个音频,则终端设备可以删除该S个音频。若至少一个第二对象指示的内容为S个文档,则终端设备可以删除该S个文档。
(5)更改至少一个第二对象的文件格式。
示例性的,在至少一个第二对象为S个缩略图的情况下,终端设备可以更改该S个缩略图的文件格式。其中,该S个缩略图指示的内容可以为S个图片、S个视频、S个音频、S个文档。
(6)更改至少一个第二对象指示的内容的文件格式。
示例性的,以至少一个第二对象的数量为S个为例进行示例性说明。若至少一个第二对象指示的内容为S个图片,则终端设备可以更改该S个图片的文件格式。若至少一个第二对象指示的内容为S个视频,则终端设备可以更改该S个视频的文件格式。若至少一个第二对象指示的内容为S个音频,则终端设备可以更改该S个音频的文件格式。若至少一个第二对象指示的内容为S个文档,则终端设备可以更改该S个文档的文件格式。
(7)将至少一个第二对象的存储区域更改为目标存储区域。
示例性的,在至少一个第二对象为S个缩略图的情况下,终端设备可以将该S个缩略图的存储区域更改为目标存储区域。其中,该S个缩略图指示的内容可以为S个图片、S个视频、S个音频、S个文档。
(8)将至少一个第二对象指示的内容的存储区域更改为目标存储区域。
示例性的,以至少一个第二对象的数量为S个为例进行示例性说明。若至少一个第二对象指示的内容为S个图片,则终端设备可以将该S个图片的存储区域更改为目标存储区域。若至少一个第二对象指示的内容为S个视频,则终端设备可以将该S个视频的存储区域更改为目标存储区域。若至少一个第二对象指示的内容为S个音频,则终端设备可以将该S个音频的存储区域更改为目标存储区域。若至少一个第二对象指示的内容为S个文档,则终端设备可以将该S个文档的存储区域更改为目标存储区域。
(9)将至少一个第二对象合并为一个对象。
示例性的,在至少一个第二对象为S个缩略图的情况下,终端设备可以将该S个缩略图合并为一个对象。其中,该S个缩略图指示的内容可以为S个图片、S个视频、S个音频、S个文档。
(10)将至少一个第二对象指示的内容合并为一个内容。
示例性的,以至少一个第二对象的数量为S个为例进行示例性说明。若至少一个第二对象指示的内容为S个图片,则终端设备可以将该S个图片合并为一个图片。若至少一个第二对象指示的内容为S个视频,则终端设备可以将该S个视频合并为一个 视频。若至少一个第二对象指示的内容为S个音频,则终端设备可以将该S个音频合并为一个音频。若至少一个第二对象指示的内容为S个文档,则终端设备可以将该S个文档合并为一个文档。
可选地,若至少一个第二对象指示的内容为应用程序,则对至少一个第二对象进行目标处理可以包括下述(1)至(4)任意一项。
(1)从终端设备中删除至少一个第二对象。
示例性的,在至少一个第二对象为S个应用图标的情况下,终端设备可以从终端设备中删除该S个应用图标。其中,该S个应用图标指示的内容可以为S个应用程序。
(2)从终端设备中删除至少一个第二对象指示的内容。
示例性的,在至少一个第二对象指示的内容为S个应用程序的情况下,终端设备可以从终端设备中删除该S个应用程序。
(3)将至少一个第二对象的存储区域更改为目标存储区域。
示例性的,在至少一个第二对象为S个应用图标的情况下,终端设备可以将该S个应用图标的存储区域更改为目标存储区域。其中,该S个应用图标指示的内容可以为S个应用程序。
(4)将至少一个第二对象指示的内容的存储区域更改为目标存储区域。
示例性的,在至少一个第二对象指示的内容为S个应用程序的情况下,终端设备可以将该S个应用程序的存储区域更改为目标存储区域。
可选地,若至少一个第二对象指示的内容为应用程序的安装包,则对至少一个第二对象进行目标处理可以包括下述(1)至(7)任意一项。
(1)向目标设备发送至少一个第二对象指示的内容。
示例性的,在至少一个第二对象为S个应用图标的情况下,终端设备可以向目标设备发送该S个应用图标。其中,该S个应用图标指示的内容可以为S个应用程序的安装包。
(2)从终端设备中删除至少一个第二对象。
示例性的,在至少一个第二对象为S个应用图标的情况下,终端设备可以从终端设备中删除该S个应用图标。
(3)从终端设备中删除至少一个第二对象指示的内容。
示例性的,在至少一个第二对象指示的内容为S个应用程序的安装包的情况下,终端设备可以从终端设备中删除该S个应用程序的安装包。
(4)更改至少一个第二对象指示的内容的文件格式。
示例性的,在至少一个第二对象指示的内容为S个应用程序的安装包的情况下,终端设备可以更改该S个应用程序的安装包的文件格式。
(5)将至少一个第二对象的存储区域更改为目标存储区域。
示例性的,在至少一个第二对象为S个应用图标的情况下,终端设备可以将该S个应用图标的存储区域更改为目标存储区域。其中,该S个应用图标指示的内容可以为S个应用程序的安装包。
(6)将至少一个第二对象指示的内容的存储区域更改为目标存储区域。
示例性的,在至少一个第二对象指示的内容为S个应用程序的安装包的情况下, 终端设备可以将该S个应用程序的安装包的存储区域更改为目标存储区域。
(7)将至少一个第二对象指示的内容合并为一个内容。
示例性的,在至少一个第二对象指示的内容为S个应用程序的安装包的情况下,终端设备可以将该S个应用程序的安装包合并为一个安装包。
本公开实施例提供一种对象处理方法,由于终端设备可以将用户从第一屏上的多个对象中选择的对象在第二屏幕上进行显示,因此用户可以在第二屏上对已选择的对象进行更改和管理操作等,而无需在第一屏上进行上下滑动操作,以触发终端设备滚动显示用户已选择的对象,从而可以简化查看和操作文件的过程,并节省用户时间。
可选地,结合图2,如图5所示,本公开实施例中,终端设备可以在第二屏幕上显示目标对象之前,将第一屏幕中的目标对象的显示效果更新为目标显示效果。具体的,上述步骤102可以通过下述的步骤102A实现。
步骤102A、终端设备响应于第一输入,将第一屏幕中的目标对象的显示效果更新为目标显示效果,并在第二屏幕上显示目标对象。
需要说明的是,对于目标对象的具体描述,可以参见上述实施例的步骤101中对目标对象的相关描述,此处不再赘述。
可选地,本公开实施例中,上述目标显示效果可以为放大显示目标对象、以预设颜色显示目标图像、以透明度显示目标图像、闪烁显示目标图像、悬浮显示目标图像、在目标对象上显示预设标识,例如虚线框等。当然,该目标显示效果还可以包括其他可能的显示效果,本公开实施例不作具体限定。
可选地,本公开实施例中,第一输入可以包括第一子输入和第二子输入。其中,第一子输入可以为用户对目标对象的按压输入,第二子输入可以为对目标图标的滑动输入。
示例性的,仍以上述图3为例进行示例性说明。如图3中的(a)所示,用户可以对第一屏幕01上的“图像6”03进行按压输入,从而终端设备可以接收到用户对“图像6”03的按压输入,即第一子输入,并响应于该第一子输入,放大显示“图像6”03。可选地,如果用户按住“图像6”03向第二屏幕的方向滑动,那么终端设备可以接收到对“图像6”03的滑动输入,即第二子输入,并且如图3中的(b)所示,终端设备可以响应于该第二子输入,在第二屏幕04显示“图像6”03,即在第二屏幕上显示目标对象。
本公开实施例提供的对象处理方法,通过以目标显示效果显示目标图像,可以使用户知道目标图像已被选中,从而有利于用户进行其他操作。
可选地,结合图2,如图6所示,本公开实施例提供的对象处理方法还可以包括下述的步骤105和步骤106。
步骤105、终端设备接收用户在第一屏幕上的第三输入。
步骤106、终端设备响应于该第三输入,将第一屏幕上显示的至少一个第一对象更新为至少一个第三对象。
需要说明的是,本公开实施例中,上述至少一个第一对象和至少一个第三对象可以完全不同,也可以部分不同。具体可以根据实际使用需求确定,本公开实施例不作限定。
此外,上述图6是以终端设备先执行步骤101和步骤102,再执行步骤105和步骤106为例进行示例性,其并不对本公开实施例形成任何限定。可以理解,实际实现时,终端设备也可以先执行步骤105和步骤106,再执行步骤101至步骤104;或者,终端设备可以先执行步骤101至步骤104,再执行步骤105和步骤106,具体可以根据实际使用需求确定。
可选地,本公开实施例中,第三输入可以为长按输入、滑动输入或者点击输入等。具体可以根据实际使用需求确定,本公开实施例不作限定。
示例性的,图7为本公开实施例提供的一种终端设备显示第三对象的示意图。以上述图3为例进行示例性说明,假设图3的第一屏幕上显示的图像为第一对象,在用户从第一屏幕选择“图像6”之后,用户可以在第一屏幕上向下滑动或向上滑动。如此,终端设备可以接收到用户的滑动输入,即第四输入,并响应于该第四输入,将如图3所示的第一屏幕上显示的第一对象,更新为如图7所示的第一屏幕上显示的第二对象,即终端设备可以将第一屏幕上显示的至少一个第一对象更新为至少一个第三对象。
本公开实施例提供的对象处理方法,由于用户可以根据实际使用需求触发终端设备显示至少一个第三对象,因此用户可以从第三对象中选择与目标对象不同的其他的对象,以触发终端设备在第二屏幕上显示选中的对象。
可选地,结合图2,如图8所示,第一屏幕上还可以显示目标控件。在上述步骤101之前,本公开实施例提供的对象处理方法还可以包括下述的步骤107和步骤108。
步骤107、终端设备接收用户对目标控件的第四输入。
步骤108、终端设备响应于该第四输入,控制至少一个第一对象处于可选择状态。
可选地,本公开实施例中,第四输入可以为对目标控件的长按输入、滑动输入或者点击输入等。具体可以根据实际使用需求确定,本公开实施例不作限定。
示例性的,图9为本公开实施例提供的一种用户对目标控件的操作示意图。假设目标控件为如图9所示的“编辑照片”07,在用户将第一屏幕的对象移动到第二屏幕上之前,用户可先点击“编辑照片”07。如此,终端设备接收用户对“编辑照片”07的输入,即第四输入,并响应于该第四输入,控制至少一个第一对象处于可选择状态。可选地,在至少一个第一对象处于可选择状态的情况下,用户可以选择如图3所示的“图像1”、“图像2”、“图像3”、“图像4”、“图像5”和“图像6”这6个图像。
本公开实施例提供的对象处理方法,由于可以通过对控件的输入,使第一屏幕的第一对象处于可选择状态,因此用户可以从第一对象中选择一个或多个对象。
需要说明的是,本公开实施例中的上述附图5、附图6、附图8均是结合附图2进行示例说明的,其并对本公开实施例形成任何限定。可以理解,实际实现时,附图5、附图6、附图8还可以结合其它任意可以结合的附图实现。
如图10所示,本公开实施例提供一种终端设备1000。该终端设备包括第一屏幕和第二屏幕。该终端设备可以包括接收模块1001、显示模块1002和处理模块1003。其中,接收模块1001,可以用于接收用户的第一输入,该第一输入可以为对第一屏幕上显示的至少一个第一对象中的目标对象的选择输入;显示模块1002,可以用于响应 于接收模块1001接收的该第一输入,在第二屏幕上显示该目标对象;接收模块1001,还可以用于接收用户针对第二屏幕上显示的至少一个第二对象的第二输入,该至少一个第二对象可以包括目标对象;处理模块1003,用于响应于接收模块1001接收的该第二输入,对该至少一个第二对象进行目标处理。
可选地,本公开实施例中,每个第一对象指示的内容可以为下述任意一项:图片、视频、音频、文档、应用程序。
可选地,本公开实施例中,至少一个第二对象指示的内容为下述任意一项:图片、视频、音频、文档。处理模块1003,具体可以用于:向目标设备发送至少一个第一对象;或者,向目标设备发送至少一个第一对象指示的内容;或者,从终端设备中删除至少一个第一对象;或者,从终端设备中删除至少一个第一对象指示的内容;或者,更改至少一个第一对象的文件格式;或者,更改至少一个第一对象指示的内容的文件格式;或者,将至少一个第一对象的存储区域更改为目标存储区域;或者,将至少一个第一对象指示的内容的存储区域更改为目标存储区域;或者,将至少一个第一对象合并为一个对象;或者,将至少一个第一对象指示的内容合并为一个内容。
可选地,本公开实施例中,至少一个第二对象指示的内容为应用程序。处理模块1003,具体可以用于:从终端设备中删除至少一个第二对象;或者,从终端设备中删除至少一个第二对象指示的内容;或者,将至少一个第二对象的存储区域更改为目标存储区域;或者,将至少一个第二对象指示的内容的存储区域更改为目标存储区域。
可选地,本公开实施例中,至少一个第二对象指示的内容为应用程序。处理模块1003,具体可以用于:向目标设备发送至少一个第二对象指示的内容;或者,从终端设备中删除至少一个第二对象;或者,从终端设备中删除至少一个第二对象指示的内容;或者,更改至少一个第二对象指示的内容的文件格式;或者,将至少一个第二对象的存储区域更改为目标存储区域;或者,将至少一个第二对象指示的内容的存储区域更改为目标存储区域;或者,将至少一个第二对象指示的内容合并为一个内容。
可选地,本公开实施例中,显示模块1002,还可以用于在第二屏幕上显示目标对象之前,将第一屏幕中的目标对象的显示效果更新为目标显示效果。
可选地,本公开实施例中,接收模块1001,还可以用于接收用户在第一屏幕上的第三输入;显示模块1002,还可以用于响应于该接收模块1001接收的该第三输入,将第一屏幕上显示的至少一个第一对象更新为至少一个第三对象。
可选地,本公开实施例中,第一屏幕上还显示目标控件。接收模块1001,还可以用于在接收第一输入之前,接收用户对目标控件的第四输入;处理模块1003,还可以用于响应于接收模块1001接收的第四输入,控制至少一个第一对象处于可选择状态。
本公开实施例提供的终端设备能够实现上述方法实施例中终端设备实现的各个过程,为避免重复,这里不再赘述。
本公开实施例提供一种终端设备,由于终端设备可以将用户从第一屏幕上的多个对象中选择的对象在第二屏幕上进行显示,因此用户可以在第二屏幕上对一个或多个第二对象进行更改和管理操作等,而无需在第一屏幕上进行上下滑动操作,以触发终端设备滚动显示用户已选择的对象,从而本公开实施例提供的终端设备可以简化查看和操作文件的过程,并节省用户时间。
图11为本公开实施例提供的一种终端设备的硬件结构示意图。如图11所示,该终端设备200包括但不限于:射频单元201、网络模块202、音频输出单元203、输入单元204、传感器205、显示单元206、用户输入单元207、接口单元208、存储器209、处理器210、以及电源211等部件。本领域技术人员可以理解,图11中示出的终端设备结构并不构成对终端设备的限定,终端设备可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。在本公开实施例中,终端设备包括但不限于手机、平板电脑、笔记本电脑、掌上电脑、车载终端设备、可穿戴设备、以及计步器等。
其中,用户输入单元207,可以用于接收用户的第一输入,该第一输入可以为对第一屏幕上显示的至少一个第一对象中的目标对象的选择输入;显示单元206,可以用于响应于用户输入单元207接收的该第一输入,在第二屏幕上显示该目标对象;用户输入单元207,还可以用于接收用户针对第二屏幕上显示的至少一个第二对象的第二输入,该至少一个第二对象可以包括目标对象;处理器210,可以用于响应于用户输入单元207接收的该第二输入,对该至少一个第二对象进行目标处理。
本公开实施例提供一种终端设备,由于终端设备可以将用户从第一屏幕上的多个对象中选择的对象在第二屏幕上进行显示,因此用户可以在第二屏幕上对一个或多个第二对象进行更改和管理操作等,而无需在第一屏幕上进行上下滑动操作,以触发终端设备滚动显示用户已选择的对象,从而本公开实施例提供的终端设备可以简化查看和操作文件的过程,并节省用户时间。
应理解的是,本公开实施例中,射频单元201可用于收发信息或通话过程中,信号的接收和发送,具体的,将来自基站的下行数据接收后,给处理器210处理;另外,将上行的数据发送给基站。通常,射频单元201包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频单元201还可以通过无线通信系统与网络和其他设备通信。
终端设备通过网络模块202为用户提供了无线的宽带互联网访问,如帮助用户收发电子邮件、浏览网页和访问流式媒体等。
音频输出单元203可以将射频单元201或网络模块202接收的或者在存储器209中存储的音频数据转换成音频信号并且输出为声音。而且,音频输出单元203还可以提供与终端设备200执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出单元203包括扬声器、蜂鸣器以及受话器等。
输入单元204用于接收音频或视频信号。输入单元204可以包括图形处理器(graphics processing unit,GPU)2041和麦克风2042,图形处理器2041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元206上。经图形处理器2041处理后的图像帧可以存储在存储器209(或其它存储介质)中或者经由射频单元201或网络模块202进行发送。麦克风2042可以接收声音,并且能够将这样的声音处理为音频数据。处理后的音频数据可以在电话通话模式的情况下转换为可经由射频单元201发送到移动通信基站的格式输出。
终端设备200还包括至少一种传感器205,比如光传感器、运动传感器以及其他 传感器。具体地,光传感器包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板2061的亮度,接近传感器可在终端设备200移动到耳边时,关闭显示面板2061和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别终端设备姿态(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;传感器205还可以包括指纹传感器、压力传感器、虹膜传感器、分子传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等,在此不再赘述。
显示单元206用于显示由用户输入的信息或提供给用户的信息。显示单元206可包括显示面板2061,可以采用液晶显示器(liquid crystal display,LCD)、有机发光二极管(organic light-emitting diode,OLED)等形式来配置显示面板2061。
用户输入单元207可用于接收输入的数字或字符信息,以及产生与终端设备的用户设置以及功能控制有关的键信号输入。具体地,用户输入单元207包括触控面板2071以及其他输入设备2072。触控面板2071,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板2071上或在触控面板2071附近的操作)。触控面板2071可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器210,接收处理器210发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板2071。除了触控面板2071,用户输入单元207还可以包括其他输入设备2072。具体地,其他输入设备2072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。
可选地,触控面板2071可覆盖在显示面板2061上,当触控面板2071检测到在其上或附近的触摸操作后,传送给处理器210以确定触摸事件的类型,随后处理器210根据触摸事件的类型在显示面板2061上提供相应的视觉输出。虽然在图11中,触控面板2071与显示面板2061是作为两个独立的部件来实现终端设备的输入和输出功能,但是在某些实施例中,可以将触控面板2071与显示面板2061集成而实现终端设备的输入和输出功能,具体此处不做限定。
接口单元208为外部装置与终端设备200连接的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。接口单元208可以用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到终端设备200内的一个或多个元件或者可以用于在终端设备200和外部装置之间传输数据。
存储器209可用于存储软件程序以及各种数据。存储器209可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器209可以包括高速随机存取 存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
处理器210是终端设备的控制中心,利用各种接口和线路连接整个终端设备的各个部分,通过运行或执行存储在存储器209内的软件程序和/或模块,以及调用存储在存储器209内的数据,执行终端设备的各种功能和处理数据,从而对终端设备进行整体监控。处理器210可包括一个或多个处理单元;可选地,处理器210可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器210中。
终端设备200还可以包括给各个部件供电的电源211(比如电池),可选地,电源211可以通过电源管理系统与处理器210逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。
另外,终端设备200包括一些未示出的功能模块,在此不再赘述。
可选地,本公开实施例还提供一种终端设备,包括如图11所示的处理器210,存储器209,存储在存储器209上并可在处理器210上运行的计算机程序,该计算机程序被处理器210执行时实现上述方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
本公开实施例还提供一种计算机可读存储介质,计算机可读存储介质上存储有计算机程序,该计算机程序被处理器执行时实现上述方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。其中,计算机可读存储介质,如只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本公开的技术方案本质上或者说对传统技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本公开各个实施例描述的方法。
上面结合附图对本公开的实施例进行了描述,但是本公开并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本公开的启示下,在不脱离本公开宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本公开的保护之内。

Claims (12)

  1. 一种对象处理方法,应用于包括第一屏幕和第二屏幕的终端设备,所述方法包括:
    接收用户的第一输入,所述第一输入为对所述第一屏幕上显示的至少一个第一对象中的目标对象的选择输入;
    响应于所述第一输入,在所述第二屏幕上显示所述目标对象;
    接收用户针对所述第二屏幕上显示的至少一个第二对象的第二输入,所述至少一个第二对象包括所述目标对象;
    响应于所述第二输入,对所述至少一个第二对象进行目标处理。
  2. 根据权利要求1所述的方法,其中,所述至少一个第二对象指示的内容为下述任意一项:图片、视频、音频、文档;
    所述对所述至少一个第二对象进行目标处理,包括下述任意一项:
    向目标设备发送所述至少一个第二对象,向目标设备发送所述至少一个第二对象指示的内容,从所述终端设备中删除所述至少一个第二对象,从所述终端设备中删除所述至少一个第二对象指示的内容,更改所述至少一个第二对象的文件格式,更改所述至少一个第二对象指示的内容的文件格式,将所述至少一个第二对象的存储区域更改为目标存储区域,将所述至少一个第二对象指示的内容的存储区域更改为目标存储区域,将所述至少一个第二对象合并为一个对象,将所述至少一个第二对象指示的内容合并为一个内容。
  3. 根据权利要求1所述的方法,其中,所述至少一个第二对象指示的内容为应用程序;所述对所述至少一个第二对象进行目标处理,包括下述任意一项:从所述终端设备中删除所述至少一个第二对象,从所述终端设备中删除所述至少一个第二对象指示的内容,将所述至少一个第二对象的存储区域更改为目标存储区域,将所述至少一个第二对象指示的内容的存储区域更改为目标存储区域;
    或者,
    所述至少一个第二对象指示的内容为应用程序的安装包;所述对所述至少一个第二对象进行目标处理,包括下述任意一项:向目标设备发送所述至少一个第二对象指示的内容,从所述终端设备中删除所述至少一个第二对象,从所述终端设备中删除所述至少一个第二对象指示的内容,更改所述至少一个第二对象指示的内容的文件格式,将所述至少一个第二对象的存储区域更改为目标存储区域,将所述至少一个第二对象指示的内容的存储区域更改为目标存储区域,将所述至少一个第二对象指示的内容合并为一个内容。
  4. 根据权利要求1至3中任一项所述的方法,其中,所述在所述第二屏幕上显示所述目标对象之前,所述方法还包括:
    将所述第一屏幕中的目标对象的显示效果更新为目标显示效果。
  5. 根据权利要求1至3中任一项所述的方法,其中,所述方法还包括:
    接收用户在所述第一屏幕上的第三输入;
    响应于所述第三输入,将所述第一屏幕上显示的所述至少一个第一对象更新为至少一个第三对象。
  6. 一种终端设备,所述终端设备包括第一屏幕和第二屏幕,所述终端设备包括接收模块、显示模块和处理模块;
    所述接收模块,用于接收用户的第一输入,所述第一输入为对所述第一屏幕上显示的至少一个第一对象中的目标对象的选择输入;
    所述显示模块,用于响应于所述接收模块接收的所述第一输入,在所述第二屏幕上显示所述目标对象;
    所述接收模块,还用于接收用户针对所述第二屏幕上显示的至少一个第二对象的第二输入,所述至少一个第二对象包括所述目标对象;
    所述处理模块,用于响应于所述接收模块接收的所述第二输入,对所述至少一个第二对象进行目标处理。
  7. 根据权利要求6所述的终端设备,其中,所述至少一个第二对象指示的内容为下述任意一项:图片、视频、音频、文档;
    所述处理模块,具体用于:向目标设备发送所述至少一个第二对象;或者,向目标设备发送所述至少一个第二对象指示的内容;或者,从所述终端设备中删除所述至少一个第二对象;或者,从所述终端设备中删除所述至少一个第二对象指示的内容;或者,更改所述至少一个第二对象的文件格式;或者,更改所述至少一个第二对象指示的内容的文件格式;或者,将所述至少一个第二对象的存储区域更改为目标存储区域;或者,将所述至少一个第二对象指示的内容的存储区域更改为目标存储区域;或者,将所述至少一个第二对象合并为一个对象;或者,将所述至少一个第二对象指示的内容合并为一个内容。
  8. 根据权利要求7所述的终端设备,其中,所述至少一个第二对象指示的内容为应用程序;所述处理模块,具体用于:从所述终端设备中删除所述至少一个第二对象;或者,从所述终端设备中删除所述至少一个第二对象指示的内容;或者,将所述至少一个第二对象的存储区域更改为目标存储区域;或者,将所述至少一个第二对象指示的内容的存储区域更改为目标存储区域;
    或者,
    所述至少一个第二对象指示的内容为应用程序的安装包;所述处理模块,具体用于:向目标设备发送所述至少一个第二对象指示的内容;或者,从所述终端设备中删除所述至少一个第二对象;或者,从所述终端设备中删除所述至少一个第二对象指示的内容;或者,更改所述至少一个第二对象指示的内容的文件格式;或者,将所述至少一个第二对象的存储区域更改为目标存储区域;或者,将所述至少一个第二对象指示的内容的存储区域更改为目标存储区域;或者,将所述至少一个第二对象指示的内容合并为一个内容。
  9. 根据权利要求6至8中任一项所述的终端设备,其中,所述显示模块,还用于在所述第二屏幕上显示所述目标对象之前,将所述第一屏幕中的目标对象的显示效果更新为目标显示效果。
  10. 根据权利要求6至8中任一项所述的终端设备,其中,所述接收模块,还用于接收用户在第一屏幕上的第三输入;
    所述显示模块,还用于响应于所述接收模块接收的所述第三输入,将所述第一屏 幕上显示的所述至少一个第一对象更新为至少一个第三对象。
  11. 一种终端设备,包括处理器、存储器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述计算机程序被所述处理器执行时实现如权利要求1至5中任一项所述的对象处理方法的步骤。
  12. 一种计算机可读存储介质,所述计算机可读存储介质上存储计算机程序,所述计算机程序被处理器执行时实现如权利要求1至5中任一项所述的对象处理方法的步骤。
PCT/CN2019/129861 2019-01-25 2019-12-30 对象处理方法及终端设备 WO2020151460A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/383,434 US20210349591A1 (en) 2019-01-25 2021-07-23 Object processing method and terminal device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910074692.7A CN109917995B (zh) 2019-01-25 2019-01-25 一种对象处理方法及终端设备
CN201910074692.7 2019-01-25

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/383,434 Continuation US20210349591A1 (en) 2019-01-25 2021-07-23 Object processing method and terminal device

Publications (1)

Publication Number Publication Date
WO2020151460A1 true WO2020151460A1 (zh) 2020-07-30

Family

ID=66960871

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/129861 WO2020151460A1 (zh) 2019-01-25 2019-12-30 对象处理方法及终端设备

Country Status (3)

Country Link
US (1) US20210349591A1 (zh)
CN (1) CN109917995B (zh)
WO (1) WO2020151460A1 (zh)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109917995B (zh) * 2019-01-25 2021-01-08 维沃移动通信有限公司 一种对象处理方法及终端设备
CN110022445B (zh) * 2019-02-26 2022-01-28 维沃软件技术有限公司 一种内容输出方法及终端设备
CN110308839B (zh) * 2019-06-28 2020-11-03 维沃移动通信有限公司 一种文件管理方法及终端设备
CN110609724A (zh) * 2019-08-30 2019-12-24 维沃移动通信有限公司 显示处理方法及终端设备
CN110908552B (zh) * 2019-10-11 2021-08-10 广州视源电子科技股份有限公司 多窗口操作控制方法、装置、设备及存储介质
CN111143300B (zh) * 2019-12-25 2024-01-12 维沃移动通信有限公司 文件压缩方法及电子设备
CN112187626B (zh) * 2020-09-30 2023-04-07 维沃移动通信(杭州)有限公司 文件处理方法、装置及电子设备
CN112558851B (zh) * 2020-12-22 2023-05-23 维沃移动通信有限公司 对象处理方法、装置、设备和可读存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107861670A (zh) * 2017-11-30 2018-03-30 努比亚技术有限公司 双屏终端的交互显示方法、双屏终端和计算机存储介质
CN108280136A (zh) * 2017-12-27 2018-07-13 努比亚技术有限公司 一种多媒体对象预览方法、设备及计算机可读存储介质
CN108319412A (zh) * 2018-01-16 2018-07-24 努比亚技术有限公司 一种照片删除方法、移动终端和计算机可读存储介质
CN108459803A (zh) * 2018-02-28 2018-08-28 努比亚技术有限公司 基于双面屏的图片发送方法、移动终端及可读存储介质
CN109190388A (zh) * 2018-08-01 2019-01-11 维沃移动通信有限公司 一种加密方法、解密方法及终端设备
CN109213396A (zh) * 2018-07-12 2019-01-15 维沃移动通信有限公司 一种对象控制方法及终端
CN109917995A (zh) * 2019-01-25 2019-06-21 维沃移动通信有限公司 一种对象处理方法及终端设备

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120036466A1 (en) * 2010-08-04 2012-02-09 General Electric Company Systems and methods for large data set navigation on a mobile device
US9588668B2 (en) * 2011-07-21 2017-03-07 Imerj, Llc Methods of displaying a second view
US9591181B2 (en) * 2012-03-06 2017-03-07 Apple Inc. Sharing images from image viewing and editing application
CN104133629A (zh) * 2014-07-10 2014-11-05 深圳市中兴移动通信有限公司 双屏互动的方法及移动终端
CN106603823A (zh) * 2016-11-28 2017-04-26 努比亚技术有限公司 一种内容分享方法、装置及终端
CN107977152A (zh) * 2017-11-30 2018-05-01 努比亚技术有限公司 一种基于双屏移动终端的图片分享方法、终端和存储介质
CN109981878B (zh) * 2017-12-28 2021-09-14 华为终端有限公司 一种图标管理的方法及装置
US10552136B2 (en) * 2018-06-29 2020-02-04 Alibaba Group Holding Limited One click application asset distribution
CN117093301A (zh) * 2018-08-15 2023-11-21 华为技术有限公司 显示方法及装置
WO2020051968A1 (zh) * 2018-09-11 2020-03-19 华为技术有限公司 数据分享的方法、图形用户界面、电子设备及系统

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107861670A (zh) * 2017-11-30 2018-03-30 努比亚技术有限公司 双屏终端的交互显示方法、双屏终端和计算机存储介质
CN108280136A (zh) * 2017-12-27 2018-07-13 努比亚技术有限公司 一种多媒体对象预览方法、设备及计算机可读存储介质
CN108319412A (zh) * 2018-01-16 2018-07-24 努比亚技术有限公司 一种照片删除方法、移动终端和计算机可读存储介质
CN108459803A (zh) * 2018-02-28 2018-08-28 努比亚技术有限公司 基于双面屏的图片发送方法、移动终端及可读存储介质
CN109213396A (zh) * 2018-07-12 2019-01-15 维沃移动通信有限公司 一种对象控制方法及终端
CN109190388A (zh) * 2018-08-01 2019-01-11 维沃移动通信有限公司 一种加密方法、解密方法及终端设备
CN109917995A (zh) * 2019-01-25 2019-06-21 维沃移动通信有限公司 一种对象处理方法及终端设备

Also Published As

Publication number Publication date
US20210349591A1 (en) 2021-11-11
CN109917995B (zh) 2021-01-08
CN109917995A (zh) 2019-06-21

Similar Documents

Publication Publication Date Title
WO2021083052A1 (zh) 对象分享方法及电子设备
WO2020151460A1 (zh) 对象处理方法及终端设备
WO2021104195A1 (zh) 图像显示方法及电子设备
WO2020063091A1 (zh) 一种图片处理方法及终端设备
WO2020258929A1 (zh) 文件夹界面切换方法及终端设备
WO2021197263A1 (zh) 内容共享方法及电子设备
WO2020181942A1 (zh) 图标控制方法及终端设备
WO2020215957A1 (zh) 界面显示方法及终端设备
WO2021012931A1 (zh) 图标管理方法及终端
WO2021129536A1 (zh) 图标移动方法及电子设备
WO2021098695A1 (zh) 信息分享方法及电子设备
WO2021136136A1 (zh) 截图方法及电子设备
WO2020215949A1 (zh) 对象处理方法及终端设备
JP7247417B2 (ja) アイコン表示方法及び端末機器
WO2020238449A1 (zh) 通知消息的处理方法及终端
WO2021083087A1 (zh) 截屏方法及终端设备
WO2020151525A1 (zh) 消息发送方法及终端设备
WO2021104163A1 (zh) 图标整理方法及电子设备
WO2020199783A1 (zh) 界面显示方法及终端设备
WO2021057290A1 (zh) 信息控制方法及电子设备
WO2020181945A1 (zh) 标识显示方法及终端设备
WO2020192299A1 (zh) 信息显示方法及终端设备
WO2020215969A1 (zh) 内容输入方法及终端设备
WO2021017738A1 (zh) 界面显示方法及电子设备
WO2021057301A1 (zh) 文件控制方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19912124

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19912124

Country of ref document: EP

Kind code of ref document: A1