CN112558851A - Object processing method, device, equipment and readable storage medium - Google Patents

Object processing method, device, equipment and readable storage medium Download PDF

Info

Publication number
CN112558851A
CN112558851A CN202011527128.5A CN202011527128A CN112558851A CN 112558851 A CN112558851 A CN 112558851A CN 202011527128 A CN202011527128 A CN 202011527128A CN 112558851 A CN112558851 A CN 112558851A
Authority
CN
China
Prior art keywords
input
target
screen area
screen
objects
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011527128.5A
Other languages
Chinese (zh)
Other versions
CN112558851B (en
Inventor
郭敏旋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202011527128.5A priority Critical patent/CN112558851B/en
Publication of CN112558851A publication Critical patent/CN112558851A/en
Application granted granted Critical
Publication of CN112558851B publication Critical patent/CN112558851B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/16File or folder operations, e.g. details of user interfaces specifically adapted to file systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/16File or folder operations, e.g. details of user interfaces specifically adapted to file systems
    • G06F16/162Delete operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an object processing method, an object processing device, object processing equipment and a readable storage medium, belongs to the technical field of communication, and can solve the problem of low efficiency in batch operation of objects in the prior art. The method can comprise the following steps: receiving a first input to a first screen region; in response to the first input, adding at least two target objects of the objects displayed in the first screen area to a second screen area display; receiving a second input to the second screen region; in response to the second input, batch processing is performed on the at least two target objects. The method and the device can improve the efficiency of batch operation of the objects.

Description

Object processing method, device, equipment and readable storage medium
Technical Field
The present application belongs to the field of communications, and in particular, relates to an object processing method, apparatus, device, and readable storage medium.
Background
With the development of mobile devices, functions of the mobile devices are becoming rich, more and more files are contained, and users often need to operate objects such as files in batches. For example, deleting multiple files, sharing multiple files, moving the locations of multiple files, etc.
In the process of implementing the embodiment of the present application, the inventors found that the batch operation method for objects in the prior art has low processing efficiency.
Disclosure of Invention
An object of the embodiments of the present application is to provide an object processing method, an object processing apparatus, an object processing device, and a readable storage medium, which can solve the problem in the prior art that the efficiency is low when the objects are operated in batch.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides an object processing method applied to an electronic device having at least a first screen area and a second screen area, where the method includes:
receiving a first input to a first screen region;
in response to the first input, adding at least two target objects of the objects displayed in the first screen area to a second screen area display;
receiving a second input to the second screen region;
in response to the second input, batch processing is performed on the at least two target objects.
In a second aspect, an embodiment of the present application provides an apparatus for object processing, where the apparatus includes:
the first receiving module is used for receiving a first input of a first screen area;
a first processing module, configured to add at least two target objects of the objects displayed in the first screen area to a second screen area display in response to the first input;
the second receiving module is used for receiving a second input of the second screen area;
and the second processing module is used for responding to the second input and carrying out batch processing on the at least two target objects.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, when a first input to a first screen area is received, at least two target objects in the objects displayed in the first screen area are added to a second screen area in response to the first input, and the at least two target objects can be processed in batch according to the second input. Therefore, by using the scheme of the embodiment of the application, a plurality of objects can be simultaneously selected according to the input of the first screen area, and the plurality of objects can be processed in batch according to the input of the second screen area.
Drawings
Fig. 1 is a flowchart of an object processing method according to an embodiment of the present application;
FIGS. 2-19 are schematic diagrams illustrating operation of an embodiment of the present application;
fig. 20 is a schematic structural diagram of an object processing apparatus according to an embodiment of the present application;
fig. 21 is one of schematic structural diagrams of an electronic apparatus according to an embodiment of the present application;
fig. 22 is a second schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The object processing method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
Referring to fig. 1, fig. 1 is a flowchart of an object processing method according to an embodiment of the present application. As shown in fig. 1, an object processing method according to an embodiment of the present application includes:
step 101, receiving a first input to a first screen region.
The method is applicable to an electronic device having at least a first screen region and a second screen region, or an electronic device in which a screen is foldable to form the first screen region and the second screen region. The first screen area may refer to any display area of the electronic device, and correspondingly, the second screen area may be any display area except the first screen area. The first input may be, for example, a click input, a touch input, a slide input, or the like.
And 102, responding to the first input, and adding at least two target objects in the objects displayed in the first screen area to a second screen area for display.
In the embodiment of the present application, the object includes, but is not limited to, a file, an image, and an icon.
Specifically, in this step, an input trajectory of the first input on the first screen area may be acquired, and then at least two objects whose display positions are located within an area formed by the input trajectory may be set as target objects. Or, taking an object corresponding to a connecting line between an input starting point and an input end point of the input track as a target object; wherein the number of the target objects is at least two. Thereafter, the at least two target objects are displayed in the second screen area.
The "at least two objects corresponding to a line between the input starting point and the input ending point of the input track" may be objects having display positions on the line between the input starting point and the input ending point, or may be objects having display positions between the input starting point and the input ending point in the order of arrangement of the images. In this way, a plurality of target objects can be determined simply and quickly.
As shown in fig. 2, an input trace is formed in response to an input of a user on a first screen area, and then three files whose display positions are within the area formed by the input trace are taken as the target objects. As shown in fig. 3, the user's finger slides from the a file to the b file, where the input starting point is the a file and the input end point is located in the b file. And according to the arrangement sequence of the objects from left to right and from top to bottom, taking the files with the arrangement sequence positioned after the file a and before the file b as the target files. By this operation, a plurality of objects can be simultaneously selected, thereby improving processing efficiency. Meanwhile, through the operation, a plurality of objects can be selected by operating on one screen area, and selection is not needed in multi-screen operation, so that operation steps are reduced.
In conjunction with fig. 2 and 3, the selected file is moved to the second screen area in its entirety in this step.
In practical applications, after the step, if the display operation of moving the target object to the second screen is to be cancelled, a third input to at least one of the first screen area and the second screen area may be further received, and at least some of the at least two target objects of the second screen area may be moved to the first screen area in response to the third input. The third input may be, for example, a click input, a touch input, a slide input, or the like. By the method, the operation on a plurality of objects can be simplified, and the operation is convenient and fast.
As shown in fig. 4, a first screen area and a second screen area are closed and then expanded, and at least part of the objects on the second screen area are moved to the first screen area in response to the input; as shown in fig. 5, three fingers of the user slide from the second screen region to the first screen region (e.g., to the boundary between the first screen region and the second screen region), and in response to the input, at least a portion of the objects on the second screen region are moved to the first screen region.
And 103, receiving a second input to the second screen area.
The second input may be, for example, a click input, a touch input, a slide input, or the like.
And 104, responding to the second input, and performing batch processing on the at least two target objects.
Wherein the batch processing comprises at least one of: and deleting, sharing to the target application, and moving to the target folder.
(1) The batch processing includes deletion.
Wherein deleting comprises deleting more than two objects simultaneously. As shown in fig. 6, the side of the user's hand is placed on the second screen area, sliding to the outside of the screen, and in response to this input, the at least two target objects may be deleted. As shown in FIG. 7, the user's finger joints slide outward on the second screen area, and in response to the input, the at least two target objects may be deleted.
(2) The batch processing includes sharing to a target application.
The target application may be any application. Specifically, at least one application icon may be displayed on the second screen region in response to the second input, and then a fourth input to a target application icon of the at least one application icon may be received. And then, responding to the fourth input, and sharing the at least two target objects to the application corresponding to the target application icon. The fourth input may be, for example, a click, a swipe, or the like.
As shown in fig. 8, the thumb of the user presses the blank of the second screen area, the index finger slides, at least one application is triggered and displayed, and accordingly, the selectable applications are displayed below the second screen area. And after a user clicks a certain application, responding to the clicking operation, and simultaneously sharing the at least two target objects to the target application. As shown in fig. 9, the two fingers of the user drawing a heart-shaped track triggers the display of at least one application and, correspondingly, the presentation of selectable applications below the second screen area. And after a user clicks a certain application, responding to the clicking operation, and simultaneously sharing the at least two target objects to the target application.
(3) The batch processing includes moving to a target folder.
The target folder may refer to an existing folder or a new folder.
Specifically, in this step, in response to the second input, at least one folder is displayed on the second screen area, and the at least two target objects are simultaneously moved to a target folder in the at least one folder. Further, if the second input acts on a mark for a new folder, the method may further include, before displaying at least one folder on the second screen region in response to the second input: and responding to the second input, and creating a new folder.
As shown in fig. 10 and 11, the user's finger joint pinch (or pinch between the thumb and a plurality of finger joints) between the thumb and the index finger acts on one display screen. In response to the input, at least two target objects are displayed in superimposition, with multiple folders and an identification for the newly created folder (e.g., "+" or the like) being displayed simultaneously. When the finger joints drag the files to move into the folder, the files are moved. If the file is dragged to the first plus sign icon in the folder list, a new folder is created, and the object is moved into the new folder. As shown in fig. 12, the knuckle of a single finger of the user rotates on the screen, and in response to this input, at least two target objects are displayed in superposition, while a plurality of folders and an identifier (e.g., "+" or the like) for the new folder are displayed. When the finger joints drag the files to move into the folder, the files are moved. If the file is dragged to the first plus sign icon in the folder list, a new folder is created, and the object is moved into the new folder.
On the basis of the above embodiment, to ensure the consistency of user operations and improve user experience, the method may further include: an input to the target folder is received. The target folder may refer to any folder, and then, in response to the input, the target folder is expanded and displayed on the second screen area. Input may then be received for an object in the target file and the object processed in response to the input. The input may be touch input, click input, and the like, and the processing may be deletion, sharing, editing, and the like.
In the embodiment of the application, when a first input to a first screen area is received, at least two target objects in the objects displayed in the first screen area are added to a second screen area in response to the first input, and the at least two target objects can be processed in batch according to the second input. Therefore, by using the scheme of the embodiment of the application, the plurality of objects can be simultaneously selected according to the input of the first screen area, and the plurality of objects can be processed in batch according to the input of the second screen area, so that the operation of processing the plurality of objects in batch is simplified by using the scheme of the embodiment of the application, and the efficiency of processing the plurality of objects in batch can be improved.
On the basis of the above-described embodiment, the processing of the target object may also be undone. Specifically, a fifth input to at least one of the first screen region and the second screen region is received; in response to the fifth input, undoing processing of the at least two target objects. The fifth input may be, for example, a click input, a touch input, a slide input, or the like.
As shown in fig. 13, a track is formed by a finger circling in the second screen area, and the processing of the target object is cancelled in response to the input of the user. If multiple traces are formed by looping, then multiple processes may be undone. As shown in fig. 14, the screen is not completely folded and opened again (0 degrees < angle at which the screen is folded <180 degrees). In response to the user input, the previous operation may be withdrawn. By successive folds, the multi-step process can be withdrawn. Through the mode, the operation is more flexible and convenient, and the operation efficiency is improved.
On the basis of the above embodiment, before the method of the embodiment of the present application is executed, the electronic device may be triggered to enter an object management state, so as to further manage each object.
Specifically, the electronic device may be triggered to enter the object management by inputting the first screen area and the second screen area. The input may be, for example, a click, touch, rotation operation, or the like.
As shown in fig. 15 and 16, the screen is folded back and then unfolded, thereby triggering entry into an object management state in which one screen area displays all files and another screen area may not display content. As shown in fig. 17, two fingers are rotated counterclockwise on two screen areas, respectively, to trigger entry into an object management state, where one screen area displays all files and the other screen area may not display content.
Of course, in the embodiment of the present application, one target object may also be selected. As shown in fig. 18 and 19, one of the objects is clicked on one screen area, and the object can move to another screen area in response to the input.
It can be seen from the description of the above embodiment that, in the embodiment of the present application, displaying objects on two screen regions is more intuitive, and when the number of the objects is large, the problem of inconvenient operation caused by searching the objects displayed in one screen region can be avoided. When the selection of a certain object needs to be cancelled, the selection can be completed through the input in another screen area, and the searching cost is reduced.
It should be noted that, in the object processing method provided in the embodiment of the present application, the execution main body may be an object processing apparatus, or a control module in the object processing apparatus for executing the loaded object processing method. In the embodiment of the present application, an object processing apparatus executing a load object processing method is taken as an example, and the object processing apparatus provided in the embodiment of the present application is described.
Referring to fig. 20, fig. 20 is a schematic view of an object processing apparatus according to an embodiment of the present application. As shown in fig. 20, the apparatus may include:
a first receiving module 2001 for receiving a first input to the first screen region; a first processing module 2002 for adding at least two target objects of the objects displayed in the first screen region to a second screen region display in response to the first input; a second receiving module 2003 for receiving a second input to the second screen region; a second processing module 2004 for batch processing the at least two target objects in response to the second input.
Wherein the first processing module 2002 comprises:
the first obtaining submodule is used for obtaining an input track of the first input on the first screen area; the first selection sub-module is used for taking an object of which the display position is located in an area formed by the input track on the first screen area as a target object, or taking an object corresponding to a connecting line between an input starting point and an input end point of the input track as a target object; the number of the target objects is at least two; and the first display sub-module is used for displaying the at least two target objects in the second screen area.
Optionally, the apparatus further comprises:
a second receiving module for receiving a third input to at least one of the first screen region and the second screen region and moving at least a part of the at least two target objects of the second screen region to the first screen region in response to the third input.
Wherein the batch processing comprises sharing to a target application; the second processing module 2004 includes:
a first display sub-module for displaying at least one application icon on the second screen area in response to the second input; the first receiving submodule is used for receiving a fourth input of a target application icon in the at least one application icon; and the first processing submodule is used for responding to the fourth input and sharing the at least two target objects to the application corresponding to the target application icon.
Wherein the batch processing comprises moving to a target folder; the second processing module 2004 includes:
a first display sub-module for displaying at least one folder on the second screen area in response to the second input; a first processing submodule, configured to move the at least two target objects to a target folder in the at least one folder at the same time.
Optionally, the apparatus further comprises:
a third receiving module for receiving a fifth input to at least one of the first screen region and the second screen region;
a third processing module for undoing processing of the at least two target objects in response to the fifth input.
In the embodiment of the application, when a first input to a first screen area is received, at least two target objects in the objects displayed in the first screen area are added to a second screen area in response to the first input, and the at least two target objects can be processed in batch according to the second input. Therefore, by using the scheme of the embodiment of the application, the plurality of objects can be simultaneously selected according to the input of the first screen area, and the plurality of objects can be processed in batch according to the input of the second screen area, so that the operation of processing the plurality of objects in batch is simplified by using the scheme of the embodiment of the application, and the efficiency of processing the plurality of objects in batch can be improved.
The object processing apparatus in the embodiment of the present application may be an apparatus, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The object processing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The object processing apparatus provided in the embodiment of the present application can implement each process implemented by the method embodiments in fig. 1 to fig. 19, and is not described herein again to avoid repetition.
Optionally, as shown in fig. 21, an electronic device 2100 further provided in this embodiment of the present application includes a processor 2101, a memory 2102, and a program or an instruction stored in the memory 2102 and executable on the processor 2101, where the program or the instruction is executed by the processor 2101 to implement each process of the above-described embodiment of the object processing method, and may achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 22 is a schematic hardware structure diagram of an electronic device implementing an embodiment of the present application. The electronic device 2200 includes, but is not limited to: radio frequency unit 2201, network module 2202, audio output unit 2203, input unit 2204, sensor 2205, display unit 2206, user input unit 2207, interface unit 2208, memory 2209, processor 2210 and the like.
Those skilled in the art will appreciate that the electronic device 2200 may also include a power source (e.g., a battery) for powering the various components, which may be logically coupled to the processor 2210 via a power management system to perform functions of managing charging, discharging, and power consumption via the power management system. The electronic device structure shown in fig. 22 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description thereof is omitted.
Wherein, the user input unit 2207 is used for receiving a first input to the first screen region;
a processor 2210 for adding at least two target objects of the first screen area display to a second screen area display in response to the first input;
a user input unit 2207 for receiving a second input to the second screen region;
a processor 2210 for batch processing the at least two target objects in response to the second input.
In the embodiment of the application, when a first input to a first screen area is received, at least two target objects in the objects displayed in the first screen area are added to a second screen area in response to the first input, and the at least two target objects can be processed in batch according to the second input. Therefore, by using the scheme of the embodiment of the application, the plurality of objects can be simultaneously selected according to the input of the first screen area, and the plurality of objects can be processed in batch according to the input of the second screen area, so that the operation of processing the plurality of objects in batch is simplified by using the scheme of the embodiment of the application, and the efficiency of processing the plurality of objects in batch can be improved.
Optionally, the processor 2210 is further configured to obtain an input track of the first input on the first screen region; taking an object with a display position in an area formed by the input track on the first screen area as a target object, or taking an object corresponding to a connecting line between an input starting point and an input end point of the input track as a target object; wherein the number of the target objects is at least two.
A display unit 2206 for displaying the at least two target objects in the second screen area.
Optionally, the batch processing includes sharing to the target application; a display unit 2206 for displaying at least one application icon on the second screen region in response to the second input;
a user input unit 2207 for receiving a fourth input to a target application icon among the at least one application icon;
processor 2210, further configured to, in response to the fourth input, share the at least two target objects to an application corresponding to the target application icon.
Optionally, the batch processing includes moving to a target folder; a display unit 2206 for displaying at least one folder on the second screen region in response to the second input; processor 2210, further for simultaneously moving the at least two target objects to a target folder of the at least one folder.
Optionally, a user input unit 2207, configured to receive a fifth input to at least one of the first screen region and the second screen region; processor 2210, further for undoing processing of the at least two target objects in response to the fifth input.
It should be understood that, in the embodiment of the present application, the input Unit 2204 may include a Graphics Processing Unit (GPU) 22041 and a microphone 22042, and the Graphics Processing Unit 22041 processes image data of still pictures or videos obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 2206 may include a display panel 22061, and the display panel 22061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 2207 includes a touch panel 22071 and other input devices 22072. A touch panel 22071, also referred to as a touch screen. The touch panel 22071 may include two parts, a touch detection device and a touch controller. Other input devices 22072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 2209 may be used to store software programs as well as various data including, but not limited to, application programs and an operating system. Processor 2210 may integrate an application processor, which primarily handles operating systems, user interfaces, application programs, etc., and a modem processor, which primarily handles wireless communications. It is to be appreciated that the modem processor described above may not be integrated into processor 2210.
The embodiments of the present application further provide a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the above object processing method embodiments, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the embodiment of the object processing method, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (14)

1. An object processing method, comprising:
receiving a first input to a first screen region;
in response to the first input, adding at least two target objects of the objects displayed in the first screen area to a second screen area display;
receiving a second input to the second screen region;
in response to the second input, batch processing is performed on the at least two target objects.
2. The method of claim 1, wherein adding at least two of the objects displayed in the first screen area to a second screen area display in response to the first input comprises:
acquiring an input track of the first input on the first screen area;
taking an object with a display position in an area formed by the input track on the first screen area as a target object, or taking an object corresponding to a connecting line between an input starting point and an input end point of the input track as a target object; the number of the target objects is at least two;
displaying the at least two target objects in the second screen area.
3. The method of claim 1, wherein after adding at least two of the objects displayed in the first screen region to a second screen region display, the method further comprises:
receiving a third input to at least one of the first screen region and the second screen region, and moving at least some of the at least two target objects of the second screen region to the first screen region in response to the third input.
4. The method of claim 1, wherein the batching comprises sharing to a target application; the batch processing of the at least two target objects in response to the second input comprises:
displaying at least one application icon on the second screen area in response to the second input;
receiving a fourth input of a target application icon of the at least one application icon;
responding to the fourth input, and sharing the at least two target objects to the application corresponding to the target application icon.
5. The method of claim 1, wherein the batch processing comprises moving to a target folder; the batch processing of the at least two target objects in response to the second input comprises:
displaying at least one folder on the second screen area in response to the second input;
and simultaneously moving the at least two target objects to a target folder in the at least one folder.
6. The method of claim 1, further comprising:
receiving a fifth input to at least one of the first screen region and the second screen region;
in response to the fifth input, undoing processing of the at least two target objects.
7. An apparatus for object processing, the apparatus comprising:
the first receiving module is used for receiving a first input of a first screen area;
a first processing module, configured to add at least two target objects of the objects displayed in the first screen area to a second screen area display in response to the first input;
the second receiving module is used for receiving a second input of the second screen area;
and the second processing module is used for responding to the second input and carrying out batch processing on the at least two target objects.
8. The apparatus of claim 7, wherein the first processing module comprises:
the first obtaining submodule is used for obtaining an input track of the first input on the first screen area;
the first selection sub-module is used for taking an object of which the display position is located in an area formed by the input track on the first screen area as a target object, or taking an object corresponding to a connecting line between an input starting point and an input end point of the input track as a target object; the number of the target objects is at least two;
and the first display sub-module is used for displaying the at least two target objects in the second screen area.
9. The apparatus of claim 7, further comprising:
a second receiving module for receiving a third input to at least one of the first screen region and the second screen region and moving at least a part of the at least two target objects of the second screen region to the first screen region in response to the third input.
10. The apparatus of claim 7, wherein the batching process comprises sharing to a target application; the second processing module comprises:
a first display sub-module for displaying at least one application icon on the second screen area in response to the second input;
the first receiving submodule is used for receiving a fourth input of a target application icon in the at least one application icon;
and the first processing submodule is used for responding to the fourth input and sharing the at least two target objects to the application corresponding to the target application icon.
11. The apparatus of claim 7, wherein the batch process comprises moving to a target folder; the second processing module comprises:
a first display sub-module for displaying at least one folder on the second screen area in response to the second input;
a first processing submodule, configured to move the at least two target objects to a target folder in the at least one folder at the same time.
12. The apparatus of claim 7, further comprising:
a third receiving module for receiving a fifth input to at least one of the first screen region and the second screen region;
a third processing module for undoing processing of the at least two target objects in response to the fifth input.
13. An electronic device comprising a processor, a memory and a program or instructions stored on the memory and executable on the processor, the program or instructions, when executed by the processor, implementing the steps of the object processing method according to any one of claims 1 to 6.
14. A readable storage medium, characterized in that it stores thereon a program or instructions which, when executed by a processor, implement the steps of the object processing method according to any one of claims 1 to 6.
CN202011527128.5A 2020-12-22 2020-12-22 Object processing method, device, equipment and readable storage medium Active CN112558851B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011527128.5A CN112558851B (en) 2020-12-22 2020-12-22 Object processing method, device, equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011527128.5A CN112558851B (en) 2020-12-22 2020-12-22 Object processing method, device, equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN112558851A true CN112558851A (en) 2021-03-26
CN112558851B CN112558851B (en) 2023-05-23

Family

ID=75031272

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011527128.5A Active CN112558851B (en) 2020-12-22 2020-12-22 Object processing method, device, equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN112558851B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113326233A (en) * 2021-06-08 2021-08-31 维沃移动通信有限公司 Method and device for arranging folders
CN114518819A (en) * 2022-02-18 2022-05-20 维沃移动通信有限公司 Icon management method and device and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107623763A (en) * 2017-10-19 2018-01-23 广东欧珀移动通信有限公司 The method and apparatus for editing image
WO2018072463A1 (en) * 2016-10-17 2018-04-26 深圳市中兴微电子技术有限公司 Information sharing method, apparatus, and storage medium
CN109426422A (en) * 2017-08-30 2019-03-05 天津三星通信技术研究有限公司 The method and device of selecting object
CN109522278A (en) * 2018-10-18 2019-03-26 维沃移动通信有限公司 A kind of file memory method and terminal device
CN109857292A (en) * 2018-12-27 2019-06-07 维沃移动通信有限公司 A kind of object displaying method and terminal device
CN109917995A (en) * 2019-01-25 2019-06-21 维沃移动通信有限公司 A kind of object processing method and terminal device
CN111061574A (en) * 2019-11-27 2020-04-24 维沃移动通信有限公司 Object sharing method and electronic equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018072463A1 (en) * 2016-10-17 2018-04-26 深圳市中兴微电子技术有限公司 Information sharing method, apparatus, and storage medium
CN109426422A (en) * 2017-08-30 2019-03-05 天津三星通信技术研究有限公司 The method and device of selecting object
CN107623763A (en) * 2017-10-19 2018-01-23 广东欧珀移动通信有限公司 The method and apparatus for editing image
CN109522278A (en) * 2018-10-18 2019-03-26 维沃移动通信有限公司 A kind of file memory method and terminal device
CN109857292A (en) * 2018-12-27 2019-06-07 维沃移动通信有限公司 A kind of object displaying method and terminal device
CN109917995A (en) * 2019-01-25 2019-06-21 维沃移动通信有限公司 A kind of object processing method and terminal device
CN111061574A (en) * 2019-11-27 2020-04-24 维沃移动通信有限公司 Object sharing method and electronic equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
本站: "iPhone6s让照片多选的两种操作方法", 《河东软件园》 *
殷继彬: "《"接触+非接触"式交互界面的设计与研究》", 31 July 2018 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113326233A (en) * 2021-06-08 2021-08-31 维沃移动通信有限公司 Method and device for arranging folders
CN114518819A (en) * 2022-02-18 2022-05-20 维沃移动通信有限公司 Icon management method and device and electronic equipment

Also Published As

Publication number Publication date
CN112558851B (en) 2023-05-23

Similar Documents

Publication Publication Date Title
CN113126838A (en) Application icon sorting method and device and electronic equipment
CN112433693B (en) Split screen display method and device and electronic equipment
CN113703624A (en) Screen splitting method and device and electronic equipment
CN112558851B (en) Object processing method, device, equipment and readable storage medium
CN113918260A (en) Application program display method and device and electronic equipment
CN112783408A (en) Gesture navigation method and device of electronic equipment, equipment and readable storage medium
CN114518820A (en) Icon sorting method and device and electronic equipment
CN112269506A (en) Screen splitting method and device and electronic equipment
CN114415886A (en) Application icon management method and electronic equipment
CN114327726A (en) Display control method, display control device, electronic equipment and storage medium
CN112399010B (en) Page display method and device and electronic equipment
CN112698762B (en) Icon display method and device and electronic equipment
CN113485625A (en) Electronic equipment response method and device and electronic equipment
CN114090110A (en) Application program starting method and device and electronic equipment
CN111796733B (en) Image display method, image display device and electronic equipment
CN113342222B (en) Application classification method and device and electronic equipment
CN111796736B (en) Application sharing method and device and electronic equipment
CN115291778A (en) Display control method and device, electronic equipment and readable storage medium
CN114116087A (en) Interface operation method and device between two systems, electronic equipment and medium
CN114995713A (en) Display control method and device, electronic equipment and readable storage medium
CN113835578A (en) Display method and device and electronic equipment
CN113885981A (en) Desktop editing method and device and electronic equipment
CN112765500A (en) Information searching method and device
CN113778279A (en) Screenshot method and device and electronic equipment
CN113672136A (en) Information display method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant