CN111338539A - Target object selection method and device - Google Patents

Target object selection method and device Download PDF

Info

Publication number
CN111338539A
CN111338539A CN202010124256.9A CN202010124256A CN111338539A CN 111338539 A CN111338539 A CN 111338539A CN 202010124256 A CN202010124256 A CN 202010124256A CN 111338539 A CN111338539 A CN 111338539A
Authority
CN
China
Prior art keywords
target object
sliding operation
area range
detected
selection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010124256.9A
Other languages
Chinese (zh)
Other versions
CN111338539B (en
Inventor
朱中波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jjworld Beijing Network Technology Co ltd
Original Assignee
Jjworld Beijing Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jjworld Beijing Network Technology Co ltd filed Critical Jjworld Beijing Network Technology Co ltd
Priority to CN202010124256.9A priority Critical patent/CN111338539B/en
Publication of CN111338539A publication Critical patent/CN111338539A/en
Application granted granted Critical
Publication of CN111338539B publication Critical patent/CN111338539B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Abstract

The embodiment of the application discloses a method and a device for selecting a target object, wherein the target object can be selected within an area range by acquiring the area range of a target area on a screen, detecting sliding operation within the area range and executing corresponding marking or selection of the target object, and a user does not need to perform selection operation in the whole screen, operates within the area range and conveniently selects the target object.

Description

Target object selection method and device
Technical Field
The application relates to the technical field of internet, in particular to a target object selection method and device.
Background
With the widespread use of mobile devices, more and more screens of mobile devices employ touch screens, and users select displayed objects by touching the screens of the mobile devices. However, due to the hardware limitations of the mobile device and the need for multiple functions, the screen of the mobile device is large in size. In consideration of the display effect and the user experience, when an object to be selected is displayed, the objects to be selected are generally distributed in the display screen in a relatively dispersed manner, and the user needs to select a target object by clicking a corresponding position on the screen. However, due to the large size of the screen, the user needs to operate with both hands to select the target object, which causes inconvenience for the user to select the target object.
Disclosure of Invention
In view of this, embodiments of the present application provide a method and an apparatus for selecting a target object, so as to solve the problem of inconvenient operation of a user by changing a selection method for the target object in the prior art.
The technical scheme provided by the embodiment of the application is as follows:
in a first aspect, an embodiment of the present application provides a method for selecting a target object, where the method includes:
acquiring the area range of a target area on a screen;
when the sliding operation along the preset path is detected in the area range, sequentially moving the selection mark from the current position to each target object;
when the sliding operation along the first direction is detected in the area range, marking a starting label on a target object where the selection mark is located;
when the sliding operation along the second direction is detected in the area range, if a target object marking a start label exists, marking an end label on the target object where the current selection label is located, and acquiring the target object marking the start label, the target object marking the end label and a target object between the target object marking the start label and the target object marking the end label as selected target objects;
and when the sliding operation along the preset path is detected to be finished in the area range, and the sliding operation along the first direction and the sliding operation along the second direction are not detected in the area range, acquiring the target object where the current selection mark is positioned as the selected target object.
Optionally, when the sliding operation along the preset path is detected within the area range, sequentially moving the selection marker from the current position to each target object includes:
when the sliding operation along a first preset direction of a preset path is detected in the area range, sequentially moving the selection mark from the current position to the left to each target object;
and when the sliding operation along the second preset direction of the preset path is detected in the area range, sequentially moving the selection mark from the current position to the right to each target object.
Optionally, the method further includes:
when a sliding operation in a second direction is detected within the area range, if there is no target object marking a start tag and there is the selected target object, the selection of the selected target object is cancelled.
Optionally, the method further includes:
when a first predetermined operation is detected within the range of the area, a first processing logic is performed on the selected target object.
Optionally, the method further includes:
and when a second preset operation is detected in the area range, executing second processing logic corresponding to the second preset operation.
Optionally, the area range of the target area on the screen is pre-configured according to the display parameters of the screen.
In a second aspect, an embodiment of the present application provides an apparatus for selecting a target object, where the apparatus includes:
the acquisition unit is used for acquiring the area range of the target area on the screen;
a moving unit configured to sequentially move the selection marker from the current position to each target object when a sliding operation along a preset path is detected within the area range;
a marking unit configured to mark a start tag on a target object on which the selection marker is located when a sliding operation in a first direction is detected within the area range;
a first selection unit, configured to, when a sliding operation in a second direction is detected within the area range, if there is a target object that marks a start tag, mark an end tag on a target object where a currently selected tag is located, and acquire the target object of the start tag, the target object of the end tag, and a target object between the target object of the start tag and the target object of the end tag as selected target objects;
and the second selection unit is used for acquiring the target object where the current selection mark is located as the selected target object when the sliding operation along the preset path is detected to be finished in the area range and the sliding operation along the first direction and the sliding operation along the second direction are not detected in the area range.
Optionally, the mobile unit includes:
the first moving subunit is used for sequentially moving the selection mark from the current position to the left to each target object when the sliding operation along the first preset direction of the preset path is detected in the area range;
and the second moving subunit is used for sequentially moving the selection mark from the current position to the right to each target object when the sliding operation along the second preset direction of the preset path is detected in the area range.
Optionally, the apparatus further comprises:
a canceling unit configured to cancel, when a sliding operation in a second direction is detected within the area range, selection of the selected target object if there is no target object marking a start tag and the selected target object exists.
Optionally, the apparatus further comprises:
and the first processing unit is used for executing first processing logic on the selected target object when a first preset operation is detected in the area range.
Optionally, the apparatus further comprises:
and the second processing unit is used for executing second processing logic corresponding to a second preset operation when the second preset operation is detected in the area range.
Optionally, the area range of the target area on the screen is pre-configured according to the display parameters of the screen.
Therefore, the embodiment of the application has the following beneficial effects:
the method includes the steps that the area range of a target area on a screen is obtained; when the sliding operation along the preset path is detected in the area range, sequentially moving the selection mark from the current position to each target object; when the sliding operation along the first direction is detected in the area range, marking a start label on a target object where the selection mark is positioned; when the sliding operation along the second direction is detected in the area range, if a target object marking a start label exists, marking an end label on the target object where the current selection label is located, and determining the selected target object; and when the sliding operation along the preset path is detected to be finished in the area range, and the sliding operation along the first direction and the sliding operation along the second direction are not detected in the area range, acquiring the target object where the current selection mark is positioned as the selected target object. Therefore, the embodiment of the application can complete the selection of the target object within the area range by detecting the sliding operation within the area range and executing the corresponding marking or selection of the target object, and the user can operate within the area range without performing the selection operation in the whole screen, thereby conveniently selecting the target object.
Drawings
Fig. 1 is a flowchart of a method for selecting a target object according to an embodiment of the present disclosure;
fig. 2 is a flowchart of another target object selection method provided in an embodiment of the present application;
fig. 3 is a schematic diagram of an area range of a target area on a screen according to an embodiment of the present application;
fig. 4 is a flowchart of another target object selection method provided in an embodiment of the present application;
fig. 5 is a schematic diagram of an area range of a target area on a screen according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a target object selection apparatus according to an embodiment of the present disclosure.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, embodiments accompanying the drawings are described in detail below.
In order to facilitate understanding of the technical solutions provided in the present application, the following description will first be made on the background of the present application.
The inventor finds that, when displaying an object to be selected to a user, the existing mobile device generally distributes the object to be selected in a scattered manner in a displayed interface in consideration of the aesthetic appearance of the display layout. When the user selects, the target object can be selected by directly clicking the target object or clicking a fixed key on the screen to adjust the object being selected. However, the existing mobile terminal is limited by hardware and considers the requirement of multiple functions of the mobile device in daily life, the display screen of the mobile terminal is relatively large relative to the moving range of fingers, and a general mobile terminal may need to be held by both hands of a user, and a thumb actively clicks the screen to complete selection operation; or the user holds the device with one hand and performs selection operation with the other hand. The range of the operation to be performed by the user is larger than the range of the fingers of one hand, so that the user is difficult to perform holding and selecting operation with one hand, and the operation is inconvenient.
After research, the inventor finds that the display of the existing interface is affected by reducing the display interface or adjusting the display area, and the experience of the user is reduced. Therefore, it is necessary to improve the selection manner of the target object in the prior art, so as to avoid the selection operation performed by the user in a larger display screen range. The area range of the target area can be acquired, the operation of the user is detected in the area range, and when the user performs the preset operation in the area range, the selection process of the target object is correspondingly adjusted. Therefore, the user can select the target object by operating in the area range, and the user can conveniently select the target object.
In order to facilitate understanding of the technical solutions provided in the embodiments of the present application, a method for displaying a prompt area provided in the embodiments of the present application is described below with reference to the accompanying drawings.
Referring to fig. 1, which is a flowchart of a method for selecting a target object according to an embodiment of the present application, the method may include:
s101: and acquiring the area range of the target area on the screen.
It should be noted that the target area may be an area operated by a user, and the area range of the target area on the screen may refer to a position and a size of an area occupied by the target area on the screen. The area range of the target area on the screen can be set according to the size of the screen and the range of one-hand operation of the user, and the target area can be the range of the thumb which can rotate on the screen when the user holds the mobile device with one hand. The area range of the target area on the screen may be pre-configured according to the display parameters of the screen, and the specific manner of acquiring the area range of the target area on the screen is not limited in the embodiment of the present application.
S102: when a sliding operation along a preset path is detected within the area range, the selection marker is sequentially moved from the current position to each target object.
In the embodiment of the application, whether the trigger operation of the user exists or not and the specific type of the trigger operation can be detected in real time within the area range. When a sliding operation along a preset path is detected, the selection markers may be sequentially moved from the current position onto the respective target objects. For example, the selection marker is currently on the first target object, the selection marker may be moved from the first target object to the second target object, then to the third target object, and so on, along with the sliding operation, until the sliding operation is finished.
It is understood that the sliding operation may be performed by a finger touching and sliding on the screen.
The embodiment of the present application does not limit the trigger condition for the occurrence of the selection mark, and in a possible implementation, before the sliding operation along the preset path is performed, a selection mark may be provided on one target object. The current position of the selection marker may be located at the first target object or the last target object, or may be any target object, for example, a target object selected by the user is automatically recommended. The current position of the selection marker may also be the position of the target object on which the selection marker was last moved.
It should be noted that, in the embodiment of the present application, a detection method for detecting a sliding operation of a preset path in an area range is not limited, and in a possible implementation manner, a path of the sliding operation may be obtained by detecting a contact point of a display screen.
The preset path may be a track set by a user or a developer during a sliding operation, the sliding operation is performed along the preset path within an area range, and the selection marker may be triggered to move to each target object from the current position in sequence. The specific form of the preset path is not limited in the embodiment of the present application, and for example, the preset path may be an arc-shaped path formed by rotating a finger with a root of the finger as a center, or may be a path formed by sliding the finger from a certain specific position to another specific position.
It should be noted that the selection mark may be a mark for displaying to the user specifically, may be a mark for thickening the boundary of the target object, or may be a mark for changing the display color of the target object, or may be a mark for adding an arrow or a symbol.
It is to be understood that the number of target objects is not limited in the embodiment of the present application, and one or more target objects may be used. If there is one target object, when the sliding operation along the preset path is detected within the area range, the selection mark may be repeatedly moved to the current target object. If there are a plurality of target objects, when the sliding operation along the preset path is detected within the area range, the selection mark may be moved from the current position to another target object.
In the embodiment of the present application, a moving method for sequentially moving the selection marker from the current position to each target object is not limited. In one possible implementation manner, when a sliding operation in a first preset direction along a preset path is detected within an area range, the selection marks are sequentially moved to the left from the current position to each target object; when a sliding operation in a second preset direction along the preset path is detected within the area range, the selection mark is sequentially moved from the current position to the right onto each target object. It is understood that the first predetermined direction and the second predetermined direction are not the same, and may be different directions on the same predetermined path. For example, the first preset direction may be a sliding operation from right to left along a preset path, and the second preset direction may be a sliding operation from left to right along the preset path; for another example, the first preset direction may be counterclockwise moving along the preset path, and the second preset direction may be clockwise moving along the preset path. It is understood that the speed of the movement of the selection mark may be determined by the speed of the sliding operation, and when the speed of the sliding operation is fast, the selection mark may be moved fast; when the speed of the sliding operation is slow, the selection mark may be moved slowly.
S103: when a sliding operation in a first direction is detected within the area range, a start tag is marked on the target object where the selection marker is located.
In the process of detecting the trigger operation of the user in real time within the area range, if the sliding operation along the first direction is detected, the representative user may need to select a plurality of continuous target objects, and a start tag is marked on the target object where the selection mark is located, where the start tag indicates that the target object is the first of the plurality of continuous target objects to be selected by the user.
It is to be understood that the sliding operation in the first direction may be a sliding operation in the first direction starting from any position within the area, and the first direction in the embodiment of the present application may be an upward or downward direction, a leftward or rightward direction, or a direction perpendicular to the preset path direction.
The start tag may be the same as or different from the selection marker, and the start tag may be used to highlight the target object where the selection marker is located, for example, to enlarge a boundary of the target object where the selection marker is located, or to move the target object where the selection marker is located out of an area where another target object is located.
In addition, the order of executing S103 and S102 is not limited in the embodiment of the present application. In a possible implementation manner, S102 may be executed first to move the selection marker, and then S103 may be executed to mark the start tag on the target object where the selection marker is located. The sliding operation along the preset path when S102 is executed and the sliding operation along the first direction when S103 is executed may be continuously performed, for example, the sliding operation along the preset path may be executed first to trigger the movement of the selection mark, and the sliding operation may continue to be performed along the preset path without stopping the current sliding operation when the selection mark is moved; when the selection marker is moved to the target object to be selected, the sliding operation in the first direction triggers the marking of the start tag. The sliding operation along the preset path when S102 is executed and the sliding operation along the first direction when S103 is executed may be respectively performed, for example, the sliding operation along the preset path may be executed first, the movement of the selection mark is triggered, the sliding operation along the preset path is stopped when the selection mark starts to move, and the sliding operation may be stopped at any position on the preset path, or the finger leaves the screen; when the selection mark moves to the target object to be selected, the sliding operation along the first direction is executed in the area range, and the mark of the start label is triggered. In another possible implementation manner, S103 may be directly performed without performing S102, and the target object where the current position of the selection marker is located is marked with the start tag.
S104: when a sliding operation in the second direction is detected within the area range, if a target object marking the start tag exists, marking the end tag on the target object where the current selection tag is located, and acquiring the target object marking the start tag, the target object marking the end tag, and a target object between the target object marking the start tag and the target object marking the end tag as selected target objects.
It is understood that the second direction is not the same as the first direction, and the second direction may be the opposite direction to the first direction or may be independent of the first direction.
In one possible implementation manner, when the sliding operation in the second direction is detected within the area range, if there is a target object marking the start tag, the end tag may be marked on the target object where the currently selected tag is located. The end tag may be the same as or different from the start tag.
It should be noted that, if there is a target object marking a start tag, the execution sequence and the execution times of S102, S103, and S104 are not limited in this embodiment, in a possible implementation manner, after the target object is marked with the start tag by executing S102 and S103, S102 may be executed to move the selection tag, and then S104 may be executed to mark the end tag of the target object where the moved selection tag is located. Operations S103 and S104 during the movement of the selection mark S102 may also be performed. In another possible implementation, S104 may be executed directly after S103 is executed, in which case the start tag and the end tag may be marked on the same target object.
It can be understood that, if there is a target object marked with a start tag, after the target object where the currently selected tag is marked with an end tag, the target object marked with the start tag, the target object marked with the end tag, and a target object between the target object marked with the start tag and the target object marked with the end tag may be obtained as selected target objects. The target object of the mark start tag is the start of the selected target object, the target object of the mark end tag is the end of the selected target object, and the target object between the target object of the mark start tag and the target object of the mark end tag is the selected target object. It should be noted that the selected target objects may include three types, namely, a target object marking a start tag, a target object marking an end tag, and a target object between the target object marking the start tag and the target object marking the end tag. In one possible implementation, the three target objects may be the same target object. The display mode of the selected target object is not limited, and the selected target object can be highlighted.
It should be noted that, when the sliding operation in the second direction is detected within the area range, if there is no target object marking the start tag and there is a selected target object, the selection of the selected target object may be cancelled. That is, after the selected target object is determined, the sliding operation in the second direction may be performed again to cancel the selection of the selected target object this time.
S105: and when the sliding operation along the preset path is detected to be finished in the area range, and the sliding operation along the first direction and the sliding operation along the second direction are not detected in the area range, acquiring the target object where the current selection mark is positioned as the selected target object.
When the sliding operation along the preset path is detected to be finished in the area range, and in the process of detecting that the sliding operation along the preset path exists, the sliding operation along the first direction and the second direction is not detected, representing that a user only needs to select a single target object, and when the sliding operation along the preset path is finished, the target object where the current selection mark is located is obtained and serves as the selected target object.
The embodiment of the present application does not limit a method for determining the end of the sliding operation along the preset path. In a possible implementation manner, the sliding operation along the preset path and the sliding operation along the first direction and the sliding operation along the second direction are required to be operated continuously, that is, there is no intermediate pause between the sliding operation along the preset path and the sliding operation along the first direction and the sliding operation along the second direction, and the finger is always in contact with the screen. At this time, if the sliding operation along the preset path is completed, the presence of another operation is not detected, and it may be considered that the sliding operation along the preset path is completed. In another possible implementation manner, there may be a pause between the sliding operation along the preset path and the sliding operation along the first direction and the sliding operation along the second direction, for example, after the sliding operation along the preset path is performed, the finger may be disengaged from the screen until the sliding operation along the first direction or the second direction is performed when a next operation is required. In this case, whether the sliding operation along the preset path is finished or not can be determined by determining whether the duration of the finger being separated from the screen after the sliding operation along the preset path is greater than or equal to a preset threshold.
In the embodiment of the present application, when having the selected target object, the first processing logic may also be executed on the selected target object when the first predetermined operation is detected within the area range. The type of the first predetermined operation is not limited in the embodiment of the application, and the first predetermined operation may be set according to the operation requirement, and may be a double click or a sliding operation performed according to a preset path. It is to be understood that the first predetermined operation may be different from the above-described sliding operation along the preset path, the sliding operation in the first direction, and the sliding operation in the second direction. The first processing logic is not limited in this embodiment, and may be to perform corresponding processing on the selected target object, and may be set as needed.
In the embodiment of the present application, when the second predetermined operation is detected within the area range, the second processing logic corresponding to the second predetermined operation may be executed. It is to be understood that the second predetermined operation may be different from the first predetermined operation, the sliding operation along the preset path, the sliding operation in the first direction, and the sliding operation in the second direction described above, and the second predetermined operation may also be set as needed. It should be noted that the second processing logic may perform processing for the selection operation itself, and may not be limited to the selected target object.
In the embodiment of the application, by acquiring the area range of the target area on the screen, whether the sliding operation along a preset path, the sliding operation along a first direction and the sliding operation along a second direction exist in the area range is detected, and the selected target object is determined by correspondingly moving a selection mark, a mark start mark and a mark end mark; and if only the sliding operation along the preset path is detected, taking the target object where the current selection mark is as the selected target object. Therefore, by detecting the sliding operation in the area range and executing the corresponding marking or selection of the target object, the target object can be selected in the area range, the user does not need to perform the selection operation in the whole screen, and the user can conveniently select the target object.
The method for selecting a target object provided by the embodiment of the present application is introduced above, and the method provided by the embodiment of the present application is introduced below with reference to a specific scenario.
In the embodiment of the present application, a method for selecting a target object is applied to selecting cards in a board game, and referring to fig. 2, fig. 2 is a flowchart of another method for selecting a target object provided in the embodiment of the present application.
S201: and starting the card-playing device, acquiring the area range of the target area on the screen, and detecting the operation of the user in the area range.
It will be appreciated that actuating the card shoe is to initiate a card shoe selection, and that the card shoe will set the extent of the target area on the screen based on the position of the card on the screen that the user has and the possible operating positions of the user. Fig. 3 is a schematic diagram of an area range of a target area on a screen according to an embodiment of the present application, and a dotted line portion is the area range of the target area on the screen. An area range of the target area on the screen is acquired, and the operation of the user is detected in the area range. The operation of detecting the user may be touch detection within a range of an area on the detection screen.
S202: when a sliding operation along the circular arc of the sector is detected within the range of the area, the selection mark is sequentially moved from the current position to each card.
In this embodiment, the sliding operation along the preset path may be a sliding operation along a circular arc in a sector shape, and when the sliding operation along the circular arc in the sector shape by the user is detected, the selection mark is moved, and the selection mark may be, for example, an arrow displayed above the start card, and the selection mark may be located above the start card before the movement, or may be located above the start card of the deal automatically recommended by the game. The selectable indicia may also be a color treatment of the cards such that the cards with the selectable indicia are different colors than the other cards.
It should be noted that, when the user performs a sliding operation clockwise along the circular arc of the sector, the selection marker may move from left to right, and when the user performs a sliding operation counterclockwise along the circular arc of the sector, the selection marker may move from right to left.
S203: if the extrapolation sliding operation is detected within the area range, the label is marked for the card where the selection mark is located.
It should be noted that the sliding operation in the first direction may be an extrapolation sliding operation, and the extrapolation sliding operation may be a movement along a radius of the sector area operated by the user to a longer radius direction. When an extrapolation sliding operation is detected within the range of the area, the start tag is marked for the card in which the selection marker is located, which may be in a sequence of pushing cards out of the sequence.
S204: if the push-in sliding operation is detected in the area range, whether a card marking a start label exists is judged.
S205: if the cards marked with the start labels exist, the cards marked with the end labels are marked on the cards where the current selection labels are located, and the cards marked with the start labels, the cards marked with the end labels and the cards between the cards marked with the start labels and the cards marked with the end labels are obtained as the selected cards.
The slide operation in the second direction may be a push-in slide operation, and the push-in slide operation may be a movement in a direction of a shorter radius along the radius of the sector area operated by the user. When a push-in slide operation is detected within the zone, the card in which the selection mark is located is marked with an end tag, which may also be in the sequence of pushing the card out.
It is understood that the cards marked with the start label, the cards marked with the end label, and the cards between the cards marked with the start label and the cards marked with the end label are selected cards, and the selected cards may be multiple cards at this time. After the selected card is determined, the selected card may be pushed out of the sequence of all cards.
S206: if there is no card marking the start tag, the selection of the card is cancelled.
S207: and if the sliding operation along the fan-shaped circular arc is detected to be finished in the area range, and the outward pushing sliding operation and the inward pushing sliding operation are not detected in the area range, acquiring the card where the current selection mark is located as the selected card.
It can be understood that, in the process of selecting a card, the user may continuously perform the touch operation, and if the user stops touching the screen in the area range after the user performs the sliding operation along the sector arc, it may be considered that the sliding operation along the sector arc is ended this time.
It should be noted that the selected card at this time is the single card where the current selection mark is located.
S208: when a double-click by the user is detected within the area, a discard is performed on the selected card.
S209: and when the user is detected to execute the custom operation in the area range, executing a second processing logic corresponding to the custom operation.
It is understood that the custom operation may be a setting of an operation performed by a user according to a requirement of playing and selecting a card, and different custom operations correspond to different second processing logics, wherein the second processing logics may be a processing method for the local playing card and may not be limited to the processing of the selected card. For example, when the user slides up in a vertical direction using the thumb, the corresponding second processing logic may indicate that a card-out cue is displayed or switched; when the user slides down in a vertical direction using the thumb, the corresponding second processing logic may be a cue to indicate a cancellation of the card; when the user draws a circle with the thumb, the corresponding second processing logic may be to indicate that the turn is not played.
S210: and (5) finishing card playing.
In the embodiment of the application, the operation of the user is detected in the area range, the corresponding card selection or card discharge can be performed according to different operations of the user, the selection and processing of the cards can be realized by the operation of the user in the area range, and the single-hand operation of the user is facilitated.
In the embodiment of the present application, a method for selecting a target object is applied to selecting a file when processing a file, referring to fig. 4, where fig. 4 is a flowchart of another method for selecting a target object provided in the embodiment of the present application.
S301: and responding to the operation of triggering the selection file by the user, acquiring the area range of the target area on the screen, and detecting the operation of the user in the area range.
Referring to fig. 5, fig. 5 is a schematic diagram of an area range of a target area on a screen according to an embodiment of the present application, and a dotted line portion is the area range of the target area on the screen.
It can be understood that when the user triggers the operation of selecting a file, the system for selecting a file sets the area range of the target area on the screen according to the display condition of the file on the current interface of the user and the possible operation position of the user. An area range of the target area on the screen is acquired, and the operation of the user is detected in the area range. The operation of detecting the user may be detecting a touch on the screen, particularly touch detection of a target area within an area range on the screen.
S302: and if the sliding operation along the sector arc is detected in the area range, sequentially moving the selection mark from the current position to each file.
In this embodiment, the sliding operation along the preset path may be a sliding operation along a circular arc of a sector, and when it is detected that the user performs the sliding operation along the sector, the selection mark is moved, where the selection mark may be a thickening process on a boundary of the file, or a transparency displayed by a file icon, so that the transparency displayed by the icon of the file with the selection mark is different from that of other files, and the selection mark may be located in the first file before the movement.
It should be noted that, when the user performs a sliding operation clockwise along the circular arc, the selection markers may move from left to right sequentially, and when the user performs a sliding operation counterclockwise along the circular arc, the selection markers may move from right to left sequentially.
S303: and if the upward sliding operation is detected in the area range, marking a starting label on the file where the selection mark is located.
It should be noted that the sliding operation along the first direction may be an upward sliding operation, and when the upward sliding operation is detected within the area range, the start label may be a file label where the selection label is located, where the start label deepens the transparency of the icon display of the file.
S304: and if the sliding down operation is detected in the area range, judging whether a file for marking the start label exists or not.
S305: if the file with the start label exists, marking the end label of the file where the current selection label is located, and acquiring the file with the start label, the file with the end label and the file between the file with the start label and the file with the end label as the selected files.
It should be noted that the sliding operation in the second direction may be a downward sliding operation, and when the downward sliding operation is detected in the area range, the end tag may be marked on the file where the selection tag is located, and the end tag may be the same as the start tag.
It is to be understood that the file marking the start tag, the file marking the end tag, and the file between the file marking the start tag and the file marking the end tag are selected files, and the selected file may be multiple files. When the file marking the start tag and the file marking the end tag are one file, the selected file is one.
S306: and if the file marking the starting tag does not exist, canceling the selection of the file.
S307: and if the sliding operation along the fan-shaped circular arc is detected to be finished in the area range and the upward sliding operation and the downward sliding operation are not detected in the area range, acquiring the file where the current selection mark is located as the selected file.
It can be understood that, in the process of selecting a file, the user may interrupt the touch operation, and if the user does not perform an operation within the area range any more within a certain time after performing a sliding operation along the sector arc, it may be considered that the sliding operation along the sector arc is ended this time.
It should be noted that the selected file at this time is the single file where the current selection marker is located.
S308: when the user is detected to execute the first preset operation in the area range, executing a first processing logic corresponding to the first preset operation on the selected file.
It is to be understood that the first predetermined operation may be a setting of an operation performed by a user according to a need of file processing, and different first predetermined operations correspond to different first processing logics, wherein the first processing logics may be processing methods for a selected file. For example, when the user slides to the left in a horizontal direction using the thumb, the corresponding first processing logic may indicate that the selected file is copied; when the user slides to the right in the horizontal direction using the thumb, the corresponding first processing logic may be to represent cutting the selected file; when the user draws a circle using the thumb, the corresponding first processing logic may delete the selected file.
S308: when a user double-click is detected within the area scope, the corresponding processing logic to eliminate the file selection may be executed.
S310: the file processing is ended.
According to the file processing method and device, the operation of the user is detected in the area range, corresponding file selection or processing is carried out according to different operations of the user, and due to the fact that the area range is small, the user can achieve one-hand operation in the area range, and therefore the user can conveniently select the file.
The embodiment of the application introduces a target object selection device. Referring to fig. 6, the drawing is a schematic structural diagram of a target object selection apparatus according to an embodiment of the present application.
An obtaining unit 401, configured to obtain an area range of a target area on a screen;
a moving unit 402 configured to sequentially move the selection marker from the current position to each target object when a sliding operation along a preset path is detected within the area;
a marking unit 403, configured to mark a start tag on a target object where the selection marker is located when a sliding operation in a first direction is detected within the area range;
a first selecting unit 404, configured to mark, when a sliding operation in the second direction is detected within the area range, an end tag for a target object where a mark is currently selected if a target object of the start tag exists, and acquire, as selected target objects, the target object of the start tag, the target object of the end tag, and a target object between the target object of the start tag and the target object of the end tag;
a second selecting unit 405, configured to, when it is detected that the sliding operation along the preset path is ended within the area range, and the sliding operation along the first direction and the sliding operation along the second direction are not detected within the area range, acquire the target object where the current selection mark is located as the selected target object.
Optionally, the mobile unit includes:
the first moving subunit is used for sequentially moving the selection mark from the current position to the left to each target object when the sliding operation along the first preset direction of the preset path is detected in the area range;
and the second moving subunit is used for sequentially moving the selection mark from the current position to the right to each target object when the sliding operation along the second preset direction of the preset path is detected in the area range.
Optionally, the apparatus further comprises:
a canceling unit configured to cancel, when a sliding operation in a second direction is detected within the area range, selection of the selected target object if there is no target object marking a start tag and the selected target object exists.
Optionally, the apparatus further comprises:
and the first processing unit is used for executing first processing logic on the selected target object when a first preset operation is detected in the area range.
Optionally, the apparatus further comprises:
and the second processing unit is used for executing second processing logic corresponding to a second preset operation when the second preset operation is detected in the area range.
Optionally, the area range of the target area on the screen is pre-configured according to the display parameters of the screen.
The method includes the steps that the area range of a target area on a screen is obtained; when the sliding operation along the preset path is detected in the area range, sequentially moving the selection mark from the current position to each target object; when the sliding operation along the first direction is detected in the area range, marking a start label on a target object where the selection mark is positioned; when the sliding operation along the second direction is detected in the area range, if a target object marking a start label exists, marking an end label on the target object where the current selection label is located, and determining the selected target object; and when the sliding operation along the preset path is detected to be finished in the area range, and the sliding operation along the first direction and the sliding operation along the second direction are not detected in the area range, acquiring the target object where the current selection mark is positioned as the selected target object. Therefore, the embodiment of the application can complete the selection of the target object within the area range by detecting the sliding operation within the area range and executing the corresponding marking or selection of the target object, and the user can operate within the area range without performing the selection operation in the whole screen, thereby conveniently selecting the target object.
It should be noted that, in the present specification, the embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments may be referred to each other. For the system or the device disclosed by the embodiment, the description is simple because the system or the device corresponds to the method disclosed by the embodiment, and the relevant points can be referred to the method part for description.
It should be understood that in the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" for describing an association relationship of associated objects, indicating that there may be three relationships, e.g., "a and/or B" may indicate: only A, only B and both A and B are present, wherein A and B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of single item(s) or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
It is further noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A method for selecting a target object, the method comprising:
acquiring the area range of a target area on a screen;
when the sliding operation along the preset path is detected in the area range, sequentially moving the selection mark from the current position to each target object;
when the sliding operation along the first direction is detected in the area range, marking a starting label on a target object where the selection mark is located;
when the sliding operation along the second direction is detected in the area range, if a target object marking a start label exists, marking an end label on the target object where the current selection label is located, and acquiring the target object marking the start label, the target object marking the end label and a target object between the target object marking the start label and the target object marking the end label as selected target objects;
and when the sliding operation along the preset path is detected to be finished in the area range, and the sliding operation along the first direction and the sliding operation along the second direction are not detected in the area range, acquiring the target object where the current selection mark is positioned as the selected target object.
2. The method according to claim 1, wherein the moving the selection marker from the current position to each target object in sequence when the sliding operation along the preset path is detected within the area range comprises:
when the sliding operation along a first preset direction of a preset path is detected in the area range, sequentially moving the selection mark from the current position to the left to each target object;
and when the sliding operation along the second preset direction of the preset path is detected in the area range, sequentially moving the selection mark from the current position to the right to each target object.
3. The method of claim 1, further comprising:
when a sliding operation in a second direction is detected within the area range, if there is no target object marking a start tag and there is the selected target object, the selection of the selected target object is cancelled.
4. The method of claim 1, further comprising:
when a first predetermined operation is detected within the range of the area, a first processing logic is performed on the selected target object.
5. The method of claim 1, further comprising:
and when a second preset operation is detected in the area range, executing second processing logic corresponding to the second preset operation.
6. The method of claim 1, wherein the area range of the target area on the screen is pre-configured according to the display parameters of the screen.
7. An apparatus for selecting a target object, the apparatus comprising:
the acquisition unit is used for acquiring the area range of the target area on the screen;
a moving unit configured to sequentially move the selection marker from the current position to each target object when a sliding operation along a preset path is detected within the area range;
a marking unit configured to mark a start tag on a target object on which the selection marker is located when a sliding operation in a first direction is detected within the area range;
a first selection unit, configured to, when a sliding operation in a second direction is detected within the area range, if there is a target object that marks a start tag, mark an end tag on a target object where a currently selected tag is located, and acquire the target object of the start tag, the target object of the end tag, and a target object between the target object of the start tag and the target object of the end tag as selected target objects;
and the second selection unit is used for acquiring the target object where the current selection mark is located as the selected target object when the sliding operation along the preset path is detected to be finished in the area range and the sliding operation along the first direction and the sliding operation along the second direction are not detected in the area range.
8. The apparatus of claim 7, wherein the mobile unit comprises:
the first moving subunit is used for sequentially moving the selection mark from the current position to the left to each target object when the sliding operation along the first preset direction of the preset path is detected in the area range;
and the second moving subunit is used for sequentially moving the selection mark from the current position to the right to each target object when the sliding operation along the second preset direction of the preset path is detected in the area range.
9. The apparatus of claim 7, further comprising:
a canceling unit configured to cancel, when a sliding operation in a second direction is detected within the area range, selection of the selected target object if there is no target object marking a start tag and the selected target object exists.
10. The apparatus of claim 7, further comprising:
and the first processing unit is used for executing first processing logic on the selected target object when a first preset operation is detected in the area range.
CN202010124256.9A 2020-02-27 2020-02-27 Target object selection method and device Active CN111338539B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010124256.9A CN111338539B (en) 2020-02-27 2020-02-27 Target object selection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010124256.9A CN111338539B (en) 2020-02-27 2020-02-27 Target object selection method and device

Publications (2)

Publication Number Publication Date
CN111338539A true CN111338539A (en) 2020-06-26
CN111338539B CN111338539B (en) 2021-05-18

Family

ID=71183795

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010124256.9A Active CN111338539B (en) 2020-02-27 2020-02-27 Target object selection method and device

Country Status (1)

Country Link
CN (1) CN111338539B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102681786A (en) * 2012-05-14 2012-09-19 华为技术有限公司 Method and device for text selection
CN103533411A (en) * 2013-09-04 2014-01-22 小米科技有限责任公司 Method and device for controlling motion of selection cursor
CN103870156A (en) * 2014-03-05 2014-06-18 美卓软件设计(北京)有限公司 Method and device for processing object
CN104571871A (en) * 2015-01-26 2015-04-29 深圳市中兴移动通信有限公司 Method and system for selecting files
CN104598023A (en) * 2014-12-22 2015-05-06 广东欧珀移动通信有限公司 Method and device for selecting files through gesture recognition
CN106020627A (en) * 2016-05-20 2016-10-12 乐视控股(北京)有限公司 Method and device for carrying out multi-selection on list items
US10101734B2 (en) * 2011-11-15 2018-10-16 Rockwell Automation Technologies, Inc. Routing of enterprise resource planning messages

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10101734B2 (en) * 2011-11-15 2018-10-16 Rockwell Automation Technologies, Inc. Routing of enterprise resource planning messages
CN102681786A (en) * 2012-05-14 2012-09-19 华为技术有限公司 Method and device for text selection
CN103533411A (en) * 2013-09-04 2014-01-22 小米科技有限责任公司 Method and device for controlling motion of selection cursor
CN103870156A (en) * 2014-03-05 2014-06-18 美卓软件设计(北京)有限公司 Method and device for processing object
CN104598023A (en) * 2014-12-22 2015-05-06 广东欧珀移动通信有限公司 Method and device for selecting files through gesture recognition
CN104571871A (en) * 2015-01-26 2015-04-29 深圳市中兴移动通信有限公司 Method and system for selecting files
CN106020627A (en) * 2016-05-20 2016-10-12 乐视控股(北京)有限公司 Method and device for carrying out multi-selection on list items

Also Published As

Publication number Publication date
CN111338539B (en) 2021-05-18

Similar Documents

Publication Publication Date Title
CN109513208B (en) Object display method and device, storage medium and electronic device
CN107126698B (en) Control method and device of game virtual object, electronic equipment and readable medium
US9389718B1 (en) Thumb touch interface
US9335925B2 (en) Method of performing keypad input in a portable terminal and apparatus
US20100020033A1 (en) System, method and computer program product for a virtual keyboard
WO2013042394A1 (en) Touch pad input device, program, data processing method, and data processing device
CN106708645B (en) Misoperation processing method, misoperation processing device and terminal
WO2013167028A2 (en) Method and system for implementing suspended global button on touch screen terminal interface
CN111701226A (en) Control method, device and equipment for control in graphical user interface and storage medium
JP5374564B2 (en) Drawing apparatus, drawing control method, and drawing control program
CN103631821A (en) Application program search method and electronic terminal
CN109284063A (en) Method, apparatus and touch apparatus are determined based on the instruction of single-button
CN106648330A (en) Human-machine interaction method and device
CN111158553B (en) Processing method and device and electronic equipment
CN112783408A (en) Gesture navigation method and device of electronic equipment, equipment and readable storage medium
CN111338539B (en) Target object selection method and device
CN105808080B (en) Method and device for quickly copying object by utilizing gesture
JP6970712B2 (en) Information processing program, information processing device, information processing system, and information processing method
CN112057848A (en) Information processing method, device, equipment and storage medium in game
CN113926186A (en) Method and device for selecting virtual object in game and touch terminal
CN108124064B (en) Key response method of mobile terminal and mobile terminal
CN113769404A (en) Game movement control method and device and electronic equipment
CN113332703A (en) Game role moving state switching method, device, equipment and storage medium
CN112274915A (en) Game control method and device and electronic equipment
CN111388992A (en) Interactive control method, device and equipment in game

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant