CN110333803B - Multimedia object selection method and terminal equipment - Google Patents

Multimedia object selection method and terminal equipment Download PDF

Info

Publication number
CN110333803B
CN110333803B CN201910330197.8A CN201910330197A CN110333803B CN 110333803 B CN110333803 B CN 110333803B CN 201910330197 A CN201910330197 A CN 201910330197A CN 110333803 B CN110333803 B CN 110333803B
Authority
CN
China
Prior art keywords
cursor
button
control
input
moving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910330197.8A
Other languages
Chinese (zh)
Other versions
CN110333803A (en
Inventor
李敬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201910330197.8A priority Critical patent/CN110333803B/en
Publication of CN110333803A publication Critical patent/CN110333803A/en
Application granted granted Critical
Publication of CN110333803B publication Critical patent/CN110333803B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a multimedia object selection method and terminal equipment, wherein the method comprises the following steps: displaying a first display interface, wherein the first display interface comprises an object display area and a control area, the object display area displays a multimedia object and a cursor, and the control area displays a cursor control; in the case that a first input of a user for the cursor control is received, identifying movement information of the first input; controlling the cursor to move according to the movement information; and selecting the multimedia object corresponding to the position of the cursor after the cursor is moved. The embodiment of the invention can improve the accuracy of the terminal equipment in selecting the multimedia object.

Description

Multimedia object selection method and terminal equipment
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a multimedia object selection method and a terminal device.
Background
With the development of terminal technology, many works of people can be completed on the terminal, so that the work of people becomes more convenient. In practical applications, when a multimedia object is displayed on a display screen of a terminal, a user needs to press a cursor on the display screen and control the cursor to move to a corresponding position, so as to achieve the purpose of selecting a part of the multimedia object. However, since the cursor is small, when the user controls the cursor to move, the cursor cannot be accurately moved to the corresponding position, so that part of unnecessary multimedia objects are selected, and thus, the accuracy of selecting the multimedia objects by the current terminal is low.
Disclosure of Invention
The embodiment of the invention provides a multimedia object selection method and terminal equipment, and aims to solve the problem that the accuracy of selecting a multimedia object by a terminal is low.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides a multimedia object selection method, including:
displaying a first display interface, wherein the first display interface comprises an object display area and a control area, the object display area displays a multimedia object and a cursor, and the control area displays a cursor control;
in the case that a first input of a user for the cursor control is received, identifying movement information of the first input;
controlling the cursor to move according to the movement information;
and selecting the multimedia object corresponding to the position of the cursor after the cursor is moved.
In a second aspect, an embodiment of the present invention further provides a terminal device, including:
the first display module is used for displaying a first display interface, wherein the first display interface comprises an object display area and a control area, the object display area displays a multimedia object and a cursor, and the control area displays a cursor control;
the first identification module is used for identifying the movement information of a first input when the first input of a user for the cursor control is received;
the control module is used for controlling the cursor to move according to the movement information;
and the selecting module is used for selecting the multimedia object corresponding to the position of the cursor after the cursor moves.
In a third aspect, an embodiment of the present invention further provides a terminal device, including: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the above multimedia object selection method when executing the computer program.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the steps in the multimedia object selection method.
In the embodiment of the invention, the terminal equipment receives the first input by the user, so that the movement of the cursor can be controlled according to the movement information of the first input, and the accuracy of selecting the multimedia object by the terminal equipment is further improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a flowchart of a multimedia object selection method according to an embodiment of the present invention;
FIG. 2 is a flow chart of another method for selecting a multimedia object according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating an example of an embodiment of the present invention;
FIG. 4 is a second example diagram provided by the embodiment of the present invention;
FIG. 5 is a third exemplary diagram according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of another terminal device provided in the embodiment of the present invention;
fig. 8 is a schematic structural diagram of another terminal device provided in the embodiment of the present invention;
fig. 9 is a schematic structural diagram of another terminal device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a flowchart of a multimedia object selection method according to an embodiment of the present invention, as shown in fig. 1, including the following steps:
step 101, displaying a first display interface, wherein the first display interface comprises an object display area and a control area, the object display area displays a multimedia object and a cursor, and the control area displays a cursor control.
The embodiment can be applied to terminal equipment. And the terminal device may display the first display interface after receiving an instruction for controlling the first display interface from the user. The instruction can be pressing a display screen of the terminal equipment, and the pressing duration time exceeds a preset time value.
The position relationship between the object display area and the control area is not limited herein, for example: the upper half part of the first display interface may be an object display area, and the lower half part of the first display interface may be a control area; of course, the left half of the first display area may be an object display area, and the right half of the first display interface may be a control area; in addition, the area located at the corner position on the first display interface may be a control area, for example, the control area may specifically be a lower left corner, a lower right corner, an upper left corner, or a lower right corner of the first display interface, and correspondingly, the area in the first display interface except for the control area may be an object display area.
The multimedia object may include at least one of text, picture, video, and the like, and of course, the multimedia object may further include thumbnails of respective files displayed in a list form. The specific content of the multimedia object is not limited herein.
The cursor control can be circular or square.
And 102, under the condition that a first input of a user for the cursor control is received, identifying the movement information of the first input.
The first input may be a slide input or a press input.
When the first input is a sliding input, the cursor control can move synchronously with the first input, and then the movement information of the first input at this time can also be understood as the movement information of the cursor control. Of course, the cursor control may also not move in synchronization with the first input, i.e.: when the first input moves, the cursor control may remain stationary at the current position.
Additionally, when the cursor control moves in synchronization with the first input, the cursor control may move a corresponding second distance as the first input slides the first distance, wherein the second distance and the first distance may be proportional.
And 103, controlling the cursor to move according to the movement information.
The movement information of the cursor control may include a movement direction, and the specific operation may be as follows: first, the moving direction of the first input is judged, and the moving direction of the cursor is determined according to the moving direction of the first input, for example: the moving direction of the first input faces to the left, and then the moving direction of the cursor also faces to the left; the direction of movement of the first input is to the right, and the direction of movement of the cursor is also to the right.
Of course, the movement information may also include a movement speed, and if the first input moves to the left and the movement speed is the first numerical value, then correspondingly, the cursor moves to the left and the movement speed may also be the first numerical value.
And 104, selecting the multimedia object corresponding to the position of the cursor after the cursor is moved.
The selected multimedia object may be a multimedia object included between a position where the cursor is not moved and a position where the cursor is moved, and of course, the selected multimedia object may also be a multimedia object included between a position where the cursor is moved and a start or end position of the multimedia object in the object display area.
In the embodiment of the present invention, the terminal Device may be a Mobile phone, a Tablet Personal Computer (Tablet Personal Computer), a Laptop Computer (Laptop Computer), a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), a Wearable Device (Wearable Device), or the like.
In the embodiment of the invention, the terminal equipment receives the first input by the user, so that the movement of the cursor can be controlled according to the movement information of the first input, and the accuracy of selecting the multimedia object by the terminal equipment is further improved.
Referring to fig. 2, fig. 2 is a flowchart of another multimedia object selection method according to an embodiment of the present invention. The main difference between this embodiment and the previous embodiment is that after the multimedia object is selected, the target operation can be performed on the selected multimedia object. As shown in fig. 2, the method comprises the following steps:
step 201, displaying a first display interface, wherein the first display interface includes an object display area and a control area, the object display area displays a multimedia object and a cursor, and the control area displays a cursor control.
The multimedia object may include at least one of text, picture, video, and the like, and of course, the multimedia object may further include thumbnails of respective files displayed in a list form. The specific content of the multimedia object is not limited herein.
Optionally, referring to fig. 3, a first cursor 3011 and a second cursor 3012 are displayed in the object display area 301, and a first cursor control 3021 and a second cursor control 3022 are displayed in the control area 302, where the first input is an input for at least one of the first cursor control 3021 and the second cursor control 3022, where the first cursor control 3021 is used to control the first cursor 3011, and the second cursor control 3022 is used to control the second cursor 3012.
Wherein, when the first input moves, the first cursor control 3021 or the second cursor control 3022 may move in synchronization with the first input, i.e., each time the first input moves a first distance, the first cursor control 3021 or the second cursor control 3022 may move a second distance, which may be proportional to the second distance, for example: may be 2: 1. In this way, movement of the first cursor 3011 may be better controlled by the first cursor control 3021 while movement of the second cursor 3012 may be better controlled by the second cursor control 3022.
The specific form of the first cursor and the second cursor is not limited herein, for example: the first cursor and the second cursor may be in the form of a vertical line, and of course, both may be a circle.
In this embodiment, the first cursor control controls the first cursor, and the second cursor control controls the second cursor, so that when the first cursor and/or the second cursor are/is controlled to move, since the user can clearly know the cursor control corresponding to each cursor, the corresponding cursor movement can be controlled more accurately. The phenomenon that the user wants to control the movement of the first cursor and controls the movement of the second cursor by mistake is reduced.
Step 202, in the case that a first input of the cursor control is received, identifying the movement information of the first input.
The first input may be a slide input or a press input.
When the first input is a sliding input, the cursor control can move synchronously with the first input, and then the movement information of the first input at this time can also be understood as the movement information of the cursor control. Of course, the cursor control may also not move in synchronization with the first input, i.e.: when the first input moves, the cursor control may remain stationary at the current position.
Additionally, when the cursor control moves in synchronization with the first input, the cursor control may move a corresponding second distance as the first input slides the first distance, wherein the second distance and the first distance may be proportional.
And 203, controlling the cursor to move according to the movement information.
Wherein, the process of cursor movement can also send out feedback signal, for example: the vibration feedback signal and/or the voice signal can be sent out when the cursor moves one character. Therefore, the user can control the moving process of the cursor according to the feedback signal, and the movement of the cursor is more accurate.
Optionally, at least one cursor is displayed in the object display area, a control component is displayed in the control area, and the control component includes the cursor control and at least one object selection button;
the controlling the cursor to move according to the movement information comprises:
controlling a target cursor to move to a target position according to the first input first moving button information;
wherein the first moving button information is used to indicate that the end position of the first input is located at a target object selection button in the control component; the target cursor is a cursor corresponding to the target object selection button, and the target position is a moving position of the cursor corresponding to the target object selection button.
When a control component comprising a cursor control is displayed on the object display area, a pressing input of a user for the control component can be received, and at least one object selection button is displayed according to the pressing input. Of course, control components including a cursor control and at least one object selection button may also be displayed directly on the object display area.
The object selection button may be set according to an input of a user, or may be preset before the terminal device leaves a factory.
For example: the object selection button may be a first position selection button, and when the end position of the first input is located at the first position selection button, a cursor corresponding to the first position selection button may be moved to a first position, and the first position may be a second text position, a third text position, a penultimate text position, or the like in the multimedia object. Of course, if only one cursor is displayed in the object display area, the cursor is directly moved to the first position; in addition, if two cursors are displayed in the object display area and the first position is the second text position in the multimedia object, the cursor corresponding to the first position selection button may be a cursor whose current display position is ahead. If two cursors are displayed in the object display area and the first position is the penultimate text position in the multimedia object, the cursor corresponding to the first position selection button may be a cursor behind the current display position.
Optionally, a first cursor and a second cursor are displayed in the object display area;
the at least one object selection button comprises at least one of: a head button, a tail button, a head button and a tail button;
the cursor corresponding to the head of line button is the first cursor, and the cursor moving position corresponding to the head of line button is the head of line position of the line where the first cursor is located currently;
the cursor corresponding to the line end button is the second cursor, and the cursor moving position corresponding to the line end button is the line end position of the line where the second cursor is located currently;
the cursor corresponding to the head button is the first cursor, and the cursor moving position corresponding to the head button is the multimedia object initial position of the object display area;
the cursor corresponding to the tail button is the second cursor, and the cursor moving position corresponding to the tail button is the multimedia object ending position of the object display area.
It should be noted that the first cursor is displayed at a position further forward than the second cursor in the multimedia object in the object display region. When the first cursor and the second cursor meet, the two cursors can be combined into one cursor, but at the moment, the first cursor control cannot control the first cursor to move continuously along the moving direction before the first cursor, and only can control the first cursor to move from the meeting position to the direction opposite to the moving direction before the first cursor; similarly, the second cursor control cannot control the second cursor to move continuously along the moving direction before the second cursor, and can only control the second cursor to move from the meeting position to the direction opposite to the moving direction before the second cursor.
Thus, since the at least one object selection button comprises at least one of: the head button, the tail button, the head button and the tail button are arranged, so that the speed is higher when the cursor is controlled to the head position, the tail position, the starting position of the multimedia object or the ending position of the multimedia object through the object selection button. Meanwhile, when a plurality of multimedia objects need to be selected and page turning is needed, the multimedia objects can be quickly and accurately selected through the object selection buttons.
In this embodiment, when the end point position of the first input is located at the target object selection button, the cursor corresponding to the target object selection button can be directly moved to the target position corresponding to the target object selection button, and the moving speed of the cursor is increased.
And 204, selecting the multimedia object corresponding to the position of the cursor after the cursor is moved.
The selected multimedia object may be a multimedia object included between a position where the cursor is not moved and a position where the cursor is moved, and of course, the selected multimedia object may also be a multimedia object included between a position where the cursor is moved and a start or end position of the multimedia object in the object display area.
Optionally, the movement information includes a movement direction and a movement distance; the controlling the cursor to move according to the movement information comprises:
controlling the cursor to move according to the moving direction and the moving distance of the first input;
wherein the moving direction of the cursor is matched with the moving direction of the cursor control;
the moving speed of the cursor is positively correlated with the moving distance, or the moving distance of the cursor is positively correlated with the moving distance.
Wherein the moving direction of the cursor matches the moving direction of the first input, for example: when the first input moves upwards, the cursor moves upwards; when the first input moves towards the left, the cursor moves towards the left; when the first input moves to the upper left, the cursor moves to the upper left.
The moving speed of the cursor may be in direct proportion to the moving distance of the first input, and preferably, the moving speed of the cursor may be in direct proportion to the square of the moving distance. I.e. v ═ C × d2Where v denotes a moving speed of the cursor, C denotes a constant, and d denotes a moving distance of the first input. It can be seen that, when the moving distance of the first input is larger, the moving speed of the cursor is faster, and when the moving distance of the first input is smaller, the moving speed of the cursor is slower, so that when the cursor is primarily controlled to move, the moving distance of the first input can be controlled to be larger, so that the moving speed of the cursor is faster, and thus the cursor can be rapidly moved to the target position, and the cursor movement is finely controlledWhen the cursor is moved, for example, when the cursor is behind the target position and the cursor needs to be further controlled to move between several adjacent characters, the moving distance of the first input can be controlled to be smaller so as to reduce the moving speed of the cursor, and therefore the movement of the cursor can be accurately controlled.
In addition, the moving distance of the cursor may be in a direct proportional relationship with the moving distance of the first input, and of course, the moving distance of the cursor may be in a direct proportional relationship with the square of the moving distance of the first input.
In the present embodiment, when the movement of the cursor is controlled by the movement direction and the movement distance of the first input, the accuracy of the movement of the cursor can be made high.
It should be noted that steps 205, 206, and 207 are optional.
And 205, displaying at least one function button on a control assembly, wherein the control assembly is displayed in the control area and comprises the cursor control.
Referring to fig. 4, a control component 4011 is displayed in the control area 401, the control component 4011 includes a cursor control 40111, and when a plurality of function buttons are displayed on the control component 4011, the types of the function buttons may be different, for example: the plurality of function buttons may include: copy, cut, and paste buttons, etc. Of course, the plurality of function buttons may further include: and (4) fully selecting the button. In addition, the object display area 402 displays the selected multimedia object 4021.
And 206, under the condition that a second input of the cursor control is received, identifying second moving button information of the second input.
The second input may also be a slide input or a press input.
And step 207, according to the second input second moving button information, executing a target operation on the selected multimedia object, where the second moving button information is used to indicate that the end position of the second input is located in a target function button of the control component, and the target operation is an operation corresponding to the target function button.
Wherein when the end position of the second input is located at a different function button, different operations can be performed with respect to the selected multimedia object.
For example: referring to fig. 4, when the cursor control 4011 moves synchronously with the second input, when the end position of the second input is located in the copy button, that is, when the cursor control 4011 moves to the copy button, the selected multimedia object 4021 may be copied; when the cursor control 4011 moves to the cut button, the selected multimedia object 4021 can be cut; when the cursor control 4011 moves to the paste button, previously copied or cut content may be pasted in the selected multimedia object 4021.
In addition, the function buttons may further include a full selection button, and when the cursor control 4011 moves to the full selection button, all multimedia objects displayed in the object display area 402 may be selected.
Optionally, the target operation includes copying or cutting, and after the target operation is performed on the selected multimedia object, the method further includes:
displaying a second display interface, wherein the second display interface comprises an object display area and an application program identification area, and the application program identification area comprises an identification of at least one application program;
and under the condition that third input of the identification of the target application program by the user is received, inputting the copied or cut multimedia object into the target application program according to the third input.
When the first display interface is displayed and the selected multimedia object is detected to exist in the copy frame or the cut frame, the second display interface can be directly displayed. Of course, the second display interface may also be displayed according to an input instruction of the user, where the input instruction of the user is not specifically limited herein, for example: the input instruction of the user may be to press the object display area, and the press time exceeds a target time value. Of course, the input instruction of the user may also be a voice instruction.
The second display interface includes an object display area 501 and an application program identification area 502, and the object display area 501 may also display multimedia objects or only display selected multimedia objects.
In addition, application identification area 502 includes an identification of at least one application, such as: the method may include an identifier of a background application program and/or an identifier of an application program whose usage frequency exceeds a threshold, where the usage frequency may be a usage frequency within a period, the length of the period may be one week, one month, or half a year, and a specific value of the threshold is not limited herein.
In addition, optionally, the identifier of the background application may be highlighted in the identifier of the at least one application. For example: the identifier of the background application program can be displayed first, or the identifier of the background application program can be displayed in a bold mode, or displayed in an enlarged mode and the like.
Alternatively, the identity of the application matching the selected type of multimedia object may be highlighted in the identity of the at least one application. For example: if the type of the selected multimedia object is a picture, the identifier of the application of the picture class may be highlighted in the identifier of the at least one application. If the type of the selected multimedia object is a web address, the identifier of the application program of the browser class may be highlighted in the identifier of the at least one application program. Of course, the detailed description of the highlighting may be found in the previous alternative embodiment.
In the embodiment, the copied or cut multimedia object can be directly input into the target application program, and compared with the mode that the corresponding application program needs to be opened again and the copied or cut multimedia object is input into the input box of the application program in the prior art, the input efficiency of the embodiment is higher, and the use of a user is more convenient.
In the embodiment of the present invention, through steps 201 to 207, after the multimedia object is selected, since corresponding operations can be performed on the selected multimedia object and the types of the operations are more, the selected multimedia object can be used more flexibly.
Referring to fig. 6, fig. 6 is a schematic structural diagram of a terminal device according to an embodiment of the present invention, which can implement details of a multimedia object selection method in the foregoing embodiment and achieve the same effect. As shown in fig. 6, the terminal apparatus 600 includes:
a first display module 601, configured to display a first display interface, where the first display interface includes an object display area and a control area, the object display area displays a multimedia object and a cursor, and the control area displays a cursor control;
a first identification module 602, configured to, in a case that a first input for the cursor control is received by a user, identify movement information of the first input;
a control module 603, configured to control the cursor to move according to the movement information;
a selecting module 604, configured to select a multimedia object corresponding to a position where the cursor is moved.
Optionally, the movement information includes a movement direction and a movement distance;
the control module 603 is further configured to control the cursor to move according to the moving direction and the moving distance of the first input;
wherein the direction of movement of the cursor matches the direction of movement of the first input;
the moving speed of the cursor is positively correlated with the moving distance, or the moving distance of the cursor is positively correlated with the moving distance.
Optionally, a first cursor and a second cursor are displayed in the object display area, a first cursor control and a second cursor control are displayed in the control area, the first input is an input for at least one of the first cursor control and the second cursor control, where the first cursor control is used to control the first cursor, and the second cursor control is used to control the second cursor.
Optionally, at least one cursor is displayed in the object display area, a control component is displayed in the control area, and the control component includes the cursor control and at least one object selection button;
the control module 603 is further configured to control a target cursor to move to a target position according to the first input first moving button information;
wherein the first moving button information is used to indicate that the end position of the first input is located at a target object selection button in the control component; the target cursor is a cursor corresponding to the target object selection button, and the target position is a moving position of the cursor corresponding to the target object selection button.
Optionally, a first cursor and a second cursor are displayed in the object display area;
the at least one object selection button comprises at least one of: a head button, a tail button, a head button and a tail button;
the cursor corresponding to the head of line button is the first cursor, and the cursor moving position corresponding to the head of line button is the head of line position of the line where the first cursor is located currently;
the cursor corresponding to the line end button is the second cursor, and the cursor moving position corresponding to the line end button is the line end position of the line where the second cursor is located currently;
the cursor corresponding to the head button is the first cursor, and the cursor moving position corresponding to the head button is the multimedia object initial position of the object display area;
the cursor corresponding to the tail button is the second cursor, and the cursor moving position corresponding to the tail button is the multimedia object ending position of the object display area.
Optionally, referring to fig. 7, the terminal device 600 further includes:
a second display module 605 for displaying at least one function button on the control component;
a second identifying module 606, configured to, in a case that a second input of the cursor control is received, identify second moving button information of the second input;
an executing module 607, configured to execute a target operation for the selected multimedia object according to the second input second moving button information, where the second moving button information is used to indicate that the end position of the second input is located in a target function button of the control component, and the target operation is an operation corresponding to the target function button.
Optionally, referring to fig. 8, the terminal device 600 further includes:
a third display module 608, configured to display a second display interface, where the second display interface includes the object display area and an application identification area, and the application identification area includes an identification of at least one application;
an input module 609, configured to, in a case that a third input of the user for the identification of the target application program is received, input the copied or cut multimedia object into the target application program according to the third input.
The terminal device 600 provided in the embodiment of the present invention can implement each process implemented by the terminal device in the method embodiments of fig. 1 to fig. 2, and is not described here again to avoid repetition. The embodiment of the invention can also improve the accuracy of the terminal in selecting the multimedia objects.
Fig. 9 is a schematic diagram of a hardware structure of a terminal device for implementing various embodiments of the present invention.
The terminal device 900 includes but is not limited to: a radio frequency unit 901, a network module 902, an audio output unit 903, an input unit 904, a sensor 905, a display unit 906, a user input unit 907, an interface unit 908, a memory 909, a processor 910, and a power supply 911. Those skilled in the art will appreciate that the terminal device configuration shown in fig. 9 does not constitute a limitation of the terminal device, and that the terminal device may include more or fewer components than shown, or combine certain components, or a different arrangement of components. In the embodiment of the present invention, the terminal device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The display unit 906 is configured to display a first display interface, where the first display interface includes an object display area and a control area, the object display area displays a multimedia object and a cursor, and the control area displays a cursor control;
a processor 910 configured to: in the case that a first input of a user for the cursor control is received, identifying movement information of the first input; controlling the cursor to move according to the movement information; and selecting the multimedia object corresponding to the position of the cursor after the cursor is moved.
Optionally, the movement information includes a movement direction and a movement distance;
the controlling the cursor movement according to the movement information performed by processor 910 includes:
controlling the cursor to move according to the moving direction and the moving distance of the first input;
wherein the direction of movement of the cursor matches the direction of movement of the first input;
the moving speed of the cursor is positively correlated with the moving distance, or the moving distance of the cursor is positively correlated with the moving distance.
Optionally, a first cursor and a second cursor are displayed in the object display area, a first cursor control and a second cursor control are displayed in the control area, the first input is an input for at least one of the first cursor control and the second cursor control, where the first cursor control is used to control the first cursor, and the second cursor control is used to control the second cursor.
Optionally, at least one cursor is displayed in the object display area, a control component is displayed in the control area, and the control component includes the cursor control and at least one object selection button;
the controlling the cursor movement according to the movement information performed by processor 910 includes:
controlling a target cursor to move to a target position according to the first input first moving button information;
wherein the first moving button information is used to indicate that the end position of the first input is located at a target object selection button in the control component; the target cursor is a cursor corresponding to the target object selection button, and the target position is a moving position of the cursor corresponding to the target object selection button.
Optionally, a first cursor and a second cursor are displayed in the object display area;
the at least one object selection button comprises at least one of: a head button, a tail button, a head button and a tail button;
the cursor corresponding to the head of line button is the first cursor, and the cursor moving position corresponding to the head of line button is the head of line position of the line where the first cursor is located currently;
the cursor corresponding to the line end button is the second cursor, and the cursor moving position corresponding to the line end button is the line end position of the line where the second cursor is located currently;
the cursor corresponding to the head button is the first cursor, and the cursor moving position corresponding to the head button is the multimedia object initial position of the object display area;
the cursor corresponding to the tail button is the second cursor, and the cursor moving position corresponding to the tail button is the multimedia object ending position of the object display area.
Optionally, a control component is displayed in the control area, where the control component includes the cursor control, and the display unit 906 is further configured to: displaying at least one function button on the control assembly;
processor 910, further configured to:
in the case that a second input of the cursor control is received, identifying second moving button information of the second input; and executing target operation aiming at the selected multimedia object according to second input second moving button information, wherein the second moving button information is used for indicating that the destination position of the second input is located at a target function button of the control component, and the target operation is the operation corresponding to the target function button.
Optionally, the target operation includes copying or cutting;
the display unit 906 is further configured to display a second display interface, where the second display interface includes the object display area and an application identification area, and the application identification area includes an identification of at least one application;
the processor 910 is further configured to: and under the condition that third input of the identification of the target application program by the user is received, inputting the copied or cut multimedia object into the target application program according to the third input.
The embodiment of the invention can also improve the accuracy of the terminal in selecting the multimedia objects.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 901 may be used for receiving and sending signals during a message transmission and reception process or a call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 910; in addition, the uplink data is transmitted to the base station. Generally, the radio frequency unit 901 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 901 can also communicate with a network and other devices through a wireless communication system.
The terminal device provides wireless broadband internet access to the user through the network module 902, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 903 may convert audio data received by the radio frequency unit 901 or the network module 902 or stored in the memory 909 into an audio signal and output as sound. Also, the audio output unit 903 may also provide audio output related to a specific function performed by the terminal apparatus 900 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 903 includes a speaker, a buzzer, a receiver, and the like.
The input unit 904 is used to receive audio or video signals. The input Unit 904 may include a Graphics Processing Unit (GPU) 9041 and a microphone 9042, and the Graphics processor 9041 processes image data of a still picture or video obtained by an image capturing device (such as a camera) in a video capture mode or an image capture mode. The processed image frames may be displayed on the display unit 906. The image frames processed by the graphic processor 9041 may be stored in the memory 909 (or other storage medium) or transmitted via the radio frequency unit 901 or the network module 902. The microphone 9042 can receive sounds and can process such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 901 in case of the phone call mode.
The terminal device 900 also includes at least one sensor 905, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 9061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 9061 and/or backlight when the terminal device 900 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal device posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 905 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which are not described in detail herein.
The display unit 906 is used to display information input by the user or information provided to the user. The Display unit 906 may include a Display panel 9061, and the Display panel 9061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 907 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 907 includes a touch panel 9071 and other input devices 9072. The touch panel 9071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 9071 (e.g., operations by a user on or near the touch panel 9071 using a finger, a stylus, or any other suitable object or accessory). The touch panel 9071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 910, receives a command from the processor 910, and executes the command. In addition, the touch panel 9071 may be implemented by using various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The user input unit 907 may include other input devices 9072 in addition to the touch panel 9071. Specifically, the other input devices 9072 may include, but are not limited to, a physical keyboard, function keys (such as a volume control key, a switch key, and the like), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 9071 may be overlaid on the display panel 9061, and when the touch panel 9071 detects a touch operation on or near the touch panel 9071, the touch panel is transmitted to the processor 910 to determine the type of the touch event, and then the processor 910 provides a corresponding visual output on the display panel 9061 according to the type of the touch event. Although in fig. 9, the touch panel 9071 and the display panel 9061 are implemented as two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 9071 and the display panel 9061 may be integrated to implement the input and output functions of the terminal device, which is not limited herein.
The interface unit 908 is an interface for connecting an external device to the terminal apparatus 900. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 908 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the terminal apparatus 900 or may be used to transmit data between the terminal apparatus 900 and external devices.
The memory 909 may be used to store software programs as well as various data. The memory 909 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 909 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 910 is a control center of the terminal device, connects various parts of the entire terminal device with various interfaces and lines, and performs various functions of the terminal device and processes data by running or executing software programs and/or modules stored in the memory 909 and calling data stored in the memory 909, thereby performing overall monitoring of the terminal device. Processor 910 may include one or more processing units; preferably, the processor 910 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It is to be appreciated that the modem processor described above may not be integrated into processor 910.
The terminal device 900 may further include a power supply 911 (e.g., a battery) for supplying power to various components, and preferably, the power supply 911 may be logically connected to the processor 910 through a power management system, so as to manage charging, discharging, and power consumption management functions through the power management system.
In addition, the terminal device 900 includes some functional modules that are not shown, and are not described in detail here.
Preferably, an embodiment of the present invention further provides a terminal device, which includes a processor 910, a memory 909, and a computer program that is stored in the memory 909 and can be run on the processor 910, and when the computer program is executed by the processor 910, the computer program implements each process of the embodiment of the multimedia object selection method, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the embodiment of the method for selecting a multimedia object, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. A method for multimedia object selection, comprising:
displaying a first display interface, wherein the first display interface comprises an object display area and a control area, the object display area displays a multimedia object and a cursor, and the control area displays a cursor control;
in the case that a first input of a user for the cursor control is received, identifying movement information of the first input;
controlling the cursor to move according to the movement information;
selecting a multimedia object corresponding to the position of the cursor after the cursor moves;
the object display area displays a first cursor and a second cursor, the control area displays a control assembly, and the control assembly comprises a cursor control and at least one object selection button; the cursor control comprises a first cursor control and a second cursor control, the first input is an input for at least one of the first cursor control and the second cursor control, wherein the first cursor control is used for controlling the first cursor, and the second cursor control is used for controlling the second cursor;
the controlling the cursor to move according to the movement information comprises:
controlling a target cursor to move to a target position according to the first input first moving button information;
wherein the first moving button information is used to indicate that the end position of the first input is located at a target object selection button in the control component; the target cursor is a cursor corresponding to the target object selection button, and the target position is a moving position of the cursor corresponding to the target object selection button;
the at least one object selection button comprises at least one of: a head button, a tail button, a head button and a tail button;
the cursor corresponding to the head of line button is the first cursor, and the cursor moving position corresponding to the head of line button is the head of line position of the line where the first cursor is located currently;
the cursor corresponding to the line end button is the second cursor, and the cursor moving position corresponding to the line end button is the line end position of the line where the second cursor is located currently;
the cursor corresponding to the head button is the first cursor, and the cursor moving position corresponding to the head button is the multimedia object initial position of the object display area;
the cursor corresponding to the tail button is the second cursor, and the cursor moving position corresponding to the tail button is the multimedia object ending position of the object display area.
2. The method of claim 1, wherein the movement information includes a movement direction and a movement distance;
the controlling the cursor to move according to the movement information comprises:
controlling the cursor to move according to the moving direction and the moving distance of the first input;
wherein the direction of movement of the cursor matches the direction of movement of the first input;
the moving speed of the cursor is positively correlated with the moving distance, or the moving distance of the cursor is positively correlated with the moving distance.
3. The method of claim 1, wherein the control area displays control components including the cursor control, and wherein after the selection of the multimedia object corresponding to the position where the cursor is moved, the method further comprises:
displaying at least one function button on the control assembly;
in the case that a second input of the cursor control is received, identifying second moving button information of the second input;
and executing target operation aiming at the selected multimedia object according to second input second moving button information, wherein the second moving button information is used for indicating that the destination position of the second input is located at a target function button of the control component, and the target operation is the operation corresponding to the target function button.
4. The method of claim 3, wherein the target operation comprises a copy or a cut, and wherein after the target operation is performed on the selected multimedia object, the method further comprises:
displaying a second display interface, wherein the second display interface comprises an object display area and an application program identification area, and the application program identification area comprises an identification of at least one application program;
and under the condition that third input of the identification of the target application program by the user is received, inputting the copied or cut multimedia object into the target application program according to the third input.
5. A terminal device, comprising:
the first display module is used for displaying a first display interface, wherein the first display interface comprises an object display area and a control area, the object display area displays a multimedia object and a cursor, and the control area displays a cursor control;
the first identification module is used for identifying the movement information of a first input when the first input of a user for the cursor control is received;
the control module is used for controlling the cursor to move according to the movement information;
the selection module is used for selecting the multimedia object corresponding to the position of the cursor after the cursor moves;
the object display area displays a first cursor and a second cursor, the control area displays a control assembly, and the control assembly comprises a cursor control and at least one object selection button; the cursor control comprises a first cursor control and a second cursor control, the first input is an input for at least one of the first cursor control and the second cursor control, wherein the first cursor control is used for controlling the first cursor, and the second cursor control is used for controlling the second cursor;
the control module is further used for controlling a target cursor to move to a target position according to the first input first moving button information;
wherein the first moving button information is used to indicate that the end position of the first input is located at a target object selection button in the control component; the target cursor is a cursor corresponding to the target object selection button, and the target position is a moving position of the cursor corresponding to the target object selection button;
the at least one object selection button comprises at least one of: a head button, a tail button, a head button and a tail button;
the cursor corresponding to the head of line button is the first cursor, and the cursor moving position corresponding to the head of line button is the head of line position of the line where the first cursor is located currently;
the cursor corresponding to the line end button is the second cursor, and the cursor moving position corresponding to the line end button is the line end position of the line where the second cursor is located currently;
the cursor corresponding to the head button is the first cursor, and the cursor moving position corresponding to the head button is the multimedia object initial position of the object display area;
the cursor corresponding to the tail button is the second cursor, and the cursor moving position corresponding to the tail button is the multimedia object ending position of the object display area.
6. The terminal device according to claim 5, wherein the movement information includes a movement direction and a movement distance;
the control module is further used for controlling the cursor to move according to the moving direction and the moving distance of the first input;
wherein the direction of movement of the cursor matches the direction of movement of the first input;
the moving speed of the cursor is positively correlated with the moving distance, or the moving distance of the cursor is positively correlated with the moving distance.
7. The terminal device of claim 5, wherein the terminal device further comprises:
a second display module for displaying at least one function button on the control assembly;
the second identification module is used for identifying second moving button information of a second input under the condition that the second input of the cursor control is received by a user;
and the execution module is used for executing target operation aiming at the selected multimedia object according to second input second mobile button information, wherein the second mobile button information is used for indicating that the destination position of the second input is located at a target function button of the control component, and the target operation is the operation corresponding to the target function button.
8. The terminal device of claim 7, wherein the terminal device further comprises:
the third display module is used for displaying a second display interface, the second display interface comprises an object display area and an application program identification area, and the application program identification area comprises an identification of at least one application program;
and the input module is used for inputting the copied or cut multimedia object into the target application program according to the third input under the condition of receiving the third input of the identification of the target application program by the user.
9. A terminal device, comprising: memory, processor and computer program stored on the memory and executable on the processor, the processor implementing the steps in the method for multimedia object selection according to any of claims 1-4 when executing the computer program.
10. A computer-readable storage medium, characterized in that a computer program is stored thereon, which computer program, when being executed by a processor, carries out the steps in the method for multimedia object selection according to any of the claims 1-4.
CN201910330197.8A 2019-04-23 2019-04-23 Multimedia object selection method and terminal equipment Active CN110333803B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910330197.8A CN110333803B (en) 2019-04-23 2019-04-23 Multimedia object selection method and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910330197.8A CN110333803B (en) 2019-04-23 2019-04-23 Multimedia object selection method and terminal equipment

Publications (2)

Publication Number Publication Date
CN110333803A CN110333803A (en) 2019-10-15
CN110333803B true CN110333803B (en) 2021-08-13

Family

ID=68139729

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910330197.8A Active CN110333803B (en) 2019-04-23 2019-04-23 Multimedia object selection method and terminal equipment

Country Status (1)

Country Link
CN (1) CN110333803B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112650437B (en) * 2020-12-23 2022-06-03 北京小米移动软件有限公司 Cursor control method and device, electronic equipment and storage medium
CN114035714A (en) * 2021-09-24 2022-02-11 武汉联影医疗科技有限公司 Cursor control method and device, ultrasonic equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102880418A (en) * 2012-09-14 2013-01-16 广州市动景计算机科技有限公司 Method and device for selecting text based on touch screen type mobile terminal
CN104536665A (en) * 2014-12-29 2015-04-22 小米科技有限责任公司 Cursor moving method and device
CN105912258A (en) * 2016-04-13 2016-08-31 北京小米移动软件有限公司 Method and device for operation processing
CN106547421A (en) * 2015-09-23 2017-03-29 浙江格林蓝德信息技术有限公司 Instruction input method and device based on cursor event
CN108334267A (en) * 2018-02-05 2018-07-27 广东欧珀移动通信有限公司 A kind of cursor-moving method, system and terminal device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150024247A (en) * 2013-08-26 2015-03-06 삼성전자주식회사 Method and apparatus for executing application using multiple input tools on touchscreen device
CN103914260B (en) * 2014-03-31 2021-03-23 苏州浩辰软件股份有限公司 Control method and device for operation object based on touch screen

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102880418A (en) * 2012-09-14 2013-01-16 广州市动景计算机科技有限公司 Method and device for selecting text based on touch screen type mobile terminal
CN104536665A (en) * 2014-12-29 2015-04-22 小米科技有限责任公司 Cursor moving method and device
CN106547421A (en) * 2015-09-23 2017-03-29 浙江格林蓝德信息技术有限公司 Instruction input method and device based on cursor event
CN105912258A (en) * 2016-04-13 2016-08-31 北京小米移动软件有限公司 Method and device for operation processing
CN108334267A (en) * 2018-02-05 2018-07-27 广东欧珀移动通信有限公司 A kind of cursor-moving method, system and terminal device

Also Published As

Publication number Publication date
CN110333803A (en) 2019-10-15

Similar Documents

Publication Publication Date Title
CN108762954B (en) Object sharing method and mobile terminal
CN109388304B (en) Screen capturing method and terminal equipment
CN108737904B (en) Video data processing method and mobile terminal
CN110851040B (en) Information processing method and electronic equipment
CN107943390B (en) Character copying method and mobile terminal
CN109213416B (en) Display information processing method and mobile terminal
CN108132752B (en) Text editing method and mobile terminal
CN110007835B (en) Object management method and mobile terminal
WO2021004426A1 (en) Content selection method, and terminal
CN111142723B (en) Icon moving method and electronic equipment
CN110196668B (en) Information processing method and terminal equipment
CN110673770B (en) Message display method and terminal equipment
CN110531915B (en) Screen operation method and terminal equipment
CN109683802B (en) Icon moving method and terminal
CN110909524B (en) Editing method and electronic equipment
CN108228902B (en) File display method and mobile terminal
CN108646960B (en) File processing method and flexible screen terminal
CN109189303B (en) Text editing method and mobile terminal
CN108196753B (en) Interface switching method and mobile terminal
US11895069B2 (en) Message sending method and mobile terminal
CN110096203B (en) Screenshot method and mobile terminal
CN111610904B (en) Icon arrangement method, electronic device and storage medium
CN109271262B (en) Display method and terminal
CN109710130B (en) Display method and terminal
CN109683768B (en) Application operation method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant