CN112015270A - Terminal control method, terminal and computer storage medium - Google Patents

Terminal control method, terminal and computer storage medium Download PDF

Info

Publication number
CN112015270A
CN112015270A CN202010846916.4A CN202010846916A CN112015270A CN 112015270 A CN112015270 A CN 112015270A CN 202010846916 A CN202010846916 A CN 202010846916A CN 112015270 A CN112015270 A CN 112015270A
Authority
CN
China
Prior art keywords
terminal
gesture
target object
display interface
control method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010846916.4A
Other languages
Chinese (zh)
Inventor
姜顺豹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Qinggan Intelligent Technology Co Ltd
Original Assignee
Shanghai Qinggan Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Qinggan Intelligent Technology Co Ltd filed Critical Shanghai Qinggan Intelligent Technology Co Ltd
Priority to CN202010846916.4A priority Critical patent/CN112015270A/en
Publication of CN112015270A publication Critical patent/CN112015270A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Abstract

The invention discloses a terminal control method, a terminal and a computer storage medium, wherein the terminal control method comprises the following steps: displaying a selected identifier on a current display interface of the terminal; acquiring an air separation gesture of a user not contacting the terminal, which is shot by a monitoring video device based on the terminal; and controlling the selected identification to select the target object or executing preset operation on the target object currently selected by the selected identification according to the air-separating gesture. According to the terminal control method, the terminal and the computer storage medium, the selected identification is displayed on the current display interface of the terminal in real time, the selected identification is controlled to select the target object or preset operation is performed on the target object currently selected by the selected identification according to the spaced gesture of the user, so that accurate control of gesture operation on the terminal is guaranteed, and the user experience can be effectively improved.

Description

Terminal control method, terminal and computer storage medium
Technical Field
The present invention relates to the field of terminals, and in particular, to a terminal control method, a terminal, and a computer storage medium.
Background
With the rapid development of mobile communication technology and the popularization of terminals, terminals such as mobile phones have become an indispensable part of people's life and work, such as chatting with friends, viewing mails, etc. In the related art, a user may perform a related operation on a terminal such as a mobile phone by touching a terminal screen with a finger or according to a gesture of the user. However, in the existing space gesture recognition of the terminal, a sensor is used for establishing a corresponding relation between an abstract coordinate and a current interface, and due to the problems of high false recognition rate, unobvious gesture feedback and the like in the above manner, the accuracy of gesture operation on the terminal is influenced, and further the use experience of a user is influenced.
Disclosure of Invention
The invention aims to provide a terminal control method, a terminal and a computer storage medium, which can effectively improve the user experience.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides a terminal control method, where the terminal control method includes:
displaying a selected identifier on a current display interface of the terminal;
acquiring an air separation gesture of a user not contacting the terminal, which is shot by a monitoring video device based on the terminal;
and controlling the selected identification to select the target object or executing preset operation on the target object currently selected by the selected identification according to the air-separating gesture.
As an embodiment, the displaying a selected identifier on a current display interface of the terminal includes:
after receiving an open gesture mode starting instruction, selecting an initial target object meeting a preset selection rule on a current display interface of the terminal, and displaying the target object in an enlarged manner according to a set animation effect.
As one embodiment, the clear gesture includes: and the palm slides leftwards at intervals, the palm slides rightwards at intervals, the fingers move leftwards at intervals, the fingers move rightwards at intervals, the fingers move upwards at intervals, the fingers move downwards at intervals, and the fingers hit at intervals.
As an embodiment, the controlling the selected identifier to select the target object or performing a preset operation on the target object currently selected by the selected identifier according to the space gesture includes:
acquiring an operation instruction corresponding to the air-separating gesture according to the corresponding relation between the preset air-separating gesture and the control instruction;
and controlling the selected identification to select the target object or executing preset operation on the target object currently selected by the selected identification according to the operation instruction corresponding to the spaced gesture.
As one of the implementation manners, when the blank gesture is a gesture in which the palm slides leftwards or rightwards in the blank space, controlling the terminal to switch the display interface to the corresponding direction, and displaying the updated target object in an enlarged manner according to the set animation effect on the updated current display interface; or when the air gesture is that the palm slides leftwards at intervals or the palm slides rightwards at intervals, controlling the terminal to move the current display interface to the corresponding direction
When the space gesture is that the finger moves leftwards with space, moves rightwards with space, moves upwards with space, or moves downwards with space, the selected mark is controlled to move according to a preset movement rule to select the target object.
For one embodiment, when the space gesture is a finger space click, the selected object currently selected is opened or operated.
As one embodiment, the acquiring an air-separating gesture of a user not contacting the terminal, which is captured by a surveillance video device of the terminal, includes:
after receiving an open gesture mode opening instruction, controlling a vehicle-mounted camera of a vehicle where the vehicle machine is located to be opened so as to shoot video frames containing user hand actions in real time;
and acquiring the air separating gesture of the user not contacting the terminal according to the video frame.
In a second aspect, an embodiment of the present invention provides a terminal, including: a processor and a memory for storing a computer program capable of running on the processor,
wherein the processor implements the steps of the method for controlling a terminal of the first aspect when running the computer program.
In a third aspect, an embodiment of the present invention provides a computer storage medium, where a computer program is stored, and the computer program, when executed by a processor, implements the steps of the control method for the terminal according to the first aspect.
The embodiment of the invention provides a terminal control method, a terminal and a computer storage medium, wherein the terminal control method comprises the following steps: displaying a selected identifier on a current display interface of the terminal; acquiring an air separation gesture of a user not contacting the terminal, which is shot by a monitoring video device based on the terminal; and controlling the selected identification to select the target object or executing preset operation on the target object currently selected by the selected identification according to the air-separating gesture. Therefore, the selected identification is displayed on the current display interface of the terminal in real time, the selected identification is controlled to select the target object or preset operation is executed on the target object currently selected by the selected identification according to the spaced gesture of the user, accurate control of gesture operation on the terminal is guaranteed, user experience can be effectively improved, and operation is simple.
Drawings
Fig. 1 is a schematic flowchart of a control method of a terminal according to an embodiment of the present invention;
fig. 2 is a schematic flowchart illustrating a specific control method of a terminal according to an embodiment of the present invention;
FIG. 3 is a first diagram illustrating a rule of focus shifting according to an embodiment of the present invention;
FIG. 4 is a second diagram illustrating a rule of focus shifting according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a terminal according to an embodiment of the present invention.
Detailed Description
The technical scheme of the invention is further elaborated by combining the drawings and the specific embodiments in the specification. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
Referring to fig. 1, a terminal control method provided in the embodiment of the present invention is applied to a terminal, where the terminal may be a mobile terminal device such as a smart phone, a personal digital assistant, a tablet computer, a car machine, or a fixed terminal device such as a desktop computer. The control method of the terminal comprises the following steps:
step S101: displaying a selected identifier on a current display interface of the terminal;
it should be noted that, in the terminal control method of the present application, a selected identifier is displayed on a current display interface of a terminal, and then, based on an air-break gesture of a user, which is shot by a monitoring video device of the terminal and does not contact the terminal, the selected identifier is controlled to select a target object or preset operation is performed on the target object currently selected by the selected identifier, on one hand, the above function may be named as an air-break gesture mode, or may be named as another function mode; on the other hand, the function mode may be always in an operating state as a system function, or may be to operate the function after receiving an activation instruction. Therefore, the descriptive description of the spaced gesture mode does not limit the control method of the terminal of the present application, but only describes the functional effects of the control method of the terminal of the present application in a general way. In this embodiment, an air gesture mode is set in the terminal, and in the air gesture mode, the terminal may operate a target object in the terminal according to an air gesture in which a user does not contact the terminal. The selected identifier is used to distinguish the selected target object from other objects, and may be a selection frame displayed around the target object, a circle, a small box, or the like displayed on the target object, or a target object displayed in a protruding or flashing manner. The selected indicia may be moved to select a target object on the display interface, which may be an application icon, a virtual key, etc. In addition, the displaying a selected identifier on the current display interface of the terminal may be displaying a selected identifier at a set position of the current display interface of the terminal, where the set position may be an upper left corner, an upper right corner, a lower left corner, a lower right corner, and the like.
In one embodiment, the displaying a selected identifier on a current display interface of the terminal includes: after receiving an open gesture mode starting instruction, selecting an initial target object meeting a preset selection rule on a current display interface of the terminal, and displaying the target object in an enlarged manner according to a set animation effect. Therefore, the terminal can be provided with a virtual key for starting the space gesture mode, and when a user clicks the virtual key, the terminal correspondingly receives a space gesture mode starting instruction. The preset selection rule may be set according to actual needs, for example, a first application icon on the upper left of the current display interface may be used as an initial target object, or an application icon in the middle of the current display interface may be used as an initial target object. And the target object is displayed in a protruding mode in an amplifying mode according to the set animation effect. Therefore, the target object is displayed in an amplifying mode according to the set animation effect, a user can timely know that the air-separating gesture mode is started, and the user can conveniently input a corresponding gesture to the terminal according to the current position of the selected mark to select the target object to be controlled, so that the user experience is further improved, and the accuracy of controlling the terminal is improved. In addition, it should be noted that, in order to better conform to handedness of different users, the blank gesture mode may include a left-hand blank gesture mode and a right-hand blank gesture mode, in the left-hand blank gesture mode, an initial target object meeting the preset selection rule may be selected on the left side of the current display interface of the terminal, and in the right-hand blank gesture mode, an initial target object meeting the preset selection rule may be selected on the right side of the current display interface of the terminal.
Step S102: acquiring an air separation gesture of a user not contacting the terminal, which is shot by a monitoring video device based on the terminal;
here, the air gesture is used as a basis for the selection by the movement of the selected identifier, and the air gesture of the user not contacting the terminal can be acquired through a monitoring video device of the terminal. It is to be understood that the monitoring video device may be a camera or other device having a video shooting function. When the terminal is a smart phone, a tablet computer or other equipment, the monitoring video device of the terminal may refer to a camera or other device with a video shooting function integrated on the terminal. It should be noted that, if the terminal is a car machine, the acquiring of the air separation gesture of the user not contacting the terminal, which is shot by the monitoring video device based on the terminal, includes: after receiving an open gesture mode opening instruction, controlling a vehicle-mounted camera of a vehicle where the vehicle machine is located to be opened so as to shoot video frames containing user hand actions in real time; and acquiring the air separating gesture of the user not contacting the terminal according to the video frame. The vehicle-mounted camera can be integrated on the vehicle-mounted device or not, if the vehicle-mounted camera is not integrated on the vehicle-mounted device, the vehicle-mounted device is connected with the vehicle-mounted camera so as to acquire the video frame shot by the vehicle-mounted camera in real time. Here, after the terminal obtains the video frame including the hand motion of the user shot by the monitoring video device in real time, the video frame may be preprocessed by filtering, denoising, and the like, so as to improve the quality of the video frame.
It should be noted that, since it usually takes a certain time for a user to generate a gesture, the acquiring of the blank gesture of the user not contacting the terminal, which is shot by the surveillance video device based on the terminal, may be acquiring a plurality of continuous video frames shot by the surveillance video device based on the terminal, which include the hand motion of the user, and analyzing the plurality of continuous video frames, so as to correspondingly acquire the blank gesture of the user not contacting the terminal. The air-separating gesture can be set according to actual needs, for example, one or more of the following gestures can be used: the palm slides to the left at intervals, the palm slides to the right at intervals, the fingers move to the left at intervals, the fingers move to the right at intervals, the fingers move upwards at intervals, the fingers move downwards at intervals and the fingers click at intervals. Here, it may be determined whether the clear gesture is a palm clear swipe or a finger clear swipe based on the moving distance, and if the moving distance is greater than a preset distance threshold, the clear gesture may be considered as the palm clear swipe, and if the moving distance is less than the preset distance threshold, the clear gesture may be considered as the finger clear swipe. In addition, it may be determined whether the space gesture is a palm space swipe or a finger space movement according to the number of recognized fingers, and if the space gesture is an operation of a single finger, the space gesture is considered to be a finger space swipe, and if the space gesture is an operation of two or more fingers, the space gesture is considered to be a palm space swipe.
Step S103: and controlling the selected identification to select the target object or executing preset operation on the target object currently selected by the selected identification according to the air-separating gesture.
And controlling the selected identifier to move to select a target object or performing preset operation on the target object currently selected by the selected identifier according to the difference of the air-separating gesture. In one embodiment, the controlling the selected tag to select the target object or performing a preset operation on the target object currently selected by the selected tag according to the space gesture includes:
acquiring an operation instruction corresponding to the air-separating gesture according to the corresponding relation between the preset air-separating gesture and the control instruction;
and controlling the selected identification to select the target object or executing preset operation on the target object currently selected by the selected identification according to the operation instruction corresponding to the spaced gesture.
It can be understood that, the terminal may be preset with a corresponding relationship between different space gestures and corresponding control instructions, for example, if the space gesture is that the finger moves to the right in the space, the corresponding control instruction may be set as that the selected identifier moves to the right; if the air-separating gesture is that the fingers move downwards in the air-separating mode, the corresponding control instruction can be set to be that the selected mark moves downwards; if the air gesture is a finger air click, the corresponding control instruction can be set to operate the currently selected target object of the selected identifier. In practical application, the corresponding relation between the air-separating gesture and the control instruction can be set according to the requirement of practical situation. Therefore, the selected identification is controlled to select the target object or preset operation is executed on the target object currently selected by the selected identification according to the operation instruction corresponding to the air-separating gesture, so that the terminal is quickly and accurately controlled, and the use experience of a user is further improved.
In an embodiment, when the blank gesture is that the palm slides leftwards at a blank space or the palm slides rightwards at a blank space, the terminal is controlled to switch the display interface to the corresponding direction, and the updated target object is displayed in an enlarged manner according to a set animation effect on the updated current display interface. Specifically, when the blank gesture is that the palm slides leftwards in a blank manner, the terminal is controlled to be switched to the next display interface, and the next display interface may be one of the display interfaces arranged in a fixed order and located behind the current display interface. And when the air gesture is that the palm slides rightwards in an air mode, controlling the terminal to be switched to the previous display interface, wherein the previous display interface can be one of the display interfaces arranged according to a fixed sequence and positioned in front of the current display interface. It can be understood that, after the terminal updates the current display interface, the target object currently selected by the selected identifier also needs to be updated correspondingly, that is, the updated target object is displayed in an enlarged manner according to the set animation effect on the updated current display interface. Therefore, the display interface of the terminal can be switched quickly, and the operation is simple and convenient.
In an embodiment, when the blank gesture is that the palm slides leftwards with blank space or the palm slides rightwards with blank space, the terminal is controlled to move the current display interface to a corresponding direction. Specifically, when the air gesture is that the palm slides leftwards at an air space, the terminal is controlled to move the current display interface leftwards; and when the air separating gesture is that the palm slides rightwards at an air separating position, controlling the terminal to move the current display interface rightwards. It should be noted that the moving distance of the current display interface may be a preset distance threshold, that is, each time the terminal receives a palm space sliding gesture, the terminal correspondingly moves the current display interface to the corresponding direction by the preset distance threshold. In addition, the moving distance of the current display interface can also be determined according to the moving distance of the palm. It can be understood that, limited by factors such as the size of a file, the size of content, or the size of a font displayed on the current display interface, a user may not be able to see all the content displayed on the current display interface, and at this time, the user moves the current display interface in a corresponding direction through an air-separating gesture, so that all the content displayed on the current display interface can be conveniently known. Therefore, the display interface of the terminal can be quickly moved, and the operation is simple and convenient.
In an embodiment, when the space gesture is that the finger moves to the left in space, moves to the right in space, moves upwards in space, or moves downwards in space, the selected identifier is controlled to move according to a preset movement rule to select the target object. It can be understood that the preset movement rule may be set according to actual needs, for example, when the blank gesture is that the finger moves to the left in the blank space, the selected identifier is controlled to move to the left to select the target object; when the air-separating gesture is that the fingers move upwards in the air-separating mode, controlling the selected mark to move upwards to select a target object; when the air-separating gesture is that the fingers move to the right in an air-separating mode and the target object currently selected by the selected mark is located at the edge of the current display interface, controlling the selected mark to move to the next row of the currently selected target object to select the target object. Therefore, the target object of the terminal can be quickly selected, the operation is simple and convenient, and the user experience is further improved.
In one embodiment, when the clear gesture is a finger clear tap, the selected object currently selected is opened or operated. Specifically, when the air gesture is a finger air click, if the target object is an application icon, opening an application corresponding to the application icon; and if the target object is a virtual key, switching the working state corresponding to the virtual key. Here, the finger space click may be one of a finger space click, a double click, or a multiple click. Therefore, the operations such as opening or running of the selected target object in the terminal are realized, the operation is simple and convenient, and the user experience is further improved.
In summary, in the control method for the terminal provided in the above embodiment, the selected identifier is displayed on the current display interface of the terminal in real time, and the selected identifier is controlled to select the target object or the preset operation is performed on the target object currently selected by the selected identifier according to the spaced gesture of the user, so as to ensure accurate control of the gesture operation performed on the terminal, and therefore, the user experience can be effectively improved, and the operation is simple.
Based on the same inventive concept of the foregoing embodiments, the present embodiment describes technical solutions of the foregoing embodiments in detail through specific examples. Taking the terminal as a car machine, the monitoring video device as a camera, and the spaced gesture mode as a volition mode as an example, fig. 2 is a specific flowchart of a control method of the terminal provided by the embodiment of the present invention, and includes the following steps:
step S201: starting a high-altitude mode, and controlling a focus to select a control at a default position;
here, when the user opens the high-altitude mode, the focus of the current page is defaulted on the first control, and as the focus is focused on the control, the control presents a certain animation amplification effect, and meanwhile, the camera is started to perform gesture recognition. It will be appreciated that the focus may be understood as the selected indicia mentioned in the above embodiments.
Step S202: and recognizing a user gesture, and controlling the focus to move or operating the control selected by the focus according to the user gesture.
Here, the user gesture is recognized through the camera, and the user gesture recognized by the camera may be classified into three types: move, click, slide with the palm to indicate left slide or right slide, move with the finger to indicate left move or right move, and finger down press is defined as a click event. At this moment, the user gesture recognized through the camera can be specifically divided into: left slide, right slide, left shift, right shift, up shift, down shift, click. The slide gesture is a slide event that triggers the page for the current page as a whole, while the move gesture is an operation performed for the focus of the control of the current page. Referring to fig. 3, which is a schematic diagram one of the focus movement rules, when the user gesture is upward movement, downward movement, left movement, and right movement, respectively, the focus will move in the order in fig. 3 to select a control. Referring to fig. 4, a second schematic diagram of the focal point moving rule shows that when the focal point touches the boundary, the focal point moves in a zigzag manner. It should be noted that, when the focus stays on a certain control and the gesture of the user is recognized as an alternate clicking gesture, the currently selected control of the focus is triggered to operate, so as to implement the recognition of the volley gesture.
In conclusion, the camera recognizes the gesture, and the gesture is used for controlling the focus to move on the current page so as to select the control or operate the control selected by the focus, so that the volley gesture operation function is realized. Because the recognized gesture is only simple actions such as moving and sliding, the recognition rate is high, and meanwhile, the focus on the page can feed back the operation of the current gesture in real time, so that the interaction of the whole volley gesture becomes more accurate and controllable, and the user experience is greatly improved.
Based on the same inventive concept as the foregoing embodiment, an embodiment of the present invention provides a terminal, as shown in fig. 5, including: a processor 310 and a memory 311 for storing computer programs capable of running on the processor 310; the processor 310 illustrated in fig. 5 is not used to refer to the number of the processors 310 as one, but is only used to refer to the position relationship of the processor 310 relative to other devices, and in practical applications, the number of the processors 310 may be one or more; similarly, the memory 311 shown in fig. 5 is also used in the same sense, i.e. it is only used to refer to the position relationship of the memory 311 with respect to other devices, and in practical applications, the number of the memory 311 may be one or more. The processor 310 is configured to implement the terminal control method applied to the terminal when running the computer program.
The terminal may further include: at least one network interface 312. The various components in the terminal are coupled together by a bus system 313. It will be appreciated that the bus system 313 is used to enable communications among the components connected. The bus system 313 includes a power bus, a control bus, and a status signal bus in addition to the data bus. For clarity of illustration, however, the various buses are labeled as bus system 313 in FIG. 5.
The memory 311 may be a volatile memory or a nonvolatile memory, or may include both volatile and nonvolatile memories. Among them, the nonvolatile Memory may be a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an Erasable Programmable Read-Only Memory (EPROM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a magnetic random access Memory (FRAM), a Flash Memory (Flash Memory), a magnetic surface Memory, an optical disk, or a Compact Disc Read-Only Memory (CD-ROM); the magnetic surface storage may be disk storage or tape storage. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of illustration and not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), Synchronous Static Random Access Memory (SSRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic Random Access Memory (SDRAM), Double Data Rate Synchronous Dynamic Random Access Memory (DDRSDRAM), Enhanced Synchronous Dynamic Random Access Memory (ESDRAM), Enhanced Synchronous Dynamic Random Access Memory (Enhanced DRAM), Synchronous Dynamic Random Access Memory (SLDRAM), Direct Memory (DRmb Access), and Random Access Memory (DRAM). The memory 311 described in connection with the embodiments of the invention is intended to comprise, without being limited to, these and any other suitable types of memory.
The memory 311 in the embodiment of the present invention is used to store various types of data to support the operation of the terminal. Examples of such data include: any computer program for operation on the terminal, such as operating systems and application programs; contact data; telephone book data; a message; a picture; video, etc. The operating system includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, and is used for implementing various basic services and processing hardware-based tasks. The application programs may include various application programs such as a Media Player (Media Player), a Browser (Browser), etc. for implementing various application services. Here, the program that implements the method of the embodiment of the present invention may be included in an application program.
Based on the same inventive concept of the foregoing embodiments, this embodiment further provides a computer storage medium, where a computer program is stored in the computer storage medium, where the computer storage medium may be a Memory such as a magnetic random access Memory (FRAM), a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an Erasable Programmable Read Only Memory (EPROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Flash Memory (Flash Memory), a magnetic surface Memory, an optical Disc, or a Compact Disc Read Only Memory (CD-ROM), and the like; or may be a variety of devices including one or any combination of the above memories, such as a mobile phone, computer, tablet device, personal digital assistant, etc. The computer program stored in the computer storage medium implements a control method of the terminal applied to the above terminal when being executed by a processor. Please refer to the description of the embodiment shown in fig. 1 for a specific step flow realized when the computer program is executed by the processor, which is not described herein again.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
As used herein, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, including not only those elements listed, but also other elements not expressly listed.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (10)

1. A control method of a terminal, the control method of the terminal comprising:
displaying a selected identifier on a current display interface of the terminal;
acquiring an air separation gesture of a user not contacting the terminal, which is shot by a monitoring video device based on the terminal;
and controlling the selected identification to select the target object or executing preset operation on the target object currently selected by the selected identification according to the air-separating gesture.
2. The method for controlling a terminal according to claim 1, wherein the displaying a selected identifier on a current display interface of the terminal comprises:
after receiving an open gesture mode starting instruction, selecting an initial target object meeting a preset selection rule on a current display interface of the terminal, and displaying the target object in an enlarged manner according to a set animation effect.
3. The method for controlling a terminal according to claim 1 or 2, wherein the clear gesture comprises: and the palm slides leftwards at intervals, the palm slides rightwards at intervals, the fingers move leftwards at intervals, the fingers move rightwards at intervals, the fingers move upwards at intervals, the fingers move downwards at intervals, and the fingers hit at intervals.
4. The method for controlling the terminal according to claim 3, wherein the controlling the selected tag to select the target object or performing a preset operation on the target object currently selected by the selected tag according to the space gesture comprises:
acquiring an operation instruction corresponding to the air-separating gesture according to the corresponding relation between the preset air-separating gesture and the control instruction;
and controlling the selected identification to select the target object or executing preset operation on the target object currently selected by the selected identification according to the operation instruction corresponding to the spaced gesture.
5. The terminal control method according to claim 4, wherein when the blank gesture is that the palm slides leftwards across the blank space or the palm slides rightwards across the blank space, the terminal is controlled to switch the display interface to the corresponding direction, and the updated target object is displayed in an enlarged manner according to the set animation effect on the updated current display interface; or when the air gesture is that the palm slides leftwards at intervals or the palm slides rightwards at intervals, controlling the terminal to move the current display interface to the corresponding direction.
6. The terminal control method according to claim 4, wherein when the blank gesture is a finger moving left at a blank space, a finger moving right at a blank space, a finger moving up at a blank space, or a finger moving down at a blank space, the selected identifier is controlled to move according to a preset movement rule to select the target object.
7. The terminal control method according to claim 1, wherein when the clear gesture is a finger clear click, the selected object currently selected by the selected identifier is opened or operated.
8. The method for controlling the terminal according to claim 2, wherein the terminal is a vehicle machine, and the obtaining of the air separation gesture of the terminal, which is shot by the monitoring video device based on the terminal and is not touched by the user, comprises:
after receiving an open gesture mode opening instruction, controlling a vehicle-mounted camera of a vehicle where the vehicle machine is located to be opened so as to shoot video frames containing user hand actions in real time;
and acquiring the air separating gesture of the user not contacting the terminal according to the video frame.
9. A terminal, comprising: a processor and a memory for storing a computer program capable of running on the processor,
wherein the processor, when running the computer program, implements a control method of the terminal of any one of claims 1 to 8.
10. A computer storage medium, characterized in that a computer program is stored which, when executed by a processor, implements a control method of a terminal according to any one of claims 1 to 8.
CN202010846916.4A 2020-08-21 2020-08-21 Terminal control method, terminal and computer storage medium Pending CN112015270A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010846916.4A CN112015270A (en) 2020-08-21 2020-08-21 Terminal control method, terminal and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010846916.4A CN112015270A (en) 2020-08-21 2020-08-21 Terminal control method, terminal and computer storage medium

Publications (1)

Publication Number Publication Date
CN112015270A true CN112015270A (en) 2020-12-01

Family

ID=73505372

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010846916.4A Pending CN112015270A (en) 2020-08-21 2020-08-21 Terminal control method, terminal and computer storage medium

Country Status (1)

Country Link
CN (1) CN112015270A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112698739A (en) * 2020-12-28 2021-04-23 联想(北京)有限公司 Positioning method and device
CN112748805A (en) * 2021-01-12 2021-05-04 深圳佑驾创新科技有限公司 Gesture control method and device, computer equipment and storage medium
CN112817443A (en) * 2021-01-22 2021-05-18 歌尔科技有限公司 Display interface control method, device and equipment based on gestures and storage medium
CN113553135A (en) * 2021-07-29 2021-10-26 深圳康佳电子科技有限公司 Split screen display method based on gesture recognition, display terminal and storage medium
CN113844262A (en) * 2021-09-27 2021-12-28 东风电子科技股份有限公司 PCIE-based system, method and device for realizing double-screen interaction of vehicle-mounted system, processor and computer storage medium thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105278668A (en) * 2014-12-16 2016-01-27 维沃移动通信有限公司 Mobile terminal control method and mobile terminal
CN106055098A (en) * 2016-05-24 2016-10-26 北京小米移动软件有限公司 Air gesture operation method and apparatus
CN111142655A (en) * 2019-12-10 2020-05-12 上海博泰悦臻电子设备制造有限公司 Interaction method, terminal and computer readable storage medium
CN111338461A (en) * 2018-12-18 2020-06-26 鸿合科技股份有限公司 Gesture operation method and device and electronic equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105278668A (en) * 2014-12-16 2016-01-27 维沃移动通信有限公司 Mobile terminal control method and mobile terminal
CN106055098A (en) * 2016-05-24 2016-10-26 北京小米移动软件有限公司 Air gesture operation method and apparatus
CN111338461A (en) * 2018-12-18 2020-06-26 鸿合科技股份有限公司 Gesture operation method and device and electronic equipment
CN111142655A (en) * 2019-12-10 2020-05-12 上海博泰悦臻电子设备制造有限公司 Interaction method, terminal and computer readable storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
DICK OLIVER著,孙宝成等译: "《循序渐进HTML与XHTML教程(第五版)》", 清华大学出版社, pages: 135 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112698739A (en) * 2020-12-28 2021-04-23 联想(北京)有限公司 Positioning method and device
CN112748805A (en) * 2021-01-12 2021-05-04 深圳佑驾创新科技有限公司 Gesture control method and device, computer equipment and storage medium
CN112748805B (en) * 2021-01-12 2023-07-04 深圳佑驾创新科技有限公司 Gesture control method, gesture control device, computer equipment and storage medium
CN112817443A (en) * 2021-01-22 2021-05-18 歌尔科技有限公司 Display interface control method, device and equipment based on gestures and storage medium
CN113553135A (en) * 2021-07-29 2021-10-26 深圳康佳电子科技有限公司 Split screen display method based on gesture recognition, display terminal and storage medium
CN113844262A (en) * 2021-09-27 2021-12-28 东风电子科技股份有限公司 PCIE-based system, method and device for realizing double-screen interaction of vehicle-mounted system, processor and computer storage medium thereof

Similar Documents

Publication Publication Date Title
CN112015270A (en) Terminal control method, terminal and computer storage medium
EP3454196B1 (en) Method and apparatus for editing object
JP2022520094A (en) Interface display method and its devices, terminals and computer programs
CN108829314B (en) Screenshot selecting interface selection method, device, equipment and storage medium
CN113163050B (en) Session interface display method and device
EP3454226B1 (en) Method and apparatus for displaying a user interface
CN111831205B (en) Device control method, device, storage medium and electronic device
CN105302458A (en) Message display method and apparatus
CN108664286B (en) Application program preloading method and device, storage medium and mobile terminal
CN115390740A (en) Device control method, device, storage medium and electronic device
CN107544740B (en) Application processing method and device, storage medium and electronic equipment
CN114415886A (en) Application icon management method and electronic equipment
CN112068764B (en) Language switching method and device for language switching
CN113253883A (en) Application interface display method and device and electronic equipment
CN113268182A (en) Application icon management method and electronic equipment
CN111625176A (en) Device control method, device, storage medium and electronic device
CN106445291A (en) Method and terminal for achieving application processing
CN111124584A (en) Shortcut panel display method, terminal and readable storage medium
CN115357158A (en) Message processing method and device, electronic equipment and storage medium
CN108182020A (en) screen display processing method, device and storage medium
CN114860149A (en) Content editing control method and device, electronic equipment and storage medium
CN113687922A (en) Task switching control method and device and related equipment
CN113436297A (en) Picture processing method and electronic equipment
US20230089457A1 (en) Information processing method and apparatus, electronic device, and storage medium
WO2024088187A1 (en) Information display method and apparatus, and electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination