Disclosure of Invention
In view of the above, the invention provides a terminal mode switching method, a terminal and a vehicle, which can enable a user to switch modes of a remote control terminal without an entity machine, improve the reliability and accuracy of user interface interaction, and meet the requirements of liveness and emotional interaction experience of the user.
The invention provides a terminal mode switching method, which is applied to a terminal and comprises the following steps: acquiring a dynamic gesture instruction; judging whether the dynamic gesture instruction exists in a gesture list or not; when the dynamic gesture instruction exists in the gesture list, searching mode information corresponding to the dynamic gesture instruction from the gesture list; and controlling the terminal to be switched from the current first display mode to the second display mode according to the mode information.
Specifically, the step of acquiring a dynamic gesture instruction includes: acquiring hand image information of a user; and identifying the hand image information to obtain a dynamic gesture instruction.
Specifically, the step of acquiring a dynamic gesture instruction further includes, before the step of acquiring a dynamic gesture instruction: receiving a gesture setting instruction; acquiring hand image information of a user according to the gesture setting instruction; carrying out identification processing on the hand image information to obtain an identification result; and storing the recognition result and the mode information corresponding to the recognition result in a gesture list in an associated manner.
Specifically, the step of controlling the terminal to switch from the current first display mode to the second display mode according to the mode information includes: acquiring a first display mode currently displayed by a terminal; processing according to the mode information and the first display mode to obtain a mode switching signal; and controlling the terminal to be switched from the first display mode to a second display mode according to the mode switching signal.
The present invention also provides a terminal, including: a memory for storing executable program code; and a processor for calling the executable program code in the memory to implement the steps of the terminal mode switching method: acquiring a dynamic gesture instruction; judging whether the dynamic gesture instruction exists in a gesture list or not; when the dynamic gesture instruction exists in the gesture list, searching mode information corresponding to the dynamic gesture instruction from the gesture list; and controlling the terminal to be switched from the current first display mode to the second display mode according to the mode information.
Specifically, the step of executing the dynamic gesture command acquisition by the processor includes: acquiring hand image information of a user; and identifying the hand image information to obtain a dynamic gesture instruction.
Specifically, the processor, before executing the step of acquiring a dynamic gesture instruction, further includes: receiving a gesture setting instruction; acquiring hand image information of a user according to the gesture setting instruction; carrying out identification processing on the hand image information to obtain an identification result; and storing the recognition result and the mode information corresponding to the recognition result in a gesture list in an associated manner.
Specifically, the step of executing the control of the terminal to switch from the current first display mode to the second display mode according to the mode information by the processor includes: acquiring a first display mode currently displayed by a terminal; processing according to the mode information and the first display mode to obtain a mode switching signal; and controlling the terminal to be switched from the first display mode to a second display mode according to the mode switching signal.
The invention also provides a vehicle comprising the terminal.
Specifically, according to the terminal mode switching method, the terminal and the vehicle provided by the embodiment, whether the dynamic gesture instruction exists in the gesture list is judged by acquiring the dynamic gesture instruction, when the dynamic gesture instruction exists in the gesture list, mode information corresponding to the dynamic gesture instruction is searched from the gesture list, and the terminal is controlled to be switched from the current first display mode to the second display mode according to the mode information, so that a user can be separated from an entity machine to perform mode switching of a remote control terminal, the reliability and the accuracy of user interface interaction are improved, and the liveliness and emotional interaction experience of the user is met.
The foregoing description is only an overview of the technical solutions of the present invention, and in order to make the technical means of the present invention more clearly understood, the present invention may be implemented in accordance with the content of the description, and in order to make the above and other objects, features, and advantages of the present invention more clearly understood, the following preferred embodiments are described in detail with reference to the accompanying drawings.
Detailed Description
To further explain the technical means and effects of the present invention adopted to achieve the predetermined objects, the present invention will be described in detail below with reference to the accompanying drawings and preferred embodiments.
Fig. 1 is a flowchart illustrating a terminal mode switching method according to a first embodiment of the present invention. The embodiment is a terminal mode switching method executed by a terminal. As shown in fig. 1, the terminal mode switching method of the present embodiment may include the following steps:
step S11: and acquiring a dynamic gesture instruction.
Specifically, in an embodiment, the terminal may be, but is not limited to, a smart phone, a smart wearable device, a tablet computer, a PAD, a car machine, and the like.
Specifically, in an embodiment, the terminal tracks dynamic information of the hand of the user in real time, for example, but not limited to, the terminal may collect dynamic gesture instructions of the hand of the user through an infrared camera or a visible light camera.
Step S12: and judging whether the dynamic gesture instruction exists in the gesture list or not.
Specifically, in an embodiment, the terminal compares the acquired dynamic gesture instruction with a pre-stored gesture list to determine whether the dynamic gesture instruction exists in the gesture list.
Step S13: and when the dynamic gesture instruction exists in the gesture list, searching the mode information corresponding to the dynamic gesture instruction from the gesture list.
Specifically, in an embodiment, when the dynamic gesture instruction exists in the gesture list, the terminal searches for mode information corresponding to the dynamic gesture instruction from the gesture list, and acquires the mode information. For example, when the user performs a finger-making gesture, the terminal acquires day and night mode information to switch the day mode currently displayed by the terminal to the night mode, or to switch the night mode currently displayed by the terminal to the day mode.
Step S14: and controlling the terminal to switch from the current first display mode to the second display mode according to the mode information.
Specifically, in one embodiment, the first display mode may be, but is not limited to, a day mode, a night mode, a mute mode, a vibration mode, a sound mode, or a flight mode, etc. The second display mode may be, but is not limited to, a day mode, a night mode, a mute mode, a vibrate mode, a sound mode, or a flight mode, etc.
Specifically, in the present embodiment, the first display mode is a daytime mode, the second display mode is a nighttime mode, and the current display mode of the terminal is the daytime mode. The terminal is controlled to be switched from the current daytime mode to the night mode according to the mode information, so that a user can be free from an entity machine to carry out mode switching of the remote control terminal, the reliability and the accuracy of user interface interaction are improved, and the liveness and emotional interaction experience of the user is met.
Specifically, according to the terminal mode switching method provided by this embodiment, whether the dynamic gesture instruction exists in the gesture list is determined by acquiring the dynamic gesture instruction, and when the dynamic gesture instruction exists in the gesture list, mode information corresponding to the dynamic gesture instruction is searched from the gesture list, so that the terminal is controlled to switch from the current first display mode to the second display mode according to the mode information, and thus a user can be enabled to break away from an entity machine to perform mode switching of a remote control terminal, reliability and accuracy of user interface interaction are improved, and liveness and emotional interaction experience of the user is met.
Referring to fig. 2, fig. 2 is a flowchart illustrating a terminal mode switching method according to a second embodiment of the present invention. As shown in fig. 1 and fig. 2, in the terminal mode switching method provided in this embodiment, the step of acquiring the dynamic gesture instruction specifically includes the following steps:
step S21: the hand image information of the user is collected.
Specifically, in an embodiment, the terminal may track the hand image information of the user in real time through an infrared camera or a visible light camera.
Step S22: and identifying the hand image information to obtain a dynamic gesture instruction.
Specifically, in an embodiment, the terminal performs recognition processing on the hand image information to obtain a dynamic gesture instruction. Specifically, the terminal performs gesture segmentation on hand image information by using a gesture recognition technology to extract hand features, and performs gesture recognition on the extracted hand features after preprocessing the extracted hand features to obtain a dynamic gesture instruction.
Referring to fig. 3, fig. 3 is a flowchart illustrating a terminal mode switching method according to a third embodiment of the present invention. As shown in fig. 1 and fig. 3, the terminal mode switching method provided in this embodiment further includes the following steps before the step of acquiring the dynamic gesture instruction:
step S31: and receiving a gesture setting instruction.
Step S32: and acquiring the hand image information of the user according to the gesture setting instruction.
Specifically, in an embodiment, after receiving a gesture setting instruction triggered by a user, the terminal acquires hand image information of the user according to the gesture setting instruction.
Step S33: and identifying the hand image information to obtain an identification result.
Specifically, in an embodiment, the terminal performs recognition processing on the hand image information by using a gesture recognition technology to obtain a recognition result. Specifically, the terminal performs gesture segmentation on hand image information by using a gesture recognition technology to extract hand features, and performs gesture recognition on the extracted hand features after preprocessing the extracted hand features to obtain a recognition result.
Step S34: and storing the recognition result and the mode information corresponding to the recognition result in a gesture list in an associated manner.
Specifically, in one embodiment, the terminal acquires user-triggered mode information corresponding to the recognition result, such as day and night mode information. Specifically, the terminal processes the recognition result and the day and night mode information to obtain a corresponding dynamic gesture command, and stores the recognition result and the day and night mode information into a gesture list in a one-to-one correspondence manner.
Referring to fig. 4, fig. 4 is a flowchart illustrating a terminal mode switching method according to a fourth embodiment of the present invention. As shown in fig. 1 and fig. 4, the step of controlling the terminal to switch from the current first display mode to the second display mode according to the mode information in the terminal mode switching method provided in this embodiment specifically includes the following steps:
step S41: the method includes the steps of obtaining a first display mode currently displayed by a terminal.
Specifically, in one embodiment, the first display mode may be, but is not limited to, a day mode, a night mode, a mute mode, a vibration mode, a sound mode, or a flight mode, etc. Specifically, the terminal acquires a first display mode of a current display of the terminal display interface in real time, for example, the first display mode is a day mode or a night mode.
Step S42: and processing according to the mode information and the first display mode to obtain a mode switching signal.
Specifically, in an embodiment, the terminal processes the mode information and the first display mode to obtain the mode switching signal.
Step S43: and controlling the terminal to switch from the first display mode to the second display mode according to the mode switching signal.
Specifically, in an embodiment, the terminal controls the terminal to switch from the current day mode to the night mode according to the mode switching signal, or the terminal controls the user interface of the terminal to switch from the current night mode to the day mode according to the mode switching signal, and the user can perform mode switching of the remote control terminal without contacting with a display interface of the terminal, so that the user can be separated from an entity machine, the reliability and accuracy of user interface interaction are improved, and the lively and emotional interaction experience of the user is met.
Specifically, according to the terminal mode switching method provided by this embodiment, whether the dynamic gesture instruction exists in the gesture list is determined by acquiring the dynamic gesture instruction, and when the dynamic gesture instruction exists in the gesture list, mode information corresponding to the dynamic gesture instruction is searched from the gesture list, so that the terminal is controlled to switch from the current first display mode to the second display mode according to the mode information, and thus a user can be enabled to break away from an entity machine to perform mode switching of a remote control terminal, reliability and accuracy of user interface interaction are improved, and liveness and emotional interaction experience of the user is met.
Referring to fig. 5, fig. 5 is a block diagram of a terminal 100 according to a fifth embodiment of the present invention. As shown in fig. 5, the terminal 100 provided in this embodiment is used for executing the terminal mode switching method, and the terminal 100 provided in this embodiment includes a memory 110 and a processor 120.
Specifically, in the present embodiment, the memory 110 is used to store executable program code. The processor 120 is configured to call the executable program code in the memory to implement the terminal mode switching method, and the steps executed by the processor include: acquiring a dynamic gesture instruction; judging whether the dynamic gesture instruction exists in a gesture list or not; when the dynamic gesture instruction exists in the gesture list, searching mode information corresponding to the dynamic gesture instruction from the gesture list; and controlling the terminal to switch from the current first display mode to the second display mode according to the mode information.
Specifically, in an embodiment, the terminal may be, but is not limited to, a smart phone, a smart wearable device, a tablet computer, a PAD, a car machine, and the like.
Specifically, in an embodiment, the step of executing, by the processor 120, the step of acquiring the dynamic gesture instruction is specifically executed by the steps of: acquiring hand image information of a user; and identifying the hand image information to obtain a dynamic gesture instruction.
Specifically, in an embodiment, the steps further performed by the processor 120 before the step of acquiring the dynamic gesture instruction includes: receiving a gesture setting instruction; acquiring hand image information of a user according to a gesture setting instruction; identifying the hand image information to obtain an identification result; and storing the recognition result and the mode information corresponding to the recognition result in a gesture list in an associated manner.
Specifically, in an embodiment, the step executed by the processor 120 for controlling the terminal to switch from the current first display mode to the second display mode according to the mode information includes: acquiring a first display mode currently displayed by a terminal; processing according to the mode information and the first display mode to obtain a mode switching signal; and controlling the terminal to switch from the first display mode to the second display mode according to the mode switching signal.
For the specific process of implementing each function of each functional unit of the terminal 100 in this embodiment, please refer to the specific contents described in the embodiments shown in fig. 1 to fig. 4, which is not described herein again.
Specifically, the terminal provided in this embodiment judges whether the dynamic gesture instruction exists in the gesture list by acquiring the dynamic gesture instruction, and when the dynamic gesture instruction exists in the gesture list, searches for mode information corresponding to the dynamic gesture instruction from the gesture list to control the terminal to switch from the current first display mode to the second display mode according to the mode information, so that a user can be away from an entity machine to perform mode switching of a remote control terminal, reliability and accuracy of user interface interaction are improved, and liveness and emotional interaction experience of the user is satisfied.
Referring to fig. 6, fig. 6 is a block diagram of a vehicle 200 according to a sixth embodiment of the invention. As shown in fig. 6, the present embodiment provides a vehicle 200 including a terminal 210. Specifically, please refer to the description of the terminal 100 in the embodiment shown in fig. 5 for a specific structure of the terminal 210, which is not described herein again.
Specifically, the vehicle provided in this embodiment judges whether the dynamic gesture instruction exists in the gesture list by acquiring the dynamic gesture instruction, and when the dynamic gesture instruction exists in the gesture list, searches for mode information corresponding to the dynamic gesture instruction from the gesture list to control the terminal to switch from the current first display mode to the second display mode according to the mode information, so that a user can be away from an entity machine to perform mode switching of a remote control terminal, reliability and accuracy of user interface interaction are improved, and liveness and emotional interaction experience of the user is satisfied.
In addition, an embodiment of the present invention further provides a computer-readable storage medium, in which computer-executable instructions are stored, where the computer-readable storage medium is, for example, a non-volatile memory such as an optical disc, a hard disc, or a flash memory. The computer-executable instructions are used to make a computer or similar computing device perform various operations in the terminal mode switching method.
It should be noted that, in the present specification, the embodiments are all described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments may be referred to each other. For the terminal class embodiment, since it is basically similar to the method embodiment, the description is relatively simple, and for relevant points, reference may be made to part of the description of the method embodiment.