CN107562205B - Projection keyboard of intelligent terminal and operation method of projection keyboard - Google Patents

Projection keyboard of intelligent terminal and operation method of projection keyboard Download PDF

Info

Publication number
CN107562205B
CN107562205B CN201710833350.XA CN201710833350A CN107562205B CN 107562205 B CN107562205 B CN 107562205B CN 201710833350 A CN201710833350 A CN 201710833350A CN 107562205 B CN107562205 B CN 107562205B
Authority
CN
China
Prior art keywords
keyboard
projection
wearable device
instruction
position information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710833350.XA
Other languages
Chinese (zh)
Other versions
CN107562205A (en
Inventor
郑天赐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Chuanying Information Technology Co Ltd
Original Assignee
Shanghai Spreadrise Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Spreadrise Technologies Co Ltd filed Critical Shanghai Spreadrise Technologies Co Ltd
Priority to CN201710833350.XA priority Critical patent/CN107562205B/en
Publication of CN107562205A publication Critical patent/CN107562205A/en
Application granted granted Critical
Publication of CN107562205B publication Critical patent/CN107562205B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides an operation method of a projection keyboard of an intelligent terminal, which comprises the following steps: establishing communication connection between the projection keyboard and intelligent wearable equipment; projecting a keyboard image at a target position by the projection keyboard, and horizontally projecting an infrared horizontal optical network at the position of the keyboard image; when a first human finger falls into a first letter range, the projection keyboard captures a reflected light spot image of the first human finger on infrared horizontal light; acquiring position information of a first human finger, and mapping the position information to a keyboard layout; the intelligent wearable device captures the knocking action of a first human finger and feeds a key instruction back to the projection keyboard; the projection keyboard receives the key instruction, generates a keyboard key event and feeds an electric signal back to the intelligent wearable device; the intelligent wearable device receives the electric signal and feeds back a rebounding stimulus to the first human finger. The intelligent wearable device is used for further monitoring the knocking action of the human body fingers, and the effectiveness of information input is ensured.

Description

Projection keyboard of intelligent terminal and operation method of projection keyboard
Technical Field
The invention relates to the technical field of accessories of intelligent terminals, in particular to a projection keyboard of an intelligent terminal and an operation method of the projection keyboard.
Background
With the development and progress of science and technology, intelligent terminals are also continuously innovated. Currently, intelligent terminals have been widely used in work, study, and life of people due to their versatility and portability. The keyboard is used as an instruction and data input device of the intelligent terminal, and can realize the input of English letters, numbers, punctuations and the like into the computer, thereby sending commands, inputting data and the like to the computer. Since the development of the keyboard, the keyboard is continuously advanced and innovated along with the market and the use requirements of users. For example, with the continuous requirement of portability of intelligent terminals, a virtual projection keyboard comes along, which adopts VKB keyboard technology as a new input mode, and uses a built-in red laser transmitter to project the outline of a standard keyboard on any surface, and then uses infrared technology to track the movement of fingers, and finally completes the acquisition of input information.
However, the projection keyboard in the current market still has many defects, such as slow input speed, high input error rate, no feedback information, inability of a user to determine that information has been accurately input, inability of a user to determine input contents through both eyes, inability of realizing touch typing, and the like. Therefore, the market application prospect of the existing projection keyboard is greatly influenced.
Disclosure of Invention
In order to solve the problems, the invention provides a projection keyboard of an intelligent terminal and an operation method of the projection keyboard, the projection keyboard is in communication connection with an intelligent wearable device, the wearable device is further used for monitoring the knocking action of human fingers, and the accuracy of input information is ensured; meanwhile, the wearing equipment is used for feeding back a rebounding stimulus to the human fingers, so that the user can confirm that the input information is effective, and the information input speed is increased.
Specifically, the invention provides an operation method of a projection keyboard of an intelligent terminal, which comprises the following steps:
establishing communication connection between the projection keyboard and intelligent wearable equipment; the projection keyboard projects a keyboard image at a target position and horizontally projects an infrared horizontal optical network at the position of the keyboard image; when a first human finger falls into a first letter range in the keyboard image, the projection keyboard captures a reflection light spot image of the first human finger on an infrared horizontal optical network; acquiring position information of the first human finger, and mapping the position information to a keyboard layout; the intelligent wearable device captures the knocking action of the first human body finger and feeds a key instruction back to the projection keyboard; the projection keyboard receives the key instruction, generates a keyboard key event and feeds an electric signal back to the intelligent wearable device; the intelligent wearable device receives the electric signal and feeds back a rebound stimulus to the first human finger.
Preferably, in the above operating method, the step of obtaining the position information of the first human finger and mapping the position information to the keyboard layout includes: after the projection keyboard captures a reflected light spot image of the first human finger on infrared horizontal light, calculating actual position information of a reflected light spot in the reflected light spot image; mapping the actual position information to an orthogonal coordinate system to obtain virtual position information of the reflection light spot in the orthogonal coordinate system; and comparing the virtual position information with a prestored keyboard position information, and mapping the virtual position information to the keyboard layout.
Preferably, in the above operating method, the step of capturing the tapping motion of the first human finger by the smart wearable device and feeding back a key instruction to the projected keyboard includes: the intelligent wearable device checks the speed and pressure of the first human finger for sending actions, and compares the speed and pressure with a preset speed and a preset pressure respectively; when the speed is greater than the preset pressure and the pressure is greater than the preset pressure, judging that the first human finger sends out a knocking action; and generating a key instruction and sending the key instruction to the projection keyboard.
Preferably, in the above operating method, the step of receiving the electrical signal by the smart wearable device and feeding back a resilient stimulus to the first human finger position includes: after the intelligent wearable equipment receives the electric signal, analyzing the electric signal to obtain the instruction content of the electric signal; the intelligent wearable device identifies the first human body finger which sends the knocking action according to the instruction content; the intelligent wearable device sends a rebound stimulus to the first human finger.
Preferably, in the above operating method, the smart wearable device includes a smart finger cot, a smart glove, a smart bracelet, and a smart wrist.
Another aspect of the present invention provides a projection keyboard of an intelligent terminal, which includes the following modules, a projection keyboard body, and a wearable device in communication connection with the projection keyboard body.
The projection keyboard body comprises a keyboard projection module, a display module and a display module, wherein the keyboard projection module projects a keyboard image at a target position; the infrared horizontal optical network module horizontally projects an infrared horizontal optical network at the position of the keyboard image; the camera module is used for capturing a reflected light spot image of a human finger on the infrared horizontal optical network; the image processing module is used for acquiring the position information of the human body finger from the reflected light spot image; the data processing module is used for mapping the position information to the keyboard layout of the projection keyboard; the key module receives a key instruction and generates a keyboard key event; the first feedback module receives the key instruction and feeds back an electric signal to the wearable device.
The wearable device comprises an instruction module, a display module and a display module, wherein the instruction module is in communication connection with the projection keyboard body, captures the knocking action of the human fingers and feeds a key instruction back to the projection keyboard; and the second feedback module is used for receiving the electric signal fed back by the projection keyboard and feeding back a rebound stimulus to the human finger.
Preferably, in the projection keyboard, the image processing module further includes a first analyzing unit configured to analyze the reflected light spot image and calculate actual position information of the reflected light spot in the reflected light spot image; the data processing module comprises a first mapping unit for mapping the actual position information to an orthogonal coordinate system to obtain the virtual position information of the reflection light spot in the orthogonal coordinate system, and an analysis unit for comparing the virtual position information with a pre-stored keyboard position information and mapping the virtual position information to the keyboard layout.
Preferably, in the projection keyboard, the command module includes a pressure sensing unit, which checks pressure of the human body's finger to send out motion, compares the pressure with a preset pressure, and sends a first operation command to the outside when the pressure is greater than the preset pressure; the speed sensing unit is used for checking the speed of the human body finger for sending the action, comparing the speed with a preset speed, and sending a second operation instruction outwards when the speed is higher than the preset speed; and the instruction unit is respectively in communication connection with the pressure sensing unit and the speed sensing unit, judges that the first human finger sends a knocking action when receiving the first operation instruction and the second operation instruction, generates a key instruction and sends the key instruction to the projection keyboard.
Preferably, in the projection keyboard, the second feedback module includes a second analyzing unit, which receives the electrical signal, analyzes the electrical signal, and obtains instruction content of the electrical signal; the information identification unit is used for identifying the human body finger which sends the knocking action according to the instruction content and sending the identification information of the human body finger outwards; and the feedback unit is used for receiving the identification information and sending a rebound stimulus to the human body finger.
Preferably, in the above projection keyboard, the intelligent wearable device includes an intelligent finger stall, an intelligent glove, an intelligent bracelet, and an intelligent wrist.
Compared with the prior art, the invention has the technical advantages that:
1) the projection keyboard is in communication connection with an intelligent wearable device, and the intelligent wearable device is utilized to further monitor the knocking action of human fingers, so that the validity of information input is ensured;
2) according to the intelligent wearable device, the rebounding stimulus is timely and accurately fed back to the human finger which sends the knocking action, so that the user can confirm that the input information is effective, and the information input speed is increased;
3) the intelligent wearable device monitors the pressure and the speed of the motion of the human finger at the same time, further confirms the knocking motion of the human finger, effectively avoids information input caused by misoperation of the human finger, and improves the effectiveness of the information input.
Drawings
Fig. 1 is a schematic flowchart of an operation method of a projection keyboard of an intelligent terminal according to a preferred embodiment of the present invention;
FIG. 2 is a flowchart illustrating a step of obtaining the position information of the first human finger and mapping the position information to a keyboard layout in the operating method of FIG. 1;
fig. 3 is a schematic flowchart illustrating a step of the smart wearable device capturing a tapping motion of the first human finger and feeding a key instruction back to the projected keyboard in the operation method in fig. 1;
fig. 4 is a schematic flowchart illustrating a step of the smart wearable device receiving the electrical signal and feeding back a resilient stimulus to the first human finger position in the operating method of fig. 1;
fig. 5 is a structural diagram of a projection keyboard of an intelligent terminal according to a preferred embodiment of the present invention;
FIG. 6 is a block diagram of an image processing module in the projection keyboard of FIG. 5;
FIG. 7 is a block diagram of a command module in the projected keyboard of FIG. 5;
fig. 8 is a block diagram of a second feedback module in the projection keyboard of fig. 5.
Detailed Description
The advantages of the invention are explained in detail below with reference to the drawings and the embodiments.
When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
First, it should be understood that although the terms first, second, third, etc. may be used in this disclosure to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, a first intelligent terminal may also be referred to as a second intelligent terminal, and similarly, a second intelligent terminal may also be referred to as a first intelligent terminal, without departing from the scope of the present disclosure.
Referring to fig. 1, it is a method for operating a projection keyboard of an intelligent terminal according to a preferred embodiment of the present invention, and as can be seen from the figure, the method for operating a projection keyboard provided in this embodiment mainly includes the following steps:
-connecting the projection keyboard to an intelligent wearable device for communication
In this embodiment, in order to realize more effective information input of the projection keyboard, an intelligent wearable device is provided for the projection keyboard, and communication connection is established between the intelligent wearable device and the projection keyboard in the form of, for example, bluetooth, a wireless network, a mobile network, a data line, and the like.
Preferably, in this embodiment, the intelligent wearable device may be an intelligent finger cot, an intelligent glove, an intelligent bracelet, and an intelligent wrist.
-projecting the keyboard to project a keyboard image at a target location
In this embodiment, the projection keyboard includes a keyboard projection module. Preferably, the keyboard projection module may be an infrared laser projector or other image projection device. When the projection keyboard is started, the keyboard projection module can project a keyboard image at a target position selected by a user. Preferably, the keyboard image may be a standard 26-letter keyboard, or a nine-grid input method keyboard, according to a preset setting. In this embodiment, preferably, a keyboard template may be preset in the projection keyboard projection device according to the preference and the requirement of the user, and when the projection keyboard is started, the keyboard projection module projects a corresponding keyboard image at the target position selected by the user according to the keyboard template.
-projecting an infrared horizontal optical network horizontally at the keyboard image position
In this embodiment, the projection keyboard further includes an infrared horizontal optical network module. Preferably, the infrared horizontal optical network module may include an in-line infrared laser, which may project an infrared horizontal optical network in parallel with the horizontal plane at a position close to the horizontal plane, where the infrared horizontal optical network is an infrared optical network invisible to the naked eye of the user, and the infrared horizontal optical network should coincide with the position of the keyboard image and be consistent with the positions of the letters, numbers or other function keys in the keyboard image.
-capturing a reflected spot image of a human finger on infrared horizontal light
In this embodiment, the projection keyboard further includes a camera module. Preferably, the camera module comprises an image sensor for capturing an image of a human finger placed within the keyboard image. Therefore, when a human finger of a user is placed on the keyboard image and falls into the range of the infrared horizontal optical network, the infrared horizontal light irradiates the human finger and is reflected back to a reflection light spot. At this time, the camera module may capture a reflected light spot image including the reflected light spot.
Preferably, in this embodiment, the infrared horizontal optical network is woven by the infrared horizontal light projected onto each key in the keyboard image, so that, only when a human finger falls within the range of each key, the infrared horizontal light can be projected onto the human finger and reflect a reflection spot.
-obtaining position information of the human finger from the reflected light spot image
In this embodiment, the projection keyboard further includes an image processing module, and the image processing module can read actual specific position information of the human finger in the reflection light spot image from the reflection light spot image.
Specifically, in this embodiment, the image processing module includes a first parsing unit, where the first parsing unit is capable of converting an image visible to human eyes into coordinate information representing a position, and further, for example, using a laser triangulation principle, extracting and calculating actual position information of the reflection light spot in the reflection light spot image from the coordinate information.
-mapping position information of human fingers into a keyboard layout
In this embodiment, the projection keyboard further includes a data processing module, and the data processing module includes a microcontroller. By establishing a mapping mechanism in advance, after the actual position information of the reflection light spot is obtained, the data processing module can read and map the actual position information of the reflection light spot into an orthogonal coordinate system according to the mapping mechanism. So as to establish a one-to-one correspondence relationship between the position information of the reflection light spots and the position information of the keyboard image.
The data processing module comprises a first mapping unit, and the mapping unit can map the actual position information of the reflection light spot into an orthogonal coordinate system by converting a world coordinate system into a screen coordinate system according to a preset mapping mechanism, so as to obtain the virtual position information of the reflection light spot in the orthogonal coordinate system;
further, the data processing module further comprises an analysis unit, the analysis unit can receive and read the virtual position information of the reflection light spot in the orthogonal coordinate system, and compare the virtual position information with keyboard position information prestored in the projection keyboard, when the virtual position information is consistent with position information of one or more keys of the projection keyboard, the analysis unit obtains key information triggered by the human finger, namely, the position information of the human finger is mapped to the keyboard layout.
-the intelligent wearable device captures the tapping action of a human finger and feeds a key instruction back to the projection keyboard
In this embodiment, on one hand, in order to further confirm the triggering action of the human finger on the projection keyboard and prevent the phenomenon of wrong information input caused by no operation of the human finger, a special intelligent wearable device is provided in this embodiment to be in communication connection with the projection keyboard. The motion state of the human body fingers is detected through the intelligent wearable device, and the accuracy of information input is further guaranteed.
Specifically, intelligence wearing equipment includes an instruction module, instruction module with projection keyboard body communication is connected. The instruction module can detect and capture the knocking action of the human fingers and feed back a key instruction to the projection keyboard, wherein the knocking action is the action of knocking the keyboard image area by the human fingers at a certain pressure and speed.
In this embodiment, preferably, the instruction module of the intelligent wearable device includes a pressure sensing unit, and the pressure sensing unit may be a pressure sensor. The pressure sensing unit can sense the pressure of the human body finger to send out action, the pressure value is compared with a preset pressure value, and when the pressure is found to be larger than the preset pressure through comparison, the pressure sensing unit sends a first operation instruction outwards; when the comparison shows that the pressure is less than or equal to the preset pressure, no action is taken;
preferably, in this embodiment, the instruction module further includes a speed sensing unit, and the speed sensing unit may be a speed sensor. The speed sensor can sense the speed of the action sent by the human finger and compare the speed with a preset speed, and when the speed is found to be higher than the preset speed through comparison, the speed sensing unit generates a second operation instruction and sends the second operation instruction outwards; when the speed is found to be less than or equal to the preset speed through comparison, the speed sensing unit does not generate any action;
preferably, in this embodiment, the instruction module further includes an instruction unit, the instruction unit is respectively connected to the pressure sensing unit and the speed sensing unit in a communication manner, and when the instruction unit captures a first operation instruction sent by the pressure sensing unit and a second operation instruction sent by the speed sensing unit, it is determined that the human finger sends a tapping action, so as to generate a key instruction, and send the key instruction to the projection keyboard.
-the projected keyboard receives key commands and generates keyboard key events
In this embodiment, after the projection keyboard confirms the specific key information touched by the human finger through the light spot reflected by the human finger to the infrared horizontal optical network, only after the key module of the projection keyboard confirms the key instruction sent by the intelligent wearable device, the key module generates a keyboard case event, and inputs corresponding information into the intelligent terminal. Thus, double guarantees are provided, and the accuracy of information input is ensured.
-feeding back an electrical signal to the smart wearable device
In this embodiment, the projection keyboard further includes a first feedback module, and the first feedback module may also capture the key instruction and feed back a corresponding electrical signal to the wearable device according to the key instruction. Thereby, the information of the specific human body finger giving the knocking action is fed back to the wearing device.
-the smart wearable device receives the electrical signal and feeds back a resilient stimulus to the first human finger
In this embodiment, the intelligent wearable device includes a second feedback module, which is in communication connection with the projection keyboard, so that the second feedback module can receive the electrical signal fed back by the projection keyboard, and feed back a resilient stimulus to the human finger that has sent out the tapping action according to the electrical signal. Therefore, the user is enabled to confirm that the knocking action of the user on one or more keys of the projection keyboard is accurate.
Particularly preferably, in this embodiment, the second feedback module includes a second parsing unit, and the second parsing unit is in communication connection with the projection keyboard, so that the second parsing unit can receive the electrical signal, parse the electrical signal, obtain the instruction content of the electrical signal from the electrical signal, and send the instruction content to the outside;
preferably, in this embodiment, the second feedback module further includes an information identification unit, which is in communication connection with the second parsing unit, so that the information identification unit can receive and identify the human finger that has sent the tapping action according to the instruction content, and send the identification information of the human finger to the outside;
preferably, in this embodiment, the second feedback module further includes a feedback unit, the feedback unit is in communication connection with the information recognition unit, and the feedback unit can receive the identification information and send a resilient stimulus to the human finger according to the identification information. Preferably, the rebound stimulation may be electrical stimulation, pressure stimulation, or the like.
Referring to fig. 5, another aspect of the present invention is to provide a projection keyboard of an intelligent terminal, which mainly includes: the projection keyboard comprises a projection keyboard body and wearing equipment in communication connection with the projection keyboard body.
In this embodiment, the connection mode between the projection keyboard and the intelligent wearable device may be: bluetooth, wireless network, mobile network, data line, etc. And, preferably, the intelligent wearable device can be an intelligent finger stall, an intelligent glove, an intelligent bracelet, and an intelligent wrist.
In this embodiment, the projection keyboard and the intelligent wearable device specifically include the following modules:
keyboard projection module
The projection keyboard body is arranged in the projection keyboard body and used for projecting a keyboard image at a target position. Specifically, the keyboard projection module may be an infrared laser projector or other image projection device. When the projection keyboard is started, the keyboard projection module can project a keyboard image at a target position selected by a user. Preferably, the keyboard image may be a standard 26-letter keyboard, or a nine-grid input method keyboard, according to a preset setting. In this embodiment, preferably, a keyboard template may be preset in the projection keyboard projection device according to the preference and the requirement of the user, and when the projection keyboard is started, the keyboard projection module projects a corresponding keyboard image at the target position selected by the user according to the keyboard template.
-infrared horizontal optical network module
And the projection keyboard body is arranged in the projection keyboard body and is used for horizontally projecting an infrared horizontal optical network at the position of the keyboard image. Specifically, the infrared horizontal optical network module may include an in-line infrared laser, which may project an infrared horizontal optical network in parallel with the horizontal plane at a position close to the horizontal plane, where the infrared horizontal optical network is an infrared optical network invisible to the naked eye of the user, and the infrared horizontal optical network should coincide with the position of the keyboard image and be consistent with the positions of the letters, numbers, or other function keys in the keyboard image.
-a camera module
The projection keyboard body is arranged in the projection keyboard body and used for capturing a reflection light spot image of the human finger to the infrared horizontal light. Specifically, the camera module comprises an image sensor which can capture images of human fingers placed in the keyboard image range. Therefore, when a human finger of a user is placed on the keyboard image and falls into the range of the infrared horizontal optical network, the infrared horizontal light irradiates the human finger and is reflected back to a reflection light spot. At this time, the camera module may capture a reflected light spot image including the reflected light spot.
Preferably, in this embodiment, the infrared horizontal optical network is woven by the infrared horizontal light projected on each key in the keyboard image, so that, only when a human finger falls within the range of each key, the infrared horizontal light can be projected on the human finger and reflect a reflection light spot.
-an image processing module
And the projection keyboard body is arranged in the projection keyboard body and is used for acquiring the position information of the human fingers from the reflected light spot image. Specifically, the image processing module can read actual specific position information of the human finger in the reflected light spot image from the reflected light spot image.
Referring to fig. 6, in this embodiment, the image processing module preferably includes a first parsing unit, where the first parsing unit is capable of converting an image visible to human eyes into coordinate information representing a position, and further extracting and calculating actual position information of the reflection spot in the reflection spot image from the coordinate information, for example, by using the principle of laser triangulation.
-a data processing module
And the keyboard layout is arranged in the projection keyboard body and used for mapping the position information to the projection keyboard. Specifically, the data processing module comprises a microcontroller. By establishing a mapping mechanism in advance, after the actual position information of the reflection light spot is obtained, the data processing module can read and map the actual position information of the reflection light spot into an orthogonal coordinate system according to the mapping mechanism. So as to establish a one-to-one correspondence relationship between the position information of the reflection light spots and the position information of the keyboard image.
With continuing reference to fig. 6, preferably, the data processing module includes a first mapping unit, and the mapping unit may map the actual position information of the reflection light spot into an orthogonal coordinate system by converting a world coordinate system into a screen coordinate system according to a preset mapping mechanism, so as to obtain virtual position information of the reflection light spot in the orthogonal coordinate system;
further, the data processing module further comprises an analysis unit, the analysis unit can receive and read the virtual position information of the reflection light spot in the orthogonal coordinate system, and compare the virtual position information with keyboard position information prestored in the projection keyboard, when the virtual position information is consistent with position information of one or more keys of the projection keyboard, the analysis unit obtains key information triggered by the human finger, namely, the position information of the human finger is mapped to the keyboard layout.
-a key module
The projection keyboard body is arranged in the projection keyboard body and used for receiving a key instruction and generating a keyboard key event. Specifically, in this embodiment, after the projection keyboard confirms the specific key information touched by the human finger through the light spot reflected by the human finger to the infrared horizontal optical network, only after the key module of the projection keyboard confirms the key instruction sent by the intelligent wearable device, the key module generates a keyboard case event, and inputs corresponding information into the intelligent terminal. Thus, double guarantees are provided, and the accuracy of information input is ensured.
-a first feedback module
The projection keyboard is arranged in the projection keyboard and used for receiving the key instruction and feeding back an electric signal to the wearable device. Specifically, the first feedback module may also capture the key instruction, and feed back a corresponding electrical signal to the wearable device according to the key instruction. Thereby, the information of the specific human body finger giving the knocking action is fed back to the wearing device.
-an instruction module
The intelligent wearable device is arranged in the intelligent wearable device, is in communication connection with the projection keyboard body, and is used for capturing the knocking action of the human fingers and feeding a key instruction back to the projection keyboard. Specifically, the instruction module is in communication connection with the projection keyboard body. The instruction module can detect and capture the knocking action of the human fingers and feed back a key instruction to the projection keyboard, wherein the knocking action is the action of knocking the keyboard image area by the human fingers at a certain pressure and speed.
Referring to fig. 7, in this embodiment, preferably, the instruction module of the intelligent wearable device includes a pressure sensing unit, and the pressure sensing unit may be a pressure sensor. The pressure sensing unit can sense the pressure of the human body finger to send out action, the pressure value is compared with a preset pressure value, and when the pressure is found to be larger than the preset pressure through comparison, the pressure sensing unit sends a first operation instruction outwards; when the comparison shows that the pressure is less than or equal to the preset pressure, no action is taken;
with continued reference to fig. 7, in the present embodiment, the command module further preferably includes a speed sensing unit, which may be a speed sensor. The speed sensor can sense the speed of the action sent by the human finger and compare the speed with a preset speed, and when the speed is found to be higher than the preset speed through comparison, the speed sensing unit generates a second operation instruction and sends the second operation instruction outwards; when the speed is found to be less than or equal to the preset speed through comparison, the speed sensing unit does not generate any action;
with reference to fig. 7, preferably, in this embodiment, the instruction module further includes an instruction unit, the instruction unit is respectively connected to the pressure sensing unit and the speed sensing unit in a communication manner, and when the instruction unit captures a first operation instruction sent by the pressure sensing unit and a second operation instruction sent by the speed sensing unit, it is determined that the human finger is performing a tapping operation, so as to generate a key instruction, and send the key instruction to the projection keyboard.
-a second feedback module
The intelligent wearable device is arranged in the intelligent wearable device, can receive the electric signal fed back by the projection keyboard and feed back a rebound stimulus to the human fingers. Specifically, the second feedback module is in communication connection with the projection keyboard, so that the second feedback module can receive the electric signal fed back by the projection keyboard and feed back a rebound stimulus to the human finger which sends out the knocking action according to the electric signal. Therefore, the user is enabled to confirm that the knocking action of the user on one or more keys of the projection keyboard is accurate.
Referring to fig. 8, preferably, in this embodiment, the second feedback module includes a second parsing unit, and the second parsing unit is in communication connection with the projection keyboard, so that the second parsing unit can receive the electrical signal and parse the electrical signal, obtain the instruction content of the electrical signal from the electrical signal, and send the instruction content to the outside;
with reference to fig. 8, preferably, in this embodiment, the second feedback module further includes an information recognition unit, which is in communication connection with the second parsing unit, so that the information recognition unit can receive and recognize the human finger that has sent the tapping action according to the instruction content, and send the identification information of the human finger to the outside;
with reference to fig. 8, preferably, in this embodiment, the second feedback module further includes a feedback unit, the feedback unit is in communication connection with the information identification unit, and the feedback unit can receive the identification information and send a resilient stimulus to the human finger according to the identification information. Preferably, the rebound stimulation may be electrical stimulation, pressure stimulation, or the like.
In conclusion, the projection keyboard and the intelligent wearable device are in communication connection, the intelligent wearable device is utilized to further monitor the knocking action of human fingers, and the effectiveness of information input is ensured; according to the intelligent wearable device, the rebounding stimulus is timely and accurately fed back to the human finger which sends the knocking action, so that the user can confirm that the input information is effective, and the information input speed is increased; the intelligent wearable device monitors the pressure and the speed of the motion of the human finger at the same time, further confirms the knocking motion of the human finger, effectively avoids information input caused by misoperation of the human finger, and improves the effectiveness of the information input.
It should be noted that the embodiments of the present invention have been described in terms of preferred embodiments, and not by way of limitation, and that those skilled in the art can make modifications and variations of the embodiments described above without departing from the spirit of the invention.

Claims (8)

1. An operation method of a projection keyboard of an intelligent terminal is characterized by comprising the following steps:
establishing communication connection between the projection keyboard and an intelligent wearable device,
the projection keyboard projects a keyboard image at a target position and horizontally projects an infrared horizontal optical network at the position of the keyboard image,
when a first human finger falls within a first letter range in the keyboard image, the projection keyboard captures a reflection spot image of the first human finger on an infrared horizontal optical network,
acquiring position information of the first human finger, mapping to a keyboard layout,
the intelligent wearable device captures the knocking action of the first human body finger and feeds a key instruction back to the projection keyboard,
the projection keyboard receives the key command and generates a keyboard key event,
and feeds back an electric signal to the intelligent wearable device,
the intelligent wearable device receives the electric signal and feeds back a rebound stimulus to the first human finger;
the intelligent wearable device captures the knocking action of the first human body finger and feeds a key instruction back to the projection keyboard, and the step comprises the following steps,
the intelligent wearable device checks the speed and pressure of the first human finger to send out the action, and compares the speed and the pressure with a preset speed and a preset pressure respectively,
when the speed is higher than the preset speed and the pressure is higher than the preset pressure, judging that the first human finger sends out a knocking action,
and generating a key instruction and sending the key instruction to the projection keyboard.
2. The method of operation of claim 1,
a step of obtaining position information of the first human finger, mapping to a keyboard layout, comprising,
after the projection keyboard captures the reflected light spot image of the first human finger to the infrared horizontal light, calculating the actual position information of the reflected light spot in the reflected light spot image,
mapping the actual position information to an orthogonal coordinate system to obtain the virtual position information of the reflection light spot in the orthogonal coordinate system,
and comparing the virtual position information with a prestored keyboard position information, and mapping the virtual position information to the keyboard layout.
3. The method of operation of claim 1,
the step of receiving the electric signal and feeding back a rebounding stimulus to the first human finger position by the intelligent wearable device comprises the steps of,
after receiving the electric signal, the intelligent wearable equipment analyzes the electric signal to obtain the instruction content of the electric signal,
the intelligent wearable device identifies the first human body finger which sends the knocking action according to the instruction content,
the intelligent wearable device sends a rebound stimulus to the first human finger.
4. The method of operation of any of claims 1-3,
the intelligent wearable device comprises an intelligent finger stall, an intelligent glove and an intelligent bracelet.
5. A projection keyboard of an intelligent terminal is characterized by comprising the following modules,
a projection keyboard body and an intelligent wearable device in communication connection with the projection keyboard body,
wherein the content of the first and second substances,
the projection keyboard body comprises a projection keyboard body,
the keyboard projection module projects a keyboard image at a target position,
an infrared horizontal optical network module for horizontally projecting an infrared horizontal optical network at the keyboard image position,
the camera module captures a light spot image reflected by a human finger to the infrared horizontal optical network,
an image processing module for acquiring the position information of the human finger from the reflected light spot image,
a data processing module to map the location information to a keyboard layout of the projected keyboard,
the key module receives a key command and generates a keyboard key event,
the first feedback module receives the key instruction and feeds back an electric signal to the intelligent wearable device,
the intelligent wearable device comprises a plurality of intelligent wearable devices,
the instruction module is in communication connection with the projection keyboard body, captures the knocking action of the human fingers and feeds a key instruction back to the projection keyboard,
the second feedback module is used for receiving the electric signal fed back by the projection keyboard and feeding back a rebound stimulus to the human finger;
the instruction module comprises a plurality of instruction modules,
the pressure sensing unit is used for checking the pressure of the human body finger for sending action, comparing the pressure with a preset pressure, and sending a first operation instruction outwards when the pressure is greater than the preset pressure,
the speed sensing unit is used for checking the speed of the human body finger for sending the action, comparing the speed with a preset speed, and sending a second operation instruction outwards when the speed is higher than the preset speed,
and the instruction unit is respectively in communication connection with the pressure sensing unit and the speed sensing unit, judges that the human finger sends a knocking action when receiving the first operation instruction and the second operation instruction, generates a key instruction and sends the key instruction to the projection keyboard.
6. The projection keyboard of claim 5,
the image processing module further comprises a processor for processing the image,
a first analyzing unit for analyzing the reflected light spot image and calculating the actual position information of the reflected light spot in the reflected light spot image,
the data processing module comprises a data processing module and a data processing module,
a first mapping unit, which maps the actual position information to an orthogonal coordinate system to obtain the virtual position information of the reflection light spot in the orthogonal coordinate system,
and the analysis unit is used for comparing the virtual position information with a prestored keyboard position information and mapping the virtual position information to the keyboard layout.
7. The projection keyboard of claim 5,
the second feedback module may include a second feedback module,
a second analysis unit for receiving the electric signal, analyzing the electric signal and obtaining the instruction content of the electric signal,
the information identification unit identifies the human body finger which sends the knocking action according to the instruction content and sends the identification information of the human body finger outwards,
and the feedback unit is used for receiving the identification information and sending a rebound stimulus to the human body finger.
8. The projection keyboard of any of claims 5-7,
the intelligent wearable device comprises an intelligent finger stall, an intelligent glove and an intelligent bracelet.
CN201710833350.XA 2017-09-15 2017-09-15 Projection keyboard of intelligent terminal and operation method of projection keyboard Active CN107562205B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710833350.XA CN107562205B (en) 2017-09-15 2017-09-15 Projection keyboard of intelligent terminal and operation method of projection keyboard

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710833350.XA CN107562205B (en) 2017-09-15 2017-09-15 Projection keyboard of intelligent terminal and operation method of projection keyboard

Publications (2)

Publication Number Publication Date
CN107562205A CN107562205A (en) 2018-01-09
CN107562205B true CN107562205B (en) 2021-08-13

Family

ID=60981127

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710833350.XA Active CN107562205B (en) 2017-09-15 2017-09-15 Projection keyboard of intelligent terminal and operation method of projection keyboard

Country Status (1)

Country Link
CN (1) CN107562205B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110654236A (en) * 2018-06-29 2020-01-07 比亚迪股份有限公司 Vehicle key system, control method thereof and vehicle
CN110058743A (en) * 2019-06-06 2019-07-26 广东好太太科技集团股份有限公司 A kind of the clothes airing machine system and its control method of projected keyboard control
CN111563459A (en) * 2020-05-09 2020-08-21 胡团伟 Finger motion acquisition method and finger motion acquisition equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105339870A (en) * 2014-03-21 2016-02-17 三星电子株式会社 Method and wearable device for providing a virtual input interface
CN105739672A (en) * 2014-12-09 2016-07-06 深圳富泰宏精密工业有限公司 Projection input system and method

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106484125B (en) * 2006-12-19 2020-06-05 浙江讯翔信息科技有限公司 Man-machine interaction device, electronic device and method
CN101685342B (en) * 2008-09-26 2012-01-25 联想(北京)有限公司 Method and device for realizing dynamic virtual keyboard
CN201638148U (en) * 2009-09-10 2010-11-17 深圳市亿思达显示科技有限公司 Glove-type virtual input device
CN102096470A (en) * 2011-02-14 2011-06-15 厦门大学 Acceleration sensing-based virtual air keyboard
KR101896947B1 (en) * 2011-02-23 2018-10-31 엘지이노텍 주식회사 An apparatus and method for inputting command using gesture
CN102799317B (en) * 2012-07-11 2015-07-01 联动天下科技(大连)有限公司 Smart interactive projection system
US20140188606A1 (en) * 2013-01-03 2014-07-03 Brian Moore Systems and methods for advertising on virtual keyboards
KR20160045269A (en) * 2014-10-17 2016-04-27 엘지전자 주식회사 Wearable device and mobile terminal for supporting communication with the device
US20160261147A1 (en) * 2015-03-04 2016-09-08 PogoTec, Inc. Wireless power base unit and a system and method for body-worn repeater charging of wearable electronic devices
US9746921B2 (en) * 2014-12-31 2017-08-29 Sony Interactive Entertainment Inc. Signal generation and detector systems and methods for determining positions of fingers of a user
GB2534386A (en) * 2015-01-21 2016-07-27 Kong Liang Smart wearable input apparatus
CN104978142B (en) * 2015-06-17 2018-07-31 华为技术有限公司 A kind of control method of intelligent wearable device and intelligent wearable device
US9898809B2 (en) * 2015-11-10 2018-02-20 Nanjing University Systems, methods and techniques for inputting text into mobile devices using a camera-based keyboard
CN106095136A (en) * 2016-06-10 2016-11-09 北京行云时空科技有限公司 A kind of wearable device controls the method for intelligent terminal
CN105975091A (en) * 2016-07-05 2016-09-28 南京理工大学 Virtual keyboard human-computer interaction technology based on inertial sensor
CN106933376B (en) * 2017-03-23 2018-03-13 哈尔滨拓博科技有限公司 A kind of scaling method of smooth projected keyboard

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105339870A (en) * 2014-03-21 2016-02-17 三星电子株式会社 Method and wearable device for providing a virtual input interface
CN105739672A (en) * 2014-12-09 2016-07-06 深圳富泰宏精密工业有限公司 Projection input system and method

Also Published As

Publication number Publication date
CN107562205A (en) 2018-01-09

Similar Documents

Publication Publication Date Title
US11650659B2 (en) User input processing with eye tracking
CN110310288B (en) Method and system for object segmentation in a mixed reality environment
US11009961B2 (en) Gesture recognition devices and methods
CN103914152B (en) Multi-point touch and the recognition methods and system that catch gesture motion in three dimensions
US9288373B2 (en) System and method for human computer interaction
EP2817694B1 (en) Navigation for multi-dimensional input
US9122353B2 (en) Kind of multi-touch input device
CN108139856B (en) Signature authentication method, terminal, handwriting pen and system
CN203930682U (en) Multi-point touch and the recognition system that catches gesture motion in three dimensions
CN107562205B (en) Projection keyboard of intelligent terminal and operation method of projection keyboard
US9213413B2 (en) Device interaction with spatially aware gestures
US10296096B2 (en) Operation recognition device and operation recognition method
KR102147430B1 (en) virtual multi-touch interaction apparatus and method
JP2004246578A (en) Interface method and device using self-image display, and program
US10372223B2 (en) Method for providing user commands to an electronic processor and related processor program and electronic circuit
CN104571521B (en) Hand-written recording equipment and hand-written recording method
TW201423612A (en) Device and method for recognizing a gesture
KR20150106823A (en) Gesture recognition apparatus and control method of gesture recognition apparatus
US20110254813A1 (en) Pointing device, graphic interface and process implementing the said device
CN102855087A (en) Input method, device and terminal
CN111881431A (en) Man-machine verification method, device, equipment and storage medium
US20150138088A1 (en) Apparatus and Method for Recognizing Spatial Gesture
US10593077B2 (en) Associating digital ink markups with annotated content
CN116301551A (en) Touch identification method, touch identification device, electronic equipment and medium
JP2013134549A (en) Data input device and data input method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20221214

Address after: 201203 1st floor, building 1, Lane 36, Xuelin Road, Pudong New Area Free Trade Zone, Shanghai

Patentee after: SHANGHAI TRANSSION INFORMATION TECHNOLOGY Ltd.

Address before: Room 922 / 926, block a, No.1 Lane 399, shengxia Road, Pudong New Area pilot Free Trade Zone, Shanghai 201203

Patentee before: SHANGHAI SPREADRISE COMMUNICATION TECHNOLOGY Ltd.