The content of the invention
In view of the above problems, the present invention provide it is a kind of overcome the problems referred to above or at least partly solve the above problems it is virtual
Keyboard operation device, system and method.
The present invention provides a kind of dummy keyboard operation device, including display control unit and display unit, and the device includes:
View recognition unit, the view data of the user's finger for gathering to camera carries out hard recognition, to determine
Finger end points position in the picture.
Horizontal plane two-dimensional coordinate sets up unit, and the finger endpoint location for being identified according to the view recognition unit is being schemed
Position as in and the pixel resolution of camera, by finger end points location of pixels the two-dimensional coordinate value of XZ coordinate surfaces is converted to.
Vertical plane two-dimensional coordinate sets up unit, and the finger endpoint location for being identified according to the view recognition unit is being schemed
Position as in and the pixel resolution of camera, by finger end points location of pixels the two-dimensional coordinate value of YZ coordinate surfaces is converted to.
Three-dimensional coordinate computing unit, for setting up unit and the foundation of vertical plane two-dimensional coordinate according to the horizontal plane two-dimensional coordinate
The finger end points location of pixels that unit determines respectively calculates finger end points and exists in the two-dimensional coordinate value of XZ coordinate surfaces and YZ coordinate surfaces
Coordinate value in XYZ three-dimensional system of coordinates.
Action judging unit, for according to the D coordinates value of the finger end points of three-dimensional coordinate computing unit calculating and void
The corresponding band of position of each key mapping of plan keyboard, judges operation of the user to dummy keyboard key mapping.And
Chart drawing unit, for drawing out analog subscriber finger virtual according to the judged result of the action judging unit
The operation diagram picture of keyboard correspondence key mapping, and call the display control unit to control the display unit to show the operation diagram picture.
The present invention also provides a kind of dummy keyboard operating system, including the dummy keyboard operation dress described in as above any one
Put and two picture pick-up devices with the device communication connection.
The present invention also provides a kind of dummy keyboard method of operating, and the method includes:
User vacantly lies against finger in image capture area, and the view data of the user's finger of camera collection is entered
Row hard recognition, to determine finger end points position in the picture.
According to the finger endpoint location for identifying position in the picture and the pixel resolution of camera, by finger end points
Location of pixels is respectively converted into the two-dimensional coordinate value of XZ coordinate surfaces and YZ coordinate surfaces.
Finger end points is calculated in XYZ in the two-dimensional coordinate value of XZ coordinate surfaces and YZ coordinate surfaces according to finger end points location of pixels
Coordinate value in three-dimensional system of coordinate.
According to the corresponding band of position of the D coordinates value and dummy keyboard of finger end points of the calculating each key mapping, judge
Operation of the user to dummy keyboard key mapping.
Operation diagram picture of the analog subscriber finger in dummy keyboard correspondence key mapping is drawn out according to the judged result.And
Show the images for user viewing of the drafting.
A kind of dummy keyboard operation device, system and method that the present invention is provided, capture image and recognize hand by camera
The position of finger tip point and posture, by the finger extreme coordinates for obtaining the operational motion to dummy keyboard is mapped directly into, and
Show on display and feed back to user, by taking the photograph in intelligent glasses and Intelligent bracelet, or intelligent and portable mobile device
As equipment fast construction dummy keyboard input environment, it is no longer necessary to entity device, user is facilitated to pass through dummy keyboard whenever and wherever possible
Carry out man-machine interactive operation.
Specific embodiment
To describe technology contents of the invention, structural feature in detail, purpose and effect being realized, below in conjunction with embodiment
And coordinate accompanying drawing to be explained in detail.
Fig. 1 is referred to, is the hardware structure schematic diagram of the dummy keyboard operating system in embodiment of the present invention, the system
100 include 10, two picture pick-up devices 20 of dummy keyboard operation device and display device 21, for the detection reality to user gesture
Existing touch-control input.
It is the high-level schematic functional block diagram of the dummy keyboard operation device in embodiment of the present invention please refer to Fig. 2.Should
Device 10 includes that ambient brightness sensing unit 101, view recognition unit 102, horizontal plane two-dimensional coordinate set up unit 105, vertical
Face two-dimensional coordinate sets up unit 106, three-dimensional coordinate computing unit 107, action judging unit 108, chart drawing unit 109, aobvious
Show control unit 110 and display unit 111.The device 10 can set using electronics such as camera, mobile phone, panel computers
In standby, the picture pick-up device 20 is communicatively coupled by network with the device 10, the transmission medium of the network can be bluetooth,
The wireless transmission mediums such as zigbee, WIFI.
Each picture pick-up device 20 includes the first camera 201 and second camera 202, sets respectively as longitudinal direction shooting
Standby and horizontal picture pick-up device.Wherein, can may be at for intelligent glasses etc. as the first camera 201 of longitudinal picture pick-up device
Mobile portable formula electronic equipment above user's hand, can be intelligent hand as the second camera 202 of horizontal picture pick-up device
Ring etc. can be positioned over the Mobile portable formula electronic equipment in front of user.Further, the first shooting of each picture pick-up device 20
201 and second camera 202 be respectively common camera and infrared camera.Wherein, common camera can be in light line
In the case of part is preferable, IMAQ is carried out to user operation action and device 10 is sent to analyzing.Infrared camera can be in light
In the case that lines part is poor, IMAQ is carried out to user operation action and device 10 is sent to analyzing.The view recognition unit
102 include longitudinal view identification subelement 103 and transverse views identification subelement 104, set to should be used as longitudinal direction shooting respectively
First camera 201 and second camera 202 of standby and horizontal picture pick-up device is arranged, for knowing to the image that it is gathered
Manage in other places.
In original state, two pairs of cameras (a pair of common cameras and a pair of infrared cameras) use cooperatively, shooting
Direction is set to orthogonal, can simultaneously catch hand action behavior both vertically and horizontally.Generally, intelligent glasses
In two cameras (common camera and an infrared camera) put down, in Intelligent bracelet or smart mobile phone
Two camera (common camera and an infrared camera) levels are put.Also, by shooting for the two pairs of cameras
Rectangular area be collectively forming image capture area.
The brightness value of the induced environment of ambient brightness sensing unit 101, and ambient brightness value is sent to the view identification
In unit 102.The view recognition unit 102 judges to use common camera or infrared according to the luminance threshold value that pre-sets
Camera.For example, brightness impression scope is 1~100, and threshold value is 50, then determination when ambient brightness value is more than 50 uses common
Camera, infrared camera image is used when ambient brightness value is less than 50.
Determined after the camera types for using according to ambient brightness value, start initial alignment operation, it is specific as follows.The device
10 when initial alignment operation is carried out, and the finger for needing to operate in both hands is vacantly lain against two groups of selected cameras by user can
With the position for photographing, i.e. image capture area, and the static of certain hour is kept, to complete the initialization of user's hand position
Flow process, is easy to device 10 to recognize and orient the initial position of finger end points, so as to follow-up operation.The device 10 is recognized and determined
The principle of position finger endpoint location will be described below in detail.
When the formula that interacts is operated, user vacantly lies against finger in image capture area, longitudinal view identification
The ambient brightness value that subelement 103 is detected according to the ambient brightness sensing unit 101 judges to use common camera or red
Outer camera, and when determine use camera after in finger top as longitudinal direction picture pick-up device common camera or
The view data of infrared camera collection carries out hard recognition, to determine finger end points position in the picture.The transverse views
Identification subelement 104 judges to use common camera also according to the ambient brightness value that the ambient brightness sensing unit 101 is detected
It is infrared camera, and to the common camera in front of finger as horizontal picture pick-up device after the camera for using is determined
Or the view data of infrared camera collection carries out hard recognition, to determine finger end points position in the picture.
Wherein, the finger end points that longitudinal view identification subelement 103 determines position in the picture is in XZ coordinate surfaces
Finger end points pixel position in the picture, for example, longitudinal direction identification subelement 103 identifies that the pixel of finger end points 1 is located at
The a rows b row of XZ faces image, the pixel of end points 2 is located at the c rows d of XZ faces image and arranges ... ..., and the pixel of end points 10 is located at the e of XZ faces image
Row f is arranged.The finger end points that the transverse views identification subelement 104 determines position in the picture is at the finger end of YZ coordinate surfaces
Point pixel position in the picture.For example, the transverse views identification subelement 104 identifies that the pixel of end points 1 is located at YZ faces image
G rows h row, the pixel of end points 2 be located at YZ faces image i rows j arrange ... ..., the pixel of end points 10 be located at YZ faces image k rows l arrange.
Further, judge that the method for finger end points includes color background method and color glove method by common camera.
Wherein, color background method is specially:The environmental background of bimanualness needs color relatively easy and single, so can directly lead to
Cross the direct handle portion Extraction of Image of color interval scope of human body complexion out, then hand is calculated according to figure endpoint algorithm
The cut off position that each strip extends, as finger endpoint location.Color glove auxiliary law is specially:User wears special
Pure red gloves, because common camera is all RGB (red-green-blue) samplings, can go out pure red region position with extracting directly
Put, it is also possible to use green or blueness as finger of glove end points color.
Judge that the method for finger centre point filters method and color glove auxiliary law including temperature by infrared camera.Its
In, temperature filters method and is in particular:Bimanualness can directly by human surface temperature's relative ambient temperature it is higher the characteristics of
Directly the higher hand Extraction of Image of temperature out, then according to figure endpoint algorithm calculate hand strip extension cut-off
Point position, as each finger endpoint location.Color glove auxiliary law is specially:User wears special gloves, the surface of gloves
There are heating effect, the hot spot region that so can go out in image with extracting directly.
The horizontal plane two-dimensional coordinate sets up the finger end that unit 105 is identified according to longitudinal view identification subelement 103
10 finger end points location of pixels are converted to XZ coordinate surfaces by point position position in the picture and the pixel resolution of camera
Two-dimensional coordinate value.The vertical plane two-dimensional coordinate sets up the hand that unit 106 is identified according to the transverse views identification subelement 104
Refer to the pixel resolution of endpoint location position in the picture and camera, 10 finger end points location of pixels are converted to into YZ and are sat
The two-dimensional coordinate value in mark face.
Wherein, the transfer principle of the two-dimensional coordinate value for finger end points location of pixels being converted to into XZ coordinate surfaces is in particular:
Image lower left corner pixel is set to into the starting point 0 of two-dimensional coordinate system, according to image analytic degree and being converted to after two-dimensional coordinate
Coordinate value range computation go out coordinate value scope with respect to the line number of each image and the ratio of columns.For example, XZ coordinate surfaces image
The a height of 2000*1000 of resolution width, the coordinate value scope of two-dimentional XZ plane coordinate systems is that 1 to 150, Z axis are 1 to 100 for X-axis,
Then the column number proportion of Z axis coordinate value scope relative image is 100/1000, the columns ratio of X-axis coordinate value scope relative image
150/2000.The location of pixels of finger end points is multiplied by into the ratio of calculated coordinate range relative image row, column number, so as to
Get converted to the end points two-dimensional coordinate value after two-dimensional coordinate.For example, the location of pixels of certain finger end points is arranged for 300 rows 200,
Then the Z axis coordinate of the finger end points is 300*100/1000=30, and the X-axis coordinate of the finger end points is 200*150/2000=
15.Finger end points location of pixels is converted to the transfer principle of two-dimensional coordinate value of YZ coordinate surfaces ibid, here is not added with repeating.
The three-dimensional coordinate computing unit 107 sets up unit 105 and vertical plane two-dimensional coordinate according to the horizontal plane two-dimensional coordinate
Set up 10 finger end points location of pixels that unit 106 determines respectively to calculate in the two-dimensional coordinate value of XZ coordinate surfaces and YZ coordinate surfaces
Coordinate value of the finger end points in XYZ three-dimensional system of coordinates.
Wherein, the operation principle for calculating coordinate value of the finger end points in XYZ three-dimensional system of coordinates is specially:Due to XZ coordinates
Face and YZ coordinate surfaces have common Z axis, so by each coordinate in the Z values of each coordinate end points in XZ coordinate surfaces and YZ coordinate surfaces
The Z values of end points are all extracted and are compared, and Z axis coordinate value is consistent or immediate coordinate end points can be considered as same
End points, then merges into a coordinate by the coordinate value of the coordinate value and YZ coordinate surfaces that are judged as the XZ coordinate surfaces of same end point
End points, using the coordinate value as XYZ three-dimensional system of coordinates.Because Z values are possible to different, the Z of the new three-dimensional coordinate for producing
It is worth coordinate Z value of the coordinate Z values plus YZ coordinate surfaces and then the operation result divided by 2 for XZ coordinate surfaces, in three-dimensional system of coordinate
X, Y-coordinate value are respectively equal to the X-coordinate value and Y-coordinate value of XZ coordinate surfaces and YZ coordinate surfaces.
The D coordinates value of the finger end points that the action judging unit 108 is calculated according to the three-dimensional coordinate computing unit 107
With the corresponding band of position of each key mapping in dummy keyboard, judge whether user has carried out the push of keyboard.
In the present embodiment, in hand position initial phase, the action judging unit 108 is according to all finger ends
The minimum end points (the namely min coordinates value of Y-axis) of vertical direction judges plane Y-axis as clicking in the D coordinates value of point
Value, the D coordinates value of 10 finger end points is mapped on each key area of dummy keyboard, then according to each finger end
Point falls to determining selected button in corresponding key area, so that it is determined that key mapping information.
Due to being hand position initial phase, the action judging unit 108 sets the initial of the judgement face of click using Y value
Value, so the Y value of hand coordinate is both greater than or equal to the decision content in the judgement face of click.Also, when user's movement hand is carried out just
During often operation under mode of operation, every time the action judging unit 108 is received after the three-dimensional coordinate of finger end points, no longer again
The click is set and judges plane Y-axis value, but directly plane Y-axis value is judged according to the click and gone out judging whether virtual touch screen
Existing effect click action.
Wherein, the D coordinates value of finger end points is mapped on the key area of keyboard, specially:Each is pressed in keyboard
Key all can have the two-dimensional coordinate regional extent of corresponding key mapping in XZ planes, and for example, coordinate ranges of the button A in XZ faces is x-axis
Upper 3~5,7~9 rectangular area scope for being formed on Z axis, as long as the XZ coordinate values for having any one end points fall into the square
Shape regional extent then judges the top of the finger end points in key mapping A of user.
Click action is judged according to the D coordinates value of finger end points, specially:Judge plane Y-axis value when have selected to click on
Afterwards, as long as the Y value in finger end points three-dimensional coordinate judges plane Y-axis value less than the click, then having judged what the end points was passed through should
Click on and judge that click behavior occur in plane, the i.e. finger, it is final in which key mapping region decision user then in conjunction with finger end points
It is that clicking operation has been carried out to which button.
Judged result of the chart drawing unit 109 according to the action judging unit 108, i.e. 10 finger end points difference
Fall key mapping on which key area and selected, draws out each finger of analog subscriber hand in correspondence dummy keyboard
In correspondence key mapping, button that is then selected and that click behavior occur is plotted as highlighted choosing to represent clicked.
The display control unit 110 is converted to display device 21 the image drawn by the chart drawing unit 109 can be with
The sequential of display, calls the display unit 111 that the image operated on dummy keyboard is shown on display device 21 and supplies user
Viewing, user can learn which key mapping each finger currently corresponds to and proceed operation of virtual key according to feedback.
Fig. 3 is referred to, is the schematic flow sheet of the dummy keyboard method of operating in embodiment of the present invention, the method bag
Include:
Step S30, the brightness value of the induced environment of ambient brightness sensing unit 101, the view recognition unit 102 is according to pre-
The ambient brightness value that the luminance threshold value and the ambient brightness sensing unit 101 for first arranging is sensed is judged using common shooting
Head or infrared camera.
In original state, two pairs of cameras (a pair of common cameras and a pair of infrared cameras) use cooperatively, shooting
Direction is set to orthogonal, can simultaneously catch hand action behavior both vertically and horizontally.Generally, intelligent glasses
In two cameras (common camera and an infrared camera) put down, in Intelligent bracelet or smart mobile phone
Two camera (common camera and an infrared camera) levels are put.Also, by shooting for the two pairs of cameras
Rectangular area be collectively forming image capture area.
The finger for needing to operate in both hands is vacantly lain against in image capture area and keeps certain by step S31, user
Time it is static, recognized by device 10 and oriented the initial position of finger, complete user's finger position initialization.
The device 10 recognizes and positions that the principle of finger position will be described below in detail.
Step S32, user vacantly lies against the finger for needing to operate in both hands in image capture area, longitudinal view
Identification subelement 103 is according to the common camera in finger top as longitudinal picture pick-up device or infrared camera collection
View data carry out hard recognition, to determine finger end points position in the picture.Transverse views 104, the subelement of identification
Carry out according to the view data in finger front as the common camera in horizontal picture pick-up device or infrared camera collection
Hard recognition, to determine finger end points position in the picture.
Specifically, the finger end points that longitudinal view identification subelement 103 determines position in the picture is in XZ coordinates
The finger end points pixel in face position in the picture, for example, the longitudinal direction identification subelement 103 identifies the pixel position of finger end points 1
Arrange in a rows b of XZ faces image, the pixel of end points 2 is located at the c rows d of XZ faces image and arranges ... ..., and the pixel of end points 10 is located at XZ faces image
E rows f row.The finger end points that the transverse views identification subelement 104 determines position in the picture is the hand in YZ coordinate surfaces
Finger tip point pixel position in the picture.For example, the transverse views identification subelement 104 identifies that the pixel of end points 1 is located at YZ faces
The g rows h row of image, the pixel of end points 2 is located at the i rows j of YZ faces image and arranges ... ..., and the pixel of end points 10 is located at the k row l of YZ faces image
Row.
Further, judge that the method for finger end points includes color background method and color glove method by common camera.
Wherein, color background method is specially:The environmental background of bimanualness needs color relatively easy and single, so can directly lead to
Cross the direct handle portion Extraction of Image of color interval scope of human body complexion out, then hand is calculated according to figure endpoint algorithm
The cut off position that each strip extends, as finger endpoint location.Color glove auxiliary law is specially:User wears special
Pure red gloves, because common camera is all RGB (red-green-blue) samplings, can go out pure red region position with extracting directly
Put, it is also possible to use green or blueness as finger of glove end points color.
Judge that the method for finger centre point filters method and color glove auxiliary law including temperature by infrared camera.Its
In, temperature filters method and is in particular:Bimanualness can directly by human surface temperature's relative ambient temperature it is higher the characteristics of
Directly the higher hand Extraction of Image of temperature out, then according to figure endpoint algorithm calculate hand strip extension cut-off
Point position, as each finger endpoint location.Color glove auxiliary law is specially:User wears special gloves, the surface of gloves
There are heating effect, the hot spot region that so can go out in image with extracting directly.
Step S33, the horizontal plane two-dimensional coordinate is set up unit 105 and is identified according to longitudinal view identification subelement 103
Finger endpoint location position in the picture and the pixel resolution of camera, 10 finger end points location of pixels are converted to
The two-dimensional coordinate value of XZ coordinate surfaces.The vertical plane two-dimensional coordinate is set up unit 106 and is known according to the transverse views identification subelement 104
The finger endpoint location not gone out position in the picture and the pixel resolution of camera, 10 finger end points location of pixels are turned
It is changed to the two-dimensional coordinate value of YZ coordinate surfaces.
Wherein, the transfer principle of the two-dimensional coordinate value for finger end points location of pixels being converted to into XZ coordinate surfaces is in particular:
Image lower left corner pixel is set to into the starting point 0 of two-dimensional coordinate system, according to image analytic degree and being converted to after two-dimensional coordinate
Coordinate value range computation go out coordinate value scope with respect to the line number of each image and the ratio of columns.For example, XZ coordinate surfaces image
The a height of 2000*1000 of resolution width, the coordinate value scope of two-dimentional XZ plane coordinate systems is that 1 to 150, Z axis are 1 to 100 for X-axis,
Then the column number proportion of Z axis coordinate value scope relative image is 100/1000, the columns ratio of X-axis coordinate value scope relative image
150/2000.The location of pixels of finger end points is multiplied by into the ratio of calculated coordinate range relative image row, column number, so as to
Get converted to the end points two-dimensional coordinate value after two-dimensional coordinate.For example, the location of pixels of certain finger end points is arranged for 300 rows 200,
Then the Z axis coordinate of the finger end points is 300*100/1000=30, and the X-axis coordinate of the finger end points is 200*150/2000=
15.Finger end points location of pixels is converted to the transfer principle of two-dimensional coordinate value of YZ coordinate surfaces ibid, here is not added with repeating.
Step S34, the three-dimensional coordinate computing unit 107 sets up unit 105 and vertical plane according to the horizontal plane two-dimensional coordinate
Two-dimensional coordinate is set up 10 finger end points location of pixels that unit 106 determines respectively and is sat in the two dimension of XZ coordinate surfaces and YZ coordinate surfaces
Scale value calculates coordinate value of the finger end points in XYZ three-dimensional system of coordinates.
Wherein, the method for calculating coordinate value of the finger end points in XYZ three-dimensional system of coordinates is specially:Due to XZ coordinate surfaces and
YZ coordinate surfaces have common Z axis, so by each coordinate end points in the Z values of each coordinate end points in XZ coordinate surfaces and YZ coordinate surfaces
Z values all extract and be compared, Z axis coordinate value is consistent or immediate coordinate end points can be considered as same end
Point, then merges into a coordinate end by the coordinate value of the coordinate value and YZ coordinate surfaces that are judged as the XZ coordinate surfaces of same end point
Point, using the coordinate value as XYZ three-dimensional system of coordinates.Because Z values are possible to different, the Z values of the new three-dimensional coordinate for producing
For the coordinate Z value of the coordinate Z values plus YZ coordinate surfaces and then the operation result divided by 2 of XZ coordinate surfaces, X, the Y in three-dimensional system of coordinate
Coordinate value is respectively equal to the X-coordinate value and Y-coordinate value of XZ coordinate surfaces and YZ coordinate surfaces.
Step S35, the three of the finger end points that the action judging unit 108 is calculated according to the three-dimensional coordinate computing unit 107
The corresponding band of position of each key mapping in dimensional coordinate values and dummy keyboard, judges whether user has carried out the push of keyboard.
In the present embodiment, in hand position initial phase, the action judging unit 108 is according to all finger ends
The minimum end points (the namely min coordinates value of Y-axis) of vertical direction judges plane Y-axis as clicking in the D coordinates value of point
Value, the D coordinates value of 10 finger end points is mapped on each key area of dummy keyboard, then according to each finger end
Point falls to determining selected button in corresponding key area, so that it is determined that key mapping information.
Due to being hand position initial phase, the action judging unit 108 sets the initial of the judgement face of click using Y value
Value, so the Y value of hand coordinate is both greater than or equal to the decision content in the judgement face of click.Also, when user's movement hand is carried out just
During often operation under mode of operation, every time the action judging unit 108 is received after the three-dimensional coordinate of finger end points, no longer again
The click is set and judges plane Y-axis value, but directly plane Y-axis value is judged according to the click and gone out judging whether virtual touch screen
Existing effect click action.
Wherein, the D coordinates value of finger end points is mapped on the key area of keyboard, in particular:In keyboard each
Button all can have the two-dimensional coordinate regional extent of corresponding key mapping in XZ planes, and for example, coordinate ranges of the button A in XZ faces is x
7~9 rectangular area scope for being formed on 3~5, Z axis on axle, as long as the XZ coordinate values for having any one end points fall into this
Rectangular area scope then judges the top of the finger end points in key mapping A of user.
Click action is judged according to the D coordinates value of finger end points, specially:Judge plane Y-axis value when have selected to click on
Afterwards, as long as the Y value in finger end points three-dimensional coordinate judges plane Y-axis value less than the click, then having judged what the end points was passed through should
Click on and judge that click behavior occur in plane, the i.e. finger, it is final in which key mapping region decision user then in conjunction with finger end points
It is that clicking operation has been carried out to which button.
Step S36, judged result of the chart drawing unit 109 according to the action judging unit 108, i.e. 10 fingers
End points respectively falls in key mapping on which key area and selected, draws out each finger of analog subscriber hand empty in correspondence
Intend in the corresponding key mapping of keyboard, button that is then selected and that click behavior occur is plotted as highlighted choosing to represent clicked.
Step S37, the display control unit 110 is converted to display and sets the image drawn by the chart drawing unit 109
Standby 21 sequential that can be shown, call the display unit 111 that the image operated on dummy keyboard is shown to into display device 21
Upper confession user viewing, user can learn which key mapping each finger currently corresponds to and proceed virtual key according to feedback
Operation.
A kind of dummy keyboard operation device, system and method that the present invention is provided, capture image and recognize hand by camera
The position of finger tip point and posture, by the finger extreme coordinates for obtaining the operational motion to dummy keyboard is mapped directly into, and
Show on display and feed back to user, by taking the photograph in intelligent glasses and Intelligent bracelet, or intelligent and portable mobile device
As equipment fast construction dummy keyboard input environment, it is no longer necessary to entity device, user is facilitated to pass through dummy keyboard whenever and wherever possible
Carry out man-machine interactive operation.
Embodiments of the invention are the foregoing is only, the scope of the claims of the present invention is not thereby limited, it is every using this
Equivalent structure or equivalent flow conversion that bright specification and accompanying drawing content are made, or directly or indirectly it is used in other related skills
Art field, is included within the scope of the present invention.