CN104199550B - Virtual keyboard operation device, system and method - Google Patents

Virtual keyboard operation device, system and method Download PDF

Info

Publication number
CN104199550B
CN104199550B CN201410436989.0A CN201410436989A CN104199550B CN 104199550 B CN104199550 B CN 104199550B CN 201410436989 A CN201410436989 A CN 201410436989A CN 104199550 B CN104199550 B CN 104199550B
Authority
CN
China
Prior art keywords
coordinate
end points
value
finger
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410436989.0A
Other languages
Chinese (zh)
Other versions
CN104199550A (en
Inventor
廖裕民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rockchip Electronics Co Ltd
Original Assignee
Fuzhou Rockchip Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuzhou Rockchip Electronics Co Ltd filed Critical Fuzhou Rockchip Electronics Co Ltd
Priority to CN201410436989.0A priority Critical patent/CN104199550B/en
Publication of CN104199550A publication Critical patent/CN104199550A/en
Application granted granted Critical
Publication of CN104199550B publication Critical patent/CN104199550B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention provides a virtual keyboard operation device, system and method. The method comprises the steps that a camera is utilized for collecting user finger image data to determine the positions of the end points of fingers in an image, the pixel resolution degree of the camera is combined, and therefore the pixel positions of the end points of the fingers are respectively converted into a two-dimensional coordinate value of an XZ coordinate surface and a two-dimensional coordinate value of a YZ coordinate surface. According to the two-dimensional coordinate value of the pixel positions of the end points of the fingers on the XZ coordinate surface and the two-dimensional coordinate value of the pixel positions of the end points of the fingers on YZ coordinate surface, the coordinate value of the end points of the fingers in an XYZ three-dimensional coordinate system is calculated, the operation of a user on keys of a virtual keyboard is judged by combining the position area corresponding to each key in the virtual keyboard, an operation image of simulating the fingers of the user on the corresponding keys of the virtual keyboard is drawn according to the judgment result, and the drawn image is displayed to allow the user to look on. According to the device, system and method, the user can conveniently perform free man-machine interactive operation through the virtual keyboard any time any where.

Description

A kind of dummy keyboard operation device, system and method
Technical field
The present invention relates to virtual technical field of touch control, more particularly to a kind of dummy keyboard operation device, system and method.
Background technology
Keyboard is the important component part in the man-machine interaction of prior art, and existing input through keyboard all relies on entity key Disc apparatus, that is to say, that it is necessary to have in esse keyboard can finishing man-machine interaction, thus greatly limit Carry out the place scope and condition of man-machine interaction.
The content of the invention
In view of the above problems, the present invention provide it is a kind of overcome the problems referred to above or at least partly solve the above problems it is virtual Keyboard operation device, system and method.
The present invention provides a kind of dummy keyboard operation device, including display control unit and display unit, and the device includes:
View recognition unit, the view data of the user's finger for gathering to camera carries out hard recognition, to determine Finger end points position in the picture.
Horizontal plane two-dimensional coordinate sets up unit, and the finger endpoint location for being identified according to the view recognition unit is being schemed Position as in and the pixel resolution of camera, by finger end points location of pixels the two-dimensional coordinate value of XZ coordinate surfaces is converted to.
Vertical plane two-dimensional coordinate sets up unit, and the finger endpoint location for being identified according to the view recognition unit is being schemed Position as in and the pixel resolution of camera, by finger end points location of pixels the two-dimensional coordinate value of YZ coordinate surfaces is converted to.
Three-dimensional coordinate computing unit, for setting up unit and the foundation of vertical plane two-dimensional coordinate according to the horizontal plane two-dimensional coordinate The finger end points location of pixels that unit determines respectively calculates finger end points and exists in the two-dimensional coordinate value of XZ coordinate surfaces and YZ coordinate surfaces Coordinate value in XYZ three-dimensional system of coordinates.
Action judging unit, for according to the D coordinates value of the finger end points of three-dimensional coordinate computing unit calculating and void The corresponding band of position of each key mapping of plan keyboard, judges operation of the user to dummy keyboard key mapping.And
Chart drawing unit, for drawing out analog subscriber finger virtual according to the judged result of the action judging unit The operation diagram picture of keyboard correspondence key mapping, and call the display control unit to control the display unit to show the operation diagram picture.
The present invention also provides a kind of dummy keyboard operating system, including the dummy keyboard operation dress described in as above any one Put and two picture pick-up devices with the device communication connection.
The present invention also provides a kind of dummy keyboard method of operating, and the method includes:
User vacantly lies against finger in image capture area, and the view data of the user's finger of camera collection is entered Row hard recognition, to determine finger end points position in the picture.
According to the finger endpoint location for identifying position in the picture and the pixel resolution of camera, by finger end points Location of pixels is respectively converted into the two-dimensional coordinate value of XZ coordinate surfaces and YZ coordinate surfaces.
Finger end points is calculated in XYZ in the two-dimensional coordinate value of XZ coordinate surfaces and YZ coordinate surfaces according to finger end points location of pixels Coordinate value in three-dimensional system of coordinate.
According to the corresponding band of position of the D coordinates value and dummy keyboard of finger end points of the calculating each key mapping, judge Operation of the user to dummy keyboard key mapping.
Operation diagram picture of the analog subscriber finger in dummy keyboard correspondence key mapping is drawn out according to the judged result.And
Show the images for user viewing of the drafting.
A kind of dummy keyboard operation device, system and method that the present invention is provided, capture image and recognize hand by camera The position of finger tip point and posture, by the finger extreme coordinates for obtaining the operational motion to dummy keyboard is mapped directly into, and Show on display and feed back to user, by taking the photograph in intelligent glasses and Intelligent bracelet, or intelligent and portable mobile device As equipment fast construction dummy keyboard input environment, it is no longer necessary to entity device, user is facilitated to pass through dummy keyboard whenever and wherever possible Carry out man-machine interactive operation.
Description of the drawings
Fig. 1 is the hardware structure schematic diagram of the dummy keyboard operating system in embodiment of the present invention;
Fig. 2 is the high-level schematic functional block diagram of the dummy keyboard operation device in embodiment of the present invention;
Fig. 3 is the schematic flow sheet of the dummy keyboard method of operating in embodiment of the present invention.
Label declaration:
System 100
Device 10
Ambient brightness sensing unit 101
View recognition unit 102
Longitudinal view recognizes subelement 103
Transverse views recognize subelement 104
Horizontal plane two-dimensional coordinate sets up unit 105
Vertical plane two-dimensional coordinate sets up unit 106
Three-dimensional coordinate computing unit 107
Action judging unit 108
Chart drawing unit 109
Display control unit 110
Display unit 111
Picture pick-up device 20
First camera 201
Second camera 202
Display device 21
Specific embodiment
To describe technology contents of the invention, structural feature in detail, purpose and effect being realized, below in conjunction with embodiment And coordinate accompanying drawing to be explained in detail.
Fig. 1 is referred to, is the hardware structure schematic diagram of the dummy keyboard operating system in embodiment of the present invention, the system 100 include 10, two picture pick-up devices 20 of dummy keyboard operation device and display device 21, for the detection reality to user gesture Existing touch-control input.
It is the high-level schematic functional block diagram of the dummy keyboard operation device in embodiment of the present invention please refer to Fig. 2.Should Device 10 includes that ambient brightness sensing unit 101, view recognition unit 102, horizontal plane two-dimensional coordinate set up unit 105, vertical Face two-dimensional coordinate sets up unit 106, three-dimensional coordinate computing unit 107, action judging unit 108, chart drawing unit 109, aobvious Show control unit 110 and display unit 111.The device 10 can set using electronics such as camera, mobile phone, panel computers In standby, the picture pick-up device 20 is communicatively coupled by network with the device 10, the transmission medium of the network can be bluetooth, The wireless transmission mediums such as zigbee, WIFI.
Each picture pick-up device 20 includes the first camera 201 and second camera 202, sets respectively as longitudinal direction shooting Standby and horizontal picture pick-up device.Wherein, can may be at for intelligent glasses etc. as the first camera 201 of longitudinal picture pick-up device Mobile portable formula electronic equipment above user's hand, can be intelligent hand as the second camera 202 of horizontal picture pick-up device Ring etc. can be positioned over the Mobile portable formula electronic equipment in front of user.Further, the first shooting of each picture pick-up device 20 201 and second camera 202 be respectively common camera and infrared camera.Wherein, common camera can be in light line In the case of part is preferable, IMAQ is carried out to user operation action and device 10 is sent to analyzing.Infrared camera can be in light In the case that lines part is poor, IMAQ is carried out to user operation action and device 10 is sent to analyzing.The view recognition unit 102 include longitudinal view identification subelement 103 and transverse views identification subelement 104, set to should be used as longitudinal direction shooting respectively First camera 201 and second camera 202 of standby and horizontal picture pick-up device is arranged, for knowing to the image that it is gathered Manage in other places.
In original state, two pairs of cameras (a pair of common cameras and a pair of infrared cameras) use cooperatively, shooting Direction is set to orthogonal, can simultaneously catch hand action behavior both vertically and horizontally.Generally, intelligent glasses In two cameras (common camera and an infrared camera) put down, in Intelligent bracelet or smart mobile phone Two camera (common camera and an infrared camera) levels are put.Also, by shooting for the two pairs of cameras Rectangular area be collectively forming image capture area.
The brightness value of the induced environment of ambient brightness sensing unit 101, and ambient brightness value is sent to the view identification In unit 102.The view recognition unit 102 judges to use common camera or infrared according to the luminance threshold value that pre-sets Camera.For example, brightness impression scope is 1~100, and threshold value is 50, then determination when ambient brightness value is more than 50 uses common Camera, infrared camera image is used when ambient brightness value is less than 50.
Determined after the camera types for using according to ambient brightness value, start initial alignment operation, it is specific as follows.The device 10 when initial alignment operation is carried out, and the finger for needing to operate in both hands is vacantly lain against two groups of selected cameras by user can With the position for photographing, i.e. image capture area, and the static of certain hour is kept, to complete the initialization of user's hand position Flow process, is easy to device 10 to recognize and orient the initial position of finger end points, so as to follow-up operation.The device 10 is recognized and determined The principle of position finger endpoint location will be described below in detail.
When the formula that interacts is operated, user vacantly lies against finger in image capture area, longitudinal view identification The ambient brightness value that subelement 103 is detected according to the ambient brightness sensing unit 101 judges to use common camera or red Outer camera, and when determine use camera after in finger top as longitudinal direction picture pick-up device common camera or The view data of infrared camera collection carries out hard recognition, to determine finger end points position in the picture.The transverse views Identification subelement 104 judges to use common camera also according to the ambient brightness value that the ambient brightness sensing unit 101 is detected It is infrared camera, and to the common camera in front of finger as horizontal picture pick-up device after the camera for using is determined Or the view data of infrared camera collection carries out hard recognition, to determine finger end points position in the picture.
Wherein, the finger end points that longitudinal view identification subelement 103 determines position in the picture is in XZ coordinate surfaces Finger end points pixel position in the picture, for example, longitudinal direction identification subelement 103 identifies that the pixel of finger end points 1 is located at The a rows b row of XZ faces image, the pixel of end points 2 is located at the c rows d of XZ faces image and arranges ... ..., and the pixel of end points 10 is located at the e of XZ faces image Row f is arranged.The finger end points that the transverse views identification subelement 104 determines position in the picture is at the finger end of YZ coordinate surfaces Point pixel position in the picture.For example, the transverse views identification subelement 104 identifies that the pixel of end points 1 is located at YZ faces image G rows h row, the pixel of end points 2 be located at YZ faces image i rows j arrange ... ..., the pixel of end points 10 be located at YZ faces image k rows l arrange.
Further, judge that the method for finger end points includes color background method and color glove method by common camera. Wherein, color background method is specially:The environmental background of bimanualness needs color relatively easy and single, so can directly lead to Cross the direct handle portion Extraction of Image of color interval scope of human body complexion out, then hand is calculated according to figure endpoint algorithm The cut off position that each strip extends, as finger endpoint location.Color glove auxiliary law is specially:User wears special Pure red gloves, because common camera is all RGB (red-green-blue) samplings, can go out pure red region position with extracting directly Put, it is also possible to use green or blueness as finger of glove end points color.
Judge that the method for finger centre point filters method and color glove auxiliary law including temperature by infrared camera.Its In, temperature filters method and is in particular:Bimanualness can directly by human surface temperature's relative ambient temperature it is higher the characteristics of Directly the higher hand Extraction of Image of temperature out, then according to figure endpoint algorithm calculate hand strip extension cut-off Point position, as each finger endpoint location.Color glove auxiliary law is specially:User wears special gloves, the surface of gloves There are heating effect, the hot spot region that so can go out in image with extracting directly.
The horizontal plane two-dimensional coordinate sets up the finger end that unit 105 is identified according to longitudinal view identification subelement 103 10 finger end points location of pixels are converted to XZ coordinate surfaces by point position position in the picture and the pixel resolution of camera Two-dimensional coordinate value.The vertical plane two-dimensional coordinate sets up the hand that unit 106 is identified according to the transverse views identification subelement 104 Refer to the pixel resolution of endpoint location position in the picture and camera, 10 finger end points location of pixels are converted to into YZ and are sat The two-dimensional coordinate value in mark face.
Wherein, the transfer principle of the two-dimensional coordinate value for finger end points location of pixels being converted to into XZ coordinate surfaces is in particular: Image lower left corner pixel is set to into the starting point 0 of two-dimensional coordinate system, according to image analytic degree and being converted to after two-dimensional coordinate Coordinate value range computation go out coordinate value scope with respect to the line number of each image and the ratio of columns.For example, XZ coordinate surfaces image The a height of 2000*1000 of resolution width, the coordinate value scope of two-dimentional XZ plane coordinate systems is that 1 to 150, Z axis are 1 to 100 for X-axis, Then the column number proportion of Z axis coordinate value scope relative image is 100/1000, the columns ratio of X-axis coordinate value scope relative image 150/2000.The location of pixels of finger end points is multiplied by into the ratio of calculated coordinate range relative image row, column number, so as to Get converted to the end points two-dimensional coordinate value after two-dimensional coordinate.For example, the location of pixels of certain finger end points is arranged for 300 rows 200, Then the Z axis coordinate of the finger end points is 300*100/1000=30, and the X-axis coordinate of the finger end points is 200*150/2000= 15.Finger end points location of pixels is converted to the transfer principle of two-dimensional coordinate value of YZ coordinate surfaces ibid, here is not added with repeating.
The three-dimensional coordinate computing unit 107 sets up unit 105 and vertical plane two-dimensional coordinate according to the horizontal plane two-dimensional coordinate Set up 10 finger end points location of pixels that unit 106 determines respectively to calculate in the two-dimensional coordinate value of XZ coordinate surfaces and YZ coordinate surfaces Coordinate value of the finger end points in XYZ three-dimensional system of coordinates.
Wherein, the operation principle for calculating coordinate value of the finger end points in XYZ three-dimensional system of coordinates is specially:Due to XZ coordinates Face and YZ coordinate surfaces have common Z axis, so by each coordinate in the Z values of each coordinate end points in XZ coordinate surfaces and YZ coordinate surfaces The Z values of end points are all extracted and are compared, and Z axis coordinate value is consistent or immediate coordinate end points can be considered as same End points, then merges into a coordinate by the coordinate value of the coordinate value and YZ coordinate surfaces that are judged as the XZ coordinate surfaces of same end point End points, using the coordinate value as XYZ three-dimensional system of coordinates.Because Z values are possible to different, the Z of the new three-dimensional coordinate for producing It is worth coordinate Z value of the coordinate Z values plus YZ coordinate surfaces and then the operation result divided by 2 for XZ coordinate surfaces, in three-dimensional system of coordinate X, Y-coordinate value are respectively equal to the X-coordinate value and Y-coordinate value of XZ coordinate surfaces and YZ coordinate surfaces.
The D coordinates value of the finger end points that the action judging unit 108 is calculated according to the three-dimensional coordinate computing unit 107 With the corresponding band of position of each key mapping in dummy keyboard, judge whether user has carried out the push of keyboard.
In the present embodiment, in hand position initial phase, the action judging unit 108 is according to all finger ends The minimum end points (the namely min coordinates value of Y-axis) of vertical direction judges plane Y-axis as clicking in the D coordinates value of point Value, the D coordinates value of 10 finger end points is mapped on each key area of dummy keyboard, then according to each finger end Point falls to determining selected button in corresponding key area, so that it is determined that key mapping information.
Due to being hand position initial phase, the action judging unit 108 sets the initial of the judgement face of click using Y value Value, so the Y value of hand coordinate is both greater than or equal to the decision content in the judgement face of click.Also, when user's movement hand is carried out just During often operation under mode of operation, every time the action judging unit 108 is received after the three-dimensional coordinate of finger end points, no longer again The click is set and judges plane Y-axis value, but directly plane Y-axis value is judged according to the click and gone out judging whether virtual touch screen Existing effect click action.
Wherein, the D coordinates value of finger end points is mapped on the key area of keyboard, specially:Each is pressed in keyboard Key all can have the two-dimensional coordinate regional extent of corresponding key mapping in XZ planes, and for example, coordinate ranges of the button A in XZ faces is x-axis Upper 3~5,7~9 rectangular area scope for being formed on Z axis, as long as the XZ coordinate values for having any one end points fall into the square Shape regional extent then judges the top of the finger end points in key mapping A of user.
Click action is judged according to the D coordinates value of finger end points, specially:Judge plane Y-axis value when have selected to click on Afterwards, as long as the Y value in finger end points three-dimensional coordinate judges plane Y-axis value less than the click, then having judged what the end points was passed through should Click on and judge that click behavior occur in plane, the i.e. finger, it is final in which key mapping region decision user then in conjunction with finger end points It is that clicking operation has been carried out to which button.
Judged result of the chart drawing unit 109 according to the action judging unit 108, i.e. 10 finger end points difference Fall key mapping on which key area and selected, draws out each finger of analog subscriber hand in correspondence dummy keyboard In correspondence key mapping, button that is then selected and that click behavior occur is plotted as highlighted choosing to represent clicked.
The display control unit 110 is converted to display device 21 the image drawn by the chart drawing unit 109 can be with The sequential of display, calls the display unit 111 that the image operated on dummy keyboard is shown on display device 21 and supplies user Viewing, user can learn which key mapping each finger currently corresponds to and proceed operation of virtual key according to feedback.
Fig. 3 is referred to, is the schematic flow sheet of the dummy keyboard method of operating in embodiment of the present invention, the method bag Include:
Step S30, the brightness value of the induced environment of ambient brightness sensing unit 101, the view recognition unit 102 is according to pre- The ambient brightness value that the luminance threshold value and the ambient brightness sensing unit 101 for first arranging is sensed is judged using common shooting Head or infrared camera.
In original state, two pairs of cameras (a pair of common cameras and a pair of infrared cameras) use cooperatively, shooting Direction is set to orthogonal, can simultaneously catch hand action behavior both vertically and horizontally.Generally, intelligent glasses In two cameras (common camera and an infrared camera) put down, in Intelligent bracelet or smart mobile phone Two camera (common camera and an infrared camera) levels are put.Also, by shooting for the two pairs of cameras Rectangular area be collectively forming image capture area.
The finger for needing to operate in both hands is vacantly lain against in image capture area and keeps certain by step S31, user Time it is static, recognized by device 10 and oriented the initial position of finger, complete user's finger position initialization.
The device 10 recognizes and positions that the principle of finger position will be described below in detail.
Step S32, user vacantly lies against the finger for needing to operate in both hands in image capture area, longitudinal view Identification subelement 103 is according to the common camera in finger top as longitudinal picture pick-up device or infrared camera collection View data carry out hard recognition, to determine finger end points position in the picture.Transverse views 104, the subelement of identification Carry out according to the view data in finger front as the common camera in horizontal picture pick-up device or infrared camera collection Hard recognition, to determine finger end points position in the picture.
Specifically, the finger end points that longitudinal view identification subelement 103 determines position in the picture is in XZ coordinates The finger end points pixel in face position in the picture, for example, the longitudinal direction identification subelement 103 identifies the pixel position of finger end points 1 Arrange in a rows b of XZ faces image, the pixel of end points 2 is located at the c rows d of XZ faces image and arranges ... ..., and the pixel of end points 10 is located at XZ faces image E rows f row.The finger end points that the transverse views identification subelement 104 determines position in the picture is the hand in YZ coordinate surfaces Finger tip point pixel position in the picture.For example, the transverse views identification subelement 104 identifies that the pixel of end points 1 is located at YZ faces The g rows h row of image, the pixel of end points 2 is located at the i rows j of YZ faces image and arranges ... ..., and the pixel of end points 10 is located at the k row l of YZ faces image Row.
Further, judge that the method for finger end points includes color background method and color glove method by common camera. Wherein, color background method is specially:The environmental background of bimanualness needs color relatively easy and single, so can directly lead to Cross the direct handle portion Extraction of Image of color interval scope of human body complexion out, then hand is calculated according to figure endpoint algorithm The cut off position that each strip extends, as finger endpoint location.Color glove auxiliary law is specially:User wears special Pure red gloves, because common camera is all RGB (red-green-blue) samplings, can go out pure red region position with extracting directly Put, it is also possible to use green or blueness as finger of glove end points color.
Judge that the method for finger centre point filters method and color glove auxiliary law including temperature by infrared camera.Its In, temperature filters method and is in particular:Bimanualness can directly by human surface temperature's relative ambient temperature it is higher the characteristics of Directly the higher hand Extraction of Image of temperature out, then according to figure endpoint algorithm calculate hand strip extension cut-off Point position, as each finger endpoint location.Color glove auxiliary law is specially:User wears special gloves, the surface of gloves There are heating effect, the hot spot region that so can go out in image with extracting directly.
Step S33, the horizontal plane two-dimensional coordinate is set up unit 105 and is identified according to longitudinal view identification subelement 103 Finger endpoint location position in the picture and the pixel resolution of camera, 10 finger end points location of pixels are converted to The two-dimensional coordinate value of XZ coordinate surfaces.The vertical plane two-dimensional coordinate is set up unit 106 and is known according to the transverse views identification subelement 104 The finger endpoint location not gone out position in the picture and the pixel resolution of camera, 10 finger end points location of pixels are turned It is changed to the two-dimensional coordinate value of YZ coordinate surfaces.
Wherein, the transfer principle of the two-dimensional coordinate value for finger end points location of pixels being converted to into XZ coordinate surfaces is in particular: Image lower left corner pixel is set to into the starting point 0 of two-dimensional coordinate system, according to image analytic degree and being converted to after two-dimensional coordinate Coordinate value range computation go out coordinate value scope with respect to the line number of each image and the ratio of columns.For example, XZ coordinate surfaces image The a height of 2000*1000 of resolution width, the coordinate value scope of two-dimentional XZ plane coordinate systems is that 1 to 150, Z axis are 1 to 100 for X-axis, Then the column number proportion of Z axis coordinate value scope relative image is 100/1000, the columns ratio of X-axis coordinate value scope relative image 150/2000.The location of pixels of finger end points is multiplied by into the ratio of calculated coordinate range relative image row, column number, so as to Get converted to the end points two-dimensional coordinate value after two-dimensional coordinate.For example, the location of pixels of certain finger end points is arranged for 300 rows 200, Then the Z axis coordinate of the finger end points is 300*100/1000=30, and the X-axis coordinate of the finger end points is 200*150/2000= 15.Finger end points location of pixels is converted to the transfer principle of two-dimensional coordinate value of YZ coordinate surfaces ibid, here is not added with repeating.
Step S34, the three-dimensional coordinate computing unit 107 sets up unit 105 and vertical plane according to the horizontal plane two-dimensional coordinate Two-dimensional coordinate is set up 10 finger end points location of pixels that unit 106 determines respectively and is sat in the two dimension of XZ coordinate surfaces and YZ coordinate surfaces Scale value calculates coordinate value of the finger end points in XYZ three-dimensional system of coordinates.
Wherein, the method for calculating coordinate value of the finger end points in XYZ three-dimensional system of coordinates is specially:Due to XZ coordinate surfaces and YZ coordinate surfaces have common Z axis, so by each coordinate end points in the Z values of each coordinate end points in XZ coordinate surfaces and YZ coordinate surfaces Z values all extract and be compared, Z axis coordinate value is consistent or immediate coordinate end points can be considered as same end Point, then merges into a coordinate end by the coordinate value of the coordinate value and YZ coordinate surfaces that are judged as the XZ coordinate surfaces of same end point Point, using the coordinate value as XYZ three-dimensional system of coordinates.Because Z values are possible to different, the Z values of the new three-dimensional coordinate for producing For the coordinate Z value of the coordinate Z values plus YZ coordinate surfaces and then the operation result divided by 2 of XZ coordinate surfaces, X, the Y in three-dimensional system of coordinate Coordinate value is respectively equal to the X-coordinate value and Y-coordinate value of XZ coordinate surfaces and YZ coordinate surfaces.
Step S35, the three of the finger end points that the action judging unit 108 is calculated according to the three-dimensional coordinate computing unit 107 The corresponding band of position of each key mapping in dimensional coordinate values and dummy keyboard, judges whether user has carried out the push of keyboard.
In the present embodiment, in hand position initial phase, the action judging unit 108 is according to all finger ends The minimum end points (the namely min coordinates value of Y-axis) of vertical direction judges plane Y-axis as clicking in the D coordinates value of point Value, the D coordinates value of 10 finger end points is mapped on each key area of dummy keyboard, then according to each finger end Point falls to determining selected button in corresponding key area, so that it is determined that key mapping information.
Due to being hand position initial phase, the action judging unit 108 sets the initial of the judgement face of click using Y value Value, so the Y value of hand coordinate is both greater than or equal to the decision content in the judgement face of click.Also, when user's movement hand is carried out just During often operation under mode of operation, every time the action judging unit 108 is received after the three-dimensional coordinate of finger end points, no longer again The click is set and judges plane Y-axis value, but directly plane Y-axis value is judged according to the click and gone out judging whether virtual touch screen Existing effect click action.
Wherein, the D coordinates value of finger end points is mapped on the key area of keyboard, in particular:In keyboard each Button all can have the two-dimensional coordinate regional extent of corresponding key mapping in XZ planes, and for example, coordinate ranges of the button A in XZ faces is x 7~9 rectangular area scope for being formed on 3~5, Z axis on axle, as long as the XZ coordinate values for having any one end points fall into this Rectangular area scope then judges the top of the finger end points in key mapping A of user.
Click action is judged according to the D coordinates value of finger end points, specially:Judge plane Y-axis value when have selected to click on Afterwards, as long as the Y value in finger end points three-dimensional coordinate judges plane Y-axis value less than the click, then having judged what the end points was passed through should Click on and judge that click behavior occur in plane, the i.e. finger, it is final in which key mapping region decision user then in conjunction with finger end points It is that clicking operation has been carried out to which button.
Step S36, judged result of the chart drawing unit 109 according to the action judging unit 108, i.e. 10 fingers End points respectively falls in key mapping on which key area and selected, draws out each finger of analog subscriber hand empty in correspondence Intend in the corresponding key mapping of keyboard, button that is then selected and that click behavior occur is plotted as highlighted choosing to represent clicked.
Step S37, the display control unit 110 is converted to display and sets the image drawn by the chart drawing unit 109 Standby 21 sequential that can be shown, call the display unit 111 that the image operated on dummy keyboard is shown to into display device 21 Upper confession user viewing, user can learn which key mapping each finger currently corresponds to and proceed virtual key according to feedback Operation.
A kind of dummy keyboard operation device, system and method that the present invention is provided, capture image and recognize hand by camera The position of finger tip point and posture, by the finger extreme coordinates for obtaining the operational motion to dummy keyboard is mapped directly into, and Show on display and feed back to user, by taking the photograph in intelligent glasses and Intelligent bracelet, or intelligent and portable mobile device As equipment fast construction dummy keyboard input environment, it is no longer necessary to entity device, user is facilitated to pass through dummy keyboard whenever and wherever possible Carry out man-machine interactive operation.
Embodiments of the invention are the foregoing is only, the scope of the claims of the present invention is not thereby limited, it is every using this Equivalent structure or equivalent flow conversion that bright specification and accompanying drawing content are made, or directly or indirectly it is used in other related skills Art field, is included within the scope of the present invention.

Claims (13)

1. a kind of man-machine interactive virtual touch control device, including display control unit and display unit, it is characterised in that the dress Put including:
View recognition unit, the view data of the user's finger for gathering to camera carries out hard recognition, to determine finger End points position in the picture;
Horizontal plane two-dimensional coordinate sets up unit, for the finger endpoint location that identified according to the view recognition unit in image In position and camera pixel resolution, finger end points location of pixels is converted to into the two-dimensional coordinate value of XZ coordinate surfaces;
Vertical plane two-dimensional coordinate sets up unit, for the finger endpoint location that identified according to the view recognition unit in image In position and camera pixel resolution, finger end points location of pixels is converted to into the two-dimensional coordinate value of YZ coordinate surfaces;
Three-dimensional coordinate computing unit, for setting up unit according to the horizontal plane two-dimensional coordinate and the vertical plane two-dimensional coordinate is built The finger end points location of pixels that vertical unit determines respectively calculates finger end points in the two-dimensional coordinate value of XZ coordinate surfaces and YZ coordinate surfaces Coordinate value in XYZ three-dimensional system of coordinates;The Z coordinate value of the three-dimensional system of coordinate is the coordinate Z value and YZ coordinates of XZ coordinate surfaces The mean value of the coordinate Z values in face, the X-coordinate value of three-dimensional system of coordinate for XZ coordinate surfaces X-coordinate value, the Y-coordinate of three-dimensional system of coordinate It is worth for the Y-coordinate value of YZ coordinate surfaces;
Action judging unit, for the D coordinates value of finger end points that calculated according to the three-dimensional coordinate computing unit with it is virtual The corresponding band of position of each key mapping of keyboard, judges operation of the user to dummy keyboard key mapping;It is additionally operable to initial in hand position Minimum end points during the change stage according to vertical direction in the D coordinates value of finger end points judges plane Y-axis value as clicking on, The D coordinates value of finger end points is mapped on each key area of dummy keyboard, and is fallen corresponding according to each finger end points Key area determines selected button, so that it is determined that key mapping information;And
Chart drawing unit, for drawing out analog subscriber finger in virtual key according to the judged result of the action judging unit The operation diagram picture of disk correspondence key mapping, and call the display control unit control display unit to show the operation diagram picture.
2. man-machine interactive virtual touch control device as claimed in claim 1, it is characterised in that also single including ambient brightness sensing Unit, for the brightness value of induced environment;
The view recognition unit includes:
Longitudinal view recognizes subelement, and the ambient brightness value for being detected according to the ambient brightness sensing unit judges to use Common camera or infrared camera, and the image of subelement collection is recognized to longitudinal view after the camera for using is determined Data carry out hard recognition, to determine in the finger end points pixel position in the picture of XZ coordinate surfaces;And
Transverse views recognize subelement, and the ambient brightness value for being detected according to the ambient brightness sensing unit judges to use Common camera or infrared camera, and after the camera for using is determined transverse views are recognized with the image of subelement collection Data carry out hard recognition, to determine in the finger end points pixel position in the picture of YZ coordinate surfaces.
3. man-machine interactive virtual touch control device as claimed in claim 1, it is characterised in that the horizontal plane two-dimensional coordinate is built Finger end points location of pixels is converted to vertical unit the two-dimensional coordinate value of XZ coordinate surfaces, and the vertical plane two-dimensional coordinate is set up Finger end points location of pixels is converted to unit the two-dimensional coordinate value of YZ coordinate surfaces, specially:Image lower left corner pixel is set The starting point 0 of two-dimensional coordinate system is set to, goes out to sit with the coordinate value range computation after two-dimensional coordinate is converted to according to image analytic degree Scale value scope is with respect to the line number of each image and the ratio of columns.
4. man-machine interactive virtual touch control device as claimed in claim 1, it is characterised in that the action judging unit is by hand The D coordinates value of finger tip point is mapped on the key area of dummy keyboard, specially:Each button is flat in XZ in dummy keyboard There is the two-dimensional coordinate regional extent of corresponding key mapping on face, when it is determined that the XZ coordinate values of finger end points fall into the coordinates regional scope When judge user finger end points in correspondence key mapping top.
5. man-machine interactive virtual touch control device as claimed in claim 4, it is characterised in that the action judging unit according to The D coordinates value of finger end points judges click action, specially:After Chosen Point hits judgement plane Y-axis value, when finger end points When Y value in three-dimensional coordinate judges plane Y-axis value less than the click, then judge that the finger end points has passed through the click and sentenced Face is allocated, and the key mapping coordinates regional scope fallen into reference to finger end points judges that user finally carries out the button of clicking operation.
6. the virtual touch-control system of a kind of man-machine interactive, it is characterised in that include the people as described in claim 1-5 any one Machine interactive virtual contactor control device and two picture pick-up devices with described device communication connection.
7. the virtual touch-control system of man-machine interactive as claimed in claim 6, it is characterised in that the picture pick-up device includes first Camera and second camera, used as longitudinal picture pick-up device, the second camera is used as laterally taking the photograph for first camera As equipment, the first camera is set to orthogonal with the shooting direction of second camera.
8. the virtual touch-control system of man-machine interactive as claimed in claim 7, it is characterised in that first camera is common Camera, the second camera is infrared camera.
9. the virtual touch control method of a kind of man-machine interactive, it is characterised in that methods described includes:
User vacantly lies against finger in image capture area, and to the view data of the user's finger of camera collection hand is carried out Portion recognizes, to determine finger end points position in the picture;
According to the finger endpoint location for identifying position in the picture and the pixel resolution of camera, by finger end points pixel Position is respectively converted into the two-dimensional coordinate value of XZ coordinate surfaces and YZ coordinate surfaces;
Finger end points is calculated in the two-dimensional coordinate value of XZ coordinate surfaces and YZ coordinate surfaces according to finger end points location of pixels three-dimensional in XYZ Coordinate value in coordinate system;The Z coordinate value of the three-dimensional system of coordinate is the coordinate Z values of XZ coordinate surfaces and the coordinate Z of YZ coordinate surfaces The mean value of value, the X-coordinate value of three-dimensional system of coordinate is the X-coordinate value of XZ coordinate surfaces, and the Y-coordinate value of three-dimensional system of coordinate is YZ coordinates The Y-coordinate value in face;
According to the corresponding band of position of the D coordinates value and dummy keyboard of the finger end points for being calculated each key mapping, user is judged Operation to dummy keyboard key mapping;Judged as clicking on according to the minimum end points of vertical direction in the D coordinates value of finger end points Plane Y-axis value, the D coordinates value of handle finger tip point is mapped on each key area of dummy keyboard, and according to each finger end Point falls to determining selected button in corresponding key area, so that it is determined that key mapping information;
Operation diagram picture of the analog subscriber finger in dummy keyboard correspondence key mapping is drawn out according to judged result;And
Show the images for user viewing drawn.
10. the virtual touch control method of man-machine interactive as claimed in claim 9, it is characterised in that the user is hanging by finger In lying against image capture area, hard recognition is carried out to the view data of the user's finger of camera collection, to determine finger Before the step of end points position in the picture, including:
Judge to use common camera or infrared photography with the luminance threshold value for pre-setting according to the ambient brightness value of sensing Head;
The finger for needing to operate in both hands is vacantly lain against in image capture area and is kept the static of certain hour by user, by Selected camera recognizes and orients the initial position of hand.
The virtual touch control method of 11. man-machine interactives as claimed in claim 9, it is characterised in that described by the three of finger end points The step that dimensional coordinate values are mapped on the key area of dummy keyboard is specially:In XZ planes there be each button in dummy keyboard The two-dimensional coordinate regional extent of correspondence key mapping, judges when it is determined that the XZ coordinate values of finger end points fall into the coordinates regional scope Top of the finger end points of user in correspondence key mapping.
The virtual touch control method of 12. man-machine interactives as claimed in claim 9, it is characterised in that the user is hanging by finger In lying against image capture area, hard recognition is carried out to the view data of the user's finger of camera collection, to determine finger The step of end points position in the picture, is specially:
Determine respectively in the finger end points pixel position in the picture of XZ coordinate surfaces and YZ coordinate surfaces.
The virtual touch control method of 13. man-machine interactives as claimed in claim 12, it is characterised in that the hand that the basis is identified Refer to the pixel resolution of endpoint location position in the picture and camera, finger end points location of pixels is respectively converted into into XZ and is sat The step of two-dimensional coordinate value of mark face and YZ coordinate surfaces, is specially:
Image lower left corner pixel is set to into the starting point 0 of two-dimensional coordinate system, according to image analytic degree and being converted to two-dimentional seat Coordinate value range computation after mark goes out coordinate value scope with respect to the line number of each image and the ratio of columns.
CN201410436989.0A 2014-08-29 2014-08-29 Virtual keyboard operation device, system and method Active CN104199550B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410436989.0A CN104199550B (en) 2014-08-29 2014-08-29 Virtual keyboard operation device, system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410436989.0A CN104199550B (en) 2014-08-29 2014-08-29 Virtual keyboard operation device, system and method

Publications (2)

Publication Number Publication Date
CN104199550A CN104199550A (en) 2014-12-10
CN104199550B true CN104199550B (en) 2017-05-17

Family

ID=52084851

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410436989.0A Active CN104199550B (en) 2014-08-29 2014-08-29 Virtual keyboard operation device, system and method

Country Status (1)

Country Link
CN (1) CN104199550B (en)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016115976A1 (en) * 2015-01-21 2016-07-28 Kong Liang Smart wearable input apparatus
CN106155535A (en) * 2015-03-23 2016-11-23 联想(北京)有限公司 Touch screen soft-keyboard input method and device
CN105425964B (en) * 2015-11-30 2018-07-13 青岛海信电器股份有限公司 A kind of gesture identification method and system
CN105867806B (en) * 2016-03-25 2020-05-26 联想(北京)有限公司 Input method and electronic equipment
TWI695307B (en) * 2016-04-29 2020-06-01 姚秉洋 Method for displaying an on-screen keyboard, computer program product thereof and non-transitory computer-readable medium thereof
CN106155533B (en) * 2016-06-30 2019-05-31 联想(北京)有限公司 A kind of information processing method and projection device
CN107168541A (en) * 2017-04-07 2017-09-15 北京小鸟看看科技有限公司 The implementation method and device of a kind of input
CN111007977A (en) * 2018-10-04 2020-04-14 邱波 Intelligent virtual interaction method and device
CN109828672B (en) * 2019-02-14 2022-05-27 亮风台(上海)信息科技有限公司 Method and equipment for determining man-machine interaction information of intelligent equipment
CN110780743A (en) * 2019-11-05 2020-02-11 聚好看科技股份有限公司 VR (virtual reality) interaction method and VR equipment
CN111796672B (en) * 2020-05-22 2023-04-28 福建天晴数码有限公司 Gesture recognition method based on head-mounted equipment and storage medium
CN111796671B (en) * 2020-05-22 2023-04-28 福建天晴数码有限公司 Gesture recognition and control method of head-mounted equipment and storage medium
CN111796673B (en) * 2020-05-22 2023-04-28 福建天晴数码有限公司 Multi-finger gesture recognition method of head-mounted equipment and storage medium
CN111796674B (en) * 2020-05-22 2023-04-28 福建天晴数码有限公司 Gesture touch sensitivity adjusting method based on head-mounted equipment and storage medium
CN111796675B (en) * 2020-05-22 2023-04-28 福建天晴数码有限公司 Gesture recognition control method of head-mounted equipment and storage medium
CN111860239A (en) * 2020-07-07 2020-10-30 佛山长光智能制造研究院有限公司 Key identification method and device, terminal equipment and computer readable storage medium
CN114949663B (en) * 2022-05-13 2023-03-21 成都软智科技有限公司 Many unmanned aerial vehicle fire extinguishing system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04271423A (en) * 1991-02-27 1992-09-28 Nippon Telegr & Teleph Corp <Ntt> Information input method
CN101589425A (en) * 2006-02-16 2009-11-25 Ftk技术有限公司 A system and method of inputting data into a computing system
CN102880304A (en) * 2012-09-06 2013-01-16 天津大学 Character inputting method and device for portable device
CN103105930A (en) * 2013-01-16 2013-05-15 中国科学院自动化研究所 Non-contact type intelligent inputting method based on video images and device using the same

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4286556B2 (en) * 2003-02-24 2009-07-01 株式会社東芝 Image display device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04271423A (en) * 1991-02-27 1992-09-28 Nippon Telegr & Teleph Corp <Ntt> Information input method
CN101589425A (en) * 2006-02-16 2009-11-25 Ftk技术有限公司 A system and method of inputting data into a computing system
CN102880304A (en) * 2012-09-06 2013-01-16 天津大学 Character inputting method and device for portable device
CN103105930A (en) * 2013-01-16 2013-05-15 中国科学院自动化研究所 Non-contact type intelligent inputting method based on video images and device using the same

Also Published As

Publication number Publication date
CN104199550A (en) 2014-12-10

Similar Documents

Publication Publication Date Title
CN104199550B (en) Virtual keyboard operation device, system and method
CN104199547B (en) Virtual touch screen operation device, system and method
CN106598227B (en) Gesture identification method based on Leap Motion and Kinect
Murugappan et al. Extended multitouch: recovering touch posture and differentiating users using a depth camera
CN104199548B (en) A kind of three-dimensional man-machine interactive operation device, system and method
US8749502B2 (en) System and method for virtual touch sensing
CN102508574B (en) Projection-screen-based multi-touch detection method and multi-touch system
TWI471815B (en) Gesture recognition device and method
CN103472916A (en) Man-machine interaction method based on human body gesture recognition
CN103207709A (en) Multi-touch system and method
CN104081307A (en) Image processing apparatus, image processing method, and program
CN105242776A (en) Control method for intelligent glasses and intelligent glasses
CN109839827B (en) Gesture recognition intelligent household control system based on full-space position information
CN104267802A (en) Human-computer interactive virtual touch device, system and method
CN103761011B (en) A kind of method of virtual touch screen, system and the equipment of calculating
TWI486815B (en) Display device, system and method for controlling the display device
CN104199549B (en) A kind of virtual mouse action device, system and method
CN106325726A (en) A touch control interaction method
CN107239222A (en) The control method and terminal device of a kind of touch-screen
Hartanto et al. Real time hand gesture movements tracking and recognizing system
CN103543825A (en) Camera cursor system
CN108227923A (en) A kind of virtual touch-control system and method based on body-sensing technology
CN103869941B (en) Have electronic installation and the instant bearing calibration of virtual touch-control of virtual touch-control service
CN103389793B (en) Man-machine interaction method and system
CN207557895U (en) A kind of equipment positioning device applied to large display screen curtain or projection screen

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 350003 Fuzhou Gulou District, Fujian, software Avenue, building 89, No. 18

Applicant after: FUZHOU ROCKCHIP ELECTRONICS CO., LTD.

Address before: 350003 Fuzhou Gulou District, Fujian, software Avenue, building 89, No. 18

Applicant before: Fuzhou Rockchip Semiconductor Co., Ltd.

COR Change of bibliographic data
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder
CP01 Change in the name or title of a patent holder

Address after: 350003 building, No. 89, software Avenue, Gulou District, Fujian, Fuzhou 18, China

Patentee after: Ruixin Microelectronics Co., Ltd

Address before: 350003 building, No. 89, software Avenue, Gulou District, Fujian, Fuzhou 18, China

Patentee before: Fuzhou Rockchips Electronics Co.,Ltd.