CN104199550A - Man-machine interactive type virtual touch device, system and method - Google Patents

Man-machine interactive type virtual touch device, system and method Download PDF

Info

Publication number
CN104199550A
CN104199550A CN201410436989.0A CN201410436989A CN104199550A CN 104199550 A CN104199550 A CN 104199550A CN 201410436989 A CN201410436989 A CN 201410436989A CN 104199550 A CN104199550 A CN 104199550A
Authority
CN
China
Prior art keywords
end points
coordinate
finger
finger end
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410436989.0A
Other languages
Chinese (zh)
Other versions
CN104199550B (en
Inventor
廖裕民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rockchip Electronics Co Ltd
Original Assignee
Fuzhou Rockchip Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuzhou Rockchip Electronics Co Ltd filed Critical Fuzhou Rockchip Electronics Co Ltd
Priority to CN201410436989.0A priority Critical patent/CN104199550B/en
Publication of CN104199550A publication Critical patent/CN104199550A/en
Application granted granted Critical
Publication of CN104199550B publication Critical patent/CN104199550B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention provides a man-machine interactive type virtual touch device, system and method. The method comprises the steps that a camera is utilized for collecting user finger image data to determine the positions of the end points of fingers in an image, the pixel resolution degree of the camera is combined, and therefore the pixel positions of the end points of the fingers are respectively converted into a two-dimensional coordinate value of an XZ coordinate surface and a two-dimensional coordinate value of a YZ coordinate surface. According to the two-dimensional coordinate value of the pixel positions of the end points of the fingers on the XZ coordinate surface and the two-dimensional coordinate value of the pixel positions of the end points of the fingers on YZ coordinate surface, the coordinate value of the end points of the fingers in an XYZ three-dimensional coordinate system is calculated, the operation of a user on keys of a virtual keyboard is judged by combining the position area corresponding to each key in the virtual keyboard, an operation image of simulating the fingers of the user on the corresponding keys of the virtual keyboard is drawn according to the judgment result, and the drawn image is displayed to allow the user to look on. According to the device, system and method, the user can conveniently perform free man-machine interactive operation through the virtual keyboard any time any where.

Description

The virtual contactor control device of man-machine interactive, system and method
Technical field
The present invention relates to virtual touch technology field, relate in particular to the virtual contactor control device of a kind of man-machine interactive, system and method.
Background technology
Keyboard is the important component part in the man-machine interaction of prior art, existing keyboard input all depends on physical keyboard equipment, that is to say to have the in esse keyboard can finishing man-machine interaction, so just greatly limited place scope and the condition of carrying out man-machine interaction.
Summary of the invention
In view of the above problems, the invention provides the virtual contactor control device of a kind of man-machine interactive that overcomes the problems referred to above or address the above problem at least partly, system and method.
The invention provides the virtual contactor control device of a kind of man-machine interactive, comprise indicative control unit and display unit, this device comprises:
View recognition unit, carries out hard recognition for the view data of the user's finger to camera collection, to determine the position of finger end points in image.
Surface level two-dimensional coordinate is set up unit, for the finger endpoint location that identifies according to this view recognition unit, in the position of image and the pixel resolution of camera, finger end points location of pixels is converted to the two-dimensional coordinate value of XZ coordinate surface.
Vertical plane two-dimensional coordinate is set up unit, for the finger endpoint location that identifies according to this view recognition unit, in the position of image and the pixel resolution of camera, finger end points location of pixels is converted to the two-dimensional coordinate value of YZ coordinate surface.
Three-dimensional coordinate computing unit, for according to this surface level two-dimensional coordinate, set up unit and vertical plane two-dimensional coordinate set up unit respectively definite finger end points location of pixels in the two-dimensional coordinate value of XZ coordinate surface and YZ coordinate surface, calculate finger end points at the coordinate figure of XYZ three-dimensional system of coordinate.
Action judging unit, for according to the band of position corresponding to each key mapping of D coordinates value and dummy keyboard of the finger end points of this three-dimensional coordinate computing unit calculating, judges the operation of user to dummy keyboard key mapping.And
Graphic plotting unit, for drawing out analog subscriber finger according to the judged result of this action judging unit at the application drawing picture of the corresponding key mapping of dummy keyboard, and calls this indicative control unit and controls this display unit and show this application drawing picture.
The present invention also provides a kind of man-machine interactive virtual touch-control system, comprise the virtual contactor control device of man-machine interactive described in as above any one and with two picture pick-up devices of this device communication connection.
The present invention also provides a kind of man-machine interactive virtual touch control method, and the method comprises:
User lies against in picture catching region finger is unsettled, and the view data of user's finger of camera collection is carried out to hard recognition, to determine the position of finger end points in image.
According to the position of finger endpoint location in image and the pixel resolution of camera that identify, finger end points location of pixels is converted to respectively to the two-dimensional coordinate value of XZ coordinate surface and YZ coordinate surface.
According to finger end points location of pixels, in the two-dimensional coordinate value of XZ coordinate surface and YZ coordinate surface, calculate the coordinate figure of finger end points in XYZ three-dimensional system of coordinate.
According to the band of position corresponding to each key mapping of D coordinates value and dummy keyboard of the finger end points of this calculating, the operation of judgement user to dummy keyboard key mapping.
According to this judged result, draw out analog subscriber finger at the application drawing picture of the corresponding key mapping of dummy keyboard.And
The image that shows this drafting is watched for user.
The virtual contactor control device of a kind of man-machine interactive provided by the invention, system and method, by camera, captured position and the posture of image identification finger end points, by the finger end points coordinate obtaining, be directly mapped as the operational motion to dummy keyboard, and on display, show and feed back to user, by intelligent glasses and intelligent bracelet, or the picture pick-up device fast construction dummy keyboard input environment on intelligent and portable mobile device, no longer need entity device, facilitate user by dummy keyboard, to carry out man-machine interactive operation whenever and wherever possible.
Accompanying drawing explanation
Fig. 1 is the hardware structure schematic diagram of the virtual touch-control system of man-machine interactive in embodiment of the present invention;
Fig. 2 is the high-level schematic functional block diagram of the virtual contactor control device of man-machine interactive in embodiment of the present invention;
Fig. 3 is the schematic flow sheet of the virtual touch control method of man-machine interactive in embodiment of the present invention.
Label declaration:
System 100
Device 10
Ambient brightness sensing unit 101
View recognition unit 102
Longitudinal view recognin unit 103
Transverse views recognin unit 104
Surface level two-dimensional coordinate is set up unit 105
Vertical plane two-dimensional coordinate is set up unit 106
Three-dimensional coordinate computing unit 107
Action judging unit 108
Graphic plotting unit 109
Indicative control unit 110
Display unit 111
Picture pick-up device 20
The first camera 201
Second camera 202
Display device 21
Embodiment
By describing technology contents of the present invention, structural attitude in detail, being realized object and effect, below in conjunction with embodiment and coordinate accompanying drawing to be explained in detail.
Refer to Fig. 1, hardware structure schematic diagram for the virtual touch-control system of man-machine interactive in embodiment of the present invention, this system 100 comprises the virtual contactor control device of man-machine interactive 10, two picture pick-up devices 20 and display devices 21, for the detection of user's gesture being realized to touch-control input.
Please refer to Fig. 2, it is the high-level schematic functional block diagram of the virtual contactor control device of man-machine interactive in embodiment of the present invention.This device 10 comprises that ambient brightness sensing unit 101, view recognition unit 102, surface level two-dimensional coordinate are set up unit 105, vertical plane two-dimensional coordinate is set up unit 106, three-dimensional coordinate computing unit 107, action judging unit 108, graphic plotting unit 109, indicative control unit 110 and display unit 111.This device 10 can be applied in electronic equipments such as camera, mobile phone, panel computer, and this picture pick-up device 20 communicates by network and this device 10 and is connected, and the transmission medium of this network can be the wireless transmission mediums such as bluetooth, zigbee, WIFI.
Each picture pick-up device 20 includes the first camera 201 and second camera 202, respectively as longitudinally picture pick-up device and laterally picture pick-up device.Wherein, as the first camera 201 of longitudinal picture pick-up device can for intelligent glasses etc. can be in user's hand top mobile portable electronic equipment, as the second camera 202 of horizontal picture pick-up device, can be positioned over the mobile portable electronic equipment in user the place ahead for intelligent bracelet etc.Further, the first camera 201 and the second camera 202 of each picture pick-up device 20 are respectively common camera and infrared camera.Wherein, common camera can, in the good situation of light condition, carry out image acquisition and be sent to device 10 and analyze user's operational motion.Infrared camera can, in the situation that light condition is poor, carries out image acquisition and be sent to device 10 and analyze user's operational motion.This view recognition unit 102 comprises longitudinal view recognin unit 103 and transverse views recognin unit 104, correspondence is as longitudinally picture pick-up device and laterally the first camera 201 and second camera 202 settings of picture pick-up device, for the image of its collection is carried out to identifying processing respectively.
When original state, two pairs of cameras (a pair of common camera and a pair of infrared camera) are used in conjunction with, and shooting direction is set to orthogonal, can catch the action behavior of hand vertical direction and horizontal direction simultaneously.Conventionally, in intelligent glasses, two cameras (a common camera and an infrared camera) are put down, and two cameras on intelligent bracelet or smart mobile phone (a common camera and an infrared camera) level is put.And, by the rectangular area of taking of two pairs of cameras, jointly form picture catching region.
The brightness value of these ambient brightness sensing unit 101 induced environments, and ambient brightness value is sent in this view recognition unit 102.This view recognition unit 102 is used common camera or infrared camera according to the luminance threshold value judgement setting in advance.For example, brightness impression scope is 1~100, and threshold value is 50, and ambient brightness value surpasses at 50 o'clock and determines and uses common camera, when ambient brightness value is used infrared camera image lower than 50 time.
According to ambient brightness value, determine after the camera types using, start initial alignment operation, specific as follows.This device 10 is when carrying out initial alignment operation, user lies against the position that two groups of selected cameras can photograph by the finger that needs operation in both hands is unsettled,, picture catching region, and keep the static of certain hour, with completing user hand position initialization flow process, be convenient to device 10 identifications and orient the initial position of pointing end points, so that follow-up operation.The principle of these device 10 identifications and location finger endpoint location will below be described in detail.
When carrying out interactive operation, user lies against in picture catching region finger is unsettled, common camera or infrared camera are used in the ambient brightness value judgement that this longitudinal view recognin unit 103 detects according to this ambient brightness sensing unit 101, and after definite camera using, the longitudinal common camera of picture pick-up device of conduct above finger or the view data of infrared camera collection are carried out to hard recognition, to determine the position of finger end points in image.Common camera or infrared camera are used in the ambient brightness value judgement that this transverse views recognin unit 104 detects according to this ambient brightness sensing unit 101, and after the camera determine using to carrying out hard recognition in finger the place ahead as the common camera of horizontal picture pick-up device or the view data of infrared camera collection, to determine the position of finger end points in image.
Wherein, the position of the definite finger end points in this longitudinal view recognin unit 103 in image is finger end points pixel at the XZ coordinate surface position in image, for example, this longitudinal recognin unit 103 identifies the capable b row of a that finger end points 1 pixel is positioned at XZ face image, end points 2 pixels are positioned at the capable d row of c of XZ face image,, end points 10 pixels are positioned at the capable f row of e of XZ face image.The position of the definite finger end points in this transverse views recognin unit 104 in image is finger end points pixel at the YZ coordinate surface position in image.For example, this transverse views recognin unit 104 identifies the capable h row of g that end points 1 pixel is positioned at YZ face image, and end points 2 pixels are positioned at the capable j row of i of YZ face image ..., end points 10 pixels are positioned at the capable l row of k of YZ face image.
Further, the method by common camera judgement finger end points comprises color background method and color glove method.Wherein, color background method is specially: the environmental background of bimanualness needs color relatively simple and single, can directly pass through so the direct handle portion Extraction of Image of color interval range of human body complexion out, then according to figure endpoint algorithm, calculate the cut off position that each strip of hand extends, as finger endpoint location.Color glove auxiliary law is specially: user wears special-purpose pure red gloves, because common camera is all RGB (red-green-blue) sampling, can directly extract pure red regional location, also can use green or blue as finger of glove end points color.
The method that judges finger central point by infrared camera comprises temperature filtering method and color glove auxiliary law.Wherein, temperature filtering method is in particular: bimanualness can be directly by the higher feature of the relative environment temperature of human surface temperature directly the higher hand Extraction of Image of temperature out, then according to figure endpoint algorithm, calculate the cut off position that hand strip extends, as each finger endpoint location.Color glove auxiliary law is specially: user wears special gloves, and there is heating effect on the surface of gloves, can directly extract the hot spot region in image like this.
This surface level two-dimensional coordinate is set up the position of finger endpoint location in image and the pixel resolution of camera that unit 105 identifies according to this longitudinal view recognin unit 103,10 finger end points location of pixels is converted to the two-dimensional coordinate value of XZ coordinate surface.This vertical plane two-dimensional coordinate is set up the position of finger endpoint location in image and the pixel resolution of camera that unit 106 identifies according to this transverse views recognin unit 104,10 finger end points location of pixels is converted to the two-dimensional coordinate value of YZ coordinate surface.
Wherein, the transfer principle that finger end points location of pixels is converted to the two-dimensional coordinate value of XZ coordinate surface is in particular: image lower left corner pixel is set to the starting point 0 of two-dimensional coordinate system, according to image analytic degree, goes out the line number of relative each image of coordinate figure scope and the ratio of columns with the coordinate figure range computation being converted to after two-dimensional coordinate.For example, the wide height of XZ coordinate surface image analytic degree is 2000*1000, the coordinate figure scope of two dimension XZ plane coordinate system is that X-axis is 1 to 150, Z axis is 1 to 100, and the column number proportion of the relative image of Z axis coordinate figure scope is 100/1000, the columns ratio 150/2000 of the relative image of X-axis coordinate figure scope.The location of pixels of finger end points is multiplied by the ratio of the relative image line of the coordinate range calculating, columns, thereby obtains being converted to the end points two-dimensional coordinate value after two-dimensional coordinate.For example, the location of pixels of certain finger end points is 300 row 200 row, and the Z axis coordinate of this finger end points is 300*100/1000=30, and the X-axis coordinate of this finger end points is 200*150/2000=15.The transfer principle of two-dimensional coordinate value that finger end points location of pixels is converted to YZ coordinate surface is the same, at this, does not add and repeats.
This three-dimensional coordinate computing unit 107 is set up unit 105 and vertical plane two-dimensional coordinate according to this surface level two-dimensional coordinate and is set up 10 finger end points location of pixels determining respectively unit 106 and calculate the coordinate figure of finger end points in XYZ three-dimensional system of coordinate in the two-dimensional coordinate value of XZ coordinate surface and YZ coordinate surface.
Wherein, the principle of work of calculating the coordinate figure of finger end points in XYZ three-dimensional system of coordinate is specially: because XZ coordinate surface and YZ coordinate surface have common Z axis, so the Z value of each coordinate end points in the Z value of each coordinate end points in XZ coordinate surface and YZ coordinate surface is extracted and is compared, consistent or the immediate coordinate end points of Z axis coordinate figure can be considered to same end points, then the coordinate figure of XZ coordinate surface and the coordinate figure of YZ coordinate surface that are judged as same end points are merged into a coordinate end points, the coordinate figure of usining as XYZ three-dimensional system of coordinate.Because Z value is likely different, so the coordinate Z value that the coordinate Z value that the Z value of the new three-dimensional coordinate producing is XZ coordinate surface adds YZ coordinate surface is then divided by 2 operation result, the X in three-dimensional system of coordinate, Y coordinate figure equal respectively X coordinate figure and the Y coordinate figure of XZ coordinate surface and YZ coordinate surface.
The band of position corresponding to each key mapping in the D coordinates value of the finger end points that this action judging unit 108 calculates according to this three-dimensional coordinate computing unit 107 and dummy keyboard, judges whether user has carried out the push of keyboard.
In the present embodiment, when hand position initial phase, this action judging unit 108 is judged plane Y-axis value according to the minimum end points of vertical direction in the D coordinates value of all finger end points (the namely min coordinates value of Y-axis) as clicking, the D coordinates value of 10 finger end points is mapped on each key area of dummy keyboard, then according to each finger end points, drop on corresponding key area and determine selected button, thereby determine key mapping information.
Owing to being hand position initial phase, this action judging unit 108 utilizes Y value to set the initial value of the judgement face of click, so the Y value of hand coordinate is all more than or equal to the decision content of the judgement face of click.And, when user moves hand and carries out the operation under normal mode of operation, each this action judging unit 108 receives after the three-dimensional coordinate of finger end points, no longer reset this click and judge plane Y-axis value, but directly according to this click, judge that plane Y-axis value judges whether that effective click action appears in virtual touch screen.
Wherein, the D coordinates value of finger end points is mapped on the key area of keyboard, be specially: in keyboard, each button can have the two-dimensional coordinate regional extent of corresponding key mapping in XZ plane, for example, button A is 7~9 formed rectangular area scopes on x axle 3~5, on Z axis at the coordinate range of XZ face, as long as there is the XZ coordinate figure of any one end points to fall into finger end points that this rectangular area scope the judges user top in key mapping A.
According to the D coordinates value judgement click action of finger end points, be specially: when having selected to click, judge after plane Y-axis value, as long as the Y value in finger end points three-dimensional coordinate is clicked and is judged plane Y-axis value lower than this, judge that this end points passes through this click judge plane, there is click behavior in this finger, then in conjunction with finger end points, which key mapping region decision user, which button carried out to clicking operation.
This graphic plotting unit 109 is according to the judged result of this action judging unit 108,, which key area 10 finger end points drop on respectively on and selected key mapping, draw out each finger of analog subscriber hand in the corresponding key mapping of corresponding dummy keyboard, then selected and occur that the button of click behavior is plotted as highlighted to represent clicked choosing.
This indicative control unit 110 is converted to the image of being drawn by this graphic plotting unit 109 sequential that display device 21 can show, call this display unit 111 operated image on dummy keyboard is shown on display device 21 for user and is watched, user can learn that according to feedback each finger is current corresponding to which key mapping and proceed virtual key operation.
Referring to Fig. 3, is the schematic flow sheet of the virtual touch control method of man-machine interactive in embodiment of the present invention, and the method comprises:
Step S30, the brightness value of these ambient brightness sensing unit 101 induced environments, common camera or infrared camera are used in the ambient brightness value judgement that the luminance threshold value that these view recognition unit 102 bases set in advance and this ambient brightness sensing unit 101 sense.
When original state, two pairs of cameras (a pair of common camera and a pair of infrared camera) are used in conjunction with, and shooting direction is set to orthogonal, can catch the action behavior of hand vertical direction and horizontal direction simultaneously.Conventionally, in intelligent glasses, two cameras (a common camera and an infrared camera) are put down, and two cameras on intelligent bracelet or smart mobile phone (a common camera and an infrared camera) level is put.And, by the rectangular area of taking of two pairs of cameras, jointly form picture catching region.
Step S31, user will need the finger of operation unsettledly lie against in picture catching region and keep the static of certain hour in both hands, by device 10 identifications with orient the initial position of finger, the initialization of completing user finger position.
The principle of these device 10 identifications and location finger position will below be described in detail.
Step S32, user will need unsettled the lying against in picture catching region of finger of operation in both hands, this longitudinal view recognin unit 103 is according to carrying out hard recognition in finger top as longitudinal common camera of picture pick-up device or the view data of infrared camera collection, to determine the position of finger end points in image.This transverse views recognin unit 104 is according to carrying out hard recognition in finger the place ahead as the view data of the common camera in horizontal picture pick-up device or infrared camera collection, to determine the position of finger end points in image.
Particularly, the position of the definite finger end points in this longitudinal view recognin unit 103 in image is finger end points pixel at the XZ coordinate surface position in image, for example, this longitudinal recognin unit 103 identifies the capable b row of a that finger end points 1 pixel is positioned at XZ face image, end points 2 pixels are positioned at the capable d row of c of XZ face image,, end points 10 pixels are positioned at the capable f row of e of XZ face image.The position of the definite finger end points in this transverse views recognin unit 104 in image is finger end points pixel at the YZ coordinate surface position in image.For example, this transverse views recognin unit 104 identifies the capable h row of g that end points 1 pixel is positioned at YZ face image, and end points 2 pixels are positioned at the capable j row of i of YZ face image ..., end points 10 pixels are positioned at the capable l row of k of YZ face image.
Further, the method by common camera judgement finger end points comprises color background method and color glove method.Wherein, color background method is specially: the environmental background of bimanualness needs color relatively simple and single, can directly pass through so the direct handle portion Extraction of Image of color interval range of human body complexion out, then according to figure endpoint algorithm, calculate the cut off position that each strip of hand extends, as finger endpoint location.Color glove auxiliary law is specially: user wears special-purpose pure red gloves, because common camera is all RGB (red-green-blue) sampling, can directly extract pure red regional location, also can use green or blue as finger of glove end points color.
The method that judges finger central point by infrared camera comprises temperature filtering method and color glove auxiliary law.Wherein, temperature filtering method is in particular: bimanualness can be directly by the higher feature of the relative environment temperature of human surface temperature directly the higher hand Extraction of Image of temperature out, then according to figure endpoint algorithm, calculate the cut off position that hand strip extends, as each finger endpoint location.Color glove auxiliary law is specially: user wears special gloves, and there is heating effect on the surface of gloves, can directly extract the hot spot region in image like this.
Step S33, this surface level two-dimensional coordinate is set up the position of finger endpoint location in image and the pixel resolution of camera that unit 105 identifies according to this longitudinal view recognin unit 103,10 finger end points location of pixels is converted to the two-dimensional coordinate value of XZ coordinate surface.This vertical plane two-dimensional coordinate is set up the position of finger endpoint location in image and the pixel resolution of camera that unit 106 identifies according to this transverse views recognin unit 104,10 finger end points location of pixels is converted to the two-dimensional coordinate value of YZ coordinate surface.
Wherein, the transfer principle that finger end points location of pixels is converted to the two-dimensional coordinate value of XZ coordinate surface is in particular: image lower left corner pixel is set to the starting point 0 of two-dimensional coordinate system, according to image analytic degree, goes out the line number of relative each image of coordinate figure scope and the ratio of columns with the coordinate figure range computation being converted to after two-dimensional coordinate.For example, the wide height of XZ coordinate surface image analytic degree is 2000*1000, the coordinate figure scope of two dimension XZ plane coordinate system is that X-axis is 1 to 150, Z axis is 1 to 100, and the column number proportion of the relative image of Z axis coordinate figure scope is 100/1000, the columns ratio 150/2000 of the relative image of X-axis coordinate figure scope.The location of pixels of finger end points is multiplied by the ratio of the relative image line of the coordinate range calculating, columns, thereby obtains being converted to the end points two-dimensional coordinate value after two-dimensional coordinate.For example, the location of pixels of certain finger end points is 300 row 200 row, and the Z axis coordinate of this finger end points is 300*100/1000=30, and the X-axis coordinate of this finger end points is 200*150/2000=15.The transfer principle of two-dimensional coordinate value that finger end points location of pixels is converted to YZ coordinate surface is the same, at this, does not add and repeats.
Step S34, this three-dimensional coordinate computing unit 107 is set up unit 105 and vertical plane two-dimensional coordinate according to this surface level two-dimensional coordinate and is set up 10 finger end points location of pixels determining respectively unit 106 and calculate the coordinate figure of finger end points in XYZ three-dimensional system of coordinate in the two-dimensional coordinate value of XZ coordinate surface and YZ coordinate surface.
Wherein, the method of calculating the coordinate figure of finger end points in XYZ three-dimensional system of coordinate is specially: because XZ coordinate surface and YZ coordinate surface have common Z axis, so the Z value of each coordinate end points in the Z value of each coordinate end points in XZ coordinate surface and YZ coordinate surface is extracted and is compared, consistent or the immediate coordinate end points of Z axis coordinate figure can be considered to same end points, then the coordinate figure of XZ coordinate surface and the coordinate figure of YZ coordinate surface that are judged as same end points are merged into a coordinate end points, the coordinate figure of usining as XYZ three-dimensional system of coordinate.Because Z value is likely different, so the coordinate Z value that the coordinate Z value that the Z value of the new three-dimensional coordinate producing is XZ coordinate surface adds YZ coordinate surface is then divided by 2 operation result, the X in three-dimensional system of coordinate, Y coordinate figure equal respectively X coordinate figure and the Y coordinate figure of XZ coordinate surface and YZ coordinate surface.
Step S35, the band of position corresponding to each key mapping in the D coordinates value of the finger end points that this action judging unit 108 calculates according to this three-dimensional coordinate computing unit 107 and dummy keyboard, judges whether user has carried out the push of keyboard.
In the present embodiment, when hand position initial phase, this action judging unit 108 is judged plane Y-axis value according to the minimum end points of vertical direction in the D coordinates value of all finger end points (the namely min coordinates value of Y-axis) as clicking, the D coordinates value of 10 finger end points is mapped on each key area of dummy keyboard, then according to each finger end points, drop on corresponding key area and determine selected button, thereby determine key mapping information.
Owing to being hand position initial phase, this action judging unit 108 utilizes Y value to set the initial value of the judgement face of click, so the Y value of hand coordinate is all more than or equal to the decision content of the judgement face of click.And, when user moves hand and carries out the operation under normal mode of operation, each this action judging unit 108 receives after the three-dimensional coordinate of finger end points, no longer reset this click and judge plane Y-axis value, but directly according to this click, judge that plane Y-axis value judges whether that effective click action appears in virtual touch screen.
Wherein, the D coordinates value of finger end points is mapped on the key area of keyboard, be in particular: in keyboard, each button can have the two-dimensional coordinate regional extent of corresponding key mapping in XZ plane, for example, button A is 7~9 formed rectangular area scopes on x axle 3~5, on Z axis at the coordinate range of XZ face, as long as there is the XZ coordinate figure of any one end points to fall into finger end points that this rectangular area scope the judges user top in key mapping A.
According to the D coordinates value judgement click action of finger end points, be specially: when having selected to click, judge after plane Y-axis value, as long as the Y value in finger end points three-dimensional coordinate is clicked and is judged plane Y-axis value lower than this, judge that this end points passes through this click judge plane, there is click behavior in this finger, then in conjunction with finger end points, which key mapping region decision user, which button carried out to clicking operation.
Step S36, this graphic plotting unit 109 is according to the judged result of this action judging unit 108,, which key area 10 finger end points drop on respectively on and selected key mapping, draw out each finger of analog subscriber hand in the corresponding key mapping of corresponding dummy keyboard, then selected and occur that the button of click behavior is plotted as highlighted to represent clicked choosing.
Step S37, this indicative control unit 110 is converted to the image of being drawn by this graphic plotting unit 109 sequential that display device 21 can show, call this display unit 111 operated image on dummy keyboard is shown on display device 21 for user and is watched, user can learn that according to feedback each finger is current corresponding to which key mapping and proceed virtual key operation.
The virtual contactor control device of a kind of man-machine interactive provided by the invention, system and method, by camera, captured position and the posture of image identification finger end points, by the finger end points coordinate obtaining, be directly mapped as the operational motion to dummy keyboard, and on display, show and feed back to user, by intelligent glasses and intelligent bracelet, or the picture pick-up device fast construction dummy keyboard input environment on intelligent and portable mobile device, no longer need entity device, facilitate user by dummy keyboard, to carry out man-machine interactive operation whenever and wherever possible.
The foregoing is only embodiments of the invention; not thereby limit the scope of the claims of the present invention; every equivalent structure or conversion of equivalent flow process that utilizes instructions of the present invention and accompanying drawing content to do; or be directly or indirectly used in other relevant technical fields, be all in like manner included in scope of patent protection of the present invention.

Claims (15)

1. the virtual contactor control device of man-machine interactive, comprises indicative control unit and display unit, it is characterized in that, described device comprises:
View recognition unit, carries out hard recognition for the view data of the user's finger to camera collection, to determine the position of finger end points in image;
Surface level two-dimensional coordinate is set up unit, for the finger endpoint location that identifies according to described view recognition unit, in the position of image and the pixel resolution of camera, finger end points location of pixels is converted to the two-dimensional coordinate value of XZ coordinate surface;
Vertical plane two-dimensional coordinate is set up unit, for the finger endpoint location that identifies according to described view recognition unit, in the position of image and the pixel resolution of camera, finger end points location of pixels is converted to the two-dimensional coordinate value of YZ coordinate surface;
Three-dimensional coordinate computing unit, for according to described surface level two-dimensional coordinate, set up unit and vertical plane two-dimensional coordinate set up unit respectively definite finger end points location of pixels in the two-dimensional coordinate value of XZ coordinate surface and YZ coordinate surface, calculate finger end points at the coordinate figure of XYZ three-dimensional system of coordinate;
Action judging unit, for according to the band of position corresponding to each key mapping of D coordinates value and dummy keyboard of the finger end points of described three-dimensional coordinate computing unit calculating, judges the operation of user to dummy keyboard key mapping; And
Graphic plotting unit, for drawing out analog subscriber finger according to the judged result of described action judging unit at the application drawing picture of the corresponding key mapping of dummy keyboard, and calls described indicative control unit and controls described display unit and show described application drawing picture.
2. the virtual contactor control device of man-machine interactive as claimed in claim 1, is characterized in that, also comprises ambient brightness sensing unit, for the brightness value of induced environment;
Described view recognition unit comprises:
Longitudinal view recognin unit, for the ambient brightness value judgement detecting according to described ambient brightness sensing unit, use common camera or infrared camera, and after the camera determine using, the view data of described collection is carried out to hard recognition, to determine finger end points pixel at the XZ coordinate surface position in image; And
Transverse views recognin unit, for the ambient brightness value judgement detecting according to described ambient brightness sensing unit, use common camera or infrared camera, and after the camera determine using, the view data of described collection is carried out to hard recognition, to determine finger end points pixel at the YZ coordinate surface position in image.
3. the virtual contactor control device of man-machine interactive as claimed in claim 1, it is characterized in that, described surface level two-dimensional coordinate is set up unit and finger end points location of pixels is converted to the two-dimensional coordinate value of XZ coordinate surface, and described vertical plane two-dimensional coordinate is set up unit and finger end points location of pixels is converted to the two-dimensional coordinate value of YZ coordinate surface, be specially: image lower left corner pixel is set to the starting point 0 of two-dimensional coordinate system, according to image analytic degree, go out the line number of relative each image of coordinate figure scope and the ratio of columns with the coordinate figure range computation being converted to after two-dimensional coordinate.
4. the virtual contactor control device of man-machine interactive as claimed in claim 1, it is characterized in that, described action judging unit is also for judging plane Y-axis value according to the minimum end points of the D coordinates value vertical direction of finger end points as clicking when the hand position initial phase, the D coordinates value of finger end points is mapped on each key area of dummy keyboard, and according to each finger end points, drop on corresponding key area and determine selected button, thereby determine key mapping information.
5. the virtual contactor control device of man-machine interactive as claimed in claim 4, it is characterized in that, described action judging unit is mapped to the D coordinates value of finger end points on the key area of dummy keyboard, be specially: in dummy keyboard, each button has the two-dimensional coordinate regional extent of corresponding key mapping in XZ plane the top of the finger end points of judging user when determining that the XZ coordinate figure of finger end points falls into described coordinates regional scope in corresponding key mapping.
6. the virtual contactor control device of man-machine interactive as claimed in claim 5, it is characterized in that, described action judging unit is according to the D coordinates value judgement click action of finger end points, be specially: when Chosen Point hits, judge after plane Y-axis value, when the Y value in finger end points three-dimensional coordinate is judged plane Y-axis value lower than described click, judge that described finger end points has passed through described click and judged plane, and the key mapping coordinates regional scope judgement user who falls in conjunction with finger end points finally carries out the button of clicking operation.
7. the virtual touch-control system of man-machine interactive, is characterized in that, comprise the virtual contactor control device of man-machine interactive as described in claim 1-6 any one and with as described in two picture pick-up devices of device communication connection.
8. the virtual touch-control system of man-machine interactive as claimed in claim 7, is characterized in that, described picture pick-up device comprises the first camera and second camera, and respectively as longitudinally picture pick-up device and laterally picture pick-up device, shooting direction is set to orthogonal.
9. the virtual touch-control system of man-machine interactive as claimed in claim 8, is characterized in that, described the first camera and second camera are respectively common camera and infrared camera.
10. the virtual touch control method of man-machine interactive, is characterized in that, described method comprises:
User lies against in picture catching region finger is unsettled, and the view data of user's finger of camera collection is carried out to hard recognition, to determine the position of finger end points in image;
According to the position of finger endpoint location in image and the pixel resolution of camera that identify, finger end points location of pixels is converted to respectively to the two-dimensional coordinate value of XZ coordinate surface and YZ coordinate surface;
According to finger end points location of pixels, in the two-dimensional coordinate value of XZ coordinate surface and YZ coordinate surface, calculate the coordinate figure of finger end points in XYZ three-dimensional system of coordinate;
According to the band of position corresponding to each key mapping of D coordinates value and dummy keyboard of the finger end points of described calculating, the operation of judgement user to dummy keyboard key mapping;
According to described judged result, draw out analog subscriber finger at the application drawing picture of the corresponding key mapping of dummy keyboard; And
The image that shows described drafting is watched for user.
The virtual touch control method of 11. man-machine interactive as claimed in claim 10, it is characterized in that, described user lies against in picture catching region finger is unsettled, view data to user's finger of camera collection is carried out hard recognition, before determining the step of the position of finger end points in image, comprising:
According to the ambient brightness value of induction and the luminance threshold value judgement setting in advance, use common camera or infrared camera;
User will need the finger of operation unsettledly lie against in picture catching region and keep the static of certain hour in both hands, by the camera identification of selecting and the initial position of orienting hand.
The virtual touch control method of 12. man-machine interactive as claimed in claim 11, is characterized in that, the step that the initial position of hand was identified and oriented to the described camera by selecting also comprises afterwards:
According to the minimum end points of vertical direction in the D coordinates value of finger end points, as clicking, judge plane Y-axis value, the D coordinates value of finger end points is mapped on each key area of dummy keyboard, and according to each finger end points, drop on corresponding key area and determine selected button, thereby determine key mapping information.
The virtual touch control method of 13. man-machine interactive as claimed in claim 12, it is characterized in that, the described step that the D coordinates value of finger end points is mapped on the key area of dummy keyboard is specially: in dummy keyboard, each button has the two-dimensional coordinate regional extent of corresponding key mapping in XZ plane, the top of the finger end points of judging user when determining that the XZ coordinate figure of finger end points falls into described coordinates regional scope in corresponding key mapping.
The virtual touch control method of 14. man-machine interactive as claimed in claim 10, it is characterized in that, described user lies against in picture catching region finger is unsettled, view data to user's finger of camera collection is carried out hard recognition, to determine that the step of the position of finger end points in image is specially:
Determine respectively finger end points pixel at XZ coordinate surface and the YZ coordinate surface position in image.
The virtual touch control method of 15. man-machine interactive as claimed in claim 14, it is characterized in that, the position of finger endpoint location in image and the pixel resolution of camera that described basis identifies, the step that finger end points location of pixels is converted to respectively to the two-dimensional coordinate value of XZ coordinate surface and YZ coordinate surface is specially:
Image lower left corner pixel is set to the starting point 0 of two-dimensional coordinate system, according to image analytic degree, goes out the line number of relative each image of coordinate figure scope and the ratio of columns with the coordinate figure range computation being converted to after two-dimensional coordinate.
CN201410436989.0A 2014-08-29 2014-08-29 Virtual keyboard operation device, system and method Active CN104199550B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410436989.0A CN104199550B (en) 2014-08-29 2014-08-29 Virtual keyboard operation device, system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410436989.0A CN104199550B (en) 2014-08-29 2014-08-29 Virtual keyboard operation device, system and method

Publications (2)

Publication Number Publication Date
CN104199550A true CN104199550A (en) 2014-12-10
CN104199550B CN104199550B (en) 2017-05-17

Family

ID=52084851

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410436989.0A Active CN104199550B (en) 2014-08-29 2014-08-29 Virtual keyboard operation device, system and method

Country Status (1)

Country Link
CN (1) CN104199550B (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105425964A (en) * 2015-11-30 2016-03-23 青岛海信电器股份有限公司 Gesture identification method and system
WO2016115976A1 (en) * 2015-01-21 2016-07-28 Kong Liang Smart wearable input apparatus
CN105867806A (en) * 2016-03-25 2016-08-17 联想(北京)有限公司 Input method and electronic equipment
CN106155535A (en) * 2015-03-23 2016-11-23 联想(北京)有限公司 Touch screen soft-keyboard input method and device
CN106155533A (en) * 2016-06-30 2016-11-23 联想(北京)有限公司 A kind of information processing method and projector equipment
CN107168541A (en) * 2017-04-07 2017-09-15 北京小鸟看看科技有限公司 The implementation method and device of a kind of input
CN107479717A (en) * 2016-04-29 2017-12-15 姚秉洋 Display method of on-screen keyboard and computer program product thereof
CN109828672A (en) * 2019-02-14 2019-05-31 亮风台(上海)信息科技有限公司 It is a kind of for determining the method and apparatus of the human-machine interactive information of smart machine
CN110780743A (en) * 2019-11-05 2020-02-11 聚好看科技股份有限公司 VR (virtual reality) interaction method and VR equipment
CN111007977A (en) * 2018-10-04 2020-04-14 邱波 Intelligent virtual interaction method and device
CN111796672A (en) * 2020-05-22 2020-10-20 福建天晴数码有限公司 Gesture recognition method based on head-mounted device and storage medium
CN111796674A (en) * 2020-05-22 2020-10-20 福建天晴数码有限公司 Gesture touch sensitivity adjusting method based on head-mounted device and storage medium
CN111796675A (en) * 2020-05-22 2020-10-20 福建天晴数码有限公司 Gesture recognition control method of head-mounted device and storage medium
CN111796671A (en) * 2020-05-22 2020-10-20 福建天晴数码有限公司 Gesture recognition and control method for head-mounted device and storage medium
CN111796673A (en) * 2020-05-22 2020-10-20 福建天晴数码有限公司 Multi-finger gesture recognition method and storage medium for head-mounted device
CN111860239A (en) * 2020-07-07 2020-10-30 佛山长光智能制造研究院有限公司 Key identification method and device, terminal equipment and computer readable storage medium
CN114949663A (en) * 2022-05-13 2022-08-30 成都软智科技有限公司 Many unmanned aerial vehicle fire extinguishing system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04271423A (en) * 1991-02-27 1992-09-28 Nippon Telegr & Teleph Corp <Ntt> Information input method
US20040242988A1 (en) * 2003-02-24 2004-12-02 Kabushiki Kaisha Toshiba Operation recognition system enabling operator to give instruction without device operation
CN101589425A (en) * 2006-02-16 2009-11-25 Ftk技术有限公司 A system and method of inputting data into a computing system
CN102880304A (en) * 2012-09-06 2013-01-16 天津大学 Character inputting method and device for portable device
CN103105930A (en) * 2013-01-16 2013-05-15 中国科学院自动化研究所 Non-contact type intelligent inputting method based on video images and device using the same

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04271423A (en) * 1991-02-27 1992-09-28 Nippon Telegr & Teleph Corp <Ntt> Information input method
US20040242988A1 (en) * 2003-02-24 2004-12-02 Kabushiki Kaisha Toshiba Operation recognition system enabling operator to give instruction without device operation
CN101589425A (en) * 2006-02-16 2009-11-25 Ftk技术有限公司 A system and method of inputting data into a computing system
CN102880304A (en) * 2012-09-06 2013-01-16 天津大学 Character inputting method and device for portable device
CN103105930A (en) * 2013-01-16 2013-05-15 中国科学院自动化研究所 Non-contact type intelligent inputting method based on video images and device using the same

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016115976A1 (en) * 2015-01-21 2016-07-28 Kong Liang Smart wearable input apparatus
CN106155535A (en) * 2015-03-23 2016-11-23 联想(北京)有限公司 Touch screen soft-keyboard input method and device
CN105425964A (en) * 2015-11-30 2016-03-23 青岛海信电器股份有限公司 Gesture identification method and system
CN105425964B (en) * 2015-11-30 2018-07-13 青岛海信电器股份有限公司 A kind of gesture identification method and system
CN105867806A (en) * 2016-03-25 2016-08-17 联想(北京)有限公司 Input method and electronic equipment
CN105867806B (en) * 2016-03-25 2020-05-26 联想(北京)有限公司 Input method and electronic equipment
CN107479717A (en) * 2016-04-29 2017-12-15 姚秉洋 Display method of on-screen keyboard and computer program product thereof
CN107479717B (en) * 2016-04-29 2020-06-12 姚秉洋 Display method of on-screen keyboard and computer program product thereof
CN106155533B (en) * 2016-06-30 2019-05-31 联想(北京)有限公司 A kind of information processing method and projection device
CN106155533A (en) * 2016-06-30 2016-11-23 联想(北京)有限公司 A kind of information processing method and projector equipment
CN107168541A (en) * 2017-04-07 2017-09-15 北京小鸟看看科技有限公司 The implementation method and device of a kind of input
CN111007977A (en) * 2018-10-04 2020-04-14 邱波 Intelligent virtual interaction method and device
CN109828672A (en) * 2019-02-14 2019-05-31 亮风台(上海)信息科技有限公司 It is a kind of for determining the method and apparatus of the human-machine interactive information of smart machine
CN109828672B (en) * 2019-02-14 2022-05-27 亮风台(上海)信息科技有限公司 Method and equipment for determining man-machine interaction information of intelligent equipment
CN110780743A (en) * 2019-11-05 2020-02-11 聚好看科技股份有限公司 VR (virtual reality) interaction method and VR equipment
CN111796671A (en) * 2020-05-22 2020-10-20 福建天晴数码有限公司 Gesture recognition and control method for head-mounted device and storage medium
CN111796675A (en) * 2020-05-22 2020-10-20 福建天晴数码有限公司 Gesture recognition control method of head-mounted device and storage medium
CN111796674A (en) * 2020-05-22 2020-10-20 福建天晴数码有限公司 Gesture touch sensitivity adjusting method based on head-mounted device and storage medium
CN111796673A (en) * 2020-05-22 2020-10-20 福建天晴数码有限公司 Multi-finger gesture recognition method and storage medium for head-mounted device
CN111796672A (en) * 2020-05-22 2020-10-20 福建天晴数码有限公司 Gesture recognition method based on head-mounted device and storage medium
CN111796671B (en) * 2020-05-22 2023-04-28 福建天晴数码有限公司 Gesture recognition and control method of head-mounted equipment and storage medium
CN111796672B (en) * 2020-05-22 2023-04-28 福建天晴数码有限公司 Gesture recognition method based on head-mounted equipment and storage medium
CN111796673B (en) * 2020-05-22 2023-04-28 福建天晴数码有限公司 Multi-finger gesture recognition method of head-mounted equipment and storage medium
CN111796675B (en) * 2020-05-22 2023-04-28 福建天晴数码有限公司 Gesture recognition control method of head-mounted equipment and storage medium
CN111796674B (en) * 2020-05-22 2023-04-28 福建天晴数码有限公司 Gesture touch sensitivity adjusting method based on head-mounted equipment and storage medium
CN111860239A (en) * 2020-07-07 2020-10-30 佛山长光智能制造研究院有限公司 Key identification method and device, terminal equipment and computer readable storage medium
CN114949663A (en) * 2022-05-13 2022-08-30 成都软智科技有限公司 Many unmanned aerial vehicle fire extinguishing system

Also Published As

Publication number Publication date
CN104199550B (en) 2017-05-17

Similar Documents

Publication Publication Date Title
CN104199550A (en) Man-machine interactive type virtual touch device, system and method
CN104199547B (en) Virtual touch screen operation device, system and method
US9329691B2 (en) Operation input apparatus and method using distinct determination and control areas
CN104199548B (en) A kind of three-dimensional man-machine interactive operation device, system and method
CN106598227A (en) Hand gesture identification method based on Leap Motion and Kinect
TWI471815B (en) Gesture recognition device and method
US20150192990A1 (en) Display control method, apparatus, and terminal
CN102508574A (en) Projection-screen-based multi-touch detection method and multi-touch system
CN103207709A (en) Multi-touch system and method
Reddy et al. Virtual mouse control using colored finger tips and hand gesture recognition
CN103092334B (en) Virtual mouse driving device and virtual mouse simulation method
CN105242776A (en) Control method for intelligent glasses and intelligent glasses
CN104267802A (en) Human-computer interactive virtual touch device, system and method
CN109839827B (en) Gesture recognition intelligent household control system based on full-space position information
CN114138121B (en) User gesture recognition method, device and system, storage medium and computing equipment
CN112486394A (en) Information processing method and device, electronic equipment and readable storage medium
CN104199549A (en) Man-machine interactive type virtual touch device, system and method
TWI486815B (en) Display device, system and method for controlling the display device
CN107239222A (en) The control method and terminal device of a kind of touch-screen
Hartanto et al. Real time hand gesture movements tracking and recognizing system
CN108227923A (en) A kind of virtual touch-control system and method based on body-sensing technology
CN103389793B (en) Man-machine interaction method and system
CN114581535A (en) Method, device, storage medium and equipment for marking key points of user bones in image
JP2019087136A (en) Screen display control method and screen display control system
CN102902468A (en) Map browsing method and device of mobile terminal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 350003 Fuzhou Gulou District, Fujian, software Avenue, building 89, No. 18

Applicant after: FUZHOU ROCKCHIP ELECTRONICS CO., LTD.

Address before: 350003 Fuzhou Gulou District, Fujian, software Avenue, building 89, No. 18

Applicant before: Fuzhou Rockchip Semiconductor Co., Ltd.

COR Change of bibliographic data
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder
CP01 Change in the name or title of a patent holder

Address after: 350003 building, No. 89, software Avenue, Gulou District, Fujian, Fuzhou 18, China

Patentee after: Ruixin Microelectronics Co., Ltd

Address before: 350003 building, No. 89, software Avenue, Gulou District, Fujian, Fuzhou 18, China

Patentee before: Fuzhou Rockchips Electronics Co.,Ltd.