CN104637080B - A kind of three-dimensional drawing system and method based on man-machine interaction - Google Patents

A kind of three-dimensional drawing system and method based on man-machine interaction Download PDF

Info

Publication number
CN104637080B
CN104637080B CN201310549711.XA CN201310549711A CN104637080B CN 104637080 B CN104637080 B CN 104637080B CN 201310549711 A CN201310549711 A CN 201310549711A CN 104637080 B CN104637080 B CN 104637080B
Authority
CN
China
Prior art keywords
intelligent glasses
wearer
infrared
dimensional drawing
man
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310549711.XA
Other languages
Chinese (zh)
Other versions
CN104637080A (en
Inventor
费树培
樊建平
谢耀钦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN201310549711.XA priority Critical patent/CN104637080B/en
Publication of CN104637080A publication Critical patent/CN104637080A/en
Application granted granted Critical
Publication of CN104637080B publication Critical patent/CN104637080B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/048023D-info-object: information is displayed on the internal or external surface of a three dimensional manipulable object, e.g. on the faces of a cube that can be rotated by the user

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a kind of three-dimensional drawing system and method based on man-machine interaction, the working face of intelligent glasses is determined by the plane recognition unit on intelligent glasses, and the wearer's head position of locating and tracking element keeps track intelligent glasses, and adjust the image that intelligent glasses are superimposed upon on working face in real time according to head position, wearer is set to keep geo-stationary with described image, again X-Y scheme is drawn by interaction pen on the interface of three-dimensional drawing software, and X-Y scheme is handled, to form 3-D graphic, interaction pen control 3-D graphic departs from the interface of the three-dimensional drawing software, into in three dimensions, the information of 3-D graphic is shown through intelligent glasses again.Three-dimensional drawing software can be placed in any plane by drafting system provided by the invention and method, so as to realize, any place carries out 3D modeling renderings at any time, strengthen the freedom of Plot Work so that user need not be convenient and reliable all the time in a two dimensional display previous work.

Description

A kind of three-dimensional drawing system and method based on man-machine interaction
Technical field
The present invention relates to image technique field, more particularly to a kind of three-dimensional drawing system and method based on man-machine interaction.
Background technology
Although designer can be helped to paint for this kind of three-dimensional drawing software of representative with SolidWorks, UG, ProE etc. 3D models needed for system, but such mode has the shortcomings that essence:All drawings are all in two dimensional surface Upper completion, and the model drawn is three-dimensional, such drafting mode is non-intuitive for user.Meanwhile mesh Preceding three-dimensional drawing software is all to complete on computers, and user, which is limited in face of a two dimensional display, completes the behaviour that draws Make, this cannot allow designer, and any place can complete design process at any time.
With the development of technology, intelligent glasses are gradually into the visual field of people, for can Clairvoyant type intelligent glasses For, after people puts on this glasses, ambient visible light can transmitted through glasses eyeglass, the eyes of people can just be commonly seen surrounding Environment;Some image informations can be superimposed upon on the true visual field of people by intelligent glasses simultaneously, can due to possessing such characteristic Clairvoyant type intelligent glasses will play a great role in numerous industries and field.
At present, all 3D modelings work is completed on two dimensional display, also no to enter in three dimensions The technology of row modeling occurs.
The content of the invention
It is an object of the invention to provide a kind of three-dimensional drawing system based on man-machine interaction, can carry out in three dimensions Three-dimensional drawing.
In order to achieve the above object, a kind of three-dimensional drawing system based on man-machine interaction provided by the invention, including:
Intelligent glasses, for display image information in three dimensions, including:
Plane recognition unit, for determining the working face of the intelligent glasses;
Locating and tracking unit, for tracking the wearer's head position of the intelligent glasses, and according to the head position The image that the intelligent glasses are superimposed upon on the working face is adjusted in real time, the wearer is kept relative with described image It is static;
Interaction pen, signal are connected to the intelligent glasses, for performing the drafting of X-Y scheme;
Main frame, signal are connected to the intelligent glasses, and the main frame is provided with three-dimensional drawing software, and the main frame passes through nothing The interface of the three-dimensional drawing software is delivered in the intelligent glasses by line signal, and the intelligent glasses are by the three-dimensional drawing For the interface display of software on the working face, the interaction pen draws X-Y scheme at the interface of the three-dimensional drawing software Shape, the interaction pen is additionally operable to handle the X-Y scheme, to form 3-D graphic;
The interaction pen is additionally operable to the interface for controlling the 3-D graphic to depart from the three-dimensional drawing software, into described three In dimension space, then show through the intelligent glasses information of the 3-D graphic.
Preferably, the plane recognition unit includes the first camera, second camera, infrared matrix grid LED and the One processor, first camera, second camera are respectively symmetrically arranged at the left and right ends of the intelligent glasses, described red Outer matrix grid LED is arranged at the center of the intelligent glasses, and the infrared matrix grid LED is used to produce infrared ray matrix net Lattice, the first processor are built in the inside of the intelligent glasses, and first camera, second camera are used to identify institute State infrared matrix grid LED and be incident upon the infrared ray scattered in the plane of the intelligent glasses wearer at the moment, at described first Reason device is used for the infrared ray for analyzing the scattering, and draws distance and angle of the plane relative to the intelligent glasses, from And determine the working face of the intelligent glasses.
Preferably, the infrared matrix grid LED includes the baffle plate for offering grid groove and immediately ahead of the baffle plate Infrared LED, infrared ray caused by the infrared LED produces infrared ray matrix grid after the baffle plate.
Preferably, the locating and tracking unit includes 3-axis acceleration sensor and three-axis gyroscope, and three axle accelerates Movement of the head that degree sensor is used to track the wearer on X, Y, Z coordinate direction of principal axis;The three-axis gyroscope is used for Track rotation of the head of the wearer on X, Y, Z coordinate direction of principal axis.
Preferably, the intelligent glasses also include the first infrared LED, and first infrared LED is arranged at the Brilliant Eyes The center of mirror and positioned at the top of the infrared matrix grid LED, first camera, second camera and described first red Outer LED forms finger recognition unit, and the finger that the finger recognition unit is used to identify the wearer is in the three dimensions In position relationship.
Preferably, the interaction pen includes nib, the first infrared emitting point, the second infrared emitting point and processor, The first infrared emitting point and the second infrared emitting point are located at the both ends of the interaction pen respectively, red by described first Outside line launch point, the second infrared emitting point can determine that the coordinate of the nib, and the processor is arranged at the interaction pen Inside, for controlling the overall work of interaction pen.
Preferably, the main frame is the cloud on computer or network.
In addition, present invention also offers a kind of 3 D drawing method based on man-machine interaction, comprise the steps:
Step A:The working face of the intelligent glasses is determined by the plane recognition unit;
Step B:The image frame of display is superimposed upon on the working face by the intelligent glasses;
Step C:By the wearer's head position of intelligent glasses described in locating and tracking element keeps track, and according to the head Position adjusts the image that the intelligent glasses are superimposed upon on the working face in real time, the wearer is kept with described image Geo-stationary;
Step D:Open the three-dimensional drawing software on the main frame, the main frame is by wireless signal by the three-dimensional drawing The interface of software is delivered in the intelligent glasses, and the intelligent glasses are by the interface display of the three-dimensional drawing software described On working face, and X-Y scheme is drawn at the interface of the three-dimensional drawing software by the interaction pen;
Step E:The X-Y scheme is handled by the interaction pen, to form 3-D graphic;
Step F:The interface of the 3-D graphic disengaging three-dimensional drawing software is controlled by the interaction pen, into institute State in three dimensions;
Step G:The intelligent glasses show the information of the 3-D graphic.
Preferably, in addition to step H, the three-dimensional graphical model that user is opposite to by finger in three dimensions are moved Dynamic, rotation process, realize cooperation, assembling between the 3-D graphic and other 3D models drawn.
Preferably, also include using binocular stereo vision algorithm before step H is completed, pass through the finger recognition unit The step of identifying position relationship of user's finger in the intelligent glasses three dimensions.
Preferably, in step A, the working face of the intelligent glasses is determined by the plane recognition unit, including under State step:
Step A1:The infrared matrix grid LED produces infrared ray matrix grid;
Step A2:First camera, second camera identify that the infrared matrix grid LED is incident upon the intelligence The infrared ray scattered in the plane of eyeglass wearer at the moment;
Step A3:The first processor analyzes the infrared ray of the scattering, and draws the plane relative to the intelligence The distance and angle of energy glasses, so that it is determined that the working face of the intelligent glasses.
Preferably, in step C, by the wearer's head position of intelligent glasses described in locating and tracking element keeps track, and root The image that the intelligent glasses are superimposed upon on the working face is adjusted in real time according to the head position, makes the wearer and institute State image and keep geo-stationary, specifically include following step:
Step C1:The wearer's head position of the intelligent glasses is tracked, is designated as ax,ay,azAnd ωxyz, it is described ax,ay,azThe acceleration moved for the head of the wearer on X, Y, Z coordinate direction of principal axis, the ωxyzTo be described The angular speed that the head of wearer rotates on X, Y, Z coordinate direction of principal axis;
Step C2:Respectively to ax,ay,azThe double integral for carrying out the time obtains the wearer's head in the time In the range of the displacement that occurs on three change in coordinate axis direction, and be designated as:△x=∫∫axdt、△y=∫∫ayDt and △ z=∫ ∫ azdt;
Step C3:Respectively to the ωxyzThe multiple integral for carrying out the time obtains the wearer's head at this Between in the range of the angle that is rotated around three change in coordinate axis direction, and be designated as:△θx=∫ωxdt、△θy=∫ωyDt and △ θz=∫ ωzdt;
Step C4:According to (△ x, the △ y, △ z), (△ θx,△θy,△θz) obtain the transformation matrix of camera coordinates system M;
Step C5:According to model conversion and the duality of view transformation, corresponding model transformation matrix M ' is obtained;
Step C6:According to the matrix M and matrix M ', the wearer's head post exercise threedimensional model is obtained New coordinate X ';
Step C7:The image that the intelligent glasses are superimposed upon on the working face is adjusted based on the X ' in real time, makes institute State wearer and keep geo-stationary with described image.
Preferably, the coordinate that the nib is determined using binocular stereo vision algorithm is also included before complete step D Step.
Preferably, in step E, the X-Y scheme is handled by the interaction pen, to form 3-D graphic, tool Body is:The two dimensional image is stretched by the interaction pen, rotation process, to form 3-D graphic.
Three-dimensional drawing system and method provided by the invention based on man-machine interaction, identified by the plane on intelligent glasses Unit determines the working face of the intelligent glasses, and wearer's head of intelligent glasses described in locating and tracking element keeps track thereon Portion position, and the image that the intelligent glasses are superimposed upon on the working face is adjusted according to the head position in real time, make institute State wearer and keep geo-stationary with described image, then two dimension is drawn at the interface of the three-dimensional drawing software by the interaction pen Figure, and the X-Y scheme is handled, to form 3-D graphic, the interaction pen controls the 3-D graphic to depart from institute The interface of three-dimensional drawing software is stated, into the three dimensions, then shows through the intelligent glasses letter of the 3-D graphic Breath.Three-dimensional drawing software can be placed in any plane by drafting system provided by the invention and method, so as to realize Any moment any place carries out 3D modeling renderings, greatly strengthens the freedom of Plot Work so that user need not be all the time one Platform two dimensional display previous work, it is convenient and reliable;Simultaneously as the present invention uses intelligent glasses therefore, to lead to as display terminal Crossing the threedimensional model that the present invention is drawn can be shown in the visual field of user with 3D effect, visual good;In addition, the present invention carries The system and method for confession carry out two-dimensional pattern drafting only with interaction pen, abandon the side to be drawn at present using mouse completely Formula, more meets the direct feel of people, and the drafting of designer is intended to preferably show by interaction pen.
In addition, the three-dimensional drawing system and method provided by the invention based on man-machine interaction, user can pass through finger The three-dimensional graphical model being opposite in three dimensions such as moves, rotated at the operation, realizes the 3-D graphic and other draftings 3D models between cooperation, assembling, it is simple and easy.
Brief description of the drawings
Fig. 1 is the structural representation of intelligent glasses provided by the invention;
Fig. 2 is that infrared ray matrix grid provided by the invention produces schematic diagram;
Fig. 3 is the infrared ray matrix network schematic diagram of intelligent glasses provided by the invention projection;
Fig. 4 is the structural representation of interaction pen provided by the invention;
Fig. 5 is a kind of 3 D drawing method flow chart of steps based on man-machine interaction provided by the invention;
Fig. 6 is the wearer's head position by intelligent glasses described in locating and tracking element keeps track, and according to the head Position adjusts the image that the intelligent glasses are superimposed upon the working face in real time, the wearer is kept phase with described image To static step flow chart;
Fig. 7 is the coordinate system schematic diagram of locating and tracking unit provided by the invention;
Fig. 8 is the illustraton of model that interaction pen uses in the plane;
Fig. 9 is to draw a two-dimension rectangular chart using interaction pen;
Figure 10 is that the two-dimensional rectangle of drafting is stretched as into a cuboid using interaction pen;
Figure 11 is that the three-dimensional graphical model that the present invention is opposite to by finger in three dimensions moves, rotation process obtains The 3-D graphic arrived;
Figure 12 is to identify user's finger in intelligent glasses three by finger recognition unit using binocular stereo vision algorithm Position general principle figure in dimension space;
Figure 13 is the identification range schematic diagram of finger recognition unit in intelligent glasses.
Embodiment
In order that the objects, technical solutions and advantages of the present invention become apparent from, below in conjunction with drawings and Examples, to this hair It is bright to be further elaborated.It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, and do not have to It is of the invention in limiting.
A kind of three-dimensional drawing system based on man-machine interaction provided by the invention, including:Intelligent glasses 110, interaction pen 120 And main frame 130.
Referring to Fig. 1, Fig. 1 is the structural representation of intelligent glasses.Intelligent glasses 110 are used to show figure in three dimensions As information.Intelligent glasses 110 for can Clairvoyant type intelligent glasses, after putting on this glasses, ambient visible light being capable of transmitted through glasses mirror Piece, the eyes of people can just be commonly seen the environment of surrounding;Some image informations can be superimposed upon the true of people by intelligent glasses simultaneously On the real visual field, after people puts on this intelligent glasses, in addition to the environment for seeing surrounding, it can be seen that being shown by intelligent glasses The image information come, watch the screen certain equivalent to one size of viewing in the space with a certain distance from from human eye.Brilliant Eyes Mirror 110 includes plane recognition unit and locating and tracking unit.
Plane recognition unit is used for the working face for determining the intelligent glasses.Plane recognition unit includes the first camera 111st, second camera 112, infrared matrix grid LED113 and first processor(It is not shown).
First camera 111, second camera 112 are respectively symmetrically arranged at the left and right ends of intelligent glasses 110.
Infrared matrix grid LED113 is arranged at the center of intelligent glasses 110, and infrared matrix grid LED113 is used to produce Infrared ray matrix grid.Preferably, infrared matrix grid LED113 includes offering the baffle plate 115 of grid groove and positioned at baffle plate The infrared LED 116 in 115 fronts, infrared ray caused by infrared LED 116 produce an infrared ray matrix net after baffle plate 115 Lattice.Fig. 2 and Fig. 3 are referred to, Fig. 2 is that infrared ray matrix grid provided by the invention produces schematic diagram, and Fig. 3 is provided by the invention The infrared ray matrix network schematic diagram of intelligent glasses projection.
First processor is built in the inside of intelligent glasses 110, and the first camera 111, second camera 112 identify infrared Matrix grid LED113 is incident upon the infrared ray scattered in the plane of the wearer of intelligent glasses 110 at the moment, first processor analysis The infrared ray of scattering, and distance and angle of the above-mentioned plane relative to intelligent glasses 110 are drawn, so that it is determined that intelligent glasses 110 Working face.
It is appreciated that in intelligent glasses coordinate system OXYZ, infrared matrix grid LED115 produces infrared ray matrix grid, When infrared ray matrix grid projects the plane of intelligent glasses wearer at the moment, due to perspective effect, infrared grid lines will Generation alteration of form;Meanwhile the infrared ray for being projected in-plane scatter can be by first camera 111, second at intelligent glasses both ends Camera 112 receives, and so, the infrared ray of scattering is calculated by analyzing, it becomes possible to know intelligent glasses relative to projection plane Angle and distance, so that it is determined that the working face of intelligent glasses 110.
Locating and tracking unit, adjusted in real time for tracking the wearer's head position of intelligent glasses 110, and according to head position Whole intelligent glasses 110 are superimposed upon the image of working face, wearer is kept geo-stationary with image.Preferably, locating and tracking Unit includes 3-axis acceleration sensor(It is not shown)And three-axis gyroscope(It is not shown), 3-axis acceleration sensor is for tracking Movement of the head of the wearer on X, Y, Z coordinate direction of principal axis;Three-axis gyroscope be used to tracking the head of wearer X, Y, the rotation on Z coordinate direction of principal axis.
Intelligent glasses 110 can also include finger recognition unit, and finger recognition unit includes the first infrared LED 114, first Camera 111, second camera 112.First infrared LED 114 is arranged at the center of intelligent glasses 110 and is located at infrared matrix net Lattice LED113 top, finger recognition unit is for identifying the position relationship of the finger of wearer in three dimensions.
Referring to Fig. 4, Fig. 4 is the structural representation of interaction pen provided by the invention.The signal of interaction pen 120 is connected to intelligence Glasses 110, for performing the drafting of X-Y scheme.It is appreciated that bluetooth module can be set on interaction pen 120(It is not shown)With Communicated in the bluetooth module on intelligent glasses 110.
Interaction pen 120 includes nib 121, the first infrared emitting point 122, the second infrared emitting point 123 and processor (It is not shown).First infrared emitting point 122 and the second infrared emitting point 123 are located at the both ends of interaction pen 120 respectively, pass through First infrared emitting point 122, the second infrared emitting point 123 can determine that the coordinate of nib 121, and processor is arranged at interaction pen 120 inside, for controlling the overall work of interaction pen 120.
Main frame 130, signal are connected to intelligent glasses 110.Main frame 130 is the cloud on computer or network.It is it is appreciated that main Machine 130 can be attached by the WiFi module on intelligent glasses 110, realize that the signal of main frame 130 and intelligent glasses 110 connects Connect.
It is appreciated that being provided with three-dimensional drawing software on main frame 130, main frame 130 is soft by three-dimensional drawing by wireless signal The interface of part is delivered in intelligent glasses 110, intelligent glasses 110 by the interface display of three-dimensional drawing software on working face, Interaction pen 120 can draw X-Y scheme at the interface of three-dimensional drawing software, then X-Y scheme is handled, to form three-dimensional Figure, then the information through the display 3-D graphic of intelligent glasses 110.
Referring to Fig. 5, Fig. 5 is a kind of 3 D drawing method 200 based on man-machine interaction provided by the invention, including it is following Step:
Step A:The working face of intelligent glasses 110 is determined by plane recognition unit;
Preferably, the working face of the intelligent glasses is determined by the plane recognition unit, is comprised the steps:
Step A1:Infrared matrix grid LED113 produces infrared ray matrix grid;
Step A2:First camera 111, second camera 112 identify that infrared matrix grid LED113 is incident upon Brilliant Eyes The infrared ray scattered in the plane of the wearer of mirror 110 at the moment;
Step A3:First processor analyzes infrared ray, and draws the plane relative to intelligent glasses 110 away from walk-off angle Degree, so that it is determined that the working face of the intelligent glasses.
It is appreciated that in intelligent glasses coordinate system OXYZ, infrared matrix grid LED113 produces infrared ray matrix grid, When infrared ray matrix grid is superimposed upon the plane of the wearer of intelligent glasses 110 at the moment, due to perspective effect, infrared grid lines will Alteration of form can occur;Meanwhile the infrared ray for being projected in-plane scatter can be by first camera 111 at intelligent glasses both ends, Two cameras 112 receive, and so, the infrared ray of scattering are calculated by analyzing, it becomes possible to know that intelligent glasses are flat relative to projection The angle and distance in face, so that it is determined that the working face of intelligent glasses 110.
Step B:The image frame of display is superimposed upon on working face by intelligent glasses 110;
Step C:By the wearer's head position of locating and tracking element keeps track intelligent glasses 110, and according to head position Adjustment intelligent glasses 110 are superimposed upon the image of working face in real time, the wearer is kept geo-stationary with image;
Referring to Fig. 6, in step C, by the wearer's head position of intelligent glasses described in locating and tracking element keeps track, And the image that the intelligent glasses are superimposed upon the working face is adjusted according to the head position in real time, make the wearer with Described image keeps geo-stationary, specifically includes following step:
Step C1:The wearer's head position of intelligent glasses 110 is tracked, is designated as ax,ay,azAnd ωxyz, it is described ax,ay,azThe acceleration moved for the head of the wearer on X, Y, Z coordinate direction of principal axis, the ωxyzTo be described The angular speed that the head of wearer rotates on X, Y, Z coordinate direction of principal axis;
It is appreciated that the head that 3-axis acceleration sensor is used to track the wearer is in X, Y, Z coordinate direction of principal axis Movement, be designated as ax,ay,az;The head that the three-axis gyroscope is used to track the wearer is in X, Y, Z coordinate direction of principal axis Rotation, be designated as ωxyz
Step C2:Respectively to ax,ay,azThe double integral for carrying out the time obtains the wearer's head in the time In the range of the displacement that occurs on three change in coordinate axis direction, and be designated as:△x=∫∫axdt、△y=∫∫ayDt and △ z=∫ ∫ azdt;
Step C3:Respectively to the ωxyzThe multiple integral for carrying out the time obtains the wearer's head at this Between in the range of the angle that is rotated around three change in coordinate axis direction, and be designated as:△θx=∫ωxdt、△θy=∫ωyDt and △ θz=∫ ωzdt;
Step C4:According to (△ x, the △ y, △ z), (△ θx,△θy,△θz) obtain the transformation matrix of camera coordinates system M;
Step C5:According to model conversion and the duality of view transformation, corresponding model transformation matrix M ' is obtained;
Step C6:According to the matrix M and matrix M ', the wearer's head post exercise threedimensional model is obtained New coordinate X ';
Step C7:The image that the intelligent glasses are superimposed upon the working face is adjusted based on the X ' in real time, made described Wearer keeps geo-stationary with described image.
Referring to Fig. 7, Fig. 7 is the coordinate system schematic diagram of locating and tracking unit provided by the invention.It is appreciated that positioning with The coordinate system of track unit is also simultaneously camera coordinates system.After carrying out the value that locating and tracking coordinate system exports on the integration of time Corner (the △ θ that the coordinate value (△ x, △ y, △ z) of obtained wearer's head motion and wearer's head motion occurx,△ θy,△θz), according to (△ x, △ y, △ z) and (△ θx,△θy,△θz) this six values, it becomes possible to obtain the change of camera coordinates system Matrix M is changed, simultaneously because the duality of model conversion and view transformation, can solve corresponding model transformation matrix M ', in It is to be multiplied by model transformation matrix M ' with the homogeneous coordinates of current threedimensional model, it becomes possible to obtain that there occurs certain in wearer's head After motion, the new coordinate X ' of threedimensional model.After perspective projection is re-started to threedimensional model, it becomes possible to obtain current time Under, threedimensional model view that intelligent glasses should be shown.By the above-mentioned means, it can be realized as wearer's head in intelligent glasses After portion moves, the angle that threedimensional model is shown in intelligent glasses is adjusted in real time, allows three bit models in the visual field model of wearer The effect of remains stationary within enclosing.
Step D:The three-dimensional drawing software on main frame 130 is opened, main frame 130 is by wireless signal by three-dimensional drawing software Interface is delivered in intelligent glasses 110, and intelligent glasses 110 on working face, and lead to the interface display of three-dimensional drawing software Cross interaction pen 120 and draw X-Y scheme at the interface of three-dimensional drawing software;
Preferably, the step of also including determining the coordinate of nib using binocular stereo vision algorithm before step C is completed.
Referring to Fig. 8, the illustraton of model used in the plane for interaction pen.Due in the camera of intelligent glasses 110, being in Existing is two infrared emitting points on interaction pen 120.Therefore with P1Point and P2Point represents two infrared emitting points respectively, So straight line P1P2Can directly represents interaction pen.
Because known smart glasses are relative to plane ABCD distance and angle, therefore plane ABCD side can be sought out Journey.In Fig. 8, in intelligent glasses coordinate system OXYZ, it is known that plane ABCD normal vector n (n0,n1,n2) and plane on one Point R (xr,yr,zr), then plane ABCD plane equation is:
n0(x-xr)+n1(y-yr)+n2(z-zr)=0
Meanwhile using the principle of binocular stereo vision, can be in the hope of two infrared emission points in intelligent glasses coordinate system Coordinate be respectively P1(x1,y1,z1)、P2(x2,y2,z2), then represent the straight line P of interaction pen1P2Equation be:
Wherein, (l, m, n) is straight line P1P2Direction vector;Have simultaneously:
Because plane ABCD and straight line P1P2Intersection point P0I.e. in plane ABCD, also in straight line P1P2On, therefore only need to join Vertical plane equation and linear equation, can obtain P0Coordinate P0(x0,y0,z0):
P0Also it is simultaneously the intersection point of interaction pen and plane ABCD, therefore be aware of P0Coordinate after, intelligent glasses can According to P0The trajectory path of point draws out corresponding track, it is achieved thereby that the interaction of interaction pen and intelligent glasses.
When needing to use interaction pen to stretch the two-dimentional sketch of drafting, now interaction pen is placed on three dimensions In, now need to ask for the three-dimensional coordinate of nib, ask for nib coordinate and asking in the plane in three dimensions It is similar.Further regard to Fig. 8, it is assumed that now P0(x0,y0,z0) point for interaction pen nib a bit, and P0P1、P1P2Between away from Known to being also, it is assumed that be respectively l1And l2, then straight line P is can obtain more than1P2Linear equation be:
Wherein (l, m, n) is straight line P1P2Direction vector;Have simultaneously:
P then can be sought out by interspace analytic geometry relation0(x0,y0,z0) point coordinate:
More than simultaneous two formulas are only needed to obtain P0(x0,y0,z0) point coordinate.
Step E:The X-Y scheme is handled by interaction pen 120, to form 3-D graphic;
Preferably, the operation such as stretched, rotated to two dimensional image by interaction pen 120, to form 3-D graphic.
Specifically, it is assumed that firstly the need of one cuboid of drafting, first on the interface of the three-dimensional drawing software of main frame 130 Rectangle function is drawn in selection, and user is painted using drawing area of the interaction pen 120 in three-dimensional drawing software in a manner of cornerwise Make a rectangle;After completing rectangle and drawing, the rectangle is chosen with the nib 121 of interaction pen 120, is pressed on interaction pen 120 Button, while interaction pen 120 is above carried, then will now draw out a cuboid, as shown in Figure 9 and Figure 10, Fig. 9 is makes With interaction pen draw a two-dimension rectangular chart, Figure 10 be using interaction pen 10 by the two-dimensional rectangle of drafting be stretched as one it is rectangular Body.
Step F:The interface of 3-D graphic disengaging three-dimensional drawing software is controlled by interaction pen 120, into three dimensions;
It is appreciated that even if the cuboid drawn is in three dimensions, interaction pen 120 still can be used rectangular at this The surface of body carries out drawing.
Step G:Intelligent glasses 110 show the information of 3-D graphic.
Preferably, a kind of above-mentioned 3 D drawing method 200 based on man-machine interaction also includes step H, and user passes through hand Refer to the three-dimensional graphical model that is opposite in three dimensions and the operation such as move, rotate, realize that the 3-D graphic is painted with other Cooperation, assembling between the 3D models of system.It is the graphics that the present invention is opposite to by finger in three dimensions to refer to Figure 11 The 3-D graphic that shape model moves, rotation process obtains.
Preferably, also include using binocular stereo vision algorithm before step H is completed, pass through the finger recognition unit The step of identifying position relationship of user's finger in the intelligent glasses three dimensions.
Figure 12 is referred to, to identify user's finger in intelligence by finger recognition unit using binocular stereo vision algorithm Position general principle figure in glasses three dimensions.In figure, l1、l2Respectively two cameras being placed in parallel, in the present invention In, l1、l2The first camera, second camera are represented respectively.B represents the distance between two camera photocentres, and f is camera Focal length.P (x, y, z) is the certain point in space, and (x, y, z) is its three-dimensional coordinate, P1(x1,y1)、P2(x2,y2) it is respectively P Picpointed coordinate o'clock in two camera image planes, the parallax of wherein P points is d=x1-x2, can be calculated by geometrical relationship The depth of P points is:
Z coordinate of the P points in intelligent glasses coordinate system is calculated by using stereoscopic vision algorithm can.And P simultaneously The X-coordinate and Y-coordinate of point directly can be read from the coordinate system of video camera.Then technique of binocular stereoscopic vision is being employed Afterwards, coordinate that can be in the hope of some spatial point in video camera three-dimensional system of coordinate.
In specific use, the first infrared LED 114 on intelligent glasses 110 sends infrared ray, and infrared ray is meeting people's It can be scattered after finger by finger, and the part infrared ray scattered can enter the camera at the both ends of intelligent glasses 110, camera can be defeated Go out two width dispersion images, can directly from video camera obtain two width dispersion images in read finger fingertip in image frame X, The coordinate of Y-direction.Meanwhile after using binocular stereo vision algorithm, the Z coordinate of finger fingertip can be calculated.
Refer to Figure 13, Figure 13 is the identification range schematic diagram of finger recognition unit in intelligent glasses 110, dotted line institute in figure The three dimensions scope represented is identification range.
Three-dimensional drawing system and method provided by the invention based on man-machine interaction, identified by the plane on intelligent glasses Unit determines the working face of the intelligent glasses, and wearer's head of intelligent glasses described in locating and tracking element keeps track thereon Portion position, and the image that the intelligent glasses are superimposed upon on the working face is adjusted according to the head position in real time, make institute State wearer and keep geo-stationary with described image, then two dimension is drawn at the interface of the three-dimensional drawing software by the interaction pen Figure, and the X-Y scheme is handled, to form 3-D graphic, the interaction pen controls the 3-D graphic to depart from institute The interface of three-dimensional drawing software is stated, into the three dimensions, then shows through the intelligent glasses letter of the 3-D graphic Breath.Three-dimensional drawing software can be placed in any plane by drafting system provided by the invention and method, so as to realize Any moment any place carries out 3D modeling renderings, greatly strengthens the freedom of Plot Work so that user need not be all the time one Platform two dimensional display previous work, it is convenient and reliable;Simultaneously as the present invention uses intelligent glasses therefore, to lead to as display terminal Crossing the threedimensional model that the present invention is drawn can be shown in the visual field of user with 3D effect, visual good;In addition, the present invention carries The system and method for confession carry out two-dimensional pattern drafting only with interaction pen, abandon the side to be drawn at present using mouse completely Formula, more meets the direct feel of people, and the drafting of designer is intended to preferably show by interaction pen.
In addition, the three-dimensional drawing system and method provided by the invention based on man-machine interaction, user can pass through finger The three-dimensional graphical model being opposite in three dimensions such as moves, rotated at the operation, realizes the 3-D graphic and other draftings 3D models between cooperation, assembling, it is simple and easy.
It is understood that for the person of ordinary skill of the art, it can be conceived with the technique according to the invention and done Go out other various corresponding changes and deformation, and all these changes and deformation should all belong to the protection model of the claims in the present invention Enclose.

Claims (14)

  1. A kind of 1. three-dimensional drawing system based on man-machine interaction, it is characterised in that including:
    Intelligent glasses, for display image information in three dimensions, including:
    Plane recognition unit, for determining the working face of the intelligent glasses;
    Locating and tracking unit, for tracking the wearer's head position of the intelligent glasses, and it is real-time according to the head position The image that the intelligent glasses are superimposed upon on the working face is adjusted, the wearer is kept relatively quiet with described image Only;
    Interaction pen, signal are connected to the intelligent glasses, for performing the drafting of X-Y scheme;
    Main frame, signal are connected to the intelligent glasses, and the main frame is provided with three-dimensional drawing software, and the main frame passes through wireless communication Number the interface of the three-dimensional drawing software is delivered in the intelligent glasses, the intelligent glasses are by the three-dimensional drawing software Interface display on the working face, the interaction pen draws X-Y scheme on the interface of the three-dimensional drawing software, The interaction pen is additionally operable to handle the X-Y scheme, to form 3-D graphic;
    The interaction pen is additionally operable to the interface for controlling the 3-D graphic to depart from the three-dimensional drawing software, into the three-dimensional space Between in, then show through the intelligent glasses information of the 3-D graphic.
  2. 2. the three-dimensional drawing system according to claim 1 based on man-machine interaction, it is characterised in that:The plane identification is single Member includes the first camera, second camera, infrared matrix grid LED and first processor, and first camera, second are taken the photograph As head is respectively symmetrically arranged at the left and right ends of the intelligent glasses, the infrared matrix grid LED is arranged at the Brilliant Eyes The center of mirror, the infrared matrix grid LED are used to produce infrared ray matrix grid, and the first processor is built in the intelligence The inside of energy glasses, first camera, second camera are used to identify that the infrared matrix grid LED is incident upon the intelligence The infrared ray scattered in the plane of energy eyeglass wearer at the moment, the first processor is used to analyze the infrared ray, and draws The plane relative to the intelligent glasses distance and angle, so that it is determined that the working face of the intelligent glasses.
  3. 3. the three-dimensional drawing system according to claim 2 based on man-machine interaction, it is characterised in that:The infrared matrix net Lattice LED includes offering the baffle plate of grid groove and the infrared LED immediately ahead of the baffle plate, red caused by the infrared LED Outside line produces infrared ray matrix grid after the baffle plate.
  4. 4. the three-dimensional drawing system according to claim 1 based on man-machine interaction, it is characterised in that:The locating and tracking list Member includes 3-axis acceleration sensor and three-axis gyroscope, and the 3-axis acceleration sensor is used for the head for tracking the wearer Movement of the portion on X, Y, Z coordinate direction of principal axis;The head that the three-axis gyroscope is used to track the wearer is in X, Y, Z coordinate Rotation on direction of principal axis.
  5. 5. the three-dimensional drawing system according to claim 2 based on man-machine interaction, it is characterised in that:The intelligent glasses are also Including the first infrared LED, first infrared LED is arranged at the center of the intelligent glasses and is located at the infrared matrix grid LED top, first camera, second camera and first infrared LED form finger recognition unit, the finger Position relationship of the finger that recognition unit is used to identify the wearer in the three dimensions.
  6. 6. the three-dimensional drawing system according to claim 1 based on man-machine interaction, it is characterised in that:The interaction pen includes Nib, the first infrared emitting point, the second infrared emitting point and processor, the first infrared emitting point and second infrared Line launch point is located at the both ends of the interaction pen respectively, can by the first infrared emitting point, the second infrared emitting point The coordinate of the nib is determined, the processor is arranged at the inside of the interaction pen, for controlling the overall work of interaction pen.
  7. 7. the three-dimensional drawing system according to claim 1 based on man-machine interaction, it is characterised in that:The main frame is computer Or the cloud on network.
  8. 8. a kind of 3 D drawing method based on man-machine interaction, it is characterised in that comprise the steps:
    Step A:The working face of intelligent glasses is determined by plane recognition unit;
    Step B:The image frame of display is superimposed upon on the working face by the intelligent glasses;
    Step C:By the wearer's head position of intelligent glasses described in locating and tracking element keeps track, and according to the head position The image that the intelligent glasses are superimposed upon the working face is adjusted in real time, the wearer is kept relatively quiet with described image Only;
    Step D:The three-dimensional drawing software on main frame is opened, the main frame is by wireless signal by the boundary of the three-dimensional drawing software Face is delivered in the intelligent glasses, and the intelligent glasses are by the interface display of the three-dimensional drawing software in the working face On, and X-Y scheme is drawn at the interface of the three-dimensional drawing software by interaction pen;
    Step E:The X-Y scheme is handled by the interaction pen, to form 3-D graphic;
    Step F:The interface of the 3-D graphic disengaging three-dimensional drawing software is controlled by the interaction pen, into three-dimensional space Between in;
    Step G:The intelligent glasses show the information of the 3-D graphic.
  9. 9. the 3 D drawing method according to claim 8 based on man-machine interaction, it is characterised in that also including step H, make The three-dimensional graphical model that user is opposite to by finger in three dimensions moves, rotation process, realizes the 3-D graphic Cooperation, assembling between the 3D models of other draftings.
  10. 10. the 3 D drawing method according to claim 9 based on man-machine interaction, it is characterised in that complete step H it It is preceding also to include using binocular stereo vision algorithm, identify that user's finger is three-dimensional in the intelligent glasses by finger recognition unit The step of position relationship in space.
  11. 11. the 3 D drawing method according to claim 8 based on man-machine interaction, it is characterised in that in step A, pass through The plane recognition unit determines the working face of the intelligent glasses, comprises the steps:
    Step A1:Infrared matrix grid LED produces infrared ray matrix grid;
    Step A2:First camera, second camera identify that the infrared matrix grid LED is incident upon the intelligent glasses and worn The infrared ray scattered in the plane of person at the moment;
    Step A3:First processor analyzes the infrared ray of the scattering, and draws the plane relative to the intelligent glasses Distance and angle, so that it is determined that the working face of the intelligent glasses.
  12. 12. the 3 D drawing method according to claim 8 based on man-machine interaction, it is characterised in that in step C, pass through The wearer's head position of intelligent glasses described in locating and tracking element keeps track, and the intelligence is adjusted according to the head position in real time Energy glasses are superimposed upon the image of the working face, the wearer is kept geo-stationary with described image, specifically include down State step:
    Step C1:The wearer's head position of the intelligent glasses is tracked, is designated as ax,ay,azAnd ωxyz, ax,ay, azThe acceleration moved for the head of the wearer on X, Y, Z coordinate direction of principal axis, the ωxyzFor the wearing The angular speed that the head of person rotates on X, Y, Z coordinate direction of principal axis;
    Step C2:Respectively to ax,ay,azThe double integral for carrying out the time obtains the wearer's head in time range The displacement occurred on three change in coordinate axis direction, and be designated as:Δ x=∫ ∫ axDt, Δ y=∫ ∫ ayDt and Δ z=∫ ∫ azdt;
    Step C3:Respectively to the ωxyzThe multiple integral for carrying out the time obtains the wearer's head in time range The interior angle to be rotated around three change in coordinate axis direction, and be designated as:Δθx=∫ ωxdt、Δθy=∫ ωyDt and Δ θz=∫ ωzdt;
    Step C4:According to Δ x, Δ y, Δ z, Δ θx、Δθy、ΔθzObtain the transformation matrix M of camera coordinates system;
    Step C5:According to model conversion and the duality of view transformation, corresponding model transformation matrix M ' is obtained;
    Step C6:According to the matrix M and matrix M', the new seat of the wearer's head post exercise threedimensional model is obtained Mark X';
    Step C7:The image that the intelligent glasses are superimposed upon the working face is adjusted based on the X' in real time, makes the wearing Person keeps geo-stationary with described image.
  13. 13. the 3 D drawing method according to claim 8 based on man-machine interaction, it is characterised in that complete step D it Preceding the step of also including determining the coordinate of nib using binocular stereo vision algorithm.
  14. 14. the 3 D drawing method according to claim 8 based on man-machine interaction, it is characterised in that in step E, pass through The interaction pen is handled X-Y scheme, to form 3-D graphic, is specially:By the interaction pen to the X-Y scheme Shape is stretched, rotation process, to form 3-D graphic.
CN201310549711.XA 2013-11-07 2013-11-07 A kind of three-dimensional drawing system and method based on man-machine interaction Active CN104637080B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310549711.XA CN104637080B (en) 2013-11-07 2013-11-07 A kind of three-dimensional drawing system and method based on man-machine interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310549711.XA CN104637080B (en) 2013-11-07 2013-11-07 A kind of three-dimensional drawing system and method based on man-machine interaction

Publications (2)

Publication Number Publication Date
CN104637080A CN104637080A (en) 2015-05-20
CN104637080B true CN104637080B (en) 2017-12-19

Family

ID=53215785

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310549711.XA Active CN104637080B (en) 2013-11-07 2013-11-07 A kind of three-dimensional drawing system and method based on man-machine interaction

Country Status (1)

Country Link
CN (1) CN104637080B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105892869A (en) * 2016-04-28 2016-08-24 北京小米移动软件有限公司 Image position adjusting method and device
CN106096912B (en) * 2016-06-03 2020-07-28 广州视源电子科技股份有限公司 Face recognition method of intelligent glasses and intelligent glasses
CN106327582A (en) * 2016-08-22 2017-01-11 深圳马顿科技有限公司 Pattern drawing system of 3D model
CN106774992A (en) * 2016-12-16 2017-05-31 深圳市虚拟现实技术有限公司 The point recognition methods of virtual reality space location feature
CN109186549A (en) * 2018-10-26 2019-01-11 国网黑龙江省电力有限公司电力科学研究院 A kind of Iron tower incline angle measurement method of view-based access control model
KR102051889B1 (en) * 2018-12-05 2019-12-06 주식회사 증강지능 Method and system for implementing 3d augmented reality based on 2d data in smart glass
CN116958332B (en) * 2023-09-20 2023-12-22 南京竹影数字科技有限公司 Method and system for mapping 3D model in real time of paper drawing based on image recognition

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101833385A (en) * 2009-08-21 2010-09-15 深圳先进技术研究院 Remote control interactive pen and receiver thereof
CN102945564A (en) * 2012-10-16 2013-02-27 上海大学 True 3D modeling system and method based on video perspective type augmented reality

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130215132A1 (en) * 2012-02-22 2013-08-22 Ming Fong System for reproducing virtual objects

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101833385A (en) * 2009-08-21 2010-09-15 深圳先进技术研究院 Remote control interactive pen and receiver thereof
CN102945564A (en) * 2012-10-16 2013-02-27 上海大学 True 3D modeling system and method based on video perspective type augmented reality

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
DAB: Interactive Haptic Painting with 3D Virtual Brushes;Bill Baxter 等;《ACM SIGGRAPH 2001 video revoew on Animation theater program》;20011231;第1-8页 *
基于增强现实技术的三维交互建模;胡庆夕 等;《计算机工程》;20100731;第36卷(第13期);第236-238页 *
基于手势控制的虚拟绘画室;王江春 等;《系统仿真学报》;20060131;第18卷(第1期);第243-247页 *

Also Published As

Publication number Publication date
CN104637080A (en) 2015-05-20

Similar Documents

Publication Publication Date Title
CN104637080B (en) A kind of three-dimensional drawing system and method based on man-machine interaction
US20210283496A1 (en) Realistic Virtual/Augmented/Mixed Reality Viewing and Interactions
TWI827633B (en) System and method of pervasive 3d graphical user interface and corresponding readable medium
CN107636585B (en) Generation of three-dimensional fashion objects by drawing inside a virtual reality environment
US10235807B2 (en) Building holographic content using holographic tools
KR20230164185A (en) Bimanual interactions between mapped hand regions for controlling virtual and graphical elements
US9986228B2 (en) Trackable glasses system that provides multiple views of a shared display
CN116719413A (en) Method for manipulating objects in an environment
CN104423578B (en) Interactive input system and method
US10652525B2 (en) Quad view display system
TW201401224A (en) System and method for performing three-dimensional motion by two-dimensional character
KR20230028532A (en) Creation of ground truth datasets for virtual reality experiences
CN106200985A (en) Desktop type individual immerses virtual reality interactive device
CN104656880B (en) A kind of writing system and method based on intelligent glasses
CN117280711A (en) Head related transfer function
CN109102571B (en) Virtual image control method, device, equipment and storage medium thereof
WO2024049578A1 (en) Scissor hand gesture for a collaborative object
WO2024049585A1 (en) Timelapse of generating a collaborative object
US20240070243A1 (en) Authenticating a selective collaborative object
US20240071020A1 (en) Real-world responsiveness of a collaborative object
US20220256137A1 (en) Position calculation system
Aloor et al. Design of VR headset using augmented reality
CN109634427B (en) AR (augmented reality) glasses control system and control method based on head tracking
US12099659B1 (en) Translation of visual effects
US20240070302A1 (en) Collaborative object associated with a geographical location

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant