CN102169364B - Interaction module applied to stereoscopic interaction system and method of interaction module - Google Patents

Interaction module applied to stereoscopic interaction system and method of interaction module Download PDF

Info

Publication number
CN102169364B
CN102169364B CN 201010122713 CN201010122713A CN102169364B CN 102169364 B CN102169364 B CN 102169364B CN 201010122713 CN201010122713 CN 201010122713 CN 201010122713 A CN201010122713 A CN 201010122713A CN 102169364 B CN102169364 B CN 102169364B
Authority
CN
China
Prior art keywords
coordinate
dimensional
eyes
interactive
dimentional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN 201010122713
Other languages
Chinese (zh)
Other versions
CN102169364A (en
Inventor
赵子毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pixart Imaging Inc
Original Assignee
Pixart Imaging Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pixart Imaging Inc filed Critical Pixart Imaging Inc
Priority to CN 201010122713 priority Critical patent/CN102169364B/en
Publication of CN102169364A publication Critical patent/CN102169364A/en
Application granted granted Critical
Publication of CN102169364B publication Critical patent/CN102169364B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The invention relates to an interaction module applied to a stereoscopic interaction system and a method of the interaction module. The interaction module and the method are used for correcting a position of an interaction component according to the position of a user, or correcting the position of a virtual article in a stereoscopic image and an interaction judgement condition. Therefore, even if the position of the virtual article in the stereoscopic image which is seen by the user is changed by the change of the position of the user, the stereoscopic interaction system can obtain an accurate result according to the position of the corrected interaction component or the corrected position of the virtual article and the interaction judgement condition.

Description

Be applied to interactive module and the method thereof of stereoscopic interaction system
Technical field
The present invention relates to a kind of stereoscopic interaction system, more particularly, relate to a kind of three-dimensional display system that utilizes and carry out interactive stereoscopic interaction system.
Background technology
In known technology, three-dimensional display system is used to provide stereopsis.As shown in Figure 1, three-dimensional display system can be divided into bore hole formula three-dimensional display system and eyeglass stereoscopic display system.For example, the bore hole formula three-dimensional display system 110 in the left side of Fig. 1 utilizes the mode of light splitting, provides different image (such as the image DIM of Fig. 1 in different angles θ 1~DIM θ 8).So, because user's eyes are positioned at different angles, so the user can receive respectively left image DIM L(image DIM θ 4) and right image DIM R(image DIM θ 5), and obtain according to this stereopsis that bore hole formula three-dimensional display system 110 provides.Eyeglass stereoscopic display system 120 at the right-hand part of Fig. 1 comprises display screen 121 and auxiliary eyeglasses 122.Display screen 121 is used to provide left image DIM LWith right image DIM R Auxiliary eyeglasses 122 is used for assisting user's eyes to receive respectively left image DIM LWith right image DIM R, to allow the user can obtain this stereopsis.
Yet the user can change along with user's position from the resulting stereopsis of three-dimensional display system.Take eyeglass stereoscopic display system 120 as example, as shown in Figure 2 (not shown auxiliary eyeglasses 122 in Fig. 2) has virtual objects VO (for example, virtual objects VO is tennis) in the stereopsis that three-dimensional display system 120 provides.Wherein virtual objects VO is at left image DIM LIn the position be LOC ILVO, at right image DIM RIn the position be LOC IRVOIf the position of user's left eye is LOC at this moment 1LE, the position of user's right eye is LOC 1REThe position LOC of user's left eye 1LEPosition LOC with virtual objects VO ILVOForm straight line L 1LThe position LOC of user's right eye 1REPosition LOCI with virtual objects VO RVOForm straight line L 1RSo, the user sees that the position of virtual objects VO is according to straight line L 1LWith L 1RAnd determine.For example, as straight line L 1LWith L 1RThe position of intersection point be LOC 1CPThe time, the user sees that the position of virtual objects VO is LOC 1CPIn like manner, be respectively LOC when user's eyes position 2LEWith LOC 2REThe time, the position of user's eyes respectively with the position LOC of virtual objects VO ILVOWith LOC IRVOForm straight line L 2LWith L 2RAt this moment, the user sees that the position of virtual objects VO is according to straight line L 2LWith L 2RAnd determine.That is to say, the user sees that the position of virtual objects VO is straight line L 2LWith L 2RThe position LOC of intersection point 2CP
Because the user from the resulting stereopsis of three-dimensional display system, can change along with user's position, therefore when user's wish is interactive by interactive module (such as Game device) and three-dimensional display system, may produce wrong interactive result.For example, user's wish is carried out three-dimensional tennis game by interactive module (such as Game device) and three-dimensional display system 120.Interactive assembly (such as the game control joystick) in the hand-held interactive module of user is with the batting of swinging the bat of the role in the control game.Interactive module (Game device) supposes that user's position is positioned at the dead ahead of three-dimensional display system, and interactive module (Game device) supposes that user's eyes position is respectively LOC 1LEWith LOC 1REAt this moment, interactive module (Game device) control three-dimensional display system 120 is at left image DIM LThe position LOC of middle demonstration tennis ILVO, at right image DIM RThe position of middle demonstration tennis is LOC IRVOTherefore, interactive module (Game device) supposes that the 3D tennis position that the user sees is LOC 1CP(as shown in Figure 2).In addition, as position and position LOC that the user swings the bat 1CPBetween distance less than interactive critical distance D THThe time, interactive module (Game device) judges that namely the user hits tennis.Yet, if user's eyes position is actually LOC at this moment 2LEWith LOC 2RE, then the 3D tennis position in fact seen of user is LOC 2CPAssumed position LOC 2CPWith LOC 1CPBetween distance greater than interactive critical distance D THSo, control interactive assembly (game control joystick) to position LOC as the user 2CPWhen swinging the bat, interactive module (Game device) judges that the user does not hit tennis.In other words, although the 3D tennis position that in fact user sees is LOC 2CP, and the user controls interactive assembly (game control joystick) to position LOC 2CPSwing the bat, but interactive module (Game device) judges that but the user does not hit tennis.That is to say, because user's eye position changes the distortion that causes stereopsis, therefore can cause the interactive relationship of interactive module (Game device) erroneous judgement user and article, produce incorrect interactive result, bring the user very large inconvenience.
Summary of the invention
The invention provides a kind of interactive module that is applied to a stereoscopic interaction system.This stereoscopic interaction system has a three-dimensional display system.This three-dimensional display system is used to provide a stereopsis.This stereopsis has a virtual objects.This virtual objects has a virtual coordinates and an interactive Rule of judgment.This interactive module comprises a location module, an interactive assembly, an interactive assembly locating module, and an interactive decision circuitry.This locating module is used for detecting the position of user in a scene, to produce a three-dimensional reference coordinate.This interaction assembly locating module is used for detecting the position of this interaction assembly, produces a three-dimensional interactive coordinate.It is a correction virtual coordinates that this interaction decision circuitry is used for changing this virtual coordinates according to this three-dimensional reference coordinate, and according to this three-dimensional interactive coordinate, this correction virtual coordinates and this interaction Rule of judgment, determine the interactive result between this interaction assembly and this stereopsis.
The present invention also provides a kind of interactive module that is applied to a stereoscopic interaction system.This stereoscopic interaction system has a three-dimensional display system.This three-dimensional display system is used to provide a stereopsis.This stereopsis has a virtual objects.This virtual objects has a virtual coordinates and an interactive Rule of judgment.This interactive module comprises a location module, an interactive assembly, an interactive assembly locating module, and an interactive decision circuitry.This locating module is used for detecting the position of user in a scene, to produce a three-dimensional reference coordinate.This interaction assembly locating module is used for detecting the position of this interaction assembly, to produce a three-dimensional interactive coordinate.It is the interactive coordinate of a three-dimensional correction that this interaction decision circuitry is used for changing this three-dimensional interactive coordinate according to this three-dimensional reference coordinate, and proofread and correct interactive coordinate, this virtual coordinates and this interaction Rule of judgment according to this three-dimensional, determine the interactive result between this interaction assembly and this stereopsis.
The present invention also provides a kind of usefulness to decide an interactive result's of a stereoscopic interaction system method.This stereoscopic interaction system has a three-dimensional display system and an interactive assembly.This three-dimensional display system is used to provide a stereopsis.This stereopsis has a virtual objects.This virtual objects has a virtual coordinates and an interactive Rule of judgment.The method comprises detecting user's in a scene position, to produce a three-dimensional reference coordinate, to detect the position of this interaction assembly, to produce a three-dimensional interactive coordinate, and according to this three-dimensional reference coordinate, this three-dimensional interactive coordinate, this virtual coordinates and this interaction Rule of judgment, determine this interaction result between this interaction assembly and this stereopsis.
Description of drawings
Fig. 1 is the schematic diagram of the three-dimensional display system of explanation known technology.
The schematic diagram that Fig. 2 changes with user's position for the stereopsis that provides of three-dimensional display system of explanation known technology.
Fig. 3 and Fig. 4 are the schematic diagram of explanation stereoscopic interaction system of the present invention.
Fig. 5 is the schematic diagram of the first embodiment of explanation bearing calibration of the present invention.
Fig. 6, Fig. 7 and Fig. 8 can reduce the schematic diagram of interactive decision circuitry mode of the number of the search point of required processing in the first embodiment of bearing calibration of the present invention for explanation.
Fig. 9 and Figure 10 are the schematic diagram of the second embodiment of explanation bearing calibration of the present invention.
Figure 11 and Figure 12 are the schematic diagram of the 3rd embodiment of explanation bearing calibration of the present invention.
Figure 13 can control the schematic diagram of sound and light program for explanation stereoscopic interaction system of the present invention.
Figure 14 is the schematic diagram of the first embodiment of eyes locating module of the present invention.
Figure 15 is the schematic diagram of the first embodiment of eyes positioning circuit of the present invention.
Figure 16 is the schematic diagram of another embodiment of eyes locating module of the present invention.
Figure 17 is the schematic diagram of another embodiment of eyes positioning circuit of the present invention.
Figure 18 is the schematic diagram of another embodiment of eyes positioning circuit of the present invention.
Figure 19 and Figure 20 are the schematic diagram of another embodiment of eyes positioning circuit of the present invention.
Figure 21 and Figure 22 are the schematic diagram of another embodiment of eyes positioning circuit of the present invention.
Figure 23 is the schematic diagram of another embodiment of eyes locating module of the present invention.
Figure 24 is the schematic diagram of the first embodiment of three-dimensional scenic sensor of the present invention.
Figure 25 is the schematic diagram that eyes coordinates of the present invention produces the first embodiment of circuit.
Figure 26 is the schematic diagram that eyes coordinates of the present invention produces another embodiment of circuit.
Figure 27 is the schematic diagram that eyes coordinates of the present invention produces another embodiment of circuit.
Figure 28 is the schematic diagram that eyes coordinates of the present invention produces another embodiment of circuit.
Wherein, description of reference numerals is as follows:
110,120,310 three-dimensional display systems
121 display screens
122 auxiliary eyeglasses
300 stereoscopic interaction systems
320 interactive module
321 locating modules
322 interactive assemblies
323 interactive assembly locating modules
324 interactive decision circuitry
330 display control circuits
340 loudspeaker
350 sound control circuits
1100,1300,1700 eyes locating modules
1110,1120,1810 image sensors
1130,1200,1400,1500,1600, eyes positioning circuit
2300
1140,1920 three-dimensional coordinate change-over circuits
1210,1910 eye detecting circuit
1350,2030 human face detection circuit
1410,2110,2310 glasses circuit for detecting
1420,2120,2320 glasses coordinate transformation circuits
1530,2230 inclination detectors
1640,1820,2340 infrared light luminescence components
1650 infrared light reflection assemblies
1660,2360 infrared light sensing circuits
1710,1800 three-dimensional scenic sensors
1720,1900,2000,2100, eyes coordinates produces circuit
2200
1830 light sensing distance measuring equipments
COND PVO, COND CVOInteractive Rule of judgment
D SError distance
D THInteractive critical distance
D MPR, D MPLDistance
DIM 3DStereopsis
DIM θ 1~DIM θ 8, DIM L, DIM RImage
INFO DRange information
INFO TILTInclination information
L DDetected light
L RReflected light
L 1L、L 1R、L 2L、L 2R、L PL、L PR
L AL, L AR, L REFL, L REFR, L PJL, straight line
L PJR
LOC 3D_PIO、LOC 3D_CIO
LOC 3D_PVO、LOC 3D_CVO
LOC 3D_EYE、LOC IRVO
LOC ILVO、LOC 1CP、LOC 2CP
LOC 1LE、LOC 1LR、LOC 2LE
LOC 2LR、LOC 3D_LE、LOC 3D_RE
LOC LE_PRE, LOC RE_PRE, coordinate
LOC PTH、LOC CTH、LOC 3D_IPJR
LOC 3D_IPJL、LOC 3D_SPJR
LOC 3D_SPJL
LOC SEN1~LOC SEN3
LOC 2D_EYE1~LOC 2D_EYE3
LOC GLASS1、LOC GLASS2
LOC GLASS3、LOC IR、LOC MD
MP is with reference to mid point
P A, P XSearch point
P BEnd points
P CCentral point
The RA search area
The interactive result of RT
The SC scene
SIM 2D1~SIM 2D3The two dimension sensing image
SL GLASS1~SL GLASS3The glasses slope
SL IRThe infrared light slope
SUF PTH, SUF CTHCritical surface
Embodiment
The invention provides a kind of stereoscopic interaction system, can be according to user's position, proofread and correct the position of interactive assembly, or the position of the virtual objects in the stereopsis and interactive Rule of judgment, so, stereoscopic interaction system of the present invention can be according to the position of the interactive assembly after calibrated, or the position of the virtual objects after calibrated and interactive Rule of judgment, obtains correct interactive result.
Please refer to Fig. 3 and Fig. 4.Fig. 3 and Fig. 4 are the schematic diagram of explanation stereoscopic interaction system 300 of the present invention.Stereoscopic interaction system 300 comprises three-dimensional display system 310, and interactive module 320.Three-dimensional display system 310 provides stereopsis DIM 3DThree-dimensional display system can be implemented by bore hole formula three-dimensional display system 110 or eyeglass stereoscopic display system 120.Interactive module 320 comprises locating module 321, interactive assembly 322, interactive assembly locating module 323, and interactive decision circuitry 324.Locating module 321 is used for detecting the position of user in scene SC, to produce three-dimensional reference coordinate.The position of the interactive assembly 322 of interactive assembly locating module 323 detectings is to produce three-dimensional interactive coordinate LOC 3D_PIOInteractive decision circuitry 324 is according to 3D reference coordinate, three-dimensional interactive coordinate LOC 3D_PIOWith stereopsis DIM 3D, determine interactive assembly 322 and stereopsis DIM 3DBetween interactive as a result RT.
For convenience of description, suppose that in the present invention locating module 321 is illustrated for the eyes locating module.The position of eyes locating module 321 detectings eyes of user in scene SC is to produce three-dimensional eyes coordinate LOC 3D_EYEBe used as three-dimensional reference coordinate.Three-dimensional eyes coordinate LOC wherein 3D_EYEComprise three-dimensional left eye coordinates LOC 3D_LEWith three-dimensional right eye coordinate LOC 3D_RETherefore, this moment, interactive decision circuitry 324 was according to three-dimensional eyes coordinate LOC 3D_EYE, three-dimensional interactive coordinate LOC 3D_PIOWith stereopsis DIM 3D, determine interactive assembly 322 and stereopsis DIM 3DBetween interactive as a result RT.Yet locating module 321 of the present invention is not defined as the eye detecting module, and for example, locating module 321 can be by detecting user's further feature (such as user's ear or face etc.), with location user's position.
Below the principle of work of stereoscopic interaction system 300 of the present invention will be described further.
Stereopsis DIM 3DBy left image DIM LWith right image DIM RForm.Set up body image DIM 3DHas virtual objects VO.For example, the user carries out tennis game by stereoscopic interaction system 300, and virtual objects VO is tennis, and the user is controlled at stereopsis DIM by interactive assembly 322 3DIn another virtual objects (such as tennis racket), carry out tennis game.Virtual objects VO has virtual coordinates LOC 3D_PVOWith interactive Rule of judgment COND PVOMore particularly, the left image DIM that provides at three-dimensional display system 310 of virtual objects VO LIn the position be LOC ILVO, the right image DIM that provides at three-dimensional display system 310 RIn the position be LOC IRVOInteractive module 320 supposes that first the user is in reference position (such as the dead ahead of three-dimensional display system 310), and user's eyes position equals known eyes coordinate LOC EYE_PRE, known eyes coordinate LOC wherein EYE_PREComprise known left eye coordinates LOC LE_PREWith known right eye coordinate LOC RE_PREAccording to straight line L PL(known left eye coordinates LOC LE_PREWith virtual objects VO at left image DIM LPosition LOC ILVOBetween straight line) with straight line L PR(known right eye coordinate LOC RE_PREWith virtual objects VO at right image DIM RPosition LOC IRVOBetween straight line), stereoscopic interaction system 300 can obtain the user from known eyes coordinate LOC EYE_PREThe virtual objects VO that sees is at position LOC 3D_PVO, and the virtual coordinates of virtual objects VO is set as LOC 3D_PVOMore particularly, the user has three-dimensional imaging position model MODEL LOCCan be used to come according to the image that eyes receive the position of positioning object.That is to say, when the user receives left image DIM LWith right image DIM RAfter, the user is according to left image DIM LThe position LOC of middle virtual objects VO ILVO, right image DIM RThe position of middle virtual objects VO is LOC IRVO, can be by three-dimensional imaging position model MODEL LOCLocate the three-dimensional imaging position of virtual objects VO.For example, in the present invention, suppose three-dimensional imaging position model MODEL LOCFor foundation virtual objects VO at left image DIM LIn the position (such as position LOC ILVO) with the position of user's left eye (such as known left eye coordinates LOC LE_PRE) first online (such as straight line L PL) with according to virtual objects VO at right image DIM RIn the position (such as known right eye coordinate LOC IRVO) with the position of user's right eye (such as position LOC RE_PRE) second online (such as straight line L PR), the three-dimensional imaging position of decision virtual objects VO.When above-mentioned first online and second online when intersecting at a point of crossing, three-dimensional imaging position model MODEL LOCThe three-dimensional imaging position that can set virtual objects VO is the coordinate of point of crossing; When above-mentioned first online and second online when not having the point of crossing, three-dimensional imaging position model MODEL LOCCan determine first to have with the first online and the second online minor increment and the reference mid point, and the three-dimensional imaging position of setting virtual objects VO is the coordinate with reference to mid point.The interactive Rule of judgment COND of virtual objects VO PVOBe used to provide to interactive decision circuitry 324 and determine interactive as a result RT.For example, interactive Rule of judgment COND PVOCan be made as position and virtual coordinates LOC when interactive assembly 322 3D_PVOBetween distance less than interactive critical distance D THThe time, interactive as a result RT represents " contact ", that is to say, this moment, interactive decision circuitry 324 judged that the tennis racket that interactive assembly 322 is controlled touches stereopsis DIM 3DIn virtual objects VO (for example, as hit tennis); Position and virtual coordinates LOC when interactive assembly 322 3D_PVOBetween distance greater than interactive critical distance D THThe time, interactive as a result RT represents " not contact ", that is to say, this moment, interactive decision circuitry 324 judged that interactive assembly 322 does not touch stereopsis DIM 3DIn virtual objects VO (for example, not hitting tennis).
In the present invention, interactive decision circuitry 324 is according to three-dimensional eyes coordinate (three-dimensional reference coordinate) LOC 3D_EYE, three-dimensional interactive coordinate LOC 3D_PIOWith stereopsis DIM 3D, determine interactive as a result RT.More particularly, not the known eyes coordinate LOC that supposes from stereoscopic interaction system 300 owing to working as the user EYE_PREWatch stereopsis DIM 3DThe time, the user sees that the position of virtual objects VO can change and virtual objects VO may have point deformation, and causes incorrect interactive as a result RT.Therefore, the invention provides the embodiment of three kinds of bearing calibrations.Below will be further described.
In the first embodiment of bearing calibration of the present invention, interactive decision circuitry 324 is watched stereopsis DIM according to the user 3DPosition (three-dimensional eyes coordinate LOC 3D_EYE), correction user in fact wish carries out interactive position by interactive assembly 322, obtains correct interactive as a result RT.More particularly, interactive decision circuitry 324 is according to three-dimensional imaging position model MODEL LOC, the eyes position of calculating as the user is known eyes coordinate LOC EYE_PREThe time viewed interactive assembly 322 virtual objects (such as tennis racket) controlled (this position is the three-dimensional interactive coordinate LOC of correction in the position 3D_CIO).Then, interactive decision circuitry 324 is proofreaied and correct interactive coordinate LOC according to three-dimensional 3D_CIO, virtual objects VO virtual coordinates LOC 3D_PVOWith interactive Rule of judgment COND POV, determine that the eyes position as the user is known eyes coordinate LOC EYE_PREThe time viewed interactive as a result RT.Owing to interactive as a result RT not along with user's position changes, the eyes position that therefore this moment, interactive decision circuitry 324 resulting interactive results were the user is virtual at three-dimensional eyes coordinate LOC 3D_EYEViewed interactive as a result RT.
Please refer to Fig. 5.Fig. 5 is the schematic diagram of the first embodiment of explanation bearing calibration of the present invention.Interactive decision circuitry 324 is according to three-dimensional eyes coordinate (three-dimensional reference coordinate) LOC 3D_EYEConversion three-dimensional interactive coordinate LOC 3D_PIOFor three-dimensional is proofreaied and correct interactive coordinate LOC 3D_CIOMore particularly, interactive decision circuitry 324 is according to three-dimensional eyes coordinate LOC 3D_EYEWith three-dimensional interactive coordinate LOC 3D_PIO, calculate when user's eyes position virtual at known eyes coordinate LOC EYE_PREThe time, (meaning i.e. the interactive coordinate LOC of three-dimensional correction in the position of the viewed interactive assembly 322 of user 3D_CIO).For example, at known eyes coordinate LOC EYE_PRECoordinate system in have a plurality of search point P (search point P as shown in Figure 5 A).Interactive decision circuitry 324 is according to searching some P AWith known eyes coordinate LOC LE_PREWith LOC RE_PRE, can obtain searching some P AProjection is in left image DIM LThe left search LOC of projection coordinate 3D_SPJL, and search some P AProjection is in right image DIM RThe right search LOC of projection coordinate 3D_SPJRThe three-dimensional imaging position model MODEL that supposes by the present invention LOC, interactive decision circuitry 324 is according to searching the LOC of projection coordinate 3D_SPJLWith LOC 3D_SPJR, and three-dimensional eyes coordinate LOC 3D_EYECan obtain corresponding to three-dimensional eyes coordinate LOC 3D_EYECoordinate system in, corresponding to searching a some P AEnd points P B, and interactive decision circuitry 324 can calculate end points P further BWith three-dimensional interactive coordinate LOC 3D_PIOError distance D SThus, interactive decision circuitry 324 can be calculated at known eyes coordinate LOC according to above-mentioned illustrated mode EYE_PRECoordinate system in all search the corresponding error distance D of some P SSearch point (for example, such as P when one X) corresponding error distance D SHour, interactive decision circuitry 324 is according to searching a some P XThe position decide the three-dimensional interactive coordinate LOC of correction 3D_CIOBecause when user's eyes position is three-dimensional eyes coordinate LOC 3D_EYEThe time, the stereopsis DIM that the user sees 3DThe position of each virtual objects all be from known eyes coordinate LOC EYE_PREOrigin coordinate system transform to three-dimensional eyes coordinate LOC 3D_EYECoordinate system.Therefore proofread and correct interactive coordinate LOC by the illustrated method Calculation of Three Dimensional of Fig. 5 3D_CIOThe time, the stereopsis DIM that the conversion direction of coordinate system and user see 3DThe conversion direction of each virtual objects identical, so can reduce the error that produces because of non-linear origin coordinate system transform, proofread and correct interactive coordinate LOC and obtain more correct three-dimensional 3D_CIO
In order to reduce in the first embodiment of bearing calibration of the present invention, interactive decision circuitry 324 is when calculating at known eyes coordinate LOC EYE_PRECoordinate system in search the corresponding error distance D of some P SThe calculation resources of Shi Suoxu, the present invention further provides the mode of simplification, with the number of the search point P that reduces interactive decision circuitry 324 required processing.Please refer to Fig. 6, Fig. 7 and Fig. 8.Fig. 6, Fig. 7 and Fig. 8 can reduce the schematic diagram of interactive decision circuitry 324 mode of the number of the search point of required processing in the first embodiment of bearing calibration of the present invention for explanation.Interactive decision circuitry 324 is according to three-dimensional eyes coordinate LOC 3D_EYEWith three-dimensional eyes coordinate LOC 3D_EYECoordinate system in three-dimensional interactive coordinate LOC 3D_PIOBe converted to known eyes coordinate LOC EYE_PRECoordinate system in center point P CBecause center point P CCorresponding to three-dimensional eyes coordinate LOC 3D_EYECoordinate system in three-dimensional interactive coordinate LOC 3D_PIOTherefore, in the situation that general, have the least error distance D SSearch point P XCan be adjacent to center point P CIn other words, interactive decision circuitry 324 can only be calculated and be adjacent to center point P CThe corresponding error distance D of search point P S, can obtain having the least error distance D SSearch point P X, and determine according to this three-dimensional interactive coordinate LOC of correction 3D_CIO
More particularly, as shown in Figure 6, according to the three-dimensional interactive coordinate LOC of interactive assembly 322 3D_PIOThree-dimensional left eye coordinates LOC with the user 3D_LECan form projection straight line L PJLProjection straight line L PJLWith three-dimensional display system 310 intersections in position LOC 3D_IPJLPosition LOC wherein 3D_IPJLBe the left image DIM that the user sees that interactive assembly 322 projections provide in three-dimensional display system 310 LThree-dimensional left interaction coordinate; In like manner, according to the three-dimensional interactive coordinate LOC of interactive assembly 322 3D_PIOThree-dimensional right eye coordinate LOC with the user 3D_ERCan form projection straight line L PJRProjection straight line L PJRWith 310 intersections of 3D optical projection system in position LOC 3D_IPJRPosition LOC wherein 3D_IPJRBe the right image DIM that the user sees that interactive assembly 322 projections provide in three-dimensional display system 310 LThree-dimensional right interaction coordinate.That is to say, interactive decision circuitry 324 is according to three-dimensional eyes coordinate LOC 3D_EYEWith three-dimensional interactive coordinate LOC 3D_PIO, to draw the three-dimensional left interaction coordinate LOC of interactive assembly 322 projections on three-dimensional display system 310 3D_IPJLWith the right interaction coordinate of three-dimensional LOC 3D_IPJRInteractive decision circuitry 324 is according to the left interaction coordinate of three-dimensional LOC 3D_IPJLWith known left eye coordinates LOC LE_PREDecide left consult straight line L REFL, and according to the right interaction coordinate of this three-dimensional LOC 3D_IPJRWith known right eye coordinate LOC RE_PREDecide right consult straight line L REFRInteractive decision circuitry 324 is according to left consult straight line L REFLWith right consult straight line L REFR, can obtain at known eyes coordinate LOC EYE_PRECoordinate system in center point P CFor example, as left consult straight line L REFLWith right consult straight line L REFRWhen intersecting at intersection point CP (as shown in Figure 6), interactive decision circuitry 324 can according to the position of intersection point CP, determine center point P CAs left consult straight line L REFLWith right consult straight line L REFRWhen directly not intersecting (as shown in Figure 7), interactive decision circuitry 324 is according to left consult straight line L REFLWith right consult straight line L REFR, obtain having and left consult straight line L REFLWith right consult straight line L REFRMinor increment and reference mid point MP, and with reference to mid point MP and left consult straight line L REFLBetween distance D MPLEqual with reference to mid point MP and right consult straight line L REFRBetween distance D MPRAt this moment, be center point P with reference to mid point MP CWhen interactive decision circuitry 324 obtains center point P CAfter, as shown in Figure 8, interactive decision circuitry 324 can be according to center point P CDetermine search area RA.The corresponding error distance D of the interactive search point P of 324 calculating of decision circuitry in search area RA STherefore compared to the mode in comprehensive search illustrated in fig. 5, utilize the illustrated mode of Fig. 6, Fig. 7 and Fig. 8, can save further interactive decision circuitry 324 and proofread and correct interactive coordinate LOC at Calculation of Three Dimensional 3D_CIOThe calculation resources of Shi Suoxu.
Please refer to Fig. 9 and Figure 10.Fig. 9 and Figure 10 are the schematic diagram of the search point of the second embodiment of explanation bearing calibration of the present invention.Interactive decision circuitry 324 is according to three-dimensional eyes coordinate (three-dimensional reference coordinate) LOC 3D_EYEConversion three-dimensional interactive coordinate LOC 3D_PIOFor three-dimensional is proofreaied and correct interactive coordinate LOC 3D_CIOMore particularly, interactive decision circuitry 324 is according to three-dimensional eyes coordinate LOC 3D_EYEWith three-dimensional interactive coordinate LOC 3D_PIO, calculate when user's eyes position be known eyes coordinate LOC EYE_PREThe time, (meaning i.e. the interactive coordinate LOC of three-dimensional correction in the position of the viewed interactive assembly 322 of user 3D_CIO).For example, as shown in Figure 9, according to the three-dimensional interactive coordinate LOC of interactive assembly 322 3D_PIOThree-dimensional left eye coordinates LOC with the user 3D_LECan form projection straight line L PJLProjection straight line L PJLWith three-dimensional display system 310 intersections in position LOC 3D_IPJLPosition LOC wherein 3D_IPJLBe the left image DIM that the user sees that interactive assembly 322 projections provide in three-dimensional display system 310 LThree-dimensional left interaction coordinate; In like manner, according to the three-dimensional interactive coordinate LOC of interactive assembly 322 3D_PIOThree-dimensional right eye coordinate LOC with the user 3D_ERCan form projection straight line L PJRProjection straight line L PJRWith 310 intersections of 3D optical projection system in position LOC 3D_IPJRPosition LOC wherein 3D_IPJRBe the right image DIM that the user sees that interactive assembly 322 projections provide in three-dimensional display system 310 LThree-dimensional right interaction coordinate.That is to say, interactive decision circuitry 324 is according to three-dimensional eyes coordinate LOC 3D_EYEWith three-dimensional interactive coordinate LOC 3D_PIO, draw the three-dimensional left interaction coordinate LOC of interactive assembly 322 projections on three-dimensional display system 310 3D_IPJLWith the right interaction coordinate of three-dimensional LOC 3D_IPJRInteractive decision circuitry 324 is according to the left interaction coordinate of three-dimensional LOC 3D_IPJLWith known left eye coordinates LOC LE_PREDetermine left consult straight line L REFL, and according to the right interaction coordinate of this three-dimensional LOC 3D_IPJRWith known right eye coordinate LOC RE_PREDetermine right consult straight line L REFRSo, interactive decision circuitry 324 is according to left consult straight line L REFLWith right consult straight line L REFR, the eyes position that can obtain as the user is virtual at known eyes coordinate LOC EYE_PREThe time, the position of the viewed interactive assembly 322 of user (the interactive coordinate LOC of three-dimensional correction 3D_CIO).Further say, as left consult straight line L REFLWith right consult straight line L REFRWhen intersecting at intersection point CP, the coordinate of intersection point CP is the three-dimensional interactive coordinate LOC of correction 3D_CIOAs left consult straight line L REFLWith right consult straight line L REFRWhen directly not intersecting (as shown in figure 10), interactive decision circuitry 324 is according to left consult straight line L REFLWith right consult straight line L REFR, obtain having and left consult straight line L REFLWith right consult straight line L REFRMinor increment and reference mid point MP, and with reference to mid point MP and left consult straight line L REFLBetween distance D MPLEqual with reference to mid point MP and right consult straight line L REFRBetween distance D MPRAt this moment, with reference to the coordinate of mid point MP namely can be considered when user's eyes position be known eyes coordinate LOC EYE_PREThe time, the position of the viewed interactive assembly 322 of user (the interactive coordinate LOC of three-dimensional correction 3D_CIO).Therefore, interactive decision circuitry 324 can be proofreaied and correct interactive coordinate LOC according to three-dimensional 3D_CIO, and the virtual coordinates LOC of virtual objects VO 3D_PVOWith interactive Rule of judgment COND PVO, to determine interactive as a result RT.Compared to the first embodiment of bearing calibration of the present invention, in the second embodiment of bearing calibration of the present invention, interactive decision circuitry 324 is according to three-dimensional interactive coordinate LOC 3D_PIOWith three-dimensional eyes coordinate LOC 3D_EYE, obtain three-dimensional left interaction coordinate LOC 3D_IPJLWith the right interaction coordinate of three-dimensional LOC 3D_IPJR, and further according to the left interaction coordinate of three-dimensional LOC 3D_IPJL, three-dimensional right interaction coordinate LOC 3D_IPJRWith known eyes coordinate LOC EYE_PRE, obtain the three-dimensional interactive coordinate LOC of correction 3D_CIOThat is to say, be will be corresponding to three-dimensional eyes coordinate LOC in the second embodiment of bearing calibration of the present invention 3D_EYEThe three-dimensional interactive coordinate LOC of coordinate system 3D_PIOBe converted to corresponding to known eyes coordinate LOC EYE_PREThe position of coordinate system, and proofread and correct interactive coordinate LOC with this position as three-dimensional 3D_CIOIn the second embodiment of bearing calibration of the present invention, corresponding to known eyes coordinate LOC EYE_PRECoordinate system with corresponding to three-dimensional eyes coordinate LOC 3D_EYECoordinate system between conversion and non-linear (meaning is about to the three-dimensional interactive coordinate LOC of correction 3D_CIOReverse in the mode of similar above-mentioned explanation and to gain three-dimensional eyes coordinate LOC 3D_EYEThe position of coordinate system be not equal to three-dimensional interactive coordinate LOC 3D_PIO), therefore compared to the first embodiment of bearing calibration of the present invention, the resulting three-dimensional of the second embodiment of bearing calibration of the present invention is proofreaied and correct interactive coordinate LOC 3D_CIOBe approximate value.Yet, utilize the second embodiment of bearing calibration of the present invention, interactive decision circuitry 324 need not calculate and search the corresponding error distance D of some P STherefore, can save in large quantities the required calculation resources of interactive decision circuitry 324.
In the 3rd embodiment of bearing calibration of the present invention, in fact interactive decision circuitry 324 sees stereopsis DIM according to the user 3DPosition (three-dimensional left eye coordinates LOC as shown in Figure 4 3D_LEWith three-dimensional right eye coordinate LOC 3D_RE), proofread and correct stereopsis DIM 3D(such as the virtual coordinates LOC of virtual objects VO 3D_PVOWith interactive Rule of judgment COND PVO), obtain correct interactive as a result RT.More particularly, interactive decision circuitry 324 is according to three-dimensional eyes coordinate LOC 3D_EYE(three-dimensional left eye coordinates LOC 3D_LEWith three-dimensional right eye coordinate LOC 3D_RE), the virtual coordinates LOC of virtual objects VO 3D_PVOWith interactive Rule of judgment COND PVO, the viewing location that calculates as the user is three-dimensional eyes coordinate LOC 3D_EYEThe time, in fact the user sees the position of virtual objects VO and the interactive Rule of judgment that the user should experience.So, interactive decision circuitry 324 can be according to position (the three-dimensional interactive coordinate LOC of interactive assembly 322 3D_PIO), in fact the user see the position (coordinate behind as shown in Figure 4 calibrated) of virtual objects VO, and the interactive Rule of judgment that should experience of user (the interactive Rule of judgment behind as shown in Figure 4 calibrated), and determine correct interactive result.
Please refer to Figure 11 and Figure 12.Figure 11 and Figure 12 are the schematic diagram of the 3rd embodiment of explanation bearing calibration of the present invention.In the 3rd embodiment of bearing calibration of the present invention, interactive decision circuitry 324 is according to three-dimensional eyes coordinate (three-dimensional reference coordinate) LOC 3D_EYEProofread and correct stereopsis DIM 3D, to obtain correct interactive as a result RT.More particularly, interactive decision circuitry 324 is according to three-dimensional eyes coordinate (three-dimensional reference coordinate) LOC 3D_EYEThe virtual coordinates LOC of conversion virtual objects VO 3D_PVOFor proofreading and correct virtual coordinates LOC 3D_CVOAnd interactive decision circuitry 324 is according to this three-dimensional eyes coordinate LOC 3D_EYEChange interactive Rule of judgment COND PVOFor proofreading and correct interactive Rule of judgment COND CVOThus, interactive decision circuitry 324 is according to three-dimensional interactive coordinate LOC 3D_PIO, proofread and correct virtual coordinates LOC 3D_CVOWith the interactive Rule of judgment COND of correction CVO, determine interactive as a result RT.For example, as shown in figure 11, the user is from three-dimensional left eye coordinates LOC 3D_LEWith three-dimensional right eye coordinate LOC 3D_REWatch stereopsis DIM 3DTherefore, interactive decision circuitry 324 can be according to straight line L AL(three-dimensional left eye coordinates LOC 3D_LEWith virtual objects VO in left image DIM LPosition LOC ILVOBetween straight line) with straight line L AR(three-dimensional right eye coordinate LOC 3D_REWith virtual objects VO in right image DIM RPosition LOC IRVOBetween straight line), obtain the user from three-dimensional eyes coordinate LOC 3D_EYEThe virtual objects VO that sees is at position LOC 3D_CVOThus, interactive decision circuitry 324 can be according to three-dimensional eyes coordinate LOC 3D_EYE, proofread and correct virtual coordinates LOC 3D_PVO, in fact see the residing position of virtual objects VO (correction virtual coordinates LOC and obtain the user 3D_CVO).As shown in figure 12, interactive Rule of judgment COND PVOFor according to interactive critical distance D THDetermine with the position of virtual objects VO.Therefore, interactive Rule of judgment COND PVOCan be considered centered by the position of virtual objects VO, with interactive critical distance D THBe the formed critical surface SUF of radius PTHWhen interactive assembly 322 enters critical surface SUF PTHThe time, interactive decision circuitry 324 determines that interactive as a result RT represents " contact "; When interactive assembly 322 does not enter critical surface SUF PTHThe time, interactive decision circuitry 324 determines that interactive as a result RT represents " not contact ".Because critical surface SUF PTHCan be considered by many critical point P THForm each critical point P THThe position be its virtual coordinates LOC PTH, therefore interactive decision circuitry 324 is utilized the illustrated method of similar Figure 11, can be according to three-dimensional eyes coordinate LOC 3D_EYE, obtain each critical point P that in fact user experiences THCorrection virtual coordinates LOC CTHThus, all critical point P THCorrection virtual coordinates LOCC THCan form the critical surface SUF after calibrated CTHAt this moment, proofread and correct critical surface SUF CTHBe and proofread and correct interactive Rule of judgment COND COVThat is to say, as the three-dimensional interactive coordinate LOC of interactive assembly 322 3D_PIOEnter and proofread and correct critical surface SUF CTHThe time, interactive decision circuitry 324 determines that interactive as a result RT represents " contact " (as shown in figure 12).Thus, interactive decision circuitry 324 is according to three-dimensional eyes coordinate LOC 3D_EYERecoverable stereopsis DIM 3D(the virtual coordinates LOC of virtual objects VO 3D_PVOWith interactive Rule of judgment COND PVO), see in fact that to obtain the user virtual coordinates LOC (is proofreaied and correct in the position of virtual objects VO 3D_CVO) the interactive Rule of judgment that in fact should experience with the user (proofreaies and correct interactive Rule of judgment COND CVO).Therefore, interactive decision circuitry 324 can be according to the three-dimensional interactive coordinate LOC of interactive assembly 322 3D_PIO, virtual coordinates LOC 3D_CVOWith the interactive Rule of judgment COND of correction CVO, correctly to determine interactive as a result RT.In addition, in the situation that general, interactive Rule of judgment COND POVWith the interactive Rule of judgment COND of correction COVDifference little, for example, critical surface SUF PTHFor having radius D THSphere, at this moment, proofread and correct critical surface SUF CTHAlso be sphere, and its radius approximate greatly D THTherefore in the 3rd embodiment of bearing calibration of the present invention, also can only proofread and correct the virtual coordinates LOC of virtual objects VO 3D_PVO, and do not proofread and correct interactive Rule of judgment COND PVO, to save the required calculation resources of interactive decision circuitry 324.In other words, interactive decision circuitry 324 can be according to proofreading and correct virtual coordinates LOC 3D_CVOWith interactive Rule of judgment COND originally PVO, calculate interactive as a result RT.
In addition, in the 3rd embodiment of bearing calibration of the present invention, in fact interactive decision circuitry 324 sees stereopsis DIM according to the user 3DPosition (three-dimensional eyes coordinate LOC 3D_EYE), proofread and correct stereopsis DIM 3D(virtual coordinates LOC 3D_PVOWith interactive Rule of judgment COND PVO), obtain correct interactive as a result RT.Therefore in the 3rd embodiment of bearing calibration provided by the present invention, if stereopsis DIM 3DIn have a plurality of virtual objects (for example, VO 1~VO M), then interactive decision circuitry 324 needs to calculate each virtual objects VO 1~VO MThe correction virtual coordinates with proofread and correct interactive Rule of judgment.In other words, the data quantity of interactive decision circuitry 324 required processing increases along with the quantity of virtual objects.Yet in first and second embodiment of bearing calibration of the present invention, interactive decision circuitry 324 is watched stereopsis DIM according to the user 3DPosition (three-dimensional eyes coordinate LOC 3D_EYE), with position (the three-dimensional interactive coordinate LOC that proofreaies and correct interactive assembly 322 3D_PIO), obtain correct interactive as a result RT.Therefore in first and second embodiment of bearing calibration provided by the present invention, interactive decision circuitry 324 only need be calculated the three-dimensional of interactive assembly 322 and proofread and correct interactive coordinate LOC 3D_CIOIn other words, compared to the 3rd embodiment of bearing calibration provided by the present invention, even the quantity of virtual objects increases, the data quantity of interactive decision circuitry 324 required processing can not change yet.
Please refer to Figure 13, Figure 13 can control the schematic diagram of sound and light program for explanation stereoscopic interaction system 300 of the present invention.Stereoscopic interaction system 300 comprises display control circuit 330 in addition, loudspeaker 340, and sound control circuit 350.Display control circuit 330 is adjusted the stereopsis DIM that three-dimensional display system 310 provides according to interactive as a result RT 3DFor example, when interactive decision circuitry 324 judged that interactive as a result RT represents " contact ", display control circuit 330 control three-dimensional display systems 310 showed the stereopsis DIM that virtual objects VO (such as tennis) is hit by interactive assembly 322 (corresponding to tennis racket) 3DSound control circuit 350 is adjusted the sound that loudspeaker 340 provide according to interactive as a result RT.For example, when interactive decision circuitry 324 judges that interactive as a result RT represents " contact ", the sound that sound control circuit 350 control loudspeaker 340 output virtual objects VO (such as tennis) are hit by interactive assembly 322 (corresponding to tennis racket).
Please refer to Figure 14.Figure 14 is the schematic diagram of the embodiment 1100 of eyes locating module of the present invention.Eyes locating module 1100 comprises image sensor 1110 and 1120, eyes positioning circuit 1130, and three-dimensional coordinate change-over circuit 1140.Image sensor 1110 and 1120 is used for sensing range and contains the scene SC of user's position, to produce respectively two-dimentional sensing image SIM 2D1With SIM 2D2, and image sensor 1110 is arranged at sense position LOC SEN1, image sensor 1120 is arranged at sense position LOC SEN2 Eyes positioning circuit 1130 is used for according to two-dimentional sensing image SIM 2D1With SIM 2D2, obtain respectively at two-dimentional sensing image SIM 2D1The two-dimentional eyes coordinate LOC of middle user's eyes 2D_EYE1With at two-dimentional sensing image SIM 2D2The two-dimentional eyes coordinate LOC of middle user's eyes 2D_EYE2Three-dimensional coordinate change-over circuit 1140 is used for according to two-dimentional eyes coordinate LOC 2D_EYE1With LOC 2D_EYE2, image sensor 1110 position LOC SEN1, and the position LOC of image sensor 1120 SEN2, calculate the three-dimensional eyes coordinate LOC of user's eyes 3D_EYE, its principle of work is the known technology of industry, therefore repeat no more.
Please refer to Figure 15.Figure 15 is the schematic diagram of the embodiment 1200 of eyes positioning circuit of the present invention.Eyes positioning circuit 1200 comprises eye detecting circuit 1210.The two-dimentional sensing image SIM of eye detecting (eye-detecting) circuit 1210 detectings 2D1In user's eyes, to obtain two-dimentional eyes coordinate LOC 2D_EYE1, and the two-dimentional sensing image SIM of eye detecting circuit 1210 detectings 2D2In user's eyes, to obtain two-dimentional eyes coordinate LOC 2D_EYE2Because eye detecting is the known technology of industry, therefore repeat no more.
Please refer to Figure 16.Figure 16 is the schematic diagram of the embodiment 1300 of eyes locating module of the present invention.Compared to eyes locating module 1100, eyes locating module 1300 also comprises human face detection circuit 1350.Human face detection circuit 1350 is used for identification two dimension sensing image SIM 2D1In people's face HM of user 1Scope and two-dimentional sensing image SIM 2D2In people's face HM of user 2Scope, wherein human face detection is the known technology of industry, therefore repeat no more.By human face detection circuit 1350,1130 need of eyes positioning circuit are according to people's face HM 1With people's face HM 2Scope in data, can obtain respectively two-dimentional eyes coordinate LOC 2D_EYE1With LOC 2D_EYE2Therefore, compared to eyes locating module 1100, eyes locating module 1300 can reduce eyes positioning circuit 1340 for two-dimentional sensing image SIM 2D1With SIM 2D2The scope of required processing, the processing speed of lifting eyes locating module 1100.
Consider that user's eyes may be covered by the auxiliary eyeglasses of eyeglass stereoscopic display system when three-dimensional display system 310 was implemented with the eyeglass stereoscopic display system, therefore in Figure 17, the invention provides another embodiment 1400 of eyes positioning circuit.Set up body display system 310 to comprise display screen 311 and auxiliary eyeglasses 312.The user wears auxiliary eyeglasses 312, the left image DIM that is provided to receive display screen 311 LWith right image DIM REyes positioning circuit 1400 comprises glasses circuit for detecting 1410, and glasses coordinate transformation circuit 1420.The two-dimentional sensing image SIM of glasses circuit for detecting 1410 detectings 2D1In auxiliary eyeglasses 312, to obtain two-dimentional glasses coordinate LOC GLASS1With glasses slope S L GLASS1, and the two-dimentional sensing image SIM of glasses circuit for detecting 1410 detectings 2D2In auxiliary eyeglasses 312, to obtain two-dimentional glasses coordinate LOC GLASS2With glasses slope S L GLASS2Glasses coordinate transformation circuit 1420 is according to two-dimentional glasses coordinate LOC GLASS1With LOC GLASS2, glasses slope S L GLASS1With SL GLASS2, and the user pre-enters to stereoscopic interaction system 300 or stereoscopic interaction system 300 predefined known binocular interval D EYE, calculate user's two-dimentional eyes coordinate LOC 2D_EYE1With LOC 2D_EYE2So, even when user's eyes are covered by glasses, eyes locating module of the present invention still can by the design of eyes positioning circuit 1400, obtain user's two-dimentional eyes coordinate LOC 2D_EYE1With LOC 2D_EYE2
Please refer to Figure 18.Figure 18 is the schematic diagram that the invention provides another embodiment 1500 of eyes positioning circuit.Compared to eyes positioning circuit 1400, eyes positioning circuit 1500 also comprises an inclination detector 1530.Inclination detector 1530 can be arranged on the auxiliary eyeglasses 312.Inclination detector 1530 produces inclination information INFO according to the angle of inclination of auxiliary eyeglasses 312 TILTFor example, inclination detector 1530 is gyroscope (Gyroscope).Owing to working as at two-dimentional sensing image SIM 2D1With image SIM 2D2In when less corresponding to the picture element of auxiliary eyeglasses 312, the glasses slope S L that glasses circuit for detecting 1410 calculates GLASS1With SL GLASS2Be easier to produce error.Therefore the inclination information INFO that provides by inclination detector 1530 TILT, the glasses slope S L that glasses coordinate transformation circuit 1420 recoverable glasses circuit for detecting 1410 calculate GLASS1With SL GLASS2For example, glasses coordinate transformation circuit 1420 is according to inclination information INFO TILT, proofread and correct the glasses slope S L that glasses circuit for detecting 1410 calculates GLASS1With SL GLASS2, and produce according to this correction glasses slope S L GLASS1_CWith correction glasses slope S L GLASS2_CSo, glasses coordinate transformation circuit 1420 is according to two-dimentional glasses coordinate LOC GLASS1With LOC GLASS2, proofread and correct glasses slope S L GLASS1_CWith SL GLASS2_C, with known binocular interval DEYE, can calculate two-dimentional eyes coordinate LOC 2D_EYE1With LOC 2D_EYE2Therefore, that is to say, compared to eyes positioning circuit 1400, in eyes positioning circuit 1500, glasses coordinate transformation circuit 1420 recoverable glasses circuit for detecting 1410 calculate glasses slope S L GLASS1With SL GLASS2The time error that produces, more correctly to calculate user's two-dimentional eyes coordinate LOC 2D_EYE1With LOC 2D_EYE2
Please refer to Figure 19.Figure 19 is the schematic diagram of another embodiment 1600 of eyes positioning circuit.Compared to eyes positioning circuit 1400, eyes positioning circuit 1600 comprises infrared light luminescence component 1640, infrared light reflection assembly 1650 in addition, and infrared light sensing circuit 1660.Infrared light luminescence component 1640 is used for sending detected light L DTo scene SC.Infrared light reflection assembly 1650 is arranged on the auxiliary eyeglasses 312, is used for reflecting detected light L DTo produce reflected light L RInfrared light sensing circuit 1660 is according to L R, produce the two-dimensional infrared light coordinate LOC corresponding to the position of auxiliary eyeglasses 312 IRWith the infrared light slope S L corresponding to the angle of inclination of auxiliary eyeglasses 312 IRBe similar to the explanation of Figure 18, information (the two-dimensional infrared light coordinate LOC that glasses coordinate transformation circuit 1420 can provide according to infrared light sensing circuit 1660 IRWith infrared light slope S L IR), proofread and correct the glasses slope S L that glasses circuit for detecting 1410 calculates GLASS1With SL GLASS2, and produce according to this correction glasses slope S L GLASS1_CWith correction glasses slope S L GLASS2_CThus, compared to eyes positioning circuit 1400, in eyes positioning circuit 1600, glasses coordinate transformation circuit 1420 recoverable glasses circuit for detecting 1410 calculate glasses slope S L GLASS1With SL GLASS2The time error that produces, more correctly to calculate user's two-dimentional eyes coordinate LOC 2D_EYE1With LOC 2D_EYE2In addition, in eyes positioning circuit 1600, can have a plurality of infrared light reflection assemblies 1650.For example, in Figure 20, eyes positioning circuit 1600 has two infrared light reflection assemblies 1650, correspondence is set respectively in the position of user's eyes.In Figure 20, infrared light reflection assembly 1650 is separately positioned on the top of user's eyes, with as an example explanation.Eyes positioning circuit 1600 in Figure 19 only has infrared light reflection assembly 1650, so infrared light sensing circuit 1660 needs the directive property of the single infrared light reflection assembly 1650 of detecting to calculate infrared light slope S L IRYet, in Figure 20, when infrared light sensing circuit 1660 detects the reflected light L that two infrared light reflection assemblies 1650 produce RThe time, infrared light sensing circuit 1660 can be detected the position of two infrared light reflection assemblies 1650 according to this, and calculates infrared light slope S L IRTherefore, the eyes positioning circuit 1600 that utilizes the mode of Figure 20 to implement can obtain infrared light slope S L simpler and easy and more accurately IR, more correctly to calculate user's two-dimentional eyes coordinate LOC 2D_EYE1With LOC 2D_EYE2
In addition, in Figure 19 and eyes positioning circuit 1600 illustrated in fig. 20, when the rotation amplitude of user's head was larger, the angle that may cause infrared light reflection assembly 1650 is deflection too, and so that infrared light sensing circuit 1660 can't sense enough reflected light L REnergy, so, may cause infrared light sensing circuit 1660 can't correctly calculate infrared light slope S L IRTherefore, the present invention further provides another embodiment 2300 of eyes positioning circuit.Figure 21 and Figure 22 are the schematic diagram of explanation eyes positioning circuit 2300.Compared to eyes positioning circuit 1400, eyes positioning circuit 2300 comprises one or more infrared light luminescence component 2340 and infrared light sensing circuit 2360 in addition.The structure of infrared light luminescence component 2340 and infrared light sensing circuit 2360 and principle of work are similar with infrared light luminescence component 1640 and infrared light sensing circuit 1660 respectively.In eyes potential circuit 2300, infrared light luminescence component 2340 is directly arranged position corresponding to user's eyes.So, even when the rotation amplitude of user's head is larger, infrared light sensing circuit 2360 also can sense enough detected light L DEnergy, with detecting infrared light luminescence component 2340, and calculate according to this infrared light slope S L IRIn Figure 21, eyes positioning circuit 2300 has infrared light luminescence component 2340, and infrared light luminescence component 2340 approximately is arranged on the position of centre of user's eyes.In Figure 22, eyes positioning circuit 2300 has two infrared light luminescence components 2340, and infrared light luminescence component 2340 is arranged at respectively the top of user's eyes.Therefore, in Figure 21, only has an infrared light luminescence component 2340, in Figure 22, when infrared light sensing circuit 2360 detects two infrared light luminescence components 2240, can be directly go out infrared light slope S L with the position calculation of two infrared light luminescence components 2340 IR, and do not need to detect the directive property of single infrared light luminescence component 2340.Therefore, the eyes positioning circuit 2300 that utilizes the mode of Figure 22 to implement can obtain infrared light slope S L simpler and easy and more accurately IR, more correctly to calculate user's two-dimentional eyes coordinate LOC 2D_EYE1With LOC 2D_EYE2
Please refer to Figure 23.Figure 23 is the schematic diagram of another embodiment 1700 of eyes locating module of the present invention.Eyes locating module 1700 comprises three-dimensional scenic sensor 1710, and eyes coordinates produces circuit 1720.Three-dimensional scenic sensor 1710 is used for sensing range and contains user's scene SC, to produce two-dimentional sensing image SIM 2D3, and corresponding to two-dimentional sensing image SIM 2D3Range information INFO DRange information INFO DHave at two-dimentional sensing image SIM 2D3Every bit and the data of the distance between the three-dimensional scenic sensor 1710.Eyes coordinates produces circuit 1720, is used for according to two-dimentional sensing image SIM 2D3With range information INFO D, produce three-dimensional eyes coordinate LOC 3D_EYEFor example, eyes coordinates produces circuit 1720 and picks out two-dimentional sensing image SIM 2D3In corresponding to the picture element of user's eyes, then, eyes coordinates produces circuit 1720 according to range information INFO D, obtain two-dimentional sensing image SIM 2D3In corresponding to the scene SC of the picture element institute sensing of user's eyes and the distance between the three-dimensional scenic sensor 1710.So, eyes coordinates produces circuit 1720 according to two-dimentional sensing image SIM 2D3In corresponding to the position of the picture element of user's eyes with at range information INFO DIn the range data of correspondence, can produce three-dimensional eyes coordinate LOC 3D_EYE
Please refer to Figure 24.Figure 24 is the schematic diagram of the embodiment 1800 of three-dimensional scenic sensor of the present invention.Three-dimensional scenic sensor 1800 comprises image sensor 1810, infrared light luminescence component 1820, and light sensing distance measuring equipment 1830.Image sensor 1810 sensing scene SC are to produce two-dimentional sensing image SIM 2D3Infrared light luminescence component 1820 detected light L DTo scene SC, so that scene SC produces reflected light L RLight sensing distance measuring equipment 1830 is used for sensing reflected light L R, to produce range information INFO DFor example, light sensing distance measuring equipment 1830 is Z sensor (Z-sensor).Because the Z sensor is the known technology of industry, therefore repeat no more.
Please refer to Figure 25.Figure 25 is the schematic diagram that eyes coordinates of the present invention produces the embodiment 1900 of circuit.Eyes coordinates produces circuit 1900 and comprises eye detecting circuit 1910, and three-dimensional coordinate change-over circuit 1920.Eye detecting circuit 1910 is used for detecting two-dimentional sensing image SIM 2D3Middle user's eyes are to obtain two-dimentional eyes coordinate LOC 2D_EYE3Three-dimensional coordinate change-over circuit 1920 is according to two-dimentional eyes coordinate LOC 2D_EYE3, range information INFO D, the set range finding position LOC of light sensing distance measuring equipment 1830 MDAnd the set sense position LOC of image sensor 1810 (as shown in figure 24), SEN3(as shown in figure 24), calculate three-dimensional eyes coordinate LOC 3D_EYE
Please refer to Figure 26.Figure 26 is the schematic diagram that eyes coordinates of the present invention produces the embodiment 2000 of circuit.Produce circuit 1900 compared to eyes coordinates, eyes coordinates produces circuit 2000 and also comprises human face detection circuit 2030.Human face detection circuit 2030 is used for identification two dimension sensing image SIM 2D3In people's face HM of user 3Scope.By people's face slowdown monitoring circuit 2030,1910 need of eye detecting circuit are according to people's face HM 3Data in the scope can obtain two-dimentional eyes coordinate LOC 2D_EYE3Therefore, produce circuit 1900 compared to eyes coordinates, eyes coordinates produces circuit 2000 can reduce eye detecting circuit 1910 for two-dimentional sensing image SIM 2D3The scope of required processing promotes the processing speed that eyes coordinates produces circuit 2000.
In addition, consider when three-dimensional display system 310 is implemented with the eyeglass stereoscopic display system, user's eyes may be covered by the auxiliary eyeglasses 312 of eyeglass stereoscopic display system, therefore in Figure 27, the invention provides another embodiment 2100 that eyes coordinates produces circuit.Eyes positioning circuit 2100 comprises glasses circuit for detecting 2110, and glasses coordinate transformation circuit 2120.The two-dimentional sensing image SIM of glasses circuit for detecting 2110 detectings 2D3In auxiliary eyeglasses 312, to obtain two-dimentional glasses coordinate LOC GLASS3With glasses slope S L GLASS3Glasses coordinate transformation circuit 2120 is according to two-dimentional glasses coordinate LOC GLASS3With glasses slope S L GLASS3, the user pre-enters to stereoscopic interaction system 300 or stereoscopic interaction system 300 predefined known binocular interval D EYE, and range information INFO D, to calculate user's three-dimensional eyes coordinate LOC 3D_EYESo, even when user's eyes are covered by glasses, eyes coordinates of the present invention produces the three-dimensional eyes coordinate LOC that circuit 2100 still can calculate the user 3D_EYE
Please refer to Figure 28.Figure 28 the invention provides the schematic diagram that eyes coordinates produces another embodiment 2200 of circuit.Produce circuit 2100 compared to eyes coordinates, eyes coordinates produces circuit 2200 and also comprises inclination detector 2230.Inclination detector 2230 can be arranged on the auxiliary eyeglasses 312.The structure of inclination detector 2230 is similar to inclination detector 1530 to principle of work, therefore repeat no more.The inclination information INFO that provides by inclination detector 2230 TILT, the glasses slope S L that glasses coordinate transformation circuit 2120 recoverable glasses circuit for detecting 2110 calculate GLASS3For example, glasses coordinate transformation circuit 1420 is according to inclination information INFO TILT, proofread and correct the glasses slope S L that glasses circuit for detecting 2110 calculates GLASS3, and produce according to this correction glasses slope S L GLASS3_CSo, glasses coordinate transformation circuit 1420 is according to two-dimentional glasses coordinate LOC GLASS3With correction glasses slope S L GLASS3_C, known binocular interval D EYEWith range information INFO D, can calculate three-dimensional eyes coordinate LOC 3D_EYEProduce circuit 2100 compared to eyes coordinates, produce in the circuit 2200 in eyes coordinates, glasses coordinate transformation circuit 2120 recoverable glasses circuit for detecting 2110 calculate glasses slope S L GLASS3The time error that produces, more correctly to calculate user's three-dimensional eyes coordinate LOC 3D_EYE
In sum, stereoscopic interaction system 300 provided by the present invention, can be according to user's position, proofread and correct the position of interactive assembly, or the position of the virtual objects in the stereopsis and interactive Rule of judgment, so, even changing, user's position cause the position of the virtual objects in the stereopsis that the user sees to change, stereoscopic interaction system of the present invention still can be according to the position of the interactive assembly after calibrated, or the position of the virtual objects after calibrated and interactive Rule of judgment, obtain correct interactive result.In addition, when locating module of the present invention is the eyes locating module, the auxiliary eyeglasses of formula three-dimensional display system even the user wears glasses and cause user's eyes crested, the known binocular interval that eyes locating module provided by the present invention pre-enters according to the user, still can calculate the position of user's eyes, bring the user larger facility.
The above only is the preferred embodiments of the present invention, and all equalizations of doing according to claim of the present invention change and modify, and all should belong to covering scope of the present invention.

Claims (25)

1. interactive module that is applied to a stereoscopic interaction system, this stereoscopic interaction system has a three-dimensional display system, this three-dimensional display system is used to provide a stereopsis, this stereopsis has a virtual objects, this virtual objects has a virtual coordinates and a Rule of judgment, and this interactive module is characterised in that and comprises:
One locates module, is used for detecting the position of user in a scene, to produce a three-dimensional reference coordinate;
One operating assembly;
One operating assembly locating module is used for detecting the position of this operating assembly, to produce a three-dimensional interactive coordinate; And
One interactive decision circuitry, being used for changing this virtual coordinates according to this three-dimensional reference coordinate is a correction virtual coordinates, and according to this three-dimensional interactive coordinate, this correction virtual coordinates and this Rule of judgment, to control this stereopsis.
2. interactive module as claimed in claim 1 is characterized in that, it is the interactive Rule of judgment of a correction that this interaction decision circuitry is changed this Rule of judgment according to this three-dimensional reference coordinate; This interaction decision circuitry is proofreaied and correct interactive Rule of judgment according to this three-dimensional interactive coordinate, this correction virtual coordinates and this, to control this stereopsis; This interaction decision circuitry is according to an interactive critical distance and this virtual coordinates, to calculate a critical surface; It is a correction critical surface that this interaction decision circuitry is changed this critical surface according to this three-dimensional reference coordinate; This proofreaies and correct interactive Rule of judgment for represent contact when this three-dimensional interactive coordinate enters this correction critical surface.
3. interactive module as claimed in claim 1 is characterized in that, this locating module is an eyes locating module, and this eyes locating module is used for detecting the position of user's eyes in this scene, produces a three-dimensional eyes coordinate as this three-dimensional reference coordinate;
Wherein this three-dimensional display system comprises a display screen and an auxiliary eyeglasses, and this display screen is used to provide a left image and a right image, and this auxiliary eyeglasses is used for this left image of auxiliary reception and this right image, to obtain this stereopsis;
Wherein this eyes locating module comprises:
One first image sensor is used for this scene of sensing, to produce one first two-dimentional sensing image;
One second image sensor is used for this scene of sensing, to produce one second two-dimentional sensing image;
One eyes positioning circuit comprises:
One glasses circuit for detecting, be used for detecting this auxiliary eyeglasses in this first two-dimentional sensing image, obtaining one first two-dimentional glasses coordinate and one first glasses slope, and detect this auxiliary eyeglasses in this second two-dimentional sensing image, to obtain one second two-dimentional glasses position and a secondary glasses slope; And
One glasses coordinate transformation circuit, be used for according to this first two-dimentional glasses coordinate, this first glasses slope, this second two-dimentional glasses coordinate, this secondary glasses slope and a known binocular interval, to calculate one first two-dimentional eyes coordinate and one second two-dimentional eyes coordinate; And
One three-dimensional coordinate change-over circuit is used for calculating this three-dimensional eyes coordinate according to this first two-dimentional eyes coordinate, this second two-dimentional eyes coordinate, one first sense position of this first image sensor and one second sense position of this second image sensor.
4. interactive module as claimed in claim 3 is characterized in that, this eyes positioning circuit also comprises an inclination detector; This inclination detector is arranged on this auxiliary eyeglasses; This inclination detector is used for producing an inclination information according to the angle of inclination of this auxiliary eyeglasses; This glasses coordinate transformation circuit calculates this first two-dimentional eyes coordinate and this second two-dimentional eyes coordinate according to this inclination information, the first two-dimentional glasses coordinate, this first glasses slope, this second two-dimentional glasses coordinate, this secondary glasses slope and this known binocular interval.
5. interactive module as claimed in claim 3 is characterized in that, this eyes positioning circuit comprises in addition:
One first infrared light luminescence component is used for sending one first detected light; And
One infrared light sensing circuit is used for according to this first detected light, produces a two-dimensional infrared light coordinate and an infrared light slope;
Wherein this glasses coordinate transformation circuit is proofreaied and correct glasses slope and this known binocular interval according to this infrared light slope, this first glasses slope, this secondary glasses slope, this two-dimensional infrared light coordinate, this first two-dimentional glasses coordinate, this second two-dimentional glasses coordinate, one second, calculates this first two-dimentional eyes coordinate and this second two-dimentional eyes coordinate.
6. interactive module as claimed in claim 1 is characterized in that, this locating module is an eyes locating module, and this eyes locating module is used for detecting the position of user's eyes in this scene, to produce a three-dimensional eyes coordinate as this three-dimensional reference coordinate;
Wherein this three-dimensional display system comprises a display screen and an auxiliary eyeglasses, and this display screen is used to provide a left image and a right image, and this auxiliary eyeglasses is used for this left image of auxiliary reception and this right image, to obtain this stereopsis;
Wherein this eyes locating module comprises:
One three-dimensional scenic sensor comprises:
One the 3rd image sensor is used for this scene of sensing, to produce one the 3rd two-dimentional sensing image;
One infrared light luminescence component is used for sending a detected light to this scene, so that this scene produces a reflected light; And
One light sensing distance measuring equipment is used for this reflected light of sensing, to produce a range information;
Wherein this range information has the data of the distance between the every bit and this three-dimensional scenic sensor in the 3rd two-dimentional sensing image; And
One eyes coordinate produces circuit, comprises:
One glasses circuit for detecting is used for detecting this auxiliary eyeglasses in the 3rd two-dimentional sensing image, to obtain one the 3rd two-dimentional glasses coordinate and one the 3rd glasses slope; And
One glasses coordinate transformation circuit is used for according to the 3rd two-dimentional glasses coordinate, the 3rd glasses slope, a known binocular interval and this range information, to calculate this three-dimensional eyes coordinate.
7. interactive module as claimed in claim 1 is characterized in that, this locating module is an eyes locating module, and this eyes locating module is used for detecting the position of user's eyes in this scene, to produce a three-dimensional eyes coordinate as this three-dimensional reference coordinate;
Wherein this eyes locating module comprises:
One three-dimensional scenic sensor is used for this scene of sensing, producing one the 3rd two-dimentional sensing image, and corresponding to a range information of the 3rd two-dimentional sensing image;
Wherein this range information has the data of the distance between the every bit and this three-dimensional scenic sensor in the 3rd two-dimentional sensing image; And
One eyes coordinate produces circuit, comprises:
One eyes circuit for detecting is used for detecting the eyes in the 3rd two-dimentional sensing image, to obtain one the 3rd two-dimentional eyes coordinate;
One three-dimensional coordinate change-over circuit is used for a range finding position according to the 3rd two-dimentional eyes coordinate, this range information, a light sensing distance measuring equipment, and one the 3rd sense position of one the 3rd image sensor, calculates this three-dimensional eyes coordinate.
8. interactive module that is applied to a stereoscopic interaction system, this stereoscopic interaction system has a three-dimensional display system, this three-dimensional display system is used to provide a stereopsis, this stereopsis has a virtual objects, this virtual objects has a virtual coordinates and a Rule of judgment, and this interactive module is characterised in that and comprises:
One locates module, is used for detecting the position of user in a scene, to produce a three-dimensional reference coordinate;
One operating assembly;
One interactive assembly locating module is used for detecting the position of this operating assembly, to produce a three-dimensional interactive coordinate; And
One interactive decision circuitry, being used for changing this three-dimensional interactive coordinate according to this three-dimensional reference coordinate is the interactive coordinate of a three-dimensional correction, and proofreaies and correct interactive coordinate, this virtual coordinates and this Rule of judgment according to this three-dimensional, to control this stereopsis.
9. interactive module as claimed in claim 8 is characterized in that, this locating module is an eyes locating module, and this eyes locating module is used for detecting the position of user's eyes in this scene, to produce a three-dimensional eyes coordinate as this three-dimensional reference coordinate; This interaction decision circuitry is according to this three-dimensional eyes coordinate and this three-dimensional interactive coordinate, draws this operating assembly and is projected in a three-dimensional left interaction coordinate and a three-dimensional right interaction coordinate on this three-dimensional display system; This interaction decision circuitry decides a left consult straight line according to the left interaction coordinate of this three-dimensional and a known left eye coordinates, and decides a right consult straight line according to the right interaction coordinate of this three-dimensional and a known right eye coordinate; This interaction decision circuitry obtains this three-dimensional and proofreaies and correct interactive coordinate according to this left consult straight line and this right consult straight line.
10. interactive module as claimed in claim 9 is characterized in that, when left consult straight line and this right consult straight line intersected, this interaction decision circuitry obtained this three-dimensional and proofreaies and correct interactive coordinate according to the coordinate of the intersection point of this left consult straight line and this right consult straight line; When left consult straight line and this right consult straight line are non-intersect, this interaction decision circuitry is according to this left consult straight line and this right consult straight line, obtain having with the minor increment of this left consult straight line and this right consult straight line and one with reference to mid point, and should equal this with reference to the distance between mid point and this right consult straight line with reference to the distance between mid point and this left consult straight line, this interaction decision circuitry obtains this three-dimensional according to this coordinate with reference to mid point and proofreaies and correct interactive coordinate.
11. interactive module as claimed in claim 9 is characterized in that, this interaction decision circuitry obtains a central point according to this left consult straight line and this right consult straight line; This interaction decision circuitry determines a search area according to this central point; Have M in this search area and search point; This interaction decision circuitry is searched point and this three-dimensional eyes coordinate according to this known eyes coordinate, this M, determines in the coordinate system corresponding to this three-dimensional eyes coordinate, corresponding to M end points of this M search; This interaction decision circuitry decides M error distance corresponding to this M end points according to the position of this M end points and this three-dimensional interactive coordinate respectively; This interaction decision circuitry has the least error distance according to a K end points of this M end points, determines that this three-dimensional proofreaies and correct interactive coordinate; Wherein M, K represent respectively positive integer, and K≤M;
Wherein should the interaction decision circuitry search point and this known eyes coordinate according to this a M K of searching point, determine a left search projection coordinate and a right search projection coordinate; This interaction decision circuitry obtains in this M end points according to this left search projection coordinate, this right search projection coordinate and this three-dimensional eyes coordinate, corresponding to this K this K end points of searching point.
12. interactive module as claimed in claim 8 is characterized in that, this locating module is an eyes locating module, and this eyes locating module is used for detecting the position of user's eyes in this scene, to produce a three-dimensional eyes coordinate as this three-dimensional reference coordinate;
Wherein in the corresponding coordinate system of this known eyes coordinate, have M and search point; This interaction decision circuitry is searched point and this three-dimensional eyes coordinate according to this known eyes coordinate, this M, determines in the coordinate system corresponding to this three-dimensional eyes coordinate, corresponding to M end points of this M search; This interaction decision circuitry decides M error distance corresponding to this M end points according to the position of this M end points and this three-dimensional interactive coordinate respectively; This interaction decision circuitry has the least error distance according to a K end points of this M end points, determines that this three-dimensional proofreaies and correct interactive coordinate; Wherein M, K represent respectively positive integer, and K≤M;
Wherein should the interaction decision circuitry search point and this known eyes coordinate according to this a M K of searching point, determine a left search projection coordinate and a right search projection coordinate; This interaction decision circuitry obtains in this M end points according to this left search projection coordinate, this right search projection coordinate and this three-dimensional eyes coordinate, corresponding to this K this K end points of searching point.
13. interactive module as claimed in claim 8 is characterized in that, this locating module is the eyes locating module, and this eyes locating module is used for detecting the position of user's eyes in this scene, to produce a three-dimensional eyes coordinate as this three-dimensional reference coordinate;
Wherein this three-dimensional display system comprises a display screen and an auxiliary eyeglasses, and this display screen is used to provide a left image and a right image, and this auxiliary eyeglasses is used for this left image of auxiliary reception and this right image, to obtain this stereopsis;
Wherein this eyes locating module comprises:
One first image sensor is used for this scene of sensing, to produce one first two-dimentional sensing image;
One second image sensor is used for this scene of sensing, to produce one second two-dimentional sensing image;
One eyes positioning circuit comprises:
One glasses circuit for detecting, be used for detecting this auxiliary eyeglasses in this first two-dimentional sensing image, obtaining one first two-dimentional glasses coordinate and one first glasses slope, and detect this auxiliary eyeglasses in this second two-dimentional sensing image, to obtain one second two-dimentional glasses position and a secondary glasses slope; And
One glasses coordinate transformation circuit, be used for calculating one first two-dimentional eyes coordinate and one second two-dimentional eyes coordinate according to this first two-dimentional glasses coordinate, this first glasses slope, this second two-dimentional glasses coordinate, this secondary glasses slope and a known binocular interval; And
One three-dimensional coordinate change-over circuit is used for calculating this three-dimensional eyes coordinate according to this first two-dimentional eyes coordinate, this second two-dimentional eyes coordinate, one first sense position of this first image sensor and one second sense position of this second image sensor.
14. interactive module as claimed in claim 13 is characterized in that, this eyes positioning circuit also comprises an inclination detector; This inclination detector is arranged on this auxiliary eyeglasses; This inclination detector is used for producing an inclination information according to the angle of inclination of this auxiliary eyeglasses; This glasses coordinate transformation circuit calculates this first two-dimentional eyes coordinate and this second two-dimentional eyes coordinate according to this inclination information, the first two-dimentional glasses coordinate, this first glasses slope, this second two-dimentional glasses coordinate, this secondary glasses slope and this known binocular interval.
15. interactive module as claimed in claim 13 is characterized in that, this eyes positioning circuit also comprises:
One first infrared light luminescence component is used for sending one first detected light; And
One infrared light sensing circuit is used for according to this first detected light, produces a two-dimensional infrared light coordinate and an infrared light slope;
Wherein this glasses coordinate transformation circuit is proofreaied and correct glasses slope and this known binocular interval according to this infrared light slope, this first glasses slope, this secondary glasses slope, this two-dimensional infrared light coordinate, this first two-dimentional glasses coordinate, this second two-dimentional glasses coordinate, one second, calculates this first two-dimentional eyes coordinate and this second two-dimentional eyes coordinate.
16. interactive module as claimed in claim 8 is characterized in that, this locating module is an eyes locating module, and this eyes locating module is used for detecting the position of user's eyes in this scene, to produce a three-dimensional eyes coordinate as this three-dimensional reference coordinate;
Wherein this three-dimensional display system comprises a display screen and an auxiliary eyeglasses, and this display screen is used to provide a left image and a right image, and this auxiliary eyeglasses is used for this left image of auxiliary reception and this right image, to obtain this stereopsis;
Wherein this eyes locating module comprises:
One three-dimensional scenic sensor comprises:
One the 3rd image sensor is used for this scene of sensing, to produce one the 3rd two-dimentional sensing image;
One infrared light luminescence component is used for sending a detected light to this scene, so that this scene produces a reflected light; And
One light sensing distance measuring equipment is used for this reflected light of sensing, to produce the range information corresponding to the 3rd two-dimentional sensing image;
Wherein this range information has the data of the distance between the every bit and this three-dimensional scenic sensor in the 3rd two-dimentional sensing image; And
One eyes coordinate produces circuit, comprises:
One glasses circuit for detecting is used for detecting this auxiliary eyeglasses in the 3rd two-dimentional sensing image, to obtain one the 3rd two-dimentional glasses coordinate and one the 3rd glasses slope; And
One glasses coordinate transformation circuit is used for calculating this three-dimensional eyes coordinate according to the 3rd two-dimentional glasses coordinate, the 3rd glasses slope, a known binocular interval and this range information.
17. interactive module as claimed in claim 8 is characterized in that, this locating module is the eyes locating module, and this eyes locating module is used for detecting the position of user's eyes in this scene, to produce a three-dimensional eyes coordinate as this three-dimensional reference coordinate;
Wherein this eyes locating module comprises:
One three-dimensional scenic sensor is used for this scene of sensing, producing one the 3rd two-dimentional sensing image, and corresponding to a range information of the 3rd two-dimentional sensing image;
Wherein this range information has the data of the distance between the every bit and this three-dimensional scenic sensor in the 3rd two-dimentional sensing image; And
One eyes coordinate produces circuit, comprises:
One eyes circuit for detecting is used for detecting the eyes in the 3rd two-dimentional sensing image, to obtain one the 3rd two-dimentional eyes coordinate;
One three-dimensional coordinate change-over circuit is used for a range finding position according to the 3rd two-dimentional eyes coordinate, this range information, a light sensing distance measuring equipment, and one the 3rd sense position of one the 3rd image sensor, calculates this three-dimensional eyes coordinate.
18. method that is used for controlling a stereopsis of a stereoscopic interaction system, this stereoscopic interaction system has a three-dimensional display system and an operating assembly, this three-dimensional display system is used to provide a stereopsis, this stereopsis has a virtual objects, this virtual objects has a virtual coordinates and a Rule of judgment, the method is characterized in that to comprise:
Detecting user's in a scene position is to produce a three-dimensional reference coordinate;
Detect the position of this operating assembly, to produce a three-dimensional interactive coordinate; And
According to this three-dimensional reference coordinate, this three-dimensional interactive coordinate, this virtual coordinates and this Rule of judgment, to control this stereopsis.
19. method as claimed in claim 18, it is characterized in that, detecting user's in this scene position comprises the position of detecting user's eyes in this scene to produce this three-dimensional reference coordinate, to produce a three-dimensional eyes coordinate as this three-dimensional reference coordinate;
Wherein according to this three-dimensional reference coordinate, this three-dimensional interactive coordinate, this virtual coordinates and this Rule of judgment, comprise to control this stereopsis:
Be one to proofread and correct virtual coordinates according to this this virtual coordinates of three-dimensional eyes coordinate conversion; And
According to this three-dimensional interactive coordinate, this correction virtual coordinates and this Rule of judgment, to control this stereopsis.
20. method as claimed in claim 18, it is characterized in that, detecting user's in this scene position comprises the position that is detected in user's eyes in this scene to produce this three-dimensional reference coordinate, to produce a three-dimensional eyes coordinate as this three-dimensional reference coordinate;
Wherein according to this three-dimensional reference coordinate, this three-dimensional interactive coordinate, this virtual coordinates and this Rule of judgment, comprise to control this stereopsis:
Be one to proofread and correct virtual coordinates according to this this virtual coordinates of three-dimensional eyes coordinate conversion;
Be one to proofread and correct interactive Rule of judgment according to this this Rule of judgment of three-dimensional eyes coordinate conversion; And
Proofread and correct interactive Rule of judgment according to this three-dimensional interactive coordinate, this correction virtual coordinates and this, to control this stereopsis;
Wherein proofreading and correct interactive Rule of judgment according to this this Rule of judgment of three-dimensional eyes coordinate conversion for this comprises:
According to an interactive critical distance and this virtual coordinates, calculate a critical surface; And
Be one to proofread and correct critical surface according to this this critical surface of three-dimensional eyes coordinate conversion;
Wherein the interactive Rule of judgment of this correction is for when this three-dimensional interactive coordinate enters this correction critical surface, and expression contacts.
21. method as claimed in claim 18, it is characterized in that, detecting user's in this scene position comprises the position of detecting user's eyes in this scene to produce this three-dimensional reference coordinate, to produce a three-dimensional eyes coordinate as this three-dimensional reference coordinate;
Wherein according to this three-dimensional eyes coordinate, this three-dimensional interactive coordinate, this virtual coordinates and this Rule of judgment, comprise to control this stereopsis:
Be one three-dimensional to proofread and correct interactive coordinate according to this this three-dimensional interactive coordinate of three-dimensional eyes coordinate conversion; And
Proofread and correct interactive coordinate, this virtual coordinates and this Rule of judgment according to this three-dimensional, to control this stereopsis;
Wherein this Rule of judgment represents to contact when proofreading and correct distance between interactive coordinate and this virtual coordinates less than an interactive critical distance when this three-dimensional.
22. method as claimed in claim 21 is characterized in that, proofreaies and correct interactive coordinate according to this this three-dimensional interactive coordinate of three-dimensional eyes coordinate conversion for this three-dimensional and comprises:
According to this three-dimensional eyes coordinate and this three-dimensional interactive coordinate, draw this operating assembly and be projected in a three-dimensional left interaction coordinate and a three-dimensional right interaction coordinate on this three-dimensional display system;
Decide a left consult straight line according to the left interaction coordinate of this three-dimensional and a known left eye coordinates, and decide a right consult straight line according to the right interaction coordinate of this three-dimensional and a known right eye coordinate; And
According to this left consult straight line and this right consult straight line, obtain this three-dimensional and proofread and correct interactive coordinate.
23. method as claimed in claim 22 is characterized in that, according to this left consult straight line and this right consult straight line, obtains this three-dimensional and proofreaies and correct interactive coordinate and comprise:
When this left consult straight line and this right consult straight line intersected, the coordinate according to the intersection point of this left consult straight line and this right consult straight line obtained this three-dimensional and proofreaies and correct interactive coordinate; And
When this left consult straight line and this right consult straight line are non-intersect, according to this left consult straight line and this right consult straight line, obtain having with the minor increment of this left consult straight line and this right consult straight line and one with reference to mid point, and obtain this three-dimensional according to this coordinate with reference to mid point and proofread and correct interactive coordinate;
Wherein should equal this with reference to the distance between mid point and this right consult straight line with reference to the distance between mid point and this left consult straight line.
24. method as claimed in claim 23 is characterized in that, according to this left consult straight line and this right consult straight line, obtains this three-dimensional and proofreaies and correct interactive coordinate and comprise:
Obtain a central point according to this left consult straight line and this right consult straight line;
Determine a search area according to this central point;
Wherein have M in this search area and search point;
Search point and this three-dimensional eyes coordinate according to this known eyes coordinate, this M, determine a M end points of searching corresponding to this M;
Determine M error distance corresponding to this M end points according to the position of this M end points and this three-dimensional interactive coordinate respectively; And
K end points according to this M end points has the least error distance, determines that this three-dimensional proofreaies and correct interactive coordinate;
Wherein M, K represent respectively positive integer, and K≤M;
Wherein search point and this three-dimensional eyes coordinate according to this known eyes coordinate, this M, determine to comprise corresponding to this M end points of this M search:
Search point and this known eyes coordinate according to this a M K of searching point, determine a left search projection coordinate and a right search projection coordinate;
According to this left search projection coordinate, this right search projection coordinate and this three-dimensional eyes coordinate, obtain in this M end points, corresponding to this K this K end points of searching point.
25. method as claimed in claim 21 is characterized in that, proofreaies and correct interactive coordinate according to this this three-dimensional interactive coordinate of three-dimensional eyes coordinate conversion for this three-dimensional and comprises:
Search point and this three-dimensional eyes coordinate according to M in this known eyes coordinate, the corresponding coordinate system of this known eyes coordinate, determine in the coordinate system corresponding to this three-dimensional eyes coordinate, corresponding to this a M M end points of searching;
Decide M error distance corresponding to this M end points according to the position of this M end points and this three-dimensional interactive coordinate respectively; And
K end points according to this M end points has the least error distance, determines that this three-dimensional proofreaies and correct interactive coordinate;
Wherein M, K represent respectively positive integer, and K≤M;
Wherein search point and this three-dimensional eyes coordinate according to this M in this known eyes coordinate, the corresponding coordinate system of this known eyes coordinate, determine in the coordinate system corresponding to this three-dimensional eyes coordinate, comprise corresponding to this M M end points searching point:
Search point and this known eyes coordinate according to this a M K of searching point, determine a left search projection coordinate and a right search projection coordinate; And
According to this left search projection coordinate, this right search projection coordinate and this three-dimensional eyes coordinate, obtain in this M end points, corresponding to this K this K end points of searching point.
CN 201010122713 2010-02-26 2010-02-26 Interaction module applied to stereoscopic interaction system and method of interaction module Expired - Fee Related CN102169364B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201010122713 CN102169364B (en) 2010-02-26 2010-02-26 Interaction module applied to stereoscopic interaction system and method of interaction module

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201010122713 CN102169364B (en) 2010-02-26 2010-02-26 Interaction module applied to stereoscopic interaction system and method of interaction module

Publications (2)

Publication Number Publication Date
CN102169364A CN102169364A (en) 2011-08-31
CN102169364B true CN102169364B (en) 2013-03-27

Family

ID=44490550

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201010122713 Expired - Fee Related CN102169364B (en) 2010-02-26 2010-02-26 Interaction module applied to stereoscopic interaction system and method of interaction module

Country Status (1)

Country Link
CN (1) CN102169364B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101382772B1 (en) * 2012-12-11 2014-04-08 현대자동차주식회사 Display system and method
KR102310994B1 (en) * 2014-11-25 2021-10-08 삼성전자주식회사 Computing apparatus and method for providing 3-dimensional interaction
WO2018209043A1 (en) * 2017-05-10 2018-11-15 Microsoft Technology Licensing, Llc Presenting applications within virtual environments

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1687970A (en) * 2005-04-27 2005-10-26 蔡涛 Interactive controlling method for selecting 3-D image body reconstructive partial body
CN101281422A (en) * 2007-04-02 2008-10-08 原相科技股份有限公司 Apparatus and method for generating three-dimensional information based on object as well as using interactive system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1817651A1 (en) * 2004-10-15 2007-08-15 Philips Intellectual Property & Standards GmbH System for 3d rendering applications using hands

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1687970A (en) * 2005-04-27 2005-10-26 蔡涛 Interactive controlling method for selecting 3-D image body reconstructive partial body
CN101281422A (en) * 2007-04-02 2008-10-08 原相科技股份有限公司 Apparatus and method for generating three-dimensional information based on object as well as using interactive system

Also Published As

Publication number Publication date
CN102169364A (en) 2011-08-31

Similar Documents

Publication Publication Date Title
US10846864B2 (en) Method and apparatus for detecting gesture in user-based spatial coordinate system
CN105975109B (en) Active capacitance pen and its attitude detecting method, capacitance type touch control screen and touch-control system
US11044402B1 (en) Power management for optical position tracking devices
CN106383596B (en) Virtual reality anti-dizzy system and method based on space positioning
US8477099B2 (en) Portable data processing appartatus
TWI406694B (en) Interactive module applied in a 3d interactive system and method thereof
CN114127669A (en) Trackability enhancement for passive stylus
WO2014093946A1 (en) Calibration and registration of camera arrays using a single optical calibration target
US20140200080A1 (en) 3d device and 3d game device using a virtual touch
US8555205B2 (en) System and method utilized for human and machine interface
JP2013506209A (en) Method and apparatus for detecting a gazing point based on face detection and image measurement
US20150116204A1 (en) Transparent display virtual touch apparatus not displaying pointer
WO2016008265A1 (en) Method and apparatus for locating position
EP2492873B1 (en) Image processing program, image processing apparatus, image processing system, and image processing method
CN102169364B (en) Interaction module applied to stereoscopic interaction system and method of interaction module
TW201830050A (en) Tracking system, tracking device and tracking method
US20200116481A1 (en) Golf voice broadcast rangefinder
US20230256297A1 (en) Virtual evaluation tools for augmented reality exercise experiences
US10466046B1 (en) External display rangefinder
US10582186B1 (en) Approaches for identifying misaligned cameras
Petrič et al. Real-time 3D marker tracking with a WIIMOTE stereo vision system: Application to robotic throwing
US20200159339A1 (en) Desktop spatial stereoscopic interaction system
KR101594740B1 (en) Display System for calculating a Coordinate of Impact Object and Drive Method of the Same
TWI668492B (en) Transparent display device and control method using therefore
CN206863205U (en) The device in autoalign unit course

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20130327

Termination date: 20170226