CN103439794B - Calibration method for head-mounted device and head-mounted device - Google Patents
Calibration method for head-mounted device and head-mounted device Download PDFInfo
- Publication number
- CN103439794B CN103439794B CN201310412265.8A CN201310412265A CN103439794B CN 103439794 B CN103439794 B CN 103439794B CN 201310412265 A CN201310412265 A CN 201310412265A CN 103439794 B CN103439794 B CN 103439794B
- Authority
- CN
- China
- Prior art keywords
- user
- eyeball
- coordinate
- dimensional
- headset equipment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 210000003128 Head Anatomy 0.000 claims description 43
- 230000000875 corresponding Effects 0.000 claims description 11
- 238000001514 detection method Methods 0.000 claims description 9
- 239000011521 glass Substances 0.000 claims description 7
- 239000011159 matrix material Substances 0.000 description 18
- 239000000463 material Substances 0.000 description 8
- 238000000034 method Methods 0.000 description 8
- 230000000694 effects Effects 0.000 description 5
- 210000001747 Pupil Anatomy 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 230000004069 differentiation Effects 0.000 description 4
- 230000004048 modification Effects 0.000 description 3
- 238000006011 modification reaction Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 229920000049 Carbon (fiber) Polymers 0.000 description 2
- 239000004917 carbon fiber Substances 0.000 description 2
- 238000003379 elimination reaction Methods 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- OKTJSMMVPCPJKN-UHFFFAOYSA-N carbon Chemical compound data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0nMS4wJyBlbmNvZGluZz0naXNvLTg4NTktMSc/Pgo8c3ZnIHZlcnNpb249JzEuMScgYmFzZVByb2ZpbGU9J2Z1bGwnCiAgICAgICAgICAgICAgeG1sbnM9J2h0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnJwogICAgICAgICAgICAgICAgICAgICAgeG1sbnM6cmRraXQ9J2h0dHA6Ly93d3cucmRraXQub3JnL3htbCcKICAgICAgICAgICAgICAgICAgICAgIHhtbG5zOnhsaW5rPSdodHRwOi8vd3d3LnczLm9yZy8xOTk5L3hsaW5rJwogICAgICAgICAgICAgICAgICB4bWw6c3BhY2U9J3ByZXNlcnZlJwp3aWR0aD0nMzAwcHgnIGhlaWdodD0nMzAwcHgnIHZpZXdCb3g9JzAgMCAzMDAgMzAwJz4KPCEtLSBFTkQgT0YgSEVBREVSIC0tPgo8cmVjdCBzdHlsZT0nb3BhY2l0eToxLjA7ZmlsbDojRkZGRkZGO3N0cm9rZTpub25lJyB3aWR0aD0nMzAwLjAnIGhlaWdodD0nMzAwLjAnIHg9JzAuMCcgeT0nMC4wJz4gPC9yZWN0Pgo8dGV4dCB4PScxMzguMCcgeT0nMTcwLjAnIGNsYXNzPSdhdG9tLTAnIHN0eWxlPSdmb250LXNpemU6NDBweDtmb250LXN0eWxlOm5vcm1hbDtmb250LXdlaWdodDpub3JtYWw7ZmlsbC1vcGFjaXR5OjE7c3Ryb2tlOm5vbmU7Zm9udC1mYW1pbHk6c2Fucy1zZXJpZjt0ZXh0LWFuY2hvcjpzdGFydDtmaWxsOiMzQjQxNDMnID5DPC90ZXh0Pgo8cGF0aCBkPSdNIDE2OC40LDEzOC4wIEwgMTY4LjQsMTM3LjggTCAxNjguMywxMzcuNyBMIDE2OC4zLDEzNy41IEwgMTY4LjIsMTM3LjMgTCAxNjguMiwxMzcuMiBMIDE2OC4xLDEzNy4wIEwgMTY4LjAsMTM2LjkgTCAxNjcuOSwxMzYuNyBMIDE2Ny44LDEzNi42IEwgMTY3LjcsMTM2LjUgTCAxNjcuNSwxMzYuNCBMIDE2Ny40LDEzNi4zIEwgMTY3LjIsMTM2LjIgTCAxNjcuMSwxMzYuMSBMIDE2Ni45LDEzNi4xIEwgMTY2LjcsMTM2LjAgTCAxNjYuNiwxMzYuMCBMIDE2Ni40LDEzNi4wIEwgMTY2LjIsMTM2LjAgTCAxNjYuMSwxMzYuMCBMIDE2NS45LDEzNi4xIEwgMTY1LjcsMTM2LjEgTCAxNjUuNiwxMzYuMiBMIDE2NS40LDEzNi4yIEwgMTY1LjMsMTM2LjMgTCAxNjUuMSwxMzYuNCBMIDE2NS4wLDEzNi41IEwgMTY0LjksMTM2LjcgTCAxNjQuOCwxMzYuOCBMIDE2NC43LDEzNi45IEwgMTY0LjYsMTM3LjEgTCAxNjQuNSwxMzcuMiBMIDE2NC41LDEzNy40IEwgMTY0LjQsMTM3LjYgTCAxNjQuNCwxMzcuNyBMIDE2NC40LDEzNy45IEwgMTY0LjQsMTM4LjEgTCAxNjQuNCwxMzguMyBMIDE2NC40LDEzOC40IEwgMTY0LjUsMTM4LjYgTCAxNjQuNSwxMzguOCBMIDE2NC42LDEzOC45IEwgMTY0LjcsMTM5LjEgTCAxNjQuOCwxMzkuMiBMIDE2NC45LDEzOS4zIEwgMTY1LjAsMTM5LjUgTCAxNjUuMSwxMzkuNiBMIDE2NS4zLDEzOS43IEwgMTY1LjQsMTM5LjggTCAxNjUuNiwxMzkuOCBMIDE2NS43LDEzOS45IEwgMTY1LjksMTM5LjkgTCAxNjYuMSwxNDAuMCBMIDE2Ni4yLDE0MC4wIEwgMTY2LjQsMTQwLjAgTCAxNjYuNiwxNDAuMCBMIDE2Ni43LDE0MC4wIEwgMTY2LjksMTM5LjkgTCAxNjcuMSwxMzkuOSBMIDE2Ny4yLDEzOS44IEwgMTY3LjQsMTM5LjcgTCAxNjcuNSwxMzkuNiBMIDE2Ny43LDEzOS41IEwgMTY3LjgsMTM5LjQgTCAxNjcuOSwxMzkuMyBMIDE2OC4wLDEzOS4xIEwgMTY4LjEsMTM5LjAgTCAxNjguMiwxMzguOCBMIDE2OC4yLDEzOC43IEwgMTY4LjMsMTM4LjUgTCAxNjguMywxMzguMyBMIDE2OC40LDEzOC4yIEwgMTY4LjQsMTM4LjAgTCAxNjYuNCwxMzguMCBaJyBzdHlsZT0nZmlsbDojMDAwMDAwO2ZpbGwtcnVsZTpldmVub2RkO2ZpbGwtb3BhY2l0eToxO3N0cm9rZTojMDAwMDAwO3N0cm9rZS13aWR0aDowLjBweDtzdHJva2UtbGluZWNhcDpidXR0O3N0cm9rZS1saW5lam9pbjptaXRlcjtzdHJva2Utb3BhY2l0eToxOycgLz4KPHBhdGggZD0nTSAxNjguNCwxNjIuMCBMIDE2OC40LDE2MS44IEwgMTY4LjMsMTYxLjcgTCAxNjguMywxNjEuNSBMIDE2OC4yLDE2MS4zIEwgMTY4LjIsMTYxLjIgTCAxNjguMSwxNjEuMCBMIDE2OC4wLDE2MC45IEwgMTY3LjksMTYwLjcgTCAxNjcuOCwxNjAuNiBMIDE2Ny43LDE2MC41IEwgMTY3LjUsMTYwLjQgTCAxNjcuNCwxNjAuMyBMIDE2Ny4yLDE2MC4yIEwgMTY3LjEsMTYwLjEgTCAxNjYuOSwxNjAuMSBMIDE2Ni43LDE2MC4wIEwgMTY2LjYsMTYwLjAgTCAxNjYuNCwxNjAuMCBMIDE2Ni4yLDE2MC4wIEwgMTY2LjEsMTYwLjAgTCAxNjUuOSwxNjAuMSBMIDE2NS43LDE2MC4xIEwgMTY1LjYsMTYwLjIgTCAxNjUuNCwxNjAuMiBMIDE2NS4zLDE2MC4zIEwgMTY1LjEsMTYwLjQgTCAxNjUuMCwxNjAuNSBMIDE2NC45LDE2MC43IEwgMTY0LjgsMTYwLjggTCAxNjQuNywxNjAuOSBMIDE2NC42LDE2MS4xIEwgMTY0LjUsMTYxLjIgTCAxNjQuNSwxNjEuNCBMIDE2NC40LDE2MS42IEwgMTY0LjQsMTYxLjcgTCAxNjQuNCwxNjEuOSBMIDE2NC40LDE2Mi4xIEwgMTY0LjQsMTYyLjMgTCAxNjQuNCwxNjIuNCBMIDE2NC41LDE2Mi42IEwgMTY0LjUsMTYyLjggTCAxNjQuNiwxNjIuOSBMIDE2NC43LDE2My4xIEwgMTY0LjgsMTYzLjIgTCAxNjQuOSwxNjMuMyBMIDE2NS4wLDE2My41IEwgMTY1LjEsMTYzLjYgTCAxNjUuMywxNjMuNyBMIDE2NS40LDE2My44IEwgMTY1LjYsMTYzLjggTCAxNjUuNywxNjMuOSBMIDE2NS45LDE2My45IEwgMTY2LjEsMTY0LjAgTCAxNjYuMiwxNjQuMCBMIDE2Ni40LDE2NC4wIEwgMTY2LjYsMTY0LjAgTCAxNjYuNywxNjQuMCBMIDE2Ni45LDE2My45IEwgMTY3LjEsMTYzLjkgTCAxNjcuMiwxNjMuOCBMIDE2Ny40LDE2My43IEwgMTY3LjUsMTYzLjYgTCAxNjcuNywxNjMuNSBMIDE2Ny44LDE2My40IEwgMTY3LjksMTYzLjMgTCAxNjguMCwxNjMuMSBMIDE2OC4xLDE2My4wIEwgMTY4LjIsMTYyLjggTCAxNjguMiwxNjIuNyBMIDE2OC4zLDE2Mi41IEwgMTY4LjMsMTYyLjMgTCAxNjguNCwxNjIuMiBMIDE2OC40LDE2Mi4wIEwgMTY2LjQsMTYyLjAgWicgc3R5bGU9J2ZpbGw6IzAwMDAwMDtmaWxsLXJ1bGU6ZXZlbm9kZDtmaWxsLW9wYWNpdHk6MTtzdHJva2U6IzAwMDAwMDtzdHJva2Utd2lkdGg6MC4wcHg7c3Ryb2tlLWxpbmVjYXA6YnV0dDtzdHJva2UtbGluZWpvaW46bWl0ZXI7c3Ryb2tlLW9wYWNpdHk6MTsnIC8+CjxwYXRoIGQ9J00gMTY4LjQsMTQ2LjAgTCAxNjguNCwxNDUuOCBMIDE2OC4zLDE0NS43IEwgMTY4LjMsMTQ1LjUgTCAxNjguMiwxNDUuMyBMIDE2OC4yLDE0NS4yIEwgMTY4LjEsMTQ1LjAgTCAxNjguMCwxNDQuOSBMIDE2Ny45LDE0NC43IEwgMTY3LjgsMTQ0LjYgTCAxNjcuNywxNDQuNSBMIDE2Ny41LDE0NC40IEwgMTY3LjQsMTQ0LjMgTCAxNjcuMiwxNDQuMiBMIDE2Ny4xLDE0NC4xIEwgMTY2LjksMTQ0LjEgTCAxNjYuNywxNDQuMCBMIDE2Ni42LDE0NC4wIEwgMTY2LjQsMTQ0LjAgTCAxNjYuMiwxNDQuMCBMIDE2Ni4xLDE0NC4wIEwgMTY1LjksMTQ0LjEgTCAxNjUuNywxNDQuMSBMIDE2NS42LDE0NC4yIEwgMTY1LjQsMTQ0LjIgTCAxNjUuMywxNDQuMyBMIDE2NS4xLDE0NC40IEwgMTY1LjAsMTQ0LjUgTCAxNjQuOSwxNDQuNyBMIDE2NC44LDE0NC44IEwgMTY0LjcsMTQ0LjkgTCAxNjQuNiwxNDUuMSBMIDE2NC41LDE0NS4yIEwgMTY0LjUsMTQ1LjQgTCAxNjQuNCwxNDUuNiBMIDE2NC40LDE0NS43IEwgMTY0LjQsMTQ1LjkgTCAxNjQuNCwxNDYuMSBMIDE2NC40LDE0Ni4zIEwgMTY0LjQsMTQ2LjQgTCAxNjQuNSwxNDYuNiBMIDE2NC41LDE0Ni44IEwgMTY0LjYsMTQ2LjkgTCAxNjQuNywxNDcuMSBMIDE2NC44LDE0Ny4yIEwgMTY0LjksMTQ3LjMgTCAxNjUuMCwxNDcuNSBMIDE2NS4xLDE0Ny42IEwgMTY1LjMsMTQ3LjcgTCAxNjUuNCwxNDcuOCBMIDE2NS42LDE0Ny44IEwgMTY1LjcsMTQ3LjkgTCAxNjUuOSwxNDcuOSBMIDE2Ni4xLDE0OC4wIEwgMTY2LjIsMTQ4LjAgTCAxNjYuNCwxNDguMCBMIDE2Ni42LDE0OC4wIEwgMTY2LjcsMTQ4LjAgTCAxNjYuOSwxNDcuOSBMIDE2Ny4xLDE0Ny45IEwgMTY3LjIsMTQ3LjggTCAxNjcuNCwxNDcuNyBMIDE2Ny41LDE0Ny42IEwgMTY3LjcsMTQ3LjUgTCAxNjcuOCwxNDcuNCBMIDE2Ny45LDE0Ny4zIEwgMTY4LjAsMTQ3LjEgTCAxNjguMSwxNDcuMCBMIDE2OC4yLDE0Ni44IEwgMTY4LjIsMTQ2LjcgTCAxNjguMywxNDYuNSBMIDE2OC4zLDE0Ni4zIEwgMTY4LjQsMTQ2LjIgTCAxNjguNCwxNDYuMCBMIDE2Ni40LDE0Ni4wIFonIHN0eWxlPSdmaWxsOiMwMDAwMDA7ZmlsbC1ydWxlOmV2ZW5vZGQ7ZmlsbC1vcGFjaXR5OjE7c3Ryb2tlOiMwMDAwMDA7c3Ryb2tlLXdpZHRoOjAuMHB4O3N0cm9rZS1saW5lY2FwOmJ1dHQ7c3Ryb2tlLWxpbmVqb2luOm1pdGVyO3N0cm9rZS1vcGFjaXR5OjE7JyAvPgo8cGF0aCBkPSdNIDE2OC40LDE1NC4wIEwgMTY4LjQsMTUzLjggTCAxNjguMywxNTMuNyBMIDE2OC4zLDE1My41IEwgMTY4LjIsMTUzLjMgTCAxNjguMiwxNTMuMiBMIDE2OC4xLDE1My4wIEwgMTY4LjAsMTUyLjkgTCAxNjcuOSwxNTIuNyBMIDE2Ny44LDE1Mi42IEwgMTY3LjcsMTUyLjUgTCAxNjcuNSwxNTIuNCBMIDE2Ny40LDE1Mi4zIEwgMTY3LjIsMTUyLjIgTCAxNjcuMSwxNTIuMSBMIDE2Ni45LDE1Mi4xIEwgMTY2LjcsMTUyLjAgTCAxNjYuNiwxNTIuMCBMIDE2Ni40LDE1Mi4wIEwgMTY2LjIsMTUyLjAgTCAxNjYuMSwxNTIuMCBMIDE2NS45LDE1Mi4xIEwgMTY1LjcsMTUyLjEgTCAxNjUuNiwxNTIuMiBMIDE2NS40LDE1Mi4yIEwgMTY1LjMsMTUyLjMgTCAxNjUuMSwxNTIuNCBMIDE2NS4wLDE1Mi41IEwgMTY0LjksMTUyLjcgTCAxNjQuOCwxNTIuOCBMIDE2NC43LDE1Mi45IEwgMTY0LjYsMTUzLjEgTCAxNjQuNSwxNTMuMiBMIDE2NC41LDE1My40IEwgMTY0LjQsMTUzLjYgTCAxNjQuNCwxNTMuNyBMIDE2NC40LDE1My45IEwgMTY0LjQsMTU0LjEgTCAxNjQuNCwxNTQuMyBMIDE2NC40LDE1NC40IEwgMTY0LjUsMTU0LjYgTCAxNjQuNSwxNTQuOCBMIDE2NC42LDE1NC45IEwgMTY0LjcsMTU1LjEgTCAxNjQuOCwxNTUuMiBMIDE2NC45LDE1NS4zIEwgMTY1LjAsMTU1LjUgTCAxNjUuMSwxNTUuNiBMIDE2NS4zLDE1NS43IEwgMTY1LjQsMTU1LjggTCAxNjUuNiwxNTUuOCBMIDE2NS43LDE1NS45IEwgMTY1LjksMTU1LjkgTCAxNjYuMSwxNTYuMCBMIDE2Ni4yLDE1Ni4wIEwgMTY2LjQsMTU2LjAgTCAxNjYuNiwxNTYuMCBMIDE2Ni43LDE1Ni4wIEwgMTY2LjksMTU1LjkgTCAxNjcuMSwxNTUuOSBMIDE2Ny4yLDE1NS44IEwgMTY3LjQsMTU1LjcgTCAxNjcuNSwxNTUuNiBMIDE2Ny43LDE1NS41IEwgMTY3LjgsMTU1LjQgTCAxNjcuOSwxNTUuMyBMIDE2OC4wLDE1NS4xIEwgMTY4LjEsMTU1LjAgTCAxNjguMiwxNTQuOCBMIDE2OC4yLDE1NC43IEwgMTY4LjMsMTU0LjUgTCAxNjguMywxNTQuMyBMIDE2OC40LDE1NC4yIEwgMTY4LjQsMTU0LjAgTCAxNjYuNCwxNTQuMCBaJyBzdHlsZT0nZmlsbDojMDAwMDAwO2ZpbGwtcnVsZTpldmVub2RkO2ZpbGwtb3BhY2l0eToxO3N0cm9rZTojMDAwMDAwO3N0cm9rZS13aWR0aDowLjBweDtzdHJva2UtbGluZWNhcDpidXR0O3N0cm9rZS1saW5lam9pbjptaXRlcjtzdHJva2Utb3BhY2l0eToxOycgLz4KPC9zdmc+Cg== data:image/svg+xml;base64,PD94bWwgdmVyc2lvbj0nMS4wJyBlbmNvZGluZz0naXNvLTg4NTktMSc/Pgo8c3ZnIHZlcnNpb249JzEuMScgYmFzZVByb2ZpbGU9J2Z1bGwnCiAgICAgICAgICAgICAgeG1sbnM9J2h0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnJwogICAgICAgICAgICAgICAgICAgICAgeG1sbnM6cmRraXQ9J2h0dHA6Ly93d3cucmRraXQub3JnL3htbCcKICAgICAgICAgICAgICAgICAgICAgIHhtbG5zOnhsaW5rPSdodHRwOi8vd3d3LnczLm9yZy8xOTk5L3hsaW5rJwogICAgICAgICAgICAgICAgICB4bWw6c3BhY2U9J3ByZXNlcnZlJwp3aWR0aD0nODVweCcgaGVpZ2h0PSc4NXB4JyB2aWV3Qm94PScwIDAgODUgODUnPgo8IS0tIEVORCBPRiBIRUFERVIgLS0+CjxyZWN0IHN0eWxlPSdvcGFjaXR5OjEuMDtmaWxsOiNGRkZGRkY7c3Ryb2tlOm5vbmUnIHdpZHRoPSc4NS4wJyBoZWlnaHQ9Jzg1LjAnIHg9JzAuMCcgeT0nMC4wJz4gPC9yZWN0Pgo8dGV4dCB4PSczNS4wJyB5PSc1My42JyBjbGFzcz0nYXRvbS0wJyBzdHlsZT0nZm9udC1zaXplOjIzcHg7Zm9udC1zdHlsZTpub3JtYWw7Zm9udC13ZWlnaHQ6bm9ybWFsO2ZpbGwtb3BhY2l0eToxO3N0cm9rZTpub25lO2ZvbnQtZmFtaWx5OnNhbnMtc2VyaWY7dGV4dC1hbmNob3I6c3RhcnQ7ZmlsbDojM0I0MTQzJyA+QzwvdGV4dD4KPHBhdGggZD0nTSA1My42LDM1LjAgTCA1My42LDM0LjkgTCA1My42LDM0LjggTCA1My42LDM0LjcgTCA1My41LDM0LjcgTCA1My41LDM0LjYgTCA1My40LDM0LjUgTCA1My40LDM0LjQgTCA1My4zLDM0LjMgTCA1My4zLDM0LjIgTCA1My4yLDM0LjIgTCA1My4xLDM0LjEgTCA1My4wLDM0LjEgTCA1Mi45LDM0LjAgTCA1Mi44LDM0LjAgTCA1Mi44LDMzLjkgTCA1Mi43LDMzLjkgTCA1Mi42LDMzLjkgTCA1Mi41LDMzLjkgTCA1Mi40LDMzLjkgTCA1Mi4zLDMzLjkgTCA1Mi4yLDMzLjkgTCA1Mi4xLDMzLjkgTCA1Mi4wLDM0LjAgTCA1MS45LDM0LjAgTCA1MS44LDM0LjEgTCA1MS43LDM0LjEgTCA1MS42LDM0LjIgTCA1MS42LDM0LjMgTCA1MS41LDM0LjMgTCA1MS40LDM0LjQgTCA1MS40LDM0LjUgTCA1MS40LDM0LjYgTCA1MS4zLDM0LjcgTCA1MS4zLDM0LjggTCA1MS4zLDM0LjkgTCA1MS4zLDM1LjAgTCA1MS4zLDM1LjEgTCA1MS4zLDM1LjIgTCA1MS4zLDM1LjMgTCA1MS4zLDM1LjQgTCA1MS40LDM1LjUgTCA1MS40LDM1LjYgTCA1MS40LDM1LjcgTCA1MS41LDM1LjcgTCA1MS42LDM1LjggTCA1MS42LDM1LjkgTCA1MS43LDM2LjAgTCA1MS44LDM2LjAgTCA1MS45LDM2LjEgTCA1Mi4wLDM2LjEgTCA1Mi4xLDM2LjEgTCA1Mi4yLDM2LjIgTCA1Mi4zLDM2LjIgTCA1Mi40LDM2LjIgTCA1Mi41LDM2LjIgTCA1Mi42LDM2LjIgTCA1Mi43LDM2LjIgTCA1Mi44LDM2LjIgTCA1Mi44LDM2LjEgTCA1Mi45LDM2LjEgTCA1My4wLDM2LjAgTCA1My4xLDM2LjAgTCA1My4yLDM1LjkgTCA1My4zLDM1LjkgTCA1My4zLDM1LjggTCA1My40LDM1LjcgTCA1My40LDM1LjYgTCA1My41LDM1LjUgTCA1My41LDM1LjQgTCA1My42LDM1LjMgTCA1My42LDM1LjIgTCA1My42LDM1LjEgTCA1My42LDM1LjAgTCA1Mi40LDM1LjAgWicgc3R5bGU9J2ZpbGw6IzAwMDAwMDtmaWxsLXJ1bGU6ZXZlbm9kZDtmaWxsLW9wYWNpdHk6MTtzdHJva2U6IzAwMDAwMDtzdHJva2Utd2lkdGg6MC4wcHg7c3Ryb2tlLWxpbmVjYXA6YnV0dDtzdHJva2UtbGluZWpvaW46bWl0ZXI7c3Ryb2tlLW9wYWNpdHk6MTsnIC8+CjxwYXRoIGQ9J00gNTMuNiw0OS4wIEwgNTMuNiw0OC45IEwgNTMuNiw0OC44IEwgNTMuNiw0OC43IEwgNTMuNSw0OC42IEwgNTMuNSw0OC41IEwgNTMuNCw0OC40IEwgNTMuNCw0OC4zIEwgNTMuMyw0OC4yIEwgNTMuMyw0OC4xIEwgNTMuMiw0OC4xIEwgNTMuMSw0OC4wIEwgNTMuMCw0OC4wIEwgNTIuOSw0Ny45IEwgNTIuOCw0Ny45IEwgNTIuOCw0Ny44IEwgNTIuNyw0Ny44IEwgNTIuNiw0Ny44IEwgNTIuNSw0Ny44IEwgNTIuNCw0Ny44IEwgNTIuMyw0Ny44IEwgNTIuMiw0Ny44IEwgNTIuMSw0Ny45IEwgNTIuMCw0Ny45IEwgNTEuOSw0Ny45IEwgNTEuOCw0OC4wIEwgNTEuNyw0OC4wIEwgNTEuNiw0OC4xIEwgNTEuNiw0OC4yIEwgNTEuNSw0OC4zIEwgNTEuNCw0OC4zIEwgNTEuNCw0OC40IEwgNTEuNCw0OC41IEwgNTEuMyw0OC42IEwgNTEuMyw0OC43IEwgNTEuMyw0OC44IEwgNTEuMyw0OC45IEwgNTEuMyw0OS4wIEwgNTEuMyw0OS4xIEwgNTEuMyw0OS4yIEwgNTEuMyw0OS4zIEwgNTEuNCw0OS40IEwgNTEuNCw0OS41IEwgNTEuNCw0OS42IEwgNTEuNSw0OS43IEwgNTEuNiw0OS43IEwgNTEuNiw0OS44IEwgNTEuNyw0OS45IEwgNTEuOCw0OS45IEwgNTEuOSw1MC4wIEwgNTIuMCw1MC4wIEwgNTIuMSw1MC4xIEwgNTIuMiw1MC4xIEwgNTIuMyw1MC4xIEwgNTIuNCw1MC4xIEwgNTIuNSw1MC4xIEwgNTIuNiw1MC4xIEwgNTIuNyw1MC4xIEwgNTIuOCw1MC4xIEwgNTIuOCw1MC4wIEwgNTIuOSw1MC4wIEwgNTMuMCw0OS45IEwgNTMuMSw0OS45IEwgNTMuMiw0OS44IEwgNTMuMyw0OS44IEwgNTMuMyw0OS43IEwgNTMuNCw0OS42IEwgNTMuNCw0OS41IEwgNTMuNSw0OS40IEwgNTMuNSw0OS4zIEwgNTMuNiw0OS4zIEwgNTMuNiw0OS4yIEwgNTMuNiw0OS4xIEwgNTMuNiw0OS4wIEwgNTIuNCw0OS4wIFonIHN0eWxlPSdmaWxsOiMwMDAwMDA7ZmlsbC1ydWxlOmV2ZW5vZGQ7ZmlsbC1vcGFjaXR5OjE7c3Ryb2tlOiMwMDAwMDA7c3Ryb2tlLXdpZHRoOjAuMHB4O3N0cm9rZS1saW5lY2FwOmJ1dHQ7c3Ryb2tlLWxpbmVqb2luOm1pdGVyO3N0cm9rZS1vcGFjaXR5OjE7JyAvPgo8cGF0aCBkPSdNIDUzLjYsMzkuNyBMIDUzLjYsMzkuNiBMIDUzLjYsMzkuNSBMIDUzLjYsMzkuNCBMIDUzLjUsMzkuMyBMIDUzLjUsMzkuMiBMIDUzLjQsMzkuMSBMIDUzLjQsMzkuMCBMIDUzLjMsMzguOSBMIDUzLjMsMzguOSBMIDUzLjIsMzguOCBMIDUzLjEsMzguNyBMIDUzLjAsMzguNyBMIDUyLjksMzguNiBMIDUyLjgsMzguNiBMIDUyLjgsMzguNiBMIDUyLjcsMzguNSBMIDUyLjYsMzguNSBMIDUyLjUsMzguNSBMIDUyLjQsMzguNSBMIDUyLjMsMzguNSBMIDUyLjIsMzguNiBMIDUyLjEsMzguNiBMIDUyLjAsMzguNiBMIDUxLjksMzguNyBMIDUxLjgsMzguNyBMIDUxLjcsMzguOCBMIDUxLjYsMzguOCBMIDUxLjYsMzguOSBMIDUxLjUsMzkuMCBMIDUxLjQsMzkuMSBMIDUxLjQsMzkuMiBMIDUxLjQsMzkuMiBMIDUxLjMsMzkuMyBMIDUxLjMsMzkuNCBMIDUxLjMsMzkuNSBMIDUxLjMsMzkuNiBMIDUxLjMsMzkuNyBMIDUxLjMsMzkuOCBMIDUxLjMsMzkuOSBMIDUxLjMsNDAuMCBMIDUxLjQsNDAuMSBMIDUxLjQsNDAuMiBMIDUxLjQsNDAuMyBMIDUxLjUsNDAuNCBMIDUxLjYsNDAuNSBMIDUxLjYsNDAuNSBMIDUxLjcsNDAuNiBMIDUxLjgsNDAuNyBMIDUxLjksNDAuNyBMIDUyLjAsNDAuNyBMIDUyLjEsNDAuOCBMIDUyLjIsNDAuOCBMIDUyLjMsNDAuOCBMIDUyLjQsNDAuOCBMIDUyLjUsNDAuOCBMIDUyLjYsNDAuOCBMIDUyLjcsNDAuOCBMIDUyLjgsNDAuOCBMIDUyLjgsNDAuOCBMIDUyLjksNDAuNyBMIDUzLjAsNDAuNyBMIDUzLjEsNDAuNiBMIDUzLjIsNDAuNiBMIDUzLjMsNDAuNSBMIDUzLjMsNDAuNCBMIDUzLjQsNDAuMyBMIDUzLjQsNDAuMyBMIDUzLjUsNDAuMiBMIDUzLjUsNDAuMSBMIDUzLjYsNDAuMCBMIDUzLjYsMzkuOSBMIDUzLjYsMzkuOCBMIDUzLjYsMzkuNyBMIDUyLjQsMzkuNyBaJyBzdHlsZT0nZmlsbDojMDAwMDAwO2ZpbGwtcnVsZTpldmVub2RkO2ZpbGwtb3BhY2l0eToxO3N0cm9rZTojMDAwMDAwO3N0cm9rZS13aWR0aDowLjBweDtzdHJva2UtbGluZWNhcDpidXR0O3N0cm9rZS1saW5lam9pbjptaXRlcjtzdHJva2Utb3BhY2l0eToxOycgLz4KPHBhdGggZD0nTSA1My42LDQ0LjMgTCA1My42LDQ0LjIgTCA1My42LDQ0LjEgTCA1My42LDQ0LjAgTCA1My41LDQzLjkgTCA1My41LDQzLjggTCA1My40LDQzLjcgTCA1My40LDQzLjcgTCA1My4zLDQzLjYgTCA1My4zLDQzLjUgTCA1My4yLDQzLjQgTCA1My4xLDQzLjQgTCA1My4wLDQzLjMgTCA1Mi45LDQzLjMgTCA1Mi44LDQzLjIgTCA1Mi44LDQzLjIgTCA1Mi43LDQzLjIgTCA1Mi42LDQzLjIgTCA1Mi41LDQzLjIgTCA1Mi40LDQzLjIgTCA1Mi4zLDQzLjIgTCA1Mi4yLDQzLjIgTCA1Mi4xLDQzLjIgTCA1Mi4wLDQzLjMgTCA1MS45LDQzLjMgTCA1MS44LDQzLjMgTCA1MS43LDQzLjQgTCA1MS42LDQzLjUgTCA1MS42LDQzLjUgTCA1MS41LDQzLjYgTCA1MS40LDQzLjcgTCA1MS40LDQzLjggTCA1MS40LDQzLjkgTCA1MS4zLDQ0LjAgTCA1MS4zLDQ0LjEgTCA1MS4zLDQ0LjIgTCA1MS4zLDQ0LjMgTCA1MS4zLDQ0LjQgTCA1MS4zLDQ0LjUgTCA1MS4zLDQ0LjYgTCA1MS4zLDQ0LjcgTCA1MS40LDQ0LjggTCA1MS40LDQ0LjggTCA1MS40LDQ0LjkgTCA1MS41LDQ1LjAgTCA1MS42LDQ1LjEgTCA1MS42LDQ1LjIgTCA1MS43LDQ1LjIgTCA1MS44LDQ1LjMgTCA1MS45LDQ1LjMgTCA1Mi4wLDQ1LjQgTCA1Mi4xLDQ1LjQgTCA1Mi4yLDQ1LjQgTCA1Mi4zLDQ1LjUgTCA1Mi40LDQ1LjUgTCA1Mi41LDQ1LjUgTCA1Mi42LDQ1LjUgTCA1Mi43LDQ1LjUgTCA1Mi44LDQ1LjQgTCA1Mi44LDQ1LjQgTCA1Mi45LDQ1LjQgTCA1My4wLDQ1LjMgTCA1My4xLDQ1LjMgTCA1My4yLDQ1LjIgTCA1My4zLDQ1LjEgTCA1My4zLDQ1LjEgTCA1My40LDQ1LjAgTCA1My40LDQ0LjkgTCA1My41LDQ0LjggTCA1My41LDQ0LjcgTCA1My42LDQ0LjYgTCA1My42LDQ0LjUgTCA1My42LDQ0LjQgTCA1My42LDQ0LjMgTCA1Mi40LDQ0LjMgWicgc3R5bGU9J2ZpbGw6IzAwMDAwMDtmaWxsLXJ1bGU6ZXZlbm9kZDtmaWxsLW9wYWNpdHk6MTtzdHJva2U6IzAwMDAwMDtzdHJva2Utd2lkdGg6MC4wcHg7c3Ryb2tlLWxpbmVjYXA6YnV0dDtzdHJva2UtbGluZWpvaW46bWl0ZXI7c3Ryb2tlLW9wYWNpdHk6MTsnIC8+Cjwvc3ZnPgo= [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 1
- 229910052799 carbon Inorganic materials 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000002708 enhancing Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000003562 lightweight material Substances 0.000 description 1
- 230000003287 optical Effects 0.000 description 1
Abstract
The invention provides a calibration method for a head-mounted device and the head-mounted device. The calibration method comprises the following steps: detecting eyeball position information of a user; and adjusting the position of at least one display screen according to the eyeball position information of the user. According to the calibration method provided by the embodiment of the invention, by obtaining the eyeball position information of the user, and comparing the deviation between the eyeball position of the user and the position of the display screen, the position of the display screen is adjusted, an adjusting result adapts to the eyeball position of the user, the pertinence is strong, the personalized needs of the user are well met, and the user experience is enhanced.
Description
Technical field
The present invention relates to technical field of intelligent equipment, more particularly, to a kind of calibration steps of headset equipment and wear-type set
Standby.
Background technology
Headset equipment integrated picture pick-up device and control chip etc. can form intelligent headset equipment (as intelligent glasses set
Standby), provide the user abundant personalized function and information.With scientific and technical development, the practicality of headset equipment and
Recreational continuous enhancing, is just gradually being applied to the every field such as people's work, life now.But headset equipment display image
Accurate not to the utmost, the phenomenons such as aliasing, information displacement often occur.Current headset equipment Automatic Calibration Technique is usually used sharp
Optical sensor, the position according to headset equipment itself and the information of torsional deformation come realize automatic screen calibration.
In realizing process of the present invention, inventor finds prior art at least there is problems in that due to laser sensor
Cost intensive, and relevant device weight is it is not easy to miniaturization;Again because existing calibration steps relies solely on headset equipment itself
Position and torsional deformation information are realizing calibration it is difficult to meet the individual demand of user, poor user experience.
Content of the invention
It is contemplated that at least solving one of above-mentioned technical problem.
For this reason, the first of the present invention purpose is to propose a kind of calibration steps of headset equipment.The method can root
According to the eyeball position of user, exactly position adjustment is made to corresponding display screen, when so that user is worn, viewing effect is more easypro
Suitable, more meet the individual demand of user, improve Consumer's Experience.
Second object of the present invention is to propose a kind of headset equipment.
To achieve these goals, the calibration steps of the headset equipment of first aspect present invention embodiment includes following step
Rapid: the positional information of the eyeball of detection user;And according to the positional information of the eyeball of described user to described at least one show
The position of display screen curtain is adjusted.
The calibration steps of the headset equipment of the embodiment of the present invention, by obtaining the positional information of user eyeball, contrast is used
Deviation between family eyeball position and display screen position, thus exactly display screen is made with position adjustment, adjusts result
Adapt to the eyeball position of user, with strong points, when so that user is worn, viewing effect is more comfortable, the personalization more meeting user needs
Ask, improve Consumer's Experience.
To achieve these goals, the headset equipment of second aspect present invention embodiment, comprising: at least one display screen
Curtain;Adjusting module, for adjusting the position of at least one display screen described;At least one photographic head, for detecting user's
The positional information of eyeball;And control module, for the eyeball according to described user positional information pass through described adjusting module
The position of at least one display screen described is adjusted.
The headset equipment of the embodiment of the present invention, can obtain user eyeball position by obtaining the positional information of user eyeball
Put deviation and display screen position between, thus exactly display screen is made with corresponding adjustment, adjustment result adapts to use
The eyeball position at family, with strong points, when so that user is worn, viewing effect is more comfortable, meets the individual demand of user, improves
Consumer's Experience.
The aspect that the present invention adds and advantage will be set forth in part in the description, and partly will become from the following description
Obtain substantially, or recognized by the practice of the present invention.
Brief description
The above-mentioned and/or additional aspect of the present invention and advantage will become from the following description of the accompanying drawings of embodiments
Substantially and easy to understand, wherein,
Fig. 1 is the flow chart of the calibration steps of headset equipment according to an embodiment of the invention;
Fig. 2 is the flow chart of the calibration steps of headset equipment in accordance with another embodiment of the present invention;
Fig. 3 is the schematic diagram of intelligent glasses according to an embodiment of the invention;
Fig. 4 is the schematic diagram of the first three-dimensional system of coordinate according to an embodiment of the invention;
Fig. 5 is the first three-dimensional system of coordinate according to an embodiment of the invention and the showing of the second three-dimensional system of coordinate relative position
It is intended to;
Fig. 6 is the schematic diagram at intelligent glasses position adjustment according to an embodiment of the invention (side visual angle);
Fig. 7 is the structured flowchart of headset equipment according to an embodiment of the invention;
Fig. 8 is the structured flowchart of headset equipment in accordance with another embodiment of the present invention.
Specific embodiment
Embodiments of the invention are described below in detail, the example of described embodiment is shown in the drawings, wherein from start to finish
The element that same or similar label represents same or similar element or has same or like function.Below with reference to attached
The embodiment of figure description is exemplary, is only used for explaining the present invention, and is not considered as limiting the invention.On the contrary, originally
Inventive embodiment includes falling into all changes in the range of the spirit of attached claims and intension, modification and equivalent
Thing.
In describing the invention it is to be understood that term " first ", " second " etc. are only used for describing purpose, and not
It is understood that as indicating or implying relative importance.In describing the invention, it should be noted that specifying unless otherwise clear and definite
And restriction, term " being connected ", " connection " should be interpreted broadly, for example, it may be being fixedly connected or being detachably connected,
Or be integrally connected;Can be to be mechanically connected or electrically connect;Can be to be joined directly together it is also possible to pass through intermediary
It is indirectly connected to.For the ordinary skill in the art, above-mentioned term tool in the present invention can be understood with concrete condition
Body implication.Additionally, in describing the invention, unless otherwise stated, " multiple " are meant that two or more.
In flow chart or here any process described otherwise above or method description are construed as, represent and include
The module of the code of executable instruction of one or more steps for realizing specific logical function or process, fragment or portion
Point, and the scope of the preferred embodiment of the present invention includes other realization, wherein can not press shown or discuss suitable
Sequence, including according to involved function by substantially simultaneously in the way of or in the opposite order, carry out perform function, this should be by the present invention
Embodiment person of ordinary skill in the field understood.
Below with reference to the accompanying drawings calibration steps and the headset equipment of headset equipment according to embodiments of the present invention are described.
Current headset equipment calibration steps is mainly by using laser sensor, according to headset equipment itself
The information of position and torsional deformation is realizing screen calibration, but the method has ignored the eyeball position of user, and specific aim is not strong,
Calibration result is difficult to meet the individual demand of user, poor user experience.For this reason, the present invention proposes a kind of headset equipment
Calibration steps, comprises the following steps: the positional information of the eyeball of detection user;And the positional information pair of the eyeball according to user
The position of at least one display screen is adjusted.
Fig. 1 is the flow chart of the calibration steps of headset equipment according to an embodiment of the invention.
As shown in figure 1, the calibration steps of headset equipment includes:
S101, the positional information of the eyeball of detection user.
Wherein, the headset equipment that this method uses includes at least one display screen.
Specifically, in one embodiment of the invention, display screen is installed in (the such as carbon fibre of certain material making
The light materials such as dimension) in framework, and have touch controlling interface or mike cooperation sound recognition module etc. to accept user
Touch and or the control information such as sound, realize interaction.
It should be appreciated that user wears can choose whether the positional information of automatic detection eyeball during headset equipment to calibrate
At least one display screen.In one embodiment of the invention, this function can by control information (such as sound control,
Gesture control, touch control etc.) it is unlocked.After function is opened, headset equipment can detect user's by relevant device
The positional information of eyeball.
S102, the positional information of the eyeball according to user is adjusted to the position of at least one display screen.
Specifically, the positional information of the eyeball according to the user detecting, headset equipment is according at least one display screen
Deviation between the position of curtain and user eyeball position is adjusted to the position of at least one display screen.It should be appreciated that position
Putting adjustment is generalized concept, including multiple factors such as the position in adjustment all directions, angles.
The calibration steps of the headset equipment of the embodiment of the present invention, by obtaining the positional information of user eyeball, contrast is used
Deviation between family eyeball position and display screen position, from adjusting exactly and to display screen, adjustment result adapts to
The eyeball position of user, with strong points, when so that user is worn, viewing effect is more comfortable, meets the individual demand of user, lifting
Consumer's Experience.
Fig. 2 is the flow chart of the calibration steps of headset equipment in accordance with another embodiment of the present invention.
In order to lift Consumer's Experience further, headset equipment detects the eyeball position of user's eyes by pinhole cameras
Put, a three-dimensional system of coordinate is set up respectively according to display screen and eyes eyeball position, and the phase according to two three-dimensional system of coordinates
Wear formula equipment to position enemy to calibrate, specifically, as shown in Fig. 2 the calibration steps of headset equipment includes:
S201, the positional information of the eyeball of detection user.
Specifically, as indicated at 3, the headset equipment that user is used is intelligent glasses, including two display screens, and two
Individual display screen is corresponding with the eyes of user respectively.Headset equipment also includes the first photographic head and second camera.Wherein,
First photographic head and second camera are pinhole cameras, are fixed in headset equipment, for detecting user's eyes respectively
The positional information of eyeball.
It should be appreciated that user wears can choose whether the positional information of automatic detection eyeball during headset equipment to calibrate
Two display screens in left and right.In one embodiment of the invention, this function can by control information (such as sound control,
Gesture control, touch control etc.) it is unlocked.After function is opened, headset equipment can be by the first photographic head and the second shooting
Head detects the positional information of the eyeball of user.
S202, according to the position of the first photographic head and second camera, and the first photographic head and second camera are located
Plane set up the first three-dimensional system of coordinate.
Specifically, as shown in figure 4, using the line of the first photographic head and second camera as x-axis, y-axis vertical with x-axis and
, perpendicular to ground, with x, y-axis is vertical and points to user, three coordinate axess of x, y, z are vertical two-by-two, and the o that intersects at a point for z-axis for y-axis,
Form the first three-dimensional system of coordinate.
S203, according to the eyeball position of user's eyes, and the plane that the eyeball position of eyes is located sets up the second three-dimensional
Coordinate system.
Specifically, as shown in figure 5, the eyeball position of the user's eyes found out according to two photographic head, with two eyeballs
The line of pupil position is x' axle, and y' axle is vertical with x' axle and y' axle is perpendicular to ground, and z' axle is vertical with x', y' axle and points to use
Family, tri- coordinate axess of x', y', z' are vertical two-by-two, and three coordinate axess intersect at the central point o' of two eyeball pupil position lines,
Form the second three-dimensional system of coordinate.
S204, adjusts to the position of at least one display screen according to the first three-dimensional system of coordinate and the second three-dimensional system of coordinate
Whole.
Specifically, according to two three-dimensional system of coordinates calculating, the quantization of differentiation data in three directions refers to
Mark, the central point of two display screens and projection angle correspondingly change automatically, thus realizing display screen and framework three
Adjust automatically on individual direction.For example, as Fig. 6 (a) show the axial signal of y ', z ' on the side-looking visual angle of headset equipment
Figure, specifically on the basis of shown in Fig. 5, user conveniently sees eyeball direction to the right from left eye ball, and concrete adjustment algorithm is, false
If (x, y, the z) point in the first coordinate system passes through mapping matrix a, may map to (x ', y ', the z ') point in the second coordinate system:
Headset equipment preserves this mapping matrix and (is defined as mapping matrix a), as calculating differentiation data further
Mapping scale.Wherein, in the first coordinate system in (x, y, z) point and the second coordinate system (x ', y ', z ') point be by two with use
The family eyes eyeball corresponding photographic head of difference obtains in real time, and mapping matrix a directly determines the position of the subpoint after calibration
Information.Therefore, calibration process involved in the present invention, be dynamic realtime calculate the process of mapping matrix a.For example, Fig. 6
There are 9 sampled points in (b) in the first coordinate system, be respectively as follows:
(x1, y1, z1) ... (x9, y9, z9),
In new coordinate system, that is, in the second coordinate system, its coordinate respectively becomes
(x’1,y’1,z’1)……(x’9,y’9,z’9).
So, our nine equatioies of simultaneous are an equation group:
(x1, y1, z1) * a=(x ' 1, y ' 1, z ' 1);
(x2, y2, z2) * a=(x ' 2, y ' 2, z ' 2);
(x3, y3, z3) * a=(x ' 3, y ' 3, z ' 3);
……
(x9, y9, z9) * a=(x ' 9, y ' 9, z ' 9);
Because there being nine unknown parameters in mapping matrix a, by the equation group of above-mentioned 9 equation simultaneous, we can letter
Singly obtaining the value of nine parameters in mapping matrix a by Gaussian elimination method, thus showing all of point in rectangular surfaces, all may be used
To realize the mapping of new mapping point system according to mapping matrix a:
(x, y, z) * a=(x ', y ', z ').
So, by solving mapping matrix a, be achieved that according to user eyeball position to the dynamic calibration of viewing area and
Adjustment.
In order to increase the stability of calibration, when the deviation of two coordinate systems is less than predetermined threshold value, then calibration completes.
Otherwise again readjusted according to screen current location and eyes eyeball and photographic head current location.If the attempt to n (n
For preset value) be still unable to reach deviation range less than predetermined threshold value, then display to the user that calibration failure information.Should manage
Solution, position adjustment is generalized concept, including multiple factors such as the position in adjustment all directions, angles.
The calibration steps of the headset equipment of the embodiment of the present invention, by detecting the eyeball position of user's eyes, according to aobvious
Display screen curtain and eyes eyeball position set up three-dimensional system of coordinate respectively, and correct according to the relative position deviation of two three-dimensional system of coordinates
The formula equipment of wearing is calibrated, and calibration result specific aim is higher, and viewing effect is more comfortable, meets the individual requirement of user, carries
Rise Consumer's Experience.
In order to realize above-described embodiment, the present invention also proposes a kind of headset equipment.
A kind of headset equipment, comprising: at least one display screen;Adjusting module, for adjusting at least one display screen
The position of curtain;At least one photographic head, for detecting the positional information of the eyeball of user;And control module, for according to
The positional information of the eyeball at family is adjusted to the position of at least one display screen by adjusting module.
Fig. 7 is the structured flowchart of headset equipment according to an embodiment of the invention.
As shown in fig. 7, headset equipment includes: display screen 100, adjusting module 200, photographic head 300 and control module
400, wherein, display screen 100 is at least one, and photographic head 300 is at least one.
Specifically, at least one display screen 100 be used for showing user observes oneself around real world and head
The information that the formula equipment of wearing provides the user.In one embodiment of the invention, headset equipment can make in certain material
(light material such as such as carbon fiber) framework in embedded control system, provide the user information needed as the case may be and lead to
Cross display screen 100 and show user.
Adjusting module 200 is used for adjusting the position of at least one display screen 100.More specifically, according to the use detecting
The positional information of the eyeball at family, adjusting module 200 is according between the position of at least one display screen 100 and user eyeball position
Deviation position adjustment is carried out at least one display screen 100.It should be appreciated that position adjustment is a generalized concept, including tune
The multiple factors such as the position in whole all directions, angle.
At least one photographic head 300 is used for detecting the positional information of the eyeball of user.It should be appreciated that user wears wear-type
Can choose whether the positional information of automatic detection eyeball during equipment to calibrate at least one display screen 100.The present invention's
In one embodiment, this function can be unlocked by control information (such as sound control, gesture control, touch control etc.).
After function is opened, at least one photographic head 300 can detect that the positional information of the eyeball of user.
The positional information that control module 400 is used for the eyeball according to user passes through adjusting module 200 at least one display
The position of screen 100 is adjusted.More specifically, in one embodiment of the invention, control module 400 calculates at least one
Deviation between the position of display screen 100 and user eyeball position, and control adjusting module 200 at least one display screen
100 adjustment carrying out the factors such as position and angle.
The headset equipment of the embodiment of the present invention, can obtain user eyeball position by obtaining the positional information of user eyeball
Put deviation and display screen position between, thus exactly display screen is made with corresponding adjustment, adjustment result adapts to use
The eyeball position at family, with strong points, meet the individual demand of user, improve Consumer's Experience.
Fig. 8 is the structured flowchart of headset equipment in accordance with another embodiment of the present invention.
As shown in figure 8, headset equipment includes: the first display screen 110, second display screen 120, adjusting module 200,
First photographic head 310, second camera 320 and control module 400.Wherein first display screen 110 and second camera 320 phase
Corresponding, second display screen 120 is corresponding with the first photographic head 310.
Specifically, headset equipment is intelligent glasses, the first display screen 110 and second display screen 120 respectively with
The eyes at family are corresponding, for show that user observes oneself around real world and headset equipment provide the user
Information.In one embodiment of the invention, (the lightweight material such as such as carbon fiber that headset equipment can make in certain material
Material) embedded control system in framework, provide the user information needed as the case may be and pass through the first display screen 110 and the
Two display screens 120 show user.
Adjusting module 200 is used for adjusting the position of display screen.More specifically, the position of the eyeball according to the user detecting
Confidence ceases, adjusting module 200 according to the position of the first display screen 110 and second display screen 120 and user eyeball position it
Between deviation position adjustment is carried out to the first display screen 110 and second display screen 120.It should be appreciated that position adjustment is individual
Generalized concept, including multiple factors such as the position in adjustment all directions, angles.
First photographic head 310 and second camera 320 are respectively used to detect the positional information of the eyeball of user's eyes.Its
In, the first photographic head 310 and second camera 320 are pinhole cameras, are fixed in headset equipment, for detecting use respectively
The positional information of the eyeball of family eyes.It should be appreciated that user wears can choose whether automatic detection eyeball during headset equipment
Positional information come to calibrate left and right two display screens.In one embodiment of the invention, this function can be believed by control
Breath (such as sound control, gesture control, touch control etc.) is unlocked.After function is opened, the first photographic head 310 and second is taken the photograph
Positional information as 320 eyeballs that user's eyes can be detected respectively.
The positional information that control module 400 is used for the eyeball according to user passes through adjusting module 200 to the first display screen
110 and the position of second display screen 120 be adjusted.
More specifically, control module 400 is according to the position of the first photographic head 310 and second camera 320, and first takes the photograph
As 310 and the plane that is located of second camera 320 set up the first three-dimensional system of coordinate, and the eyeball position according to user's eyes,
And the plane that the eyeball position of eyes is located sets up the second three-dimensional system of coordinate, and according to the first three-dimensional system of coordinate and the two or three
Dimension coordinate system is adjusted to the position of the first display screen 110 and second display screen 120.In one embodiment of the present of invention
In, using the line of the first photographic head and second camera as x-axis, y-axis is vertical with x-axis and y-axis is perpendicular to ground for control module 400
Face, with x, y-axis is vertical and points to user, three coordinate axess of x, y, z are vertical two-by-two, and the o that intersects at a point, and form the one or three for z-axis
Dimension coordinate system.The eyeball position of the user's eyes again found out according to two photographic head, with the line of the pupil position of two eyeballs
For x' axle, y' axle is vertical with x' axle and y' axle is perpendicular to ground, and z' axle is vertical with x', y' axle and points to user, x', y', z' tri-
Individual coordinate axess are vertical two-by-two, and three coordinate axess intersect at the central point o' of two eyeball pupil position lines, form second three-dimensional
Coordinate system.Then control module 400 calculates the quantizating index of two three-dimensional system of coordinates differentiation data in three directions, leads to
Crossing adjusting module 200 controls the factors such as central point and the projection angle of two display screens correspondingly to change, thus realizing showing
Screen and framework adjust automatically in three directions.For example, as Fig. 6 (a) show on the side-looking visual angle of headset equipment
The axial schematic diagram of y ', z ', specifically on the basis of shown in Fig. 5, user conveniently sees eyeball direction to the right from left eye ball,
Concrete adjustment algorithm is it is assumed that (x, y, the z) point in the first coordinate system passes through mapping matrix a, may map to the second coordinate system
In (x ', y ', z ') point:
Headset equipment preserves this mapping matrix and (is defined as mapping matrix a), as calculating differentiation data further
Mapping scale.Wherein, in the first coordinate system in (x, y, z) point and the second coordinate system (x ', y ', z ') point be by two with use
The family eyes eyeball corresponding photographic head of difference obtains in real time, and mapping matrix a directly determines the position of the subpoint after calibration
Information.Therefore, calibration process involved in the present invention, be dynamic realtime calculate the process of mapping matrix a.For example, Fig. 6
There are 9 sampled points in (b) in the first coordinate system, be respectively as follows:
(x1, y1, z1) ... (x9, y9, z9),
In new coordinate system, that is, in the second coordinate system, its coordinate respectively becomes
(x’1,y’1,z’1)……(x’9,y’9,z’9).
So, our nine equatioies of simultaneous are an equation group:
(x1, y1, z1) * a=(x ' 1, y ' 1, z ' 1);
(x2, y2, z2) * a=(x ' 2, y ' 2, z ' 2);
(x3, y3, z3) * a=(x ' 3, y ' 3, z ' 3);
……
(x9, y9, z9) * a=(x ' 9, y ' 9, z ' 9);
Because there being nine unknown parameters in mapping matrix a, by the equation group of above-mentioned 9 equation simultaneous, we can letter
Singly obtaining the value of nine parameters in mapping matrix a by Gaussian elimination method, thus showing all of point in rectangular surfaces, all may be used
To realize the mapping of new mapping point system according to mapping matrix a:
(x, y, z) * a=(x ', y ', z ').
So, by solving mapping matrix a, be achieved that according to user eyeball position to the dynamic calibration of viewing area and
Adjustment.
In order to increase the stability of calibration, when the deviation of two coordinate systems is less than predetermined threshold value, then calibration completes.
Otherwise again current according to the first display screen 110 and second display screen 120 current location and eyes eyeball and photographic head
Position is readjusting.(n is preset value) is still unable to reach the deviation range less than predetermined threshold value if the attempt to n time, then to
User's display calibration failure information.
The headset equipment of the embodiment of the present invention, detects the eyeball position of user's eyes, root by using pinhole cameras
Set up three-dimensional system of coordinate respectively according to display screen and eyes eyeball position, and calculated the phase of two three-dimensional system of coordinates by control module
To position deviation, thus calibrating to display screen, calibration result specific aim is higher, more intelligent, and user watches more
Comfortable, more meet the individual demand of user, improve Consumer's Experience.
In the description of this specification, reference term " embodiment ", " some embodiments ", " example ", " specifically show
The description of example " or " some examples " etc. means specific features, structure, material or the spy describing with reference to this embodiment or example
Point is contained at least one embodiment or the example of the present invention.In this manual, to the schematic representation of above-mentioned term not
Necessarily refer to identical embodiment or example.And, the specific features of description, structure, material or feature can be any
One or more embodiments or example in combine in an appropriate manner.
Although an embodiment of the present invention has been shown and described, it will be understood by those skilled in the art that: not
Multiple changes, modification, replacement and modification can be carried out to these embodiments in the case of the principle of the disengaging present invention and objective, this
The scope of invention is limited by claim and its equivalent.
Claims (4)
1. a kind of calibration steps of headset equipment is it is characterised in that comprise the following steps:
The positional information of the eyeball of detection user, wherein, described headset equipment is intelligent glasses, and described headset equipment includes
Two display screens, described two display screens are corresponding with the eyes of user respectively, and described headset equipment includes first and takes the photograph
As head and second camera, described first photographic head and second camera are fixed in described headset equipment, and for respectively
Detect the positional information of the eyeball of described user's eyes;
According to the position of described first photographic head and second camera, and described first photographic head and second camera are located
Plane sets up the first three-dimensional system of coordinate;
According to the eyeball position of described user's eyes, and the plane that the eyeball position of described eyes is located sets up the second three-dimensional seat
Mark system;And
According to described first three-dimensional system of coordinate and the second three-dimensional system of coordinate, the position of at least one display screen is adjusted, its
In, when the deviation of described first three-dimensional system of coordinate and the second three-dimensional system of coordinate is less than predetermined threshold value, calibration completes, otherwise, weight
New root is readjusted according to screen current location and eyes eyeball and photographic head current location, if n adjustment still cannot reach
Deviation to described first three-dimensional system of coordinate and the second three-dimensional system of coordinate is less than predetermined threshold value, then lose to described user display calibration
Lose information, wherein, n is preset value.
2. the method for claim 1 is it is characterised in that described first photographic head and second camera image for pin hole
Head.
3. it is characterised in that described headset equipment is intelligent glasses, described headset equipment includes a kind of headset equipment:
Two display screens, described two display screens are corresponding with the eyes of user respectively;
At least one photographic head includes the first photographic head and second camera, and described first photographic head and second camera are fixed on
In described headset equipment, and for detecting the positional information of the eyeball of described user's eyes respectively;
Control module, for the position according to described first photographic head and second camera, and described first photographic head and
The plane that two photographic head are located sets up the first three-dimensional system of coordinate, and the eyeball position according to described user's eyes, and described double
The plane that the eyeball position of eye is located sets up the second three-dimensional system of coordinate, and three-dimensional according to described first three-dimensional system of coordinate and second
Coordinate system is adjusted to the position of at least one display screen, wherein, when described first three-dimensional system of coordinate and the second three-dimensional seat
When the deviation of mark system is less than predetermined threshold value, calibration completes, otherwise, again according to screen current location and eyes eyeball and shooting
Head current location is readjusted, if n adjustment is still unable to reach described first three-dimensional system of coordinate and the second three-dimensional system of coordinate
Deviation be less than predetermined threshold value, then to described user show calibration failure information, wherein, n be preset value.
4. headset equipment as claimed in claim 3 is it is characterised in that described first photographic head and second camera are pin hole
Photographic head.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310412265.8A CN103439794B (en) | 2013-09-11 | 2013-09-11 | Calibration method for head-mounted device and head-mounted device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310412265.8A CN103439794B (en) | 2013-09-11 | 2013-09-11 | Calibration method for head-mounted device and head-mounted device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103439794A CN103439794A (en) | 2013-12-11 |
CN103439794B true CN103439794B (en) | 2017-01-25 |
Family
ID=49693497
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201310412265.8A Active CN103439794B (en) | 2013-09-11 | 2013-09-11 | Calibration method for head-mounted device and head-mounted device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103439794B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018214751A1 (en) * | 2017-05-25 | 2018-11-29 | 京东方科技集团股份有限公司 | Eye-protection display device and method |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103901622B (en) * | 2014-04-23 | 2016-05-25 | 成都理想境界科技有限公司 | 3D wears viewing equipment and corresponding video player |
CN105204604B (en) * | 2014-05-30 | 2019-03-01 | 华为技术有限公司 | A kind of eyeball interactive control equipment |
CN106199963B (en) * | 2014-09-01 | 2019-09-27 | 精工爱普生株式会社 | Display device and its control method and computer program |
CN104765163B (en) * | 2015-04-27 | 2017-07-21 | 小米科技有限责任公司 | Display methods, device and the intelligent glasses of framing information |
CN105975083B (en) | 2016-05-27 | 2019-01-18 | 北京小鸟看看科技有限公司 | A kind of vision correction methods under reality environment |
CN106123916B (en) * | 2016-06-13 | 2019-11-15 | 上海临奇智能科技有限公司 | It is a kind of for calibrating the method and apparatus of Inertial Measurement Unit in VR equipment |
CN107561701B (en) * | 2016-07-01 | 2020-01-31 | 成都理想境界科技有限公司 | Near-to-eye display system, virtual reality equipment and augmented reality equipment |
TWI619967B (en) * | 2016-09-26 | 2018-04-01 | 緯創資通股份有限公司 | Adjustable virtual reality device capable of adjusting display modules |
WO2018076202A1 (en) * | 2016-10-26 | 2018-05-03 | 中国科学院深圳先进技术研究院 | Head-mounted display device that can perform eye tracking, and eye tracking method |
US20190333468A1 (en) * | 2016-12-22 | 2019-10-31 | Shenzhen Royole Technologies Co. Ltd. | Head mounted display device and visual aiding method |
CN108604015B (en) * | 2016-12-26 | 2020-07-14 | 华为技术有限公司 | Image display method and head-mounted display device |
WO2018119854A1 (en) * | 2016-12-29 | 2018-07-05 | 深圳市柔宇科技有限公司 | Display module and near-eye display device |
CN106708270B (en) * | 2016-12-29 | 2020-09-11 | 宇龙计算机通信科技(深圳)有限公司 | Virtual reality equipment display method and device and virtual reality equipment |
WO2018165906A1 (en) * | 2017-03-15 | 2018-09-20 | 廖建强 | Head-mounted display apparatus and display method therefor |
TWI653469B (en) | 2017-12-19 | 2019-03-11 | 宏碁股份有限公司 | Method of adjusting panel and head mounted display |
CN108153417B (en) * | 2017-12-25 | 2022-03-04 | 北京凌宇智控科技有限公司 | Picture compensation method and head-mounted display device adopting same |
CN109709679A (en) * | 2019-03-14 | 2019-05-03 | 大连交通大学 | A kind of virtual reality enhanced smart glasses device |
CN110308560A (en) * | 2019-07-03 | 2019-10-08 | 南京玛克威信息科技有限公司 | The control method of VR equipment |
CN111265392B (en) * | 2020-02-27 | 2022-05-03 | 深圳市视界智造科技有限公司 | Amblyopia treatment system |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN201732214U (en) * | 2010-06-07 | 2011-02-02 | 贾怀昌 | Eyeglasses display capable of freely adjusting position and visual angles |
CN102053367A (en) * | 2010-09-28 | 2011-05-11 | 中航华东光电有限公司 | Image calibration method for binocular helmet display |
CN102928979A (en) * | 2011-08-30 | 2013-02-13 | 微软公司 | Adjustment of a mixed reality display for inter-pupillary distance alignment |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5290092B2 (en) * | 2009-08-31 | 2013-09-18 | オリンパス株式会社 | Eyeglass-type image display device |
JP2013050558A (en) * | 2011-08-30 | 2013-03-14 | Sony Corp | Head-mounted display, and display control method |
-
2013
- 2013-09-11 CN CN201310412265.8A patent/CN103439794B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN201732214U (en) * | 2010-06-07 | 2011-02-02 | 贾怀昌 | Eyeglasses display capable of freely adjusting position and visual angles |
CN102053367A (en) * | 2010-09-28 | 2011-05-11 | 中航华东光电有限公司 | Image calibration method for binocular helmet display |
CN102928979A (en) * | 2011-08-30 | 2013-02-13 | 微软公司 | Adjustment of a mixed reality display for inter-pupillary distance alignment |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018214751A1 (en) * | 2017-05-25 | 2018-11-29 | 京东方科技集团股份有限公司 | Eye-protection display device and method |
Also Published As
Publication number | Publication date |
---|---|
CN103439794A (en) | 2013-12-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103439794B (en) | Calibration method for head-mounted device and head-mounted device | |
KR101260287B1 (en) | Method for simulating spectacle lens image using augmented reality | |
KR101902957B1 (en) | Compressible eyecup assembly in a virtual reality headset | |
US9395543B2 (en) | Wearable behavior-based vision system | |
WO2017071458A1 (en) | Diopter self-adaptive head-mounted display device | |
US10627620B2 (en) | Head-mounted display device, method of controlling head-mounted display device, and computer program | |
JP5078616B2 (en) | Method for determining a pair of progressive ophthalmic lenses | |
CN110636414B (en) | Audio system for dynamic determination of personalized acoustic transfer functions | |
WO2016115873A1 (en) | Binocular ar head-mounted display device and information display method therefor | |
US10032313B2 (en) | Head-mounted device and method of enabling non-stationary user to perform 3D drawing interaction in mixed-reality space | |
US9256069B2 (en) | Image processing apparatus image processing method and program using electrodes contacting a face to detect eye gaze direction | |
JP6364936B2 (en) | Image display device | |
US10321820B1 (en) | Measuring optical properties of an eyewear device | |
US20170092004A1 (en) | Head-mounted display device, control method for head-mounted display device, and computer program | |
WO2018008232A1 (en) | Information processing device, information processing method, and program | |
US20160170482A1 (en) | Display apparatus, and control method for display apparatus | |
US20140225994A1 (en) | Three-dimensional image adjusting device and method thereof | |
CN110770636A (en) | Wearable image processing and control system with functions of correcting visual defects, enhancing vision and sensing ability | |
JP4811683B2 (en) | Visual field detector | |
US10185159B2 (en) | Progressive power lens and method of designing and manufacturing the same | |
US20160063909A1 (en) | Method For Adjusting Display Effect And Electronic Apparatus | |
CN109791294A (en) | Method and apparatus for running the display system with data glasses | |
US20220075187A1 (en) | Method for generating and displaying a virtual object by an optical system | |
EP3047883B1 (en) | Compressible eyecup assemblies in a virtual reality headset | |
US11048100B2 (en) | System and method for obtaining ophthalmic measurements for progressive lenses that accurately account for body stature and posture |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |