CN113641009A - Passive type head real-time tracking stereo glasses and positioning method - Google Patents
Passive type head real-time tracking stereo glasses and positioning method Download PDFInfo
- Publication number
- CN113641009A CN113641009A CN202111192504.4A CN202111192504A CN113641009A CN 113641009 A CN113641009 A CN 113641009A CN 202111192504 A CN202111192504 A CN 202111192504A CN 113641009 A CN113641009 A CN 113641009A
- Authority
- CN
- China
- Prior art keywords
- block
- frame body
- head
- real
- glasses
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02C—SPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
- G02C11/00—Non-optical adjuncts; Attachment thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/344—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Ophthalmology & Optometry (AREA)
- Optics & Photonics (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
The invention discloses a passive head real-time tracking stereo glasses and a positioning method, wherein the stereo glasses comprise: the mirror frame body is provided with a chute; one end of the movable marking component extends into the sliding groove to be abutted against the inner wall of the mirror frame body, and the movable marking component can move in the sliding groove; the movable marking assembly comprises a limiting block arranged in the sliding groove, and the upper end face and the lower end face of the limiting block are abutted against the inner wall of the mirror frame body; the connecting column is fixedly connected with the limiting block; the placing block is sleeved on one end of the connecting column, which is far away from the limiting block, is in threaded connection with the connecting column, and the distance between the placing block and the limiting block is adjusted by rotating the placing block along the central axis direction of the connecting column; and a marking block arranged in the middle of the placing block. The problem of glasses and detection sensor distance lead to the fact the seizure mark point difficult too near, perhaps because the sheltering from of sheltering from the thing can't catch mark point totally is solved.
Description
Technical Field
The invention belongs to the technical field of positioning, and particularly relates to a pair of passive head real-time tracking stereoscopic glasses and a positioning method.
Background
The AR technology is a modern information technology developed on the basis of VR (virtual reality) technology, which integrates a computer-generated virtual environment with a real environment around a user by means of technologies such as photoelectric display technology, various sensing technologies, computer graphics, and multimedia, and makes the user feel perceptually that the virtual environment is a part of the real environment around the user.
The AR technology has the characteristics of virtual-real combination, real-time interaction and three-dimensional registration, has the advantages of strong sense of reality, small modeling workload and the like compared with the VR technology, and can be widely applied to the fields of engineering design, medical treatment, military, education, entertainment, tourism and the like.
In the prior art, the angle rotation of the head of a person is usually positioned by tracking glasses, but in the tracking process, due to the operation habit of a user, the situation that the mark point is difficult to capture caused by the fact that the distance between the glasses and the detection sensor is too close to each other easily occurs, or the mark point cannot be captured due to the shielding of a shielding object, so that the practicability of the angle rotation mode for positioning the head of the person by tracking is reduced.
Disclosure of Invention
In view of the above, the main objective of the present invention is to solve the problem that the mark point is difficult to capture due to the close distance between the glasses and the detection sensor, or the mark point cannot be captured completely due to the shielding of the shielding object.
The invention provides a pair of passive head real-time tracking stereoscopic glasses, which comprises: the mirror frame body is provided with a sliding groove; the moving mark component is arranged on the mirror frame body, one end of the moving mark component extends into the sliding groove to be abutted against the inner wall of the mirror frame body, and the moving mark component can move in the sliding groove; the mobile marking assembly comprises a limiting block, a connecting column, a placing block and a marking block; the limiting block is arranged in the sliding groove, and the upper end face and the lower end face of the limiting block are abutted against the inner wall of the mirror frame body; the connecting column is fixedly connected with the limiting block, one end, extending into the placing block, of the connecting column extends along the radial direction of the placing block to form a stop block, and one end face, close to the connecting column, of the stop block is abutted against the inner wall of the placing block; the placing block is sleeved on one end, away from the limiting block, of the connecting column, is in threaded connection with the connecting column, and is rotated along the central axis direction of the connecting column, so that the distance between the placing block and the limiting block is adjusted; and a marking block arranged in the middle of the placing block.
In some embodiments of the present invention, two ends of the glasses frame body are respectively hinged with a glasses leg, a human head tracking module is arranged inside the glasses leg, and the human head tracking module includes a motion data collecting unit for collecting head motion data of a stereoscopic glasses wearer, an angle converting unit for converting the head motion data into angle information, and a transmitting unit for transmitting the angle information; the output end of the motion data acquisition unit is connected with the input end of the angle conversion unit, and the output end of the angle conversion unit is connected with the input end of the transmitting unit.
In some embodiments of the invention, the motion data acquisition unit is a three-axis gyroscope sensor.
In some embodiments of the invention, a buffer layer is arranged on one end face of the placing block close to the connecting column.
In some embodiments of the present invention, the buffer layer is made of an elastic material.
In some embodiments of the invention, an end of the marker block distal to the placement block extends beyond the placement block.
In some embodiments of the present invention, the frame body has a lens disposed thereon.
In some embodiments of the present invention, the frame body is provided with three marker blocks capable of moving on the sliding groove and two marker blocks fixed to both ends of the frame body.
The invention also provides a positioning method for the passive head real-time tracking, which comprises the following steps:
the method comprises the following steps: the wearer positions the head, and real-time position information of at least three marking blocks on the frame body is acquired through at least two detection sensors, so that coordinate information of marking points is obtained;
step two: comparing the coordinate information of each marking point respectively acquired by at least two detection sensors based on a coordinate conversion relation, and judging whether the coordinate information of the marking points is correct or not;
step three: if the coordinate information of the mark points is correct, generating a 3D picture;
step four: and carrying out real-time spatial positioning on the head of the wearer based on the real-time acquisition of the angle information of the head of the wearer.
The utility model provides a three-dimensional glasses through removing the mark piece along the spout to place the piece through the rotating and make and place piece and picture frame body and closely laminate, reduced the possibility that the mark piece takes place the aversion easily, thereby realize adjusting the position of mark piece on the picture frame body, solved glasses and detection sensor distance too closely and cause to catch the mark point difficulty or because sheltering from of thing, the unable problem of catching mark point totally, thereby the practicality of three-dimensional glasses has been improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on the drawings without creative efforts.
Fig. 1 is an overall structural diagram of a pair of passive head real-time tracking stereoscopic glasses according to an embodiment of the present invention;
fig. 2 is an overall structural diagram of a moving mark assembly of a pair of stereoscopic glasses with real-time passive head tracking according to an embodiment of the present invention;
FIG. 3 is a cross-sectional view of a moving mark assembly of a pair of stereoscopic glasses with real-time passive head tracking according to an embodiment of the present invention;
fig. 4 is a flowchart of a positioning method for passive head real-time tracking according to an embodiment of the present invention.
Wherein the figures include the following reference numerals:
10. a frame body; 11. a chute; 12. a temple; 13. a human head tracking module; 14. a lens; 20. a movement marker component; 21. a limiting block; 22. connecting columns; 2201. a stopper; 23. placing the blocks; 2301. a buffer layer; 24. and marking the block.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1-3, a pair of passive head real-time tracking stereoscopic glasses includes: the spectacle frame comprises a spectacle frame body 10, wherein a chute 11 is arranged on the spectacle frame body 10; a moving mark component 20 arranged on the spectacle frame body 10, wherein one end of the moving mark component 20 extends into the chute 11 to be abutted against the inner wall of the spectacle frame body 10, and the moving mark component 20 can move in the chute 11; the movable marking assembly 20 comprises a limiting block 21, a connecting column 22, a placing block 23 and a marking block 24; the limiting block 21 is arranged in the sliding groove 11, and the upper end surface and the lower end surface of the limiting block 21 are both abutted against the inner wall of the mirror frame body 10; the connecting column 22 is fixedly connected with the limiting block 21, one end of the connecting column 22, which extends into the placing block 23, extends along the radial direction of the placing block 23 to form a stopper 2201, and one end face, close to the connecting column 22, of the stopper 2201 is abutted against the inner wall of the placing block 23; the placing block 23 is sleeved on one end of the connecting column 22 far away from the limiting block 21, the placing block 23 is in threaded connection with the connecting column 22, and the distance between the placing block 23 and the limiting block 21 is adjusted by rotating the placing block 23 along the central axis direction of the connecting column 22; and a marking block 24 provided in the middle of the placing block 23.
By applying the technical scheme of the embodiment, the mark block 24 may include a light source, the light source on the mark block 24 is received by the detection sensor, so that the wearer wears the stereoscopic glasses, after the head is aligned, the real-time position information of the mark block 24 on the frame body 10 is collected by the at least two detection sensors arranged at the detection end, so that the coordinate information of the mark point is obtained, then the coordinate information of each mark point respectively obtained by the at least two detection sensors is compared based on the coordinate conversion relation, and whether the coordinate information of the mark point is correct is judged, if the coordinate information of the mark point is correct, a 3D stereoscopic picture is generated, when the position of the mark block 24 needs to be adjusted, the mark block 24 is moved along the chute 11, and the placing block 23 is rotated, so that the placing block 23 and the limit block 21 extrude the frame body 10, so that the placing block 23 is tightly attached to the frame body 10, the possibility that the mark block 24 is easy to shift is reduced, so that the position of the mark block 24 on the glasses frame body 10 is adjusted, the problem that the mark points are difficult to capture due to the fact that the three-dimensional glasses are too close to the detection sensor is solved, or the problem that the mark points cannot be completely captured due to shielding of shielding objects is solved, and the practicability of the three-dimensional glasses is improved.
In some optional embodiments, two ends of the frame body 10 are respectively hinged with a temple 12, a human head tracking module 13 is disposed inside the temple 12, and the human head tracking module 13 includes a motion data collecting unit for collecting head motion data of a stereoscopic glasses wearer, an angle converting unit for converting the head motion data into angle information, and a transmitting unit for transmitting the angle information; the output end of the motion data acquisition unit is connected with the input end of the angle conversion unit, and the output end of the angle conversion unit is connected with the input end of the transmitting unit.
This kind of structural design adopts the motion data acquisition unit to gather wearer's head motion information, converts the head motion information into angle information through angle conversion unit to send angle information to the location end through the transmitting element, thereby realize acquireing the angle information of wearer's head in real time, carry out real-time space location to wearer's head.
Specifically, the motion data acquisition unit is a three-axis gyroscope sensor.
Referring to fig. 3, one end of the connecting column 22 extending into the placing block 23 extends along the radial direction of the placing block 23 to form a stopper 2201, and one end surface of the stopper 2201 close to the connecting column 22 abuts against the inner wall of the placing block 23. In this way, the placing block 23 is prevented from being separated from the connecting column 22 during the process of rotating the placing block 23.
With further reference to fig. 3, a buffer layer 2301 is disposed on an end of placement block 23 adjacent to connecting stud 22. Thus, the situation that the placing block 23 is easily worn by the lens frame body 10 in the process of pressing the lens frame body 10 due to the direct contact of the placing block 23 with the lens frame body 10 can be avoided.
Specifically, the buffer layer 2301 is made of an elastic material.
In some alternative embodiments, the end of the marker block 24 distal from the placement block 23 extends beyond the placement block 23. In this way, it is possible to reduce the possibility that the detection sensor cannot detect the marker block 24 due to the obstruction of the marker block 24 by the placement block 23. For example, when the sensor, the placing block 23, and the marking block 24 are on the same horizontal line, the sensor can detect the marking block 24 because the marking block 24 is raised with respect to the placing block 23.
In some alternative embodiments, the frame body 10 is provided with a lens 14.
In a specific embodiment, the frame body 10 is provided with three marker blocks 24 that can move on the slide groove 11 and two marker blocks 24 fixed to both ends of the frame body 10. Thus, the two detection sensors detect the five marking blocks 24 on the lens frame body 10, and the three marking blocks 24 at different viewing angles are simultaneously detected by the detection sensors, so that the three-dimensional coordinates of the three marking blocks are compared, whether the three-dimensional coordinates of the three marking blocks are correct or not is judged, and if the three-dimensional coordinates are correct, a 3D (three-dimensional) picture is generated, so that real-time head tracking is performed.
Please refer to fig. 4, which shows a flowchart of a positioning method for passive head real-time tracking according to the present application.
As shown in fig. 4, the positioning method of passive head real-time tracking includes the following steps:
the method comprises the following steps: the wearer positions the head, and real-time position information of at least three marking blocks on the frame body is acquired through at least two detection sensors, so that coordinate information of marking points is obtained;
step two: comparing the coordinate information of each marking point respectively acquired by at least two detection sensors based on the coordinate conversion relation, and judging whether the coordinate information of the marking points is correct or not;
step three: if the coordinate information of the mark points is correct, generating a 3D picture;
step four: and carrying out real-time spatial positioning on the head of the wearer based on the real-time acquisition of the angle information of the head of the wearer.
In this embodiment, for the first step, the wearer corrects the head position, and the real-time position information of at least three mark blocks on the frame body is collected by at least two detection sensors, so as to obtain the coordinate information of the mark points. And then, as for the second step, the positioning and tracking device compares the coordinate information of each marking point respectively acquired by at least two detection sensors based on the coordinate conversion relation and judges whether the coordinate information of the marking points is correct or not. And finally, as for the third step, if the coordinate information of the mark point is correct, the positioning and tracking device generates a 3D stereoscopic picture. Then, the positioning and tracking device carries out real-time space positioning on the head of the wearer for acquiring the angle information of the head of the wearer in real time based on the detection sensor.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.
Claims (9)
1. Stereoscopic glasses with passive head real-time tracking, characterized in that they comprise:
the spectacle frame comprises a spectacle frame body (10), wherein a sliding groove (11) is formed in the spectacle frame body (10);
the moving mark component (20) is arranged on the spectacle frame body (10), one end of the moving mark component (20) extends into the sliding groove (11) to be abutted against the inner wall of the spectacle frame body (10), and the moving mark component (20) can move in the sliding groove (11);
the mobile marking assembly (20) comprises a limiting block (21), a connecting column (22), a placing block (23) and a marking block (24);
The limiting block (21) is arranged in the sliding groove (11), and the upper end face and the lower end face of the limiting block (21) are abutted against the inner wall of the mirror frame body (10);
the connecting column (22) is fixedly connected with the limiting block (21), one end, extending into the placing block (23), of the connecting column (22) extends along the radial direction of the placing block (23) to form a stop block (2201), and one end face, close to the connecting column (22), of the stop block (2201) is abutted to the inner wall of the placing block (23);
the placing block (23) is sleeved on one end, far away from the limiting block (21), of the connecting column (22), the placing block (23) is in threaded connection with the connecting column (22), and the placing block (23) is rotated along the central axis direction of the connecting column (22) to adjust the distance between the placing block (23) and the limiting block (21); and
a marking block (24) arranged in the middle of the placing block (23).
2. The pair of passive stereoscopic glasses with real-time head tracking according to claim 1, wherein two ends of the frame body (10) are respectively hinged with a glasses leg (12), a human head tracking module (13) is disposed inside the glasses leg (12), the human head tracking module (13) comprises a motion data acquisition unit for acquiring head motion data of a stereoscopic glasses wearer, an angle conversion unit for converting the head motion data into angle information, and a transmitting unit for transmitting the angle information;
The output end of the motion data acquisition unit is connected with the input end of the angle conversion unit, and the output end of the angle conversion unit is connected with the input end of the transmitting unit.
3. The passive head real-time tracking stereoscopic eyewear of claim 2 wherein the motion data acquisition unit is a three-axis gyroscope sensor.
4. The glasses with passive real-time head tracking according to claim 1, wherein a buffer layer (2301) is disposed on an end surface of the placement block (23) near the connection column (22).
5. The glasses according to claim 4, wherein the cushioning layer (2301) is made of elastic material.
6. The glasses according to claim 1, wherein the marking block (24) protrudes from the placing block (23) at an end thereof far away from the placing block (23).
7. The glasses with passive real-time head tracking according to claim 1, wherein the frame body (10) is provided with lenses (14).
8. The glasses with passive real-time head tracking according to claim 1, wherein the frame body (10) is provided with three marking blocks (24) capable of moving on the sliding groove (11) and two marking blocks (24) fixed at two ends of the frame body (10).
9. A positioning method for passive head real-time tracking of stereoscopic glasses according to claim 1, characterized in that it comprises the following steps:
the method comprises the following steps: the wearer positions the head, and the real-time position information of at least three marking blocks on the frame body is collected to obtain the coordinate information of the marking points;
step two: comparing the coordinate information of each mark point respectively obtained based on the coordinate conversion relation, and judging whether the coordinate information of the mark points is correct or not;
step three: if the coordinate information of the mark points is correct, generating a 3D picture;
step four: and carrying out real-time spatial positioning on the head of the wearer based on the real-time acquisition of the angle information of the head of the wearer.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111192504.4A CN113641009B (en) | 2021-10-13 | 2021-10-13 | Passive type head real-time tracking stereo glasses and positioning method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111192504.4A CN113641009B (en) | 2021-10-13 | 2021-10-13 | Passive type head real-time tracking stereo glasses and positioning method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113641009A true CN113641009A (en) | 2021-11-12 |
CN113641009B CN113641009B (en) | 2022-01-28 |
Family
ID=78426711
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111192504.4A Active CN113641009B (en) | 2021-10-13 | 2021-10-13 | Passive type head real-time tracking stereo glasses and positioning method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113641009B (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20090132456A (en) * | 2008-06-19 | 2009-12-30 | 엄순재 | Sleepiness break glasses |
CN107077013A (en) * | 2014-10-21 | 2017-08-18 | 飞利浦灯具控股公司 | The system configured without hand, method and computer program product for luminous distribution |
CN207164367U (en) * | 2017-08-21 | 2018-03-30 | 刘洋 | AR glasses and its tracing system |
US10264964B1 (en) * | 2017-06-28 | 2019-04-23 | Bertec Corporation | Eye movement measurement device |
CN212490199U (en) * | 2020-04-03 | 2021-02-09 | 中国人民解放军陆军特色医学中心 | Cleaning and maintaining device for operating forceps |
-
2021
- 2021-10-13 CN CN202111192504.4A patent/CN113641009B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20090132456A (en) * | 2008-06-19 | 2009-12-30 | 엄순재 | Sleepiness break glasses |
CN107077013A (en) * | 2014-10-21 | 2017-08-18 | 飞利浦灯具控股公司 | The system configured without hand, method and computer program product for luminous distribution |
US10264964B1 (en) * | 2017-06-28 | 2019-04-23 | Bertec Corporation | Eye movement measurement device |
CN207164367U (en) * | 2017-08-21 | 2018-03-30 | 刘洋 | AR glasses and its tracing system |
CN212490199U (en) * | 2020-04-03 | 2021-02-09 | 中国人民解放军陆军特色医学中心 | Cleaning and maintaining device for operating forceps |
Also Published As
Publication number | Publication date |
---|---|
CN113641009B (en) | 2022-01-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110908503B (en) | Method of tracking the position of a device | |
US10674912B1 (en) | Method and apparatus for a compact and high resolution mind-view communicator | |
CN106873778B (en) | Application operation control method and device and virtual reality equipment | |
Plopski et al. | Corneal-imaging calibration for optical see-through head-mounted displays | |
CN107004275B (en) | Method and system for determining spatial coordinates of a 3D reconstruction of at least a part of a physical object | |
US9779512B2 (en) | Automatic generation of virtual materials from real-world materials | |
CN101310289B (en) | Capturing and processing facial motion data | |
CN116027894A (en) | Passive optical and inertial tracking for slim form factors | |
Van der Aa et al. | Umpm benchmark: A multi-person dataset with synchronized video and motion capture data for evaluation of articulated human motion and interaction | |
CN114761909A (en) | Content stabilization for head-mounted displays | |
JP2019531782A (en) | Sensor fusion system and method for eye tracking applications | |
CN106373198A (en) | Method for realizing augmented reality | |
CN104536579A (en) | Interactive three-dimensional scenery and digital image high-speed fusing processing system and method | |
Baak et al. | Analyzing and evaluating markerless motion tracking using inertial sensors | |
Reale et al. | Viewing direction estimation based on 3D eyeball construction for HRI | |
CA3085733A1 (en) | System and method of obtaining fit and fabrication measurements for eyeglasses using simultaneous localization and mapping | |
Gee et al. | Non-intrusive gaze tracking for human-computer interaction | |
CN109242887A (en) | A kind of real-time body's upper limks movements method for catching based on multiple-camera and IMU | |
Lee et al. | A robust eye gaze tracking method based on a virtual eyeball model | |
Reale et al. | Pointing with the eyes: Gaze estimation using a static/active camera system and 3D iris disk model | |
CN105225270B (en) | A kind of information processing method and electronic equipment | |
CN111947650A (en) | Fusion positioning system and method based on optical tracking and inertial tracking | |
CN113641009B (en) | Passive type head real-time tracking stereo glasses and positioning method | |
CN117372657A (en) | Training method and device for key point rotation model, electronic equipment and storage medium | |
Park et al. | A simple vision-based head tracking method for eye-controlled human/computer interface |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |