CN202159302U - Augment reality system with user interaction and input functions - Google Patents
Augment reality system with user interaction and input functions Download PDFInfo
- Publication number
- CN202159302U CN202159302U CN2011202698202U CN201120269820U CN202159302U CN 202159302 U CN202159302 U CN 202159302U CN 2011202698202 U CN2011202698202 U CN 2011202698202U CN 201120269820 U CN201120269820 U CN 201120269820U CN 202159302 U CN202159302 U CN 202159302U
- Authority
- CN
- China
- Prior art keywords
- user
- reality system
- augmented reality
- augment reality
- infrared
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Images
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
The utility model discloses an augment reality system with user interaction and input functions. In the augment reality system, an infrared video tracking device is applied, and position marks such as a small disk are adopted, the augment reality system can identify the mark through the position of the preset mark and can set the position of the mark as a virtual button. The augment reality marks can be operated by a user so as to achieve the interaction between the augment reality system and the user. The interaction includes that the augment reality system knows the user interacts with the system when the user puts a hand on the button, and then the augment reality system can identify a current posture of the user through the infrared video tracking system, thereby determining the content required to be transferred to the augment reality system by the user. The augment reality system further can augment on a view of the user, so that the user can obtain the feedback information.
Description
Technical field
The utility model belongs to the augmented reality technical field; Be specifically related to a kind of augmented reality system; It more exactly is a kind of augmented reality system that is used in technical realization user interactions of augmented reality and input function; Can reach enhancing through this system, and in the virtual reality system that strengthens, realize the mutual of user and virtual system for real world.
Background technology
Augmented reality (Augmented Reality is called for short AR) also is referred to as mixed reality.It is applied to real world with virtual information through computer technology, and be added in real time same picture or space of real environment and virtual object exists simultaneously.From user's angle, real world is strengthened by preprepared computer model (dummy model), and this enhancing can comprise label, three-dimensional rendering model, or shadow and illumination change etc.On the basis that obtains certain trigger pip that real world applies, augmented reality can make the user can realize the mutual communication of virtual world and real world.The application of augmented reality comprises computer-aided diagnosis, computer aid repair or maintenance, computer product upgrading or inner structure design etc.
In a typical augmented reality system; The scene of real world is enhanced through the figure that cut-and-dried area of computer aided generates; The visual effect of needs assurance real world and virtual world is synchronous in this process, in other words the consistance of the coordinate system of real world and virtual world.The position of computer virtual figure in the augmented reality view will be decided according to the geographic model of real world.For make this figure can be better and the figure of real world have suitable consistance, the residing posture of user must at first be consistent with light in the real world with virtual picture pick-up device.This consistance need be through for the lasting tracking of recognition object in the real world and upgrade the particular location of dummy model in the augmented reality scene with trace information and the particular location that recognizes object in real time.In case these two abilities are bonded to together, real world objects and area of computer aided generate figure and can ideally be combined in the augmented reality scene, just can realize strengthening with storage and prior information processed the purpose of real-world scene.
Because the progress on tracking velocity and accuracy, augmented reality can be accomplished in real time, and it is more visual that user interactions becomes.In order to excavate the potentiality of augmented reality technology more fully, the user need realize alternately with augmented reality system as legacy system, just as using keyboard and mouse.If seem very heavy but be incorporated into the system that to make in the augmented reality system to this equipment.And some advanced more exchange methods are because their intrinsic problem, also extremely difficultly integrate with augmented reality system, like " teaching " augmented reality system.The present most augmented reality system of using or developing generally lacks easy-to-use, intuitively, effectively and the method for user interactions.
The utility model content
To the shortcomings and deficiencies of existing augmented reality system in user interactions and input application; The applicant improves through research; Provide a kind of and have reliably, user interactions at a high speed and the augmented reality system of input function; To improve ease for use and the sense of participation of user in using the augmented reality system process, the additional firmware and the minimum additional treatments energy that these system requirements are minimum.
The technical scheme of the utility model is following:
A kind of augmented reality system with user interactions and input function, this system comprises:
A display device is used for the result of augmented reality is shown to the user;
A video image tracking means is used for confirming the position of object at real world;
A processor is used for the calculating location conversion;
One or more marks are used to trigger user input capability.
Its further technical scheme is: said video image tracking means is based on the video image tracking means of infrared image processing, and it comprises:
An infrared camera;
An infrared filtering camera lens;
One or more infrared camera infrared LED lamps on every side that are arranged in.
Its further technical scheme is: the processing of said mark through processor is enhanced to button and is presented among the User.
Its further technical scheme is: the processing of said mark through processor is enhanced to menu and is presented among the User.
The useful technique effect of the utility model is:
One, the utility model combines real world and dummy object through augmented reality system, for the user provides an approach that reality is strengthened.
Two, the utility model can not have to realize the mutual of traditional user and augmented reality system under the situation of optional equipment, and this is reliable, high speed alternately.
Three, the utility model utilizes the video tracking technology, and hardware that only need be minimum just can be employed with the processing resource of minimum augmented reality system.
Description of drawings
Fig. 1 is the formation synoptic diagram of the utility model.
Fig. 2 is an infrared tracking camera based on the infrared LED illumination.
Fig. 3 is a utility model and the augmented reality system interaction diagrams based on user perspective.
Embodiment
Further specify below in conjunction with the embodiment of accompanying drawing the utility model.
Fig. 1 shows the augmented reality system of the utility model.As shown in Figure 1; This augmented reality system comprises that one shows real world and the display device 1 that real world is strengthened; The video image tracking means 2 of definite real-world object position in real world; And a processor 3, project in the real world through calculating decision user's view and the virtual three-dimensional object, and be presented in user's view the inside.Display device 1 is wear-type visual device (HMD) in this example, but in practical application, is not limited to the wear-type visual device.Processor 3 is a desktop computer in this example, but in practical application, is not limited to desktop computer.In order more to help explanation, this augmented reality system will be defined to the work space 4 of an appointment, comprise that in this work space 4 marker plate 5 and at least one trigger the mark 6 of loading routine.Video image tracking means 2 and marker plate 5 determine together the user head the position and towards, and according to these information decision scenes that the user saw.
As shown in Figure 2, video image tracking means 2 comprises an infrared camera 7 and an infrared filtering camera lens 8 and some illuminations with infrared LED lamp 9, and these illuminations are installed in around the infrared camera 7 with infrared LED lamp 9.As shown in Figure 1, video image tracking means 2 is connected with processor 3; Video will be caught from video image tracking means 2, send processor 3 to through image capturing system then and operate, and come the reflecting marking on the marker plate 5 in the recognition image.Because the video of catching is via filtering, unique visible be the infrared light that reflecting marking returns.Because video image tracking means 2 is limited in the work space 4 of a design in advance, the position of the reflecting marking on the marker plate 5 is known, and processor 3 can most clearly determine user's posture and position.
As shown in Figure 1, in augmented reality system, marker plate 5 is the postures that are used to determine the user.Reflecting marking on the marker plate 5 is a plurality of little reflective disks 10, and these reflective disks 10 are around four thin reflectors 11.The positional information of reflector 11 is stored in the processor 3 in the practical application, so when infrared camera 7 photographed reflector 11, user's posture just can be determined, should can be returned augmented reality system for estimation of user's posture.
If a marker plate 5 is identified in video, triggers the position of the mark 6 of loading routine and also just can in video sequence, estimate to come out.This mark 6 that triggers loading routine is based on its actual position in real world.Decide the position at processor place and the position of any object in User in the real world through tracker and posture results estimated.This that is to say that the projection of mark 6 in User that triggers loading routine can be calculated the position in other words.Become visible in case trigger the mark 6 of loading routine, the mark 6 corresponding input functions that trigger loading routine can be loaded in the augmented reality system, and system will get into input pattern and wait for user's action.
Whether augmented reality system carries out judges with augmented reality system alternately.If recognizate does not appear among the User, even the user has touched the triggering input marking, system can not react user's behavior yet; Then being regarded as the user is on the contrary carrying out with augmented reality system alternately.
It is pointed out that the type that in the utility model, triggers the mark 6 of loading routine is to come work by processor 3 based on the position of consistent marker with function.Yet, in case the position of marker plate 5 is estimated that be marked at consistent location through placement, the input equipment of any amount just can both be implemented.The marker matrix of a 4*3 can be placed on a certain location.Coming numeric keypad input equipment of emulation, similarly is the digital input keyboard on the telephone set.Do like this and also can make augmented reality system have scalability more.
Used in the utility model based on infrared and image tracking algorithm video sequence, but be not limited to infrared and image tracking algorithm video sequence.Adopted position mark, an instance of this mark is like a little plate.In treatment facility, will handle for image, for example disk is got the winning number in a bond for user's vision scene and is processed to be the button of a circle hereto.But be not limited thereto.The user can be placed on through the hand with them on these buttons and perhaps trigger certain interactive function on disk.Can also imitate the function that is similar to the menu in the computing machine through similar principle augmented reality system, and give the user feedback and interaction with necessity.Through the infrared video tracker, augmented reality system can identify the current posture of user then, thereby confirms that the user hopes to pass to the content of augmented reality system.Augmented reality system can also make the user obtain feedback information through on user's view, strengthening.
Based on above description, the user has been shown in Fig. 3 how has carried out interactive flow process with the augmented reality system of the utility model, this flow process is following:
1) step 101 with predefined visible sign object, is placed on the ad-hoc location of work space, gets into step 102 then;
2) step 102 is utilized camera head, and step sequence of video images in real world gets into step 103 then;
3) step 103 is sought predefined sign object in image sequence; If can find predefined sign object, get into step 104, otherwise return step 102;
4) step 104 is calculated the particular location of sign object in real world, gets into step 105 then;
5) step 105 is packed into and is opened input functional module, gets into step 106 then;
6) step 106 gets into input pattern, gets into step 107 then;
7) step 107 judges whether the sign object is visible; If then get into step 108, then return step 106 if not;
8) step 108 is carried out corresponding loading routine, and flow process finishes.
Above-described only is the preferred implementation of the utility model, and the utility model is not limited to above embodiment.Be appreciated that other improvement and variation that those skilled in the art directly derive or associate under the prerequisite of spirit that does not break away from the utility model and design, all should think to be included within the protection domain of the utility model.
Claims (2)
1. the augmented reality system with user interactions and input function is characterized in that, this system comprises:
A display device is used for the result of augmented reality is shown to the user;
A video image tracking means is used for confirming the position of object at real world;
A processor is used for the calculating location conversion;
One or more marks are used to trigger user input capability.
2. according to the said augmented reality system with user interactions and input function of claim 1, it is characterized in that: said video image tracking means is based on the video image tracking means of infrared image processing, and it comprises:
An infrared camera;
An infrared filtering camera lens;
One or more infrared camera infrared LED lamps on every side that are arranged in.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2011202698202U CN202159302U (en) | 2011-07-28 | 2011-07-28 | Augment reality system with user interaction and input functions |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2011202698202U CN202159302U (en) | 2011-07-28 | 2011-07-28 | Augment reality system with user interaction and input functions |
Publications (1)
Publication Number | Publication Date |
---|---|
CN202159302U true CN202159302U (en) | 2012-03-07 |
Family
ID=45766932
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2011202698202U Expired - Fee Related CN202159302U (en) | 2011-07-28 | 2011-07-28 | Augment reality system with user interaction and input functions |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN202159302U (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102778998A (en) * | 2012-03-19 | 2012-11-14 | 联想(北京)有限公司 | Interaction method, device and system |
CN105528081A (en) * | 2015-12-31 | 2016-04-27 | 广州创幻数码科技有限公司 | Mixed reality display method, device and system |
CN105814876A (en) * | 2013-12-19 | 2016-07-27 | 索尼公司 | Image processing device and method, and program |
WO2016165548A1 (en) * | 2015-04-16 | 2016-10-20 | 北京蚁视科技有限公司 | Vision localization system and method based on high reflective infrared identification |
WO2017088187A1 (en) * | 2015-11-27 | 2017-06-01 | 深圳市欢创科技有限公司 | System and method for implementing position tracking of virtual reality device |
CN108076674A (en) * | 2015-06-16 | 2018-05-25 | 利勃海尔比伯拉赫零部件有限公司 | The method for assembling electrical switching system and the auxiliary assembling device for the assembling for simplifying the switching system |
WO2019153970A1 (en) * | 2018-02-06 | 2019-08-15 | 广东虚拟现实科技有限公司 | Head-mounted display apparatus |
-
2011
- 2011-07-28 CN CN2011202698202U patent/CN202159302U/en not_active Expired - Fee Related
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102778998A (en) * | 2012-03-19 | 2012-11-14 | 联想(北京)有限公司 | Interaction method, device and system |
US9323364B2 (en) | 2012-03-19 | 2016-04-26 | Beijing Lenovo Software Ltd | Interactive method, apparatus and system |
CN105814876A (en) * | 2013-12-19 | 2016-07-27 | 索尼公司 | Image processing device and method, and program |
CN105814876B (en) * | 2013-12-19 | 2019-07-02 | 索尼公司 | Image processing equipment and method |
WO2016165548A1 (en) * | 2015-04-16 | 2016-10-20 | 北京蚁视科技有限公司 | Vision localization system and method based on high reflective infrared identification |
CN108076674A (en) * | 2015-06-16 | 2018-05-25 | 利勃海尔比伯拉赫零部件有限公司 | The method for assembling electrical switching system and the auxiliary assembling device for the assembling for simplifying the switching system |
US10566771B2 (en) | 2015-06-16 | 2020-02-18 | Leibherr-Components Biberach GmbH | Method for mounting electric switching systems and assembly support device for simplifying the assembly of such switching systems |
WO2017088187A1 (en) * | 2015-11-27 | 2017-06-01 | 深圳市欢创科技有限公司 | System and method for implementing position tracking of virtual reality device |
CN105528081A (en) * | 2015-12-31 | 2016-04-27 | 广州创幻数码科技有限公司 | Mixed reality display method, device and system |
CN105528081B (en) * | 2015-12-31 | 2019-02-19 | 广州创幻数码科技有限公司 | Mixed reality display method, device and system |
WO2019153970A1 (en) * | 2018-02-06 | 2019-08-15 | 广东虚拟现实科技有限公司 | Head-mounted display apparatus |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN202159302U (en) | Augment reality system with user interaction and input functions | |
US11887312B2 (en) | Fiducial marker patterns, their automatic detection in images, and applications thereof | |
US10521021B2 (en) | Detecting, representing, and interpreting three-space input: gestural continuum subsuming freespace, proximal, and surface-contact modes | |
US11978243B2 (en) | System and method using augmented reality for efficient collection of training data for machine learning | |
US9495013B2 (en) | Multi-modal gestural interface | |
EP2427857B1 (en) | Gesture-based control systems including the representation, manipulation, and exchange of data | |
EP3095074B1 (en) | 3d silhouette sensing system | |
CN102854983B (en) | A kind of man-machine interaction method based on gesture identification | |
US8681098B2 (en) | Detecting, representing, and interpreting three-space input: gestural continuum subsuming freespace, proximal, and surface-contact modes | |
JP2018077882A (en) | Method and system for operation environment having multiple client devices and displays | |
CN102460373A (en) | Surface computer user interaction | |
CN108304075A (en) | A kind of method and apparatus carrying out human-computer interaction in augmented reality equipment | |
KR20140068855A (en) | Adaptive tracking system for spatial input devices | |
CN102306065A (en) | Realizing method of interactive light sensitive touch miniature projection system | |
WO2018204070A1 (en) | Real time object surface identification for augmented reality environments | |
CN109656363A (en) | It is a kind of for be arranged enhancing interaction content method and apparatus | |
CN108089713A (en) | A kind of interior decoration method based on virtual reality technology | |
Margetis et al. | Augmenting physical books towards education enhancement | |
Raees et al. | Thumb inclination-based manipulation and exploration, a machine learning based interaction technique for virtual environments | |
Liu et al. | A cross-platform framework for physics-based collaborative augmented reality | |
RE | Low cost augmented reality for industrial problems | |
Xu et al. | A New Architecture of Augmented Reality Engine | |
Konrad et al. | Building a Portable Low Cost Tangible User Interface Based on a Tablet Computer | |
CN115525151A (en) | Immersive interactive large screen implementation method | |
Gupta | The Universal Media Book |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
C17 | Cessation of patent right | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20120307 Termination date: 20130728 |