CN208722146U - Wearables for Assisted Location Tracking - Google Patents

Wearables for Assisted Location Tracking Download PDF

Info

Publication number
CN208722146U
CN208722146U CN201821616682.9U CN201821616682U CN208722146U CN 208722146 U CN208722146 U CN 208722146U CN 201821616682 U CN201821616682 U CN 201821616682U CN 208722146 U CN208722146 U CN 208722146U
Authority
CN
China
Prior art keywords
marker
sub
wearable device
image
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201821616682.9U
Other languages
Chinese (zh)
Inventor
胡永涛
戴景文
贺杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Virtual Reality Technology Co Ltd
Original Assignee
Guangdong Virtual Reality Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Virtual Reality Technology Co Ltd filed Critical Guangdong Virtual Reality Technology Co Ltd
Priority to CN201821616682.9U priority Critical patent/CN208722146U/en
Application granted granted Critical
Publication of CN208722146U publication Critical patent/CN208722146U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

本申请提供一种用于辅助定位追踪的穿戴式设备,包括镜框以及连接于镜框的镜腿,还包括设置于镜框的标记物,标记物由终端设备识别后,用于确定用户的眼部与终端设备之间的相对位置关系。利用上述的穿戴式设备,图像显示设备可通过采集包含集成于穿戴式设备上的标记物的图像,获取标记物的位置信息,以追踪穿戴式设备。

The present application provides a wearable device for assisting positioning and tracking, including a frame and temples connected to the frame, as well as a marker disposed on the frame. After the marker is recognized by the terminal device, it is used to determine the relationship between the user's eye and the eye. The relative positional relationship between terminal devices. Using the above-mentioned wearable device, the image display device can acquire the position information of the marker by collecting the image including the marker integrated on the wearable device, so as to track the wearable device.

Description

Wearable device for auxiliary positioning tracking
Technical field
This application involves image display arts, in particular to a kind of wearable device for auxiliary positioning tracking.
Background technique
With the development of science and technology, machine intelligence and information intelligent become increasingly popular, pass through machine vision or virtual view Feel etc. that image collecting devices identify user image to realize that the technology of human-computer interaction is more and more important.Augmented reality (Augmented Reality, AR) technology is constructed in actual environment by computer graphics techniques and visualization technique and is not present Virtual objects, and virtual objects are accurately fused in true environment by image recognition location technology, are set by display It is standby that virtual objects and true environment combine together, and it is shown to the true sensory experience of user.Augmented reality will solve Primary technical problem certainly is how to be accurately fused to virtual objects in real world, that is, to make virtual objects with just True angular pose appears on the correct position of real scene, to generate strong visual realism.In traditional technology In, what the angle presentation of virtual objects was usually fixed, it is virtual right to change after needing user to manipulate by controller The display view angle of elephant is for a user and inconvenient.
Utility model content
The embodiment of the present application is designed to provide a kind of wearable device for auxiliary positioning tracking.
The embodiment of the present application provides a kind of wearable device for auxiliary positioning tracking, including frame and is connected to mirror The temple of frame further includes the marker for being set to frame, after marker is identified by terminal device, for determine the eye of user with Relative positional relationship between terminal device.
Wherein, in some embodiments, frame includes the left frame being set side by side and right frame, marker include first group with And second group;First group is set to left frame, and second group is set to right frame.
Wherein, in some embodiments, first group includes the first sub- marker and the second sub- marker, the first son label Object and the second sub- marker are respectively arranged at the two sides of left frame.
Wherein, in some embodiments, second group includes the sub- marker of third and the 4th sub- marker, third label Object and the 4th sub- marker are respectively arranged at the two sides of right frame.
Wherein, in some embodiments, first group of sub- marker is identical as second group of sub- marker.
Wherein, in some embodiments, the first of first group the sub- marker, the second sub- marker and the of second group Three sub- markers, the 4th sub- marker include background and a characteristic point for being different from background.
It wherein, in some embodiments, further include nose bridge, nose bridge is connected between left frame and right frame, and marker is also The 5th marker including being set to nose bridge.
Wherein, in some embodiments, the 5th sub- marker is different from described first group of sub- marker and difference In described second group of sub- marker.
Wherein, in some embodiments, the 5th sub- marker includes background and is different from two of the background Characteristic point.
Wherein, in some embodiments, it is set to the one end of left frame far from nose bridge for first group.
Wherein, in some embodiments, it is set to the one end of right frame far from nose bridge for second group.
It wherein, in some embodiments, further include filter layer, filter layer overlay marks object.
Wearable device provided by the embodiments of the present application, non-wearable image display can be by acquisitions comprising being integrated in The image of marker in wearable device obtains the location information of marker, to track wearable device.Image display The relative tertiary location relationship between image display and wearable device, and root can be determined according to the image comprising marker According to the relative tertiary location relationship with the virtual objects of corresponding visual angle display building, enable the display view angle to virtual objects It follows the positional relationship between image display of wearable device and is changed, be conducive to user easily from polygonal Degree observation virtual objects, improve the interactivity of user and virtual content.
Detailed description of the invention
In order to illustrate more clearly of the technical solution of the application, attached drawing needed in embodiment will be made below Simply introduce, it should be apparent that, the accompanying drawings in the following description is only some embodiments of the application, general for this field For logical technical staff, without creative efforts, it is also possible to obtain other drawings based on these drawings.
Fig. 1 is the application scenarios schematic diagram of location tracking method provided by the embodiments of the present application;
Fig. 2 is the structural schematic diagram of wearable device provided by the embodiments of the present application;
Fig. 3 is the structural schematic diagram for the wearable device that another embodiment of the application provides;
Fig. 4 is the structural schematic diagram for the wearable device that the another embodiment of the application provides;
Fig. 5 is the flow diagram of location tracking method provided by the embodiments of the present application;
Fig. 6 is the functional block diagram of location tracking device provided by the embodiments of the present application;
Specific embodiment
Below in conjunction with the attached drawing in the embodiment of the present application, technical solutions in the embodiments of the present application carries out clear, complete Site preparation description, it is clear that described embodiments are only a part of embodiments of the present application, instead of all the embodiments.It is based on Embodiment in the application, it is obtained by those of ordinary skill in the art without making creative efforts every other Embodiment shall fall in the protection scope of this application.
It should be noted that it can be directly on another component when component is referred to as " being fixed on " another component Or there may also be components placed in the middle.When a component is considered as " connection " another component, it, which can be, is directly connected to To another component or it may be simultaneously present component placed in the middle.When a component is considered as " being set to " another component, it It can be and be set up directly on another component or may be simultaneously present component placed in the middle.
Unless otherwise defined, all technical and scientific terms used herein and the technical field for belonging to the application The normally understood meaning of technical staff is identical.The term used in the description of the present application is intended merely to description tool herein The purpose of the embodiment of body, it is not intended that in limitation the application.Term " and or " used herein includes one or more phases Any and all combinations of the listed item of pass.
Referring to Fig. 1, showing a kind of interactive system 10 of virtual content provided by the embodiments of the present application, the virtual content Interactive system 10 include: image display 300 and wearable device 400.In the embodiment of the present application, wearable device 400 are equipped with marker 450 (referring to fig. 2).When in use, it is aobvious to may be at image for the marker 450 in wearable device 400 Show equipment 300 within sweep of the eye, so that image display 300 can acquire the image of marker 450, and to marker 450 are identified.
In the embodiment of the present application, image display 300 can be the mobile devices such as mobile phone, plate, or be desktop The virtual reality of formula/augmented reality display device.When image display 300 is desktop type display device, it can be integrated Formula display device is also possible to the display device with circumscribed, for example, image display 300 can be with the intelligence such as connecting with external mobile phone Terminal, i.e. image display 300 can be inserted into or access external display device (such as mobile phone, tablet computer), in image Virtual objects are shown in display equipment 300.
In the embodiment of the present application, image display 300 is desktop display comprising image collecting device 301 And display device 303.Wherein, desktop display can be collectively formed in image collecting device 301 and display device 303.Specifically For, display device 303 includes control centre 3031 and display 3033, and display 3033 can be transflection eyeglass, control Center 3031 is used to launch display virtual objects to display 3033, and it is virtual right to allow users to observe in display 3033 As.While user sees virtual objects in display 3033, the environment in front can be observed through display 3033, because And eyes of user image obtained is superimposed scene with the superimposed virtual reality of front environment for virtual content.Image Acquisition Device 301 is electrically connected to display device 303, and image collecting device 301 is also used to obtain the environmental information in its field range.
Wearable device 400 is used to wear for user, enables image display 300 according to wearable device 400 Location information determines the display view angle of virtual objects.Marker 450 is integrated in wearable device 400.Marker 450 can be located In image display device 300 image collecting device 301 within sweep of the eye, i.e., image collecting device 301 can collect mark Remember the image of object 450.The image of the marker 450 is stored in image display 300, for for image display 300 According to the determining relative position information between marker 450 of the image of marker 450, to render and show virtual objects.
Marker 450 can be the tag image for having effigurate sub- marker including at least one.Certainly, specifically Marker 450 in the embodiment of the present application and be not construed as limiting, it is only necessary to marker 450 can be identified by image display 300 ?.
As shown in Figure 1, the wearable wearable device 400 of user, when the marker 450 in wearable device 400 is located at figure As display equipment 300 image collecting device 301 within sweep of the eye when, image collecting device 301 can collect marker 450 image, according to it is collected include at least one characteristic point being distributed on marker 450, can determine marker The information such as relative positional relationship and rotation relationship between 450 and image display 300, thus render and show it is virtual right As buildings model 600 as shown in Figure 1, as virtual objects shown by correspondence markings object 450, user can be based on as a result, Marker and observe virtual objects, and can observe that virtual objects show different visual angles in different positions.Such as Fig. 1 Shown, the virtual objects 600 that image display 300 is presented are an emulation buildings model, when the use for wearing wearable device 400 When the virtual objects 600 on image display 300 are observed at family station at the A, it is able to observe that the emulation buildings model the View (such as preset axis surveys view) under one visual angle, after user is moved at B, is able to observe that the emulation buildings model The topology view in the left side relative to user.
For above-mentioned scene, the embodiment of the present application provides a kind of location tracking method, provided by the embodiments of the present application The executing subject of location tracking method can be above-mentioned image display, and image display can lead in the location tracking method It crosses acquisition and includes the image for the marker being integrated in wearable device, obtain the location information of marker, it is wearable to track Equipment.Image display can determine opposite between image display and wearable device according to the image comprising marker Spatial relation, and virtual objects are rendered with corresponding visual angle according to the relative tertiary location relationship, make to virtual objects Display view angle can follow the positional relationship between image display of wearable device and be changed, and be conducive to user Easily from multi-angle observation virtual objects.
Referring to Fig. 4, in one embodiment, the application provides a kind of location tracking method, applied to above-mentioned image Equipment is shown, this method comprises: S101 to S105.
Step S101: the image comprising marker is obtained, wherein marker is set in wearable device.
Step S103: identifying the marker in image, and determines that wearable device and image are aobvious according to marker Show the relative tertiary location information between equipment.
Step S105: rendering virtual objects according to relative tertiary location information and it is virtual right to show in image display apparatus As.
In one embodiment, based on above-mentioned location tracking method, the application also provides a kind of location tracking method, answers For above-mentioned image display, this method comprises: S101 to S105.
Step S101: the image comprising marker is obtained, wherein marker is set in wearable device.
It further, include the image of marker by the image acquisition device of image display, wherein label Object can integrate in wearable device, for example, can be the fixed pattern for being presented on wearable device surface, or be optional Selecting property it is presented on the pattern (pattern that such as wearable device is shown after being powered) on wearable device surface.
Further, wearable device can be glasses, be used to wear for user.It can be set at least on the glasses Two markers, at least two markers are respectively arranged at the left frame and right frame of the glasses, for identity user Eye locations.In some embodiments, at least three markers, at least three markers difference can be set on the glasses It is set at the left frame, right frame and nose central sill of the glasses, in order to the plane where identifying the frames of the glasses substantially, Be conducive to the profile for being fitted the glasses, and obtain the space angle and rotation attitude of the glasses.It is to be appreciated that wearable set Standby upper marker can be one or more, and wearable device is also possible to other equipment, such as can be with marker Cap, the necklace with marker, wrist-watch, shirt etc., but not limited to this.
Step S103: identifying the marker in image, and determines that wearable device and image are aobvious according to marker Show the relative tertiary location information between equipment.
In some embodiments, wearable device may include at least one marker, at this point, step S103 can wrap It includes: identifying at least one marker for including in the image of acquisition, calculate between at least one marker and image display Relative position and position relation, determine the relative tertiary location relationship between image display and wearable device.
Further, according to display state, size and the angles of display of marker in image, calculate marker relative to Location information and orientation information between image display, so that it is determined that between wearable device and image display Relative tertiary location relationship.Marker relative image can directly be shown the location information and direction of equipment by image display Information etc. is as the relative tertiary location relationship between wearable device and image display.
In one embodiment, wearable device can be glasses, and when glasses are worn, step S103 may include: The relative tertiary location information between the eye and image display of the user of wearing spectacles is determined according to marker.
In some embodiments, the marker being arranged on glasses may include multiple sub- markers, and sub- marker can be with It is the effigurate pattern of tool, every sub- marker may include one or more features point, wherein the shape of characteristic point is not It limits, can be dot, annulus, be also possible to triangle, other shapes.Multiple sub- markers can be separately positioned on glasses Frame on different location, multiple sub- markers of setting can collectively constitute a marker, image display device acquisition Image comprising multiple sub- marker identifies each sub- marker, obtain the characteristic information of every sub- marker with Arrangement positional relationship between and etc., to obtain the relative tertiary location relationship between glasses and image display.
At this point, step S103 may include:
Step S1031: according to the image of marker, the sub- marker that marker includes is determined;
Step S1032: the ocular of the user of wearing spectacles is positioned according to sub- marker, and determines the use of wearing spectacles Relative tertiary location information between the eye and image display at family.
As a result, by extracting the sub- marker of the image of marker, and sub- marker is tracked, to determine user's Ocular, the relative tertiary location information that can be more accurately determined between the eye of user and image display.On The sub- marker stated, it can be understood as the image of the marker in specific position or the parts of images of marker, for example, will place The image or parts of images of marker on the outside of user's eyes are considered as sub- marker, in favor of identifying the ocular of user; Alternatively, the image of the marker between user's eyes or parts of images are considered as sub- marker, in favor of positioning user's Ocular.
Further, the multiple sub- markers being arranged on the glasses that user is worn can be identical sub- marker, It can be different marker, different sub- markers can have a different characteristic informations, and this feature information may include but unlimited In the shape of sub- marker, color, the characteristic point quantity for including etc..The multiple sub- marks being arranged on image display device identification glasses Remember object, the arrangement positional relationship between each sub- marker can be obtained, wherein arrangement positional relationship refers to each sub- marker Between relative position, distributing order etc., so as to according to the arrangement positional relationship between each sub- marker, sub- marker Size etc. determines the information such as posture, the position of marker, to obtain the space position between glasses and image display Relationship is set, or obtains the relative tertiary location information between the eye and image display of the user of wearing spectacles.
In one embodiment, in addition to multiple sub- markers of a marker are separately positioned on the frame of glasses Outside different location, it can also will include that a complete label object of multiple sub- markers is arranged on the frame of glasses, for example, One marker is set in the middle position of glasses, multiple markers can also be separately positioned on to the different positions of the frame of glasses Place is set, so as to obtain the relative tertiary location relationship between multiple markers and image display, or is worn Relative tertiary location information between the eye and image display of the user of glasses.
In one embodiment, according to the eyes image of user, can determine user eye and image display it Between relative tertiary location information improve fluency in order to improve the speed to the tracking of the eye of user.At this point, step S103 can also include:
Step S1035: the eyes image of user is acquired;
Step S1036: extracting the eye feature of eyes image, and determines according to eye feature the eye of the user of wearing spectacles Relative tertiary location information between portion and image display.
It is possible to further real-time tracing be carried out to the ocular of user, in order to count according to the eyes image of user The exercise data of the eye of user is calculated, and the movement of the eye of user is predicted, is conducive to the movement for prejudging user's eye Trend, to improve the acquisition efficiency of relative tertiary location information.At this point, step S1036 may include: real-time acquisition wearing spectacles User eyes image, compare the eye feature between eyes image consecutive frame, obtain the eye position of the user of wearing spectacles Variation is set, changes the exercise data for calculating eye according to eye locations;The use of wearing spectacles is determined according to the exercise data of eye Relative tertiary location information between the eye and image display at family.
Further, by comparing the eye feature between eyes image consecutive frame, the eye of the user of wearing spectacles is obtained The incrementss of portion position carry out the exercise data for estimating eye.For example, the coordinate of eye locations has determined when current nth frame It obtains eye locations variation by comparing the identical eye feature between nth frame and N+1 frame eyes image for (X, Y, Z) and increases Amount is (x1, y1, z1), then when can determine N+1 frame, the coordinate of eye locations is (X+x1, Y+y1, Z+z1), and so on, Eye locations coordinate when nth frame, N+1 frame, N+2 frame, N+2 frame ... N+m frame can be calculated, so as to count Calculate the exercise data of eye.Wherein, when current nth frame, the coordinate of eye locations is (X, Y, Z), be can be by above-mentioned steps S103 determined, that is, can determine to obtain space between wearable device and image display according to marker Location information, the coordinate for obtaining eye locations is (X, Y, Z), will be measured on the basis of the coordinate of the eye locations, and pass through on it Compare the eye feature between eyes image consecutive frame, the incrementss of eye locations coordinate is calculated, to calculate N+1 one by one Eye locations coordinate when frame, N+2 frame, N+2 frame ... N+m frame.It so, it is possible the step of simplifying data processing, mention Height improves fluency to the speed of the eye tracking of user.
Step S1037: according to relative tertiary location information determined by step S1035, to phase determined by step S1038 Spatial positional information is corrected, is obtained accurate relatively empty between the eye and image display of the user of wearing spectacles Between location information.Further, it is corrected according to calibrating position information to location information is estimated, obtains the user of wearing spectacles Eye and image display between relative tertiary location information;Wherein, location information is estimated as according to eyes image institute Determining relative tertiary location information, calibrating position information are the relative tertiary location information according to determined by marker.
Specifically, in some embodiments, above-mentioned step S1037 can be executed by way of dual-thread, this When, step S1037 may include: to obtain the eyes image that acquires in real time by first thread, and obtain according to eyes image pre- Estimate location information;The image comprising marker is obtained by the second thread, and calibrating position information is obtained according to marker;It will be pre- Estimate location information to be compared with calibrating position information, when estimating location information and calibrating position information is inconsistent, according to school Quasi- location information is corrected to location information is estimated.
In one embodiment, image display can merge the location information that dual-thread obtains, to obtain Relative tertiary location information between the eye and image display of user, wherein the mode of fusion can be to be a variety of, herein not It limits.Image display can obtain a newest frame by first thread, and to obtain relative tertiary location information (namely above-mentioned Estimate location information), and the relative tertiary location information with the newest frame same number of frames is obtained (on namely by the second thread The calibrating position information stated), and the two is merged, for example, the relative tertiary location information that first thread can be taken to obtain The average value of the relative tertiary location information obtained with the second thread, or be weighted and calculate according to different weights, Obtain final relative tertiary location information.
In one embodiment, since the frame per second of the second thread acquisition relative tertiary location information is lower, speed is slower, because This relative tertiary location information with the newest frame same number of frames that may do not directly obtain, then the second thread can basis The relative tertiary location information that former frame obtains is estimated, to obtain the space position with the newest frame same number of frames Confidence breath.
The space position between the eye of user and image display is determined by acquiring the image of marker as a result, Confidence ceases (being denoted as calibrating position information), and the relative tertiary location information can be made more accurate, simultaneously, also pass through acquisition The eyes image of user (is denoted as come the relative tertiary location information determined between the eye of user and image display and estimates position Confidence breath), it can be improved the speed for obtaining the relative tertiary location information, further according to calibrating position information to estimating location information It is calibrated, obtains more accurate relative tertiary location information, speed and precision can be taken into account, improve location tracking method Fluency.
It is understood that the execution time of above-mentioned steps is not construed as limiting, for example, can in some specific embodiments To first carry out step S1031~S1033, then execute step S1035~S1037;Alternatively, may be performed simultaneously step S1031~ S1033 and step S1035~S1036, then execute step S1037;Alternatively, step S1035~S1036 can be first carried out, then hold Row step S1031~S1033, S1037.
Further, the relative tertiary location letter between the eye and image display of the user that above-mentioned steps obtain Breath, can be but be not limited to: relative position information, relative orientation information, relative angle information and relative rotation information with And posture information etc..
Step S105: rendering virtual objects according to relative tertiary location information and it is virtual right to show in image display apparatus As.
In some embodiments, it after obtaining relative tertiary location information, can obtain and the relative tertiary location information Corresponding model rendering data, and according to the model rendering data render virtual objects, model data may include for rendering Render coordinate, color data, data texturing, rendering visual angle etc..Specifically, image display obtains between marker After relative tertiary location relationship, the rendering coordinate of virtual objects can be determined according to relative tertiary location relationship, further according to the rendering Coordinate renders and shows virtual objects, which can be used for indicating virtual objects and image display in Virtual Space Relative tertiary location relationship.Wherein, relative tertiary location relationship may include the information such as relative position, relative orientation, and image is shown Relative tertiary location relationship in realistic space can be converted to the relative coordinate date of Virtual Space by equipment, and opposite according to this Coordinate data calculates rendering coordinate of the virtual objects in Virtual Space, so as to accurately show virtual objects.
In some embodiments, according to relative tertiary location information, determine wearable device and image display it Between angular relationship determine the display view angle of virtual objects, then step S105 may include: and according to angular relationship
Step S1051: it according to relative tertiary location information, obtains opposite between wearable device and image display Space angle;
Step S1052: according to space angle between wearable device and image display and preset right Rule is answered, determines the display view angle of virtual objects, renders virtual objects, and show virtual objects in image display apparatus.
The above-mentioned preset rule of correspondence, for pair between the display view angle of virtual objects corresponding to the space angle It should be related to, when virtual objects are tridimensional virtual model, such as shown in Fig. 1, the virtual objects of the presentation of image display 300 600 be an emulation buildings model, when the subscriber station for wearing wearable device 400 observes the void on image display 300 at A When quasi- object 600, the space angle between wearable device 400 and image display 300 is first angle at this time, then According to the space angle and the preset rule of correspondence, determine that the display view angle of virtual objects is the first visual angle, therefore use Family is able to observe that view (such as preset axis surveys view) of the emulation buildings model under the first visual angle;When user is moved to After at B, the space angle between wearable device 400 and image display 300 is second angle, then opposite according to this Space angle and the preset rule of correspondence determine that the display view angle of virtual objects is the second visual angle (such as preset northwest corner View), therefore user is able to observe that topology view (such as preset west of the emulation buildings model relative to the left side of user Beijiao view).
It further, can when determining the display view angle of virtual objects according to the space angle and render virtual objects To calculate the space angle in real time, and virtual objects are rendered in real time, allow users to observe in the process of moving It is converted to virtual objects with corresponding visual angle, is conducive to improve the ornamental value that image is shown.
Further, in some embodiments, virtual objects can be triggered by the threshold value of setting space angle Display view angle directly switch over.Specifically, for example, when the space between wearable device and image display When angle is not fallen in the threshold range of space angle (such as less than lower threshold or be higher than upper limit threshold when), it is determined that The display view angle of virtual objects is opposite visual angle, in order to which user is able to observe that the view of the opposing face of virtual objects.This is Because the display of image display device usually has visible angle, such as 5~175 degree, if user 5~175 degree range it Outer observation display is then difficult to see clearly image shown on display, this will lead to user when observing virtual objects, at most It can only observe view of the virtual objects relative to the left and right sides of user, and be difficult to observe by virtual objects relative to user's Rear view, therefore it is directed to the defect, by the way that the threshold value of space angle is arranged, the display view angle for triggering virtual objects is direct It switches over, is lower than lower threshold or higher than upper limit threshold when wearing the space angle between formula equipment and image display When, the display view angle of virtual objects is directly determined as the back side visual angle relative to user, and render virtual objects, make virtual objects The observation experience that can be improved user in front of the user is presented with the postrotational visual angle of 180 degree.Therefore, in brief, above-mentioned The step S1052 of location tracking method may include:
Step S1053: it is default to judge whether the space angle between wearable device and image display falls into Space angle threshold range, if so, execute step S1054, if it is not, thening follow the steps S1055.
Step S1054: the display view angle for determining virtual objects is positive visual angle, and is shown according to wearable device and image Space angle and the preset rule of correspondence between equipment, determine the visual angle that is particularly shown of virtual objects, and rendering is virtual Object, and virtual objects are shown in image display apparatus.
It should be understood that the front visual angle, to wear the user of wearable device relative to the aobvious of image display Display screen observation, the institute within the scope of the maximum region in front of the display screen it is observed that visual angle, that is, wearing wearable device When user moves in the visual range of the display screen, it is observed that visual angle.For specific virtual objects, if empty Quasi- object is tridimensional virtual model (emulation buildings model as shown in figure 1), and positive visual angle is it is to be understood that northeast visual angle, east Side visual angle, southeast visual angle, south visual angle, southwestern visual angle, west visual angle and northwest visual angle.Correspondingly, virtual objects can have There is back side visual angle, which is visual angle of the dummy model other than positive visual angle, for specific virtual objects, If virtual objects are tridimensional virtual model (emulation buildings model as shown in figure 1), back side visual angle can be understood as north visual angle.
Step S1055: the display view angle for determining virtual objects is back side visual angle, renders virtual objects, and show in image Virtual objects are shown in equipment.
It is appreciated that, according to relative tertiary location information, determining that wearable device and image are aobvious in above-mentioned step S105 Show the angular relationship between equipment, and according to angular relationship, the display view angle of virtual objects is determined, with wearable device and image It is illustrated for level angle variation between display device, it should be appreciated that in other implementations, wearable Angle between equipment and image display device similarly, is shown when vertical direction changes according to wearable device and image Space angle and the preset rule of correspondence between equipment determine the display view angle of virtual objects, and rendering is virtual right As allowing users to the bottom view or top-level view of easily observing virtual objects.
In location tracking method provided by the embodiments of the present application, image display can be by acquisition comprising being integrated in wearing The image of marker in formula equipment obtains the location information of marker, to track wearable device.Image display can root The relative tertiary location relationship between image display and wearable device is determined according to the image comprising marker, and according to this Relative tertiary location relationship enables to follow the display view angle of virtual objects with the virtual objects of corresponding visual angle display building The positional relationship between image display of wearable device and be changed, be conducive to user easily from multi-angle see Examine virtual objects.
Referring to Fig. 5, in one embodiment, the application provides a kind of location tracking device 100 of virtual content, it is used for Execute above-mentioned location tracking method.The location tracking device 100 of virtual content includes image capture module 101, positional relationship Determining module 103 and display module 105.Image capture module 201 is used to acquire the image of marker, and positional relationship determines mould Block 103 is used to determine that the relative tertiary location between image display and wearable device is closed according to the image comprising marker System, display module 105 according to the relative tertiary location relationship for rendering and showing virtual objects.It is understood that above-mentioned Each module can be the program module run in computer readable storage medium.In the embodiment of the present application, location tracking fills It sets 100 to be stored in the memory of image display 300, and is configured as by one or more of image display 300 A processor executes.The work of above-mentioned modules is specific as follows:
Image capture module 101 is for obtaining the image comprising marker, wherein marker is set to wearable device On.Further, the image acquisition device that image capture module 101 passes through image display includes the figure of marker Picture.
Positional relationship determining module 103 is determined according to marker and is dressed for identifying to the marker in image Relative tertiary location information between formula equipment and image display.Positional relationship determining module 103 is believed including first position Cease determination unit 1031, second location information determination unit 1033 and calibration unit 1035.
First location information determination unit 1031 is used to determine the eye and image of the user of wearing spectacles according to marker Show the relative tertiary location information between equipment.Specifically, first location information determination unit 1031 acquires for identification Image in include at least one marker, calculate relative position between at least one marker and image display and Position relation determines the relative tertiary location relationship between image display and wearable device.First location information determines Unit 1031 further for the image according to marker, determines the sub- marker that marker includes;According to extremely sub- marker Position the ocular of the user of wearing spectacles, and the phase between the eye and image display of the user of determining wearing spectacles To spatial positional information.
Second location information determination unit 1033 is used for the eyes image according to user, determines that the eye of user and image are aobvious Show the relative tertiary location information between equipment, in order to improve the speed of the eye tracking to user, improves fluency.Specifically For, second location information determination unit 1033 is used for the eyes image according to user, the eye feature of eyes image is extracted, and The relative tertiary location information between the eye and image display of the user of wearing spectacles is determined according to eye feature.
Correcting unit 1035 is used for the relative tertiary location information according to determined by first location information determination unit 1031, Relative tertiary location information determined by second location information determination unit 1033 is corrected.Specifically, correcting unit 1035 are used for: note relative tertiary location information according to determined by the image of ocular is to estimate location information, is remembered according to mark Remember that relative tertiary location information determined by object is calibrating position information;According to calibrating position information, to estimate location information into Row correction, obtains the relative tertiary location information between the eye and image display of the user of wearing spectacles.Further, Correcting unit 1035 is compared for will estimate location information with calibrating position information, when estimating location information and calibrating position When information is inconsistent, it is corrected according to calibrating position information to location information is estimated.
Display module 105 is used to render virtual objects according to relative tertiary location information and show in image display apparatus Virtual objects.In some embodiments, display module 105 is used to determine wearable device according to relative tertiary location information Angular relationship between image display, and according to angular relationship, it determines the display view angle of virtual objects, shows mould at this time Block 105 can be with angle determination unit 1051, visual angle determination unit 1053, rendering unit 1055 and display unit 1057.
In some embodiments, after positional relationship determining module 103 obtains relative tertiary location information, rendering unit 1055 for obtaining model rendering data corresponding with the relative tertiary location information, and empty according to the model rendering data render Quasi- object.Specifically, after positional relationship determining module 103 obtains the relative tertiary location relationship between marker, rendering Unit 1055 is used to determine the rendering coordinate of virtual objects according to relative tertiary location relationship, simultaneously further according to rendering coordinate rendering Show that virtual objects, the rendering coordinate can be used for indicating the space position of virtual objects and image display in Virtual Space Set relationship.Display unit 1057 is used to show the virtual objects after rendering in image display apparatus.
Further, rendering unit 1055 determines the display view angle of virtual objects according to the space angle and renders void When quasi- object, the space angle can be calculated in real time, and render virtual objects in real time, and display unit 1057 can be with Virtual objects after displaying in real time rendering, allow users to observe virtual objects with corresponding visual angle in the process of moving It is converted, is conducive to improve the ornamental value that image is shown.
Angle determination unit 1051 is used to obtain wearable device and image display according to relative tertiary location information Between space angle;
Visual angle determination unit 1053 be used for according to the space angle between wearable device and image display with And the preset rule of correspondence, determine the display view angle of virtual objects.Further, in some embodiments, visual angle determines single Member 1053 is also used to the threshold value by the way that space angle is arranged, and the display view angle for triggering virtual objects directly switches over.
Referring to Fig. 2, in one embodiment, the application provides a kind of wearable device 400, is used for auxiliary positioning Tracking.The wearable device 400 of the application is the glasses worn for user comprising mirror holder 410 and is set on mirror holder 410 Marker 450.After marker 450 is identified by terminal device (such as image display 300), for determine the eye of user with Relative positional relationship between terminal device.
In the embodiment of the present application, mirror holder 410 includes left mirror leg 411, right temple 413 and frame 415, and frame 415 is set It is placed between left mirror leg 411 and right temple 413.In other some embodiments, wearable device 400 can be clip type Glasses, then wearable device 400 does not include temple, but the clip including being connected on frame 415, will be wearable by clip Equipment 400 is directly sandwiched on the spectacles of user, is relatively brought great convenience to near-sighted user.
Frame 415 includes left frame 4151, right frame 4153 and the nose bridge 4155 being set side by side, and left frame 4151 is connected to a left side Temple 411, right frame 4153 are connected to right temple 413, and nose bridge 4155 is connected between left frame 4151 and right frame 4153.
Marker 450 is set on frame 415.Specifically, marker 450 is set to the outside of frame 415, the outside It will be appreciated that frame 415 deviates from the side of user's eye when user wears wearable device 400.In the present embodiment, it marks Remember object 450 be it is multiple, multiple markers 450 according to arrangement can divide into multiple marker groups, for example, multiple markers 450 Including first group 451, second group 453 and third group 455, first group 451 is set to left frame 4151, and second group 453 is set to Right frame 4153, third group 455 are set to nose bridge 4155.Third group 455 is all different with first group 451, second group 453, with The space bit of wearable device 400 is determined by three markers 450 of identification convenient for terminal device (such as image display 300) It sets.Alternatively, in other real-time modes, third group 455 from first group 451, second group 453 it is different.
Further, for the ease of the profile of fitting wearable device 400, marker 450 can be distributed in frame 415 Profile.At this point, first group 451 may include one or more sub- marker, for example, may include the first sub- marker 4511 with And the second sub- marker 4513, the first sub- marker 4511 and the second sub- marker 4513 are respectively arranged at the two of left frame 4151 Side.Specifically in Fig. 2 and embodiment shown in Fig. 3, the first sub- marker 4511 and the second sub- marker 4513 are all set in Side of the left frame 4151 far from nose bridge 4155, and the first sub- marker 4511 is set to the upper end of left frame 4151, the second son mark Note object 4513 is set to the lower end of left frame 4151.Above-mentioned upper end, it should be understood that when user wears wearable device 400, mirror Frame 415 is close to one end of user's eyebrow, correspondingly, above-mentioned lower end, it should be appreciated that when user wears wearable device 400, The one end of frame 415 far from user's eyebrow.
Correspondingly, second group 453 may include one or more sub- markers, for example, may include the sub- marker of third 4531 and the 4th sub- marker 4533, the sub- marker 4531 of third and the 4th sub- marker 4533 are respectively arranged at right frame 4153 two sides.Specifically in Fig. 2 and embodiment shown in Fig. 3, the sub- marker 4531 of third and the 4th sub- marker 4533 It is all set in right side of the frame 4153 far from nose bridge 4155, and the sub- marker 4531 of third is set to the upper end of right frame 4153, 4th sub- marker 4533 is set to the lower end of right frame 4153.
Third group 455 may include the 5th sub- marker 4551, and the 5th sub- marker 4551 is set to nose bridge 4155.The Five sub- markers 4551 and the sub- marker 4531 of the first sub- marker 4511, the second sub- marker 4513, third and the 4th son Marker 4533, this five markers can be different (as shown in Figure 2), can also be identical (as shown in Figure 3), or Wherein at least two is identical (as shown in Figure 4).
Referring to Fig. 4, in some specific embodiments, the first sub- marker 4511, the second sub- marker 4513, Three sub- markers 4531 and the 4th sub- marker 4533 are identical, and the 5th sub- marker 4551 is different from other four sons Marker, so that the profile of wearable device 400 is collectively formed in five sub- markers.As a result, by the way that four sons of surrounding will be located at Marker is set as same marker, and the sub- marker (the 5th sub- marker 4551) that will be located substantially at centre is set as and it His different marker of sub- marker is conducive to the profile that image display device 300 identifies wearable device 400, and positions and wear The approximate mid-section position of formula equipment 400 is worn, in order to location tracking wearable device 400.Wherein, the first sub- marker 4511, In the sub- marker 4531 of second sub- marker 4513, third and the 4th sub- marker 4533, every sub- marker includes back Scape 457 and a characteristic point 459, characteristic point 459 are different from background in order to the identification of image display device 300.5th label Object 4551 includes background 427 and two characteristic points 459.In the present embodiment, characteristic point 459 is generally circular.By in son The characteristic point 429 of negligible amounts is set on marker, enables the area of characteristic point 429 is relatively large (e.g., to occupy sub- marker 1/3 or more of area), so that sub- marker be made to be easier to be identified by image display device 300.
In the present embodiment, marker 450 is the planar tags for being integrated in frame 415, can be predetermined symbol Or pattern.After marker 450 is used to be identified by image display 300, set to determine that wearable device 400 is shown with image Relative tertiary location relationship between standby 300.In the present embodiment, marker 450 is the entity for being set to display panel 430 Structure.In other some embodiments, the predetermined symbol or figure that are shown after marker 430 can also be to be powered.
It is understood that the specific pattern that marker 450 is shown is unrestricted, can be shown to be any for image The pattern that the image collecting device 301 of equipment 300 obtains.For example, the specific pattern of marker 450 can be following arbitrary graphic pattern One of or a variety of combinations: round, triangle, rectangle, ellipse, wave, straight line, curve etc., it is not limited to Described by this specification.It is understood that marker 450 can be other kinds of in other some embodiments Pattern, and enable marker 450 more efficiently to be identified by image collecting device 301.For example, marker 450 is specific Pattern can be in the discernmible geometric figure of image collecting device 301 (such as round, triangle, rectangle, ellipse, wave Line, straight line, curve etc.), predetermined pattern (such as animal head, common schematic symbols such as traffic sign) or other figures Case is to form marker, it is not limited to described by this specification.It can also be appreciated that in other some embodiments In, marker 450 can be the identification codes such as bar code, two dimensional code.
Further, in some embodiments, wearable device 400 can also include filter layer (not shown), Filter layer can be stacked and placed on the side that marker 450 deviates from frame 415.
Filter layer was used to filter out the light except the light of the lighting device directive marker 450 of image display 300 Line avoids marker 450 in reflection light by the influence of ambient light, so that marker 450 is easier to be identified. In some embodiments, the optical filtering performance of filter layer can be set according to actual needs.For example, when marker 450 enters figure As acquisition device 301 visual field in it is identified when, in order to improve recognition efficiency, image collecting device 301 is usually by auxiliary Light source carries out assisted acquisition image, for example, filter layer was then used to filter out other than infrared light when being assisted using infrared light supply Light (such as visible light, ultraviolet light), make that light in addition to infrared light can not pass through filter layer and infrared light may pass through and arrive at Marker 450.When infrared light is projected to marker 450 by secondary light source, filter layer can cross the environment light filtered out other than infrared light, Make only infrared light arrival marker 450 and labeled object 450 reflexes in the image collecting device of near-infrared, so as to drop Influence of the low environment light to identification process.
In one embodiment, the application also provides a kind of computer readable storage medium, in the computer-readable medium It is stored with program code, program code can be called by processor and execute method described in above method embodiment.It is worth note Meaning in the absence of conflict, can be tied mutually in the embodiment that this specification provides between the above embodiments It closes, can also be combined with each other between the feature of each embodiment, be not to limit with embodiment.
Above-mentioned location tracking method and wearable device, it is aobvious according to marker by above-mentioned location tracking method Show virtual objects, virtual objects is enable to be visually displayed in user at the moment, and can be controlled in real time using wearable device Virtual objects, the interaction being advantageously implemented between user and virtual objects make information entrained by virtual objects be easier to obtain, It can be improved the usage experience of user.
In the description of this specification, reference term " one embodiment ", " some embodiments ", " example ", " specifically show The description of example " or " some examples " etc. means specific features, structure, material or spy described in conjunction with this embodiment or example Point is contained at least one embodiment or example of the application.In the present specification, schematic expression of the above terms are not It must be directed to identical embodiment or example.Moreover, particular features, structures, materials, or characteristics described can be in office It can be combined in any suitable manner in one or more embodiment or examples.In addition, without conflicting with each other, the skill of this field Art personnel can tie the feature of different embodiments or examples described in this specification and different embodiments or examples It closes and combines.
In addition, term " first ", " second " are used for descriptive purposes only and cannot be understood as indicating or suggesting relative importance Or implicitly indicate the quantity of indicated technical characteristic.Define " first " as a result, the feature of " second " can be expressed or Implicitly include at least one this feature.In the description of the present application, the meaning of " plurality " is at least two, such as two, three It is a etc., unless otherwise specifically defined.
Finally, it should be noted that above embodiments are only to illustrate the technical solution of the application, rather than its limitations;Although The application is described in detail with reference to the foregoing embodiments, those skilled in the art are when understanding: it still can be with It modifies the technical solutions described in the foregoing embodiments or equivalent replacement of some of the technical features;And These are modified or replaceed, do not drive corresponding technical solution essence be detached from each embodiment technical solution of the application spirit and Range.

Claims (10)

1.一种用于辅助定位追踪的穿戴式设备,包括镜框以及连接于所述镜框的镜腿,其特征在于,还包括设置于所述镜框的标记物,所述标记物由终端设备识别后,用于确定用户的眼部与终端设备之间的相对位置关系。1. A wearable device for assisting positioning and tracking, comprising a spectacle frame and temples connected to the spectacle frame, and characterized in that it also includes a marker arranged on the spectacle frame, and the marker is identified by the terminal device. , which is used to determine the relative positional relationship between the user's eyes and the terminal device. 2.如权利要求1所述的穿戴式设备,其特征在于,所述镜框包括并列设置的左框以及右框,所述标记物包括第一组以及第二组;所述第一组设置于所述左框,所述第二组设置于所述右框。2 . The wearable device according to claim 1 , wherein the spectacle frame comprises a left frame and a right frame arranged side by side, the marker comprises a first group and a second group; the first group is arranged on the The left frame and the second group are arranged on the right frame. 3.如权利要求2所述的穿戴式设备,其特征在于,所述第一组包括第一子标记物以及第二子标记物,所述第一子标记物以及所述第二子标记物分别设置于所述左框的两侧。3. The wearable device of claim 2, wherein the first group comprises a first sub-marker and a second sub-marker, the first sub-marker and the second sub-marker They are respectively arranged on both sides of the left frame. 4.如权利要求3所述的穿戴式设备,其特征在于,所述第二组包括第三子标记物以及第四子标记物,所述第三子标记物以及所述第四子标记物分别设置于所述右框的两侧。4. The wearable device of claim 3, wherein the second group comprises a third sub-marker and a fourth sub-marker, the third sub-marker and the fourth sub-marker They are respectively arranged on both sides of the right frame. 5.如权利要求4所述的穿戴式设备,其特征在于,所述第一组的子标记物与所述第二组的子标记物相同。5. The wearable device of claim 4, wherein the sub-markers of the first group are the same as the sub-markers of the second group. 6.如权利要求5所述的穿戴式设备,其特征在于,6. The wearable device of claim 5, wherein, 所述第一组的第一子标记物、第二子标记物,以及所述第二组的第三子标记物、第四子标记物均包括背景以及区别于所述背景的一个特征点。The first sub-marker and the second sub-marker of the first group, and the third sub-marker and the fourth sub-marker of the second group all include a background and a feature point different from the background. 7.如权利要求2所述的穿戴式设备,其特征在于,还包括鼻梁部,所述鼻梁部连接于所述左框和所述右框之间,所述标记物还包括设置于所述鼻梁部的第五子标记物。7 . The wearable device according to claim 2 , further comprising a nose bridge part, the nose bridge part is connected between the left frame and the right frame, the marker further comprising: Fifth submarker on the bridge of the nose. 8.如权利要求7所述的穿戴式设备,其特征在于,所述第五子标记物区别于所述第一组的子标记物且区别于所述第二组的子标记物;8. The wearable device of claim 7, wherein the fifth sub-marker is distinct from the first set of sub-markers and from the second set of sub-markers; 或/及,所述第五子标记物包括背景以及区别于所述背景的两个特征点。Or/and, the fifth sub-marker includes a background and two feature points that are different from the background. 9.如权利要求7所述的穿戴式设备,其特征在于,所述第一组设置于所述左框远离所述鼻梁部的一端;9. The wearable device according to claim 7, wherein the first group is disposed at an end of the left frame away from the bridge of the nose; 或/及,所述第二组设置于所述右框远离所述鼻梁部的一端。Or/and, the second group is disposed at one end of the right frame away from the bridge of the nose. 10.如权利要求1~9中任一项所述的穿戴式设备,其特征在于,还包括滤光层,所述滤光层覆盖所述标记物。10 . The wearable device according to claim 1 , further comprising a filter layer, the filter layer covering the marker. 11 .
CN201821616682.9U 2018-09-30 2018-09-30 Wearables for Assisted Location Tracking Active CN208722146U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201821616682.9U CN208722146U (en) 2018-09-30 2018-09-30 Wearables for Assisted Location Tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201821616682.9U CN208722146U (en) 2018-09-30 2018-09-30 Wearables for Assisted Location Tracking

Publications (1)

Publication Number Publication Date
CN208722146U true CN208722146U (en) 2019-04-09

Family

ID=65983031

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201821616682.9U Active CN208722146U (en) 2018-09-30 2018-09-30 Wearables for Assisted Location Tracking

Country Status (1)

Country Link
CN (1) CN208722146U (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110428468A (en) * 2019-08-12 2019-11-08 北京字节跳动网络技术有限公司 A kind of the position coordinates generation system and method for wearable display equipment
CN111176445A (en) * 2019-12-23 2020-05-19 广东虚拟现实科技有限公司 Interactive device identification method, terminal equipment and readable storage medium
CN111522441A (en) * 2020-04-09 2020-08-11 北京奇艺世纪科技有限公司 Space positioning method and device, electronic equipment and storage medium
CN112214100A (en) * 2019-07-12 2021-01-12 广东虚拟现实科技有限公司 Marker, interaction device and identification tracking method

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112214100A (en) * 2019-07-12 2021-01-12 广东虚拟现实科技有限公司 Marker, interaction device and identification tracking method
CN110428468A (en) * 2019-08-12 2019-11-08 北京字节跳动网络技术有限公司 A kind of the position coordinates generation system and method for wearable display equipment
CN111176445A (en) * 2019-12-23 2020-05-19 广东虚拟现实科技有限公司 Interactive device identification method, terminal equipment and readable storage medium
CN111176445B (en) * 2019-12-23 2023-07-14 广东虚拟现实科技有限公司 Identification method of interactive device, terminal equipment and readable storage medium
CN111522441A (en) * 2020-04-09 2020-08-11 北京奇艺世纪科技有限公司 Space positioning method and device, electronic equipment and storage medium
CN111522441B (en) * 2020-04-09 2023-07-21 北京奇艺世纪科技有限公司 Space positioning method, device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN208722146U (en) Wearables for Assisted Location Tracking
EP4272064B1 (en) Micro hand gestures for controlling virtual and graphical elements
JP6195893B2 (en) Shape recognition device, shape recognition program, and shape recognition method
JP2024159807A (en) Information processing device, information processing method, and information processing program
WO2022216784A1 (en) Bimanual interactions between mapped hand regions for controlling virtual and graphical elements
CN105393284B (en) Space engraving based on human body data
CN103649874B (en) Use the interface of eye tracking contact lens
CN204631355U (en) Transparent near-eye display device
JP6333801B2 (en) Display control device, display control program, and display control method
CN110968182A (en) Positioning tracking method and device and wearable equipment thereof
WO2014128747A1 (en) I/o device, i/o program, and i/o method
CN106095089A (en) A kind of method obtaining interesting target information
JP6250024B2 (en) Calibration apparatus, calibration program, and calibration method
KR20160082600A (en) Method for determining at least one optical design parameter for a progressive ophthalmic lens
CN109471533B (en) A student terminal system in VR/AR classroom and its use method
CN111399633A (en) Correction method for eyeball tracking application
WO2019125700A1 (en) System and method of obtaining fit and fabrication measurements for eyeglasses using simultaneous localization and mapping
WO2014128750A1 (en) Input/output device, input/output program, and input/output method
CN111491159A (en) Augmented reality display system and method
CN119576126A (en) Mixed reality eye-movement interaction system and method based on dense map semantic segmentation
WO2025145642A1 (en) Parameter determination method and apparatus, and electronic device and computer-readable storage medium
KR20180069013A (en) A method for determining a human visual behavior parameter, and an associated testing device
CN114758404A (en) Human eye region of interest positioning system
US20240337863A1 (en) Calculation module, system and method for determining production parameters of an optical element

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant