CN103761505A - Object tracking embodiments - Google Patents

Object tracking embodiments Download PDF

Info

Publication number
CN103761505A
CN103761505A CN201310757227.6A CN201310757227A CN103761505A CN 103761505 A CN103761505 A CN 103761505A CN 201310757227 A CN201310757227 A CN 201310757227A CN 103761505 A CN103761505 A CN 103761505A
Authority
CN
China
Prior art keywords
described
movable objects
object
life movable
life
Prior art date
Application number
CN201310757227.6A
Other languages
Chinese (zh)
Inventor
M·斯卡维泽
J·斯科特
J·斯蒂德
I·麦克恩特瑞
A·克劳斯
D·麦克洛克
S·拉塔
Original Assignee
微软公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 微软公司 filed Critical 微软公司
Priority to CN201310757227.6A priority Critical patent/CN103761505A/en
Publication of CN103761505A publication Critical patent/CN103761505A/en

Links

Abstract

The invention discloses embodiments related to automatic object tracking. For example, one of the disclosed embodiments provides a method for operating a mobile computing device with an image sensor. The method includes the steps of obtaining image data, recognizing nonlife movable objects in the image data, determining whether the nonlife movable objects are tracked objects or not, if the nonlife movable objects are the tracked objects, storing information about the states of the nonlife movable objects, and triggering detection so as to provide notifications of the states of the nonlife movable objects and provide output of the notifications of the states of the nonlife movable objects.

Description

To image tracing

Background technology

The position of tracing object or other states may take a large amount of work in daily life.In addition, attempt location misplace on the object of position, may spend the much time.For example, finding the car key, wallet, mobile device etc. that misplace position may cause people to lose the production time.Equally, the milk box of having forgotten family refrigerator is almost empty may cause extra go shopping one time, if shopper remembers the state of milk box, this was avoidable originally.In some cases, such object may be to be moved by the people who is not owner, empties etc., and therefore tracing task is complicated.

Summary of the invention

Correspondingly, herein disclosed is the various embodiment that relate to automatic tracing object.For example, a disclosed embodiment provides a kind of operation to comprise the method for the mobile computing device of imageing sensor.The method comprises obtains view data, in view data, identification is without life movable objects, determine whether be tracing object without life movable objects, and if be tracing object without life movable objects, store about the information without life movable objects state.The method further comprises that detection triggers so that the notice without life movable objects state to be provided, and the output without life movable objects state notifying is provided.

It is for the form introduction to simplify is by the concept of the selection further describing in following embodiment that content of the present invention is provided.Content of the present invention is not intended to identify key feature or the essential feature of claimed subject, is not intended to the scope for limiting claimed subject yet.In addition, theme required for protection is not limited to solve the realization of any or all mentioned shortcoming in arbitrary part of the present disclosure.

Accompanying drawing explanation

The user that Figure 1A has described the embodiment of perspective display device observes environment for use by perspective display device.

Figure 1B shows Figure 1A user's visual angle, and also shows the embodiment of the tracing object warning being presented on perspective display device.

Fig. 2 shows another embodiment of the tracing object warning being presented on perspective display device.

Fig. 3 shows the block diagram of the embodiment of the environment for use of the perspective display device that is configured to tracing object.

Fig. 4 shows the process flow diagram of describing by the embodiment of perspective display device tracing object.

Fig. 5 shows the block diagram of the embodiment that describes computing equipment.

Embodiment

As mentioned above, the effort that may will take much time in daily life of the state (as position or other physical states) of following the tracks of and remember object.In addition, the state of tracing object exactly of failing may cause the time that loses and yield-power.Therefore, herein disclosed is the embodiment of the state that relates to automatic tracing object.Briefly, a mobile device, as the see-through display that user dresses, can comprise the imageing sensor of observing user environment.Can process from the video data of imageing sensor to detect the object in user environment, and identify the tracing object in video data.Can then store the status information of tracing object, as position and other features.This may allow when detecting that warning triggers (such as user's request or context cues), and output is about the warning of tracing object state.Status information can be stored in this locality, and/or uploads to remote service.When many people utilize object tracking device disclosed herein, by shared object trace information, everyone can know that the object that other users make changes.In this way, user can find to lose the latest position of key, provides to remind to buy more milk, and/or can follow the tracks of in any suitable manner and remember other Obj State information in the time of can browsing dairy products portion in grocery store.

Figure 1A and 1B show non-limiting example use scenes.First, Figure 1A shows user and puts on the wear-type perspective display device 100 of the form of a pair of glasses, and passes through perspective display device 100 environment of observation (parlor) at very first time t1.As detailed below, perspective display device 100 comprises one or more towards outside imageing sensor (RGB and/or depth transducer), to move and to obtain video image along with user.As detailed below, can be in real time or the video image that obtains in time series analysis after a while, determine the removable without life object of usertracking.In Figure 1A, the removable example embodiment without life object of tracking is depicted as and is positioned at the key 102 under a folded magazine on desk, but can understand, and can follow the tracks of any suitable object.

Can store the object data of identification and/or view data so that further information to be provided, such as giving user the warning about object.For example, referring to Figure 1B, at time t2 after a while, key 102 is covered by newspaper 104.Therefore, response user input request is about the information (as the voice command detecting by the microphone on perspective display device 100) of key position, the key in highlighted 106 regions on perspective display device 100.

Be understood that having an X-rayed display device 100 may not follow the tracks of user's observation, the even with it status information of mutual all objects.For example, in Figure 1A, the environment that user observes comprises except key 102, and various other are removable without life object, as user may not wish the book 112 on the magazine 108, flowerpot 110, bookshelf of tracking.

Therefore, perspective display device 100 may be followed the tracks of the state that is considered to enough " important " objects, and wherein term " important " refers to and whether receives that enough implicit expression and/or explicit information are to specify the object of following the tracks of.Therefore, before the information of any state without life movable objects about observing of storage, perspective display device 100 can determine whether object is tracing object, if to liking tracing object, then storaging state information.Perspective display device 100 also can detect the information that changes " importance scores together " of distributing to object.As detailed below, these importance scores together can relatively determine whether tracing object status information and/or follow the tracks of what Obj State information with one or more threshold value importance scores together.

Without life movable objects, can be defined as in any suitable manner and follow the tracks of.For example, can predefine (for example, by developer) as the Obj State of following the tracks of, can user's definition, and/or can adaptive learning.In predefine, follow the tracks of in the embodiment of some objects, predefine can be applicable to one group and thinks that to most of users are important objects, as key, wallet, financial transaction card etc.In addition, user may have the ability from tracing object list, to delete the object that is matched with predefined important object definition, should be undesirable to image tracing.

Can specify in any suitable manner tracing object.For example, perspective display device 100 may comprise object designated mode, and in this object designated mode, in the visual field of the imageing sensor in perspective on display device 100, user holds or not so placing objects, the image of sweep object, and be identified as tracking.As an example more specifically, when the user of dress perspective display device 100 holds without life movable objects (as key, wallet, credit/debit card etc.) before perspective display device 100, can use voice command, as " following the tracks of this object ", thereby cause perspective display device, store this object as tracing object.

In addition, in certain embodiments, user can tracing object specify on evaluation time and/or position limitation.For example, journey abroad user can specify in specific date scope or until get back to assigned address (in family), using passport as tracing object, when meeting specific date, place or other conditions, tracking will finish automatically like this.In addition, user can have at any time and from following the tracks of, to change the state of abiotic movably object into do not follow the tracks of right to choose.Should be appreciated that these are to propose for example purposes for user specifies the method for tracing object, and user can define tracing object in any suitable manner.

As mentioned above, in certain embodiments, which object perspective display device 100 can follow the tracks of by adaptive learning.This can help a people location not clearly to be defined as the object of tracing object, and this still has potential importance to this person.Perspective display device 100 can be specified without life movable objects as tracing object adaptively based on any suitable standard.For example, perspective display device 100 can based on numeral, pattern and/or see and/or with the frequency of the user interactions of object, specify abiotic movably object as tracing object, thereby more through often seeing and/or mutual object is more likely considered to important.

In addition, perspective display device 100 can be identified specific position (for example family, office and other positions), time and/or situation (as prepared working), and position-based and/or user and object interaction or otherwise see that the situation of object determines whether to specify abiotic movably object as tracing object.For example, the object that is positioned at family's bookshelf of user is more likely important object than the object that is positioned at shop shelf.Another one example, being placed near the object of tracing object may be prior than other objects.For example, user can be put key, wallet on the table as a group together with mobile device.Still as another example, it is potential important that the object that another people moves may be considered to.

In addition, in certain embodiments, user can assigned address and/or the time as critical positions and/or the time of specifying.In this way, it is more likely important at the object of assigned address and/or time and user interactions, than time and/or place, not being appointed as important object.As an example, during user can specify vacation, travel document carrier is as important object.Equally, the object that user is carried to another position from family may be recently more important object from the object of other positions.For example, at the restaurant, compared with user's water tumbler, may more expect that following the tracks of user takes out and put object on the table from overcoat pocket.

Still as another example, for example, at the object of the special time of a day (, before working in morning) and user interactions, than the object of other times and user interactions, be more likely important object.In addition, (for example approach the variation of user's context, before leaving the work of returning home) object of time and user interactions may be important because user be away from home or office before, may contact with key, wallet, briefcase, smart mobile phone etc.When variable-length event (before finding key as every morning and/or leaving house for looking for the time of key) finishes and the object of user interactions to be also considered to may be more important.

Without life movable objects, can be appointed as adaptively in any suitable manner tracking.For example, as mentioned above, based on position, time, behavior and/or other factor relevant with object interaction with perspective display device user, can assigns importance mark to the object of detection and Identification in vedio data.If importance scores together meets or exceeds threshold value importance scores together, then object can be appointed as tracking, and can store the more details about this Obj State (as value of nearest position, historical position, object variable physical property etc.).In addition, in certain embodiments, different threshold value importance scores together can be used for defining different level of significance.This may allow the level of significance based on different, for object, stores dissimilar information, and dissimilar warning etc. is provided.

Can follow the tracks of in any suitable manner with the user interaction patterns of object.For example, the Activity Type based on about time, position, execution and/or any other suitable contextual information can sorted users behaviors.Instance user behavior comprise have a meal, have a bath, find loss article, drive to working etc.By along with time tracking and storage about this mutual information, can find the pattern mutual with detected object.For example, can observe five o'clock of every afternoon user and put down key by perspective display device 100, and eight every morning finding key.

User model based on observing also can generate warning.For example, once identify the behavior pattern relating to the user interactions of key, about the prompting of key position, can automatically show eight every morning.If the position that object is finally observed and stored from it was moved, by perspective display device, such pattern also can be used for envisioning guesses the object of losing may at which.The example where display reminding of finding the suggestion of losing article be provided in Fig. 2 with find article position list 200 form and show.Fig. 2 has also shown the prompting 202 of the position of finally being seen about object.Should be appreciated that about the prompting of which going find loss article and can adopt any form.For example, this prompting can adopt guides the figure indication of user to possible position, and for example arrow and/or line to be to follow, the highlighted form of pointing out position that suggestion is checked etc.

In certain embodiments, based on the video data input from multiple users, for example, each user's uploaded videos data during to service based on cloud, can generate the single view of position.The object of seeing during this object that can allow to see in user video stream flows with another user video is associated, even and therefore can allow object to be moved by other users also can be tracked.In addition, object metadata can be shared between user, for example, so that multiple user can upload tracing object data rather than video data (, thereby multiple equipment can be carried out this locality identification of same object).

As mentioned above, except position, other states of object can be understood and monitor.As unrestriced example, can monitor the milk level in milk box.Equally, can monitor in time the contents list in refrigerator, and can point out the disappearance that often occurs project.Based on the state of these observations, can generate warning by context trigger.For example, observing under the situation of empty milk box, user, during through grocery store milk portion, can generate shopping list and/or oral prompting that vision covers, automatically generates.The context that can understand in any suitable manner grocery store milk portion triggers, mode includes but not limited to by the particular type/brand of grocery store or milk manufacturer annotation or marked product, by by the characteristic matching of the outward appearance of the object of observation in milk portion, size and/or other physics and/or contextual feature and tracking milk object etc.In addition, when user's shopping mode of observing in time comparison milk box full state (for example, user continues to buy more milk when milk box state is empty or approaches sky), can be along with adaptive study of time be using the full state of milk box as tracing object state.

Fig. 3 shows for the block diagram from the environment for use 300 of motion tracking without life movable objects.Environment for use 300 has shown the object tracking device of any amount N, is shown object tracking device 1302 and object tracking device N304, by for example computer network of network 307, communicates by letter with remote object follow-up service 306.It should be understood that object tracking device may take any suitable form, include but not limited to the perspective display device 100 of Fig. 1.To further know that some embodiment can omit remote object follow-up service.

Object tracking device 302 comprises one or more sensors 308.Sensor 308 can comprise imageing sensor, and as two-dimensional image sensor 310 and/or depth transducer 312, it collects the video data of subscriber's local environment.Can comprise two-dimensional image sensor 310 and/or the depth transducer 312 of any suitable type and quantity.The example of suitable two-dimensional image sensor comprises RGB and gray-scale sensor.The example of suitable depth transducer comprises transit time sensor, structured light sensor and three-dimensional depth sensor.It should be understood that

Can comprise the sensor that any other is suitable.Example include, but are not limited to microphone, GPS (GPS) sensor, motion sensor, detection eye motion towards internal image sensor.

The view data of collecting by imageing sensor offers object identification and tracking module 314, and the object that its recognition image sensor is taken also detects the status information about this object.As detailed below, object identification and tracking module 314 are based on Obj State, can further provide warning to one or more output 316, for example, as display 318 (see-through display, or other suitable displays) and/or one or more loudspeaker 320.Object identification and tracking module 314 identifying object in any suitable manner, include but not limited to by the classification function of developer's training in advance, and/or the classification function of the user training object of specifying with identification user, the object that wherein classification function comparison and/or coupling are observed is to object model 315.The object identity that then object identification and tracking module 314 can export the object of identifying in correspondence image data is to object information storehouse 322.Further, while new object being detected (object not detecting in the past), object identification and tracking module 314 can designated identification to object.

Object information storehouse 322 can be configured to storage tracing object data 324, includes but not limited to sign and the status information of tracing object.In adaptive learning user model automatic appointed object, be that in the embodiment following the tracks of, not tracing object data 326 also can be stored in object information storehouse 322, wherein term " is not followed the tracks of " and is represented not to be these object storaging state informations.Tracing object data 326 are not depicted as the storage information of multiple not tracing objects, are expressed as object 1328 and object N332.Can store any applicable not tracing object data.For example, can be each not tracing object storage importance scores together information 330, wherein based on importance scores together information 330 being distributed to the object of identification with the user interactions of object, and then use to determine whether the state of tracing object.When importance scores together exceedes threshold value importance scores together, object may be appointed as tracing object, and can be this object storaging state information.

Tracing object data 324 can be included as the tracing object of any suitable quantity and type and any suitable data of storing.In Fig. 3, multiple tracing objects are shown tracing object 1336 and tracing object N337.Also show the various status datas 338 of tracing object 1.For example, can stored position information 340, include but not limited to latest position and/or position in the past.These positional informations also can comprise about past and the mutual information of object user.For example, this can allow analysis user model to help location lost objects.Can store any suitable positional information.Example comprises for example, for example, information about current environment for use physical location (gps coordinate) and/or contextual location the data position of other tracking/identifying objects (about).

In addition, can store the information about the value of object variable attribute 342.For example, as mentioned above, can store the specific food project lacking in refrigerator, the milk level in milk box, and/or relate to any other suitable value of object variable attribute.Can understand, be the status information of describing for exemplary purposes these particular types, and can store any other suitable status information.

Also can be the information of object storage except status information.For example, at adaptive learning and appointed object, be in the embodiment following the tracks of, can store importance scores together data 344 for tracing object to determine whether maintaining appointed object as tracking.For example, if user stops and the object interaction that does not have to lose, As time goes on, the mark of object may decline, and can stop tracing object.In addition, also can store warning condition 346, wherein when warning condition definition triggers the warning of special object, and the character of warning is provided.

In addition, context data 350 can be stored as object information to help to determine the importance scores together of object.For example, for assigns importance mark is to object, context data 350 definable positions 352, time 354 and/or other contextual informations are " important ".The example of position 352 can comprise actual physical location, about the context rule that approaches other tracing objects, and other such position datas.The example of time 354 can comprise clock/calendar time and/or about the object of observation mutual and for example position of other events time interval between changing context rule (as and object interaction then leave the time between house).Context data 350 further can comprise that about how application site, time and other contextual informations determine the context rule of the importance scores together of object.

In certain embodiments, the analysis of the view data of object can be carried out after a while, and non real-time.Same, object information storehouse 322 can comprise object identification and the untreated view data 355 of tracking module 314.In such embodiments, when user request can carries out image during about the information of object be analyzed with by locate the object in associated picture data positioning image data in the view data 355 of storage.For helping location associated picture data, can store various metadata, for example position (for example gps data) and/or the time of obtaining view data with view data is common.

In certain embodiments, can on object tracking device 302, carry out image tracing and storage this locality.In other embodiments, as mentioned above, object tracking device 302 can be communicated by letter with remote object follow-up service 306 by network.This can allow object data and view data to share between user.Remote object follow-up service 306 can be carried out about above-mentioned any object identification of object tracking device 302, follow the tracks of and warning generation function.In addition, the information receiving from user can be stored in user information database 356, and it is shown multiple user's canned datas of user 1358 and user N359 representative.Can store any suitable information, include but not limited to object data 360 (for example tracking and not tracing object information), view data 362 (for example putting cloud depth degrees of data and/or two-dimensional image data), and context data 364 (for example position/time/other contexts, for determining whether to follow the tracks of the object of user interactions).In addition, user data also can comprise about sharable object data, other users' 366 of the trust of view data and/or other information information.For example, user may wish that the view data of accessing other kinsfolks is to help to detect the article of losing.As a more specific example, if user's spouse by during user's key is from table moving to drawer, spouse's object and/or view data can search for to help normal key in the lump with user's object and/or view data.

Fig. 4 shows the process flow diagram of the embodiment that describes the method 400 by having an X-rayed display device tracing object.Method 400 comprises, in step 402, from imageing sensor, receives by the visual view data of perspective display system background scene, and wherein term " background scene " representative is positioned at the visual field of user-dependent see-through display real-world scene afterwards.Any suitable view data be can obtain, two-dimensional video data 404 (RGB and/or gray scale) and depth data 406 included but not limited to.

Next method 400 comprises, in step 408, such as, once receive triggering (user asks anchored object, context triggering etc.), in view data, identify without life movable objects in real time or after a while, in step 410, determine whether what detect is tracing object without life movable objects.As mentioned above, can identify in any suitable manner without life movable objects, include but not limited to the sorting technique by comparing movable objects and object model.

Same, as mentioned above, can will be defined as tracking without life movable objects in any suitable manner.For example, as shown in step 412, user can ask (by sound, gesture and/or other applicable users input) that object is appointed as to tracking, and the image of sweep object.In addition, the developer of see-through display and/or manufacturer can to specify some objects be important.

In addition,, as shown in step 414, if the mark (" importance scores together ") distributing based on the user interactions carrying out with object reaches threshold value importance scores together, without life movable objects, can be designated as tracking.The combination of any suitable factor or factor can be used for determining this mark.For example, as shown in step 416, can based on quantity, frequency and/or pattern in time and user interactions object, determine importance scores together at least partly.In addition, positional information also can be used for determining importance scores together, as shown in step 418.These positional informations can include, but not limited to as shown in step 420, and whether object is positioned at is appointed as important position, and/or as shown in step 422, whether object enough approaches another tracing object.

In addition,, as shown in step 424, the time that also can occur based on user interactions is at least partly specified importance scores together.This time can comprise specifies material time 426, and the time adjacency between the user interactions of object and user's context change or other remarkable events, and/or any other suitable time factor.Can understand, position and/or time can be appointed as important in any suitable manner, include but not limited to by user, developer and/or manufacturer by appointments such as adaptive learnings.It can also be appreciated that can any other suitable mode assigns importance mark.For example, in certain embodiments, user can assigns importance mark to object.In these embodiments, the mark of user assignment can permanent allocation, or is initial value, and object-based user interactions can change subsequently.Same, in other embodiments, application can based on except position/or the time any other suitable selecting factors importance scores together.

If be defined as tracing object without life movable objects, then method 400 comprises, in step 430, storage is about the information of Obj State.The status information of storage can include, but are not limited to positional information and/or the information about object variable physical property value.In addition, as shown in step 432, can local storaging state information, and/or as shown in step 434, be sent to remote service for remote storage.

Next method 400 comprises, in step 436, detects the notice output triggering so that tracing object state to be provided.Can use any suitable triggering.For example, as shown in step 438, triggering can comprise the user's input request information about Obj State, the request of the article that for example lose location.In addition,, as shown in step 440, triggering can comprise the detected value of the object variable physical property that meets threshold condition.Still as another example, as shown in step 442, triggering can comprise the time, position of detection and/or with context-sensitive other objects of tracing object.For example, if milk disappearance in user's refrigerator/low-level detected in the past, detect that the pass by dairy products portion in grocery store of user can trigger the output of reminding the notice of buying milk.Same, when user is in the place except family, detect triggering and can comprise detection user mobile preset distance individual event.As a more specific example, user, put down at the restaurant wallet and then from desk limit, leave or the ground of walking out buildings can detect triggering.Can understand, above-mentioned triggering is to introduce for exemplary purposes, is not intended to limit by any way.

Once triggering be detected, method 400 comprises, in step 444, provides the output of Obj State notice.Any suitable output can be provided, comprise audio frequency 446 and/or vision output 448.The example of suitable vision output includes, but not limited to as above, about as described in Fig. 1, to use the real world images of the enhancing of perspective display device.In addition, can obtain the information about state from local storage or from remote service.In the place from remote service obtaining information, can be from user's account, the account of another trusted user as mentioned above, or from any other suitable source acquired information.

Notice can comprise any suitable information.For example, as shown in step 450, notice can be informed the up-to-date memory location of user object.As a more specific example, notice can be in background scene the physical location of highlighted object, and/or can comprise the direction (arrow, line, word etc.) of guiding user to latest position.Notice also can comprise that user has left the warning of object in public places.In addition, notice can comprise the information that relates to the object variable physical property value detecting.For example, at milk box, reach empty threshold status, notice can comprise reminding buys more milk, and/or selects to carry out the more milk of on-line purchase.

In addition, in certain embodiments, notice can comprise the image of the object of finding.In the case of existing object to seem that similar variant, this can help anchored object.For example, if user is finding bunch of keys, and display device has the information that four strings seem similar key, and display device can show that the image of four string keys is to allow user to select that string of finding.In addition can show, the image of object Set For Current.For example, if the bunch of keys of finding at present on bedside cupboard, display device can show on bedside cupboard that the image of key is to be shown to the physical location of user's key.

In this way, can be the state of the automatic tracing object of user, thereby be conducive to location, lose article and/or follow the tracks of any other suitable Obj State.In certain embodiments, said method and flow process can rely on the computing system that comprises one or more computing machines.Especially, method described herein and flow process can computer utility, Computer Service, computer applied algorithm interface, calculate hangar and/or other computer programs are realized.

Fig. 5 has schematically shown the one or more non-limiting computing system 500 that can carry out among said method and process.Show in simplified form computing system 500.Should be appreciated that and can use substantially any computer architecture and not deviate from the scope of the present disclosure.In different embodiment, computing system 500 can be taked following form: mainframe computer, server computer, desk-top computer, laptop computer, flat computer, home entertaining computing machine, network computing device, mobile computing device, mobile communication equipment, perspective display device, nearly eye display device, game station etc., include but not limited to see-through display and other computing equipment of with reference to figure 1-4, describing herein.

Computing system 500 comprises that logic subsystem 502 and data keep subsystem 504.Computing system 500 can optionally comprise display subsystem 508, communication subsystem 506 and/or unshowned other assemblies in Fig. 5.Computing system 500 can also optionally comprise such as following user input device: for example keyboard, mouse, game console, camera, microphone and/or touch-screen etc.

Logic subsystem 502 can comprise the one or more physical equipments that are configured to carry out one or more instructions.For example, logic subsystem can be configured to carry out one or more instructions, and these one or more instructions are parts of one or more application, service, program, routine, storehouse, object, assembly, data structure or other logical constructs.Can realize such instruction to execute the task, to realize data type, convert the state of one or more equipment or otherwise to obtain desirable result.

Logic subsystem can comprise the one or more processors that are configured to executive software instruction.Additionally or alternately, logic subsystem can comprise the one or more hardware or the firmware logic machine that are configured to carry out hardware or firmware instructions.The processor of logic subsystem can be monokaryon or multinuclear, and the program of carrying out thereon can be configured to parallel or distributed treatment.Logic subsystem can optionally comprise the stand-alone assembly that spreads all over two or more equipment, and described equipment can long-range placement and/or is configured to carry out associated treatment.One or more aspects of this logic subsystem can be virtualized and carry out by configuring with cloud computing the networking computing equipment capable of making remote access being configured.

Data keep subsystem 504 can comprise instantaneous equipment one or more physics, non-, and these equipment are configured to the instruction that keeps data and/or can be carried out by this logic subsystem, to realize Method and Process described herein.When realizing such Method and Process, can transform data keep the state (for example,, to keep different data) of subsystem 504.

Data keep subsystem 504 can comprise removable medium and/or built-in device.Data keep subsystem 504 especially (for example can comprise optical memory devices, CD, DVD, HD-DVD, Blu-ray disc etc.), semiconductor memory devices (for example, RAM, EPROM, EEPROM etc.) and/or magnetic storage device (for example, hard disk drive, floppy disk, tape drive, MRAM etc.).Data keep subsystem 504 can comprise the equipment with the one or more characteristics in following characteristic: volatibility, non-volatile, dynamic, static, read/write, read-only, random access, sequential access, position addressable, file addressable and content addressable.In certain embodiments, can keep subsystem 504 to be integrated in one or more common device logic subsystem 502 and data, as special IC or SOC (system on a chip).

Fig. 5 also illustrates that the data of movable computer readable storage medium storing program for executing 510 forms keep the one side of subsystem, and this movable computer readable storage medium storing program for executing can be used for storage and/or shifts data and/or the instruction that can carry out to realize Method and Process described herein.Movable computer readable storage medium storing program for executing 510 especially can be taked the form of CD, DVD, HD-DVD, Blu-ray disc, EEPROM and/or floppy disk.

Can understand, data keep subsystem 504 to comprise instantaneous equipment one or more physics, non-.On the contrary, in certain embodiments, the each side of instruction described herein can be by transient state mode such as, by can't help pure signal (electromagnetic signal, the light signal etc.) propagation of physical equipment at least limited duration maintenance.In addition, the data relevant with the disclosure and/or other forms of information can be propagated by pure signal.

Term " module " and " engine " can be used for describing an aspect that is implemented the computing system 500 of carrying out one or more specific functions.In some cases, can come the such module of instantiation and/or program by the logic subsystem 502 of carrying out the instruction being kept by data maintenance subsystem 504.Should be appreciated that and can come the different module of instantiation and/or program from same application, service, code block, object, storehouse, routine, API, function etc.Equally, can come the same module of instantiation and/or program by different application programs, service, code block, object, routine, API, function etc.Term " module ", " program " and " engine " are intended to contain single or executable file in groups, data file, storehouse, driver, script, data-base recording etc.

Should be appreciated that " service " as used herein can be that to cross over multiple user conversations executable and one or more system components, program and/or other are served to available application program.In some implementations, service can move in response to the request from client computer on server.

Display subsystem 508 can be used for presenting the visual representation of the data that kept by data maintenance subsystem 504.Described Method and Process changes the data that keep subsystem to keep by data herein, and when transform data keeps the state of subsystem thus, can convert equally the state of display subsystem 508 visually to represent the change of bottom data.Display subsystem 508 can comprise almost one or more display devices of the technology of any type (including but not limited to have an X-rayed display technique) of utilization.This type of display device can be combined in and share in encapsulation together with logic subsystem 502 and/or data preservation subsystem 506, or this type of display device can be peripheral display device.

When comprising communication subsystem 506, communication subsystem 508 can be configured to computing system 500 and one or more other computing equipments to be coupled communicatedly.Communication subsystem 506 can comprise and one or more different communication protocol wired and/or Wireless Telecom Equipment of compatibility mutually.As non-limiting example, communication subsystem can be configured to communicate via radiotelephony network, WLAN (wireless local area network), cable LAN, wireless wide area network, wired wide area network etc.In certain embodiments, communication subsystem can allow computing system 500 to send a message to other equipment via the network such as the Internet and/or from other equipment receipt messages.

Should be appreciated that, configuration described herein and/or method are exemplary in itself, and these specific embodiments or example should not be considered to circumscribed, because multiple variant is possible.Concrete routine described herein or method can represent one or more in any amount of processing policy.Thus, shown each action can by shown in order carry out, by other order, carry out, carry out concurrently or be omitted in some cases.Equally, can change the order of said process.

Theme of the present disclosure comprise various processes, system and configuration, other features, function, action and/or characteristic disclosed herein, with and all novelties and non-obvious combination and careful the closing of any and whole equivalents.

Claims (10)

1. operate a method for mobile computing device, described computing equipment comprises imageing sensor, and described method comprises:
Obtain (402) view data;
In described view data, identification (408) is without life movable objects;
Whether definite (410) are described is tracing object without life movable objects;
If described, without life movable objects, be tracing object, storage (430) is about the information of the described state without life movable objects;
Detecting (436) triggers so that the notice of the described state without life movable objects to be provided; And provide the output of (444) the described notice without life movable objects state.
2. the method for claim 1, is characterized in that, before being further included in the information of storage about the described state without life movable objects, determines whether described be tracing object without life movable objects.
3. method as claimed in claim 2, is characterized in that, further comprises receiving being appointed as tracing object and/or the importance scores together of user's selection being distributed to the described input of the user without life movable objects without life movable objects described.
4. method as claimed in claim 2, it is characterized in that, further comprise based on coming to described without life movable objects assigns importance mark with the described user interactions without life movable objects, and based on described importance scores together, meet threshold value importance scores together and without life movable objects, be appointed as tracing object by described.
5. method as claimed in claim 4, is characterized in that, described importance scores together is at least partly based on distributing with the quantity of the described user interactions without life movable objects.
6. method as claimed in claim 4, is characterized in that, described importance scores together is at least part of one or more distribution based in the described time occurring without the position of life movable objects with the described user interactions without life movable objects.
7. method as claimed in claim 6, it is characterized in that, described importance scores together is at least part of one or more distribution based in the following: the described position without life movable objects is designated as critical positions, describedly without life movable objects, approach other tracing objects, and and the described user interactions without life movable objects and the user's context time adjacency between changing.
8. the method for claim 1, is characterized in that, the described state without life movable objects comprises the described position without life movable objects, and the output of wherein said notice comprises the output of the notice of the position of described object.
9. the method for claim 1, it is characterized in that, the described state without life movable objects comprises the described variable physical property value without life movable objects, and wherein detect described triggering comprise detect described variable physical property value meet threshold condition.
10. a mobile computing device (302,500), comprising:
Sensor subsystem (308), comprises imageing sensor (310,312);
Display subsystem (318);
Logic subsystem (502), is configured to carry out instruction; And
Data keep subsystem (504), comprise instruction stored thereon, thereby logic subsystem can be carried out this instruction:
From described imageing sensor, receive (402) view data;
By described view data, detect in time and removable multiple user interactions without life object;
Based on described multiple user interactions without life movable objects (416), by importance scores together distribute (414) give described without life movable objects;
If described importance scores together reaches threshold value importance scores together, storage (430) is about the information of the described state without life movable objects;
Detecting (436) triggers so that the notice of the described state without life movable objects to be provided; And provide the output of the notice of (444) described state without life movable objects.
CN201310757227.6A 2013-12-18 2013-12-18 Object tracking embodiments CN103761505A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310757227.6A CN103761505A (en) 2013-12-18 2013-12-18 Object tracking embodiments

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310757227.6A CN103761505A (en) 2013-12-18 2013-12-18 Object tracking embodiments

Publications (1)

Publication Number Publication Date
CN103761505A true CN103761505A (en) 2014-04-30

Family

ID=50528741

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310757227.6A CN103761505A (en) 2013-12-18 2013-12-18 Object tracking embodiments

Country Status (1)

Country Link
CN (1) CN103761505A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105335498A (en) * 2015-10-23 2016-02-17 广东小天才科技有限公司 Method and system for carrying out information recommendation based on voice information
CN106663197A (en) * 2014-08-08 2017-05-10 三星电子株式会社 Method and apparatus for generating environmental profile
US10469826B2 (en) 2014-08-08 2019-11-05 Samsung Electronics Co., Ltd. Method and apparatus for environmental profile generation

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080309874A1 (en) * 2007-06-14 2008-12-18 Psychology Software Tools Inc. System and Apparatus for object based attention tracking in a virtual environment
US20110222729A1 (en) * 2006-11-14 2011-09-15 TrackThings LLC Apparatus and method for finding a misplaced object using a database and instructions generated by a portable device
US8027029B2 (en) * 2007-11-07 2011-09-27 Magna Electronics Inc. Object detection and tracking system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110222729A1 (en) * 2006-11-14 2011-09-15 TrackThings LLC Apparatus and method for finding a misplaced object using a database and instructions generated by a portable device
US20080309874A1 (en) * 2007-06-14 2008-12-18 Psychology Software Tools Inc. System and Apparatus for object based attention tracking in a virtual environment
US8027029B2 (en) * 2007-11-07 2011-09-27 Magna Electronics Inc. Object detection and tracking system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106663197A (en) * 2014-08-08 2017-05-10 三星电子株式会社 Method and apparatus for generating environmental profile
US10469826B2 (en) 2014-08-08 2019-11-05 Samsung Electronics Co., Ltd. Method and apparatus for environmental profile generation
CN105335498A (en) * 2015-10-23 2016-02-17 广东小天才科技有限公司 Method and system for carrying out information recommendation based on voice information

Similar Documents

Publication Publication Date Title
KR101794493B1 (en) Mobile devices and methods employing haptics
US9292085B2 (en) Configuring an interaction zone within an augmented reality environment
US9092600B2 (en) User authentication on augmented reality display device
CN105009031B (en) Augmented reality equipment and the method in operation user interface thereon
JP2014225288A (en) User interface method and system based on natural gesture
CN204946542U (en) Electronic equipment
JP5619156B2 (en) Method and apparatus for controlling image display according to viewer's factor and reaction
US20130187835A1 (en) Recognition of image on external display
US9536350B2 (en) Touch and social cues as inputs into a computer
KR20120127655A (en) Intuitive computing methods and systems
US9747306B2 (en) Method and apparatus for identifying input features for later recognition
US9412201B2 (en) Mixed reality filtering
US8494215B2 (en) Augmenting a field of view in connection with vision-tracking
US20090006286A1 (en) Method and apparatus for implementing digital video modeling to identify unexpected behavior
US20130174213A1 (en) Implicit sharing and privacy control through physical behaviors using sensor-rich devices
US9201243B2 (en) Executable virtual objects associated with real objects
CN103390075B (en) Compare virtual and true picture in purchase experiences
EP2671188B1 (en) Context aware augmentation interactions
US9443354B2 (en) Mixed reality interactions
US20140306994A1 (en) Personal holographic billboard
US10127735B2 (en) System, method and apparatus of eye tracking or gaze detection applications including facilitating action on or interaction with a simulated object
US9191238B2 (en) Virtual notes in a reality overlay
US8055656B2 (en) Generating a user-specific search index of content within a virtual environment
CN103270537B (en) Image processing equipment, image processing method and program
US9342929B2 (en) Mixed reality experience sharing

Legal Events

Date Code Title Description
PB01 Publication
C06 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C41 Transfer of patent application or patent right or utility model
TA01 Transfer of patent application right

Effective date of registration: 20150728

Address after: Washington State

Applicant after: Micro soft technique license Co., Ltd

Address before: Washington State

Applicant before: Microsoft Corp.

ASS Succession or assignment of patent right

Owner name: MICROSOFT TECHNOLOGY LICENSING LLC

Free format text: FORMER OWNER: MICROSOFT CORP.

Effective date: 20150728