OA20558A - Computing device and method for tracking objects. - Google Patents

Computing device and method for tracking objects. Download PDF

Info

Publication number
OA20558A
OA20558A OA1202100564 OA20558A OA 20558 A OA20558 A OA 20558A OA 1202100564 OA1202100564 OA 1202100564 OA 20558 A OA20558 A OA 20558A
Authority
OA
OAPI
Prior art keywords
user
computing device
operative
ofthe
current position
Prior art date
Application number
OA1202100564
Inventor
Tommy Arngren
Jonas Pettersson
Peter ÖKVIST
Original Assignee
Telefonaktiebolaget Lm Ericsson (Publ)
Filing date
Publication date
Application filed by Telefonaktiebolaget Lm Ericsson (Publ) filed Critical Telefonaktiebolaget Lm Ericsson (Publ)
Publication of OA20558A publication Critical patent/OA20558A/en

Links

Abstract

A computing device (110) for tracking objects is provided. The computing device comprises a positioning sensor (113), a wireless network interface (114), and a processing circuit (115) which causes the computing device to be operative to detect that an object (120, 130) is gripped by a user carrying the computing device, identify the object, and update information pertaining to the object in a database accessible by multiple computing devices. The information comprises an identifier associated with the object, an identifier associated with the user, and position information identifying the object as being colocated with the user. The computing device is further operative to detect that the object is released by the user, and in response thereto, update the position information in the database with a position of the computing device when the object was released by the user.

Description

COMPUTING DEVICE AND METHOD FOR TRACKING OBJECTS
Technical field
The invention relates to a computing device for tracking objects, a method of tracking objects performed by a computing device, a corresponding computer program, a corresponding computer-readable storage medium, and a corresponding data carrier signal.
Background
People may hâve difficultles in keeping track of objects such as electronic devices (e.g., mobile phones, tablet computers, orthe like), watches, keys, wailets, remote contrais, tools, or any other types of everyday items in general, which can be picked up, Le., gripped with a h and of a person using the object (the user), carried by the user, and subsequently released by the user at a potentialiy different location.
There are different solutions to assist people in fïnding îost or displaced objects. For instance, battery-powered tracking devices are known which can be attached to objects such as wailets or keys, and which are based on short-range radio signais, e.g., Bluetooth. Further, Apple's Tind My iPhone’ app can be used for locating iOS devices by retrieving position information from iOS devices which are connected to the Internet.
Summary
It is an object of the invention to provide an Improved alternative to the above techniques and prior art.
More specifically, it is an object of the invention to provide improved solutions for tracking objects such as electronicdevices (e.g., mobile phones, tablet computers, orthe iike), watches, keys, wailets, remote Controls, tools, or any other types of everyday items în general, which can be picked up, Le., gripped with a hand of a person using the object (the user), carried by the user, and released by the user at a potentialiy different location.
These and other objects of the invention are àchîeved by means ofdifferent aspects of the invention, as defîned by the rndependent claims. Embodiments of the invention are characterized by the dépendent claims.
Accordîng to a first aspect of the invention, a computing device for tracking objects is provided. The computing device comprises a positioning sensor, a wireless network interface, and a processing circuit. The processing circuit causes the computing device to be operative to detect that an object is gripped by a user carrying the computing device, and to identity the object. The computing device is further operative to update information pertaining to the object in a database accessible by multiple computing devices. The information comprises an identifier associated with the object, an identifier associated with the user, and position information identiiying the object as being co-located witti the user. The computing device is further operative to detect that the object is released by the user, and in response thereto, update the position information with a position of the computing device when the object was released by the user.
According to a second aspect of the invention, a method of tracking objects is provided. The method is performed by a computing device and comprises detecting that an object is gripped by a user carrying the computing device, and identiiying the object. The method further comprises updating information pertaining to the object in a database accessible by multiple computing devices. The information comprises an identifier associated with the object, an identifier associated with the user, and position information identiiying the object as being co-iocated with the user. The method further comprises detecting that the object is released by the user, and in response thereto, updating the position information in the database with a position ofthe computing device when the object was released by the user.
According to a third aspect of the invention, a computer program is provided. The computer program comprises instructions which, when the computer program is executed by a processor comprised in a computing device, cause the computing device to carry out the method according to the third aspect ofthe invention.
According to a fourth aspect of the invention, a computer-readable storage medium is provided. The computer-readable storage medium has stored thereon the computer program according to the third aspect of the invention.
According to a fifth aspect ofthe invention, a data carrier signal is provided. The data carrier signal cames the computer program according to the third aspect of the invention.
The invention makes use of an underatanding that computing devices, in particular mobile communications devices which are carried by usera, such as mobile phones, smartphones, tablet computers, Personal Digital Assistants (PDAs), Head-Mounted Displays (HMDs), or Augmented-Reality (AR) headsets, can be used for keeping track of objects which are picked up by their usera at a location where the objects are currently located (by gripping the object with a hand), and subsequently released after the usera bave finished using them at a potentiaily different location. Information about the current
4- , location of an object, i.e., its position, is maintained in a database which is accessible by multiple computing devices, i.e., a shared database,
This is advantagecus in that multiple computing devices which are carried by their users can share information about the eu ment positions of one or more objects, aîlowing usera to iocate an object which there are interested in finding and which may be in use by another user or which has been placed at a position where it has been released by the user or another user.
Even though advantages ofthe invention hâve in some cases been described with reference to embodiments of the first aspect of the invention, corresponding reasoning applies to embodiments ofotheraspects ofthe invention,
Further objectives of, features of, and advantages with, the invention will become apparent when studying the following detaïled disclosure, the drawings and the appended claims. Those skilled in the art realizé that different features of the invention can be combined to create embodiments otherthan those described tn the following.
Brief description ofthe drawings
The above, as well as additional objects, features and advantages ofthe invention, will be better understood through the following illustrative and non-lrmiting detailed description of embodiments of the Invention, with reference te the appended drawings, in which:
Fig. 1 illustrâtes a userwearing an AR headsetand gripping objects, in accordance with embodiments ofthe invention.
Fig. 2 illustrâtes an image captured by a caméra wom by the user, in accordance with embodiments ofthe invention.
Fig. 3 shows a sequence diagram illustrating tracking of objects using one or more computing devices, in accordance with embodiments ofthe invention.
Fig. 4 shows an embodiment ofthe processing circuit comprised in the computing device for tracking objects, in accordance with embodiments ofthe invention.
Fig. 5 shows a flow chart illustrating a method of tracking objects, the method performed by a computing device, in accordance with embodiments ofthe invention.
Ail the figures are schematic, not necessariiy to scale, and generally only show parts which are necessary in orderto elucidate the invention, wherein other parts may be omitted or merely suggesîed.
Detailed description
The invention will now be described more fully herein after with référencé to the accompanying drawings, in which certain embodiments of the invention are shown. This invention may, however, be embodied in many different forms and shoutd not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided by way of example so that this disclosure will be thorough and complété, and wili fully convey the scope of the invention to those skilled in the art.
!n the following, embodiments of the computing device 110 for tracking objects are described with référencé to Fig . 1, in which the computing device 110 is illustrated as an AR headset, or HMD, which is wom by a user gripping one or more objects 120 and 130 with his/her hand or hands. in Fig. 1, the objects which are gripped by the user are illustrated as toois, a spirit level 120 which the user grips with the left hand, and a bafterypowered drili 130 which the user is about to grip with the right hand. It will be appreciated that embodiments of the invention are not limited to the spécifie types of objects which are described throughout this disclosure, but may be envisaged to be used for tracking any kirids of objects such as eiéctronic devices (e.g., mobiiè phones, tablet computers, or the like), watches, keys, wallets, remote Controls, toois, or any other types of everyday items in general, which can be picked up, i.e.T gripped with a hand of the user carrying the computing device 110, and subsequentiy released by the user. The location where an object is released, i.e., it’s position, may potentially be different than the position at which the user has picked-up the object. Accordingly, there arises a need to assist users in tocating objects which they, or others, hâve used and pîaeed somewhere.
The computing device 110 comprises a positioning sensor 113, a wireless network interface 114, and a processing circuit 115. If the computing device is embodied as an optical AR headset or HMD 110 as is illustrated in Fig. 1, it may further comprise a seethrough display 111 through which the user wearing the AR headset can view the rea-world scene, i.e., the physical world, in front of the user, and a camera 112 which is operative to capture images ofthe real-world scene in front of the user. The captured images, which may be stil! images or video sequences, may be used for generating a 3D model of the physical world around the user. Altematively, if the computing device 110 is a non-optreat HMD, i.e,, the images captured by the camera 112 may be displayed to the user on a display provided on the inside of the computing device 110 (instead of see-through display 111). Even further, the computing device 110 may also be embodied as a mobile phone or smartphone which is fixated to the head of the user using an arrangement comprising additional optical components, such as a half-see-through mirror, which enables the user to view the real-world scene and to project images displayed by the smartphone towards the eyes of the user. An example of such an arrangement is the HoIoKit cardboard headset
The positîoning sensor113 is operative to détermine a current position of the computing device 110, and accordingly that of its user carrying the computing device 110. It may either be based on the Global Positîoning System (GPS), the Global Navigation Satellite System (GNSS), China’s BeiDou Navigation Satellite System (BDS), GLONASS, or Galileo, or may receive positron information via the wireless network interface 114, e.g., from a positîoning server. The position information may, e.g., be based on radio triangulation, radio fingerprinting, or cmwd-sourced identifiers which are associated with known positions of access points of wireless communications networks (e.g., cell-lDs or WLAN SSIDs). The current position of the computing device 110 may, e.g., be made available via an Application Programming Interface (API) provided by an operating System of the computing device 110.
The wireless network interface 114 is a circuit which is operative to access a wireless communications network and thereby enabîe the computing device 110 to communicate, i.e,, exchange data in either direction (uplink or dowôlink). The computing device may, e.g., exchange data with other computing devices which are similar to the computing device 110, or a database 150 which is accessible by multiple computing devices 110 and which is operative to maintain information pertaining to one or more objects, as is described further below. A yet a further alternative, the wireless network interface 114 may be operative to exchange data with one or more other communications devices of the user, such as a smartwafch 140 which is shown in Fig. 1. The wireless network interface 114 may comprise one or more of a cellular modem (e.g., GSM, UMTS, LTE, 5G, NR/NX), a WLANAM-Fi modem, a Bluetooth modem, a Near-Field Communication (NFC) modem, or the like.
Embodiments of the computing device 110 are now described with further reference to Fig. 3, which shows a sequence diagram 300 illustrating tracking of objects using one or more computing devices 110A and 110B (collectively referred to as computing device(s) 110).
The processing circuit 115 causes the computing device 110 to be operative to detect 311/331 that an object is gripped by a user carrying the computing device 110, such as the spirit level 120 or the drill 130. For instance, the computing device 110 may be operative to detect 311/331 that an object is gripped by a user carrying the computing device 110 by detecting flexion ofthe fingers of a hand of the user, wherein the flexion of the fingers is characteristic ofthe hand gripping an object. The computing device 110 may,
e.g., be operaüve to detect flexion of the Angers based on sensor data which is received from a sensor device which is wom close to the hand gripping the object. The sensor device may, e.g., be a smartwatch 140 or any other wesrabfe devise which is preferably worn close to the wrist and which comprises hapfic sensors, motion sensors, and/or uftrasound sensor, which are operative to detect flexion of the Angers. For instance, Mcintosh et al: hâve demonstrated hand-gestura récognition using uttrasound imaging (J. Mcintosti, A. Marzo, M. Fraser, and C. Phillips. “Echoplex Hand Gesture Récognition using Ultrasound Imaging, in Proceedings ofthe 2017 CHi Conférence on Human Factors in Computing Systems, pages 1923-1934, ACM New York, 2017). Altematively, the sensor device may be a hapticgiove which s wom by the user. The sensordata is received by the computing device 110 via fis wiretess network interface 114, and is transmitted by the sensor device via a conesponding network interface comprised si the sensor device, e.g., a Bluetooth interface.
The computtog device 110 may aftemadvety be operative to detect 311/331 that an object is gripped by a user carrying the computing device 110 by performing image analysis on one or more images captured by a caméra wom by the user. This may, e.g., be a caméra 112 which rs intégrafed into an AR headset, or HMD, embôdying the computing device 110, as is ittustrated in Ftg. 1. Altematively, the caméra may be integrated into a hetmet wom by the user, or be Axated to the userts body, e.g., a body caméra. This type of image analysis for récognition of hand gestores is known in the art. In Fig. 2, an image 200 captured by the caméra 112 integrated into the AR headset 110 illustrated in Fig. 1 is exemplified. This embodiment ofthe computing device 110 which relies on image analysis to detect that an object is gripped 311/331 by the user ïs based on an understandîng that people typically gaze at objecte they intend to gnp. Accordingly, there is a high likelihood that an object which the user intends to grîp, such as drili 130, ts visible in the image 200 which is captured by the caméra 112 wom by the user, as is iliustrated in Fig. 2.
As a further alternative, the computing device 110 may be operative to detect 311/331 that an object is gripped by a user carrying the computing device 110 by evaluating a strength of a radio signal which is transmitted by the object. This is exemplified in Fig. 1, which illustrâtes the spirit tevei 120 as being provided with a Radio-Frequency Identification (RFID) sticker 121. The radio signai may aitematrveïy be transmitted by any other type of radio transmitter which is comprised in the object or can be attached to an object. Preferably, the radio signal is a short-ranged radio signal, such as Bluetooth orNFC. For instance, car keys frequently kicorporate radio transmitters used for unlocking a car. The radio signal which is transmitted by an object may, e.g., be received by the computing device 110, via the wireless network interface 114. Altematively, the computing device 110 may be operative to evaluate a strength of a radio signal transmitted by the object based on data pertaining to the strength ofthe radio signal, which data is received from a receiver device wom close to the hand gripping the object and which has received the radio signal. For instance, this may be the smartwatch 140 or any other type of wearabte communications device which is wom by the user, preferably close to the wrist. The detecting 311/331 that an object is gripped by the user may be achieved by concluding that an object is gripped if the signal strength ofthe received radio signal gradually increases (owing to the hand and the computing device 110, or the receiver device, approaching the object, such that the distance between them is gradualiy reduced and the received signal strength increases accordingly) and then becomes substantially constant (the hand has gripped the object, the distance between object and the computing device 110, or the receiver device, is substantially constant, as is the received signal strength).
The computing device 110 is further operative to identify 312/332 the object which is gripped 311/331 by the user. The computing device 110 may, e.g., be operative to identify 312/332 the object by performing object récognition on one or more images captured by a caméra wom by the user, similartowhat is described hereinbefore in relation to detecting that an object is gripped by the user. In particular, the caméra may be the caméra 112 which is integrated into an AR headset, or HMD, embodying the computing device 110, as is iiiustrated in Fig. 1. Altematively, the caméra may be integrated into a helmet worn by the user, or be fixated to the useris body, e.g., a body caméra . Identitying 312/332 the object based on object récognition may be achieved using a machine-learning model which has been trained by classifying images of objects which are frequently used by, and/or are known to, the user. Altematively, a generic machine-learning model or a generic data base with images of objects, and which can be accessed or retrieved bythe computing device 110, may be used. Asyet a further alternative, identifying the object based on analyzing an image captured by a caméra worn by the user may also rely on fiductal markers of objects, e.g., stickers or labels which are attached to objects. Such fiductal markers may optionally comprise opticaily readable codes, such as bar codes or QR codes.
The computing device 110 may altematively be operative to identify 312/332 the object based on a radio signal which is transmitted by the object, similar to what is described hereinbefore in relation to detecting that an object is gripped by the user. In particular, the object may be provided with an RFID sticker, such as RFID sticker 121, or any other type of radio transmitter which is comprised in the object or can be attached to an object and which is transmitting a distinct code, such as a MAC address or any other unique identifier associated with the radio transmitter or object, or a distinct radio-signal pattern. Preferably, the radio signai is a short-ranged radio signal, such as Bluetooth or NFC.
The computing device 110 is further operative to update 313/333 information pertaining to the object in a database 150 which is accessible by multiple computing devices 110. The database 150, which is also referred to as a shared database, may, e.g., be maintained in an application server, an edge server, or a cloud storage, which is accessibie by the computing devices 110 through one or more wireless communications network to which the computing devices 110 are connected via their wireless network interfaces 114. The computing device 110 is operative to update 313/333 the information pertaining to the object by transmitting information via its wireless network interface 114 using a suitable protocol, e.g., one or more ofthe Hypertext Transfer Protocol (HTTP), the Transmission Control Protocol/lntemet Protocol (TCP/IP) suite, the Constrained Application Protocol (CoAP), the User Datagram Protocol (UDP), or the like. As an alternative the shared database 150 may be maintained in a local data storage, Le., memory, ofeach ofthe multiple computing devices 110, which multiple local databases are continuously synchronized with each other, or with a master database which is maintained in an application server, an edge server, or a cloud storage. Synchronization is achieved by transmitting and/or receiving information via the wireless network interfaces 114 using a suitable protocol, as is described hereinbefore.
The information which is updated in the database 150 comprises an identifier which is associated with the object, an identifier which is associated with the user, and position information which identifies the object as beîng co-located with the user. The identifier which is associated with the object is preferably generated based a unique code or identifier which is obtained when the object is idsntified 312/332. Aîtematively, an identifier which is associated with the object may begenerated by the database 150, in particular if the object is not yet listed in the database 150 and a new database entry is created. The identifier which is associated with the user may, e.g., be a name, a user name, an aœount name, or a login name, ofthe user. Aîtematively, the identifierwhich is associated with the user may be an identifier which is associated with the useris computing device 110, e.g., a MAC address ofthe computing device 110, a name associated with the computing device 110 (e.g., “Bob’s iPhone”), or the like. The position information which identifies the object as being co-located with the user may, e.g., be an information field, or a flag, indicating that the object is co-located with the user who is identified by the identifier which is associated with the user (e.g., a Booiean flag). Aîtematively, the position information which identifies the object as being co-located with the user may be the identifier which is associated with the user (e.g., “Bob) ofthe computing device ofthe user (e.g., “Bob’s iPhone). Optionally, the computing device 110 may be operative to update 313/333 the position information identifying the object as being co-located with the user by recurrentiy updating the position information with a current positron ofthe computing device 110 ofthe user who has gripped the object, which position information is acquired from the positioning sensor113. The position information identifying the object as being co-located with the user may be updated 313/333 periodically, or in response to detecting that the position ofthe computing device, and thereby that ofthe user, has changed by more than a threshold distance.
The computing device 110 is further operative to detect 314/334 that the object is released by the user, This may be achieved in a similar way as is described hereinbefore in relation to detecting that an object is gripped by the user. For instance, the computing device 110 may be operative to detect 314/334 that the object is released by the user by detecting flexion ofthe Angers of a hand ofthe user, wherein the flexion ofthe Angers is characteristic of the hand releasing an object. The computing device 110 may, e.g., be operative to detect Aexion ofthe Angers based on sensor data which is received from the sensor device which is worn close to the hand gripping the object, such as the smartwatch 140 or other wearabie device wom close to the wrist and comprising haptic sensors, motion sensors, and/or ultrasound sensors, or a haptic glove worn by the user. The computing device 110 may alternatively be operative to detect 314/334 that the object is released by the user by performing image analysis on one or more images captured by a caméra wom by the user, e.g., the caméra 112 which is integrated into an AR headset, or HMD, embodying the computing device 110, as is Îilustrated in Fig. 1, or a caméra which is integrated into a helmet wom by the user or which is fixated to the user’s body, e.g., a body caméra. Similarto what is described hereinbefore, this embodiment ofthe computing device 110 which relies on image analysis to detect that the object is released 314/334 by the user is based on an understanding that people typically gaze at objects they are about to release, by placing the object on a surface such as a table, a fioor, a sheîf, or the like. Accordingly, there is a high likelihood that an object which the user is about to release is visible in an image which is captured by the caméra worn by the user, similar to image 200 which is shown in Fig. 2. As a further alternative, the computing device 110 may be operative to detect 314/334 that the object is released by the user by evaluating a strength ofthe radio signal which is transmitted by the object. The detecting 314/334 that the object is released may be achieved by concluding that the signal strength ofthe received radio signal suddenly decreases owing to a sudden decrease ofthe distance between the object and the computing device 110, or the receiver device, when the object is released by the user. As yet a further alternative, the computing device 110 may be operative to detect 314/334 that the object is released by the user by anaiyzing an audio signal which is captured by a microphone comprised in the computing device 110, or by a microphone which is comprised in the smartwatch 140 or other wearable audio-recording device which is wom by the user. This may be achieved by detecting a sound which is characteristic of an object being piaced on a surface, in particular a hard surface, using a trained machinelearning mode!.
The processing circuit 115 causes the computing device 110 to be further operative to update 315/335 the position information in the database150 in response to detecting 314/334 that the object is released by the user. The position information in the database 150 is updated 315/335 with a position of the computing device 110 when the object was released by the user. The position information is obtained from the positioning sensor 113. Similar to what is described hereinbefore, the computing device 110 may be operative to update 315/335 the position information by transmitting information to the database 150 via its wireless network interface 114 using a suitable protocol, e.g., one or more of HTTP, TCP/IP, CoAP, UDP, or the like. As an alternative, if the database 150 is maintained in a local data storage of each of the multiple computing devices 110, the local databases 150 are continuously synchronized with each other, or with a master database which is maintained in an application server, an edge server, or a cloud storage, by transmitting information via the wireless network interfaces 114 using a suitable protocol.
Optionally, the computing device 110 may further be operative to receive 321/341 a request from the user to locate an object, to query 322/342 the database 150 to retrieve 323/343 a current position of the object, and to guide 324/344 the user to the current position ofthe object. The request from the user may, e.g., be received as a spoken instructions (e.g., “Find spirit level.) which is captured by a microphone comprised in the computing device 110, and subjected to speech récognition, or via a graphicaf user interface through which the user interacts with the computing device 110. For instance, the computing device 110 may be operative to display a list of objects which currently are listed in the database 150 on a display ofthe computing device 110, from which list the user may select an object which he/she wishes to locate. The list of objects which currently are listed in the database 150 may be retrieved by querying the database 150 via the wireless network interface 114.
With référencé to Fig. 3, different scénarios are described which may arise when □sers of computing devices, such as the computing devices 110Aand 110B, wish to locate an object. For instance, the user (“user A”) ofthe computing device 110A may request 341 to locate an object which he/she has previously gripped 311 and released 314. This, e.g., may the case if user A has forgotten where he/she has piaced the object. It may also be the case that another user (“userB) ofthe computing device 110B has gripped 331 and subsequently released 334 the object after it has been released 314 by user A, such that user A is not aware of the current location (position) of the object. h may also be the case that the object is currentiy in use by user A, i.e., it has been gripped 311 but not yet released 314 by user A, and user B requests 321 to locate the object.
The computing device 110 may be operative to guide 324/344 the user to the current position ofthe object by displaying one or more eues guiding the userto the current position ofthe object. For instance, tf the computing device 110 is embodîed by an AR headset as is illustrated in Fig. 1, it may be operative to display arrows on the see-through display 111 which point in the direction of the current position of the object, in particular if the object is not visible in the fie!d=of=the view ofthe AR headset, i.e., if the object is currentiy located in the real-world scene where it is not visible to the user. Alternatively, if the object is within the field-of-view ofthe AR headset 110, the object may be hightighted, e.g.., by displaying a circle around the object, or any other graphical marker close to the object.
Altematively, the computing device 110 may be operative to guide 324/344 the user to the current position ofthe object by emitting an audible sound guiding the userto the current position of the object. In particular, the emitted audibie sound may be varied to reflect a distance between the user and the current position ofthe object while the user is moving around to locate the object. For instance, a volume or a frequency ofthe audible sound may increase with decreasing distance. If the audible sound comprises répétitive beeps, the duration in-between beeps may be shortened to reflect a decrease in distance, similarto a meta! detector.
As yet a further alternative, the computing device 110 may be operative, if the current position of the object is indicated as being co=located with another user, to guide 324/344 the user to the current position of the object by notifying the user that the object is currentiy co-located with the other user. This may, e.g., be achîeved by providing an audible instruction to the user (e.g., “The spirit ievel is used by Bob.”), or by displaying corresponding information to the user (e.g., “The spirit Ievel is used by Bob.’j. Similarto what is described hereinbefore, the computing device 110 may be operative to guide the userto the object at its current position even if it îs in use by another user, e.g,, by displaying one or more eues or by emitting audible sound guiding the user requesting to locate the object to the user who has gripped the object.
Although embodiments of the computing device hâve in some cases been described with référencé to the AR headset 110 illustrated in Fig. 1, also referred to as HMD, alternative embodiments ofthe computing device fortracking objecte may easily be envisagea. For instance, the computing device fortracking objects may be a mobile phone, a smartphone, a tablet computer, or as a Personal Digital Assistant (PDA).
In the following, embodiments of the processing circuit 115 comprised in the computing device for tracking objects, such as the computing device 110, are described with reference to Fig. 4. The processing circuit 115 may comprise one or more processors 402, such as Central Processing doits (CPUs), microprocessors, application spécifie processors, Graphics Processing Units (GPUs), and Digital Signal Processors (DSPs) including image processors, or a combination thereof, and a memory 403 comprising a computer program 404 comprising instructions. When executed by the processor(s) 402, the computer program 404 causes the computing device 110 to perform in accordance with embodiments of the invention described herein. The memory 403 may, e.g., be a Random-Access Memory (RAM), a Read-Oniy Memory (ROM), a Flash memory, or the like. The computer program 404 may be downioaded to the memory 403 by means of the wireless network interface 114, as a data carrier signal carrying the computer program 404. The processor(s) 402 may further comprise one or more Application-Specific Integrated Circuits (ASICs), Field-Programmable Gâte Arrays (FPGAs), or the like, which in coopération with, or as an alternative to, the computer program 404 are operative to cause the computing device 110 to perform in accordance with embodiments of the invention described herein. )n addition, the processing circuit 115 comprises one or more interface circuits 401 (“I/O in Fig. 4)forcontrolling and/or receiving information from other components comprised in the computing device 110, such as the display 111, the camera112, the positioning sensor113, the wireless network interface 114, and any additional components which are comprised in the computing device 110, e.g., a microphone, a loudspeaker, or the like. The interface(s)401 may be implemented by any kind of electronic circuitry, e.g., any one, or a combination of, analogue eîectronic circuitry, digital electronic circuitry, and processing circuits executing a suitable computer program, i.e., software.
In the following, embodiments ofthe method of tracking objects are described with reference to Fig. 5. The method 500 is performed by a computing device 110 and comprises detecting 501 that an object is gripped by a user carrying the computing device, and identifying 502 the object. The method 500 further comprises updating 503 information pertaining to the object in a database accessible by multiple computing devices. The information comprises an identifier associated with the object, an identifier associated with the user, and position information identifying the object as being co-located with the user. The method 500 further comprises detecting 504 that the object is released by the user, and in response thereto, updating 505 the position information in the database wrth a position of the computing device when the object was released by the user.
The detecting 501 that an object is gripped by a user carrying the computing device may comprise detecting flexion of the fingers of a hand of the user which is characterisiic of the hand gripping an object. Optionally, the flexion of the fingers is detected based on sensor data received from a sensor device wom close to the hand gripping the object.
The detecting 501 that an object is gripped by a user carrying the computing device may alternativeiy comprise performing image analysis on an image captured by a caméra worn by the user.
The detecting 501 that an object is gripped by a user carrying the computing device may alternativeiy comprise evaiuating a strength of a radio signal transmitted by the object. Optionally, the strength of a radio signal transmitted by the object is evaluated based on data pertaining to the strength of the radio signal, which data is received from a receiver device wom dose to the hand gripping the object and which has received the radio signal.
The identifying 502 the object may comprise performing object récognition on an image captured by a caméra worn by the user.
Alternativeiy, the object may be identifîed 502 based on a radio signal transmitted by the object.
The updating 503 the position information identifying the object as being co-located with the user may comprise recurrenfly updating the position information with a current position of the computing device.
The method 500 may further comprise receiving 506 a request from the user to locate the object, querying 507 the database to retrieve a aiment position ofthe object, and guiding 508 the user to the current position of the object. The guiding 508 the user to the current position ofthe object may comprise displaying one or more eues guiding the user to the current position ofthe object. The guiding 508 the userto the current position ofthe object may alternativeiy comprise emitting audible sound guiding the user to the current position ofthe object. If the current position ofthe object is indicated as being co-located with another user, the guiding 508 the user to the current position of the object may alternativeiy comprise notifying the userthatthe object is currently co-located with the other user.
It will be apprecîated that the method 500 may comprise additional, alternative, or modified, steps in accordance with what is described throughout this disclosure. An embodiment of the method 500 may be implemented as the computer program 404 comprising instructions which, when executed by the one or more processor(s) 402 comprised in the computing device 110, cause the computing device 110 to perform in accordance with embodiments of the invention described herein.
The person skilled in the art reaNzes that the invention by no means rs limited to the embodiments described above. On the contrary, many modifications and variations are 5 possible within the scope of the appended claims.

Claims (16)

1. A computing device (110) for tracking objects (120, 130), the computing device comprising:
a positioning sensor (113), a wireless network interface (114), and a processing circuit (115) causing the computing device to be operative to:
detect (311, 331) that an object is gripped by a user carrying the computing device by evaluating a strength of a radio signal transmitted by the object, identify (312, 332) the object, update (313, 333) information pertaining to the object in a database (150) accessible by multiple computing devices (110A, 110B), the information comprising an identifier associated with the object, an identifier associated with the user, and position information identifying the object as being co-located with the user, detect (314, 334) that the object is released by the user, and in response thereto, update (315, 335) the position information in the database with a position of the computing device when the object was released by the user.
2. The computing device according to claim 1, operative to detect (311, 331) that an object is gripped by a user carrying the computing device by detecting flexion of the fingers of a hand of the user which is characteristic of the hand gripping an object.
3. The computing device according to claim 2, operative to detect (311,331) flexion ofthe fingers based on sensor data received from a sensor device (140) worn close to the hand gripping the object.
4. The computing device according to claim 1, operative to detect (311, 331) that an object is gripped by a user carrying the computing device by performing image analysis on an image captured by a caméra (112) worn by the user.
5. The computing device according to claim 1, operative to evaluate a strength of a radio signal transmitted by the object based on data pertaining to the strength ofthe radio signal, which data is received from a receiver device (140) worn close to the hand gripping the object and which has received the radio signal.
6. The computing device according to any one of claims 1 to 5, operative to identity (312, 332) the object by performing object récognition on an image captured by a caméra (112) worn by the user.
7. The computing device according to any one of claims 1 to 5, operative to identity (312, 332) the object based on a radio signal transmitted by the object.
8. The computing device according to any one of claims 1 to 7, operative to update (313, 333) the position information identifying the object as being co-located with the user by recurrently updating the position information with a current position of the computing device.
9. The computing device according to any one of claims 1 to 7, further operative to: receive (321, 341) a request from the user to locate the object, query (322, 342) the database (150) to retrieve (323, 343) a current position ofthe object, and guide (324, 344) the userto the current position ofthe object.
10. The computing device according to claim 9, operative to guide (324, 344) the user to the current position of the object by displaying one or more eues guiding the user to the current position of the object.
11. The computing device according to claim 9, operative to guide (324, 344) the user to the current position of the object by emitting audible sound guiding the user to the current position ofthe object.
12. The computing device according to claim 9, operative, if the current position of the object is indicated as being co-located with another user, to guide (324, 344) the user to the current position of the object by notifying the user that the object is currently colocated with the other user.
13. The computing device according to any one of claims 1 to 12, being any of a mobile phone, a smartphone, a tablet computer, a Personal Digital Assistant, PDA, a HeadMounted Display, HMD, and an Augmented-Reality, AR, headset (110).
14, A method (500) of tracking objects, performed by a computing device, the method comprising:
detecting (501) that an object is gripped by a user carrying the computing device by evaluating a strength of a radio signal transmitted by the object, identifying (502) the object, updating (503) information pertaining to the object in a database accessible by multiple computing devices, the information comprising an identifier associated with the object, an identifier associated with the user, and position information identifying the object as being co-located with the user, detecting (504) that the object is released by the user, and in response thereto, updating (505) the position information in the database with a position ofthe computing device when the object was released by the user.
15. A computer-readable storage medium (403) having stored thereon a computer program (404) comprising instructions which, when the computer program is executed by a processor (402) comprised in a computing device (110), cause the computing device to carry out the method according to claim 14,
16. A data carrier signal carrying a computer program (404) comprising instructions which, when the computer program is executed by a processor (402) comprised in a computing device (110), cause the computing device to carry out the method according to claim 14.
OA1202100564 2019-07-03 Computing device and method for tracking objects. OA20558A (en)

Publications (1)

Publication Number Publication Date
OA20558A true OA20558A (en) 2022-10-27

Family

ID=

Similar Documents

Publication Publication Date Title
EP2891954B1 (en) User-directed personal information assistant
US10015836B2 (en) Master device for using connection attribute of electronic accessories connections to facilitate locating an accessory
US11099651B2 (en) Providing haptic output based on a determined orientation of an electronic device
US9460609B2 (en) Method and apparatus for preventing losing electronic devices
CN106471860B (en) Mobile terminal and method for controlling the same
AU2015296833A1 (en) Providing notifications based on user activity data
US20200318988A1 (en) Methods, systems, and devices for displaying maps
WO2014057371A1 (en) Method and apparatus for utilizing sensor data for auto bookmarking of information
US20220365604A1 (en) Computing device and method for tracking objects
US20180342142A1 (en) Concealed location tracking device for tracking location of a user and method thereof
US9674883B2 (en) System, an object and a method for grouping of objects in a body area network
OA20558A (en) Computing device and method for tracking objects.
US11697301B2 (en) Remotely programmable wearable device
US20160192322A1 (en) Selectively notifying users of incoming messages on a secondary electronic device
JP2016206433A (en) Information processing apparatus, information processing method, and program
WO2015105075A1 (en) Information processing apparatus and electronic device
CN114724232A (en) Posture recognition and correction method, device and system and electronic equipment
KR20160094832A (en) Apparatus, system and method for searching contents
CN107958069B (en) Image processing method and device, electronic equipment and computer readable storage medium
KR20160060447A (en) Mobile terminal and method for controlling the same
CN117909536A (en) Photo pushing method and related device
KR20170022331A (en) A mobile terminal