US20210266700A1 - Methods for Mobile Devices to Locate Persons - Google Patents
Methods for Mobile Devices to Locate Persons Download PDFInfo
- Publication number
- US20210266700A1 US20210266700A1 US17/314,007 US202117314007A US2021266700A1 US 20210266700 A1 US20210266700 A1 US 20210266700A1 US 202117314007 A US202117314007 A US 202117314007A US 2021266700 A1 US2021266700 A1 US 2021266700A1
- Authority
- US
- United States
- Prior art keywords
- mobile device
- user
- server
- location
- beaconing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 31
- 230000004044 response Effects 0.000 claims description 11
- 238000004891 communication Methods 0.000 description 16
- 238000010586 diagram Methods 0.000 description 13
- 239000003086 colorant Substances 0.000 description 6
- 230000008859 change Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- IJJWOSAXNHWBPR-HUBLWGQQSA-N 5-[(3as,4s,6ar)-2-oxo-1,3,3a,4,6,6a-hexahydrothieno[3,4-d]imidazol-4-yl]-n-(6-hydrazinyl-6-oxohexyl)pentanamide Chemical compound N1C(=O)N[C@@H]2[C@H](CCCCC(=O)NCCCCCC(=O)NN)SC[C@@H]21 IJJWOSAXNHWBPR-HUBLWGQQSA-N 0.000 description 1
- 241000204795 Muraena helena Species 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- PCHJSUWPFVWCPO-UHFFFAOYSA-N gold Chemical compound [Au] PCHJSUWPFVWCPO-UHFFFAOYSA-N 0.000 description 1
- 239000010931 gold Substances 0.000 description 1
- 229910052737 gold Inorganic materials 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/025—Services making use of location information using location based information parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/029—Location-based management or tracking services
Definitions
- This invention relates to methods and systems for locating persons and places using mobile devices.
- the invention is a method implemented in a server for providing beaconing sequences for location sharing to mobile devices.
- the invention is a server executing a method of locating a user using a beaconing mobile device.
- the invention is a non-transitory computer readable medium on a server that encodes a program to execute a method on a first mobile device that determine directions and/or distance between the first mobile device and a second mobile device.
- the invention is a server executing a method to remember a place on a mobile device.
- the invention is a system for providing beaconing sequences for location sharing to mobile devices.
- FIG. 1 is a network diagram that illustrates N mobile devices communicating with N servers.
- FIGS. 2A-2F illustrate the user interfaces of a mobile device.
- FIG. 3 illustrates user interactions and communications among the mobile devices and the servers.
- FIG. 4 illustrates a second mobile device user locating a first mobile device user in a crowd using the compass feature.
- FIG. 5 is a perspective view of a second mobile device user locating a first mobile device user in a crowd using a compass.
- FIG. 6 is a perspective view of a second mobile device user locating a first mobile device user in a crowd using the compass on a smart watch or an electronic wearable.
- FIG. 7 illustrates a second mobile device user locating a first mobile device user by using a beaconing mobile device in a crowd with other users.
- FIG. 8 is a perspective view of a second mobile device user locating a first mobile device user in a crowd by using the beaconing mobile device.
- FIG. 9 illustrates beaconing sequences used in FIGS. 7-8 .
- FIG. 10 illustrates a variety of beaconing sequences.
- FIG. 11 illustrates assigning beaconing sequences by location cells.
- FIG. 12 illustrates some beaconing sequences that could be used in location cells.
- FIG. 13 illustrates assigning beaconing sequences by venue.
- FIG. 14 illustrates examples of beaconing sequences that could be used in FIG. 13 .
- FIG. 15 illustrates a flow diagram for the beaconing logic that runs on the server of FIGS. 1 and 3 .
- FIG. 16 illustrates a flow diagram for the message logic that runs on the server of FIGS. 1 and 3 .
- FIG. 17 illustrates a flow diagram for the event logic that runs on the server of FIGS. 1 and 3 .
- FIG. 18A-18F is a set of flow diagrams that illustrate the application logic that runs on the mobile device and interacts with the server.
- FIG. 1 is a network diagram that illustrates N mobile devices communicating with N servers.
- the network diagram illustrates a system that includes the first mobile device 74 , the second mobile 76 , the third mobile device 80 up to the Nth mobile device 84 .
- the communication links 72 , 78 , 82 , and 86 connect the first mobile device 74 through Nth mobile device 84 to the computer network 40 that connects to a first server 10 through a communication link 36 , a second server 26 through a communication link 22 , and a Nth server 38 through a communication link 39 .
- the first server 10 illustrates the software components.
- the first server 10 includes a web server 34 that will listen for requests and pass them along to an email server 35 , beacon logic 32 ( FIG. 15 ), message logic 28 ( FIG. 16 ), event logic 30 ( FIG. 17 ) that the first server 10 executes during operation of the system.
- a backplane 20 links the first server 10 , a database and storage 24 , the second server 26 , and the Nth server 38 .
- the second server 26 has the same software components (not shown) and illustrates hardware for the first server 10 through Nth server 38 . It includes a CPU-memory bus 14 that communicates with a processor 12 . The second server 26 includes memory 16 coupled to the processor 12 and interfaces 18 that connect through a communication link 22 to the network 40 .
- Data is defined as including user data, instructions, and metadata.
- Inputting data is defined as the input of parameters and data from user input, computer memory, and/or storage device(s).
- a processor could be any suitable general purpose processor running software.
- the processor could be one or more multicore processors made by Intel or licensed by ARM, and AMD.
- a low cost single board computer processor such as Raspberry Pi.
- the arrangement and type of the processors is not essential to the invention.
- Each server may run an operating system such as Linux, UNIX, a Windows OS, or another suitable operating system. Tanenbaum, Modern Operating Systems (2008) describes operating systems in detail and is hereby incorporated by reference. Bovet and Cesati, Understanding the Linux Kernel (2005), and Bach, Design of the Unix Operating System (1986) describe operating systems in detail and are incorporated by reference herein.
- each server could be implemented on a virtual machine hosted by VMware, Hyper V, or open source software Xen.
- VMware Hyper V
- open source software Xen Lowe et al. Mastering VMware vSphere 5.5 (2013) describes the VMware virtualization software in detail and is incorporated by reference herein.
- the server will be implemented by hundreds even thousands of computers in a data center such as Amazon Web Services, Google Compute Engine, Microsoft Azure, or Rackspace. It is not essential to the invention that a particular data center be used.
- Murry, Programming Amazon Web Services: S 3, EC 2, SQS, FPS, and SimpleDB (2008) describes the Amazon Web Services in detail and is incorporated by reference herein.
- Sanderson, Programming Google App Engine (2012) describes the Google App Engine in detail and is incorporated by reference herein.
- the server will be implemented using low cost single board computers such as Raspberry Pi that runs locally in the geographic area and/or near the mobile devices and communicates with the mobile devices using, for example, network protocols such as Bluetooth (e.g., low energy Bluetooth), Wi-Fi, or TCP/IP.
- network protocols such as Bluetooth (e.g., low energy Bluetooth), Wi-Fi, or TCP/IP.
- Bluetooth e.g., low energy Bluetooth
- Wi-Fi Wireless Fidelity
- TCP/IP Transmission Control Protocol/IP
- the database and storage 24 stores the user and event information and communicates with the first server 10 , the second server 26 , and the Nth server 38 .
- a non-transitory computer-readable medium e.g., storage device, DVD, USB storage device
- Rockoff The Language of SQL: How to Access Data in Relational Databases (2010) describe SQL databases in detail and is incorporated by reference herein.
- Redmond et al., Seven Databases in Seven Weeks (2012) describe non-SQL databases in detail and is incorporated by reference herein.
- a first mobile device 74 (e.g., a cell phone, a tablet, a smart watch, electronic wearable clothing, or glasses) includes an application 64 , including a communication interface 66 , application logic 68 , and a user interface 70 .
- the communication interface 66 includes a conventional network interface that enables the first mobile device 74 to communicate through a link 72 to a computer network 40 , which includes the cellular phone network (e.g., AT&T, Verizon, Sprint, NTT DoCoMo, Orange, and China Mobile) and/or the Internet. Tanenbaum, Computer Networks (2010) describes computer networks in detail and is incorporated by reference herein.
- a computer network 40 which includes the cellular phone network (e.g., AT&T, Verizon, Sprint, NTT DoCoMo, Orange, and China Mobile) and/or the Internet. Tanenbaum, Computer Networks (2010) describes computer networks in detail and is incorporated by reference herein.
- the user interface 70 defines a set of graphical user interfaces that will be displayed on the screen of the mobile device 74 as shown in FIGS. 2A-2F .
- the application logic 68 defines the state of the application (e.g., what screen is being displayed and the mobile device beacon state), stores inputs such as locations of mobile devices obtained from the servers of the computer network 40 and calculates the relative distances and directions of the mobile devices.
- the application logic 68 can be written in a variety of languages. For example, if the mobile device is an Apple device the language would be Objective-C. Kochan, Programming in Objective - C (5th Edition) (Developer's Library) (2012) describes Objective-C in detail and is incorporated by reference herein. For example, if the mobile device is Android device the language could be Java. Medinieks, Programming Android: Java Programming for the New Generation of Mobile Devices (2012) describes Android programming in detail and is incorporated by reference herein.
- the first mobile device 74 has a location determination component 52 that determines the location of the first mobile device 74 by using a variety of sources.
- a global positioning system (GPS) 42 and cell tower 44 are suitable as long range sources.
- Wi-Fi 46 , Bluetooth 48 , including Bluetooth low energy (LE), and a RFID protocol such as near field communication (NFC) 50 are suitable. Tanenbaum, Computer Networks (2010) describes these protocols in detail and is incorporated by reference herein.
- the first mobile device 74 has an orientation determination component 58 that determines the orientation of the first mobile device 74 with respect to magnetic north and with respect to the earth's surface by using a variety of sources.
- orientation determination component 58 that determines the orientation of the first mobile device 74 with respect to magnetic north and with respect to the earth's surface by using a variety of sources.
- many mobile devices contain an accelerometer 54 and a compass 56 and gyroscopes (not shown) which function as suitable sources of orientation.
- the first mobile device 74 optionally has a map component 62 that enables the first mobile device 74 to communicate over the Internet with a map provider such as Google maps, Apple maps, Nokia's HERE maps, and OpenStreetMap. It is not essential to the invention which map provider is used.
- FIGS. 2A-2F illustrate the details of the user interfaces of a mobile device.
- FIG. 2A illustrates the first mobile device 74 includes a home screen 88 that displays a set of user selectable buttons: a remember place button 89 , a share location button 90 , and a find things button 92 .
- the application logic 68 uses the user interface 70 of the first mobile device 74 to display a remember place screen 94 shown in FIG. 2B .
- the screen 94 displays a field 96 that contains the name of the item whose location a user wants to remember.
- the field 96 is a drop down menu that holds a value for Alan's car: “My Car” that is associated with location information in the parking lot.
- the field 96 can be an input field for names entered by the user. After entering the value either by menu or text field, the user can save that value by clicking on a save place button 98 .
- the user can tag the item with other information beside or in lieu of the location information by selecting the “add picture” button 95 , the “add voice memo” button 97 , and/or the “add text note” button 99 , which will call up respectively, the camera, voice memo, and/or text feature of the first mobile device 74 .
- Alan to find the place of an item (e.g., where he parked his car or bike, where he left a package, or his favorite store or restaurant) he can use the remember place feature to store coordinates and/or other location related information.
- Alan wants to share his location from 6-11 PM with Bob, Sarah, and Andrew at an event such as a concert.
- Alan adds values in the input fields such as “Concert” in the event field 101 , “6 PM” in the start time field 103 , “11 PM” in the end time field 105 , and the email addresses of Bob, Sarah, and Andrew in a guest field 104 .
- Alan will then press the send button 106 .
- the email addresses can be retrieved from the contact list of the first mobile device 74 or from Internet contact lists, e.g., LinkedIn, Facebook, and Yahoo contacts.
- the application logic 68 will use the communication interface 66 to send a message to the first server 10 ( FIG. 1 ) that will in turn send a message to a second mobile device 76 .
- the find things screen 108 optionally includes an on-off slider 111 to make the second mobile device 76 visible to the first mobile device 74 or make it visible to other mobile devices on the guest list, or invisible to everybody.
- FIG. 2E illustrates the display screen 114 of the second mobile device 76 includes a compass 116 , a user selectable request to wave button 118 , and a beacon indicator 120 .
- the compass 116 displays that the first user Alan is at an event Concert, 200 feet away in the direction of the compass arrow.
- the compass 116 may indicate north of NWSE directions.
- FIG. 2F illustrates the display screen 122 of the first mobile device 74 includes a beaconing display 124 that will be used to find the first user, Alan as will be described in detail in FIGS. 3-8 .
- FIG. 3 illustrates interactions and communications among the mobile devices and a server.
- the first user 126 e.g., Alan
- the first mobile device 74 will press a button Start App 125 that will send a HTTP request to compare the clock of the first mobile device 74 with the clock of the first server 10 . That time difference will be stored in the memory of the first server 10 , which will be used by the application later.
- the first mobile device 74 will display the home screen 88 as shown in FIG. 2A .
- the first user 126 will press the share location button 90 .
- the first mobile device 74 will display the screen 102 as shown in FIG. 2C .
- the first user 126 will complete the fields 101 , 103 , 104 , and 105 and press the send button 106 as illustrated in FIG. 2C .
- the first mobile device 74 sends a message 300 to the message logic 28 ( FIGS. 1 and 16 ) of the first server 10 , which will send a notification message 302 to the second mobile device 76 to notify the second user 128 (e.g., Bob) that a new item 107 (e.g., Alan@Concert) was added to the find things screen 108 of the second mobile device 76 .
- the first server 10 also sends a response 304 to the request 300 to confirm the notification 127 (e.g., email, text, or push notification) regarding the event (e.g., Concert) was sent to each guest on the guests list 104 .
- the notification 127 e.g., email, text, or push notification
- the second user 128 e.g., Bob
- the second mobile device 76 will start the app 129 that will send a HTTP request to compare the clock of the second mobile device 76 with the clock of the first server 10 . That time difference will be stored in the memory of the first server 10 , which will be used by the application later.
- the second mobile device 76 will display the home screen 88 as shown in FIG. 2A .
- the second user 128 will press the find things button 92 .
- the second mobile device 76 will send a request 306 to the event logic 30 ( FIGS. 1 and 17 ) of the first server 10 .
- the first server 10 reads the entries 107 , 109 , and 115 of the find things list in the database 24 ( FIG. 1 ) and sends a response 308 to the second mobile device 76 .
- the response 308 includes a list of the entries as illustrated in the find things screen 108 of the second mobile device 76 as shown in FIG. 2D .
- the first and second mobile devices 74 , 76 periodically update their coordinates to the first server 10 that keeps track of their respective locations.
- the second user 128 of the second mobile device 76 will press the find button 110 .
- the second mobile device 76 will display the compass screen 114 as shown in FIG. 2E .
- the second user 128 of the second mobile device 76 will then press the request wave button 118 .
- the second mobile device 76 will send a request 310 to the beacon logic 32 ( FIGS. 1 and 15 ) of the first server 10 .
- the first server 10 will read a list of users in the geographic area of interest and each of their beaconing sequences to identify what beaconing sequence is available for assignment as illustrated in FIG. 15 .
- the first server 10 sends a message 312 with the assigned beaconing sequence 142 ( FIG. 9 ) to the wave screen 122 of the first mobile device 74 as illustrated in FIG. 2F .
- the first server 10 sends a response 314 with the assigned beaconing sequence 144 ( FIG. 9 ) to the second mobile device 76 .
- a request to wave will trigger a beacon sequence on another mobile device so it is conspicuous and can be seen.
- the second user 128 (e.g., Bob) now uses the compass 116 to get the distance (e.g., 200 feet) and direction of the first user 126 (e.g., Alan).
- the second user 128 can use the beaconing sequence 144 on the beaconing indicator 120 of his second mobile device 76 to indicate the beaconing sequence 142 being displayed on the first mobile device 74 .
- the second user 128 will be able to spot the first user 126 as long as the first mobile device 74 is visible to the second user 128 .
- FIG. 4 illustrates a crowd gathered at an event such as concert with a stage 132 .
- the crowd 130 prevents or makes it difficult for a first user 126 (e.g., Alan) and the second user 128 (e.g., Bob) to locate each other.
- FIG. 5 is a perspective view of FIG. 4 that illustrates how the first user is found by the second user.
- the first user 126 holds the beaconing first mobile device 74 (e.g., cell phone) apart from the crowd.
- the second user 128 holds a second mobile device 76 (e.g., cell phone) displaying a compass 116 that points to the first user 126 and includes a request to wave button 118 that was pressed to trigger beaconing on the first mobile device 74 and a beacon indicator 120 .
- FIG. 6 is a perspective view of FIG. 4 that illustrates another context where a first user is found by a second user.
- the first user 126 holds the beaconing first mobile device 74 (e.g., a cell phone or tablet) apart from the crowd.
- the second user 128 holds or wears a second mobile device 77 (e.g., smart watch/electronic wearable) displaying a compass 116 that points to the first user 126 and includes a request to wave button 118 that was pressed to trigger beaconing on the first mobile device 74 and a beacon indicator 120 .
- a second mobile device 77 e.g., smart watch/electronic wearable
- FIG. 7 illustrates a crowd gathered at an event such as concert with a stage 132 .
- the crowd 130 prevents or makes it difficult for a first user 126 (e.g., Alan) and the second user 128 (e.g., Bob) to locate each other.
- a first user 126 e.g., Alan
- the second user 128 e.g., Bob
- unrelated persons at the event are using our method of location.
- the first user 126 is holding a first mobile device 74
- a third user 134 is holding a third mobile device 136
- a fourth user 138 is holding a fourth mobile device 140 , which are all beaconing.
- FIG. 8 is a perspective view of FIG. 7 that illustrates how our method would enable the second user to find the first user despite the presence of other beaconing mobile devices.
- the first user 126 holds the beaconing first mobile device 74 (e.g., cell phone) apart from the crowd 130 .
- the second user 128 holds a second mobile device 76 (e.g., cell phone) displaying a compass 116 that points to the first user 126 and includes a request to wave button 118 that was pressed to trigger beaconing on the first mobile device 74 and a beacon indicator 120 .
- the third user 134 is near the first user 126 .
- the compass 116 is no longer sufficient to locate the first user 126 , because it points generally toward the first user 126 and the third user 134 . This is one situation where assigning a unique beaconing sequence can help locate the first user 126 .
- FIG. 9 illustrates a beaconing sequence that can be flashed to locate a person in the crowd as shown in FIGS. 7-8 .
- the first mobile device 74 includes a screen 124 that flashes a yellow green beaconing sequence 142 .
- the second mobile device 76 includes a beaconing indicator 120 that flashes the yellow green beaconing sequence 144 .
- the third mobile device 136 includes a screen 121 that flashes an orange blue sequence 146 .
- a fourth user 138 holding a fourth mobile device 140 is flashing a red off beaconing sequence 148 .
- the compass 116 can eliminate the fourth user 138 from consideration, because it points away from the fourth mobile device 140 as shown in FIG. 8 . However, the compass 116 points toward the first mobile device 74 and the third mobile device 136 .
- the second user 128 can distinguish the first user 126 from the third user 134 , however, because the orange blue beaconing sequence 146 contrasts with the yellow green beaconing sequence 144 on the beaconing indicator 120 .
- the beaconing sequences 142 , 144 can be synchronized to assure the second user 128 that it is the first user 126 .
- FIG. 10 illustrates a variety of beaconing sequences which can be assigned to a given mobile device by the first server 10 .
- a beaconing sequence is a combination of colors, timing, icons and text delivered in segments that will enable a user to locate another user of interest in a crowd.
- the beaconing sequence could simply be a single color such as yellow 150 that is displayed.
- the beaconing sequence could be a flashing color such as red 152 , multiple alternating colors such as yellow-green-yellow 154 , or multiple colors in a repeating sequence of segments such as red, white, and blue 156 .
- the beaconing sequence can be multiple colors displayed at the same time such a segment of yellow/green 158 , or an icon (static or dynamic) on a background color(s) such orange 160 , or an icon on alternating background colors such as red and gold 162 , or a non-repeating sequence of multiple colors 164 .
- FIGS. 11-12 illustrate assigning beaconing sequences in a location cell that would be useful in a crowded environment such as Times Square. Let's assume the second user 128 is at the corner of W. 47th Street and Broadway looking for the first user 126 who is at W. 46th Street and 7th Avenue at about 10 PM on December 31st, just before the New Year celebration. A third user 134 is at 46th and Broadway and a fourth user 138 at W. 49th Street and Broadway. The third user 134 and fourth user 138 are using the application but are not in the same group as defined by the first user 126 which included only the second user 128 . The third user 134 with a mobile device 136 is assigned an orange-blue beaconing sequence 254 .
- the first server 10 assigns a different beaconing sequence such the yellow-green sequence 250 to be displayed on the first mobile device 74 of the first user 126 .
- the fourth user 138 with mobile device 140 in a non-adjacent cell such as location cell 244 can be assigned a similar yellow-green beaconing sequence 256 .
- the first server 10 can extend the assignment of the unique beaconing sequences for each location cell beyond the location cell 240 of the first user 126 and the location cell 242 of the second user 128 to one or more surrounding location cells such as cell location 246 , which is non-adjacent to location cell 240 .
- 11 illustrates the location cells as hexagonal, but it could be a variety of geometric planar shapes such as a square, a rectangular, circle, or a triangle, or any shape that can be arranged in a grid or a geometric 3D shapes such as a cube, a cylinder, or a prism.
- FIG. 13-14 illustrate assigning beaconing sequences by venue that would be useful in adjacent venues such as an amusement park, a stadium, and a shared parking lot.
- the first user 126 with a mobile device 74 and the fourth user 138 with a mobile device 140 are both in the first venue 258 (e.g., the shared parking lot).
- the first server 10 has assigned a yellow blue beaconing sequence 268 to the first mobile device 74 , and the flashing red beaconing sequence 272 to the fourth mobile device 140 .
- the first and fourth users 126 , 138 are both using the application, but are not in the same group.
- the fourth user 138 travels to a location 266 in a second venue 260 (e.g., a sports stadium).
- the fourth user 138 can continue using it.
- the first server 10 will assign the flashing red beaconing sequence 274 to the first mobile device 74 to avoid a conflict with the third user 134 whose mobile device 136 was previously assigned the yellow blue beaconing sequence 278 in the third venue 262 .
- the second user 128 can readily find the first user 126 using the same flashing read beaconing sequence 274 and 276 .
- FIG. 15 illustrates a flow diagram for the beaconing logic that runs on the server of FIGS. 1 and 3 .
- Hunter et al., Java Serviet Programming (2001) describes Java Servlet programming that can be used to implement the beaconing logic and is incorporated by reference herein.
- the beacon logic 32 FIG. 1
- the second mobile device 76 will send a request (e.g., HTTP request) to the first server 10 .
- the first server 10 will read the request, which includes the clock value of the second mobile device 76 .
- the first server 10 will read the location of the first and second mobile devices 74 , 76 .
- the first server 10 will perform a time sync that reads its own clock and receives the time stamp of each of the first and second mobile devices 74 , 76 .
- the first mobile device 74 also performs a time sync (e.g., FIG. 3 , Start of App 125 ) that includes a clock value of the first mobile device 74 .
- the time sync can be performed multiple times and averaged. The time sync may be done once or periodically and will depend on accuracy of the mobile device clocks, and clock drift.
- the first server 10 will look up the beaconing sequence history table, which is stored in the database 24 . Each record in the table will contain: beaconing sequence, the geographic location, and the time of the sequence.
- the beacon logic 32 will execute a database query that filters on location and time so that only recent beaconing sequences from other users in the area will be in the result. If there are no recent beaconing sequences used by other users in the area, the beacon logic 32 will move to step 174 .
- the beacon logic 32 will perform a lookup of the first user's preference for a beaconing sequence.
- the beacon logic 32 will calculate the beaconing sequence from the preference of the first user 74 .
- the beacon logic 32 will move to step 178 .
- the beacon logic 32 will read the record of the beacon sequence in the beaconing sequence history table that was recently used in the area.
- the beacon logic 32 will add the beacon sequence to a sequence list that contains all the beacon sequences that cannot be used in the area by the first user 126 .
- the beacon logic 32 will determine if there is another record returned from the query of the database 24 . If yes, the beacon logic 32 will return to read the next beacon history record. This loop continues until all of the beacon sequence records are processed. If no, the beacon logic 32 will go to step 184 .
- the beacon logic 32 will calculate the beacon sequence that will be used by the first user 126 . In an embodiment, this calculation can optimize the difference between the color(s) in the beaconing sequence to be used by the first user 126 and those already in use.
- the beacon logic 32 adds the calculated beacon sequence to the beacon history table.
- the beacon logic 32 sends a response with the calculated beacon sequence 126 , which is received by the first mobile device 74 at step 120 ( FIG. 3 ).
- FIG. 16 illustrates a flow diagram for the message logic that runs in the server of FIGS. 1 and 3 .
- the first user 74 will complete the fields shown in FIG. 2C .
- the screen 102 displays the fields: an event name 101 , a start time 103 , an end time 105 , and a list of guests 104 .
- the list of guests 104 is the email addresses of the guests.
- the first user 126 presses the send button 106 ( FIG. 2C ), which causes the first mobile device 74 to send a request 300 to the first server 10 .
- the request 300 includes the information that the user input into one or more of the fields shown in FIG. 2C .
- the message logic 28 reads the request at step 190 .
- the message logic 28 adds the request information as an entry into the Event table in the database 24 .
- the message logic 28 will read the identifying information (e.g., email address or cell phone number) of the next guest in the list 104 ( FIG. 2C ).
- the message logic 28 will perform a lookup in the user in the database 24 .
- the message logic 28 will check if the user has the application. If yes, the message logic 28 will add an entry in the user event table at step 202 .
- the message logic 28 will perform a lookup of the user's notification preferences (e.g., email, text message, push notification, automated phone call, or a social media post).
- the message logic 28 will send a notification using the preferred method(s) to the second mobile device 76 .
- the message logic 28 will send a message to add an entry 107 to the find things screen 108 ( FIG. 2D ) of the second mobile device 76 .
- the message logic 28 will check if another user is on the guest list.
- the message logic 28 will return to step 194 . If not, that is, it was the last user on the guest list, the message logic 28 will send a response 304 ( FIG. 3 ) to the first mobile device 74 at step 210 . As a result, a confirmation is received by the first mobile device 74 at step 211 .
- FIG. 17 illustrates a flow diagram for the event logic that runs on the server of FIGS. 1 and 3 .
- the second user 128 will press the find things button 92 ( FIG. 2A ) on the second mobile device 76 , which sends a request 306 to complete the find things screen to the first server 10 .
- the first server 10 reads the request that includes the identity of the second user 128 .
- the event logic 30 will perform a lookup of all entries for the second user 128 in the user event table in the database 24 . The event logic 30 will read each entry from the user event table at step 216 .
- the event logic 30 will add the information from the entry to the find things list that will eventually be returned to the second mobile device 76 to create the find things screen 108 in FIG. 2D .
- the event logic 30 will read if another entry exists in the user event table. If yes, the event logic 30 will return to step 216 . If no, the event logic 30 will go to step 222 .
- the event logic 30 will prioritize the find things list at step 222 then send it in a response 308 ( FIG. 3 ) at step 228 to generate the screen 108 ( FIG. 2D ) with entries pertaining to the second user 128 on the second mobile device 76 .
- FIG. 18A-18F is a set of flow diagrams that illustrate the application logic that runs on the mobile device.
- FIG. 18A illustrates the application that runs on the mobile device.
- the application 64 includes communication interface 66 , the application logic 68 , and the user interface 70 .
- the application logic 68 will use the communication interface 66 to send a time stamp as described in FIG. 3 .
- the application logic 68 uses the user interface 70 to draw the screen 88 shown in FIG. 2A .
- the application logic 68 determines if the user pressed the remember place button 89 shown in FIG. 2A . If yes, the application logic 68 exits to entry point B on FIG. 18B . If no, the application logic 68 goes to step 360 .
- the application logic 68 determines if the user pressed the share location button 90 shown in FIG. 2A . If yes, the application logic 68 exits to entry point C on FIG. 18C . If not, the application logic 68 goes to step 364 . At step 364 , the application logic 68 determines if the user pressed the find things button 92 shown in FIG. 2A . If yes, the application logic 68 exits to entry point D on FIG. 18D . If no, the application logic 68 goes to step 368 . At step 368 , the application logic 68 waits a time period (e.g., 10 ms) then goes to step 356 .
- a time period e.g. 10 ms
- FIG. 18B illustrates the application that runs on the mobile device.
- the application logic 68 uses the user interface 70 to draw the screen 94 shown in FIG. 2B .
- the application logic 68 determines if the user pressed the save place button 98 shown in FIG. 2B . If yes, the application logic 68 goes to the remember place at step 374 .
- the application logic 68 stores the current location in the memory of the mobile device and optionally sends the location information to be stored on the first server 10 and preferably the database 24 .
- the application logic 68 exits to entry point A on FIG. 18A .
- the application logic 68 determines if the user pressed the add picture button 95 shown in FIG.
- the application logic 68 will show the camera at step 378 .
- the application logic 68 stores the picture (e.g., a landmark near the place to be remembered) in the memory of the mobile device and optionally sends the picture to be stored on the first server 10 and preferably the database 24 .
- the application logic 68 determines if the user pressed the add memo button 97 shown in FIG. 2B . If yes, the application logic 68 goes to record voice feature at step 384 .
- the application logic 68 stores the voice memo (e.g., a description about the place to be remembered) in the memory of the mobile device and optionally sends the recorded voice memo to be stored on the first server 10 and preferably the storage 24 .
- the application logic 68 determines if the user pressed the add text note button 99 shown in FIG. 2B . If yes, the application logic 68 uses the user interface 70 to show a keyboard at step 390 .
- the application logic 68 stores the text note (e.g., a text note about the place to be remembered) in the memory of the mobile device and optionally sends the text note to be stored on the first server 10 and preferably the database 24 .
- the application logic 68 will determine if the user pressed the back button 100 as shown in FIG. 2B at step 394 . If not, application logic 68 waits a time period (e.g., 10 ms) at step 396 . After steps 380 , 386 , and 392 , the application logic 68 waits at step 396 .
- a time period e.g. 10 ms
- FIG. 18C illustrates another feature of the application that runs on the mobile device.
- FIG. 18C begins by entering from point C on FIG. 18A .
- the application logic 68 uses the user interface 70 to draw the share location screen 102 shown in FIG. 2C .
- the input fields of share location screen 102 are initially empty (not shown).
- the user will enter values into the input fields as previously shown in FIG. 2C .
- the application logic 68 will go to entry point A on FIG. 18A to draw the home screen 88 shown in FIG. 2A . If not, the application logic 68 will determine if the user pressed the send button 106 as shown in FIG. 2C .
- the application logic 68 waits a time period (e.g., 10 ms) at step 408 . If yes, the application logic 68 will determine if the user input is valid at step 404 . If not, the application logic 68 will notify the user that the input is not valid at step 410 . If yes, the application logic 68 will send share location information to be stored on the first server 10 and preferably on the database 24 at step 406 . At step 412 , the server will send a response to the mobile device then the application logic 68 exits to entry point A on FIG. 18A .
- a time period e.g. 10 ms
- FIG. 18D illustrates another feature of the application that runs on the mobile device.
- FIG. 18D begins by entering from point D on FIG. 18A or FIG. 18E .
- the application logic 68 will send a request with the identity of the user to the first server 10 ( FIG. 1 ).
- the event logic 30 will use the user identity to look up all of the entries of that user in the user event table then prepare a find things list which will be returned in a response to the user's mobile device.
- the application logic 68 of the mobile device will parse the response at step 416 .
- the application logic 68 will draw the find things screen 108 in FIG. 2D .
- the application logic 68 will update the status text in each entry of the find things screen 108 .
- the entry 107 e.g., Alan@Concert
- the status text is “200 feet NW.”
- the “Alan@Concert” will not change and thus will not be updated.
- the application logic 68 determines if the second user 128 pressed any of the find buttons such as button 110 , 112 , 117 , or 119 as shown in FIG. 2D . If yes, the application logic 68 will go to entry point E on FIG. 18E .
- the application logic 68 goes to step 428 to determine if the second user 128 switched any visibility button on or off ( FIG. 2D , button 111 is on, and button 113 is off).
- the visibility switch 111 is “on” the second user 128 (e.g., Bob) is visible to the first user 126 (e.g., Alan) and if the switch 111 is “off” the second user 128 is not visible.
- the visibility switch could be referred to as a toggle button.
- the application logic 68 uses communication interface 66 to call the first server 10 to change the visibility settings for the second user 128 in database 24 .
- the application logic 68 will wait for a time period (e.g., 10 ms) then proceed to step 420 .
- FIG. 18E illustrates another feature of the application that runs on the mobile device.
- FIG. 18E begins by entering from point E on FIG. 18D .
- the application logic 68 uses the communication interface 66 to call the first server 10 to get user locations.
- the application logic 68 calculates the distance between users and updates the compass 116 on the screen 114 shown in FIG. 2E .
- the application logic 68 will determine if there is a beacon notification on the first mobile device 74 . If yes, the application logic 68 will read a beacon sequence at step 442 , and at step 448 draw the screen 122 shown in FIG. 2F .
- the application logic 68 updates a segment of the beaconing sequence, then waits until the end of the segment at step 454 then determines if the beaconing sequence is finished at step 456 . If not, the application logic 68 will repeat the loop beginning at step 450 . If the beaconing sequence is finished, the application logic 68 will draw another screen (e.g., FIGS. 2A, 2D , or 2 E or a message screen (e.g., text or voice) to allow the first user 126 to communicate and assist in finding the second user 128 at step 457 . At step 453 , the application logic 68 will wait for a time period (e.g., 10 ms) then proceed to step 436 .
- a time period e.g. 10 ms
- FIG. 18F illustrates an embodiment of the application that runs on the mobile device.
- FIG. 18F begins by entering from point F on FIG. 18E .
- the application logic 68 uses the communication interface 66 to call the first server 10 to get the beacon sequence for the beacon indicator 120 on the screen 114 as shown in FIG. 2E .
- the application logic 68 reads the beacon sequence and updates the beacon indicator 120 ( FIG. 2E ) at step 464 .
- the application logic 68 will wait until the end of the current segment of the beacon sequence then proceeds to step 468 .
- the application logic 68 will determine if it is the last segment of the beacon sequence, that is, the beacon sequence is finished. If yes, the application logic 68 will exit to entry point E on FIG. 18E .
Abstract
Methods, computer readable storage medium, and systems for mobile devices to locate persons or places are described. In a feature, the invention is a method implemented in a server for providing beaconing sequences to the mobile devices for location sharing. In a feature, the invention is a server executing a method of locating a user using a beaconing mobile device. In a feature, the invention is a non-transitory computer readable medium on a server that encodes a program to execute a method on a first mobile device that determine directions and/or distance between the first mobile device and a second mobile device. In a feature, the invention is a server executing a method to remember a place on a mobile device.
Description
- This application is a continuation of U.S. application Ser. No. 16/658,274, filed on Oct. 21, 2019, which is a continuation of U.S. application Ser. No. 16/048,305, filed on Jul. 29, 2018, now U.S. Pat. No. 10,455,359, which is a continuation of U.S. application Ser. No. 15/096,258, filed on Apr. 11, 2016, now U.S. Pat. No. 10,057,719, which is a continuation of U.S. application Ser. No. 14/092,846, filed on Nov. 27, 2013, now U.S. Pat. No. 9,344,849, which are all incorporated by reference herein.
- This invention relates to methods and systems for locating persons and places using mobile devices.
- People attend events where they want to meet up with friends and acquaintances. If there is a large crowd at the event, it can be difficult for people to locate each other. Sometimes last minute changes in plans also prevent meeting up. Other times the problem is finding exactly where one parked the car after the event. Because many people carry mobile devices (e.g., cellphones), it would be useful to provide methods and systems for locating persons and places that could be implemented with mobile devices in these and other circumstances.
- This invention relates to systems and methods for locating persons and place with mobile devices. In a feature, the invention is a method implemented in a server for providing beaconing sequences for location sharing to mobile devices. In a feature, the invention is a server executing a method of locating a user using a beaconing mobile device. In another feature, the invention is a non-transitory computer readable medium on a server that encodes a program to execute a method on a first mobile device that determine directions and/or distance between the first mobile device and a second mobile device. In a feature, the invention is a server executing a method to remember a place on a mobile device. In still another feature, the invention is a system for providing beaconing sequences for location sharing to mobile devices.
-
FIG. 1 is a network diagram that illustrates N mobile devices communicating with N servers. -
FIGS. 2A-2F illustrate the user interfaces of a mobile device. -
FIG. 3 illustrates user interactions and communications among the mobile devices and the servers. -
FIG. 4 illustrates a second mobile device user locating a first mobile device user in a crowd using the compass feature. -
FIG. 5 is a perspective view of a second mobile device user locating a first mobile device user in a crowd using a compass. -
FIG. 6 is a perspective view of a second mobile device user locating a first mobile device user in a crowd using the compass on a smart watch or an electronic wearable. -
FIG. 7 illustrates a second mobile device user locating a first mobile device user by using a beaconing mobile device in a crowd with other users. -
FIG. 8 is a perspective view of a second mobile device user locating a first mobile device user in a crowd by using the beaconing mobile device. -
FIG. 9 illustrates beaconing sequences used inFIGS. 7-8 . -
FIG. 10 illustrates a variety of beaconing sequences. -
FIG. 11 illustrates assigning beaconing sequences by location cells. -
FIG. 12 illustrates some beaconing sequences that could be used in location cells. -
FIG. 13 illustrates assigning beaconing sequences by venue. -
FIG. 14 illustrates examples of beaconing sequences that could be used inFIG. 13 . -
FIG. 15 illustrates a flow diagram for the beaconing logic that runs on the server ofFIGS. 1 and 3 . -
FIG. 16 illustrates a flow diagram for the message logic that runs on the server ofFIGS. 1 and 3 . -
FIG. 17 illustrates a flow diagram for the event logic that runs on the server ofFIGS. 1 and 3 . -
FIG. 18A-18F is a set of flow diagrams that illustrate the application logic that runs on the mobile device and interacts with the server. - The following description includes the best mode of carrying out the invention. The detailed description illustrates the principles of the invention and should not be taken in a limiting sense. The scope of the invention is determined by reference to the claims. Each part (or step) is assigned its own part (or step) number throughout the specification and drawings. Because some flow diagrams don't fit on a single drawing sheet, we use capital letters (e.g., “A”) to show how the flow diagrams connect (e.g., “A” connects the flowcharts of
FIGS. 18A-18D ). -
FIG. 1 is a network diagram that illustrates N mobile devices communicating with N servers. The network diagram illustrates a system that includes the firstmobile device 74, the second mobile 76, the thirdmobile device 80 up to the Nthmobile device 84. Thecommunication links mobile device 74 through Nthmobile device 84 to thecomputer network 40 that connects to afirst server 10 through acommunication link 36, asecond server 26 through acommunication link 22, and aNth server 38 through acommunication link 39. - We now discuss the
first server 10 and thesecond server 26 to illustrate the software and hardware components that can be used to implement the invention. - The
first server 10 illustrates the software components. Thefirst server 10 includes aweb server 34 that will listen for requests and pass them along to anemail server 35, beacon logic 32 (FIG. 15 ), message logic 28 (FIG. 16 ), event logic 30 (FIG. 17 ) that thefirst server 10 executes during operation of the system. Abackplane 20 links thefirst server 10, a database andstorage 24, thesecond server 26, and theNth server 38. - The
second server 26 has the same software components (not shown) and illustrates hardware for thefirst server 10 throughNth server 38. It includes a CPU-memory bus 14 that communicates with aprocessor 12. Thesecond server 26 includesmemory 16 coupled to theprocessor 12 andinterfaces 18 that connect through acommunication link 22 to thenetwork 40. - Data is defined as including user data, instructions, and metadata. Inputting data is defined as the input of parameters and data from user input, computer memory, and/or storage device(s).
- A processor could be any suitable general purpose processor running software. For example, the processor could be one or more multicore processors made by Intel or licensed by ARM, and AMD. In another example, we could use a low cost single board computer processor such as Raspberry Pi. The arrangement and type of the processors is not essential to the invention.
- Hennessy and Patterson, Computer Architecture: A Quantitative Approach (2006), and Patterson and Hennessy, Computer organization and Design: The Hardware/Software Interface (2007) describe computer hardware and software, storage systems, and networks and are incorporated by reference.
- Each server may run an operating system such as Linux, UNIX, a Windows OS, or another suitable operating system. Tanenbaum, Modern Operating Systems (2008) describes operating systems in detail and is hereby incorporated by reference. Bovet and Cesati, Understanding the Linux Kernel (2005), and Bach, Design of the Unix Operating System (1986) describe operating systems in detail and are incorporated by reference herein.
- In an embodiment, each server could be implemented on a virtual machine hosted by VMware, Hyper V, or open source software Xen. Lowe et al. Mastering VMware vSphere 5.5 (2013) describes the VMware virtualization software in detail and is incorporated by reference herein. Matthews et al., Running Xen: A Hands-On Guide to the Art of Virtualization (2008) describes the free open source Xen virtualization software in detail and is incorporated by reference herein.
- In a typical environment, the server will be implemented by hundreds even thousands of computers in a data center such as Amazon Web Services, Google Compute Engine, Microsoft Azure, or Rackspace. It is not essential to the invention that a particular data center be used. Murry, Programming Amazon Web Services: S3, EC2, SQS, FPS, and SimpleDB (2008) describes the Amazon Web Services in detail and is incorporated by reference herein. Sanderson, Programming Google App Engine (2012) describes the Google App Engine in detail and is incorporated by reference herein.
- In an alternative embodiment, the server will be implemented using low cost single board computers such as Raspberry Pi that runs locally in the geographic area and/or near the mobile devices and communicates with the mobile devices using, for example, network protocols such as Bluetooth (e.g., low energy Bluetooth), Wi-Fi, or TCP/IP. Halfacree, Raspberry Pi User Guide (2012) describes this single board computer in detail and is incorporated by reference herein.
- The database and
storage 24 stores the user and event information and communicates with thefirst server 10, thesecond server 26, and theNth server 38. A non-transitory computer-readable medium (e.g., storage device, DVD, USB storage device) can be used to encode the software program instructions described in the methods below. Rockoff, The Language of SQL: How to Access Data in Relational Databases (2010) describe SQL databases in detail and is incorporated by reference herein. Redmond et al., Seven Databases in Seven Weeks (2012) describe non-SQL databases in detail and is incorporated by reference herein. - A first mobile device 74 (e.g., a cell phone, a tablet, a smart watch, electronic wearable clothing, or glasses) includes an
application 64, including acommunication interface 66,application logic 68, and auser interface 70. - The
communication interface 66 includes a conventional network interface that enables the firstmobile device 74 to communicate through alink 72 to acomputer network 40, which includes the cellular phone network (e.g., AT&T, Verizon, Sprint, NTT DoCoMo, Orange, and China Mobile) and/or the Internet. Tanenbaum, Computer Networks (2010) describes computer networks in detail and is incorporated by reference herein. - The
user interface 70 defines a set of graphical user interfaces that will be displayed on the screen of themobile device 74 as shown inFIGS. 2A-2F . - The
application logic 68 defines the state of the application (e.g., what screen is being displayed and the mobile device beacon state), stores inputs such as locations of mobile devices obtained from the servers of thecomputer network 40 and calculates the relative distances and directions of the mobile devices. - The
application logic 68 can be written in a variety of languages. For example, if the mobile device is an Apple device the language would be Objective-C. Kochan, Programming in Objective-C (5th Edition) (Developer's Library) (2012) describes Objective-C in detail and is incorporated by reference herein. For example, if the mobile device is Android device the language could be Java. Medinieks, Programming Android: Java Programming for the New Generation of Mobile Devices (2012) describes Android programming in detail and is incorporated by reference herein. - The first
mobile device 74 has alocation determination component 52 that determines the location of the firstmobile device 74 by using a variety of sources. For example, a global positioning system (GPS) 42 andcell tower 44 are suitable as long range sources. For shorter range location determination, Wi-Fi 46,Bluetooth 48, including Bluetooth low energy (LE), and a RFID protocol such as near field communication (NFC) 50 are suitable. Tanenbaum, Computer Networks (2010) describes these protocols in detail and is incorporated by reference herein. - The first
mobile device 74 has anorientation determination component 58 that determines the orientation of the firstmobile device 74 with respect to magnetic north and with respect to the earth's surface by using a variety of sources. For example, many mobile devices contain anaccelerometer 54 and acompass 56 and gyroscopes (not shown) which function as suitable sources of orientation. - The first
mobile device 74 optionally has amap component 62 that enables the firstmobile device 74 to communicate over the Internet with a map provider such as Google maps, Apple maps, Nokia's HERE maps, and OpenStreetMap. It is not essential to the invention which map provider is used. -
FIGS. 2A-2F illustrate the details of the user interfaces of a mobile device. -
FIG. 2A illustrates the firstmobile device 74 includes ahome screen 88 that displays a set of user selectable buttons: a rememberplace button 89, ashare location button 90, and a findthings button 92. - To illustrate the remember place feature, assume a first user named Alan is using the first
mobile device 74 to find his car in a parking lot. If Alan selects the rememberplace button 89, theapplication logic 68 uses theuser interface 70 of the firstmobile device 74 to display a rememberplace screen 94 shown inFIG. 2B . - As shown in
FIG. 2B , thescreen 94 displays afield 96 that contains the name of the item whose location a user wants to remember. For example, thefield 96 is a drop down menu that holds a value for Alan's car: “My Car” that is associated with location information in the parking lot. - To increase the types of items that can be identified, the
field 96 can be an input field for names entered by the user. After entering the value either by menu or text field, the user can save that value by clicking on asave place button 98. The user can tag the item with other information beside or in lieu of the location information by selecting the “add picture”button 95, the “add voice memo”button 97, and/or the “add text note”button 99, which will call up respectively, the camera, voice memo, and/or text feature of the firstmobile device 74. Thus, for Alan to find the place of an item (e.g., where he parked his car or bike, where he left a package, or his favorite store or restaurant) he can use the remember place feature to store coordinates and/or other location related information. - Referring to
FIG. 2A , if Alan wants to share his location with others at an event such as a concert, he will select theshare location button 90 of the home screen. Now theapplication logic 68 of the firstmobile device 74 will display theshare location screen 102 on the firstmobile device 74 as shown inFIG. 2C . - As shown in
FIG. 2C , Alan wants to share his location from 6-11 PM with Bob, Sarah, and Andrew at an event such as a concert. Alan adds values in the input fields such as “Concert” in theevent field 101, “6 PM” in thestart time field 103, “11 PM” in theend time field 105, and the email addresses of Bob, Sarah, and Andrew in aguest field 104. Alan will then press thesend button 106. The email addresses can be retrieved from the contact list of the firstmobile device 74 or from Internet contact lists, e.g., LinkedIn, Facebook, and Yahoo contacts. On pressing thesend button 106, theapplication logic 68 will use thecommunication interface 66 to send a message to the first server 10 (FIG. 1 ) that will in turn send a message to a secondmobile device 76. - Let's now assume a second user “Bob” uses the second
mobile device 76 that receives the message. Bob will see a new entry “Alan@Concert” on the list of the find things screen 108 as shown inFIG. 2D . Before 6 PM, the entry appears, but no distance will be indicated, because the time for location sharing has not begun. However, at 6 PM, the find things screen 108 will add Alan's current distance and direction from Bob, e.g., Alan is 200 feet NW of Bob. The distance and direction updates periodically to track the locations of the first and secondmobile devices Alan@Concert entry 107 may be deleted from secondmobile device 76 since the time of location sharing is over. - The find things screen 108 optionally includes an on-off
slider 111 to make the secondmobile device 76 visible to the firstmobile device 74 or make it visible to other mobile devices on the guest list, or invisible to everybody. -
FIG. 2E illustrates thedisplay screen 114 of the secondmobile device 76 includes acompass 116, a user selectable request to wavebutton 118, and abeacon indicator 120. As shown, thecompass 116 displays that the first user Alan is at an event Concert, 200 feet away in the direction of the compass arrow. Optionally, thecompass 116 may indicate north of NWSE directions. -
FIG. 2F illustrates thedisplay screen 122 of the firstmobile device 74 includes abeaconing display 124 that will be used to find the first user, Alan as will be described in detail inFIGS. 3-8 . -
FIG. 3 illustrates interactions and communications among the mobile devices and a server. Initially, the first user 126 (e.g., Alan) of the firstmobile device 74 will press abutton Start App 125 that will send a HTTP request to compare the clock of the firstmobile device 74 with the clock of thefirst server 10. That time difference will be stored in the memory of thefirst server 10, which will be used by the application later. The firstmobile device 74 will display thehome screen 88 as shown inFIG. 2A . Thefirst user 126 will press theshare location button 90. The firstmobile device 74 will display thescreen 102 as shown inFIG. 2C . Thefirst user 126 will complete thefields send button 106 as illustrated inFIG. 2C . The firstmobile device 74 sends amessage 300 to the message logic 28 (FIGS. 1 and 16 ) of thefirst server 10, which will send anotification message 302 to the secondmobile device 76 to notify the second user 128 (e.g., Bob) that a new item 107 (e.g., Alan@Concert) was added to the find things screen 108 of the secondmobile device 76. Thefirst server 10 also sends aresponse 304 to therequest 300 to confirm the notification 127 (e.g., email, text, or push notification) regarding the event (e.g., Concert) was sent to each guest on the guests list 104. - Next, the second user 128 (e.g., Bob) of the second
mobile device 76 will start theapp 129 that will send a HTTP request to compare the clock of the secondmobile device 76 with the clock of thefirst server 10. That time difference will be stored in the memory of thefirst server 10, which will be used by the application later. The secondmobile device 76 will display thehome screen 88 as shown inFIG. 2A . Thesecond user 128 will press thefind things button 92. The secondmobile device 76 will send arequest 306 to the event logic 30 (FIGS. 1 and 17 ) of thefirst server 10. Thefirst server 10 reads theentries FIG. 1 ) and sends aresponse 308 to the secondmobile device 76. Theresponse 308 includes a list of the entries as illustrated in the find things screen 108 of the secondmobile device 76 as shown inFIG. 2D . - The first and second
mobile devices first server 10 that keeps track of their respective locations. - The
second user 128 of the secondmobile device 76 will press thefind button 110. The secondmobile device 76 will display thecompass screen 114 as shown inFIG. 2E . Thesecond user 128 of the secondmobile device 76 will then press therequest wave button 118. The secondmobile device 76 will send arequest 310 to the beacon logic 32 (FIGS. 1 and 15 ) of thefirst server 10. Thefirst server 10 will read a list of users in the geographic area of interest and each of their beaconing sequences to identify what beaconing sequence is available for assignment as illustrated inFIG. 15 . Thefirst server 10 sends amessage 312 with the assigned beaconing sequence 142 (FIG. 9 ) to thewave screen 122 of the firstmobile device 74 as illustrated inFIG. 2F . Referring toFIG. 3 , thefirst server 10 sends aresponse 314 with the assigned beaconing sequence 144 (FIG. 9 ) to the secondmobile device 76. A request to wave will trigger a beacon sequence on another mobile device so it is conspicuous and can be seen. - The second user 128 (e.g., Bob) now uses the
compass 116 to get the distance (e.g., 200 feet) and direction of the first user 126 (e.g., Alan). In addition, thesecond user 128 can use thebeaconing sequence 144 on thebeaconing indicator 120 of his secondmobile device 76 to indicate thebeaconing sequence 142 being displayed on the firstmobile device 74. Now thesecond user 128 will be able to spot thefirst user 126 as long as the firstmobile device 74 is visible to thesecond user 128. -
FIG. 4 illustrates a crowd gathered at an event such as concert with astage 132. Let's assume thecrowd 130 prevents or makes it difficult for a first user 126 (e.g., Alan) and the second user 128 (e.g., Bob) to locate each other. The firstmobile device 74 held by thefirst user 126 beacons which is seen by thesecond user 128 holding the secondmobile device 76. -
FIG. 5 is a perspective view ofFIG. 4 that illustrates how the first user is found by the second user. Thefirst user 126 holds the beaconing first mobile device 74 (e.g., cell phone) apart from the crowd. Thesecond user 128 holds a second mobile device 76 (e.g., cell phone) displaying acompass 116 that points to thefirst user 126 and includes a request to wavebutton 118 that was pressed to trigger beaconing on the firstmobile device 74 and abeacon indicator 120. -
FIG. 6 is a perspective view ofFIG. 4 that illustrates another context where a first user is found by a second user. Thefirst user 126 holds the beaconing first mobile device 74 (e.g., a cell phone or tablet) apart from the crowd. Thesecond user 128 holds or wears a second mobile device 77 (e.g., smart watch/electronic wearable) displaying acompass 116 that points to thefirst user 126 and includes a request to wavebutton 118 that was pressed to trigger beaconing on the firstmobile device 74 and abeacon indicator 120. -
FIG. 7 illustrates a crowd gathered at an event such as concert with astage 132. Let's assume thecrowd 130 prevents or makes it difficult for a first user 126 (e.g., Alan) and the second user 128 (e.g., Bob) to locate each other. In addition, unrelated persons at the event are using our method of location. Thus, thefirst user 126 is holding a firstmobile device 74, athird user 134 is holding a thirdmobile device 136, and afourth user 138 is holding a fourthmobile device 140, which are all beaconing. -
FIG. 8 is a perspective view ofFIG. 7 that illustrates how our method would enable the second user to find the first user despite the presence of other beaconing mobile devices. Thefirst user 126 holds the beaconing first mobile device 74 (e.g., cell phone) apart from thecrowd 130. Thesecond user 128 holds a second mobile device 76 (e.g., cell phone) displaying acompass 116 that points to thefirst user 126 and includes a request to wavebutton 118 that was pressed to trigger beaconing on the firstmobile device 74 and abeacon indicator 120. This time, however, thethird user 134 is near thefirst user 126. Thecompass 116 is no longer sufficient to locate thefirst user 126, because it points generally toward thefirst user 126 and thethird user 134. This is one situation where assigning a unique beaconing sequence can help locate thefirst user 126. -
FIG. 9 illustrates a beaconing sequence that can be flashed to locate a person in the crowd as shown inFIGS. 7-8 . The firstmobile device 74 includes ascreen 124 that flashes a yellowgreen beaconing sequence 142. The secondmobile device 76 includes abeaconing indicator 120 that flashes the yellowgreen beaconing sequence 144. The thirdmobile device 136 includes ascreen 121 that flashes an orangeblue sequence 146. Afourth user 138 holding a fourthmobile device 140 is flashing a red offbeaconing sequence 148. - The
compass 116 can eliminate thefourth user 138 from consideration, because it points away from the fourthmobile device 140 as shown inFIG. 8 . However, thecompass 116 points toward the firstmobile device 74 and the thirdmobile device 136. Thesecond user 128 can distinguish thefirst user 126 from thethird user 134, however, because the orangeblue beaconing sequence 146 contrasts with the yellowgreen beaconing sequence 144 on thebeaconing indicator 120. Thebeaconing sequences second user 128 that it is thefirst user 126. -
FIG. 10 illustrates a variety of beaconing sequences which can be assigned to a given mobile device by thefirst server 10. As shown, a beaconing sequence is a combination of colors, timing, icons and text delivered in segments that will enable a user to locate another user of interest in a crowd. For example, the beaconing sequence could simply be a single color such as yellow 150 that is displayed. To increase visibility, the beaconing sequence could be a flashing color such as red 152, multiple alternating colors such as yellow-green-yellow 154, or multiple colors in a repeating sequence of segments such as red, white, and blue 156. Further, the beaconing sequence can be multiple colors displayed at the same time such a segment of yellow/green 158, or an icon (static or dynamic) on a background color(s)such orange 160, or an icon on alternating background colors such as red andgold 162, or a non-repeating sequence ofmultiple colors 164. -
FIGS. 11-12 illustrate assigning beaconing sequences in a location cell that would be useful in a crowded environment such as Times Square. Let's assume thesecond user 128 is at the corner of W. 47th Street and Broadway looking for thefirst user 126 who is at W. 46th Street and 7th Avenue at about 10 PM on December 31st, just before the New Year celebration. Athird user 134 is at 46th and Broadway and afourth user 138 at W. 49th Street and Broadway. Thethird user 134 andfourth user 138 are using the application but are not in the same group as defined by thefirst user 126 which included only thesecond user 128. Thethird user 134 with amobile device 136 is assigned an orange-blue beaconing sequence 254. Since thefirst user 126 and thethird user 134 are in thesame location cell 240, to avoid a conflict thefirst server 10 assigns a different beaconing sequence such the yellow-green sequence 250 to be displayed on the firstmobile device 74 of thefirst user 126. In contrast, thefourth user 138 withmobile device 140 in a non-adjacent cell such aslocation cell 244 can be assigned a similar yellow-green beaconing sequence 256. Thefirst server 10 can extend the assignment of the unique beaconing sequences for each location cell beyond thelocation cell 240 of thefirst user 126 and thelocation cell 242 of thesecond user 128 to one or more surrounding location cells such as cell location 246, which is non-adjacent tolocation cell 240.FIG. 11 illustrates the location cells as hexagonal, but it could be a variety of geometric planar shapes such as a square, a rectangular, circle, or a triangle, or any shape that can be arranged in a grid or a geometric 3D shapes such as a cube, a cylinder, or a prism. -
FIG. 13-14 illustrate assigning beaconing sequences by venue that would be useful in adjacent venues such as an amusement park, a stadium, and a shared parking lot. Initially, thefirst user 126 with amobile device 74 and thefourth user 138 with amobile device 140 are both in the first venue 258 (e.g., the shared parking lot). In thefirst venue 258, thefirst server 10 has assigned a yellowblue beaconing sequence 268 to the firstmobile device 74, and the flashingred beaconing sequence 272 to the fourthmobile device 140. The first andfourth users fourth user 138 travels to alocation 266 in a second venue 260 (e.g., a sports stadium). Since no other user is assigned the flashingred beaconing sequence 272 in thesecond venue 260, thefourth user 138 can continue using it. In contrast, when thefirst user 126 travels to alocation 264 in a third venue 262 (e.g., an amusement park), thefirst server 10 will assign the flashingred beaconing sequence 274 to the firstmobile device 74 to avoid a conflict with thethird user 134 whosemobile device 136 was previously assigned the yellowblue beaconing sequence 278 in thethird venue 262. Now thesecond user 128 can readily find thefirst user 126 using the same flashing readbeaconing sequence -
FIG. 15 illustrates a flow diagram for the beaconing logic that runs on the server ofFIGS. 1 and 3 . Hunter et al., Java Serviet Programming (2001) describes Java Servlet programming that can be used to implement the beaconing logic and is incorporated by reference herein. As shown inFIG. 15 , the beacon logic 32 (FIG. 1 ) that runs in thefirst server 10 begins when thesecond user 128 presses thewave request button 118 on thescreen 114 of the second mobile device 76 (FIG. 2E ). The secondmobile device 76 will send a request (e.g., HTTP request) to thefirst server 10. Atstep 164, thefirst server 10 will read the request, which includes the clock value of the secondmobile device 76. Atstep 168, thefirst server 10 will read the location of the first and secondmobile devices first server 10 will perform a time sync that reads its own clock and receives the time stamp of each of the first and secondmobile devices mobile device 74 also performs a time sync (e.g.,FIG. 3 , Start of App 125) that includes a clock value of the firstmobile device 74. In an embodiment, the time sync can be performed multiple times and averaged. The time sync may be done once or periodically and will depend on accuracy of the mobile device clocks, and clock drift. Any difference between the clocks is stored in a memory accessible to thefirst server 10 and represents an offset that can be used to ensure the beaconing sequence of the firstmobile device 74 and the secondmobile device 76 are on the same clock. Atstep 170, thefirst server 10 will look up the beaconing sequence history table, which is stored in thedatabase 24. Each record in the table will contain: beaconing sequence, the geographic location, and the time of the sequence. Atstep 172, thebeacon logic 32 will execute a database query that filters on location and time so that only recent beaconing sequences from other users in the area will be in the result. If there are no recent beaconing sequences used by other users in the area, thebeacon logic 32 will move to step 174. Atstep 174, thebeacon logic 32 will perform a lookup of the first user's preference for a beaconing sequence. Atstep 176, thebeacon logic 32 will calculate the beaconing sequence from the preference of thefirst user 74. - If there are recent beaconing sequences used by other users in the area at
step 172, thebeacon logic 32 will move to step 178. Atstep 178, thebeacon logic 32 will read the record of the beacon sequence in the beaconing sequence history table that was recently used in the area. Atstep 180, thebeacon logic 32 will add the beacon sequence to a sequence list that contains all the beacon sequences that cannot be used in the area by thefirst user 126. Atstep 182, thebeacon logic 32 will determine if there is another record returned from the query of thedatabase 24. If yes, thebeacon logic 32 will return to read the next beacon history record. This loop continues until all of the beacon sequence records are processed. If no, thebeacon logic 32 will go to step 184. Atstep 184, thebeacon logic 32 will calculate the beacon sequence that will be used by thefirst user 126. In an embodiment, this calculation can optimize the difference between the color(s) in the beaconing sequence to be used by thefirst user 126 and those already in use. Atstep 186, thebeacon logic 32 adds the calculated beacon sequence to the beacon history table. Atstep 188, thebeacon logic 32 sends a response with thecalculated beacon sequence 126, which is received by the firstmobile device 74 at step 120 (FIG. 3 ). -
FIG. 16 illustrates a flow diagram for the message logic that runs in the server ofFIGS. 1 and 3 . Thefirst user 74 will complete the fields shown inFIG. 2C . As shown inFIG. 2C , thescreen 102 displays the fields: anevent name 101, astart time 103, anend time 105, and a list ofguests 104. In an embodiment, the list ofguests 104 is the email addresses of the guests. - Referring to
FIG. 16 , atstep 106, thefirst user 126 presses the send button 106 (FIG. 2C ), which causes the firstmobile device 74 to send arequest 300 to thefirst server 10. Therequest 300 includes the information that the user input into one or more of the fields shown inFIG. 2C . Themessage logic 28 reads the request atstep 190. Atstep 192, themessage logic 28 adds the request information as an entry into the Event table in thedatabase 24. Atstep 194, themessage logic 28 will read the identifying information (e.g., email address or cell phone number) of the next guest in the list 104 (FIG. 2C ). Atstep 196, themessage logic 28 will perform a lookup in the user in thedatabase 24. Atstep 198, themessage logic 28 will check if the user has the application. If yes, themessage logic 28 will add an entry in the user event table atstep 202. Next, atstep 204, themessage logic 28 will perform a lookup of the user's notification preferences (e.g., email, text message, push notification, automated phone call, or a social media post). Atstep 206, themessage logic 28 will send a notification using the preferred method(s) to the secondmobile device 76. In addition, themessage logic 28 will send a message to add anentry 107 to the find things screen 108 (FIG. 2D ) of the secondmobile device 76. Next, atstep 208, themessage logic 28 will check if another user is on the guest list. If yes, themessage logic 28 will return to step 194. If not, that is, it was the last user on the guest list, themessage logic 28 will send a response 304 (FIG. 3 ) to the firstmobile device 74 atstep 210. As a result, a confirmation is received by the firstmobile device 74 atstep 211. -
FIG. 17 illustrates a flow diagram for the event logic that runs on the server ofFIGS. 1 and 3 . Atstep 92, thesecond user 128 will press the find things button 92 (FIG. 2A ) on the secondmobile device 76, which sends arequest 306 to complete the find things screen to thefirst server 10. Atstep 212, thefirst server 10 reads the request that includes the identity of thesecond user 128. Atstep 214, theevent logic 30 will perform a lookup of all entries for thesecond user 128 in the user event table in thedatabase 24. Theevent logic 30 will read each entry from the user event table atstep 216. Atstep 218, theevent logic 30 will add the information from the entry to the find things list that will eventually be returned to the secondmobile device 76 to create the find things screen 108 inFIG. 2D . Atstep 220, theevent logic 30 will read if another entry exists in the user event table. If yes, theevent logic 30 will return to step 216. If no, theevent logic 30 will go to step 222. Theevent logic 30 will prioritize the find things list atstep 222 then send it in a response 308 (FIG. 3 ) atstep 228 to generate the screen 108 (FIG. 2D ) with entries pertaining to thesecond user 128 on the secondmobile device 76. -
FIG. 18A-18F is a set of flow diagrams that illustrate the application logic that runs on the mobile device. -
FIG. 18A illustrates the application that runs on the mobile device. Theapplication 64 includescommunication interface 66, theapplication logic 68, and theuser interface 70. Atstep 352, theapplication logic 68 will use thecommunication interface 66 to send a time stamp as described inFIG. 3 . Atstep 354, theapplication logic 68 uses theuser interface 70 to draw thescreen 88 shown inFIG. 2A . Atstep 356, theapplication logic 68 determines if the user pressed the rememberplace button 89 shown inFIG. 2A . If yes, theapplication logic 68 exits to entry point B onFIG. 18B . If no, theapplication logic 68 goes to step 360. Atstep 360, theapplication logic 68 determines if the user pressed theshare location button 90 shown inFIG. 2A . If yes, theapplication logic 68 exits to entry point C onFIG. 18C . If not, theapplication logic 68 goes to step 364. Atstep 364, theapplication logic 68 determines if the user pressed thefind things button 92 shown inFIG. 2A . If yes, theapplication logic 68 exits to entry point D onFIG. 18D . If no, theapplication logic 68 goes to step 368. Atstep 368, theapplication logic 68 waits a time period (e.g., 10 ms) then goes to step 356. -
FIG. 18B illustrates the application that runs on the mobile device. Atstep 370, theapplication logic 68 uses theuser interface 70 to draw thescreen 94 shown inFIG. 2B . Atstep 372, theapplication logic 68 determines if the user pressed the saveplace button 98 shown inFIG. 2B . If yes, theapplication logic 68 goes to the remember place atstep 374. Atstep 374, theapplication logic 68 stores the current location in the memory of the mobile device and optionally sends the location information to be stored on thefirst server 10 and preferably thedatabase 24. Theapplication logic 68 exits to entry point A onFIG. 18A . Atstep 376, theapplication logic 68 determines if the user pressed theadd picture button 95 shown inFIG. 2B . If yes, theapplication logic 68 will show the camera atstep 378. Atstep 380, theapplication logic 68 stores the picture (e.g., a landmark near the place to be remembered) in the memory of the mobile device and optionally sends the picture to be stored on thefirst server 10 and preferably thedatabase 24. Atstep 382, theapplication logic 68 determines if the user pressed theadd memo button 97 shown inFIG. 2B . If yes, theapplication logic 68 goes to record voice feature atstep 384. Atstep 386, theapplication logic 68 stores the voice memo (e.g., a description about the place to be remembered) in the memory of the mobile device and optionally sends the recorded voice memo to be stored on thefirst server 10 and preferably thestorage 24. Atstep 388, theapplication logic 68 determines if the user pressed the addtext note button 99 shown inFIG. 2B . If yes, theapplication logic 68 uses theuser interface 70 to show a keyboard atstep 390. Atstep 392, theapplication logic 68 stores the text note (e.g., a text note about the place to be remembered) in the memory of the mobile device and optionally sends the text note to be stored on thefirst server 10 and preferably thedatabase 24. If the user did not pressbuttons application logic 68 will determine if the user pressed theback button 100 as shown inFIG. 2B atstep 394. If not,application logic 68 waits a time period (e.g., 10 ms) atstep 396. Aftersteps application logic 68 waits atstep 396. -
FIG. 18C illustrates another feature of the application that runs on the mobile device.FIG. 18C begins by entering from point C onFIG. 18A . Atstep 398, theapplication logic 68 uses theuser interface 70 to draw theshare location screen 102 shown inFIG. 2C . The input fields ofshare location screen 102 are initially empty (not shown). Atstep 399, the user will enter values into the input fields as previously shown inFIG. 2C . Atstep 400, if the user presses the back button 100 (FIG. 2C ), theapplication logic 68 will go to entry point A onFIG. 18A to draw thehome screen 88 shown inFIG. 2A . If not, theapplication logic 68 will determine if the user pressed thesend button 106 as shown inFIG. 2C . If not, theapplication logic 68 waits a time period (e.g., 10 ms) atstep 408. If yes, theapplication logic 68 will determine if the user input is valid atstep 404. If not, theapplication logic 68 will notify the user that the input is not valid atstep 410. If yes, theapplication logic 68 will send share location information to be stored on thefirst server 10 and preferably on thedatabase 24 atstep 406. Atstep 412, the server will send a response to the mobile device then theapplication logic 68 exits to entry point A onFIG. 18A . -
FIG. 18D illustrates another feature of the application that runs on the mobile device.FIG. 18D begins by entering from point D onFIG. 18A orFIG. 18E . Atstep 414, theapplication logic 68 will send a request with the identity of the user to the first server 10 (FIG. 1 ). Theevent logic 30 will use the user identity to look up all of the entries of that user in the user event table then prepare a find things list which will be returned in a response to the user's mobile device. Theapplication logic 68 of the mobile device will parse the response atstep 416. Atstep 418, theapplication logic 68 will draw the find things screen 108 inFIG. 2D . Atstep 420, theapplication logic 68 will update the status text in each entry of the find things screen 108. Referring to screen 108 onFIG. 2D , the entry 107 (e.g., Alan@Concert) the status text is “200 feet NW.” In contrast, the “Alan@Concert” will not change and thus will not be updated. If not, atstep 424, theapplication logic 68 determines if thesecond user 128 pressed any of the find buttons such asbutton FIG. 2D . If yes, theapplication logic 68 will go to entry point E onFIG. 18E . If not, theapplication logic 68 goes to step 428 to determine if thesecond user 128 switched any visibility button on or off (FIG. 2D ,button 111 is on, andbutton 113 is off). When thevisibility switch 111 is “on” the second user 128 (e.g., Bob) is visible to the first user 126 (e.g., Alan) and if theswitch 111 is “off” thesecond user 128 is not visible. The visibility switch could be referred to as a toggle button. Atstep 430, theapplication logic 68 usescommunication interface 66 to call thefirst server 10 to change the visibility settings for thesecond user 128 indatabase 24. Atstep 432, theapplication logic 68 will wait for a time period (e.g., 10 ms) then proceed to step 420. -
FIG. 18E illustrates another feature of the application that runs on the mobile device.FIG. 18E begins by entering from point E onFIG. 18D . Atstep 434, theapplication logic 68 uses thecommunication interface 66 to call thefirst server 10 to get user locations. Atstep 436, theapplication logic 68 calculates the distance between users and updates thecompass 116 on thescreen 114 shown inFIG. 2E . Atstep 440, theapplication logic 68 will determine if there is a beacon notification on the firstmobile device 74. If yes, theapplication logic 68 will read a beacon sequence atstep 442, and atstep 448 draw thescreen 122 shown inFIG. 2F . Atstep 450, theapplication logic 68 updates a segment of the beaconing sequence, then waits until the end of the segment atstep 454 then determines if the beaconing sequence is finished atstep 456. If not, theapplication logic 68 will repeat the loop beginning atstep 450. If the beaconing sequence is finished, theapplication logic 68 will draw another screen (e.g.,FIGS. 2A, 2D , or 2E or a message screen (e.g., text or voice) to allow thefirst user 126 to communicate and assist in finding thesecond user 128 atstep 457. Atstep 453, theapplication logic 68 will wait for a time period (e.g., 10 ms) then proceed to step 436. -
FIG. 18F illustrates an embodiment of the application that runs on the mobile device.FIG. 18F begins by entering from point F onFIG. 18E . Atstep 458, theapplication logic 68 uses thecommunication interface 66 to call thefirst server 10 to get the beacon sequence for thebeacon indicator 120 on thescreen 114 as shown inFIG. 2E . Atstep 460, theapplication logic 68 reads the beacon sequence and updates the beacon indicator 120 (FIG. 2E ) atstep 464. Atstep 466, theapplication logic 68 will wait until the end of the current segment of the beacon sequence then proceeds to step 468. Atstep 468, theapplication logic 68 will determine if it is the last segment of the beacon sequence, that is, the beacon sequence is finished. If yes, theapplication logic 68 will exit to entry point E onFIG. 18E .
Claims (14)
1. A method executed on a server for displaying a visible beacon sequence to a first user of a first mobile device and to a second user of a second mobile device at an event in a geographic area, wherein the server has identified the first mobile device and the second mobile device, comprising:
receiving a request from the first mobile device to share location at an event with the second mobile device;
sending a message to the second mobile device to notify the second user that a new item representing the event was added to the screen of the second mobile device;
receiving a message from the second mobile device that causes the server to read an entry that corresponds to the new item on the server regarding the event at the location;
sending a response including the entry to second mobile device;
accessing an update of the locations of the first mobile device and the second mobile device; and
after the server determines based on the update of the location that the first mobile device and the second mobile device are located at the event in the geographic area and the server receives a request from either the first user of the first mobile device or the second user of the second mobile device, sending a visible beacon sequence to the first mobile device and to the second mobile device that is conspicuous and can be seen in the line of sight by the first user or the second user to enable them to locate each other.
2. The method of claim 2 , further comprising using location information of the first mobile device and the second mobile device to determine the direction and/or distance from the second mobile device to the first mobile device.
3. The method of claim 2 , further comprising displaying a compass on the second mobile device that indicates the location of the first mobile device.
4. The method of claim 1 , wherein the assigned beacon sequence on the first mobile device is visibly different from any other beacon sequences used in the geographic area.
5. The method of claim 3 , further comprising updating coordinates for the compass on the second mobile device that indicates the location of the first mobile device.
6. The method of claim 1 , further comprising computing an offset for displaying the beaconing sequence at the same time on the first mobile device and the second mobile device.
7. The method of claim 1 , further comprising using venue or cell location to determine the geographic area.
8. The method of claim 1 , wherein the second mobile device is an electronic wearable.
9. The method of claim 1 , wherein the second mobile device is a smart watch.
10. The method of claim 1 , wherein the first mobile device is an electronic wearable.
11. The method of claim 1 , further comprising displaying a user selectable visibility button on the second mobile device.
12. The method of claim 6 , wherein computing the offset includes comparing the clock of the first mobile device and the second mobile device with the clock of the server.
13. The method of claim 1 , wherein the server receives a request from the first user of the first mobile device.
14. The method of claim 1 , wherein the server receives a request from the second user of the second mobile device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/314,007 US20210266700A1 (en) | 2013-11-27 | 2021-05-06 | Methods for Mobile Devices to Locate Persons |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/092,846 US9344849B2 (en) | 2013-11-27 | 2013-11-27 | Methods and systems for locating persons and places with mobile devices |
US15/096,258 US10057719B2 (en) | 2013-11-27 | 2016-04-11 | Methods and systems for locating persons and places with mobile devices |
US16/048,305 US10455359B2 (en) | 2013-11-27 | 2018-07-29 | Methods for location of persons using beaconing sequences |
US16/658,274 US11006244B2 (en) | 2013-11-27 | 2019-10-21 | Methods on mobile devices to locate persons |
US17/314,007 US20210266700A1 (en) | 2013-11-27 | 2021-05-06 | Methods for Mobile Devices to Locate Persons |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/658,274 Continuation US11006244B2 (en) | 2013-11-27 | 2019-10-21 | Methods on mobile devices to locate persons |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210266700A1 true US20210266700A1 (en) | 2021-08-26 |
Family
ID=59998484
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/096,258 Active US10057719B2 (en) | 2013-11-27 | 2016-04-11 | Methods and systems for locating persons and places with mobile devices |
US16/040,493 Active US10448213B2 (en) | 2013-11-27 | 2018-07-19 | Methods for location of persons with electronic wearables |
US16/048,305 Active US10455359B2 (en) | 2013-11-27 | 2018-07-29 | Methods for location of persons using beaconing sequences |
US16/658,274 Active US11006244B2 (en) | 2013-11-27 | 2019-10-21 | Methods on mobile devices to locate persons |
US17/314,007 Pending US20210266700A1 (en) | 2013-11-27 | 2021-05-06 | Methods for Mobile Devices to Locate Persons |
Family Applications Before (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/096,258 Active US10057719B2 (en) | 2013-11-27 | 2016-04-11 | Methods and systems for locating persons and places with mobile devices |
US16/040,493 Active US10448213B2 (en) | 2013-11-27 | 2018-07-19 | Methods for location of persons with electronic wearables |
US16/048,305 Active US10455359B2 (en) | 2013-11-27 | 2018-07-29 | Methods for location of persons using beaconing sequences |
US16/658,274 Active US11006244B2 (en) | 2013-11-27 | 2019-10-21 | Methods on mobile devices to locate persons |
Country Status (1)
Country | Link |
---|---|
US (5) | US10057719B2 (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10057719B2 (en) | 2013-11-27 | 2018-08-21 | Alan Snyder | Methods and systems for locating persons and places with mobile devices |
US10945129B2 (en) * | 2016-04-29 | 2021-03-09 | Microsoft Technology Licensing, Llc | Facilitating interaction among digital personal assistants |
US10827450B2 (en) * | 2016-09-20 | 2020-11-03 | Qualcomm Incorporated | Wireless device registration |
MX2020001749A (en) * | 2017-08-14 | 2020-03-24 | Sensus Spectrum Llc | Methods, systems and computer program products for identifying endpoints in a network. |
GB2579406A (en) * | 2018-11-30 | 2020-06-24 | Thales Holdings Uk Plc | Remote detector and display |
US11109187B2 (en) * | 2019-09-24 | 2021-08-31 | T-Mobile Usa, Inc. | Device to device communication and wayfinding |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080113618A1 (en) * | 2006-11-09 | 2008-05-15 | Sony Ericsson Mobile Communications Ab | Pairing system and method for mobile devices |
US20090287853A1 (en) * | 2008-05-13 | 2009-11-19 | Dale Carson | Portable wireless compatibility detection, location and communication device |
US20110191823A1 (en) * | 2010-02-03 | 2011-08-04 | Bump Technologies, Inc. | Bump validation |
US20110306366A1 (en) * | 2008-07-16 | 2011-12-15 | Glympse Inc. | Sharing of location information in a networked computing environment |
US20120159371A1 (en) * | 2010-12-15 | 2012-06-21 | Haley Thrapp | Relative-location maps based on personalized points of reference |
US20130134906A1 (en) * | 2010-06-04 | 2013-05-30 | Michel Picariello | Network of synchronous self-contained light beacons |
US20130178257A1 (en) * | 2012-01-06 | 2013-07-11 | Augaroo, Inc. | System and method for interacting with virtual objects in augmented realities |
US20130225205A1 (en) * | 2005-04-04 | 2013-08-29 | X One, Inc. | Methods and Systems for Temporarily Sharing Position Data Between Mobile-Device Users |
US20140018103A1 (en) * | 2011-01-12 | 2014-01-16 | Telefonaktiebolaget Lm Ericsson (Publ) | Nodes and methods for positioning |
US20140155098A1 (en) * | 2011-03-07 | 2014-06-05 | Isis Innovation Limited | System for providing information and associated devices |
US20140245354A1 (en) * | 2005-03-30 | 2014-08-28 | Rovi Guides, Inc. | Systems and methods for video-rich navigation |
US20140282620A1 (en) * | 2013-03-15 | 2014-09-18 | Frank Settemo NUOVO | System and method for triggering an event in response to receiving a device identifier |
US20150092676A1 (en) * | 2013-09-30 | 2015-04-02 | Blackberry Limited | Macrocell Enabled MM-Wave Superspot for Mobility |
US9344849B2 (en) * | 2013-11-27 | 2016-05-17 | Alan Michael Snyder | Methods and systems for locating persons and places with mobile devices |
US10057719B2 (en) * | 2013-11-27 | 2018-08-21 | Alan Snyder | Methods and systems for locating persons and places with mobile devices |
Family Cites Families (73)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5400246A (en) | 1989-05-09 | 1995-03-21 | Ansan Industries, Ltd. | Peripheral data acquisition, monitor, and adaptive control system via personal computer |
IL108549A (en) | 1994-02-03 | 1998-08-16 | Zelinkovsky Reuven | Transport system |
JP3141734B2 (en) | 1995-07-19 | 2001-03-05 | トヨタ自動車株式会社 | Roadside vehicle communication equipment |
JP3688833B2 (en) | 1996-12-02 | 2005-08-31 | 株式会社ビデオリサーチ | Car radio listening situation investigation device |
US5959529A (en) | 1997-03-07 | 1999-09-28 | Kail, Iv; Karl A. | Reprogrammable remote sensor monitoring system |
US6098048A (en) | 1998-08-12 | 2000-08-01 | Vnu Marketing Information Services, Inc. | Automated data collection for consumer driving-activity survey |
US6756913B1 (en) | 1999-11-01 | 2004-06-29 | Mourad Ben Ayed | System for automatically dispatching taxis to client locations |
US6615046B1 (en) | 1999-12-29 | 2003-09-02 | International Business Machines Corporation | Automatic dispatch of mobile services |
EP1263626A2 (en) | 2000-03-02 | 2002-12-11 | Donnelly Corporation | Video mirror systems incorporating an accessory module |
US6356838B1 (en) | 2000-07-25 | 2002-03-12 | Sunil Paul | System and method for determining an efficient transportation route |
US20020097193A1 (en) | 2001-01-23 | 2002-07-25 | Freecar Media | System and method to increase the efficiency of outdoor advertising |
US6859173B2 (en) * | 2001-06-21 | 2005-02-22 | The Rosum Corporation | Position location using broadcast television signals and mobile telephone signals |
KR100433734B1 (en) | 2001-06-18 | 2004-06-04 | 이재욱 | Automatic Connecting Service Method For Taxi By a Communication Network |
JP2003109122A (en) | 2001-09-28 | 2003-04-11 | Fujitsu Ltd | Transaction information registering terminal equipment, transaction information managing system, transaction information managing method, and program |
JP2003109191A (en) | 2001-09-28 | 2003-04-11 | Fujitsu Ltd | Vehicle allocation system and vehicle allocation processor |
US20030069692A1 (en) | 2001-10-10 | 2003-04-10 | Krasner Norman F. | Methods and apparatuses for controlling distribution of location information |
JP2004054444A (en) | 2002-07-17 | 2004-02-19 | Omron Corp | Operation service information mediating system |
US7627422B2 (en) | 2003-06-24 | 2009-12-01 | At&T Intellectual Property I, Lp | Methods, systems and computer program products for ride matching based on selection criteria and drive characteristic information |
US7154383B2 (en) | 2003-07-09 | 2006-12-26 | Steven Earl Berquist | Dynamic mobile advertising system |
US7136747B2 (en) | 2005-01-08 | 2006-11-14 | Stephen Raney | Method for GPS carpool rendezvous tracking and personal safety verification |
US20060229058A1 (en) | 2005-10-29 | 2006-10-12 | Outland Research | Real-time person-to-person communication using geospatial addressing |
US9392406B2 (en) * | 2005-02-03 | 2016-07-12 | Trueposition, Inc. | Method and system for location-based monitoring of a mobile device |
WO2006099395A2 (en) | 2005-03-11 | 2006-09-21 | Adobe Systems, Inc. | System and method for displaying information using a compass |
US20070190494A1 (en) | 2005-04-04 | 2007-08-16 | Outland Research, Llc | Multiplayer gaming using gps-enabled portable gaming devices |
US8275397B2 (en) | 2005-07-14 | 2012-09-25 | Huston Charles D | GPS based friend location and identification system and method |
US8249626B2 (en) | 2005-07-14 | 2012-08-21 | Huston Charles D | GPS based friend location and identification system and method |
US7801542B1 (en) | 2005-12-19 | 2010-09-21 | Stewart Brett B | Automatic management of geographic information pertaining to social networks, groups of users, or assets |
DE602006019946D1 (en) | 2006-05-02 | 2011-03-17 | Acterna France | System and method for monitoring a data network segment |
US7757710B2 (en) * | 2006-06-19 | 2010-07-20 | Tescom Corporation | High-pressure regulator |
US20080079539A1 (en) | 2006-08-15 | 2008-04-03 | Daley Robert C | Friends Finder Service for a Mobile Device in a Network |
US20080238768A1 (en) | 2007-03-26 | 2008-10-02 | Michael Albert Nosworthy | GPS tracking & behavior monitoring device with back-end software program service integrations |
US8335710B2 (en) * | 2007-06-28 | 2012-12-18 | Swiss Reinsurance Company Ltd. | Unidirectionally protected, fully automated asset allocation and asset monitoring apparatuses, and a corresponding method |
US20090176451A1 (en) | 2008-01-04 | 2009-07-09 | Microsoft Corporation | Encoded color information facilitating device pairing for wireless communication |
US8732246B2 (en) | 2008-03-14 | 2014-05-20 | Madhavi Jayanthi | Mobile social network for facilitating GPS based services |
US7940166B2 (en) | 2008-05-27 | 2011-05-10 | Konstantinos Koliopoulos | Emergency assistance beacon signal methodology and system for law enforcement, emergency and military personnel |
US8369867B2 (en) | 2008-06-30 | 2013-02-05 | Apple Inc. | Location sharing |
US8242884B2 (en) | 2008-09-24 | 2012-08-14 | Denso International America, Inc. | Car finder by cell phone |
US20110071889A1 (en) | 2009-09-24 | 2011-03-24 | Avaya Inc. | Location-Aware Retail Application |
US20100100233A1 (en) | 2008-10-22 | 2010-04-22 | Lockheed Martin Corporation | Universal intelligent postal identification code |
WO2010111704A2 (en) | 2009-03-27 | 2010-09-30 | Avaya, Inc. | Method and apparatus for providing visual/audible beacon to assist in locating an emergency caller |
US20100277313A1 (en) | 2009-04-29 | 2010-11-04 | Dimitry Vaysburg | Light Emitting Device History Log |
WO2011017494A1 (en) * | 2009-08-06 | 2011-02-10 | Rexecon International, Inc. | Fuel line ionizer |
US8103402B2 (en) | 2009-08-14 | 2012-01-24 | Douglas Everett Kozlay | Apparatus, method and system for enforcing vehicle operator policy compliance |
US20110053577A1 (en) | 2009-08-31 | 2011-03-03 | Lee Changkee | Methods and apparatus for communicating by vibrating or moving mobile devices |
US8611326B2 (en) | 2009-09-03 | 2013-12-17 | Apple Inc. | Location histories for location aware devices |
EP3522081A1 (en) | 2009-12-04 | 2019-08-07 | Uber Technologies, Inc. | System and method for arranging transport amongst parties through use of mobile devices |
US8200247B1 (en) | 2010-02-08 | 2012-06-12 | Google Inc. | Confirming a venue of user location |
WO2011116815A1 (en) | 2010-03-23 | 2011-09-29 | Nokia Siemens Networks Oy | Resource allocation for direct terminal-to-terminal communication in a cellular system |
DE102011079747A1 (en) * | 2010-07-27 | 2012-02-02 | Denso Corporation | Semiconductor device with switching element and freewheeling diode, and control method therefor |
US8635556B2 (en) | 2010-11-30 | 2014-01-21 | Alcatel Lucent | Human readable iconic display server |
KR20120090445A (en) | 2011-02-08 | 2012-08-17 | 삼성전자주식회사 | Method and apparatus for providing safety taxi service |
US20120209685A1 (en) | 2011-02-15 | 2012-08-16 | Microsoft Corporation | Check-ins to commercial venues |
US9226108B2 (en) | 2011-03-31 | 2015-12-29 | Microsoft Technology Licensing, Llc | Publishing location information |
US20120258741A1 (en) * | 2011-04-07 | 2012-10-11 | Compass Auto Tracker, Llc. | Apparatus and Method for Using a Wireless Mobile Handset Application to Locate Beacons |
US9191352B2 (en) | 2011-04-15 | 2015-11-17 | Microsoft Technology Licensing, Llc | On demand location sharing |
US9285944B1 (en) | 2011-04-22 | 2016-03-15 | Angel A. Penilla | Methods and systems for defining custom vehicle user interface configurations and cloud services for managing applications for the user interface and learned setting functions |
US20130013490A1 (en) * | 2011-07-08 | 2013-01-10 | Bank Of America Corporation | Mobile apparatus with back-up payment system |
WO2013012059A1 (en) | 2011-07-21 | 2013-01-24 | 日本電気株式会社 | Communication system |
US8457502B2 (en) | 2011-07-26 | 2013-06-04 | ByteLight, Inc. | Method and system for modulating a beacon light source in a light based positioning system |
US9445305B2 (en) | 2011-09-12 | 2016-09-13 | Microsoft Corporation | Low energy beacon encoding |
US9146115B2 (en) | 2011-10-18 | 2015-09-29 | Microsoft Technology Licensing, Llc | Location enhanced meetings and collaboration |
US9836763B2 (en) | 2011-11-04 | 2017-12-05 | Neal Harley Landers | Printing, displaying, and etching advertisements on various media |
WO2014020547A1 (en) * | 2012-07-31 | 2014-02-06 | Indoorgo Navigation Systems Ltd. | Navigation method and device |
US9157748B2 (en) | 2012-07-31 | 2015-10-13 | Flatiron Apps LLC | System and method for hailing taxicabs |
US9681468B2 (en) * | 2012-08-24 | 2017-06-13 | Qualcomm Incorporated | Joining communication groups with pattern sequenced light and/or sound signals as data transmissions |
US9258676B2 (en) | 2013-01-29 | 2016-02-09 | Blackberry Limited | Mobile device for creating, managing and sharing location information |
US9483875B2 (en) | 2013-02-14 | 2016-11-01 | Blackberry Limited | Augmented reality system with encoding beacons |
WO2014137613A1 (en) | 2013-02-22 | 2014-09-12 | Intel Corporation | Location based advertisement system using wi-fi |
US20140350840A1 (en) | 2013-05-23 | 2014-11-27 | Cellco Partnership D/B/A Verizon Wireless | Crowd proximity device |
US9356819B2 (en) * | 2013-09-27 | 2016-05-31 | Ebay Inc. | Systems and methods for checking a user into a location using a packet sequence including location information |
US10217135B2 (en) | 2014-02-05 | 2019-02-26 | Pacific Wave Technology, Inc. | Presenting an advertisement in a vehicle |
US20160241660A1 (en) * | 2014-08-26 | 2016-08-18 | Hoang Nhu | Sensors and systems for iot and ifttt applications and related methods |
US10186188B2 (en) | 2015-09-23 | 2019-01-22 | Motorola Solutions, Inc. | Multi-angle simultaneous view light-emitting diode display |
-
2016
- 2016-04-11 US US15/096,258 patent/US10057719B2/en active Active
-
2018
- 2018-07-19 US US16/040,493 patent/US10448213B2/en active Active
- 2018-07-29 US US16/048,305 patent/US10455359B2/en active Active
-
2019
- 2019-10-21 US US16/658,274 patent/US11006244B2/en active Active
-
2021
- 2021-05-06 US US17/314,007 patent/US20210266700A1/en active Pending
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140245354A1 (en) * | 2005-03-30 | 2014-08-28 | Rovi Guides, Inc. | Systems and methods for video-rich navigation |
US20130225205A1 (en) * | 2005-04-04 | 2013-08-29 | X One, Inc. | Methods and Systems for Temporarily Sharing Position Data Between Mobile-Device Users |
US20080113618A1 (en) * | 2006-11-09 | 2008-05-15 | Sony Ericsson Mobile Communications Ab | Pairing system and method for mobile devices |
US20090287853A1 (en) * | 2008-05-13 | 2009-11-19 | Dale Carson | Portable wireless compatibility detection, location and communication device |
US20110306366A1 (en) * | 2008-07-16 | 2011-12-15 | Glympse Inc. | Sharing of location information in a networked computing environment |
US20110191823A1 (en) * | 2010-02-03 | 2011-08-04 | Bump Technologies, Inc. | Bump validation |
US20130134906A1 (en) * | 2010-06-04 | 2013-05-30 | Michel Picariello | Network of synchronous self-contained light beacons |
US20120159371A1 (en) * | 2010-12-15 | 2012-06-21 | Haley Thrapp | Relative-location maps based on personalized points of reference |
US20140018103A1 (en) * | 2011-01-12 | 2014-01-16 | Telefonaktiebolaget Lm Ericsson (Publ) | Nodes and methods for positioning |
US20140155098A1 (en) * | 2011-03-07 | 2014-06-05 | Isis Innovation Limited | System for providing information and associated devices |
US20130178257A1 (en) * | 2012-01-06 | 2013-07-11 | Augaroo, Inc. | System and method for interacting with virtual objects in augmented realities |
US20140282620A1 (en) * | 2013-03-15 | 2014-09-18 | Frank Settemo NUOVO | System and method for triggering an event in response to receiving a device identifier |
US20150092676A1 (en) * | 2013-09-30 | 2015-04-02 | Blackberry Limited | Macrocell Enabled MM-Wave Superspot for Mobility |
US9344849B2 (en) * | 2013-11-27 | 2016-05-17 | Alan Michael Snyder | Methods and systems for locating persons and places with mobile devices |
US10057719B2 (en) * | 2013-11-27 | 2018-08-21 | Alan Snyder | Methods and systems for locating persons and places with mobile devices |
US10455359B2 (en) * | 2013-11-27 | 2019-10-22 | Alan Michael Snyder | Methods for location of persons using beaconing sequences |
Also Published As
Publication number | Publication date |
---|---|
US20190053006A1 (en) | 2019-02-14 |
US10448213B2 (en) | 2019-10-15 |
US20200053510A1 (en) | 2020-02-13 |
US20180338221A1 (en) | 2018-11-22 |
US10455359B2 (en) | 2019-10-22 |
US20170295468A1 (en) | 2017-10-12 |
US11006244B2 (en) | 2021-05-11 |
US10057719B2 (en) | 2018-08-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9344849B2 (en) | Methods and systems for locating persons and places with mobile devices | |
US11006244B2 (en) | Methods on mobile devices to locate persons | |
KR102494540B1 (en) | Creation of collectible items based on location information | |
KR102614536B1 (en) | Virtual vision system | |
US8437779B2 (en) | Modification of dynamic contact lists | |
US20140129557A1 (en) | Zone Oriented Applications, Systems and Methods | |
US11943303B2 (en) | Augmented reality objects registry | |
CN106201448B (en) | Information processing method and user terminal | |
US20230104981A1 (en) | Surfacing augmented reality objects | |
CN107016051A (en) | Geocoding is carried out to personal information | |
KR20240034259A (en) | Generating personalized map interface with enhanced icons | |
US9690838B2 (en) | Master data management | |
EP2629213A1 (en) | Method and system for associating user interests with zones and maps | |
Lautenschläger | Design and implementation of a campus navigation application with augmented reality for smartphones | |
JP6030891B2 (en) | An accommodation facility information system for providing vacancy information of an accommodation facility and for a user to communicate with the accommodation facility based on the information | |
US20180176726A1 (en) | System for monitoring the activity status of at least one venue | |
Kefas | A GPS-based mobile application for tourists | |
Ahsan et al. | Cloud-Based Shared Food Ordering System with Context Awareness: A Location Base Services Approach | |
US20160117338A1 (en) | Spatial information sharing service system using multiple ids and method thereof | |
TW202324286A (en) | Method for providing information and electronic device using the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |