US20210373181A1 - Geo-fusion between imaging device and mobile device - Google Patents
Geo-fusion between imaging device and mobile device Download PDFInfo
- Publication number
- US20210373181A1 US20210373181A1 US17/398,692 US202117398692A US2021373181A1 US 20210373181 A1 US20210373181 A1 US 20210373181A1 US 202117398692 A US202117398692 A US 202117398692A US 2021373181 A1 US2021373181 A1 US 2021373181A1
- Authority
- US
- United States
- Prior art keywords
- imaging device
- position information
- data
- location
- auxiliary device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/48—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/01—Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/03—Cooperating elements; Interaction or communication between different cooperating elements or between cooperating elements and receivers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/396—Determining accuracy or reliability of position or pseudorange measurements
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/421—Determining position by combining or switching between position solutions or signals derived from different satellite radio beacon positioning systems; by combining or switching between position solutions or signals derived from different modes of operation in a single system
- G01S19/423—Determining position by combining or switching between position solutions or signals derived from different satellite radio beacon positioning systems; by combining or switching between position solutions or signals derived from different modes of operation in a single system by combining or switching between position solutions derived from different satellite radio beacon positioning systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/0009—Transmission of position information to remote stations
- G01S5/0072—Transmission between mobile stations, e.g. anti-collision systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
Definitions
- Location capabilities of media devices can sometimes suffer from inaccuracies due to different factors, including poor environmental conditions, limited line of site to navigation satellites, and hardware equipped with lower performance satellite receivers.
- the resulting location inaccuracy can impact the quality of geotags for objects or events captured by the media devices.
- an imaging device that uses built-in GNSS capabilities to geotag captured imagery might utilize a lower-power/quality receiver, in order to save cost and/or battery power.
- Even when the imaging device uses a high-quality location system the accuracy of determined locations may be poor due to environmental conditions. Consequently, ways of improving the accuracy of locations obtained with such devices are desirable.
- This disclosure describes various embodiments that relate to improving the accuracy of location services for electronic devices.
- a method for improving position estimation includes receiving first position information from a first GNSS receiver.
- the first position information includes a first group of position estimates associated with a first group of time stamps.
- the method also includes receiving second position information from a second GNSS receiver proximate the first GNSS receiver.
- the second position information includes a second group of position estimates associated with a second group of time stamps.
- the method can also include time synchronizing the second position information with the first position information and based on the time synchronizing, combining the first position information with the second position information to determine third position information.
- the third position information can include a third group of position estimates associated with a third group of time stamps.
- a method includes receiving first position information from a first GNSS receiver of an imaging device. Second position information is received from a second GNSS receiver of an auxiliary device proximate the imaging device. The method also includes combining the first position information with the second position information to determine a location of the imaging device. The location of the imaging device is then associated with imagery recorded by the imaging device. Finally, the location of the imaging device and associated imagery is stored to a computer readable storage medium.
- a navigation system includes a first electronic device having a first GNSS receiver and a computer readable storage medium.
- the navigation system also includes a second electronic device near the first electronic device.
- the second electronic device includes a second GNSS receiver.
- the navigation system also includes a processor configured to receive first and second position information from the first and second GNSS receivers respectively.
- the processor is also configured to combine the first and second position information together to determine an estimated position of the first electronic device, and then save the estimated position of the electronic device to the computer readable storage medium of the first electronic device.
- FIG. 1A shows a front surface of an electronic device
- FIG. 1B shows a rear surface of the electronic device depicted in FIG. 1A ;
- FIG. 1C shows both the electronic device and an auxiliary device
- FIG. 2 shows an internal view of a vehicle
- FIG. 3 shows a top view of the vehicle travelling along a road and entering an intersection
- FIGS. 4A-4D show a series of video frames from a camera module of the electronic device that includes footage of an event
- FIG. 5 is a diagram illustrating communication between various devices
- FIG. 6 shows a flow chart depicting a method for pairing a first device with a second device
- FIG. 7 shows two streams of position data, P1 and P2, that can be fused into a combined signal
- FIGS. 8A-8C show examples of unsynchronized location data fused into a combined signal.
- a Satellite-based navigation system often referred to as a Global Navigation Satellite System (GNSS)
- GNSS Global Navigation Satellite System
- One limitation of GNSS receivers is that because the signals emitted by the satellites making up a GNSS can be attenuated and may even be below a noise level, poor environmental conditions can adversely affect the performance of the GNSS receivers.
- Another limitation of GNSS receivers is that performance can be degraded when obstructions get between the GNSS receivers and satellites making up the GNSS.
- a solution to such problems is to sample readings from multiple co-located GNSS receivers to determine the location of the GNSS receivers more precisely.
- the location can be refined by determining an average position reported by the GNSS receivers.
- an imaging device such as a vehicle dashboard camera can include its own GNSS receiver that is used to geotag pictures or video frames taken by the imaging device.
- the imaging device is in electronic communication with a mobile device such as a cellular phone
- the location data from the imaging device can be combined with location data provided by a GNSS receiver of the cellular phone.
- a processor in one of the devices, or a process located elsewhere can then combine the location data from the two devices to obtain an averaged position.
- one of the GNSS receivers may be known to generally provide more accurate position information.
- a cellular phone may have a more accurate GNSS receiver than an imaging device.
- the processor can be configured to determine relative accuracy of the imaging device position information relative to the cellular phone position information. The determined relative accuracy can then be used to generate a weighting factor, which can be used to weigh the position information obtained from the cellular phone more heavily than the position information received from the imaging device.
- the cellular phone can also be configured to determine its location in other ways, such as by cellular tower triangulation or by using Wi-Fi derived location information. This additional location data can be used in many ways.
- only the data from the GNSS receiver of the cellular phone is combined with GNSS receiver data from the imaging device. After combining the GNSS receiver data, the resulting data can then be subsequently combined with the additional location data to arrive at an estimated position. In other embodiments, the accuracy improvement provided by the additional location data can be combined with the GNSS receiver data, prior to combination with the position data from the other GNSS receiver. It should be noted that when additional co-located devices include GNSS receivers or other location determining components, location data from three or more devices can be combined to further improve location accuracy.
- Combining the location data can also include synchronizing the times at which the location measurements were taken by the devices. Because devices with GNSS receivers receive GNSS time from the GNSS Satellites, there is generally no need to perform a device clock synchronization step since each device clock is generally already synchronized with GNSS time. Even though the device clocks are already synchronized the devices may not record position data at the same rate and/or time. This can be problematic and particularly problematic when a user of the devices is travelling at high speeds. Consequently, data from at least one of the devices can be time synchronized to match the data from the other device(s) prior to combining location data across multiple devices.
- FIG. 1A shows a front surface of an electronic device 100 .
- Electronic device 100 can take the form of an imaging device configured to record images and/or video data.
- the images and/or video can be stored to a memory device within electronic device 100 such as a removable memory card or other internal storage medium.
- electronic device 100 can be configured to transmit the recorded images and/or video data to a remote storage location
- the imaging device can take the form of a dashboard camera (“dash cam”) configured to record video through a front-facing window of a vehicle.
- Electronic device 100 can include device housing 102 and display 104 for showing what is being recorded or can be recorded by electronic device 100 .
- Status bar 106 can be configured to display information such as current time, latitude/longitude, recording status and battery charge.
- Electronic device 100 can also include charging cord 108 for recharging a battery within electronic device 100 and providing energy to run electrical components disposed within device housing 102 .
- FIG. 1A also shows mounting device 110 which can be used to mount electronic device 100 to a window or other surface of a vehicle. Also depicted in FIG. 1A are controls 112 for manipulating settings and operations of electronic device 100 .
- FIG. 1B shows a rear surface of electronic device 100 .
- camera module 114 protrudes from the depicted rear surface of electronic device 100 so that video footage shown in display 104 is oriented to substantially match the orientation of camera module 114 .
- Electronic device 100 can include a GNSS receiver configured to detect satellite navigation satellites and a processor for associating the location information derived from the satellite data with imagery taken by camera module 114 .
- FIG. 1C shows both electronic device 100 and auxiliary device 150 .
- Auxiliary device 150 can take the form of a cellular device having its own GNSS receiver and various other components. Both electronic device 100 and auxiliary device 150 can include various antennas or ports capable of carrying out unidirectional or bidirectional communication 116 between the devices. For example, cable 118 can be used to engage ports in electronic device 100 and auxiliary device 150 . Cable 118 can then be used to transfer data back and forth between the two devices at high speeds. Alternatively, antennas associated with each of the devices can be configured to exchange data using Bluetooth®, Near Field Communications, P2P Wi-Fi®, and/or other protocols.
- an imaging device and a cellular phone are used here as exemplary devices, the described techniques could be used with any device.
- the imaging device could take the form of a wearable video-recording device that could be worn by a user.
- FIG. 2 shows an internal view of vehicle 200 .
- FIG. 2 shows electronic device 100 and auxiliary device 150 mounted within vehicle 200 .
- Electronic device 100 can be mounted to windshield 202 of vehicle 200 using mounting device 110 and positioned to record images and video of the area directly in front of vehicle 200 through windshield 202 .
- Auxiliary device 150 can also be mounted within vehicle 200 , as depicted.
- auxiliary device 150 can be positioned within the coat pocket or purse of an occupant of vehicle 200 , for example.
- two devices such as electronic device 100 and auxiliary device 150 can be considered to be in substantially the same location when positioned anywhere within or attached to vehicle 200 .
- vehicle 200 can include its own suite of electrical components.
- Such components may include display 204 , internal electrical components such as one or more processors, and a GNSS receiver.
- vehicle 200 can include externally mounted cameras in communication with the aforementioned internal circuitry of vehicle 200 .
- location data collected from a GNSS receiver of vehicle 200 can be combined with location data from electronic device 100 and/or auxiliary device 150 .
- FIG. 3 shows a top view of vehicle 200 travelling along a road 300 and entering an intersection.
- FIG. 3 also depicts a field of view 302 of camera module 114 of electronic device 100 or a camera associated with vehicle 200 .
- event 1 occurs and takes place within field of view 302 .
- a camera within vehicle 200 is positioned to take multiple video frames of event 1 as it occurs.
- P1 and P2 represent locations determined by electronic device 100 and auxiliary device 150 , respectively. Use of either location P1 or P2 would result in misplacement of the location of event 1 at either location 1-1 or 1-2. By using both locations P1 and P2 a position of vehicle 200 can be determined with greater accuracy. Depending on how the location values are weighted, the increased amount of location data can be used to more accurately identify the location of event 1.
- Event 1 can be any event observable by an imaging device positioned within vehicle 200 . As depicted, event 1 occurs within field of view 302 of the camera positioned within vehicle 200 while vehicle 200 is crossing the intersection. Consequently, on account of the vehicle being in continuous motion, each frame of video taken by the camera can be taken from a different position/orientation. While the time signal provided by a GNSS constellation can accurately portray the timing of the event, a position of the imaging device during each frame of the video can also help in characterizing the location and details of event 1. For example, when the position of the car is known to a high enough level of precision, a position of the event within the frame of the display can be used to identify a location or at least a bearing of the event relative to the camera.
- the event position determination can be calculated by analyzing multiple frames of video taken by the camera.
- the respective position(s) of the vehicle associated with the times at which the frames were taken can be used to approximate the position of different participants or objects included in the event. While such a method would be particularly effective at determining the location of a stationary object within the field of view of the camera, approximate positions of moving objects could also be determined in this manner.
- Deviations in the position of vehicle 200 can degrade the aforementioned location determination. For this reason, refining the position of vehicle 200 and/or the camera positioned within vehicle 200 can be very helpful.
- FIG. 3 also shows position P1 and position P2 of vehicle 200 as reported by two different position identification systems. As depicted, neither position P1 nor P2 is indicative of an accurate location of vehicle 200 .
- the camera can be configured to receive inputs from both position identification systems to refine a position of vehicle 200 .
- the camera can determine a location of vehicle 200 to be at a midpoint between P1 and P2; however, when the position identification system associated with position P1 is known to have a higher accuracy than the position identification system associated with P2, the determined location of vehicle 200 can be weighted so that an estimated position of vehicle 200 is determined to be much closer to P1 than to P2.
- various devices can perform the combining of location data.
- the combining can be performed electronic device 100 (e.g., the imaging device in this case), auxiliary device 150 (e.g., a cell phone), or by another device (e.g., by a remote server).
- FIGS. 4A-4D show a series of exemplary video frames from a camera module 114 that captures portions of event 1.
- event 1 is illustrated as a vehicle positioned down the road that is not oriented consistently with the flow of traffic.
- FIG. 4A shows event 1 when it is first identified by camera module 114 .
- FIGS. 4B-4D show how the vehicle representing event 1 tracks towards the left of the video frame over the series of video frames.
- the position of event 1 can be determined at least in part by the position of the vehicle when event 1 passes out of the frame, as shown in FIG. 4D .
- associated timing information in conjunction with the rate at which the car representing event 1 tracks across the screen can be used to determine an approximate position of event 1.
- FIG. 5 is a diagram illustrating communication links that can be used for exchanging location information between various devices.
- electronic device 100 may communicate with auxiliary device 150 directly as shown in FIG. 1C and as depicted by communication link 502 of FIG. 5 .
- electronic device 100 can communicate with auxiliary device 150 by way of cloud infrastructure 504 using communication links 506 and 508 .
- cloud infrastructure 504 is utilized, fusion of the transmitted data can be carried out by processors associated with the cloud infrastructure.
- the initial pairing or initialization of the depicted communication links can be accomplished in many ways, including any of the following: manual pairing, semi-autonomous pairing, or by autonomous pairing.
- manual pairing a user will generally manipulate both devices before a communications link is achieved.
- a semi-autonomous pairing can be carried out by logic that identifies devices likely to be useful in providing location data. The logic can then be configured to ask a user of the device to confirm whether to pair the two devices together.
- autonomous pairing can be carried out by two devices that begin pairing with one another anytime a communications link can be established between the devices. This autonomous behavior can be pre-programmed at a factory or specifically setup by a user in software on one or both of the devices.
- Criteria for initiating a pairing in the semi-autonomous or autonomous pairing modes can range from simple to complex.
- a device could be configured to share data only when the devices share substantially the same inertial reference frame.
- devices that moved at different speeds or in different directions can be discarded from consideration as potential location data sharing platforms.
- an initial pairing could also require the devices to be within a particular distance from one another. These criteria could help to limit pairing suggestions to devices located within the same vehicle.
- the devices may only be configured to perform location data fusion when the accuracy or reliability of the location data for one or more of the devices falls beneath a particular threshold.
- the accuracy of a GNSS receiver can be in the realm of 5-10 meters in good conditions and 50-100 meters in poor conditions. Consequently, the system can be set up so that when the accuracy of one or more of the paired devices falls below 20-30 meters, location data fusion is initiated.
- location data fusion can be initiated when a predicted course of the device is calculated to pass through a region of poor satellite reception. Areas of poor satellite reception can be caused by numerous factors such as urban canyons, mountainous regions, and regions with high levels of electromagnetic radiation.
- FIG. 6 shows a flow chart depicting a technique for pairing a first device with a second device.
- the first device is detected.
- the first device can be detected by the second device using, e.g., a wireless protocol such as Bluetooth® or Near Field Communications protocols.
- the detection can be the result of a user requesting that a device perform a search for nearby compatible devices.
- the second device can identify the first device and determine whether the second device has any pre-existing pairing instructions for interfacing with the first device.
- the second device can have instructions to pair with the first device any time the first device is detected.
- the second device could be configured to pair with the first device only when particular proximity criteria are met.
- the second device can be configured to check the inertial reference frame of the first device relative to that of the second device.
- any pairing can be terminated.
- the user can be provided additional prompting that the first device could be particularly effective as a secondary source of location data.
- the cloud computing processor can also be used in identifying likely additional devices that could provide helpful location data.
- the second device can inform a user of the second device if a connection has been made with the first device, if a connection is recommended with the first device, or if a connection is even possible with the first device.
- the user may receive no notification at all.
- FIG. 7 shows how two streams of position data, P1 and P2, can be fused into a combined stream of position data (“Combined”), as depicted.
- This signal combination can be performed in various ways.
- One way to combine the data is to take an average position based on the readings of the two devices by weighting the two streams of position data equally in accordance with Eq(1), below:
- the streams of position data can be weighted differently in accordance with hardware specifications, internal device performance parameters and/or other factors. Weighting the location data sources allows known accuracy differences and operating parameters to be accounted for during the fusion, often yielding better performance.
- the P1 data can be associated with a GNSS receiver with lower quality hardware incapable of providing precision below 10 m.
- the P2 data is capable of providing precision as high as 5 m, then the P2 data could be weighted more heavily than the P1 data.
- one of the GNSS receivers may receive signals from more satellites than the other GNSS receiver. In such a situation, data from the GNSS receiver that is receiving signals from more satellites can be weighted more heavily than the data from the other GNSS receiver.
- FIG. 8A the depicted data samples shown in FIG. 8A are substantially time-aligned.
- the discussion in the sections below relates to synchronizing and fusing location data that is not time-aligned.
- FIGS. 8A and 8B show unsynchronized streams of location data P1 and P2.
- the location data samples from P2 are clearly not time-aligned with the location data samples from P1.
- at least one of the streams of location data can be processed to generate a stream of interpolated location data samples that are time-aligned with the data samples of the other stream, so that data fusion can be performed.
- the fused location result is calculated based a weighted combination of location data samples from stream P1 and interpolated location data samples based on stream P2.
- Position (X x , Y x , Z x ) represents a location data sample from stream P1.
- the more data samples used in the interpolation the better the interpolation result.
- Various forms of linear and non-linear interpretation techniques can be applied, as would be understood by one of ordinary skill in the art.
- the location data sample (X x , Y x , Z x ) from stream P1 and the interpolated location data sample (X i , Y i , Z i ) based on stream P2 can be weighted equally or differently.
- FIG. 8A shows an exemplary real-time fusion of unsynchronized streams of location data.
- Real-time fusion techniques are used when fused results must be generated while the two (or more) streams of location data P1 and P2 are being received.
- the more data samples used in the interpolation the better the interpolation result.
- Real-time fusion thus reduces the number of data samples that can be used in the interpolation operation for generating the interpolated data sample (X i , Y 1 , Z i ). Accordingly, the accuracy of the fused location estimates is expected to be less in the case of real-time fusion of unsynchronized streams, when compared to that of post-processed fusion of unsynchronized streams, which is discussed below.
- FIG. 8B shows an exemplary post-process fusion of unsynchronized streams of location data.
- Post-process techniques can be used when fused results can be generated after the two (or more) streams of location data P1 and P2 have been received.
- Post-process fusion thus increases the number of data samples that can be used in the interpolation. Compared to real-time fusion, post-process fusion can be expected to generate relatively more accurate fused location estimates.
- FIG. 8C shows three unsynchronized streams of location data.
- FIG. 8C shows how more than two streams of data can be combined together.
- FIG. 8C also shows that streams having different sampling rates can be combined. Even though the sampling rate of stream P3 is slower than the sampling rates of streams P1 and P2, the aforementioned interpolation techniques can still be used to calculate sample (X j , Z j ).
- Eq(3) below which shows how any number of samples can be fused together, can then be used to combine all three streams of data at time t x .
- ⁇ t t x
- location ( X a + X b + ⁇ ... ⁇ ⁇ X n n , ⁇ Y a + Y b + ⁇ ... ⁇ ⁇ Y n n , Z a + Z b + ... ⁇ Z n n ) ⁇ Eq ⁇ ⁇ ( 3 )
- the various aspects, embodiments, implementations or features of the described embodiments can be used separately or in any combination.
- Various aspects of the described embodiments can be implemented by software, hardware or a combination of hardware and software.
- the described embodiments can also be embodied as computer readable code on a computer readable medium for controlling operations of a navigation system or as computer readable code on a computer readable medium for controlling the operation of an automobile in accordance with a navigation route.
- the computer readable medium is any data storage device that can store data which can thereafter be read by a computer system. Examples of the computer readable medium include read-only memory, random-access memory, CD-ROMs, HDDs, DVDs, magnetic tape, and optical data storage devices.
- the computer readable medium can also be distributed over network-coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Navigation (AREA)
Abstract
Various methods and apparatus relating to synchronizing location information between two or more devices are described. In some embodiments, the devices are both configured to generate GNSS receiver data that is synchronized to achieve greater location accuracy. In some embodiments, the GNSS receiver data can be weighted when one set of GNSS receiver data is known to have a higher accuracy than another set of GNSS receiver data.
Description
- This application claims the priority of U.S. Provisional Application No. 62/357,160, filed Jun. 30, 2016, the entirety of which is hereby incorporated by reference.
- Location capabilities of media devices can sometimes suffer from inaccuracies due to different factors, including poor environmental conditions, limited line of site to navigation satellites, and hardware equipped with lower performance satellite receivers. The resulting location inaccuracy can impact the quality of geotags for objects or events captured by the media devices. For example, an imaging device that uses built-in GNSS capabilities to geotag captured imagery might utilize a lower-power/quality receiver, in order to save cost and/or battery power. Even when the imaging device uses a high-quality location system, the accuracy of determined locations may be poor due to environmental conditions. Consequently, ways of improving the accuracy of locations obtained with such devices are desirable.
- This disclosure describes various embodiments that relate to improving the accuracy of location services for electronic devices.
- A method for improving position estimation is disclosed. The method includes receiving first position information from a first GNSS receiver. The first position information includes a first group of position estimates associated with a first group of time stamps. The method also includes receiving second position information from a second GNSS receiver proximate the first GNSS receiver. The second position information includes a second group of position estimates associated with a second group of time stamps. The method can also include time synchronizing the second position information with the first position information and based on the time synchronizing, combining the first position information with the second position information to determine third position information. The third position information can include a third group of position estimates associated with a third group of time stamps.
- A method is disclosed and includes receiving first position information from a first GNSS receiver of an imaging device. Second position information is received from a second GNSS receiver of an auxiliary device proximate the imaging device. The method also includes combining the first position information with the second position information to determine a location of the imaging device. The location of the imaging device is then associated with imagery recorded by the imaging device. Finally, the location of the imaging device and associated imagery is stored to a computer readable storage medium.
- A navigation system is disclosed. The navigation system includes a first electronic device having a first GNSS receiver and a computer readable storage medium. The navigation system also includes a second electronic device near the first electronic device. The second electronic device includes a second GNSS receiver. The navigation system also includes a processor configured to receive first and second position information from the first and second GNSS receivers respectively. The processor is also configured to combine the first and second position information together to determine an estimated position of the first electronic device, and then save the estimated position of the electronic device to the computer readable storage medium of the first electronic device.
- The disclosure will be readily understood by the following detailed description in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements, and in which:
-
FIG. 1A shows a front surface of an electronic device; -
FIG. 1B shows a rear surface of the electronic device depicted inFIG. 1A ; -
FIG. 1C shows both the electronic device and an auxiliary device; -
FIG. 2 shows an internal view of a vehicle; -
FIG. 3 shows a top view of the vehicle travelling along a road and entering an intersection; -
FIGS. 4A-4D show a series of video frames from a camera module of the electronic device that includes footage of an event; -
FIG. 5 is a diagram illustrating communication between various devices; -
FIG. 6 shows a flow chart depicting a method for pairing a first device with a second device; -
FIG. 7 shows two streams of position data, P1 and P2, that can be fused into a combined signal; and -
FIGS. 8A-8C show examples of unsynchronized location data fused into a combined signal. - Other aspects and advantages of the invention will become apparent from the following detailed description taken in conjunction with the accompanying drawings which illustrate, by way of example, the principles of the described embodiments.
- This description is presented to enable any person skilled in the art to make and use the embodiments, and is provided in the context of a particular application and its requirements. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present disclosure. Thus, the invention is not limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features disclosed herein.
- A Satellite-based navigation system, often referred to as a Global Navigation Satellite System (GNSS), while capable of providing highly accurate position information, can suffer from a number of short-comings. One limitation of GNSS receivers is that because the signals emitted by the satellites making up a GNSS can be attenuated and may even be below a noise level, poor environmental conditions can adversely affect the performance of the GNSS receivers. Another limitation of GNSS receivers is that performance can be degraded when obstructions get between the GNSS receivers and satellites making up the GNSS.
- According to various embodiments, a solution to such problems is to sample readings from multiple co-located GNSS receivers to determine the location of the GNSS receivers more precisely. In some embodiments, the location can be refined by determining an average position reported by the GNSS receivers. For example, an imaging device such as a vehicle dashboard camera can include its own GNSS receiver that is used to geotag pictures or video frames taken by the imaging device. When the imaging device is in electronic communication with a mobile device such as a cellular phone, the location data from the imaging device can be combined with location data provided by a GNSS receiver of the cellular phone. A processor in one of the devices, or a process located elsewhere, can then combine the location data from the two devices to obtain an averaged position.
- In some embodiments, one of the GNSS receivers may be known to generally provide more accurate position information. For example, a cellular phone may have a more accurate GNSS receiver than an imaging device. In such a case, the processor can be configured to determine relative accuracy of the imaging device position information relative to the cellular phone position information. The determined relative accuracy can then be used to generate a weighting factor, which can be used to weigh the position information obtained from the cellular phone more heavily than the position information received from the imaging device. In some embodiments, the cellular phone can also be configured to determine its location in other ways, such as by cellular tower triangulation or by using Wi-Fi derived location information. This additional location data can be used in many ways. In some embodiments, only the data from the GNSS receiver of the cellular phone is combined with GNSS receiver data from the imaging device. After combining the GNSS receiver data, the resulting data can then be subsequently combined with the additional location data to arrive at an estimated position. In other embodiments, the accuracy improvement provided by the additional location data can be combined with the GNSS receiver data, prior to combination with the position data from the other GNSS receiver. It should be noted that when additional co-located devices include GNSS receivers or other location determining components, location data from three or more devices can be combined to further improve location accuracy.
- Combining the location data can also include synchronizing the times at which the location measurements were taken by the devices. Because devices with GNSS receivers receive GNSS time from the GNSS Satellites, there is generally no need to perform a device clock synchronization step since each device clock is generally already synchronized with GNSS time. Even though the device clocks are already synchronized the devices may not record position data at the same rate and/or time. This can be problematic and particularly problematic when a user of the devices is travelling at high speeds. Consequently, data from at least one of the devices can be time synchronized to match the data from the other device(s) prior to combining location data across multiple devices.
- These and other embodiments are discussed below with reference to
FIGS. 1A-8C , however, those skilled in the art will readily appreciate that the detailed description given herein with respect to these figures is for explanatory purposes only and should not be construed as limiting. -
FIG. 1A shows a front surface of anelectronic device 100.Electronic device 100 can take the form of an imaging device configured to record images and/or video data. In some embodiments, the images and/or video can be stored to a memory device withinelectronic device 100 such as a removable memory card or other internal storage medium. In some embodiments,electronic device 100 can be configured to transmit the recorded images and/or video data to a remote storage location In one particular embodiment, the imaging device can take the form of a dashboard camera (“dash cam”) configured to record video through a front-facing window of a vehicle.Electronic device 100 can includedevice housing 102 anddisplay 104 for showing what is being recorded or can be recorded byelectronic device 100.Status bar 106 can be configured to display information such as current time, latitude/longitude, recording status and battery charge.Electronic device 100 can also include chargingcord 108 for recharging a battery withinelectronic device 100 and providing energy to run electrical components disposed withindevice housing 102.FIG. 1A also shows mountingdevice 110 which can be used to mountelectronic device 100 to a window or other surface of a vehicle. Also depicted inFIG. 1A arecontrols 112 for manipulating settings and operations ofelectronic device 100.FIG. 1B shows a rear surface ofelectronic device 100. In particular,camera module 114 protrudes from the depicted rear surface ofelectronic device 100 so that video footage shown indisplay 104 is oriented to substantially match the orientation ofcamera module 114.Electronic device 100 can include a GNSS receiver configured to detect satellite navigation satellites and a processor for associating the location information derived from the satellite data with imagery taken bycamera module 114. -
FIG. 1C shows bothelectronic device 100 andauxiliary device 150.Auxiliary device 150 can take the form of a cellular device having its own GNSS receiver and various other components. Bothelectronic device 100 andauxiliary device 150 can include various antennas or ports capable of carrying out unidirectional orbidirectional communication 116 between the devices. For example,cable 118 can be used to engage ports inelectronic device 100 andauxiliary device 150.Cable 118 can then be used to transfer data back and forth between the two devices at high speeds. Alternatively, antennas associated with each of the devices can be configured to exchange data using Bluetooth®, Near Field Communications, P2P Wi-Fi®, and/or other protocols. It should be noted that while an imaging device and a cellular phone are used here as exemplary devices, the described techniques could be used with any device. For example, instead of an imaging device taking the form of a dash cam, the imaging device could take the form of a wearable video-recording device that could be worn by a user. -
FIG. 2 shows an internal view ofvehicle 200. In particular,FIG. 2 showselectronic device 100 andauxiliary device 150 mounted withinvehicle 200.Electronic device 100 can be mounted towindshield 202 ofvehicle 200 using mountingdevice 110 and positioned to record images and video of the area directly in front ofvehicle 200 throughwindshield 202.Auxiliary device 150 can also be mounted withinvehicle 200, as depicted. Alternatively,auxiliary device 150 can be positioned within the coat pocket or purse of an occupant ofvehicle 200, for example. According to various embodiments, two devices such aselectronic device 100 andauxiliary device 150 can be considered to be in substantially the same location when positioned anywhere within or attached tovehicle 200. In some embodiments,vehicle 200 can include its own suite of electrical components. Such components may includedisplay 204, internal electrical components such as one or more processors, and a GNSS receiver. In addition to the depicted internal components such assteering wheel 206,pedals 208, and mirrors 210,vehicle 200 can include externally mounted cameras in communication with the aforementioned internal circuitry ofvehicle 200. In some embodiments, location data collected from a GNSS receiver ofvehicle 200 can be combined with location data fromelectronic device 100 and/orauxiliary device 150. -
FIG. 3 shows a top view ofvehicle 200 travelling along aroad 300 and entering an intersection.FIG. 3 also depicts a field ofview 302 ofcamera module 114 ofelectronic device 100 or a camera associated withvehicle 200. At time T0 event 1 occurs and takes place within field ofview 302. A camera withinvehicle 200 is positioned to take multiple video frames ofevent 1 as it occurs. P1 and P2 represent locations determined byelectronic device 100 andauxiliary device 150, respectively. Use of either location P1 or P2 would result in misplacement of the location ofevent 1 at either location 1-1 or 1-2. By using both locations P1 and P2 a position ofvehicle 200 can be determined with greater accuracy. Depending on how the location values are weighted, the increased amount of location data can be used to more accurately identify the location ofevent 1. -
Event 1 can be any event observable by an imaging device positioned withinvehicle 200. As depicted,event 1 occurs within field ofview 302 of the camera positioned withinvehicle 200 whilevehicle 200 is crossing the intersection. Consequently, on account of the vehicle being in continuous motion, each frame of video taken by the camera can be taken from a different position/orientation. While the time signal provided by a GNSS constellation can accurately portray the timing of the event, a position of the imaging device during each frame of the video can also help in characterizing the location and details ofevent 1. For example, when the position of the car is known to a high enough level of precision, a position of the event within the frame of the display can be used to identify a location or at least a bearing of the event relative to the camera. The event position determination can be calculated by analyzing multiple frames of video taken by the camera. The respective position(s) of the vehicle associated with the times at which the frames were taken can be used to approximate the position of different participants or objects included in the event. While such a method would be particularly effective at determining the location of a stationary object within the field of view of the camera, approximate positions of moving objects could also be determined in this manner. - Deviations in the position of
vehicle 200 can degrade the aforementioned location determination. For this reason, refining the position ofvehicle 200 and/or the camera positioned withinvehicle 200 can be very helpful.FIG. 3 also shows position P1 and position P2 ofvehicle 200 as reported by two different position identification systems. As depicted, neither position P1 nor P2 is indicative of an accurate location ofvehicle 200. In some embodiments, the camera can be configured to receive inputs from both position identification systems to refine a position ofvehicle 200. For example, where both position information systems are determined to have a similar accuracy the camera can determine a location ofvehicle 200 to be at a midpoint between P1 and P2; however, when the position identification system associated with position P1 is known to have a higher accuracy than the position identification system associated with P2, the determined location ofvehicle 200 can be weighted so that an estimated position ofvehicle 200 is determined to be much closer to P1 than to P2. As mentioned previously, various devices can perform the combining of location data. For example, the combining can be performed electronic device 100 (e.g., the imaging device in this case), auxiliary device 150 (e.g., a cell phone), or by another device (e.g., by a remote server). -
FIGS. 4A-4D show a series of exemplary video frames from acamera module 114 that captures portions ofevent 1. In particular,event 1 is illustrated as a vehicle positioned down the road that is not oriented consistently with the flow of traffic.FIG. 4A showsevent 1 when it is first identified bycamera module 114.FIGS. 4B-4D show how thevehicle representing event 1 tracks towards the left of the video frame over the series of video frames. In some embodiments, the position ofevent 1 can be determined at least in part by the position of the vehicle whenevent 1 passes out of the frame, as shown inFIG. 4D . When the field of view and orientation ofcamera module 114 is known, associated timing information in conjunction with the rate at which thecar representing event 1 tracks across the screen can be used to determine an approximate position ofevent 1. -
FIG. 5 is a diagram illustrating communication links that can be used for exchanging location information between various devices. In some embodiments,electronic device 100 may communicate withauxiliary device 150 directly as shown inFIG. 1C and as depicted bycommunication link 502 ofFIG. 5 . In some embodiments,electronic device 100 can communicate withauxiliary device 150 by way ofcloud infrastructure 504 usingcommunication links cloud infrastructure 504 is utilized, fusion of the transmitted data can be carried out by processors associated with the cloud infrastructure. - The initial pairing or initialization of the depicted communication links can be accomplished in many ways, including any of the following: manual pairing, semi-autonomous pairing, or by autonomous pairing. In a manual pairing scenario, a user will generally manipulate both devices before a communications link is achieved. A semi-autonomous pairing can be carried out by logic that identifies devices likely to be useful in providing location data. The logic can then be configured to ask a user of the device to confirm whether to pair the two devices together. Finally autonomous pairing can be carried out by two devices that begin pairing with one another anytime a communications link can be established between the devices. This autonomous behavior can be pre-programmed at a factory or specifically setup by a user in software on one or both of the devices.
- Criteria for initiating a pairing in the semi-autonomous or autonomous pairing modes can range from simple to complex. For example, a device could be configured to share data only when the devices share substantially the same inertial reference frame. For example, devices that moved at different speeds or in different directions can be discarded from consideration as potential location data sharing platforms. Furthermore, an initial pairing could also require the devices to be within a particular distance from one another. These criteria could help to limit pairing suggestions to devices located within the same vehicle.
- Even after pairing two or more devices together in the aforementioned manner, the devices may only be configured to perform location data fusion when the accuracy or reliability of the location data for one or more of the devices falls beneath a particular threshold. For example, the accuracy of a GNSS receiver can be in the realm of 5-10 meters in good conditions and 50-100 meters in poor conditions. Consequently, the system can be set up so that when the accuracy of one or more of the paired devices falls below 20-30 meters, location data fusion is initiated. In some embodiments, location data fusion can be initiated when a predicted course of the device is calculated to pass through a region of poor satellite reception. Areas of poor satellite reception can be caused by numerous factors such as urban canyons, mountainous regions, and regions with high levels of electromagnetic radiation.
-
FIG. 6 shows a flow chart depicting a technique for pairing a first device with a second device. At 602, the first device is detected. The first device can be detected by the second device using, e.g., a wireless protocol such as Bluetooth® or Near Field Communications protocols. In some embodiments, the detection can be the result of a user requesting that a device perform a search for nearby compatible devices. At 604, the second device can identify the first device and determine whether the second device has any pre-existing pairing instructions for interfacing with the first device. For example, the second device can have instructions to pair with the first device any time the first device is detected. Alternatively, the second device could be configured to pair with the first device only when particular proximity criteria are met. At 606, the second device can be configured to check the inertial reference frame of the first device relative to that of the second device. In some embodiments, when the inertial reference frame of the first device differs too greatly from that of the second device, any pairing can be terminated. Conversely, when the inertial reference frames of the first and second devices are extremely close, the user can be provided additional prompting that the first device could be particularly effective as a secondary source of location data. In embodiments where location data fusion is carried out in the car, the cloud computing processor can also be used in identifying likely additional devices that could provide helpful location data. At 608, the second device can inform a user of the second device if a connection has been made with the first device, if a connection is recommended with the first device, or if a connection is even possible with the first device. In some embodiments, where the internal logic within the second device deems successful pairing unlikely or impossible, the user may receive no notification at all. -
FIG. 7 shows how two streams of position data, P1 and P2, can be fused into a combined stream of position data (“Combined”), as depicted. This signal combination can be performed in various ways. One way to combine the data is to take an average position based on the readings of the two devices by weighting the two streams of position data equally in accordance with Eq(1), below: -
{t=t 0,location=(0.5(X 0 +X 1),0.5(Y 0 +Y 1),0.5(Z 0 +Z 1))} Eq(1) - In some embodiments, the streams of position data can be weighted differently in accordance with hardware specifications, internal device performance parameters and/or other factors. Weighting the location data sources allows known accuracy differences and operating parameters to be accounted for during the fusion, often yielding better performance. For example, the P1 data can be associated with a GNSS receiver with lower quality hardware incapable of providing precision below 10 m. Here, if the P2 data is capable of providing precision as high as 5 m, then the P2 data could be weighted more heavily than the P1 data. Similarly, one of the GNSS receivers may receive signals from more satellites than the other GNSS receiver. In such a situation, data from the GNSS receiver that is receiving signals from more satellites can be weighted more heavily than the data from the other GNSS receiver.
- It should be noted that the depicted data samples shown in
FIG. 8A are substantially time-aligned. The discussion in the sections below relates to synchronizing and fusing location data that is not time-aligned. -
FIGS. 8A and 8B show unsynchronized streams of location data P1 and P2. The location data samples from P2 are clearly not time-aligned with the location data samples from P1. Before fusing the location data, at least one of the streams of location data can be processed to generate a stream of interpolated location data samples that are time-aligned with the data samples of the other stream, so that data fusion can be performed. For example, referring toFIG. 8A , stream P1 has a data sample at t=tx. However, stream P2 does not have a data sample at t=tx. This is because stream P2 is not time-aligned with stream P1. Instead, at t=tx, stream P2 is “in between” data samples. According to various embodiments, an interpolation technique can be employed to generate an interpolated data sample (Xi, Yi, Zi) at t=tx for P2, based on data samples that are available in stream P2. -
{t=tx,location=(0.5(X x +X i),0.5(Y x +Y i),0.5(Z x +Z i))} Eq(2) - In Eq(2), the fused location result is calculated based a weighted combination of location data samples from stream P1 and interpolated location data samples based on stream P2. Position (Xx, Yx, Zx) represents a location data sample from stream P1. Position (Xi, Yi, Zi) represents the interpolated location data sample at t=tx, interpolated from available, but not time-aligned (i.e., not at t=tx), location data samples from stream P2. Generally speaking, the more data samples used in the interpolation, the better the interpolation result. Various forms of linear and non-linear interpretation techniques can be applied, as would be understood by one of ordinary skill in the art. These techniques may include linear interpolation, polynomial interpolation, spline interpolation, and/or others. Similarly to the synchronized fusion case described above, the location data sample (Xx, Yx, Zx) from stream P1 and the interpolated location data sample (Xi, Yi, Zi) based on stream P2 can be weighted equally or differently.
-
FIG. 8A shows an exemplary real-time fusion of unsynchronized streams of location data. Real-time fusion techniques are used when fused results must be generated while the two (or more) streams of location data P1 and P2 are being received. In other words, the fused location result for t=tx is computed at or near time t=tx. Note that at time t=tx, the only available sample points from streams P1 and P2 are those shown inFIG. 8A as being at or to the “left” of t=tx (i.e., present and past data samples). Sample points from streams P1 and P2 that are shown inFIG. 8A as being to the “right” of t=tx (i.e., future data samples), have not yet been received. Thus, for real-time fusion, the interpolated data sample (Xi, Yi, Zi) can only be based on the interpolation of data points from stream P2 that are at or to the “left” of t=tx. This has a practical impact on the interpolation operation. As discussed previously, generally speaking, the more data samples used in the interpolation, the better the interpolation result. Real-time fusion thus reduces the number of data samples that can be used in the interpolation operation for generating the interpolated data sample (Xi, Y1, Zi). Accordingly, the accuracy of the fused location estimates is expected to be less in the case of real-time fusion of unsynchronized streams, when compared to that of post-processed fusion of unsynchronized streams, which is discussed below. -
FIG. 8B shows an exemplary post-process fusion of unsynchronized streams of location data. Post-process techniques can be used when fused results can be generated after the two (or more) streams of location data P1 and P2 have been received. Thus, the fused location result for t=tx is computed, not at or near time t=tx, but at a later time when all the sample points from streams P1 and P2 shown inFIG. 8A have been received, including data samples at or to the “left” of t=tx and data samples to the “right” of t=tx. Thus, for post-process fusion, the interpolated data sample (Xi, Y1, Zi) can be based on the interpolation of all the data points from stream P2, not just those that are at or to the “left” of t=tx. Post-process fusion thus increases the number of data samples that can be used in the interpolation. Compared to real-time fusion, post-process fusion can be expected to generate relatively more accurate fused location estimates. -
FIG. 8C shows three unsynchronized streams of location data. In particular,FIG. 8C shows how more than two streams of data can be combined together. Furthermore,FIG. 8C also shows that streams having different sampling rates can be combined. Even though the sampling rate of stream P3 is slower than the sampling rates of streams P1 and P2, the aforementioned interpolation techniques can still be used to calculate sample (Xj, Zj). Eq(3) below, which shows how any number of samples can be fused together, can then be used to combine all three streams of data at time tx. -
- The various aspects, embodiments, implementations or features of the described embodiments can be used separately or in any combination. Various aspects of the described embodiments can be implemented by software, hardware or a combination of hardware and software. The described embodiments can also be embodied as computer readable code on a computer readable medium for controlling operations of a navigation system or as computer readable code on a computer readable medium for controlling the operation of an automobile in accordance with a navigation route. The computer readable medium is any data storage device that can store data which can thereafter be read by a computer system. Examples of the computer readable medium include read-only memory, random-access memory, CD-ROMs, HDDs, DVDs, magnetic tape, and optical data storage devices. The computer readable medium can also be distributed over network-coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
- The foregoing description, for purposes of explanation, used specific nomenclature to provide a thorough understanding of the described embodiments. However, it will be apparent to one skilled in the art that the specific details are not required in order to practice the described embodiments. Thus, the foregoing descriptions of specific embodiments are presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the described embodiments to the precise forms disclosed. It will be apparent to one of ordinary skill in the art that many modifications and variations are possible in view of the above teachings.
Claims (19)
1-9. (canceled)
10. A method, comprising:
receiving first position information from a first GNSS receiver of an imaging device;
receiving second position information from a second GNSS receiver of an auxiliary device proximate the imaging device;
combining the first position information with the second position information to determine a position estimate of the imaging device;
associating the position estimate of the imaging device with imagery recorded by the imaging device; and
storing the location of the imaging device and associated imagery to a computer readable storage medium.
11. The method of claim 10 , wherein the position estimate is associated with imagery taken at substantially the same time as the first and second position information was received.
12. The method of claim 10 , wherein storing the location of the imaging device and associated imagery comprises associating the position estimate with a frame of a video file recorded by the imaging device.
13. The method of claim 10 , wherein combining the first position with the second position information comprises weighting the second position information more than the first position information.
14. The method of claim 13 , wherein the auxiliary device is a cellular phone that receives position information from both the second GNSS receiver and other data sources.
15. The method of claim 14 , wherein the other data sources comprise cellular triangulation and WiFi positioning services.
16-20. (canceled)
21. The method of claim 10 , further comprising combining the first position information, the second position information with a third position information received from a third GNSS receiver.
22. The method of claim 10 , wherein the imaging device and the auxiliary device are traveling at high speeds, and wherein the method further comprises data from the auxiliary device and data from the imaging device are time synchronized prior to combining the first position information and the second position information.
23. The method of claim 10 , wherein the imaging device comprises a dashboard camera configured to record video through a window of a vehicle.
24. The method of claim 10 , wherein the imaging device and the auxiliary device are configured to support bidirectional communication.
25. The method of claim 10 , wherein the position estimate is determined partially based on a field of view and an orientation of the imaging device and a rate at which an event captured by the imaging device tracks across a screen of the imaging device.
26. The method of claim 25 , wherein the position estimate is calculated by analyzing multiple frames of a video taken by the imaging device.
27. The method of claim 10 , further comprises sharing data between the imaging device and the auxiliary device when the imaging device and the auxiliary device share substantially a same inertial reference frame.
28. The method of claim 27 , wherein sharing data between the imaging device and the auxiliary device when the imaging device and the auxiliary device share substantially a same inertial reference frame further comprises discarding the first position information and the second position information when the imaging device and the auxiliary device are moving at different speeds or in different directions.
29. The method of claim 10 , further comprises requiring the imaging device and the auxiliary device to be within a particular distance from one another.
30. The method of claim 10 , wherein combining the first position information with the second position information to determine the position estimate of the imaging device only when an accuracy of the second position information falls beneath a threshold.
31. The method of claim 10 , wherein combining the first position information with the second position information to determine the position estimate of the imaging device only when a predicted course of either the auxiliary device or the imaging device is calculated to pass through a region of poor network reception.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/398,692 US20210373181A1 (en) | 2016-06-30 | 2021-08-10 | Geo-fusion between imaging device and mobile device |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662357160P | 2016-06-30 | 2016-06-30 | |
US15/639,176 US11092695B2 (en) | 2016-06-30 | 2017-06-30 | Geo-fusion between imaging device and mobile device |
US17/398,692 US20210373181A1 (en) | 2016-06-30 | 2021-08-10 | Geo-fusion between imaging device and mobile device |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/639,176 Division US11092695B2 (en) | 2016-06-30 | 2017-06-30 | Geo-fusion between imaging device and mobile device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210373181A1 true US20210373181A1 (en) | 2021-12-02 |
Family
ID=62147583
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/639,176 Active 2038-11-08 US11092695B2 (en) | 2016-06-30 | 2017-06-30 | Geo-fusion between imaging device and mobile device |
US17/398,692 Pending US20210373181A1 (en) | 2016-06-30 | 2021-08-10 | Geo-fusion between imaging device and mobile device |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/639,176 Active 2038-11-08 US11092695B2 (en) | 2016-06-30 | 2017-06-30 | Geo-fusion between imaging device and mobile device |
Country Status (1)
Country | Link |
---|---|
US (2) | US11092695B2 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10732297B2 (en) * | 2016-08-30 | 2020-08-04 | Faraday&Future Inc. | Geo-pairing detection |
DE102018206658A1 (en) * | 2018-04-30 | 2019-10-31 | Audi Ag | Method for operating electronic data glasses in a motor vehicle and electronic data glasses |
GB2624026A (en) * | 2022-11-04 | 2024-05-08 | Nokia Technologies Oy | Method, apparatus and computer program |
GB2624028A (en) * | 2022-11-04 | 2024-05-08 | Nokia Technologies Oy | Method, apparatus and computer program |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060187317A1 (en) * | 2005-02-24 | 2006-08-24 | Memory Matrix, Inc. | Systems and methods for processing images with positional data |
US20060257122A1 (en) * | 2003-04-08 | 2006-11-16 | Koninklijke Philips Electronics N.V. | Method of position stamping a photo or video clip taken with a digital camera |
US20070211143A1 (en) * | 2006-03-10 | 2007-09-13 | Brodie Keith J | Systems and methods for prompt picture location tagging |
US7558696B2 (en) * | 2000-06-30 | 2009-07-07 | Nokia Corporation | Method and device for position determination |
US7610123B2 (en) * | 2005-01-04 | 2009-10-27 | Deere & Company | Vision-aided system and method for guiding a vehicle |
US7646336B2 (en) * | 2006-03-24 | 2010-01-12 | Containertrac, Inc. | Automated asset positioning for location and inventory tracking using multiple positioning techniques |
CN102252681A (en) * | 2011-04-18 | 2011-11-23 | 中国农业大学 | Global positioning system (GPS) and machine vision-based integrated navigation and positioning system and method |
US20150073697A1 (en) * | 2012-11-27 | 2015-03-12 | CloudCar Inc. | Geographical location aggregation from multiple sources |
US9250097B2 (en) * | 2009-07-23 | 2016-02-02 | Broadcom Corporation | Coupled GPS phone and navigation system |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5563607A (en) * | 1994-05-26 | 1996-10-08 | Trimble Navigation Limited | Time and/or location tagging of an event |
US6275707B1 (en) * | 1999-10-08 | 2001-08-14 | Motorola, Inc. | Method and apparatus for assigning location estimates from a first transceiver to a second transceiver |
US7139651B2 (en) * | 2004-03-05 | 2006-11-21 | Modular Mining Systems, Inc. | Multi-source positioning system for work machines |
US8606299B2 (en) * | 2006-01-09 | 2013-12-10 | Qualcomm Incorporated | Apparatus and methods for geographic position approximation of an event occurring on a wireless device |
US7769393B2 (en) * | 2006-03-27 | 2010-08-03 | Sony Ericsson Mobile Communications Ab | Cooperative global positioning system (GPS) processing by mobile terminals that communicate via an ad hoc wireless network |
CN101711368B (en) * | 2007-06-26 | 2013-09-11 | 瑞士优北罗股份有限公司 | Method and system for processing of satellite positioning system signals |
US20090079622A1 (en) * | 2007-09-26 | 2009-03-26 | Broadcom Corporation | Sharing of gps information between mobile devices |
US20090115657A1 (en) * | 2007-11-06 | 2009-05-07 | Ann-Tzung Cheng | Gnss receiver system and related method thereof |
US20090187300A1 (en) * | 2008-01-22 | 2009-07-23 | David Wayne Everitt | Integrated vehicle computer system |
US7659848B2 (en) * | 2008-04-29 | 2010-02-09 | Geotate B.V. | Event location determination |
GB2463481A (en) * | 2008-09-12 | 2010-03-17 | Geotate Bv | Means for estimating the location of an event using a satellite positioning system |
US20110102637A1 (en) * | 2009-11-03 | 2011-05-05 | Sony Ericsson Mobile Communications Ab | Travel videos |
US8364405B2 (en) * | 2009-12-18 | 2013-01-29 | Caterpillar Inc. | Surface mapping system and method |
US8787184B2 (en) * | 2010-02-12 | 2014-07-22 | Broadcom Corporation | Collaborative sharing of location information among devices in a network |
EP2530487B1 (en) * | 2011-06-01 | 2014-10-01 | u-blox A.G. | Satellite positioning with assisted calculation |
US20140375807A1 (en) * | 2013-06-25 | 2014-12-25 | Zf Friedrichshafen Ag | Camera activity system |
-
2017
- 2017-06-30 US US15/639,176 patent/US11092695B2/en active Active
-
2021
- 2021-08-10 US US17/398,692 patent/US20210373181A1/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7558696B2 (en) * | 2000-06-30 | 2009-07-07 | Nokia Corporation | Method and device for position determination |
US20060257122A1 (en) * | 2003-04-08 | 2006-11-16 | Koninklijke Philips Electronics N.V. | Method of position stamping a photo or video clip taken with a digital camera |
US7610123B2 (en) * | 2005-01-04 | 2009-10-27 | Deere & Company | Vision-aided system and method for guiding a vehicle |
US20060187317A1 (en) * | 2005-02-24 | 2006-08-24 | Memory Matrix, Inc. | Systems and methods for processing images with positional data |
US20070211143A1 (en) * | 2006-03-10 | 2007-09-13 | Brodie Keith J | Systems and methods for prompt picture location tagging |
US7646336B2 (en) * | 2006-03-24 | 2010-01-12 | Containertrac, Inc. | Automated asset positioning for location and inventory tracking using multiple positioning techniques |
US9250097B2 (en) * | 2009-07-23 | 2016-02-02 | Broadcom Corporation | Coupled GPS phone and navigation system |
CN102252681A (en) * | 2011-04-18 | 2011-11-23 | 中国农业大学 | Global positioning system (GPS) and machine vision-based integrated navigation and positioning system and method |
US20150073697A1 (en) * | 2012-11-27 | 2015-03-12 | CloudCar Inc. | Geographical location aggregation from multiple sources |
Also Published As
Publication number | Publication date |
---|---|
US20180143327A1 (en) | 2018-05-24 |
US11092695B2 (en) | 2021-08-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210373181A1 (en) | Geo-fusion between imaging device and mobile device | |
US10267924B2 (en) | Systems and methods for using a sliding window of global positioning epochs in visual-inertial odometry | |
US10371530B2 (en) | Systems and methods for using a global positioning system velocity in visual-inertial odometry | |
KR101730534B1 (en) | Camera enabled headset for navigation | |
US10088318B2 (en) | Cradle rotation insensitive inertial navigation | |
WO2019121746A1 (en) | Broadcast and utilization of precise gnss correction data | |
US10132933B2 (en) | Alignment of visual inertial odometry and satellite positioning system reference frames | |
WO2020146283A1 (en) | Vehicle pose estimation and pose error correction | |
US11536856B2 (en) | Position-window extension for GNSS and visual-inertial-odometry (VIO) fusion | |
US9607518B2 (en) | Vehicle accident recorder and method for generating accident information thereof | |
US20180188382A1 (en) | Selection of gnss data for positioning fusion in urban environments | |
US20180188381A1 (en) | Motion propagated position for positioning fusion | |
JP6816768B2 (en) | Image processing equipment and image processing method | |
WO2018016151A1 (en) | Image processing device and image processing method | |
KR20130129137A (en) | Apparatus and method for estimating location of mobile station in wireless local area network | |
KR20140023564A (en) | Precise positioning system using global navigation satellite system and a corresponding method thereof | |
CN112833880A (en) | Vehicle positioning method, positioning device, storage medium, and computer program product | |
KR100926274B1 (en) | The camera system for producing the panorama of a map information | |
JP2011226972A (en) | Scenery prediction display system for traffic means and display means for use in the same | |
JP2012124554A (en) | Imaging apparatus | |
KR101311998B1 (en) | Vehicle Hybrid Positioning System Using Multiple data | |
US20220260375A1 (en) | Sensor package for infrastructure data collection | |
WO2023065810A1 (en) | Image acquisition method and apparatus, mobile terminal, and computer storage medium | |
JP2022191814A (en) | recording device | |
CN203191563U (en) | Tour guide aid terminal system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: FARADAY&FUTURE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SONG, XIUFENG;REEL/FRAME:061959/0349 Effective date: 20160629 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |