CN110174686B - Method, device and system for matching GNSS (global navigation satellite system) position and image in crowdsourcing map - Google Patents

Method, device and system for matching GNSS (global navigation satellite system) position and image in crowdsourcing map Download PDF

Info

Publication number
CN110174686B
CN110174686B CN201910303781.4A CN201910303781A CN110174686B CN 110174686 B CN110174686 B CN 110174686B CN 201910303781 A CN201910303781 A CN 201910303781A CN 110174686 B CN110174686 B CN 110174686B
Authority
CN
China
Prior art keywords
gnss
matching
image
module
error
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910303781.4A
Other languages
Chinese (zh)
Other versions
CN110174686A (en
Inventor
孙鹏
马常杰
吴龙
赵子龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Baidu Online Network Technology Beijing Co Ltd
Original Assignee
Baidu Online Network Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Baidu Online Network Technology Beijing Co Ltd filed Critical Baidu Online Network Technology Beijing Co Ltd
Priority to CN201910303781.4A priority Critical patent/CN110174686B/en
Publication of CN110174686A publication Critical patent/CN110174686A/en
Application granted granted Critical
Publication of CN110174686B publication Critical patent/CN110174686B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention relates to the technical field of electronic maps and discloses a method for matching GNSS positions and images in a crowdsourced map. The method comprises the following steps: matching the image acquired by the camera module with the GNSS position corresponding to the image acquired by the GNSS module according to the same time; the method further comprises adopting at least one of the following modes to improve the matching precision: 1) setting the clock of the camera module by taking the clock of the GNSS module as a reference; 2) adjusting an output timestamp of the GNSS location; 3) the image is framed from a plurality of images acquired by the camera module through an onPreviewFrame () function; 4) after matching, the matching error is adjusted. The embodiment of the invention can effectively solve the high-precision matching problem of the GNSS position and the camera frame taking, and reduces the registration error.

Description

Method, device and system for matching GNSS (global navigation satellite system) position and image in crowdsourcing map
Technical Field
The invention relates to the field of maps, in particular to a method for matching GNSS positions and images in a crowdsourcing map, a device for matching the GNSS positions and the images in the crowdsourcing map, a system for matching the GNSS positions and the images in the crowdsourcing map, a vehicle event data recorder and a storage medium.
Background
The method is characterized in that a crowd-sourcing mode is adopted to update a high-precision map, and is an important component in high-precision map production, wherein the crowd-sourcing mode is to economically, quickly and massively acquire and discover a change part of map elements in the real world through a consumption-level driving recorder or similar low-price equipment (hereinafter collectively referred to as crowd-sourcing acquisition equipment), and then combine and release the change part into a high-precision map product, so that an intelligent driving vehicle obtains more accurate, reliable and fresh high-precision map data. The GNSS (Global Navigation Satellite System) position and camera frame acquisition matching technology refers to that when image data is acquired by a camera of a driving recorder in a running vehicle, an image of each frame needs to be accurately matched with a geographic spatial position acquired by a GNSS module when the image is photographed at that time, so that an accurate geographic spatial position of a map element in the image can be acquired.
The conventional method is to extract frames after recording a video, and then match the GNSS trajectory with a timestamp (a time stamp, which is complete and verifiable data that can indicate that a piece of data exists before a certain time, usually a character sequence, uniquely identifies the time of a certain moment) of each frame, so as to obtain the position coordinates (such as latitude and longitude) of the current image. However, due to the influence of various factors, such as (1) large recording error of a video recording start timestamp, (2) error of matching mode of an image timestamp and a GNSS track timestamp, and (3) large deviation of a GNSS module clock and a driving recorder clock, the error of a matching result is close to the second level and unreliable, and the requirement of high-precision map crowdsourcing production cannot be met, so that a GNSS position and camera frame taking high-precision matching technology is developed, and the method has important significance for a crowdsourcing mode on high-precision map updating data.
Disclosure of Invention
The invention aims to reduce the registration error of crowdsourcing map acquisition equipment when acquiring image data and a GNSS position of the crowdsourcing map acquisition equipment, and at least solve the problem of matching precision of the GNSS position and a camera frame.
In order to achieve the above object, a first aspect of the present invention provides a method for matching GNSS positions and images in a crowd-sourced map, including: matching the image acquired by the camera module with the GNSS position corresponding to the image acquired by the GNSS module according to the same time; the method further comprises adopting at least one of the following modes to improve the matching precision:
1) setting the clock of the camera module by taking the clock of the GNSS module as a reference;
2) adjusting an output timestamp of the GNSS location;
3) the image is framed from a plurality of images acquired by the camera module through an onPreviewFrame () function;
4) after matching, the matching error is adjusted.
Optionally, setting the clock of the camera module with the clock of the GNSS module as a reference includes: and updating the clock of the camera module periodically by adopting the clock of the GNSS module.
Optionally, the adjusting the output timestamp of the GNSS position includes:
t1 is the duration of the camera module or the equipment to which the camera module belongs after the system is started;
t2 is the time duration since "system start" of the GNSS module at the time of location update;
calculating T1-T2 to obtain a difference value;
obtaining an output timestamp T3 of the GNSS location;
and adjusting T3 by using the difference value to obtain the adjusted output timestamp of the GNSS position.
Optionally, the time durations are all nanoseconds.
Optionally, the method 3) further includes: setting a 'frame taking completion' mark; before the frame fetching is started, a 'frame fetching completion' flag is set as 'false', and after the frame fetching is completed, a 'frame fetching completion' flag is set as 'true'.
Optionally, the matching error in the mode 4) includes: frame rate error and/or system call error;
the frame rate error is a time error caused by onLocationChanged () when the onPreviewFrame () is called to take a frame;
the system call error refers to a time error caused by the call overhead of the operating system.
Optionally, the adjusting the matching error in the manner 4) includes: adjusting the GNSS position according to the motion information of the GNSS module at the moment corresponding to the GNSS position; the motion information includes a motion speed and a motion direction.
Optionally, the adjusting the matching error in the manner 4) includes: obtaining an adjusted GNSS position by adopting an ST _ Project function in a PostGIS and taking the GNSS position before adjustment, the movement direction and the error distance as parameters; the error distance is equal to { (T1-T2) + (T4-T5) } speed of motion, wherein:
t1 is the time length of the camera module or the equipment thereof after the system is started,
t2 is the time duration since "system start" of the GNSS module at the time of location update,
t4 is an output timestamp corresponding to the GNSS position before the adjustment;
and T5 is a timestamp corresponding to the camera module or the device to which the camera module belongs at the frame taking time.
In a second aspect of the present invention, there is also provided a device for matching GNSS positions with images in a crowd-sourced map, the device comprising: a memory and a processor;
the memory to store program instructions;
the processor is configured to invoke the program instructions stored in the memory to implement the matching method to match GNSS locations to images in a crowd-sourced map as described above.
Optionally, the GNSS positioning system further includes a data input interface and a data output interface, where the data input interface is configured to input a GNSS position and an image to be matched, and the data output interface is configured to output a GNSS position and an image successfully matched.
Optionally, the data input interface is matched with an image output interface of the automobile data recorder.
In a third aspect of the present invention, there is provided a system for matching a GNSS position with an image in a crowd-sourced map, including a GNSS positioning apparatus and a camera apparatus, and further including a matching apparatus,
the matching device is configured to match the GNSS position output by the GNSS positioning device with the image output by the camera device according to the matching method described above.
In a fourth aspect of the present invention, a vehicle event data recorder is further provided, including a GNSS module and a camera module, and further including an image output interface, where the image output interface is configured to output an image that has been well matched with a GNSS position, and the image and the GNSS position are matched by using the aforementioned matching method.
In a fifth aspect of the present invention, there is also provided a storage medium having stored therein instructions, which, when run on a computer, cause the computer to perform the matching method as described above.
The invention provides a matching scheme of a GNSS position and an image in a crowdsourcing map, and the technical scheme can effectively solve the high-precision matching problem of the GNSS position and a camera frame, so that the registration error of crowdsourcing map acquisition equipment in the process of acquiring image data and the GNSS position of the crowdsourcing map acquisition equipment is reduced.
Drawings
FIG. 1 is a schematic flow chart of a matching method according to an embodiment of the present invention;
FIG. 2 is a flow chart illustrating an output timestamp adjustment for a GNSS position in accordance with an alternative embodiment of the present invention;
fig. 3 is a schematic diagram of a frame fetching process according to an alternative embodiment of the present invention;
fig. 4 is a schematic structural diagram of a system according to an embodiment of the present invention.
Detailed Description
The following detailed description of embodiments of the invention refers to the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating the present invention, are given by way of illustration and explanation only, not limitation.
Fig. 1 is a schematic flow chart of a matching method according to an embodiment of the present invention. As shown in fig. 1, the method for matching GNSS positions and images in a crowdsourced map includes the following steps: acquiring an image through a camera module, acquiring a GNSS position corresponding to the image through a GNSS module, and matching the GNSS position with the image according to the same moment; the method further comprises adopting at least one of the following modes to improve the matching precision:
1) setting the clock of the camera module by taking the clock of the GNSS module as a reference;
2) adjusting an output timestamp of the GNSS location;
3) the image is framed from a plurality of images acquired by the camera module through an onPreviewFrame () function;
4) after matching, the matching error is adjusted.
Therefore, the problem of high-precision matching of the GNSS position and the camera frame taking can be effectively solved, and the registration error of the crowdsourcing map acquisition equipment when the image data and the GNSS position of the crowdsourcing map acquisition equipment are acquired is reduced.
The existing method includes acquiring an image through a camera module, acquiring a GNSS position corresponding to the image through a GNSS module, and matching the GNSS position and the image according to the same time. The above is a general method in crowd sourcing of maps, and the problem of low matching accuracy caused by this method has been described in the background. The implementation mode provided by the invention respectively improves the matching precision from 4 aspects (time synchronization, GNSS output timestamp, frame taking method and error elimination), the 4 improvement modes comprise the front, middle and rear stages in the matching method, and the technicians in the field can select and use the improvement modes according to the actual conditions.
Specifically, before matching, the clock of the GNSS module and the clock of the camera module are first matched. The key for improving the matching precision is to select a high-precision clock as a matching clock reference. After the clocks are aligned, the frame taking mode of the image and the output timestamp of the GNSS position are respectively adjusted, the image and the GNSS position which are more accurate to the moment to be matched are obtained, so that the matching precision is improved, and after the matching is finished, the adjustment of possible errors in the matching is also considered, so that the matching errors are further reduced. The 4 types of lifting modes can improve the matching precision, one or more types can be selected in an actual scene, and the technical effect is better if all the lifting modes are used. The following will describe each of the 4 modes in detail.
The method 1) sets the clock of the camera module with the clock of the GNSS module as a reference.
GNSS satellites have high precision atomic clocks (such as the GPS rubidium atomic clock), which are often used as reference clocks for communication systems. The GNSS module can be used for solving the spatial coordinate position and the clock offset of the GNSS module, the GNSS error after clock offset correction is in a millisecond level, and the GNSS error is stable and reliable, so that the clock of the camera module is set by taking the clock of the GNSS module as a reference, and the alignment problem can be greatly solved. Since most camera modules do not have their own clocks but are obtained from the operating systems of the devices to which the camera modules belong, the clocks of the operating systems of the devices are set as the clocks of the GNSS devices at regular intervals. When the device operating system is an android system, a setting interface may be adopted when setting a clock, and an alarmmmanager.
Further, in order to maintain real-time synchronization of the clocks, the synchronization may be performed in a manner of periodic update, that is: and updating the clock of the camera module periodically by adopting the clock of the GNSS module. This way, the accurate synchronization of the clocks of the camera modules can be guaranteed.
Manner 2) adjust an output timestamp of the GNSS location.
Written to according to the android system's interpretation of the getTime () function is: all location information generated by the location manager has a valid UTC time, but remembers that the UTC time may have changed when the location information was generated. Therefore we need to adjust the output timestamp of the GNSS position. The adjustment method adopted here is to perform adjustment according to the number of nanoseconds after the "system start". Fig. 2 is a schematic flow diagram of adjusting an output timestamp of a GNSS location according to an alternative embodiment of the present invention, as shown in fig. 2, specifically, a nanosecond number T1 (referred to as gpsCaptureTSNS) after the camera module or a device (typically an android device, which may be a car recorder or an android smart terminal) of the camera module is "system started" is obtained through a system function system clock. Statistics show that if the time stamp output by the GNSS module is not adjusted by location, 10% of samples will have an error of more than 30ms, and 4% of samples will have an error of more than 100 ms. And finally, adjusting the output time stamp of the GNSS position by adopting the gpsDifferenceTSfromNS to obtain the adjusted output time stamp of the GNSS position. In the above manner, the GNSS output timestamp adjustment using location, getelapsedreadultimennas () can reduce the error of 10% data by 30 ms.
According to the fact that the return time of the calling function of the android system is nanosecond, the time unit needs to be processed in specific occasions.
Mode 3) the image is framed from a plurality of images acquired by the camera module by an onPreviewFrame () function.
Conventional cameras acquire frames by recording a video stream and then extracting the frames with an FFmpeg-like tool, using a timestamp before mediarecorder.preparation () as a start time, while experiments show that the actual recording of the video starts when mediarecorder.start (), statistics show that there is a difference of about 30ms between the two times. However, despite this adjustment, practical tests have shown that there is still an error of around 800ms between the time stamp of the frame and the time stamp of the ISP in the image.
Combining the above factors, we do not adopt the method of recording the video stream and then extracting the frame, but adopt the real-time acquisition mode in the video stream Preview of the Camera service, that is, when the GNSS outputs a spatial position, a frame of image is acquired from the android.
Further, the mode 3) further includes: setting a 'frame taking completion' mark; before the frame fetching is started, a 'frame fetching completion' flag is set as 'false', and after the frame fetching is completed, a 'frame fetching completion' flag is set as 'true'. The effect of setting the "frame fetching completion" flag here is to determine whether the frame fetching process is finished and avoid process conflicts.
Mode 4) after matching, the matching error is adjusted.
The match error, as described herein, includes: frame rate error and/or system call error;
the frame rate error is a time error caused by onLocationChanged (i.e. the aforementioned location update) when the onprevivewframe () is called to fetch a frame; the time error is related to the frame rate of the imaging device.
The system call error refers to a time error caused by the call overhead of the operating system. The method is mainly characterized in that the Android system is not a blackberry-like real-time operating system QNX, so that a time error caused by system call overhead exists.
By obtaining the difference of the timestamps after the gpsCaptureTS and the onPreviewFrame () enter, statistics shows that: for a Camera frame rate of 30fps device, taking a frame in preview when gps changes, there will be a time difference of about 1-100ms, averaging 58 ms. Therefore, the time difference should be used to perform the position registration using the speed information of the GPS, and the registration example SQL code is as follows, and is implemented using the ST _ Project function of PostGIS here:
Figure BDA0002029111610000081
wherein the return value of the geographics ST _ Project (geographics g1, float distance, float azimuth) is the point projected along geodesic starting from the point using azimuth angle measured in radians (azimuth angle) and distance measured in meters. Here, GNSS _ location is the longitude and latitude position before adjustment, camera, frame _ ts _ adjusted _ part is the accumulated value of diffFrameGPSTS and gpsrifferenttsfromns, and the accumulated value is multiplied by the velocity in the above formula to obtain a distance value, GNSS _ bearing is the direction angle of travel acquired by GNSS. It is measured relative to true north (azimuth zero), east azimuth 90(π/2), and so on.
The adjusting the matching error in the mode 4) includes: adjusting the GNSS position according to the motion information of the GNSS module at the moment corresponding to the GNSS position; the motion information includes a motion speed and a motion direction. Fig. 3 is a schematic diagram of a frame fetching process according to an alternative embodiment of the present invention, as shown in fig. 3: specifically, the adjusting the matching error in the mode 4) includes: obtaining an adjusted GNSS position by adopting an ST _ Project function in a PostGIS and taking the GNSS position before adjustment, the movement direction and the error distance as parameters; the error distance is equal to { (T1-T2) + (T4-T5) } speed of motion, wherein:
t1 is the time length of the camera module or the device to which the camera module belongs since "system startup", T2 is the time length of the GNSS module after "system startup" during location update, and the difference between the two is gpsfdiferengtsffromns in fig. 3;
t4 is an output timestamp corresponding to the GNSS position before the adjustment, that is, gpspcapturets;
t5 is a timestamp, i.e., a frame ts, corresponding to the camera module or the device to which the camera module belongs at the time of taking a frame; the difference between the two is diffFrameGPSTS in fig. 3;
the gpsCaptureTSAdjusted in FIG. 3 is the cumulative value of diffFrameGPSTS and gpsDifferenceTSfromNs.
The embodiment of the invention also provides a device for matching the GNSS position in the crowdsourced map with the image, which comprises: a memory and a processor;
the memory to store program instructions; the processor is configured to invoke the program instructions stored in the memory to implement the matching method to match GNSS locations to images in a crowd-sourced map as described above. The device herein has the functions of numerical calculation and logical operation, and it has at least a central processing unit CPU, a random access memory RAM, a read only memory ROM, various I/O ports and interrupt systems, etc. of data processing capability. The device can be common hardware such as a single chip microcomputer, a chip or a processor, and the like, and is an intelligent terminal or a processor of a PC (personal computer) under the more common condition.
The embodiment of the invention also provides a device for matching the GNSS position in the crowdsourced map with the image, which comprises the following steps: the matching module is configured to match the image acquired by the camera module with a GNSS position corresponding to the image acquired by the GNSS module according to the same time; the device also comprises at least one of the following modules to improve the matching precision:
1) a clock setting module configured to set a clock of the camera module with reference to a clock of the GNSS module;
2) a timestamp adjustment module configured to adjust an output timestamp of the GNSS location;
3) a frame taking module configured to take a frame from a plurality of images acquired by the camera module through an onPreviewFrame () function;
4) an error adjustment module configured to adjust a matching error after the matching.
The clock setting module is further configured to periodically update the clock of the camera module with the clock of the GNSS module.
The timestamp adjustment module is configured to adjust an output timestamp of the GNSS location, including:
t1 is the duration of the camera module or the equipment to which the camera module belongs after the system is started;
t2 is the time duration since "system start" of the GNSS module at the time of location update;
calculating T1-T2 to obtain a difference value;
obtaining an output timestamp T3 of the GNSS location;
and adjusting T3 by using the difference value to obtain the adjusted output timestamp of the GNSS position. The time durations are all nanoseconds.
The frame fetching module is further configured to: setting a 'frame taking completion' mark; before the frame fetching is started, a 'frame fetching completion' flag is set as 'false', and after the frame fetching is completed, a 'frame fetching completion' flag is set as 'true'.
The matching error in the error adjustment module includes: frame rate error and/or system call error;
the frame rate error is a time error caused by onLocationChanged () when the onPreviewFrame () is called to take a frame;
the system call error refers to a time error caused by the call overhead of the operating system.
The error adjustment module is further configured to adjust the match error, including: adjusting according to the motion information of the GNSS module at the moment corresponding to the GNSS position; the motion information includes a motion speed and a motion direction.
The error adjustment module is further configured to adjust the match error, including: obtaining an adjusted GNSS position by adopting an ST _ Project function in a PostGIS and taking the GNSS position before adjustment, the movement direction and the error distance as parameters; the error distance is equal to { (T1-T2) + (T4-T5) } speed of motion, wherein:
t1 is the time length of the camera module or the equipment thereof after the system is started,
t2 is the time duration since "system start" of the GNSS module at the time of location update,
t4 is an output timestamp corresponding to the GNSS position before the adjustment;
and T5 is a timestamp corresponding to the camera module or the device to which the camera module belongs at the frame taking time.
The matching device further comprises a data input interface and a data output interface, wherein the data input interface is used for inputting the GNSS position and the image to be matched, and the data output interface is used for outputting the GNSS position and the image which are successfully matched. The device mainly provides two universal interfaces, so that the device can be conveniently and quickly connected with other equipment for use, and the application range of the device is expanded.
Further, the data input interface is matched with an image output interface of the automobile data recorder. Most of the popular map acquisition devices in common use at present are automobile data recorders, and the automobile data recorders have the functions of position acquisition and video acquisition. Therefore, the data input interface is matched with the image output interface of the automobile data recorder, so that the device provided by the embodiment can be conveniently connected with the automobile data recorder for use, and the matching function is completed. The data input interface is usually selected from a USB port or a MiniUSB port.
The embodiment of the invention also provides a matching system of the GNSS position and the image in the crowdsourcing map, which comprises a GNSS positioning device, a camera device and a matching device. Fig. 4 is a schematic diagram of a system structure according to an embodiment of the present invention, and the connection relationship between the three is shown in fig. 4. The matching device is configured to match the GNSS position output by the GNSS positioning device with the image output by the camera device according to the aforementioned matching method. The GNSS positioning device and the camera device may be two separate devices that respectively acquire a GNSS position and an image, or may be devices such as a car recorder or a mobile phone, a tablet computer, and the like, and the GNSS positioning device and the camera device are integrated on one device entity.
The embodiment of the invention also provides a vehicle event data recorder, which comprises a GNSS module, a camera module and an image output interface, wherein the image output interface is configured to output an image matched with the GNSS position, and the image and the GNSS position are matched by adopting the matching method. Generally, the vehicle event data recorder itself includes a GNSS module and a camera module, where the method execution device is a processor inside the vehicle event data recorder, and the processor can match the GNSS position with the image by using the matching method, and then output the result through an image output interface.
It should be noted here that the image and the video in the present invention and its embodiment are not strictly distinguished, and from the technical point of view, the video is a continuous image. Even if the video is matched to a GNSS location, it is essential that the frames in the video match the corresponding GNSS location.
Embodiments of the present invention also provide a storage medium having stored therein instructions that, when run on a computer, cause the computer to perform a matching method as described above.
The above embodiment is currently applied to a crowdsourcing system, the original registration error is reduced to less than 50ms from about 1s, the improvement is more than 10 times, and the system is stable and has good robustness.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (13)

1. A method for matching GNSS positions with images in a crowdsourced map comprises the following steps:
setting a clock of the camera module by taking the clock of the GNSS module as a reference;
adjusting an output timestamp of a GNSS position acquired by the GNSS module;
matching the image acquired by the camera module with the GNSS position corresponding to the image acquired by the adjusted GNSS module according to the same time; the image is framed from a plurality of images acquired by the camera module through an onPreviewFrame () function; characterized in that the method further comprises:
after matching, adjusting the matching error; the match error is related to the onPreviewFrame () function;
wherein adjusting the output timestamp of the GNSS position obtained by the GNSS module comprises:
t1 is the duration of the camera module or the equipment to which the camera module belongs after the system is started;
t2 is the time duration since "system start" of the GNSS module at the time of location update;
calculating T1-T2 to obtain a difference value;
obtaining an output timestamp T3 of the GNSS location;
and adjusting T3 by using the difference value to obtain an adjusted output time stamp of the GNSS position.
2. The method of claim 1, wherein setting the clock of the camera module with reference to the clock of the GNSS module comprises: and updating the clock of the camera module periodically by adopting the clock of the GNSS module.
3. The method of claim 2, wherein the time durations are all nanoseconds.
4. The method of claim 1, wherein when the image is framed from a plurality of images acquired by the camera module by an onPreviewFrame () function, the method further comprises: setting a 'frame taking completion' mark; before the frame fetching is started, a 'frame fetching completion' flag is set as 'false', and after the frame fetching is completed, a 'frame fetching completion' flag is set as 'true'.
5. The method of claim 1, wherein the match error comprises: frame rate error and/or system call error;
the frame rate error is a time error caused by onLocationChanged () when the onPreviewFrame () is called to take a frame;
the system call error refers to a time error caused by the call overhead of the operating system.
6. The method of claim 1, wherein the adjusting the match error comprises: adjusting the GNSS position according to the motion information of the GNSS module at the moment corresponding to the GNSS position; the motion information includes a motion speed and a motion direction.
7. The method of claim 6, wherein the adjusting the match error comprises: obtaining an adjusted GNSS position by adopting an ST _ Project function in a PostGIS and taking the GNSS position before adjustment, the movement direction and the error distance as parameters; the error distance is equal to { (T1-T2) + (T4-T5) } speed of motion, wherein:
t1 is the time length of the camera module or the equipment thereof after the system is started,
t2 is the time duration since "system start" of the GNSS module at the time of location update,
t4 is an output timestamp corresponding to the GNSS position before the adjustment;
and T5 is a timestamp corresponding to the camera module or the device to which the camera module belongs at the frame taking time.
8. An apparatus for matching GNSS positions to images in a crowd-sourced map, the apparatus comprising: a memory and a processor;
the memory to store program instructions;
the processor to invoke the program instructions stored in the memory to implement the matching method of any of claims 1-7 to match GNSS locations to images in a crowd-sourced map.
9. The device of claim 8, further comprising a data input interface for inputting the GNSS position and image to be matched and a data output interface for outputting the GNSS position and image successfully matched.
10. The device of claim 9, wherein the data input interface matches an image output interface of a tachograph.
11. A matching system of GNSS positions and images in a crowdsourcing map comprises a GNSS positioning device and a camera device; characterized in that the system further comprises a matching device configured to: matching the GNSS position output by the GNSS positioning apparatus and the image output by the camera apparatus according to the matching method of any one of claims 1 to 7.
12. A tachograph comprising a GNSS module and a camera module, characterized in that it further comprises an image output interface configured to output an image that has been matched to a GNSS position, said image and GNSS position being matched using the matching method of any of claims 1 to 7.
13. A storage medium having stored therein instructions which, when run on a computer, cause the computer to perform the matching method according to any one of claims 1-7.
CN201910303781.4A 2019-04-16 2019-04-16 Method, device and system for matching GNSS (global navigation satellite system) position and image in crowdsourcing map Active CN110174686B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910303781.4A CN110174686B (en) 2019-04-16 2019-04-16 Method, device and system for matching GNSS (global navigation satellite system) position and image in crowdsourcing map

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910303781.4A CN110174686B (en) 2019-04-16 2019-04-16 Method, device and system for matching GNSS (global navigation satellite system) position and image in crowdsourcing map

Publications (2)

Publication Number Publication Date
CN110174686A CN110174686A (en) 2019-08-27
CN110174686B true CN110174686B (en) 2021-09-24

Family

ID=67689540

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910303781.4A Active CN110174686B (en) 2019-04-16 2019-04-16 Method, device and system for matching GNSS (global navigation satellite system) position and image in crowdsourcing map

Country Status (1)

Country Link
CN (1) CN110174686B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110645995B (en) * 2019-09-29 2021-07-13 百度在线网络技术(北京)有限公司 Map navigation simulation method and device
CN110647877B (en) * 2019-10-30 2022-11-25 武汉中海庭数据技术有限公司 Three-dimensional traffic facility positioning and deviation rectifying method and device based on neural network
CN111336995A (en) * 2020-05-19 2020-06-26 北京数字绿土科技有限公司 Multi-camera high-precision time synchronization and control device and method
CN111986347B (en) * 2020-07-20 2022-07-22 汉海信息技术(上海)有限公司 Device management method, device, electronic device and storage medium
CN114623838A (en) * 2022-03-04 2022-06-14 智道网联科技(北京)有限公司 Map data acquisition method and device based on Internet of vehicles and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109587405A (en) * 2018-10-24 2019-04-05 科大讯飞股份有限公司 Method for synchronizing time and device

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1777233A (en) * 2004-11-16 2006-05-24 上海镁原信息技术有限公司 Method for position digital photograph pickup using GPS
KR101364534B1 (en) * 2006-11-16 2014-02-18 삼성전자주식회사 System for inputting position information in image and method thereof
CN102354449B (en) * 2011-10-09 2013-09-04 昆山市工业技术研究院有限责任公司 Networking-based method for realizing image information sharing for vehicle and device and system thereof
US9111402B1 (en) * 2011-10-31 2015-08-18 Replicon, Inc. Systems and methods for capturing employee time for time and attendance management
CN104133899B (en) * 2014-08-01 2017-10-13 百度在线网络技术(北京)有限公司 The generation method and device in picture searching storehouse, image searching method and device
CN104615673B (en) * 2015-01-20 2018-06-12 百度在线网络技术(北京)有限公司 A kind of client end interface shows method and device
CN105243119B (en) * 2015-09-29 2019-05-24 百度在线网络技术(北京)有限公司 Determine region to be superimposed, superimposed image, image presentation method and the device of image
CN106572193A (en) * 2016-11-23 2017-04-19 袁峰 Method for providing GPS information of acquisition point of image acquisition device for image acquisition device
CN107707962B (en) * 2017-09-05 2020-01-07 百度在线网络技术(北京)有限公司 Method for realizing synchronization of video frame data and GPS time position and FPGA
CN107613159B (en) * 2017-10-12 2024-05-14 北京工业职业技术学院 Image time calibration method and system
CN108200341A (en) * 2018-01-15 2018-06-22 青岛海信移动通信技术股份有限公司 Seamless handover method, device and the terminal device of camera
CN108921894B (en) * 2018-06-08 2021-06-29 百度在线网络技术(北京)有限公司 Object positioning method, device, equipment and computer readable storage medium
CN108957505A (en) * 2018-06-27 2018-12-07 四川斐讯信息技术有限公司 A kind of localization method, positioning system and portable intelligent wearable device
CN109270545B (en) * 2018-10-23 2020-08-11 百度在线网络技术(北京)有限公司 Positioning true value verification method, device, equipment and storage medium
CN109194436B (en) * 2018-11-01 2020-08-07 百度在线网络技术(北京)有限公司 Sensor timestamp synchronous testing method, device, equipment, medium and vehicle

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109587405A (en) * 2018-10-24 2019-04-05 科大讯飞股份有限公司 Method for synchronizing time and device

Also Published As

Publication number Publication date
CN110174686A (en) 2019-08-27

Similar Documents

Publication Publication Date Title
CN110174686B (en) Method, device and system for matching GNSS (global navigation satellite system) position and image in crowdsourcing map
US20200218906A1 (en) Robust lane association by projecting 2-d image into 3-d world using map information
CN111077555B (en) Positioning method and device
CN102980556A (en) Distance measuring method and device
WO2019134180A1 (en) Vehicle positioning method and device, electronic apparatus, and medium
CN111538032B (en) Time synchronization method and device based on independent drawing tracks of camera and laser radar
CN104748730A (en) Device and method for determining exposure moment of aerial survey camera in unmanned aerial vehicle
CN111007554A (en) Data acquisition time synchronization system and method
CN110319850B (en) Method and device for acquiring zero offset of gyroscope
CN107272038B (en) High-precision positioning method and device
CN114136315B (en) Monocular vision-based auxiliary inertial integrated navigation method and system
CN110023778B (en) Positioning method and device
CN114614934B (en) Time synchronization triggering device and method
CN113674424B (en) Method and device for drawing electronic map
CN114114369A (en) Autonomous vehicle positioning method and apparatus, electronic device, and storage medium
CN116678406B (en) Combined navigation attitude information determining method and device, terminal equipment and storage medium
CN111397602A (en) High-precision positioning method and device integrating broadband electromagnetic fingerprint and integrated navigation
CN110779517A (en) Data processing method and device of laser radar, storage medium and computer terminal
CN116147622A (en) Combined navigation system fusion positioning method based on graph optimization
CN113922910B (en) Sensor time synchronization processing method, device and system
CN112348903B (en) Method and device for calibrating external parameters of automobile data recorder and electronic equipment
JP2012211840A (en) Position information correction system
EP3956690B1 (en) System and method for converging mediated reality positioning data and geographic positioning data
CN113484879B (en) Positioning method and device of wearable device
CN113890665A (en) Time synchronization method, system, storage medium and processor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant