CN113865620A - Time synchronization method and device for AR navigation simulation - Google Patents

Time synchronization method and device for AR navigation simulation Download PDF

Info

Publication number
CN113865620A
CN113865620A CN202111189337.8A CN202111189337A CN113865620A CN 113865620 A CN113865620 A CN 113865620A CN 202111189337 A CN202111189337 A CN 202111189337A CN 113865620 A CN113865620 A CN 113865620A
Authority
CN
China
Prior art keywords
data
read
video frame
reading
navigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111189337.8A
Other languages
Chinese (zh)
Inventor
李映辉
冯遥
卓先进
周志鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apollo Zhilian Beijing Technology Co Ltd
Original Assignee
Apollo Zhilian Beijing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apollo Zhilian Beijing Technology Co Ltd filed Critical Apollo Zhilian Beijing Technology Co Ltd
Priority to CN202111189337.8A priority Critical patent/CN113865620A/en
Publication of CN113865620A publication Critical patent/CN113865620A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B25/00Models for purposes not provided for in G09B23/00, e.g. full-sized devices for demonstration purposes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content

Abstract

The embodiment of the invention provides a time synchronization method and device for AR navigation simulation, and belongs to the field of vehicle navigation. The method comprises the following steps: reading video frame data once every preset period, and transmitting the read video frame to an image identification module of the AR navigation system; and for each read video frame, reading other types of data according to the playing time stamp of the read video frame, and transmitting the other types of data read each time to a fusion positioning module of the AR navigation system, wherein the other types of data comprise one or more of the following: GPS data, inertial navigation data, and sensor data. The method can effectively solve the time synchronization problem of the finally simulated AR navigation effect.

Description

Time synchronization method and device for AR navigation simulation
The application is a divisional application of a Chinese patent application with the application date of 2019, 16/04, and the application number of 201910304850.3, entitled "time synchronization method and device for AR navigation simulation".
Technical Field
The invention relates to the field of vehicle navigation, in particular to a time synchronization method and a time synchronization device for AR navigation simulation.
Background
The operation of the AR navigation product depends on actual environment data, and when the AR navigation effect is simulated (when debugging or testing is needed) or demonstrated indoors, an actual system cannot provide data input meeting the operation condition in real time, so that the AR navigation effect cannot be effectively simulated or demonstrated.
In the related art, the following three schemes are adopted to solve the technical problem: (1) in practice, the way of testing consumes manpower and financial resources, and has certain personal and property safety problems; (2) shooting the screen or recording through the screen to keep the video for the user presentation, this way is intelligent for the presentation and not for debugging and testing.
Disclosure of Invention
The embodiment of the invention aims to provide a time synchronization method and a time synchronization device for AR navigation simulation, which are used for realizing the synchronous playing of various types of data in the AR navigation simulation process.
In order to achieve the above object, an embodiment of the present invention provides a time synchronization method for AR navigation simulation, where the method includes: reading video frame data once every preset period, and transmitting the read video frame to an image identification module of the AR navigation system; and for each read video frame, reading other types of data according to the playing time stamp of the read video frame, and transmitting the other types of data read each time to a fusion positioning module of the AR navigation system, wherein the other types of data comprise one or more of the following: GPS data, inertial navigation data, and sensor data.
Optionally, reading the video frame data once every other preset period includes: starting a timer with the preset period; and reading the video frames which have playing timestamps not greater than the current count value of the timer and are not read every other preset period.
Optionally, for each read video frame, reading other types of data according to the play timestamp read to the video frame includes: in the case of each reading of a video frame, the read play time stamp is not greater than the play time stamp of the video frame currently read and other types of data that are not read.
Optionally, the preset period is less than or equal to the minimum video capture period.
Optionally, the video frame data, the GPS data, the inertial navigation data, and the sensor data are collected and stored in advance.
Correspondingly, the embodiment of the invention also provides a time synchronization device for the AR navigation simulation, the device comprising: the first reading module is used for reading video frame data once every preset period; the first transmission module is used for transmitting the video frames read each time to the image recognition module of the AR navigation system; the second reading module is used for reading other types of data according to the playing time stamp of the read video frame aiming at the video frame read each time; and a second transmission module for transmitting the other types of data read each time to a fused positioning module of the AR navigation system, wherein the other types of data include one or more of: GPS data, inertial navigation data, and sensor data.
Optionally, the first reading module includes: the starting unit is used for starting the timer with the preset period; and the reading unit is used for reading the video frames which have playing timestamps not larger than the current count value of the timer and are not read every other preset period.
Optionally, the second reading module is configured to, in a case that a video frame is read each time, read other types of data that have a playing timestamp that is not greater than a playing timestamp of a video frame that is currently read and that have not been read.
Optionally, the preset period is less than or equal to the maximum video capture period.
Optionally, the video frame data, the GPS data, the inertial navigation data, and the sensor data are collected and stored in advance.
Accordingly, an embodiment of the present invention further provides a time synchronization apparatus for AR navigation simulation, where the apparatus includes a memory and a processor, and the memory stores instructions that enable the processor to execute the time synchronization method for AR navigation simulation.
Accordingly, an embodiment of the present invention further provides a processor, configured to run a program, where the program is run to execute the time synchronization method for AR navigation simulation.
Accordingly, an embodiment of the present invention further provides a machine-readable storage medium, on which instructions are stored, and the instructions are configured to cause a machine to execute the above time synchronization method for AR navigation simulation.
According to the technical scheme, the video frame data are firstly read once every preset period, and for each read video frame, other types of data are read according to the playing time stamp of the read video frame, so that the images processed by the image recognition module and the other types of data processed by the fusion positioning module are output to the display module to realize time synchronization, and the video frame data synchronization method has the following advantages: (1) the testing, debugging and demonstration of the AR navigation are not limited by time and places any more, and the proper time and place can be selected to ensure personal and property safety; (2) the synchronous mode ensures that the time sequence and the result can be repeated, and reduces the positioning problem and the test verification cost; (4) the method is not limited by input hardware equipment any more, and the development process can be accelerated through simulation; (5) the simulation process can be carried out on non-original acquisition equipment, so that the hardware requirement is greatly reduced; (6) can be used for demonstration purposes to carry out commercial promotion of products.
Additional features and advantages of embodiments of the invention will be set forth in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the embodiments of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the embodiments of the invention without limiting the embodiments of the invention. In the drawings:
FIG. 1 shows a schematic flow diagram of a time synchronization method for AR navigation simulation according to an embodiment of the invention;
FIG. 2 shows a schematic flow diagram of a time synchronization method for AR navigation simulation according to another embodiment of the present invention; and
fig. 3 is a block diagram illustrating a time synchronization apparatus for AR navigation simulation according to an embodiment of the present invention.
Detailed Description
The following detailed description of embodiments of the invention refers to the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating embodiments of the invention, are given by way of illustration and explanation only, not limitation.
In order to simulate or demonstrate the AR navigation effect, the mode that each collected data source plays independently to show the AR navigation effect can also be adopted. However, the AR navigation has very strict requirements on data synchronization, and if the data is not synchronized, the effect of playing the data independently will hardly meet the requirements. The embodiment of the invention provides a time synchronization method and a time synchronization device for AR navigation simulation, and aims to solve the problem of time synchronization when each acquired data source is independently played to show an AR navigation effect.
FIG. 1 is a flow chart diagram illustrating a time synchronization method for AR navigation simulation according to an embodiment of the present invention. As shown in fig. 1, an embodiment of the present invention provides a time synchronization method for AR navigation simulation, which may include steps S110 to S120.
In step S110, the video frame data is read once every preset period, and the video frame read each time is transmitted to the image recognition module of the AR navigation system.
In some cases, the acquisition frequency of the video frame data may be a fixed frequency. In other cases, the acquisition frequency of the video frame data may be variable, e.g., the acquisition frequency may be different for different time periods.
The image recognition module is used to process the video frames, one at a time. Therefore, it may be set in step S110 that at most one video frame is read at a time, which may be achieved by setting a preset period. For example, the preset period may be set to be less than or equal to the minimum video capture period. If the stored video frame data is acquired at a fixed frequency, the preset period may be set to be less than or equal to an acquisition period corresponding to the fixed frequency. If the stored video frame data is collected with a variable frequency, the preset period may be set to be less than or equal to the collection period corresponding to the maximum collection frequency. If it can be guaranteed that at most one video frame is read at a time, in some cases, no video frame may be read when reading is performed.
The execution of step S110 may be triggered once again every preset period by setting a timer having a preset period.
In step S120, for each read video frame, other types of data are read according to the play timestamp of the read video frame, and the other types of data read each time are transmitted to the fusion positioning module of the AR navigation system.
Other types of data in embodiments of the invention may include one or more of the following: GPS data, inertial navigation data, and sensor data. Inertial navigation data may include, for example, accelerometer data, gyroscope data, and the like. The sensor data may include, for example, vehicle speed data, steering wheel angle, etc. The video frame data, the GPS data, the inertial navigation data, and the sensor data may be previously acquired and stored through a data acquisition program, and the acquired data may be stored in the same or different storage modules. The respective acquisition frequencies of the video frame data, the GPS data, the inertial navigation data, and the sensor data may be different, and correspondingly, the respective acquisition periods thereof may also be different. The respective acquisition frequencies of the GPS data, inertial navigation data, and sensor data may also be variable frequency.
The play time stamp in the embodiment of the present invention refers to a play time when the video frame data is played from the beginning, and is associated with the capture time stamp of the video frame. The acquisition timestamp of the video frame refers to the time when the video frame is acquired, and when the video frame is acquired in advance, the acquisition timestamp of each acquired video frame is recorded. The play time stamp of a video frame may refer to a time value obtained by subtracting the capture time stamp of the first video frame from the capture time stamp of the video frame.
The reading of other types of data may be performed once each time a video frame is read, according to the play time stamp of the read video frame. The GPS data, inertial navigation data, and sensor data read may be one or more, respectively.
In the embodiment of the invention, the video frame data is read once every preset period, and the read video frame data is transmitted to the image identification module for processing, which is equivalent to the fact that the pre-stored video frame data is re-collected once. Correspondingly, under the condition that the video frame is read each time, reading of other types of data is executed once according to the playing time stamp of the read video frame, the read other types of data are transmitted to the fusion positioning module for processing, equivalently, the other types of data are collected again once, and the time of the data collection is synchronous with the playing time stamp of the video frame data. Therefore, the video frames independently played by the image recognition module and other types of data independently played by the fusion positioning module can be kept synchronous on the display module, so that the simulation effect of AR navigation can meet the requirement, and the method is particularly suitable for the condition of frequency conversion during data acquisition of the video frames.
FIG. 2 is a flow chart diagram illustrating a time synchronization method for AR navigation simulation according to another embodiment of the present invention. As shown in fig. 2, the time synchronization method for AR navigation simulation according to the embodiment of the present invention includes steps S210 to S230. In an embodiment of the present invention, video frame data and other types of data are collected and stored in advance, and each collected data has a corresponding collection time stamp. The play time stamp of the data may be a time value obtained by subtracting a first data acquisition time stamp of the same type from an acquisition time stamp of the data. For example, the play time stamp of a video frame may be a time value obtained by subtracting the capture time stamp of the first video frame from the capture time stamp of the video frame. The play time stamp of the GPS data may be a time value obtained by subtracting the acquisition time stamp of the first GPS data from the acquisition time stamp of the GPS data. The playing time stamp of the inertial navigation data may be a time value obtained by subtracting the acquisition time stamp of the first inertial navigation data from the acquisition time stamp of the inertial navigation data. The play timestamp of the sensor data may be a time value obtained by subtracting the acquisition timestamp of the first sensor data from the acquisition timestamp of the sensor data.
In step S210, a timer having a preset period is started.
The preset period may be less than or equal to the minimum video capture period. The timer may be set to start timing from 0 after starting, and to count once every preset period.
In step S220, the video frames which have playing timestamps not greater than the current count value of the timer and are not read are read every other preset period, and the video frames read each time are transmitted to the image recognition module of the AR navigation system.
The video frame is read once every time the timer times. The read video frame is a video frame whose play time stamp is not greater than the current count value of the timer and which has not been read. For example, when the timer count value is Tp, reading a video frame which has a playing time stamp smaller than Tp and is not read, where Tp is n × Tp, Tp is a period set for the timer, and n is a positive integer.
Because the period of the timer is less than or equal to the minimum video acquisition period, at most one video frame meeting the condition can be read and the video frame meeting the condition cannot be read when the video frame is read each time. Each time a video frame is read, the read video frame may be transmitted to the image recognition module for processing by the image recognition module.
In step S230, in the case of reading a video frame each time, reading other types of data whose playing timestamp is not greater than the playing timestamp of the video frame read at the current time and which are not read, and transmitting the other types of data read each time to the fusion positioning module of the AR navigation system.
For example, if the play time stamp of the video frame read at the present time is tq, other types of data whose play time stamp is less than tq and which is not read are read. Other types of data may include one or more of the following: GPS data, inertial navigation data, and sensor data.
The GPS data, inertial navigation data, and sensor data read may be one or more, respectively. After the read data of other types are transmitted to the fusion positioning module, the fusion positioning module can process the received data of other types.
In the embodiment of the invention, the video frame data is read once every preset period, at least one video frame is set to be read each time, and the read video frame is transmitted to the image identification module every time one video frame is read. Meanwhile, for one video frame read each time, reading other types of data of which the playing time stamp is not larger than the playing time stamp of the video frame read at the current time and is not read, and transmitting the read other types of data to the fusion positioning module of the AR navigation system. The image recognition module carries out real-time processing on the received video frames, the fusion positioning module carries out real-time processing on the received data of other types, and the display module can display the real-time processed results of the image recognition module and the fusion positioning module in real time so as to display the rendered AR navigation effect. This is equivalent to setting a data acquisition mode to synchronously acquire the stored video frame data and other types of data so as to solve the time synchronization problem of the finally simulated AR navigation effect.
Fig. 3 is a block diagram illustrating a time synchronization apparatus for AR navigation simulation according to an embodiment of the present invention. As shown in fig. 3, an embodiment of the present invention further provides a time synchronization apparatus for AR navigation simulation, where the apparatus may include: the first reading module 310 is configured to read video frame data once every preset period; the first transmission module 320 is used for transmitting the video frame read each time to an image recognition module of the AR navigation system; the second reading module 330 is configured to, for each read video frame, read other types of data according to the play timestamp of the read video frame; and a second transmitting module 340 for transmitting other types of data read each time to the fused positioning module of the AR navigation system, wherein the other types of data include one or more of: GPS data, inertial navigation data, and sensor data.
The first reading module may include: the starting unit is used for starting the timer with the preset period; and the reading unit is used for reading the video frames which have playing timestamps not larger than the current count value of the timer and are not read every other preset period. The second reading module may be configured to, each time a video frame is read, read other types of data whose playing timestamp is not greater than the playing timestamp of the video frame currently read and which are not read.
In specific implementation, video frame data and other types of data are collected and stored in advance, and each collected data is recorded with a corresponding collection time stamp. The preset period may be less than or equal to the maximum video capture frequency period, so that the first reading module reads at least one video frame at a time.
The image recognition module carries out real-time processing on the received video frames, the fusion positioning module carries out real-time processing on the received data of other types, and the display module can display the real-time processed results of the image recognition module and the fusion positioning module in real time so as to display the rendered AR navigation effect. It can solve the time synchronization problem of the final simulated AR navigation effect.
The time synchronization device for the AR navigation simulation provided by the embodiment of the present invention may be integrated in an AR navigation system, for example, a simulation data source interface and a real data source receiving may be set in the AR navigation system. The real data source interface is used for reading a real data source in real time. The analog data source interface is used for reading a data source which is acquired and stored in advance. Or the time synchronization device for AR navigation simulation provided by the embodiment of the present invention may be integrated in an AR navigation simulation system only used for simulation, and the AR navigation simulation system may be provided with only an analog data source interface for reading a data source that is collected and stored in advance.
The specific working principle and benefits of the time synchronization device for AR navigation simulation provided by the embodiment of the present invention are similar to those of the time synchronization method for AR navigation simulation provided by the embodiment of the present invention, and will not be described herein again.
Accordingly, the time synchronization apparatus for AR navigation simulation according to an embodiment of the present invention may include a processor and a memory, where the memory stores instructions that enable the processor to execute the time synchronization method for AR navigation simulation according to any embodiment of the present invention. The first reading module, the first transmission module, the second reading module, the second transmission module, the starting unit, the reading unit and the like can be stored in the memory as program units, and the processor executes the program units stored in the memory to realize corresponding functions. The processor comprises a kernel, and the kernel calls a corresponding program unit from the memory. The kernel can be set to be one or more, and the time synchronization method for the AR navigation simulation according to any embodiment of the invention is executed by adjusting the parameters of the kernel. The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
An embodiment of the present invention further provides a processor, where the processor is configured to execute a program, where the program is configured to execute the time synchronization method for AR navigation simulation according to any embodiment of the present invention when the program is executed.
Embodiments of the present invention further provide a machine-readable storage medium having stored thereon instructions for causing a machine to execute a time synchronization method for AR navigation simulation according to any of the embodiments of the present invention.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (9)

1. A method of time synchronization for AR navigation simulation, the method comprising:
reading video frame data once every other preset period, and transmitting the read video frame to an image recognition module of the AR navigation system, wherein the preset period is less than or equal to the minimum video acquisition period; and
for each read video frame, reading other types of data according to the playing time stamp of the read video frame, and transmitting the other types of data read each time to a fusion positioning module of the AR navigation system,
wherein the other types of data include one or more of: GPS data, inertial navigation data, and sensor data,
for each read video frame, reading other types of data according to the playing time stamp of the read video frame comprises the following steps: in the case of each reading of a video frame, the read play time stamp is not greater than the play time stamp of the video frame currently read and other types of data that are not read.
2. The method of claim 1, wherein reading the video frame data once every preset period comprises:
starting a timer with the preset period; and
and reading the video frames which have playing timestamps not greater than the current count value of the timer and are not read every other preset period.
3. The method of claim 1, wherein the video frame data, the GPS data, the inertial navigation data, and the sensor data are pre-collected and stored.
4. A time synchronization apparatus for AR navigation simulation, the apparatus comprising:
the first reading module is used for reading video frame data once every preset period, wherein the preset period is less than or equal to the maximum video acquisition period;
the first transmission module is used for transmitting the video frames read each time to the image recognition module of the AR navigation system;
the second reading module is used for reading other types of data according to the playing time stamp of the read video frame aiming at the video frame read each time; and
a second transmission module for transmitting the other types of data read each time to the fusion positioning module of the AR navigation system,
wherein the other types of data include one or more of: GPS data, inertial navigation data, and sensor data,
wherein the second reading module is configured to read the other types of data according to the following steps: in the case of each reading of a video frame, the read play time stamp is not greater than the play time stamp of the video frame currently read and other types of data that are not read.
5. The apparatus of claim 4, wherein the first reading module comprises:
the starting unit is used for starting the timer with the preset period; and
and the reading unit is used for reading the video frames which have playing timestamps not greater than the current count value of the timer and are not read every other preset period.
6. The apparatus of claim 4, wherein the video frame data, the GPS data, the inertial navigation data, and the sensor data are pre-collected and stored.
7. A time synchronization apparatus for AR navigation simulation, the apparatus comprising a memory and a processor, the memory having instructions stored thereon that enable the processor to perform: the time synchronization method for AR navigation simulation of any of claims 1 to 3.
8. A processor configured to execute a program, wherein the program is configured to perform: the time synchronization method for AR navigation simulation of any of claims 1 to 3.
9. A machine-readable storage medium having instructions stored thereon for causing a machine to perform: the time synchronization method for AR navigation simulation of any of claims 1 to 3.
CN202111189337.8A 2019-04-16 2019-04-16 Time synchronization method and device for AR navigation simulation Pending CN113865620A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111189337.8A CN113865620A (en) 2019-04-16 2019-04-16 Time synchronization method and device for AR navigation simulation

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111189337.8A CN113865620A (en) 2019-04-16 2019-04-16 Time synchronization method and device for AR navigation simulation
CN201910304850.3A CN110174120B (en) 2019-04-16 2019-04-16 Time synchronization method and device for AR navigation simulation

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201910304850.3A Division CN110174120B (en) 2019-04-16 2019-04-16 Time synchronization method and device for AR navigation simulation

Publications (1)

Publication Number Publication Date
CN113865620A true CN113865620A (en) 2021-12-31

Family

ID=67689930

Family Applications (3)

Application Number Title Priority Date Filing Date
CN201910304850.3A Active CN110174120B (en) 2019-04-16 2019-04-16 Time synchronization method and device for AR navigation simulation
CN202111189336.3A Pending CN114088111A (en) 2019-04-16 2019-04-16 Time synchronization method and device for AR navigation simulation
CN202111189337.8A Pending CN113865620A (en) 2019-04-16 2019-04-16 Time synchronization method and device for AR navigation simulation

Family Applications Before (2)

Application Number Title Priority Date Filing Date
CN201910304850.3A Active CN110174120B (en) 2019-04-16 2019-04-16 Time synchronization method and device for AR navigation simulation
CN202111189336.3A Pending CN114088111A (en) 2019-04-16 2019-04-16 Time synchronization method and device for AR navigation simulation

Country Status (1)

Country Link
CN (3) CN110174120B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110174120B (en) * 2019-04-16 2021-10-08 百度在线网络技术(北京)有限公司 Time synchronization method and device for AR navigation simulation
CN113203423B (en) * 2019-09-29 2024-02-02 百度在线网络技术(北京)有限公司 Map navigation simulation method and device
CN114218139A (en) * 2021-12-15 2022-03-22 北京航天控制仪器研究所 Simulation turntable high-speed synchronous acquisition method based on real-time operating system and FPGA
CN114338951A (en) * 2021-12-30 2022-04-12 智道网联科技(北京)有限公司 Sensor synchronization method, device and system and vehicle

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103139636A (en) * 2011-12-05 2013-06-05 优视科技有限公司 Streaming media data processing method and device and streaming media data reproduction equipment
CN103471580A (en) * 2012-06-06 2013-12-25 三星电子株式会社 Method for providing navigation information, mobile terminal, and server
CN104535070A (en) * 2014-12-26 2015-04-22 上海交通大学 High-precision map data structure, high-precision map data acquiringand processing system and high-precision map data acquiringand processingmethod
CN105203127A (en) * 2014-06-30 2015-12-30 惠州市德赛西威汽车电子股份有限公司 Testing method and testing device of integrated navigation product
US20160055673A1 (en) * 2014-08-25 2016-02-25 Daqri, Llc Distributed aperture visual inertia navigation
CN107454387A (en) * 2017-08-28 2017-12-08 西安万像电子科技有限公司 Mass parameter acquisition methods and device for image coding and decoding Transmission system
CN107966723A (en) * 2017-11-22 2018-04-27 中国人民解放军国防科技大学 Multi-rate multi-channel time synchronization high-speed data recording system
CN108180921A (en) * 2017-12-22 2018-06-19 联创汽车电子有限公司 Utilize the AR-HUD navigation system and its air navigation aid of GPS data
CN108279430A (en) * 2017-12-25 2018-07-13 广州市中海达测绘仪器有限公司 Data synchronize method, apparatus, computer equipment and the storage medium of positioning
CN108801249A (en) * 2018-08-15 2018-11-13 北京七维航测科技股份有限公司 A kind of inertial navigation system
CN109307508A (en) * 2018-08-29 2019-02-05 中国科学院合肥物质科学研究院 A kind of panorama inertial navigation SLAM method based on more key frames
US20190045179A1 (en) * 2017-08-02 2019-02-07 Naver Business Platform Corporation Method and system for checking video call quality of mobile terminal
CN109581426A (en) * 2019-02-18 2019-04-05 帆美航空科技(北京)有限公司 A kind of method, system, equipment and storage medium identifying GNSS abnormal signal
CN110174120A (en) * 2019-04-16 2019-08-27 百度在线网络技术(北京)有限公司 Method for synchronizing time and device for AR navigation simulation

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0666940B2 (en) * 1985-12-20 1994-08-24 ソニー株式会社 Video disk playback device
WO2010020015A1 (en) * 2008-08-21 2010-02-25 Chronologic Pty Ltd Synchronisation and timing method and apparatus
BR112012007309A2 (en) * 2009-10-19 2016-04-19 Intergraph Technologies Co data search, analyzer and synchronization of video and telemetry data
KR20130040361A (en) * 2011-10-14 2013-04-24 주식회사 엘지유플러스 Method, server, and recording medium for providing traffic information based on ar navigation
CN102506901B (en) * 2011-11-25 2014-11-05 北京航空航天大学 Multi-serial-port navigation information simulation integrated system
US9288368B2 (en) * 2013-10-08 2016-03-15 Delightfit, Inc. Video and map data synchronization for simulated athletic training
CN103616710A (en) * 2013-12-17 2014-03-05 靳文瑞 Multi-sensor combined navigation time synchronizing system based on field programmable gate array (FPGA)
US9690375B2 (en) * 2014-08-18 2017-06-27 Universal City Studios Llc Systems and methods for generating augmented and virtual reality images
US10075623B2 (en) * 2015-03-30 2018-09-11 Myriad Sensors, Inc. Synchronizing wireless sensor data and video
US20180376193A1 (en) * 2016-03-17 2018-12-27 Hewlett-Packard Development Company, L.P. Frame transmission
CN107690053B (en) * 2016-08-05 2019-11-08 北京国基科技股份有限公司 A kind of method and system of the time shaft of determining video flowing
CN107172320A (en) * 2017-06-21 2017-09-15 成都理想境界科技有限公司 Method of data synchronization and picture pick-up device
CN107588782A (en) * 2017-08-25 2018-01-16 上海与德科技有限公司 The drive manner and system of a kind of virtual navigation
CN108629793B (en) * 2018-03-22 2020-11-10 中国科学院自动化研究所 Visual inertial ranging method and apparatus using on-line time calibration
US10726631B1 (en) * 2019-08-03 2020-07-28 VIRNECT inc. Augmented reality system and method with frame region recording and reproduction technology based on object tracking

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103139636A (en) * 2011-12-05 2013-06-05 优视科技有限公司 Streaming media data processing method and device and streaming media data reproduction equipment
CN103471580A (en) * 2012-06-06 2013-12-25 三星电子株式会社 Method for providing navigation information, mobile terminal, and server
CN105203127A (en) * 2014-06-30 2015-12-30 惠州市德赛西威汽车电子股份有限公司 Testing method and testing device of integrated navigation product
US20160055673A1 (en) * 2014-08-25 2016-02-25 Daqri, Llc Distributed aperture visual inertia navigation
CN104535070A (en) * 2014-12-26 2015-04-22 上海交通大学 High-precision map data structure, high-precision map data acquiringand processing system and high-precision map data acquiringand processingmethod
US20190045179A1 (en) * 2017-08-02 2019-02-07 Naver Business Platform Corporation Method and system for checking video call quality of mobile terminal
CN107454387A (en) * 2017-08-28 2017-12-08 西安万像电子科技有限公司 Mass parameter acquisition methods and device for image coding and decoding Transmission system
CN107966723A (en) * 2017-11-22 2018-04-27 中国人民解放军国防科技大学 Multi-rate multi-channel time synchronization high-speed data recording system
CN108180921A (en) * 2017-12-22 2018-06-19 联创汽车电子有限公司 Utilize the AR-HUD navigation system and its air navigation aid of GPS data
CN108279430A (en) * 2017-12-25 2018-07-13 广州市中海达测绘仪器有限公司 Data synchronize method, apparatus, computer equipment and the storage medium of positioning
CN108801249A (en) * 2018-08-15 2018-11-13 北京七维航测科技股份有限公司 A kind of inertial navigation system
CN109307508A (en) * 2018-08-29 2019-02-05 中国科学院合肥物质科学研究院 A kind of panorama inertial navigation SLAM method based on more key frames
CN109581426A (en) * 2019-02-18 2019-04-05 帆美航空科技(北京)有限公司 A kind of method, system, equipment and storage medium identifying GNSS abnormal signal
CN110174120A (en) * 2019-04-16 2019-08-27 百度在线网络技术(北京)有限公司 Method for synchronizing time and device for AR navigation simulation
CN114088111A (en) * 2019-04-16 2022-02-25 阿波罗智联(北京)科技有限公司 Time synchronization method and device for AR navigation simulation

Also Published As

Publication number Publication date
CN110174120B (en) 2021-10-08
CN110174120A (en) 2019-08-27
CN114088111A (en) 2022-02-25

Similar Documents

Publication Publication Date Title
CN110174120B (en) Time synchronization method and device for AR navigation simulation
CN107277594B (en) Method and device for synchronizing video and audio with bullet screen
JP7100863B1 (en) Video processing device, video processing method, program
CN109032793A (en) Method, apparatus, terminal and the storage medium of resource distribution
CN103559713A (en) Method and terminal for providing augmented reality
CN110264280A (en) A kind of outdoor advertising monitoring method
CN109672837A (en) Equipment of taking photo by plane real-time video method for recording, mobile terminal and computer storage medium
WO2015172326A1 (en) Event-based record and replay for advanced applications
CN109495340A (en) A kind of Android application performance monitoring statisticss method and system
EP3993428A1 (en) Time delay error correction method, terminal device, server, and storage medium
CN105578290B (en) It is a kind of to carry out the method and device that plan video is got ready based on EPG
CN114547874A (en) Method and device for reproducing operation process of equipment
CN110944231B (en) Monitoring method and device of video player
CN112489522A (en) Method, device, medium and electronic device for playing simulation scene data
CN110322525B (en) Method and terminal for processing dynamic diagram
CN111949512A (en) Application program jamming detection method and device, terminal and medium
CN114553895B (en) Data synchronization method, system, storage medium and electronic equipment
CN104093069A (en) Video playing method and player device
CN114390204A (en) Intelligent medical treatment method and system
CN109146870B (en) Data interaction method and device between modules
CN111984519A (en) Test method and device for service system
CN112770190B (en) Method and device for generating time stamp of automobile data recorder
CN113938615B (en) Method and device for acquiring human face anti-counterfeiting data set and electronic equipment
CN115468607A (en) Automatic simulation test method and device for automobile data recorder
CN116156074B (en) Multi-camera acquisition time synchronization method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination