WO2023246050A1 - 一种用于光电吊舱的毫秒级数据同步装置及方法 - Google Patents

一种用于光电吊舱的毫秒级数据同步装置及方法 Download PDF

Info

Publication number
WO2023246050A1
WO2023246050A1 PCT/CN2022/141710 CN2022141710W WO2023246050A1 WO 2023246050 A1 WO2023246050 A1 WO 2023246050A1 CN 2022141710 W CN2022141710 W CN 2022141710W WO 2023246050 A1 WO2023246050 A1 WO 2023246050A1
Authority
WO
WIPO (PCT)
Prior art keywords
synchronization
imaging sensor
data
trigger
millisecond
Prior art date
Application number
PCT/CN2022/141710
Other languages
English (en)
French (fr)
Inventor
景彦哲
贺若飞
刘少鹏
丁伟
Original Assignee
北京航天控制仪器研究所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京航天控制仪器研究所 filed Critical 北京航天控制仪器研究所
Publication of WO2023246050A1 publication Critical patent/WO2023246050A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/27Replication, distribution or synchronisation of data between databases or within a distributed database system; Distributed database system architectures therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T9/00Image coding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/147Details of sensors, e.g. sensor lenses
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/19Image acquisition by sensing codes defining pattern positions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level

Definitions

  • the invention relates to a millisecond-level data synchronization device and method for an optoelectronic pod, which is suitable for time synchronization of equipment in the optoelectronic pod.
  • An optoelectronic pod also known as an optoelectronic turret, is a visual stabilization hardware system on various carriers such as manned or unmanned aircraft, vehicles or ships. It usually senses changes in the visual axis posture through sensors such as gyroscopes and inertial navigation devices. , and realizes the stabilization and precise pointing of the visual axis through controllers such as servo motors and voice coil fast reflex mirrors.
  • It can be equipped with various optoelectronic loads such as visible light cameras, infrared thermal imagers, laser rangefinders, laser radars, and synthetic aperture radars to achieve It has a variety of high-precision and intelligent functions such as target tracking/identification, photo evidence collection, precise positioning, speed and direction finding, wide-area search, situational awareness, etc., and is widely used in military reconnaissance, security inspections, resource exploration, etc.
  • various optoelectronic loads such as visible light cameras, infrared thermal imagers, laser rangefinders, laser radars, and synthetic aperture radars to achieve It has a variety of high-precision and intelligent functions such as target tracking/identification, photo evidence collection, precise positioning, speed and direction finding, wide-area search, situational awareness, etc., and is widely used in military reconnaissance, security inspections, resource exploration, etc.
  • Sensors used in photoelectric pods can be divided into imaging loads and motion sensors according to their functions.
  • the motion sensor is between 100Hz and 1000Hz, which is higher than the 25Hz to 60Hz of the imaging load; and from the perspective of the amount of data collected in a single time, the amount of data per cycle of the motion sensor is between a few bytes and Between tens of bytes, which is much lower than the millions to tens of millions of bytes of data per cycle of the imaging payload.
  • the difference in signal acquisition frequency results in the inability to strictly match the collected pose data of the motion sensor and the image data of the imaging payload; and the difference in data volume brings a much higher delay to the image data collection of the imaging payload than the pose data of the motion sensor.
  • amplify the time error of data between different sensors which together lead to significant differences between the time and space measurement information extracted from the data and the real values, which cannot meet the use needs in functions and scenarios with high requirements for accuracy.
  • the technical problem to be solved by the present invention is to overcome the shortcomings of the existing technology and solve the problem of high-precision time synchronization between different devices in the photoelectric pod.
  • a millisecond-level data synchronization device for photoelectric pods including a clock synchronization module, a sensor module, and an information processing module;
  • the sensor module includes external synchronized trigger imaging sensors and non-imaging sensors
  • the clock synchronization module uses the timing second pulse of the satellite navigation system to correct the built-in clock. It is also used to trigger external synchronization to trigger the imaging sensor and send synchronization information to the external synchronization trigger imaging sensor. It is also used to trigger the non-imaging sensor and send synchronization information to the non-imaging sensor. Time information, collect data from non-imaging sensors and record the collection time, and send the collected data and time to the information processing module;
  • the external synchronization triggers the imaging sensor to send the collected data and synchronization information to the information processing module;
  • the information processing module uses the acquisition time of the non-imaging sensor and the synchronization information of the external synchronization trigger imaging sensor to synchronize the data of the external synchronization trigger imaging sensor and the non-imaging sensor.
  • the data collection frequency of the non-imaging sensor is much higher than the data collection frequency of the external synchronized trigger imaging sensor.
  • the non-imaging sensor is a motion sensor.
  • the external synchronized trigger imaging sensor includes a visible light camera and or an infrared thermal imager.
  • the clock synchronization module regularly sends millisecond data and exposure trigger pulse signals to the external synchronization trigger imaging sensor according to the imaging frequency of the external synchronization trigger imaging sensor.
  • the clock synchronization module regularly sends millisecond data and data collection trigger pulse signals to the non-imaging sensor according to the preset data collection frequency of the non-imaging sensor.
  • the external synchronization trigger imaging sensor uses daily millisecond data as the synchronization code.
  • the synchronization code is also provided with a check bit, and the synchronization code is located at a fixed position where the external synchronization triggers the imaging sensor to collect data results.
  • a millisecond-level data synchronization method for optoelectronic pods including:
  • the clock synchronization module uses satellite navigation system timing second pulses to correct the built-in clock
  • the clock synchronization module triggers the external synchronization trigger imaging sensor and sends synchronization information to the external synchronization trigger imaging sensor;
  • the clock synchronization module triggers the non-imaging sensor and sends synchronization information to the non-imaging sensor; collects the data of the non-imaging sensor and records the collection time, and sends it to the information processing module;
  • the external synchronization trigger imaging sensor sends the collected data and synchronization information to the information processing module according to the trigger signal;
  • the information processing module uses the acquisition time of the non-imaging sensor and the synchronization information of the external synchronization trigger imaging sensor to synchronize the data of the external synchronization trigger imaging sensor and the non-imaging sensor.
  • the clock synchronization module regularly sends the exposure trigger pulse signal and the time synchronization code corresponding to the pulse signal to the external synchronization trigger imaging sensor according to the imaging frequency of the external synchronization trigger imaging sensor.
  • the present invention has the following beneficial effects:
  • the high-precision millisecond-level synchronous triggering method proposed by the present invention constrains the time difference between the imaging load and the motion sensor for data signal collection from unmeasurable to millisecond level, effectively reducing the information data fusion error between multiple sensors and fully satisfying the needs of photoelectric cranes.
  • the cabin needs to use high-precision functions such as long-distance target positioning and moving target speed and direction measurement;
  • the present invention not only achieves time synchronization within the system, but also introduces GNSS timing second pulses to achieve time synchronization between the internal time reference of the system and the global positioning system. It is compatible with independent operation and networked timing operation environments, effectively improving the flexibility of the application of the present invention. sex;
  • various imaging loads such as visible light cameras and infrared thermal imagers adopt external synchronization trigger exposure technology.
  • the internal clock signal of the system of the present invention starts image exposure at the same time with millisecond accuracy, eliminating the impact of different imaging loads on sports scenes and movements. Imaging deviation of the target;
  • the synchronization code expression method for days and milliseconds in images proposed by this invention is compatible with GNSS days and milliseconds that support the extraction of imaging time from images saved in lossy compression, lossless compression or non-compression, and is anti-disturbance and self-correcting. Features, it can realize millisecond-level time synchronization of offline data, and perform high-precision data processing based on this.
  • Figure 1 is a block diagram of an exemplary implementation of the data synchronization device of the present invention
  • Figure 2 is a verification diagram of visible light image superimposed synchronization code in the present invention.
  • Figure 3 is a verification diagram of the mid-infrared image superimposed synchronization code according to the present invention.
  • the data synchronization device includes a clock synchronization module, a motion sensor module, an imaging load module, and an information processing module. module.
  • the clock synchronization module includes GNSS (Global Navigation Satellite System, Global Navigation Satellite System) timing and positioning equipment, typically such as GPS receivers, Beidou receivers, etc., and also includes a clock signal processor;
  • the motion sensor module includes posture sensors (such as gyroscopes) instrument, accelerometer, inertial navigation device, etc.);
  • the imaging load module contains imaging sensors (such as visible light cameras, infrared thermal imagers, etc.) used to generate two-dimensional image data.
  • the hardware design adopts external trigger exposure and global exposure design; information processing
  • the module includes image processor and control processor.
  • the clock synchronization module uses GNSS timing second pulses to correct the built-in clock and trigger the external synchronization exposure of the imaging load module (i.e., the external synchronization triggering imaging sensor) with millisecond-level accuracy.
  • the imaging load module superimposes the exposure time on the image in a form that meets the requirements of Table 1.
  • the data is output to the information processing module; the data collection of the motion sensor module (i.e., non-imaging sensor) is triggered with millisecond-level accuracy.
  • the motion sensor module outputs the data collection time and sensor data to the information processing module, and the information processing module caches the data queue online. , extract synchronization information and align data to achieve online clock synchronization, and compressed video or image data stored or transmitted can also extract synchronization information and align data to achieve offline clock synchronization.
  • the clock synchronization module uses GNSS timing second pulses to correct the built-in clock
  • the clock synchronization module triggers the external synchronization exposure of the imaging load module with millisecond-level accuracy.
  • the imaging load module superimposes the exposure time on the image data in a form that meets the requirements of Table 1 and outputs it to the information processing module;
  • the clock synchronization module triggers the data collection of the motion sensor module with millisecond-level accuracy, and the motion sensor module outputs the data collection time and sensor data to the information processing module;
  • the information processing module caches the data queue online and aligns the data based on time information before processing to achieve clock synchronization.
  • the data synchronization device includes a clock synchronization module, a motion sensor module, an imaging load module, and an information processing module.
  • the clock synchronization module includes GNSS timing positioning equipment (such as GPS receivers, Beidou receivers, etc.) and clock signal processors;
  • the motion sensor module includes posture sensors (such as gyroscopes, accelerometers, inertial navigation devices, etc.);
  • the imaging payload module includes imaging sensors (such as visible light cameras, infrared thermal imagers, etc.) used to generate two-dimensional image data.
  • the hardware design adopts external trigger exposure and global exposure design;
  • the information processing module includes image processors and control processors.
  • the double-line connection line indicates that the device receives the GNSS signal provided externally
  • the dot connection line indicates the motion control signal sent to the outside of the device
  • the thin solid line connection line indicates the second pulse signal and exposure related to time synchronization.
  • the trigger signal and the synchronization code data signal represents the pose data signal or the combined data signal of the pose data and the synchronization code
  • the thick solid connecting line represents the image signal data with the synchronization code superimposed.
  • the clock synchronization module includes GNSS timing positioning equipment and a clock signal processor.
  • the clock signal processor has a built-in microsecond clock source.
  • the GNSS timing positioning equipment receives global satellite navigation celestial timing signals and provides an error of ⁇ 20ns (nano-second, nano-second). seconds) and a second pulse signal with a frequency of 1Hz (Hertz, Hertz), the built-in clock source is regularly corrected to synchronize it with GNSS; the clock signal processor is based on the imaging frequency of the imaging load (such as 25Hz or 30Hz).
  • the 32-bit unsigned integer type, with an accuracy of thousandths of a second (milliseconds), the time of day expression, and the exposure trigger pulse signal are sent to the imaging load module; the clock signal processor is based on the preset data collection frequency of the motion sensor module (for example, 500Hz or 1000Hz) to regularly send millisecond data and data acquisition trigger pulse signals to the motion sensor module.
  • the preset data collection frequency of the motion sensor module for example, 500Hz or 1000Hz
  • the imaging payload module contains imaging sensors (such as visible light cameras, infrared thermal imagers, etc.) used to generate two-dimensional image data. It receives the exposure trigger pulse signal and day-millisecond data provided by the clock synchronization module.
  • the hardware design uses external trigger exposure and global exposure. Design, the external trigger exposure design starts a single global exposure according to the exposure trigger pulse signal sent regularly by the clock synchronization module, and the global exposure design enables global pixels to start exposure at the same time and have a consistent exposure duration. After the exposure is completed, the original image signal is collected.
  • the CMOS complementary metal oxide semiconductor
  • RGB Red-Green-Blue, red, green and blue
  • the uncooled vanadium oxide detector of the infrared thermal imager directly collects data of 10 8-bit single-channel grayscale data is obtained through image preprocessing to obtain 8-bit single-channel grayscale, or multi-channel RGB, or multi-channel YUV (Y represents Luminance brightness, U and V represent Chrominance chroma, which is a kind of brightness and chroma Separate color coding method) data is used for output. It is required to combine 32-bit day millisecond data and 8-bit check data into a 40-bit "synchronization code" to fill in the corresponding position image pixels in accordance with the provisions of Table 1. The filled image data is sent to the information processing module.
  • the motion sensor module includes posture sensors (such as gyroscopes, accelerometers, inertial navigation devices, etc.), receives the data acquisition trigger pulse signal and 32-bit day-millisecond data provided by the clock synchronization module, and triggers the motion sensor data collection based on the signal. And send three-axis rotation rate, three-axis acceleration, space attitude and other position information, attitude information and 32-bit day-millisecond data to the information processing module.
  • posture sensors such as gyroscopes, accelerometers, inertial navigation devices, etc.
  • the information processing module includes an image processor and a control processor, etc., and uses a FIFO (First-In-First-Out) cache queue mechanism to cache the received data. Since the image data delay of the imaging load module is usually within a hundred The millisecond-level delay is much larger than the millisecond-level delay of the pose data of the motion sensor module. Therefore, the image data enters the processing process later than the pose data. To save cache, only the pose data is cached.
  • the 40-bit "synchronization code" in the image is analyzed according to the provisions of Table 1. After verifying its validity through the 8-bit check data, the 32-bit day-millisecond data is taken as the day-millisecond.
  • the cache queue uses the L1 Euclidean distance minimum principle to search for matching pose data according to Equation (1), which can achieve millisecond-level time synchronization of online image data and pose data, and perform high-precision data processing based on this.
  • i is the sequence number of each group of cached pose data
  • each group of pose data corresponds to a Euclidean distance Di
  • t is the day millisecond in the 40-bit "synchronization code" of the current image
  • Ti is the day millisecond of each group of pose data.
  • the video and image coding and compression algorithm used when saving is usually lossy compression.
  • the decoded image data is different from the original data, and sufficiently robust encoding rules need to be adopted to enhance the natural state of the image. Recoverability of data in milliseconds.
  • the present invention proposes a synchronization code expression technology of days and milliseconds in images, which is compatible with and supports the extraction of GNSS days and milliseconds of imaging time from images saved in lossy compression, lossless compression or non-compression mode, and has anti-disturbance, automatic Error correction features enable millisecond-level time synchronization of offline data and high-precision data processing based on this.
  • the following four constraints are designed to ensure the recoverability of synchronization codes:
  • an 8-bit checksum is added and expanded to a 40-bit "synchronization code" to avoid being unable to identify data damage caused by factors such as unstable image signals and electrical noise during transmission;
  • the "synchronization code” is filled with two repeated lines in the upper right corner of the image, which is used for verification during parsing and enhances data verification capabilities;
  • 1 bit of the "synchronization code” is expressed in the 8-bit data space in the image, that is, the 40-bit "synchronization code” is expressed in 320 bits, or 40 8-bit pixels, which greatly enhances the data redundancy capability.
  • the number of horizontal pixels is recorded as W. For example, for an image with a resolution of 1920 ⁇ 1080 and 8-bit pixels, the number of horizontal pixels is 1920, so W is 1920.
  • W-n represents the n+1th pixel from the last, for example W-0 or W represents the first pixel from the last, and W-39 represents the fortieth pixel from the last.
  • the agreed filling data rule is that if a certain bit in the "synchronization code" is 1, it will be expressed by pixel data with 8 bits all being 1 (i.e. 255 in decimal or 0xFF in hexadecimal). If a certain bit is 0, it will be expressed by 8 bits. Expressed by pixel data that are all 0 (i.e. 0), even if the image data undergoes lossy compression, noise influence and interface protocol constraints, error correction can still be achieved based on the distance between the pixel and the intermediate value, see formula (2);
  • b is one bit
  • x is an 8-bit pixel in the "synchronization code" superimposed on the image data.
  • the information processing module can encode and compress image data for real-time long-distance transmission or storage.
  • Commonly used compression algorithms include H.264 for video (a video compression algorithm or standard named after the standard, also known as AVC or Advanced Video Coding). , Advanced Video Coding) compression coding and JPEG (Joint Photographic Experts Group, Joint Photographic Experts Group) compression coding for photos;
  • the video compression H.264 algorithm standard stipulates that the video compression H.264 algorithm is allowed in NALU (Network Abstraction Layer Unit, Network Abstraction Unit)
  • NALU Network Abstraction Layer Unit, Network Abstraction Unit
  • the layer unit) type is 6, which is SEI (Supplemental enhancement information, supplementary enhancement information unit) or 22, 23, which is REV (Reserved, reserved unit).
  • SEI Supplementplemental enhancement information, supplementary enhancement information unit
  • REV Reserved, reserved unit
  • the overlay custom segment is used to store custom information and will not affect the commercial video playback software.
  • pose data can be saved in any of the above NALUs (recorded as "H.264 encoding convention field").
  • pose days and milliseconds are extracted from the H.264 encoding convention field.
  • Table 1 stipulates that the image can be recovered from the image in milliseconds, which can achieve millisecond-level time synchronization of offline image data and pose data, and perform high-precision data processing based on this.
  • the image compression JPEG algorithm standard stipulates that the application of reserved segments in “APPn” (Reserved for application segments, where n is a non-zero positive integer representing the sequence number ) Position overlay custom segment is used to store custom information and will not affect the viewing of photo files by image viewing software. Therefore, pose data can be saved in any field in "APPn” (recorded as "JPEG encoding convention field"), Before the decoding process, the pose day milliseconds are extracted from the JPEG encoding convention field. After the decoding process, the image days milliseconds are recovered from the image according to the provisions of Table 1. Millisecond-level time synchronization of offline image data and pose data can be achieved, and based on this Basic high-precision data processing.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Vascular Medicine (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Studio Devices (AREA)

Abstract

一种用于光电吊舱的毫秒级数据同步装置,包括时钟同步模块、传感器模块、信息处理模块;传感器模块包括外同步触发成像传感器、非成像传感器;时钟同步模块利用卫星导航系统授时秒脉冲修正内建时钟,还用于触发外同步触发成像传感器,并向外同步触发成像传感器发送同步信息;还用于触发非成像传感器,并向非成像传感器发送时间信息,采集非成像传感器的数据并记录采集时间,并将采集的数据和时间发送给信息处理模块;外同步触发成像传感器将采集的数据和同步信息发送给信息处理模块;信息处理模块利用非成像传感器的采集时间和外同步触发成像传感器的同步信息,对外同步触发成像传感器和非成像传感器的数据进行同步。

Description

一种用于光电吊舱的毫秒级数据同步装置及方法
本申请要求于2022年6月20日提交中国专利局、申请号为202210701058.3、发明名称为“一种用于光电吊舱的毫秒级数据同步装置及方法”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及一种用于光电吊舱的毫秒级数据同步装置及方法,适用于光电吊舱内的设备时间同步。
背景技术
光电吊舱也称为光电转塔,是一种在有人、无人飞行器、车辆或舰艇等多种载体上的视觉稳定硬件系统,通常通过陀螺仪、惯性导航器件等传感器感知视轴位姿变化,并通过伺服电机、音圈快反镜等控制器实现视轴稳定与精确指向,可搭载可见光摄像机、红外热成像仪、激光测距仪、激光雷达、合成孔径雷达等多种光电载荷,实现目标跟踪/识别、拍照取证、精确定位、测速测向、广域搜索、态势感知等多种高精度、智能化功能,在军事侦察、安防巡检、资源勘探等方面具备广泛应用。
光电吊舱中采用的传感器按照功能,可分为成像载荷与运动传感器两类。通常从典型信号采集频率方面来看,运动传感器在100Hz~1000Hz之间,高于成像载荷的25Hz~60Hz;而从单次采集的数据量来看,运动传感器每周期数据量在几字节到几十字节之间,远低于成像载荷每周期数据量的几百万到几千万字节。信号采集频率的差异造成采集到运动传感器的位姿数据与成像载荷的图像数据间无法严格匹配;而数据量的差异给成像载荷的图像数据采集带来远高于运动传感器的位姿数据的延迟,进而放大不同传感器间数据的时间误差,共同导致从数据中提取的时间与空间测量信息与真实数值之间出现显著差异,在对精度具有较高要求的功能与场景下无法满足使用需求。
发明内容
本发明要解决的技术问题是:克服现有技术的不足,解决了光电吊舱内不同设备间的高精度时间同步问题。
本发明目的通过以下技术方案予以实现:
一种用于光电吊舱的毫秒级数据同步装置,包括时钟同步模块、传感器模块、信息处理模块;
传感器模块包括外同步触发成像传感器、非成像传感器;
时钟同步模块利用卫星导航系统授时秒脉冲修正内建时钟,还用于触发外同步触发成像传感器,并向外同步触发成像传感器发送同步信息;还用于触发非成像传感器,并向非成像传感器发送时间信息,采集非成像传感器的数据并记录采集时间,并将采集的数据和时间发送给信息处理模块;
外同步触发成像传感器将采集的数据和同步信息发送给信息处理模块;
信息处理模块利用非成像传感器的采集时间和外同步触发成像传感器的同步信息,对外同步触发成像传感器和非成像传感器的数据进行同步。
优选的,所述非成像传感器的数据采集频率远高于外同步触发成像传感器的数据采集频率。
优选的,所述非成像传感器为运动传感器。
优选的,所述外同步触发成像传感器包括可见光摄像机和或红外热成像仪。
优选的,所述时钟同步模块,根据外同步触发成像传感器的成像频率定时将天毫秒数据、曝光触发脉冲信号发送至外同步触发成像传感器。
优选的,所述时钟同步模块,根据非成像传感器预设的数据采集频率定时将天毫秒数据、数据采集触发脉冲信号发送至非成像传感器。
优选的,所述外同步触发成像传感器采用天毫秒数据为同步码。
优选的,所述同步码中,还设有校验位,且同步码位于外同步触发成像传感器采集数据结果的固定位置。
一种用于光电吊舱的毫秒级数据同步方法,包括:
时钟同步模块利用卫星导航系统授时秒脉冲修正内建时钟;
时钟同步模块触发外同步触发成像传感器,并向外同步触发成像传感器发送同步信息;
时钟同步模块触发非成像传感器,并向非成像传感器发送同步信息;采集非成像传感器的数据并记录采集时间,发送给信息处理模块;
外同步触发成像传感器根据触发信号将采集的数据和同步信息发送给信息处理模块;
信息处理模块利用非成像传感器的采集时间和外同步触发成像传感器的同步信息,对外同步触发成像传感器和非成像传感器的数据进行同步。
优选的,所述时钟同步模块,根据外同步触发成像传感器的成像频率定时将曝光触发脉冲信号,以及该脉冲信号对应的时间同步码发送给外同步触发成像传感器。
本发明相比于现有技术具有如下有益效果:
(1)本发明提出的高精度毫秒级同步触发方法,把成像载荷与运动传感器进行数据信号采集的时间差异从不可测量约束至毫秒级,有效降低多传感器间信息数据融合误差,充分满足光电吊舱对远距离目标定位、运动目标测速测向等高精度功能的使用需求;
(2)本发明不仅在系统内部实现时间同步,同时引入GNSS授时秒脉冲,实现系统内部时间基准与全球定位系统的时间同步,可兼容独立运行与联网授时运行环境,有效提高本发明应用的灵活性;
(3)本发明中可见光摄像机、红外热成像仪等多种成像载荷采用外同步触发曝光技术,通过本发明系统内部时钟信号以毫秒级精度同时启动图像曝光,消除不同成像载荷对于运动场景与运动目标的成像偏差;
(4)本发明提出的天毫秒在图像中的同步码表达方法,兼容支持从有损压缩、无损压缩或非压缩方式保存的图像中提取成像时刻的GNSS天毫秒,具备抗扰动、自纠错特点,可实现离线数据毫秒级时间同步,并以此为基础进行高 精度数据处理。
附图说明
图1为本发明数据同步装置的示例性实施方式组成框图;
图2为本发明中可见光图像叠加同步码验证图;
图3为本发明中红外图像叠加同步码验证图。
具体实施方式
为使本发明的目的、技术方案和优点更加清楚,下面将结合附图对本发明的实施方式作进一步详细描述。
一种用于光电吊舱的毫秒级数据同步装置及方法(以下简称为“数据同步装置”与“数据同步方法”),数据同步装置包括时钟同步模块、运动传感器模块、成像载荷模块、信息处理模块。
其中,时钟同步模块包含GNSS(Global Navigation Satellite System,全球导航卫星系统)授时定位设备,典型如GPS接收机、北斗接收机等,还包括时钟信号处理器;运动传感器模块包含位姿传感器(如陀螺仪、加速度计、惯性导航器件等);成像载荷模块包含用于生成二维图像数据的成像传感器(如可见光摄像机、红外热成像仪等),硬件设计采用外触发曝光与全局曝光设计;信息处理模块包含图像处理器与控制处理器等。
时钟同步模块利用GNSS授时秒脉冲修正内建时钟,以毫秒级精度触发成像载荷模块(即外同步触发成像传感器)的外同步曝光,成像载荷模块将曝光时间以满足表1规定的形式叠加在图像数据中输出至信息处理模块;以毫秒级精度触发运动传感器模块(即非成像传感器)的数据采集,运动传感器模块将数据采集时间与传感器数据一同输出至信息处理模块,信息处理模块在线缓存数据队列,提取同步信息并对齐数据以实现在线时钟同步,存储或传输的压缩视频或图像数据同样可提取同步信息并对齐数据以实现离线时钟同步。
更具体的:
时钟同步模块利用GNSS授时秒脉冲修正内建时钟;
时钟同步模块以毫秒级精度触发成像载荷模块的外同步曝光,成像载荷模块将曝光时间以满足表1规定的形式叠加在图像数据中输出至信息处理模块;
时钟同步模块以毫秒级精度触发运动传感器模块的数据采集,运动传感器模块将数据采集时间与传感器数据一同输出至信息处理模块;
信息处理模块在线缓存数据队列,并在进行处理前根据时间信息对齐数据以实现时钟同步。
如图1所示,数据同步装置包括时钟同步模块、运动传感器模块、成像载荷模块、信息处理模块。
图1中,时钟同步模块包含GNSS授时定位设备(如GPS接收机、北斗接收机等)与时钟信号处理器;运动传感器模块包含位姿传感器(如陀螺仪、加速度计、惯性导航器件等);成像载荷模块包含用于生成二维图像数据的成像传感器(如可见光摄像机、红外热成像仪等),硬件设计采用外触发曝光与全局曝光设计;信息处理模块包含图像处理器与控制处理器等。
图1中,关于信号连接关系,双线连接线表示装置接收外部提供的GNSS信号,点连接线表示发送至装置外部的运动控制信号,细实线连接线表示时间同步相关的秒脉冲信号、曝光触发信号与同步码数据信号,虚线连接线表示位姿数据信号或位姿数据与同步码的组合数据信号,粗实线连接线表示叠加同步码的图像信号数据。
时钟同步模块包含GNSS授时定位设备与时钟信号处理器,其中时钟信号处理器内建微秒级时钟源,GNSS授时定位设备接收全球卫星导航天授时信号,提供误差为±20ns(nano-second,纳秒)、频率为1Hz(Hertz,赫兹)的秒脉冲信号,定时修正内建时钟源,使其与GNSS同步;时钟信号处理器根据成像载荷的成像频率(例如25Hz或30Hz)定时将天毫秒(32位无符号整数类型,精度为千分之一秒即毫秒的全天内时间表达)数据、曝光触发脉冲信号发送至成像载荷模块;时钟信号处理器根据运动传感器模块预设的数据采集频率(例如500Hz或1000Hz)定时将天毫秒数据、数据采集触发脉冲信号发送至运动传感 器模块。
成像载荷模块包含用于生成二维图像数据的成像传感器(如可见光摄像机、红外热成像仪等),接收时钟同步模块提供的曝光触发脉冲信号与天毫秒数据,硬件设计采用外触发曝光与全局曝光设计,其中外触发曝光设计根据时钟同步模块定时发送的曝光触发脉冲信号开启单次全局曝光,全局曝光设计使全局像素同时启动曝光并具备一致曝光时长。曝光完成后采集原始图像信号,对于数据同步装置的示例性实施方式中,可见光摄像机的CMOS(互补金属氧化物半导体)探测器直接采集数据为12位Bayer(拜耳阵列,以发明者拜耳的名字命名,于1976年获得美国专利认证的一种像元马赛克布局)格式RGB(Red-Green-Blue,红绿蓝)三通道颜色数据,红外热成像仪的非制冷氧化钒探测器直接采集数据为10位单通道灰度数据,经过图像预处理环节得到8位单通道灰度、或多通道RGB、或多通道YUV(Y表示Luminance亮度,U与V表示Chrominance色度,是一种亮度与色度分离的颜色编码方法)数据用于输出,要求按照表1规定将32位天毫秒数据与8位校验数据组合为40位“同步码”填充对应位置图像像素,填充完成的图像数据发送至信息处理模块。
运动传感器模块包含位姿传感器(如陀螺仪、加速度计、惯性导航器件等),接收时钟同步模块提供的数据采集触发脉冲信号与32位天毫秒数据,以信号为基准触发运动传感器的数据采集,并发送三轴旋转速率、三轴加速度、空间姿态等位置信息、姿态信息与32位天毫秒数据至信息处理模块。
信息处理模块包含图像处理器与控制处理器等,采用FIFO(First-In-First-Out,先进先出)缓存队列机制对接收到的数据进行缓存,由于成像载荷模块的图像数据延迟通常在百毫秒量级,远大于运动传感器模块位姿数据的毫秒级延迟,因此图像数据进入处理流程比位姿数据更晚,出于节省高速缓存考虑仅需缓存位姿数据。而在接收到图像数据时,按照表1规定解析图像中的40位“同步码”,通过其中8位校验数据验证其有效性后,取32位天毫秒数据作为天毫秒,在位姿数据缓存队列中根据等式(1)采用L1欧式距离最小 原则搜索匹配的位姿数据,可实现在线图像数据与位姿数据进行毫秒级时间同步,并以此为基础进行高精度数据处理。
Di(t)=|t-Ti|    (1)
其中i为缓存的每组位姿数据序号,每组位姿数据对应一个欧式距离Di,t为当前图像40位“同步码”中的天毫秒,Ti为每组位姿天毫秒。
在运动传感器数据与图像数据中,对于天毫秒区别处理的原因如下:
在天毫秒与位姿数据组合时,由于不存在压缩等改变数据表现形式的流程,因此直接把天毫秒与位姿数据连接组合即可;
在天毫秒与成像载荷模组的图像数据组合时:
第一种情况,对于完全无损传输的图像接口与协议,由于数据可无损恢复,因此可以直接将32位天毫秒数据覆盖部分图像像素数据;
第二种情况,大多数图像传输接口与对应的图像传输协议中,对于在图像信号中嵌入自定义数据的支持非常有限,甚至会对图像数据产生不可逆修改(例如HD-SDI(High Definition Serial Digital Interface,高清数字分量串行接口)接口采用SMPTE292M协议,强制将8位数据表示的0~255抑制为16~240),简单以天毫秒数据代替图像像素数据会造成天毫秒数据无法正常恢复,需对32位天毫秒数据进行预定规则编码以防止数据受损;
第三种情况,在图像电信号传输过程存在不同程度电噪声,会对图像信号产生随机扰动,造成位翻转等问题,因此需要额外数据对32位天毫秒数据进行校验,以判断数据是否受损;
第四种情况,在图像信号转为数据后,保存时采用的视频、图像编码压缩算法通常为有损压缩,经解码后图像数据与原始数据不同,需采取足够鲁棒的编码规则以加强天毫秒数据的可恢复性。
因此在本发明中提出了一种天毫秒在图像中的同步码表达技术,兼容支持从有损压缩、无损压缩或非压缩方式保存的图像中提取成像时刻的GNSS天毫秒,具备抗扰动、自纠错特点,可实现离线数据毫秒级时间同步,并以此为基 础进行高精度数据处理。设计以下四种约束以保证同步码的可恢复性:
第一,以32位天毫秒为基础,新增8位校验,扩展为40位“同步码”,避免无法判别因图像信号不稳定、传输过程电噪声等因素造成的数据受损;
第二,“同步码”填充图像右上角重复两行,用于解析时校验,加强数据检验能力;
第三,以图像中8位数据空间表达“同步码”中1位,即40位“同步码”采用320位即40个8位像素表达,极大加强数据冗余能力。约定以横向像素数记为W,例如对于1920×1080分辨率的、8位像素的图像,横向像素数为1920个,因此W为1920;约定像素序号W-n表示倒数第n+1个像素,例如W-0或W表示倒数第一个像素,W-39表示倒数第四十个像素。约定填充数据规则为,“同步码”中某位为1则用8位均为1的像素数据(即十进制表示的255或十六进制表示的0xFF)表达,某位为0则用8位均为0的像素数据(即0)表达,即使图像数据经过有损压缩、噪声影响与接口协议约束,依然可根据像素与中间值的距离实现纠错,见公式(2);
表1
Figure PCTCN2022141710-appb-000001
b(x)=0(x<128时)或1(x≥128时)    (2)
其中,b为一位,x为叠加在图像数据中的“同步码”中一个8位像素。
第四,约定对于8位单通道灰度图像,像素按照“同步码”的对应位填充255或0;约定对于8位三通道RGB图像,像素对应三个通道均按照“同步码”的对应位填充255或0相同值;约定对于8为三通道YUV图像,Y分量按照“同步码”的对应位填充255或0,U与V分量均填充128。
通过以上四种约束,形成如图2所示可见光图像与图3所示红外图像中同 步码的像素数据表达,图2中心部分与图3中心部分均为图像右上角叠加同步码的放大表示。
信息处理模块可以对图像数据进行编码压缩用于实时远距离传输或存储,常用的压缩算法包括对于视频的H.264(以标准命名的视频压缩算法或标准,又被称为AVC即Advanced Video Coding,高级视频编码)压缩编码以及对于照片的JPEG(Joint Photographic Experts Group,联合图像专家组)压缩编码;
其中,根据ITU(International Telecommunication Union,国际电信联盟)标准建议书《Rec ITU-T H.264(06/2019)》,视频压缩H.264算法标准规定允许在NALU(Network Abstraction Layer Unit,网络抽象层单元)类型为6即SEI(Supplemental enhancement information,补充增强信息单元)或22、23即REV(Reserved,保留单元)位置叠加自定义段用于存放自定义信息且不会影响商业视频播放软件对于视频文件的播放,因此可保存位姿数据在上述任意NALU(记为“H.264编码约定字段”),在解码流程前从H.264编码约定字段提取位姿天毫秒,在解码流程后按照表1规定从图像中恢复图像天毫秒,可实现离线图像数据与位姿数据进行毫秒级时间同步,并以此为基础进行高精度数据处理。
根据ITU标准建议书《Rec ITU-T T.81(09/92)》,图片压缩JPEG算法标准规定允许在“APPn”(Reserved for application segments,应用保留段,其中n为非零正整数表示序号)位置叠加自定义段用于存放自定义信息且不会影响图片查看软件对于照片文件的查看,因此可保存位姿数据在“APPn”中任意一个字段(记为“JPEG编码约定字段”),在解码流程前从JPEG编码约定字段提取位姿天毫秒,在解码流程后按照表1规定从图像中恢复图像天毫秒,可实现离线图像数据与位姿数据进行毫秒级时间同步,并以此为基础进行高精度数据处理。
本发明说明书中未作详细描述的内容属本领域技术人员的公知技术。
本发明虽然已以较佳实施例公开如上,但其并不是用来限定本发明,任何本领域技术人员在不脱离本发明的精神和范围内,都可以利用上述揭示的方法 和技术内容对本发明技术方案做出可能的变动和修改,因此,凡是未脱离本发明技术方案的内容,依据本发明的技术实质对以上实施例所作的任何简单修改、等同变化及修饰,均属于本发明技术方案的保护范围。

Claims (10)

  1. 一种用于光电吊舱的毫秒级数据同步装置,其特征在于,包括时钟同步模块、传感器模块、信息处理模块;
    传感器模块包括外同步触发成像传感器、非成像传感器;
    时钟同步模块利用卫星导航系统授时秒脉冲修正内建时钟,还用于触发外同步触发成像传感器,并向外同步触发成像传感器发送同步信息;还用于触发非成像传感器,并向非成像传感器发送时间信息,采集非成像传感器的数据并记录采集时间,并将采集的数据和时间发送给信息处理模块;
    外同步触发成像传感器将采集的数据和同步信息发送给信息处理模块;
    信息处理模块利用非成像传感器的采集时间和外同步触发成像传感器的同步信息,对外同步触发成像传感器和非成像传感器的采集数据进行同步。
  2. 根据权利要求1所述毫秒级数据同步装置,其特征在于,所述非成像传感器的数据采集频率远高于外同步触发成像传感器的数据采集频率。
  3. 根据权利要求1所述毫秒级数据同步装置,其特征在于,所述非成像传感器为运动传感器。
  4. 根据权利要求1所述毫秒级数据同步装置,其特征在于,所述外同步触发成像传感器包括可见光摄像机和或红外热成像仪。
  5. 根据权利要求1至4中任一项所述毫秒级数据同步装置,其特征在于,所述时钟同步模块,根据外同步触发成像传感器的成像频率定时将天毫秒数据、曝光触发脉冲信号发送至外同步触发成像传感器。
  6. 根据权利要求1所述毫秒级数据同步装置,其特征在于,所述时钟同步模块,根据非成像传感器预设的数据采集频率定时将天毫秒数据、数据采集触发脉冲信号发送至非成像传感器。
  7. 根据权利要求5所述毫秒级数据同步装置,其特征在于,所述外同步触发成像传感器采用天毫秒数据为同步码。
  8. 根据权利要求7所述毫秒级数据同步装置,其特征在于,所述同步码中, 还设有校验位,且同步码位于外同步触发成像传感器采集数据结果的固定位置。
  9. 一种用于光电吊舱的毫秒级数据同步方法,其特征在于,包括:
    时钟同步模块利用卫星导航系统授时秒脉冲修正内建时钟;
    时钟同步模块触发外同步触发成像传感器,并向外同步触发成像传感器发送同步信息;
    时钟同步模块触发非成像传感器,并向非成像传感器发送同步信息;采集非成像传感器的数据并记录采集时间,发送给信息处理模块;
    外同步触发成像传感器根据触发信号将采集的数据和同步信息发送给信息处理模块;
    信息处理模块利用非成像传感器的采集时间和外同步触发成像传感器的同步信息,对外同步触发成像传感器和非成像传感器的数据进行同步。
  10. 根据权利要求9所述的毫秒级数据同步方法,其特征在于,所述时钟同步模块,根据外同步触发成像传感器的成像频率定时将曝光触发脉冲信号,以及该脉冲信号对应的时间同步码发送给外同步触发成像传感器。
PCT/CN2022/141710 2022-06-20 2022-12-24 一种用于光电吊舱的毫秒级数据同步装置及方法 WO2023246050A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210701058.3 2022-06-20
CN202210701058.3A CN115269718A (zh) 2022-06-20 2022-06-20 一种用于光电吊舱的毫秒级数据同步装置及方法

Publications (1)

Publication Number Publication Date
WO2023246050A1 true WO2023246050A1 (zh) 2023-12-28

Family

ID=83761914

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/141710 WO2023246050A1 (zh) 2022-06-20 2022-12-24 一种用于光电吊舱的毫秒级数据同步装置及方法

Country Status (2)

Country Link
CN (1) CN115269718A (zh)
WO (1) WO2023246050A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115269718A (zh) * 2022-06-20 2022-11-01 北京航天控制仪器研究所 一种用于光电吊舱的毫秒级数据同步装置及方法
CN115865300A (zh) * 2022-11-29 2023-03-28 成都纵横自动化技术股份有限公司 一种光电吊舱的数据同步系统及其方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103744371A (zh) * 2013-12-23 2014-04-23 广东电网公司电力科学研究院 无人机电力巡检的传感器集成电路
CN112672415A (zh) * 2020-12-25 2021-04-16 之江实验室 多传感器时间同步方法、装置、系统、电子设备及介质
CN112787740A (zh) * 2020-12-26 2021-05-11 武汉光庭信息技术股份有限公司 一种多传感器时间同步装置及方法
US20220182213A1 (en) * 2020-12-08 2022-06-09 Tusimple, Inc. Hardware-based time synchronization for heterogeneous sensors in autonomous vehicles
CN115269718A (zh) * 2022-06-20 2022-11-01 北京航天控制仪器研究所 一种用于光电吊舱的毫秒级数据同步装置及方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103744371A (zh) * 2013-12-23 2014-04-23 广东电网公司电力科学研究院 无人机电力巡检的传感器集成电路
US20220182213A1 (en) * 2020-12-08 2022-06-09 Tusimple, Inc. Hardware-based time synchronization for heterogeneous sensors in autonomous vehicles
CN112672415A (zh) * 2020-12-25 2021-04-16 之江实验室 多传感器时间同步方法、装置、系统、电子设备及介质
CN112787740A (zh) * 2020-12-26 2021-05-11 武汉光庭信息技术股份有限公司 一种多传感器时间同步装置及方法
CN115269718A (zh) * 2022-06-20 2022-11-01 北京航天控制仪器研究所 一种用于光电吊舱的毫秒级数据同步装置及方法

Also Published As

Publication number Publication date
CN115269718A (zh) 2022-11-01

Similar Documents

Publication Publication Date Title
WO2023246050A1 (zh) 一种用于光电吊舱的毫秒级数据同步装置及方法
US9639935B1 (en) Apparatus and methods for camera alignment model calibration
US11276149B2 (en) Double non-local means denoising
US20200221010A1 (en) High dynamic range processing based on angular rate measurements
US11871105B2 (en) Field of view adjustment
US20190098274A1 (en) Desaturation Control
US20200358944A1 (en) High dynamic range processing on spherical images
US11908111B2 (en) Image processing including noise reduction
US11238285B2 (en) Scene classification for image processing
US11563925B2 (en) Multiple tone control
CN102572235A (zh) 成像装置、图像处理方法和计算机程序
US11363214B2 (en) Local exposure compensation
US11412150B2 (en) Entropy maximization based auto-exposure
US9774842B2 (en) Device for 3D display of photo finish image
US20220188973A1 (en) Systems and methods for synthetic augmentation of cameras using neural networks
US11128814B2 (en) Image processing apparatus, image capturing apparatus, video reproducing system, method and program
US20180211413A1 (en) Image signal processing using sub-three-dimensional look-up tables
US6697573B1 (en) Hybrid stereoscopic motion picture camera with film and digital sensor
WO2023029567A1 (zh) 一种传感器采集的多种数据的可视化方法及其系统
EP4210335A1 (en) Image processing device, image processing method, and storage medium
WO2021020214A1 (ja) 送信装置、受信装置及び通信システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22947785

Country of ref document: EP

Kind code of ref document: A1