CN111896972A - Airborne laser radar synchronous control and automatic digital image exterior orientation element list creation method - Google Patents

Airborne laser radar synchronous control and automatic digital image exterior orientation element list creation method Download PDF

Info

Publication number
CN111896972A
CN111896972A CN202010548357.9A CN202010548357A CN111896972A CN 111896972 A CN111896972 A CN 111896972A CN 202010548357 A CN202010548357 A CN 202010548357A CN 111896972 A CN111896972 A CN 111896972A
Authority
CN
China
Prior art keywords
digital image
list
time
image
file
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010548357.9A
Other languages
Chinese (zh)
Other versions
CN111896972B (en
Inventor
窦延娟
潘文武
游安清
罗俊
辛宇亮
雍松林
刘志强
李光
王国亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Applied Electronics of CAEP
Original Assignee
Institute of Applied Electronics of CAEP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Applied Electronics of CAEP filed Critical Institute of Applied Electronics of CAEP
Priority to CN202010548357.9A priority Critical patent/CN111896972B/en
Publication of CN111896972A publication Critical patent/CN111896972A/en
Application granted granted Critical
Publication of CN111896972B publication Critical patent/CN111896972B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • G01S17/875Combinations of systems using electromagnetic waves other than radio waves for determining attitude
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising

Abstract

The invention discloses a method for synchronously controlling an airborne laser radar and automatically creating a digital image exterior orientation element list, which comprises the following steps: s1, a timing system of the GNSS receiving board card receives satellite signals and performs timing according to a UTC time system, and then the GNSS receiving board card, the IMU, the laser scanner and the digital camera are subjected to time synchronization control; and S2, after the time synchronization control is completed, automatically creating the digital image external orientation element list through the following substeps: s21, carrying out effective digital image automatic detection on the digital image shot by the digital camera; s22, renaming the detected digital image; and S23, automatically creating a digital image exterior orientation element list according to the mounting position relation between the digital camera and the IMU and the exposure time, position and posture data of the digital image recorded in the POS data. The invention realizes the accurate time synchronization and realizes the automatic creation of the digital image exterior orientation element list based on the accurate time synchronization.

Description

Airborne laser radar synchronous control and automatic digital image exterior orientation element list creation method
Technical Field
The invention relates to the technical field of remote sensing mapping and the control field of photoelectric detection systems, in particular to a method for synchronously controlling an airborne laser radar and automatically creating a digital image exterior orientation element list.
Background
The airborne laser radar system is an aviation active remote sensing and mapping system composed of a digital camera, a laser scanner, a POS system (a positioning and attitude determining system composed of an IMU and a GNSS) and an embedded computer, and in recent years, due to the development of unmanned aerial vehicle technology, unmanned airborne laser radars (a multi-rotor unmanned aerial vehicle, a fixed wing unmanned aerial vehicle and a vertical take-off and landing fixed wing unmanned aerial vehicle) are popularized and applied in large areas in various industries such as mapping, emergency, highway/railway survey design and inspection, power transmission and transformation line survey and inspection, water conservancy and the like.
The basic principle of airborne laser radar measurement and positioning is that a laser scanner actively transmits high-frequency laser pulses to a target and receives reflected echo signals, time is recorded, the distance from the laser scanner to the target is calculated through the time difference between the transmitted laser and the received reflected laser, and the three-dimensional coordinates and attitude angles of the starting point of the transmitted laser pulses are obtained by combining a POS system, so that the three-dimensional coordinates of a laser reflection end, namely a ground object target can be calculated; the position and attitude data of the flight platform are acquired through the POS system, the digital camera shoots sequence digital photos according to a certain time interval, six exterior orientation elements of the sequence photos can be obtained through synchronous control, and a digital orthophoto map can be obtained through a series of data processing on the basis.
The position and attitude information of the digital camera and the laser scanner are obtained from the POS system, and the exposure time interval of the digital camera is generally integral multiple of second. The time reference and the time precision of each sensor are different, a UTC time system is adopted by the laser scanner and the digital camera, a GPS time system is generally adopted by the POS system, the difference between the GPS time and the UTC is an integral multiple of the second, and the specific difference of the seconds is regularly published by a time service department. In order to obtain accurate position information, an airborne laser radar system must perform accurate synchronous control on time between sensors of the system.
The position and attitude information of high-density laser point cloud data and high-definition digital image data in the airborne laser radar system are acquired from a POS system, and the processed digital image and the point cloud data need to be accurately registered in subsequent application, so that the accurate matching of the spatial positions of the digital image and the point cloud data can be carried out only through an accurate digital image identifier and a GPS exposure trigger time list, and therefore, the automatic acquisition of the digital image external orientation element list is a key process for matching of a digital orthographic image produced by the airborne laser radar system and the laser point cloud data. When the unmanned airborne laser radar system executes a flight mission, a plurality of flights are often required, the initial names of digital images of different flight missions are the same, and when the digital images are applied to subsequent orthographic image processing, external orientation elements of the images can be obtained through complex calculation.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: aiming at the problems, an airborne laser radar synchronous control and digital image external orientation element list automatic creation method is provided.
The technical scheme adopted by the invention is as follows:
a method for synchronously controlling an airborne laser radar and automatically creating a digital image exterior orientation element list comprises the following steps:
s1, a timing system of the GNSS receiving board card receives satellite signals and performs timing according to a UTC time system, and then the GNSS receiving board card, the IMU, the laser scanner and the digital camera are subjected to time synchronization control;
and S2, after the time synchronization control is completed, automatically creating the digital image external orientation element list through the following substeps:
s21, carrying out effective digital image automatic detection on the digital image shot by the digital camera;
s22, renaming the detected digital image;
and S23, automatically creating a digital image exterior orientation element list according to the mounting position relation between the digital camera and the IMU and the exposure time, position and posture data of the digital image recorded in the POS data.
In one embodiment, the time synchronization control of the GNSS receiver board, the IMU and the laser scanner in S1 includes the following sub-steps:
s101, a GNSS receiving board card receives a synchronous control instruction of a system control assembly;
s102, the GNSS receiving board sends UTC initial time to the IMU and the laser scanner, and time stamps of UTC coarse time are marked in the IMU and the laser scanner;
s103, counting by taking a timestamp of the UTC coarse time as a starting point by an IMU and a UTC fine time counter in the laser scanner;
and S104, the GNSS receiving board card continuously and synchronously sends pps pulse signals to the IMU and the laser scanner per second, and the UTC fine time counter is reset to zero and counts again after the IMU and the laser scanner receive the pps pulse signals.
Preferably, the UTC coarse time is a year, month, day, hour, minute and second time, and the time precision thereof is milliseconds.
Preferably, the UTC fine time counter counts up the clock in a period of 1 μ s.
Preferably, upon receipt of the pps pulse signal by the IMU and laser scanner, the UTC fine time counter is reset and recounted upon encountering the rising edge of the pps pulse signal.
In one embodiment, the time synchronization control of the GNSS receiver board and the digital camera in S1 includes the following sub-steps:
s111, the digital camera receives a synchronous control instruction of a system control component;
and S112, the digital camera sends an exposure TTL signal to the GNSS receiving board card at the moment of exposure, and synchronously marks a UTC timestamp on the GNSS receiving board card.
In one embodiment, the method for performing effective digital image automatic detection on the digital image shot by the digital camera in S21 includes:
s211, presetting a file size threshold of the digital image;
s212, extracting the file size of the digital image from the digital image shot by the digital camera;
s213, eliminating the digital image with the file size not meeting the file size threshold in the digital image shot by the digital camera.
In one embodiment, the method for renaming the detected digital image in S22 includes:
s221, extracting the shooting time of the digital image from the digital image shot by the digital camera;
s222, converting the shooting time into a date _ day second format;
s223, renaming all digital images according to the date-day-second format or the project name-date-day-second format.
In one embodiment, S23 includes the following sub-steps:
s231, creating a list L _ t _ n _ S _ v containing names, positions, postures and states of the image files respectively corresponding to each row, wherein each row is sorted according to the renamed image file names;
s232, acquiring the image file name of the first digital image in the list L _ t _ n _ S _ v, and analyzing the shooting time Tp1 from the image file name;
s233, extracting an event list file L _ e including time, position and posture data of image exposure from the POS data, taking an exposure time Tb1 of the first digital image in the event list file L _ e, and calculating a time difference Ts between a shooting time and the exposure time of the first digital image as Tb1-Tp 1; randomly reading two adjacent rows of data in the event list file L _ e, and obtaining a time interval Td for triggering the digital camera to shoot by comparing the exposure time difference in the two rows of data to be a sum and an integer;
s234, in the list L _ t _ n _ S _ v, starting from the first digital image, assuming that the shooting time of the nth digital image is Tpn, the non-precise exposure time Tbcjn of the nth digital image is Tpn + Ts, and renaming the image according to the method of S22 with the non-precise exposure time;
s235, acquiring the exposure time Tbzn of the nth digital image one by one from the event list file L _ e in the POS data;
s236, setting a judgment threshold value (-Td/2+0.1, Td/2-0.1), and judging whether the difference value between the strip-by-strip contrast exposure time Tbzn and the non-precise exposure time Tbcjn is in the range (-Td/2+0.1, Td/2-0.1):
if so, it indicates that the matching is successful, and sets the corresponding state in the list L _ T _ n _ S _ v to "T", and then executes S237;
if not, the matching is failed, and the corresponding state in the list L _ t _ n _ s _ v is set as 'F';
s237, reading the position and posture data of the nth digital image from the event list file L _ e, and comparing with the set threshold range of the position and posture data:
if the current state is within the set threshold value range, setting the state row of the corresponding row in the list L _ t _ n _ s _ v as 'OK';
if not, setting the state row of the corresponding row in the list L _ t _ n _ s _ v as 'False'
S238, n is increased by 1 and the steps S235 to S237 are repeated until the last line in the event list file L _ e is reached, and the state marking of all the digital images is completed;
s239, extracting the image file name in "OK" state from the list L _ t _ n _ S _ v, updating the position and posture data corresponding to the image file name in the event list file L _ e into the list L _ t _ n _ S _ v, i.e. establishing an initial digital image external orientation element list Ltn;
and S240, according to the installation position relation between the digital camera and the IMU, carrying out coordinate translation and coordinate rotation on the position and the posture in the initial image exterior orientation element list Ltn to obtain an accurate exterior orientation element value, and establishing a file L _ OE in one-to-one correspondence with the image file names to obtain a final digital image exterior orientation element list.
In one embodiment, step S23 further includes automatically creating a digital image exposure time list according to the exposure time, position and posture data of the digital image recorded in the POS data after renaming the digital image; the method comprises the following implementation steps:
s0231, creating a list L _ t _ n _ S _ v containing the name, the exposure time and the state of each row of the image file corresponding to each row, wherein each row is sorted according to the renamed image file name;
s0232, acquiring the image file name of the first digital image in the list L _ t _ n _ S _ v, and analyzing the shooting time Tp1 from the image file name;
s0233, extracting an event list file L _ e including time, position and posture data of image exposure from the POS data, taking an exposure time Tb1 of the first digital image in the event list file L _ e, and calculating a time difference Ts between a shooting time and the exposure time of the first digital image as Tb1-Tp 1; randomly reading two adjacent rows of data in the event list file L _ e, and obtaining a time interval Td for triggering the digital camera to shoot by comparing the exposure time difference in the two rows of data to be a sum and an integer;
s0234, in the list L _ t _ n _ S _ v, starting from the first digital image, assuming that the shooting time of the nth digital image is Tpn, the non-precise exposure time Tbcjn of the nth digital image is Tpn + Ts, and renaming the image according to the method of S22 with the non-precise exposure time;
s0235, acquiring the exposure time Tbzn of the nth digital image from the event list file L _ e in the POS data one by one;
s0236, setting a judgment threshold value (-Td/2+0.1, Td/2-0.1), whether the difference between the strip-by-strip comparison exposure time Tbzn and the non-precision exposure time tbbjn is within the range (-Td/2+0.1, Td/2-0.1):
if so, indicating that the matching is successful, setting the corresponding state in the list L _ T _ n _ S _ v to be T, and then executing S0237;
if not, the matching is failed, and the corresponding state in the list L _ t _ n _ s _ v is set as 'F';
s0237, reading the position and posture data of the nth digital image from the event list file L _ e, and comparing with the threshold range of the set position and posture data:
if the current state is within the set threshold value range, setting the state row of the corresponding row in the list L _ t _ n _ s _ v as 'OK';
if not, setting the state row of the corresponding row in the list L _ t _ n _ s _ v as 'False'
S0238, n is increased by 1 and S0235-S0237 are repeatedly executed until the last line in the event list file L _ e is reached, and the state marking of all digital images is completed;
s0239, extracting the name of the image file with "OK" status from the list L _ t _ n _ S _ v, and updating the exposure time corresponding to the name of the image file in the event list file L _ e into the list L _ t _ n _ S _ v, i.e. creating a digital image exposure time list Ltn.
In summary, due to the adoption of the technical scheme, the invention has the beneficial effects that:
1. the invention realizes the time accurate synchronization of the digital camera shooting and the POS system, and realizes the automatic creation of the single-frame or multi-frame digital image exterior orientation element list based on the time accurate synchronization, thereby facilitating the aerial survey operator to use the orthoimage processing software to carry out the digital correction of the image.
2. The invention also realizes the automatic creation of the single-frame or multi-frame digital image exposure time list.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a flow chart of the method for synchronous control of airborne laser radar and automatic creation of digital image exterior orientation element list according to the present invention.
Fig. 2 is a schematic diagram of the time synchronization control of the present invention.
Fig. 3 is a schematic diagram of the time synchronization control of the GNSS receiver board, the IMU, and the laser scanner according to the present invention.
Fig. 4 is a schematic diagram of the count performed by the UTC fine time counter in the time synchronization control of the GNSS receiving board card, the IMU, and the laser scanner according to the present invention.
Fig. 5 is a block diagram illustrating a process of automatically creating an external orientation element list of a digital image according to the present invention.
Fig. 6 is a block diagram of the automatic creation process of the exposure time list of digital images according to the present invention.
Detailed Description
The features and properties of the present invention are described in further detail below with reference to examples.
As shown in fig. 1, a method for synchronous control of an airborne laser radar and automatic creation of an external orientation element list of a digital image includes:
s1, a timing system of the GNSS receiving board card receives satellite signals and performs timing according to a UTC time system, and then the GNSS receiving board card, the IMU, the laser scanner and the digital camera are subjected to time synchronization control;
because the timing system of the GNSS receiving board card receives the satellite signals and performs timing according to the UTC time system, no error is generated along with the time, and the IMU, the timing chip of the laser scanner and the digital camera generate accumulated errors along with the time. Therefore, according to the scheme, the satellite signals are received by the timing system of the GNSS receiving board card and timing is carried out according to the UTC time system, so that time synchronization control of the IMU, the laser scanner and the digital camera is realized. As shown in fig. 2, the connected GNSS receiving board card and IMU form a POS system; meanwhile, the GNSS receiving board card is also respectively connected with the digital camera, the laser scanner and the system control assembly; the system control assembly can be realized by adopting an embedded computer and is used for issuing a synchronous control command to the GNSS receiving board card.
In one embodiment, as shown in fig. 3, the time synchronization control of the GNSS receiver board, the IMU and the laser scanner in S1 includes the following sub-steps:
s101, a GNSS receiving board card receives a synchronous control instruction of a system control assembly;
s102, the GNSS receiving board sends UTC initial time (generally sent in a serial port data mode) to the IMU and the laser scanner, and time stamps of UTC coarse time are marked in the IMU and the laser scanner;
s103, counting by taking a timestamp of the UTC coarse time as a starting point by an IMU and a UTC fine time counter in the laser scanner;
and S104, the GNSS receiving board continuously and synchronously sends pps pulse signals to the IMU and the laser scanner per second to reduce accumulated errors of the scanners, after the IMU and the laser scanner receive the pps pulse signals, the UTC fine time counter is reset to zero and counts again, the UTC coarse time and the UTC fine time are the working time of the laser scanner for recording laser pulses, and the time of the laser scanner is consistent with the time reference standard of the GNSS receiving board card in the mode, as shown in FIG. 4.
In the above process, the UTC coarse time is year, month, day, hour, minute and second time, and the time precision thereof is millisecond.
In the above process, the UTC fine time counter performs clock accumulation counting with a period of 1 μ s. This 1 μ s is only preferable in the present embodiment, and may be 1ns,10ns, or the like, and set as needed.
In the above process, after the IMU and the laser scanner receive the pps pulse signal, the UTC fine time counter is reset and recounted when the rising edge of the pps pulse signal is encountered.
In one embodiment, the time synchronization control of the GNSS receiver board and the digital camera in S1 includes the following sub-steps:
s111, the digital camera receives a synchronous control instruction of a system control component;
and S112, the digital camera sends an exposure TTL signal to the GNSS receiving board card at the moment of exposure, and automatically marks a UTC timestamp on the GNSS receiving board card synchronously, so that the UTC time recorded by the GNSS receiving board card exposed by the digital camera is obtained, the exposure time is consistent with the time reference standard of the GNSS receiving board card, and the time is synchronous.
It should be noted that the synchronization control commands sent by the system control component to the GNSS receiver board and the digital camera are different synchronization control commands sent at the same time, that is, the GNSS receiver board, the IMU and the laser scanner are time-synchronized, and the GNSS receiver board and the digital camera are time-synchronized.
And S2, after the time synchronization control is completed, automatically creating the digital image external orientation element list through the following substeps:
s21, carrying out effective digital image automatic detection on the digital image shot by the digital camera;
in one embodiment, the data amount of the digital image shot by the digital camera is generally in the level of hundreds of G to tens of T, wherein the file size of a single digital image is too large due to shooting abnormality, so that effective digital image automatic detection is required, abnormal digital images are eliminated, and the data amount is reduced to improve the system efficiency. Specifically, the method for automatically detecting the effective digital image of the digital image shot by the digital camera in S21 includes:
s211, presetting a file size threshold value of the digital image (the file size threshold value can be set according to requirements);
s212, extracting the file size of the digital image from the digital image shot by the digital camera;
s213, eliminating the digital image with the file size not meeting the file size threshold in the digital image shot by the digital camera.
S22, renaming the detected digital image;
in one embodiment, in order to ensure that digital images with different dates and different flight numbers are not repeated and avoid that the digital images shot with different flight numbers are mistakenly covered due to the same file name, the images need to be renamed. Specifically, in S22, the method for renaming the detected digital image includes:
s221, extracting the shooting time of the digital image from the digital image shot by the digital camera;
s222, converting the shooting time into a date _ day second format;
s223, renaming all digital images according to the date-day-second format or the project name-date-day-second format. That is, in the case of a single flight number, renaming may be performed in the date _ day second format, and in the case of a plurality of flight numbers, renaming may be performed in the project name _ date _ day second format according to actual conditions. In addition, in order to avoid data confusion, the renamed digital images are separately stored in a folder.
In the above process, the file size and shooting time of the digital image are both read in the form of a stream file and obtained by analyzing the data head format of the digital image.
And S23, automatically creating a digital image exterior orientation element list according to the mounting position relation between the digital camera and the IMU and the exposure time, position and posture data of the digital image recorded in the POS data.
In one embodiment, as shown in fig. 5, S23 includes the following sub-steps:
s231, creating a list L _ t _ n _ S _ v containing names, positions, postures and states of the image files respectively corresponding to each row, wherein each row is sorted according to the renamed image file names;
s232, acquiring the image file name of the first digital image in the list L _ t _ n _ S _ v, and analyzing the shooting time Tp1 from the image file name;
s233, extracting an event list file L _ e including time, position and posture data of image exposure from the POS data, taking an exposure time Tb1 of the first digital image in the event list file L _ e, and calculating a time difference Ts between a shooting time and the exposure time of the first digital image as Tb1-Tp 1; randomly reading two adjacent rows of data in the event list file L _ e, and obtaining a time interval Td for triggering the digital camera to shoot by comparing the exposure time difference in the two rows of data to be a sum and an integer;
s234, in the list L _ t _ n _ S _ v, starting from the first digital image, assuming that the shooting time of the nth digital image is Tpn, the non-precise exposure time Tbcjn of the nth digital image is Tpn + Ts, and renaming the image according to the method of S22 with the non-precise exposure time;
s235, acquiring the exposure time Tbzn of the nth digital image one by one from the event list file L _ e in the POS data;
s236, setting a judgment threshold value (-Td/2+0.1, Td/2-0.1), and judging whether the difference value between the strip-by-strip contrast exposure time Tbzn and the non-precise exposure time Tbcjn is in the range (-Td/2+0.1, Td/2-0.1):
if so, it indicates that the matching is successful, and sets the corresponding state in the list L _ T _ n _ S _ v to "T", and then executes S237;
if not, the matching is failed, and the corresponding state in the list L _ t _ n _ s _ v is set as 'F';
s237, reading the position and posture data of the nth digital image from the event list file L _ e, and comparing with the set threshold range of the position and posture data:
if the current state is within the set threshold value range, setting the state row of the corresponding row in the list L _ t _ n _ s _ v as 'OK';
if not, setting the state row of the corresponding row in the list L _ t _ n _ s _ v as 'False'
S238, n is increased by 1 and the steps S235 to S237 are repeated until the last line in the event list file L _ e is reached, and the state marking of all the digital images is completed;
s239, extracting the image file name in "OK" state from the list L _ t _ n _ S _ v, updating the position and posture data corresponding to the image file name in the event list file L _ e into the list L _ t _ n _ S _ v, i.e. establishing an initial digital image external orientation element list Ltn;
and S240, according to the installation position relation between the digital camera and the IMU, carrying out coordinate translation and coordinate rotation on the position and the posture in the initial image exterior orientation element list Ltn to obtain an accurate exterior orientation element value, and establishing a file L _ OE in one-to-one correspondence with the image file names to obtain a final digital image exterior orientation element list. The file format of the digital image exterior orientation element list can be an excel table, a text format and the like, and is suitable for laser radar data processing software Terrasoid to process an orthographic image.
In one embodiment, as shown in fig. 6, step S23 of the present embodiment further includes automatically creating a digital image exposure time list according to the exposure time, position and posture data of the digital image recorded in the POS data after renaming the digital image; the method comprises the following implementation steps:
s0231, creating a list L _ t _ n _ S _ v containing the name, the exposure time and the state of each row of the image file corresponding to each row, wherein each row is sorted according to the renamed image file name;
s0232, acquiring the image file name of the first digital image in the list L _ t _ n _ S _ v, and analyzing the shooting time Tp1 from the image file name;
s0233, extracting an event list file L _ e including time, position and posture data of image exposure from the POS data, taking an exposure time Tb1 of the first digital image in the event list file L _ e, and calculating a time difference Ts between a shooting time and the exposure time of the first digital image as Tb1-Tp 1; randomly reading two adjacent rows of data in the event list file L _ e, and obtaining a time interval Td for triggering the digital camera to shoot by comparing the exposure time difference in the two rows of data to be a sum and an integer;
s0234, in the list L _ t _ n _ S _ v, starting from the first digital image, assuming that the shooting time of the nth digital image is Tpn, the non-precise exposure time Tbcjn of the nth digital image is Tpn + Ts, and renaming the image according to the method of S22 with the non-precise exposure time;
s0235, acquiring the exposure time Tbzn of the nth digital image from the event list file L _ e in the POS data one by one;
s0236, setting a judgment threshold value (-Td/2+0.1, Td/2-0.1), whether the difference between the strip-by-strip comparison exposure time Tbzn and the non-precision exposure time tbbjn is within the range (-Td/2+0.1, Td/2-0.1):
if so, indicating that the matching is successful, setting the corresponding state in the list L _ T _ n _ S _ v to be T, and then executing S0237;
if not, the matching is failed, and the corresponding state in the list L _ t _ n _ s _ v is set as 'F';
s0237, reading the position and posture data of the nth digital image from the event list file L _ e, and comparing with the threshold range of the set position and posture data:
if the current state is within the set threshold value range, setting the state row of the corresponding row in the list L _ t _ n _ s _ v as 'OK';
if not, setting the state row of the corresponding row in the list L _ t _ n _ s _ v as 'False'
S0238, n is increased by 1 and S0235-S0237 are repeatedly executed until the last line in the event list file L _ e is reached, and the state marking of all digital images is completed;
s0239, extracting the name of the image file with "OK" status from the list L _ t _ n _ S _ v, and updating the exposure time corresponding to the name of the image file in the event list file L _ e into the list L _ t _ n _ S _ v, i.e. creating a digital image exposure time list Ltn. Similarly, the file format of the digital image exposure time list Ltn may be an excel table, a text format, or the like, and is suitable for the laser radar data processing software TerraSoid to process an orthoimage.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (10)

1. A method for synchronously controlling an airborne laser radar and automatically creating a digital image exterior orientation element list is characterized by comprising the following steps:
s1, a timing system of the GNSS receiving board card receives satellite signals and performs timing according to a UTC time system, and then the GNSS receiving board card, the IMU, the laser scanner and the digital camera are subjected to time synchronization control;
and S2, after the time synchronization control is completed, automatically creating the digital image external orientation element list through the following substeps:
s21, carrying out effective digital image automatic detection on the digital image shot by the digital camera;
s22, renaming the detected digital image;
and S23, automatically creating a digital image exterior orientation element list according to the mounting position relation between the digital camera and the IMU and the exposure time, position and posture data of the digital image recorded in the POS data.
2. The method for synchronously controlling the airborne lidar and automatically creating the digital image exterior orientation element list according to claim 1, wherein the step of controlling the time synchronization of the GNSS receiver board, the IMU and the laser scanner in S1 comprises the following sub-steps:
s101, a GNSS receiving board card receives a synchronous control instruction of a system control assembly;
s102, the GNSS receiving board sends UTC initial time to the IMU and the laser scanner, and time stamps of UTC coarse time are marked in the IMU and the laser scanner;
s103, counting by taking a timestamp of the UTC coarse time as a starting point by an IMU and a UTC fine time counter in the laser scanner;
and S104, the GNSS receiving board card continuously and synchronously sends pps pulse signals to the IMU and the laser scanner per second, and the UTC fine time counter is reset to zero and counts again after the IMU and the laser scanner receive the pps pulse signals.
3. The method as claimed in claim 2, wherein the UTC coarse time is year, month, day, hour, minute and second time, and the time precision is millisecond.
4. The method of claim 2, wherein the UTC fine time counter counts up and down with a 1 μ s period.
5. The method of claim 2, wherein the UTC fine time counter is reset and recounted when a rising edge of the pps pulse signal is encountered after the pps pulse signal is received by the IMU and the laser scanner.
6. The method for synchronously controlling an airborne lidar and automatically creating a list of external orientation elements of digital images according to claim 1, wherein the step of time synchronization control between the GNSS receiver board and the digital camera in S1 comprises the following sub-steps:
s111, the digital camera receives a synchronous control instruction of a system control component;
and S112, the digital camera sends an exposure TTL signal to the GNSS receiving board card at the moment of exposure, and synchronously marks a UTC timestamp on the GNSS receiving board card.
7. The method for synchronously controlling an airborne lidar and automatically creating an external orientation element list of a digital image according to claim 1, wherein the step of automatically detecting the digital image captured by the digital camera in S21 comprises:
s211, presetting a file size threshold of the digital image;
s212, extracting the file size of the digital image from the digital image shot by the digital camera;
s213, eliminating the digital image with the file size not meeting the file size threshold in the digital image shot by the digital camera.
8. The method for synchronously controlling an airborne lidar and automatically creating a list of external orientation elements of a digital image according to claim 1, wherein the step of renaming the detected digital image in S22 comprises:
s221, extracting the shooting time of the digital image from the digital image shot by the digital camera;
s222, converting the shooting time into a date _ day second format;
s223, renaming all digital images according to the date-day-second format or the project name-date-day-second format.
9. The method for synchronously controlling an airborne lidar and automatically creating a list of external orientation elements in a digital image according to claim 1, wherein S23 comprises the following sub-steps:
s231, creating a list L _ t _ n _ S _ v containing names, positions, postures and states of the image files respectively corresponding to each row, wherein each row is sorted according to the renamed image file names;
s232, acquiring the image file name of the first digital image in the list L _ t _ n _ S _ v, and analyzing the shooting time Tp1 from the image file name;
s233, extracting an event list file L _ e including image exposure time, position and posture data from the POS data, taking the exposure time Tb1 of the first digital image in the event list file L _ e, and calculating a time difference Ts between the shooting time and the exposure time of the first digital image as Tb1-Tp 1; randomly reading two adjacent rows of data in the event list file L _ e, and obtaining a time interval Td for triggering the digital camera to shoot by comparing the exposure time difference in the two rows of data to be a sum and an integer;
s234, in the list L _ t _ n _ S _ v, starting from the first digital image, assuming that the shooting time of the nth digital image is Tpn, the non-precise exposure time Tbcjn of the nth digital image is Tpn + Ts, and renaming the image according to the method of S22 with the non-precise exposure time;
s235, acquiring the exposure time Tbzn of the nth digital image one by one from the event list file L _ e in the POS data;
s236, setting a judgment threshold value (-Td/2+0.1, Td/2-0.1), and judging whether the difference value between the strip-by-strip contrast exposure time Tbzn and the non-precise exposure time Tbcjn is in the range (-Td/2+0.1, Td/2-0.1):
if so, it indicates that the matching is successful, and sets the corresponding state in the list L _ T _ n _ S _ v to "T", and then executes S237;
if not, the matching is failed, and the corresponding state in the list L _ t _ n _ s _ v is set as 'F';
s237, reading the position and posture data of the nth digital image from the event list file L _ e, and comparing with the set threshold range of the position and posture data:
if the current state is within the set threshold value range, setting the state row of the corresponding row in the list L _ t _ n _ s _ v as 'OK';
if not, setting the state row of the corresponding row in the list L _ t _ n _ s _ v as 'False'
S238, n is increased by 1 and the steps S235 to S237 are repeated until the last line in the event list file L _ e is reached, and the state marking of all the digital images is completed;
s239, extracting the image file name in "OK" state from the list L _ t _ n _ S _ v, updating the position and posture data corresponding to the image file name in the event list file L _ e into the list L _ t _ n _ S _ v, i.e. establishing an initial digital image external orientation element list Ltn;
and S240, according to the installation position relation between the digital camera and the IMU, carrying out coordinate translation and coordinate rotation on the position and the posture in the initial image exterior orientation element list Ltn to obtain an accurate exterior orientation element value, and establishing a file L _ OE in one-to-one correspondence with the image file names to obtain a final digital image exterior orientation element list.
10. The method for synchronously controlling the airborne lidar and automatically creating the digital image external orientation element list according to claim 1, wherein the step S23 further comprises automatically creating a digital image exposure time list according to exposure time, position and posture data of the digital image recorded in the POS data after renaming the digital image; the method comprises the following implementation steps:
s0231, creating a list L _ t _ n _ S _ v containing the name, the exposure time and the state of each row of the image file corresponding to each row, wherein each row is sorted according to the renamed image file name;
s0232, acquiring the image file name of the first digital image in the list L _ t _ n _ S _ v, and analyzing the shooting time Tp1 from the image file name;
s0233, extracting an event list file L _ e including time, position and posture data of image exposure from the POS data, taking an exposure time Tb1 of the first digital image in the event list file L _ e, and calculating a time difference Ts between a shooting time and the exposure time of the first digital image as Tb1-Tp 1; randomly reading two adjacent rows of data in the event list file L _ e, and obtaining a time interval Td for triggering the digital camera to shoot by comparing the exposure time difference in the two rows of data to be a sum and an integer;
s0234, in the list L _ t _ n _ S _ v, starting from the first digital image, assuming that the shooting time of the nth digital image is Tpn, the non-precise exposure time Tbcjn of the nth digital image is Tpn + Ts, and renaming the image according to the method of S22 with the non-precise exposure time;
s0235, acquiring the exposure time Tbzn of the nth digital image from the event list file L _ e in the POS data one by one;
s0236, setting a judgment threshold value (-Td/2+0.1, Td/2-0.1), whether the difference between the strip-by-strip comparison exposure time Tbzn and the non-precision exposure time tbbjn is within the range (-Td/2+0.1, Td/2-0.1):
if so, indicating that the matching is successful, setting the corresponding state in the list L _ T _ n _ S _ v to be T, and then executing S0237;
if not, the matching is failed, and the corresponding state in the list L _ t _ n _ s _ v is set as 'F';
s0237, reading the position and posture data of the nth digital image from the event list file L _ e, and comparing with the threshold range of the set position and posture data:
if the current state is within the set threshold value range, setting the state row of the corresponding row in the list L _ t _ n _ s _ v as 'OK';
if not, setting the state row of the corresponding row in the list L _ t _ n _ s _ v as 'False'
S0238, n is increased by 1 and S0235-S0237 are repeatedly executed until the last line in the event list file L _ e is reached, and the state marking of all digital images is completed;
s0239, extracting the name of the image file with "OK" status from the list L _ t _ n _ S _ v, and updating the exposure time corresponding to the name of the image file in the event list file L _ e into the list L _ t _ n _ S _ v, i.e. creating a digital image exposure time list Ltn.
CN202010548357.9A 2020-06-16 2020-06-16 Airborne laser radar synchronous control and automatic digital image exterior orientation element list creation method Active CN111896972B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010548357.9A CN111896972B (en) 2020-06-16 2020-06-16 Airborne laser radar synchronous control and automatic digital image exterior orientation element list creation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010548357.9A CN111896972B (en) 2020-06-16 2020-06-16 Airborne laser radar synchronous control and automatic digital image exterior orientation element list creation method

Publications (2)

Publication Number Publication Date
CN111896972A true CN111896972A (en) 2020-11-06
CN111896972B CN111896972B (en) 2022-10-18

Family

ID=73207689

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010548357.9A Active CN111896972B (en) 2020-06-16 2020-06-16 Airborne laser radar synchronous control and automatic digital image exterior orientation element list creation method

Country Status (1)

Country Link
CN (1) CN111896972B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI816515B (en) * 2022-08-17 2023-09-21 國立臺北大學 Radar and image synchronization method and radar and image synchronization system

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040073578A1 (en) * 2002-10-14 2004-04-15 Nam Kwang Woo Spatial image information system and method for supporting efficient storage and retrieval of spatial images
US20050177307A1 (en) * 2002-05-30 2005-08-11 Rafael-Armament Development Authority Ltd Airborne reconnaissance system
CN101241011A (en) * 2007-02-28 2008-08-13 北京北科天绘科技有限公司 High precision positioning and posture-fixing device on laser radar platform and method
CN101273301A (en) * 2005-09-07 2008-09-24 肖军 Self-helping digital image processing device, system and digital image processing method
CN103399484A (en) * 2013-07-23 2013-11-20 深圳市元征科技股份有限公司 Local clock calibrating method and vehicle-mounted equipment
JP2014527630A (en) * 2011-08-12 2014-10-16 ライカ ジオシステムズ アクチエンゲゼルシャフトLeica Geosystems AG Measuring device for determining the spatial posture of a measuring aid
CN104730539A (en) * 2015-03-06 2015-06-24 河南四维远见信息技术有限公司 Low-altitude light and small infrared and laser radar integrated system
EP2990828A1 (en) * 2014-08-26 2016-03-02 Kabushiki Kaisha Topcon Point cloud position data processing device, point cloud position data processing system, point cloud position data processing method, and program therefor
CN105656617A (en) * 2016-01-06 2016-06-08 中国洛阳电子装备试验中心 Time control pulse interval laser encoding and decoding method
CN107204037A (en) * 2016-03-17 2017-09-26 中国科学院光电研究院 3-dimensional image generation method based on main passive 3-D imaging system
CN108594255A (en) * 2018-04-20 2018-09-28 武汉大学 A kind of laser ranging auxiliary optical image association error compensation method and system
CN109359205A (en) * 2018-08-30 2019-02-19 中国农业大学 A kind of remote sensing image cutting method and equipment based on geographical grid
CN110209847A (en) * 2019-04-29 2019-09-06 中国科学院遥感与数字地球研究所 Quasi real time processing method, device and storage medium on Airborne Data Classification machine
CN215064560U (en) * 2021-07-29 2021-12-07 中国工程物理研究院应用电子学研究所 Autonomous positioning device based on laser radar imaging

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050177307A1 (en) * 2002-05-30 2005-08-11 Rafael-Armament Development Authority Ltd Airborne reconnaissance system
US20040073578A1 (en) * 2002-10-14 2004-04-15 Nam Kwang Woo Spatial image information system and method for supporting efficient storage and retrieval of spatial images
CN101273301A (en) * 2005-09-07 2008-09-24 肖军 Self-helping digital image processing device, system and digital image processing method
CN101241011A (en) * 2007-02-28 2008-08-13 北京北科天绘科技有限公司 High precision positioning and posture-fixing device on laser radar platform and method
JP2014527630A (en) * 2011-08-12 2014-10-16 ライカ ジオシステムズ アクチエンゲゼルシャフトLeica Geosystems AG Measuring device for determining the spatial posture of a measuring aid
CN103399484A (en) * 2013-07-23 2013-11-20 深圳市元征科技股份有限公司 Local clock calibrating method and vehicle-mounted equipment
EP2990828A1 (en) * 2014-08-26 2016-03-02 Kabushiki Kaisha Topcon Point cloud position data processing device, point cloud position data processing system, point cloud position data processing method, and program therefor
CN104730539A (en) * 2015-03-06 2015-06-24 河南四维远见信息技术有限公司 Low-altitude light and small infrared and laser radar integrated system
CN105656617A (en) * 2016-01-06 2016-06-08 中国洛阳电子装备试验中心 Time control pulse interval laser encoding and decoding method
CN107204037A (en) * 2016-03-17 2017-09-26 中国科学院光电研究院 3-dimensional image generation method based on main passive 3-D imaging system
CN108594255A (en) * 2018-04-20 2018-09-28 武汉大学 A kind of laser ranging auxiliary optical image association error compensation method and system
CN109359205A (en) * 2018-08-30 2019-02-19 中国农业大学 A kind of remote sensing image cutting method and equipment based on geographical grid
CN110209847A (en) * 2019-04-29 2019-09-06 中国科学院遥感与数字地球研究所 Quasi real time processing method, device and storage medium on Airborne Data Classification machine
CN215064560U (en) * 2021-07-29 2021-12-07 中国工程物理研究院应用电子学研究所 Autonomous positioning device based on laser radar imaging

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
ALEX ZIHAO ZHU 等: ""The Multivehicle Stereo Event Camera Dataset: An Event Camera Dataset for 3D Perception"", 《IEEE ROBOTICS AND AUTOMATION LETTERS》 *
MENGLAN HU 等: ""Holistic Scheduling of Real-Time Applications in Time-Triggered In-Vehicle Networks"", 《IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS》 *
张爱武 等: ""移动激光雷达的瞬时三维构像方"", 《测绘学报》 *
杨浩 等: ""车载激光雷达三维点云重构与漫游方法"", 《太赫兹科学与电子信息学报》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI816515B (en) * 2022-08-17 2023-09-21 國立臺北大學 Radar and image synchronization method and radar and image synchronization system

Also Published As

Publication number Publication date
CN111896972B (en) 2022-10-18

Similar Documents

Publication Publication Date Title
CN111065980B (en) System and method for centimeter-accurate positioning using camera-based sub-maps and LIDAR-based global maps
CN106774423B (en) Landing method and system of unmanned aerial vehicle
AU2008216054B2 (en) Event multiplexer for managing the capture of images
US5275354A (en) Guidance and targeting system
CN107817488B (en) Unmanned aerial vehicle obstacle avoidance device and method based on millimeter wave radar and vision fusion
US8830115B2 (en) Multiple-sensor tracking processing method with reduced latency time
EP3128338A1 (en) Aircraft weather radar coverage supplementing system
US20110077863A1 (en) Method and system for spectral image celestial navigation
CN107352037B (en) Device and method for acquiring camera exposure position information and unmanned aerial vehicle
EP3671397B1 (en) Computer-vision-based autonomous or supervised-autonomous landing of aircraft
CN108513710A (en) The correlating method of image and location information, device and moveable platform
JP6944790B2 (en) Controls, optics, control methods, unmanned aerial vehicle tracking systems and programs
CN112383675A (en) Time synchronization method and device and terminal equipment
CN111896972B (en) Airborne laser radar synchronous control and automatic digital image exterior orientation element list creation method
CN108710381A (en) A kind of servo-actuated landing method of unmanned plane
TWI758362B (en) Method and apparatus for raw sensor image enhancement through georegistration
EP2211144A1 (en) Systems and methods for determining location of an airborne vehicle using radar images
JP6701153B2 (en) Position measurement system for moving objects
CN112950671B (en) Real-time high-precision parameter measurement method for moving target by unmanned aerial vehicle
Greco et al. SAR-based augmented integrity navigation architecture
CN110906922A (en) Unmanned aerial vehicle pose information determining method and device, storage medium and terminal
KR102539003B1 (en) Operating method for satellite and operating system for satellite
CN110308433B (en) POS data and image triggering time matching method of laser radar system
Andert et al. Radar-aided optical navigation for long and large-scale flights over unknown and non-flat terrain
CN110850897A (en) Small unmanned aerial vehicle pose data acquisition method facing deep neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant