CN111551152A - Monocular vision-based relative pose measurement method and device for near space aircraft - Google Patents

Monocular vision-based relative pose measurement method and device for near space aircraft Download PDF

Info

Publication number
CN111551152A
CN111551152A CN202010499037.9A CN202010499037A CN111551152A CN 111551152 A CN111551152 A CN 111551152A CN 202010499037 A CN202010499037 A CN 202010499037A CN 111551152 A CN111551152 A CN 111551152A
Authority
CN
China
Prior art keywords
coordinate system
measuring
air bag
monocular camera
embedded processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010499037.9A
Other languages
Chinese (zh)
Other versions
CN111551152B (en
Inventor
黄磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Jicui Intelligent Photoelectric System Research Institute Co ltd
Original Assignee
Jiangsu Jicui Intelligent Photoelectric System Research Institute Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Jicui Intelligent Photoelectric System Research Institute Co ltd filed Critical Jiangsu Jicui Intelligent Photoelectric System Research Institute Co ltd
Priority to CN202010499037.9A priority Critical patent/CN111551152B/en
Publication of CN111551152A publication Critical patent/CN111551152A/en
Application granted granted Critical
Publication of CN111551152B publication Critical patent/CN111551152B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to a monocular vision-based near space vehicle relative pose measuring method and a monocular vision-based near space vehicle relative pose measuring device, wherein an air bag coordinate system, a pod coordinate system, a measuring system coordinate system and the relation among the coordinate systems are established, a monocular camera acquires images and transmits the images to an embedded processor for processing, the image center position of a cooperation mark is extracted and corresponds to the cooperation mark coordinate under the air bag coordinate system, the relative pose of the air bag coordinate system relative to the measuring system coordinate system is calculated, and the relative pose of the air bag coordinate system relative to the pod coordinate system is obtained through the conversion relation between the pod coordinate system and the measuring system coordinate system. According to the method and the device for measuring the relative pose of the near space vehicle based on the monocular vision, the measurement of the relative pose of the air bag and the pod of the near space vehicle in the flight process is effectively realized based on the characteristic cooperation mark of illumination and reflection, the monocular camera and the airborne embedded processor, the requirements of light weight and low power consumption are met, and the method and the device have important significance for guaranteeing flight safety and controlling flight motion.

Description

Monocular vision-based relative pose measurement method and device for near space aircraft
Technical Field
The invention belongs to the technical field of near space vehicles, and particularly relates to a method and a device for measuring the relative pose of a near space vehicle based on monocular vision.
Background
An airspace with the height of 20-100 km is generally called a near space, and the near space is used as a new airspace, wherein the upper part can be used for manufacturing the sky, and the lower part can be used for manufacturing the air, the sea and the ground, and the near space becomes a hot spot for future research and application. The near space aircraft mainly comprises an air bag filled with nitrogen and a pod of loading equipment, the air bag and the pod are connected together through a cable, the near space aircraft is mainly used for ground and space detection, and the real-time measurement of the relative pose relationship between the near space aircraft and the pod has important practical significance for guaranteeing flight safety and controlling flight movement in flight.
However, no relevant report is published on a method for measuring the relative poses of an air bag and a nacelle of a near space vehicle, and the current pose measuring method mainly uses an inertial navigation method and a global positioning system method, can only independently measure the positions and the poses of the air bag and the nacelle, and cannot acquire the relative poses between the air bag and the nacelle. The visual measurement has irreplaceable status in the measurement field due to the advantages of large measurement range, non-contact measurement process and the like. Therefore, the vision-based pose measurement technology is an important key technology for solving the problems.
The measurement of the relative pose of the near space aircraft aims to realize the high-precision real-time measurement of the position and the attitude of an air bag relative to a nacelle in the flight process. The near space aircraft is large in size, the camera measurement field of view is large, the measurement field of view faces upwards, and calibration of the relation among the measurement systems is an important problem. Moreover, the measurement system faces complex imaging conditions, and is easily interfered by the sun due to the sky observation; secondly, the imaging conditions are changed violently from the ground to the highest altitude position, the ground reflected light and the air refract near the ground, the space brightness is uniform, the air is thin from the ground to the highest altitude, the difference between the backlight imaging and the backlight imaging is huge, the measured surface of the air bag lacks obvious characteristics, and the characteristics are difficult to extract from the surface of the air bag for calculation under the complex imaging conditions. In addition, the measurement system needs to be installed in the nacelle as airborne equipment, which puts high demands on light weight and low power consumption of the measurement system.
Disclosure of Invention
In order to solve the technical problems in the prior art, the invention aims to provide a method and a device for measuring the relative pose of a near space aircraft based on monocular vision.
In order to achieve the purpose and achieve the technical effect, the invention adopts the technical scheme that:
the method for measuring the relative pose of the adjacent space aircraft based on monocular vision comprises the following steps:
a. before starting flight measurement, a plurality of cooperative markers are reasonably distributed and adhered to the bottom surface of the air bag, the monocular camera, the lighting source and the embedded processor are fixed in the hanging cabin, and the monocular camera and the lighting source are connected with the embedded processor;
b. establishing an air bag coordinate system, a pod coordinate system and a monocular camera measuring system coordinate system, unifying the coordinate systems, guiding various calibrated parameters into an embedded processor, and preparing for measurement;
c. after the measurement is started, the embedded processor controls the illumination light source to be normally on, the monocular camera continuously collects images and transmits the images to the embedded processor, and the embedded processor processes the obtained images and extracts the image center position of the cooperation mark;
d. the embedded processor corresponds the coordinate of the cooperative identifier under the air bag coordinate system through the spatial arrangement characteristics of the cooperative identifier, calculates the relative pose of the air bag coordinate system relative to the monocular camera measuring system coordinate system, and obtains the relative position pose of the air bag coordinate system relative to the pod coordinate system through the relation between the pod coordinate system and the monocular camera measuring system coordinate system;
e. and the embedded processor transmits the obtained result to the back-end controller in real time.
Further, in the step a, the black light absorption cloth and the bright silver chemical fiber reflective cloth are cut to be made into X-shaped angular point characteristics as cooperation marks, and six cooperation marks are selected as characteristics to be adhered to the bottom surface of the air bag along the air bag framework.
Furthermore, in the step a, two windows are designed on the upper surface of the nacelle shell, the windows cover the coated glass to isolate the inside and the outside of the nacelle, the monocular camera and the lighting source are respectively installed at two corresponding windows in the nacelle and are upward in direction, the visual angles of the camera and the lighting source are adjusted to ensure that the camera and the lighting field can cover the characteristic or the cooperative identification movement interval, the position of the monocular camera is fixed, and the position of the camera is not changed in the flying movement process.
Further, in the step b, the steps of establishing and calibrating the coordinate system of the air bag, the coordinate system of the nacelle and the coordinate system of the measuring system are as follows:
(1) before the monocular camera is installed, internal parameters of the monocular camera need to be well defined under a laboratory environment;
(2) measuring the characteristics of a coordinate system for defining the airbag when the airbag is designed by using a total station, and establishing an airbag coordinate system;
(3) setting an air bag coordinate system as a built-in measuring coordinate system of the total station, measuring the coordinates of the pasted cooperative identifications to obtain the coordinates of all the cooperative identifications under the air bag coordinate system, and numbering the coordinates according to rules;
(4) measuring the characteristics of a coordinate system of the nacelle by using a total station when the nacelle is designed, and establishing a nacelle coordinate system;
(5) shooting the characteristic points on the target by using a fixed monocular camera, continuously changing the position and the posture of the target during shooting, setting a pod coordinate system as a built-in measuring coordinate system of the total station, and synchronously measuring the characteristic points on the target to obtain the coordinates of the characteristic points under the pod coordinate system; under the condition that the relationship between the photoetching target and the characteristic points on the target is known, the rotation translation relationship between the nacelle coordinate system and the monocular camera measurement system coordinate system can be calibrated;
(6) and importing the calibration result into the embedded processor.
Further, in step c, the embedded processor performs image processing as follows:
(1) in the measuring process, the embedded processor directly controls the illumination light source to be normally on, and simultaneously gives out a synchronous trigger signal to the monocular camera at the frequency of 100Hz, and the monocular camera continuously acquires images and transmits the images to the embedded processor through the gigabit network port;
(2) adopting a Hessian matrix to carry out sub-pixel extraction on the preliminarily screened image by using the X-type angular point characteristics, wherein the surface of the air bag and the aerial background lack the X-type angular point characteristics except the characteristic markers, and the sub-pixel centers of the angular points of all needed cooperative markers in the image can be robustly obtained;
pixel (x)0,y0) The Hessian matrix of (a) is:
Figure BDA0002524060560000031
the second order gradient value of the normal direction of the point and the normal direction (n)x,ny) The maximum absolute eigenvalue and the corresponding eigenvector of the Hessian matrix of the point are obtained;
let the angular point sub-pixel coordinate be (x)0+s,y0+ t) of (s, t) ∈ [ -0.5,0.5 [ ]]*[-0.5,0.5]When, i.e. the first-order zero crossing of the edge is within the current pixel, through the diagonal point (x)0,y0) The gray value of the inner sub-pixel corner is subjected to second-order Taylor expansion, and the method comprises the following steps:
Figure BDA0002524060560000032
the calculation can obtain:
Figure BDA0002524060560000033
Figure BDA0002524060560000034
wherein r isxx、rxy、ryyRespectively representing the image at point (x)0,y0) A second order gradient in the x direction, a first order gradient in the y direction of the first order gradient in the x direction, and a second order gradient in the y direction; r isx、ryRespectively representing the image at point (x)0,y0) First order gradients in the x-direction as well as in the y-direction.
Further, in step d, the specific steps are as follows:
(1) matching of the airbag cooperation identification characteristic points and the image cooperation identification pixel points is realized through the unchanged space geometric topological constraint relation of the optical characteristic points, and corresponding monocular image pixel coordinates and world coordinates under an airbag coordinate system are obtained;
(2) according to the three-dimensional coordinates of each characteristic point in the air bag coordinate system and the two-dimensional pixel coordinates of each characteristic point in the monocular image, the position posture of the air bag coordinate system relative to the monocular camera measuring system coordinate system is solved;
(3) under the condition that the conversion relation between the pod coordinate system and the monocular camera measurement system coordinate system is known, the position and posture conversion relation of the air bag coordinate system relative to the pod coordinate system is obtained through coordinate system conversion calculation.
Furthermore, the bottom surface of the air bag is provided with a plurality of cooperative marks, the monocular camera, the lighting source and the embedded processor are fixed in the hanging cabin, the monocular camera and the lighting source are connected with the embedded processor, and the embedded processor is connected with the rear-end controller for data transmission.
Compared with the prior art, the invention has the beneficial effects that:
the invention discloses a method and a device for measuring the relative pose of an adjacent space aircraft based on monocular vision, which comprises the steps of firstly establishing an air bag coordinate system, a pod coordinate system and a monocular camera measuring system coordinate system, establishing the relation among the coordinate systems, continuously acquiring images through a monocular camera and transmitting the images to an embedded processor, processing the obtained images by the embedded processor, extracting the image center position of a cooperative identifier, corresponding the image center position to the cooperative identifier coordinate under the air bag coordinate system, calculating the relative pose of the air bag coordinate system relative to the measuring system coordinate system, and obtaining the relative pose of the air bag coordinate system relative to the pod coordinate system through the conversion relation between the pod coordinate system and the measuring system coordinate system; and the result obtained by the calculation of the embedded processor is transmitted to the rear-end controller in real time to be used as motion feedback. The method and the device for measuring the relative pose of the near space vehicle based on the monocular vision effectively realize the measurement of the relative pose of the air bag and the pod in the flying process of the near space vehicle based on the characteristic cooperation mark of illumination and reflection, the monocular camera and the airborne embedded processor, meet the requirements of light weight and low power consumption, extract the cooperation mark from the surface of the air bag under the complex imaging condition for calculation, and have important significance for ensuring the flying safety and controlling the flying motion of the near space vehicle.
Drawings
FIG. 1 is a block diagram of the steps of the present invention;
FIG. 2 is a layout view of a cooperative indicia of the present invention on an airbag;
FIG. 3 is a schematic diagram of the electrical connections within the pod of the present invention;
fig. 4 is a measurement schematic diagram of the present invention.
Detailed Description
The embodiments of the present invention will be described in detail with reference to the accompanying drawings so that the advantages and features of the invention can be more easily understood by those skilled in the art, and the scope of the invention will be clearly and clearly defined.
As shown in fig. 1-4, the near space vehicle relative pose measuring device based on monocular vision comprises an air bag 2 and a nacelle, wherein a plurality of cooperation marks 1 are arranged on the bottom surface of the air bag 2, a monocular camera, an illumination light source and an embedded processor are fixed in the nacelle, the monocular camera and the illumination light source are connected with the embedded processor, and the embedded processor is connected with a rear-end controller for data transmission.
The method for measuring the relative pose of the adjacent space aircraft based on monocular vision comprises the following steps:
a. before flight measurement starts, paste cooperation sign 1 reasonable distribution in 2 bottom surfaces of gasbag, monocular camera, light source, embedded processing system installation are fixed in the nacelle, and wherein monocular camera and light source are installed in nacelle windowing department and direction up, and the rational adjustment visual angle specifically includes following steps:
(1) the method comprises the steps of cutting black light absorption cloth and bright silver chemical fiber reflective cloth into X-shaped angular point characteristics (black light absorption cloth and bright silver chemical fiber reflective cloth are distributed at an angle of 90 degrees alternately) serving as cooperative marks, and designing and distributing 6 cooperative marks 1 on the surface of an air bag 2 according to the change of a view field as shown in figure 2, wherein three cooperative marks are coding marks for identifying the region where the view field is located, and the rest cooperative marks are all common marks. Judging the identification number according to the distance between the common identification and the coding identification;
(2) two open windows are designed on the upper surface of a nacelle shell, the windows are covered with coated glass to isolate the inside and the outside of the nacelle, a monocular camera and an illumination light source are respectively arranged under the two corresponding windows in the nacelle, the visual angles of the camera and the illumination light source are adjusted to ensure that the camera and the illumination visual field can cover a characteristic motion interval, the position of the monocular camera is fixed, and the position of the camera is ensured not to change in the flying motion process;
(3) an embedded processor is fixed in the nacelle, the embedded processor is connected with the monocular camera through a gigabit network port and a synchronous trigger line, a control line is connected between the embedded processor and the lighting source, and result data are transmitted between the embedded processor and the rear-end controller through an RS-422 bus.
b. After all modules are installed and fixed, an air bag coordinate system and a nacelle coordinate system are established by using a total station, a monocular camera measuring system coordinate system is established by combining a calibration target, and all coordinate systems are unified. And (3) importing each parameter obtained by calibration into an embedded processor to prepare for starting measurement, wherein the specific implementation steps are as follows:
(1) the whole system needs to establish an air bag coordinate system (O) for realizing measurementBXBYBZB) Pod coordinate system (O)SXSYSZS) Measuring camera coordinate system (O)CXCYCZC). Wherein, the monocular camera measuring system coordinate system is established on the monocular camera. The coordinate (P) of all the cooperative identifications 1 in the air bag coordinate system is obtained by calibration for realizing measurementt1,Pt2,…,Pt9) And measuring the rotation and translation relation between the coordinate system of the system and the coordinate system of the pod by the monocular camera, namely a rotation matrix R and a translation vector T. Before installing the monocular camera, camera internal parameters need to be well defined under a laboratory environment;
(2) measuring the characteristics of a coordinate system for defining the airbag when the airbag is designed by using a total station, and establishing an airbag coordinate system;
(3) setting an air bag coordinate system as a built-in measuring coordinate system of the total station, measuring the coordinates of the pasted cooperative identifications 1 to obtain the coordinates of all the cooperative identifications 1 in the air bag coordinate system, and numbering the coordinates according to rules;
(4) measuring the characteristics of a coordinate system of the nacelle by using a total station when the nacelle is designed, and establishing a nacelle coordinate system;
(5) shooting the characteristic points on the target by using a fixed monocular camera, continuously changing the position and the posture of the target during shooting, setting a pod coordinate system as a built-in measuring coordinate system of the total station, and synchronously measuring the characteristic points on the target to obtain the coordinates of the characteristic points under the pod coordinate system; under the condition that the relationship between the photoetching target and the characteristic points on the target is known, the rotation translation relationship between the nacelle coordinate system and the monocular camera measurement system coordinate system can be calibrated;
(6) and importing the calibration result into the embedded processor.
c. After the measurement is started, the embedded processor controls the illumination light source to be normally on, synchronous trigger information is given out at a frame rate of 100Hz by controlling a trigger signal, the monocular camera is controlled to continuously acquire images and transmit the images to the embedded processor, the embedded processor processes the acquired images, and the image center position of the cooperation mark is extracted. The method comprises the following specific steps:
(1) the embedded processor is a control and calculation unit taking ARM + GPU as a core, can realize the miniaturization and low power consumption of the processor, is connected with a monocular camera through a gigabit network port, is connected to a hard trigger signal end of the camera through a control signal, and is connected to an illumination light source control switch through a control signal;
(2) in the measuring process, the embedded processor directly controls the illumination light source to be normally on, and simultaneously gives a synchronous trigger signal to the camera at the frequency of 100Hz, and the camera starts to continuously acquire and transmits images to the embedded processor through the kilomega network port;
(3) under the irradiation of an illumination light source, the light-reflecting material in the cooperation mark 1 is highlighted in an image, the surrounding light-absorbing material is darker in the image, and the general position of the cooperation mark 1 can be determined through self-adaptive binarization processing and morphological operation;
(4) and adopting a Hessian matrix to carry out sub-pixel extraction on the preliminarily screened image for the X-type angular point characteristics, and because the surface of the airbag and the aerial background lack the X-type angular point characteristics except the cooperative mark 1, the sub-pixel centers of the angular points of the required cooperative marks 1 in the image can be robustly obtained. The image processing process is accelerated through the GPU, so that the effect of quickly processing the image is achieved.
d. Based on the image center position of each cooperation mark 1 obtained by image processing, matching of the air bag cooperation mark feature point and the image cooperation mark pixel point is realized through a constant optical feature point space geometric topological constraint relation, the corresponding world coordinates under a monocular camera measuring system coordinate system and an air bag coordinate system are obtained, the relative pose of the air bag coordinate system and the measuring system coordinate system is obtained through an algorithm, and the final relative pose of the air bag coordinate system and the pod coordinate system is obtained through the relation between the measuring system coordinate system and the pod coordinate system, and the specific steps are as follows:
(1) matching of the airbag cooperation identification characteristic points and the image cooperation identification pixel points is realized through a constant optical characteristic point space geometric topological constraint relation, and world coordinates under a corresponding monocular camera measurement system coordinate system and an airbag coordinate system are obtained;
(2) solving the position posture of the air bag coordinate system relative to the monocular camera measuring system coordinate system according to the three-dimensional coordinates of each characteristic point in the air bag coordinate system and the two-dimensional pixel coordinates of each characteristic point in the monocular image through a PnP algorithm;
(3) under the condition that the conversion relation between a pod coordinate system and a monocular camera measurement system coordinate system is known, the real-time rotation matrix R and translation vector T of the air bag coordinate system relative to the pod coordinate system can be calculated through coordinate system conversion, and the real-time rotation matrix R and translation vector T are target parameters obtained by the measurement system;
(5) and the result obtained by the calculation of the embedded processor is transmitted to the rear-end controller in real time through the RS-422 bus to be used as motion feedback.
Example 1
As shown in fig. 1-4, the near space vehicle relative pose measuring device based on monocular vision comprises an air bag 2 and a nacelle, wherein a plurality of cooperation marks 1 are arranged on the bottom surface of the air bag 2, a monocular camera, an illumination light source and an embedded processor are fixed in the nacelle, the monocular camera and the illumination light source are both connected with the embedded processor, the embedded processor is connected with a rear-end controller through an RS-422 bus for data transmission, and the monocular camera is connected with the embedded processor through an RJ45 network port.
Based on a monocular vision based relative pose measurement method of the near space vehicle, good imaging under the condition of complex ambient light is realized by adopting a mode of illuminating and reflecting a cooperative mark 1; the embedded processor based on ARM + GPU realizes the miniaturization, low power consumption and rapid calculation of airborne equipment, relates to the establishment and field calibration of each coordinate system in the measurement process, the target characteristic construction and imaging method under complex conditions and the image processing and pose solving method based on the airborne embedded system, is a core technology of visual measurement, is the key for realizing the measurement target and ensuring the measurement precision of the system.
The method for measuring the relative pose of the near space aircraft based on monocular vision comprises the steps of firstly establishing and calibrating various coordinate systems in the whole system, including an air bag coordinate system, a pod coordinate system and a measurement system coordinate system, and establishing the relation among the coordinate systems.
The method comprises the following specific steps:
a. before starting flight measurement, a plurality of cooperation marks 1 are reasonably distributed and adhered to the surface of the bottom surface of an air bag 2, a monocular camera, an illumination light source and an embedded processor are installed and fixed in a nacelle, the monocular camera, the illumination light source and the embedded processor are connected, the on-off of the illumination light source is controlled through the embedded processor, the monocular camera and the illumination light source are installed at a nacelle windowing part and are upward in direction, and the visual angle is reasonably adjusted;
b. after all modules are installed and fixed, an air bag coordinate system and a pod coordinate system are established by using a total station, a monocular camera measuring system coordinate system is established by combining a calibration target, all coordinate systems are unified, all parameters obtained by calibration are led into an embedded processor, and then the measurement is ready to be started;
c. after the measurement is started, the embedded processor controls the illumination light source to be normally on, synchronous trigger information is given out at a frame rate of 100Hz by controlling a trigger signal, the monocular camera is controlled to continuously acquire images of the cooperation mark 1 and transmit the images to the embedded processor, the embedded processor processes the acquired images, and the image center position of the cooperation mark 1 is extracted;
d. the embedded processor obtains the image center position of each cooperative mark 1 based on the image processing technology, corresponds the image center position to the cooperative mark coordinate under the air bag coordinate system through the space geometric topological constraint relation, then calculates the relative pose of the air bag coordinate system relative to the measuring system coordinate system, and obtains the relative pose of the air bag coordinate system relative to the pod coordinate system through the conversion relation between the pod coordinate system and the measuring system coordinate system;
e. and the result obtained by the calculation of the embedded processor is transmitted to the rear-end controller in real time to be used as motion feedback.
In the step a, the specific steps of installing and fixing the cooperation mark 1, the monocular camera, the lighting source, the embedded processor and the like and adjusting the visual angle are as follows:
(1) cutting black light absorption cloth and bright silver chemical fiber reflecting cloth to manufacture X-shaped angular point characteristics (the black light absorption cloth and the bright silver chemical fiber reflecting cloth are distributed at an angle of 90 degrees alternately) to serve as cooperation marks 1, and selecting six cooperation marks 1 as characteristics to be adhered to the lower surface of the air bag 2 along the framework of the air bag 2;
(2) two open windows are designed on the upper surface of a nacelle shell, the windows are covered with coated glass to isolate the inside and the outside of the nacelle, a monocular camera and an illumination light source are respectively arranged under the two corresponding windows in the nacelle, the visual angles of the camera and the illumination light source are adjusted to ensure that the camera and the illumination visual field can cover a characteristic motion interval, the position of the monocular camera is fixed, and the position of the camera is ensured not to change in the flying motion process;
(3) and the embedded processor is fixed in the nacelle and is electrically connected with the monocular camera, the lighting source, the power supply and the rear-end controller.
In the step b, the steps of establishing and calibrating an air bag coordinate system, a pod coordinate system and a monocular vision measurement coordinate system are as follows:
(1) before installing a monocular camera, internal parameters of each camera need to be well defined under a laboratory environment;
(2) measuring the characteristics of a coordinate system for defining the airbag when the airbag is designed by using a total station, and establishing an airbag coordinate system;
(3) setting an air bag coordinate system as a built-in measuring coordinate system of the total station, measuring the coordinates of the pasted cooperative identifications 1 to obtain the coordinates of all the cooperative identifications 1 in the air bag coordinate system, and numbering the coordinates according to rules Pi(i=1…6);
(4) Measuring the characteristics of a coordinate system of the nacelle by using a total station when the nacelle is designed, and establishing a nacelle coordinate system;
(5) shooting the characteristic points on the target by using a fixed monocular camera, continuously changing the position and the posture of the target during shooting, setting a pod coordinate system as a built-in measuring coordinate system of the total station, and synchronously measuring the characteristic points on the target to obtain the coordinates of the characteristic points under the pod coordinate system; under the condition that the relationship between the photoetching target and the characteristic points on the target is known, the rotation translation relationship between the nacelle coordinate system and the monocular camera measurement system coordinate system can be calibrated;
(6) and importing the calibration result into the embedded processor.
In step c, the embedded processor performs image processing as follows:
(1) the embedded processor is a control and calculation unit taking ARM + GPU as a core, can realize the miniaturization and low power consumption of the processor, is connected with a monocular camera through a gigabit network port, receives and sets images of the camera, is connected to a hard trigger signal end of the camera through a control signal, and is connected to a lighting source control switch through a control signal;
(2) in the measuring process, the embedded processor directly controls the illumination light source to be normally on, and simultaneously gives a synchronous trigger signal to the monocular camera at the frequency of 100Hz, and the monocular camera starts to continuously acquire images and transmits the images to the embedded processor through the kilomega network port;
(3) under the irradiation of an illumination light source, the light-reflecting material in the cooperation mark 1 is highlighted in an image, the surrounding light-absorbing material is darker in the image, and the general position of the cooperation mark 1 can be determined through self-adaptive binarization processing and morphological operation; because the imaging characteristics of the two materials in the cooperative identification 1 are obviously compared, the cooperative identification 1 can be roughly separated no matter in an environment with stronger overall illumination close to the ground or an environment with darker overall high-altitude backlight;
(4) adopting a Hessian matrix to carry out sub-pixel extraction on the preliminarily screened image on the X-shaped angular point characteristics; the surface of the air bag and the air background lack X-shaped angular point characteristics except the characteristic marks, so that the sub-pixel centers of the angular points of all needed cooperative marks 1 in the image can be robustly obtained;
pixel (x)0,y0) The Hessian matrix of (a) is:
Figure BDA0002524060560000091
the second order gradient value of the normal direction of the point and the normal direction (n)x,ny) The maximum absolute eigenvalue of the Hessian matrix at this point and the corresponding eigenvector.
Let the angular point sub-pixel coordinate be (x)0+s,y0+ t) of (s, t) ∈ [ -0.5,0.5 [ ]]*[-0.5,0.5]When, i.e. the first-order zero crossing of the edge is within the current pixel, through the diagonal point (x)0,y0) The gray value of the inner sub-pixel corner is subjected to second-order Taylor expansion, and the method comprises the following steps:
Figure BDA0002524060560000092
the calculation can obtain:
Figure BDA0002524060560000101
Figure BDA0002524060560000102
wherein r isxx、rxy、ryyRespectively representing the image at point (x)0,y0) A second order gradient in the x direction, a first order gradient in the y direction of the first order gradient in the x direction, and a second order gradient in the y direction; r isx、ryIndividual watchThe image point (x)0,y0) First order gradients in the x-direction as well as in the y-direction.
In the step d, the concrete steps are as follows:
(1) matching of the airbag cooperation identification characteristic points and the image cooperation identification pixel points is realized through a constant optical characteristic point space geometric topological constraint relation, and corresponding monocular image pixel coordinates and world coordinates under an airbag coordinate system are obtained, wherein the specific principle is as follows:
in the process of camera perspective projection transformation, matching of the image feature points and the feature X-shaped corner points is realized by combining a constant optical feature point space geometric topological constraint relation. By analyzing the spatial topological structure, the method has the following characteristics in the perspective projection transformation process:
1) the straight line is unchanged. The imaging of the space straight line after perspective projection transformation is still a straight line, namely, the collinear feature points still meet the collinear condition after projection imaging;
2) the coplanar points are unchanged in the order of the hour in the perspective projection transformation process. The hour hand arrangement sequence of the feature points does not change after perspective projection transformation;
based on the two characteristics, the matching process is as follows:
1) performing straight line fitting on the pixel coordinates of the extracted six cooperative identifications 1 by using a least square method, and obtaining a straight line j through fitting, namely P of the cooperative identification 12~P5The straight line on which the light beam is positioned;
2) the set accuracy is a set threshold. P for computing cooperation identity 1i(i-1 … 6) to the straight line ji(i-1 … 6) if DiIf so, then P is considerediFor P in cooperative identification 12~P5One, the remaining two feature points are P1And P6
3) Using least square method to the characteristic point P obtained in the last step1And P6Fitting the pixel coordinate straight line to obtain a straight line k;
4) finding the intersection G and the feature point P of the straight line j and the straight line k2~P5The feature point P with the minimum middle-distance intersection point G3The feature point P is the largest distance from the intersection point G5(ii) a Judging the characteristic point P2And P4And the characteristic point P5Distance, wherein the point at which the distance is the smallest is the feature point P4The largest distance is the feature point P2
5) Remaining feature points P1And P6. Determined by judging the position relationship between the characteristic point and the straight line j, the characteristic point P is limited by the nacelle1Always above the line j and below the line j is the feature point P6And completing the matching of all the feature points.
(2) And solving the position posture of the air bag coordinate system relative to the monocular camera measurement system coordinate system according to the three-dimensional coordinates of each characteristic point in the air bag coordinate system and the two-dimensional pixel coordinates of each characteristic point in the monocular image by using a PnP algorithm, wherein the PnP algorithm is as follows:
the monocular camera perspective projection model is represented by:
Figure BDA0002524060560000111
Figure BDA0002524060560000112
wherein A is a parameter matrix in the monocular camera and can be obtained by calibrating the monocular camera; λ is a scale factor, [ X Y Z1]TCoordinates of the three-dimensional points under an air bag coordinate system; [ u v 1]Is the homogeneous coordinate of the three-dimensional point in the image coordinate system taking the pixel as the unit;
the rotation matrix R and the translation vector T are expressed as follows:
Figure BDA0002524060560000113
Figure BDA0002524060560000114
after the monocular camera is calibrated, the parameters in the monocular camera are known, and the coordinates of the two-dimensional image point in the coordinate system of the monocular camera measuring system can be calculated. The rotation matrix R and the translation vector T between the coordinate system of the monocular camera measuring system and the world coordinate system can be calculated through the three-dimensional point coordinate of the monocular camera measuring system and the three-dimensional point coordinate of the world coordinate system. The rotation matrix R is a 3 x 3 unit orthogonal matrix with three degrees of freedom, so the translation between the two coordinate systems has six degrees of freedom. And (4) substituting the world coordinates of the points under the air bag coordinate system into the above formula and finishing to obtain the following formula.
Figure BDA0002524060560000121
A pair of three-two dimensional point pairs can provide two degrees of freedom constraints. Thus, if the monocular camera is calibrated and the coordinates of more than 3 three-dimensional points in the world coordinate system and the corresponding image coordinates are known, the transformation between the monocular camera measurement system coordinate system and the world coordinate system may be calculated R, T.
Since the cooperative identifications 1 have 6 in total, the number of equations for solving the equation set of R, T is more than the number of unknowns, so that a least square solution of R, T can be solved by a least square method, and the relative pose of the airbag relative to the coordinate system of the monocular camera measurement system can be obtained.
(3) Under the condition that the conversion relation between the pod coordinate system and the monocular camera measurement system coordinate system is known, the position and attitude conversion relation of the air bag coordinate system relative to the pod coordinate system is obtained through coordinate system conversion calculation, and the coordinate system conversion principle is as follows:
coordinate system transformation is the description of the location of a spatial entity, a process of transforming from one coordinate system to another. The method is realized by establishing a one-to-one correspondence relationship between two coordinate systems, wherein the conversion formula of the coordinate systems is as follows:
Figure BDA0002524060560000122
wherein the content of the first and second substances,
Figure BDA0002524060560000123
representing a rotation matrix from one coordinate system 2 to another coordinate system 1;
Figure BDA0002524060560000124
representing a translation vector from one camera coordinate system 2 to another camera coordinate system 1; o denotes a row vector in which elements of one row and three columns are all 0.
The calculation formula for converting the air bag coordinate system to the monocular camera measurement system coordinate system and then to the pod coordinate system is as follows:
Figure BDA0002524060560000125
wherein R is3=R2R1,T3=R2T1+T2The position and posture conversion relation of the air bag coordinate system relative to the pod coordinate system can be obtained.
In step e, the measurement result is transmitted to the rear-end controller in real time through the RS-422 bus by the embedded processor to be used as motion feedback.
The parts of the invention not described in detail can be realized by adopting the prior art, and are not described herein.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes performed by the present specification and drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (7)

1. The method for measuring the relative pose of the adjacent space aircraft based on monocular vision is characterized by comprising the following steps of:
a. before starting flight measurement, a plurality of cooperative markers are reasonably distributed and adhered to the bottom surface of the air bag, the monocular camera, the lighting source and the embedded processor are fixed in the hanging cabin, and the monocular camera and the lighting source are connected with the embedded processor;
b. establishing an air bag coordinate system, a pod coordinate system and a monocular camera measuring system coordinate system, unifying the coordinate systems, guiding various calibrated parameters into an embedded processor, and preparing for measurement;
c. after the measurement is started, the embedded processor controls the illumination light source to be normally on, the monocular camera continuously collects images and transmits the images to the embedded processor, and the embedded processor processes the obtained images and extracts the image center position of the cooperation mark;
d. the embedded processor corresponds the coordinate of the cooperative identifier under the air bag coordinate system through the spatial arrangement characteristics of the cooperative identifier, calculates the relative pose of the air bag coordinate system relative to the monocular camera measuring system coordinate system, and obtains the relative position pose of the air bag coordinate system relative to the pod coordinate system through the relation between the pod coordinate system and the monocular camera measuring system coordinate system;
e. and the embedded processor transmits the obtained result to the back-end controller in real time.
2. The method for measuring the relative pose of the near space vehicle based on the monocular vision as claimed in claim 1, wherein in the step a, the black light absorption cloth and the bright silver chemical fiber reflective cloth are cut to be manufactured into X-shaped corner point characteristics as cooperative markers, and six cooperative markers are selected as characteristics and are adhered to the bottom surface of the air bag along the air bag framework.
3. The method for measuring the relative pose of the near space vehicle based on the monocular vision as claimed in claim 1, wherein in the step a, two windows are designed on the upper surface of the nacelle housing, the windows cover the coated glass to isolate the inside and the outside of the nacelle, the monocular camera and the illumination light source are respectively installed at two corresponding windows in the nacelle and are upward in direction, the visual angles of the camera and the illumination light source are adjusted to ensure that the monocular camera and the illumination visual field can cover the movement interval of the cooperative identifier, the position of the monocular camera is fixed, and the position of the monocular camera is not changed in the flying movement process.
4. The method for measuring the relative pose of the adjacent space vehicle based on the monocular vision as recited in claim 1, wherein in the step b, the steps of establishing and calibrating the coordinate system of the air bag, the coordinate system of the pod and the coordinate system of the measuring system are as follows:
(1) before the monocular camera is installed, internal parameters of the monocular camera need to be well defined under a laboratory environment;
(2) measuring the characteristics of a coordinate system for defining the airbag when the airbag is designed by using a total station, and establishing an airbag coordinate system;
(3) setting an air bag coordinate system as a built-in measuring coordinate system of the total station, measuring the coordinates of the pasted cooperative identifications to obtain the coordinates of all the cooperative identifications under the air bag coordinate system, and numbering the coordinates according to rules;
(4) measuring the characteristics of a coordinate system of the nacelle by using a total station when the nacelle is designed, and establishing a nacelle coordinate system;
(5) shooting the characteristic points on the target by using a fixed monocular camera, continuously changing the position and the posture of the target during shooting, setting a pod coordinate system as a built-in measuring coordinate system of the total station, and synchronously measuring the characteristic points on the target to obtain the coordinates of the characteristic points under the pod coordinate system; under the condition that the relationship between the photoetching target and the characteristic points on the target is known, the rotation translation relationship between the nacelle coordinate system and the monocular camera measurement system coordinate system can be calibrated;
(6) and importing the calibration result into the embedded processor.
5. The method for measuring the relative pose of the adjacent space vehicle based on the monocular vision as recited in claim 1, wherein in the step c, the embedded processor performs the image processing steps as follows:
(1) in the measuring process, the embedded processor directly controls the illumination light source to be normally on, and simultaneously gives out a synchronous trigger signal to the monocular camera at the frequency of 100Hz, and the monocular camera continuously acquires images and transmits the images to the embedded processor through the gigabit network port;
(2) adopting a Hessian matrix to carry out sub-pixel extraction on the preliminarily screened image by using the X-type angular point characteristics, wherein the surface of the air bag and the aerial background lack the X-type angular point characteristics except the characteristic markers, and the sub-pixel centers of the angular points of all needed cooperative markers in the image can be robustly obtained;
pixel (x)0,y0) The Hessian matrix of (a) is:
Figure FDA0002524060550000021
the second order gradient value of the normal direction of the point and the normal direction (n)x,ny) The maximum absolute eigenvalue and the corresponding eigenvector of the Hessian matrix of the point are obtained;
let the angular point sub-pixel coordinate be (x)0+s,y0+ t) of (s, t) ∈ [ -0.5,0.5 [ ]]*[-0.5,0.5]When, i.e. the first-order zero crossing of the edge is within the current pixel, through the diagonal point (x)0,y0) The gray value of the inner sub-pixel corner is subjected to second-order Taylor expansion, and the method comprises the following steps:
Figure FDA0002524060550000022
the calculation can obtain:
Figure FDA0002524060550000023
Figure FDA0002524060550000024
wherein r isxx、rxy、ryyRespectively representing the image at point (x)0,y0) A second order gradient in the x direction, a first order gradient in the y direction of the first order gradient in the x direction, and a second order gradient in the y direction; r isx、ryRespectively representing the image at point (x)0,y0) First order gradients in the x-direction as well as in the y-direction.
6. The method for measuring the relative pose of the adjacent space vehicle based on the monocular vision as recited in claim 1, wherein in the step d, the specific steps are as follows:
(1) matching of the airbag cooperation identification characteristic points and the image cooperation identification pixel points is realized through the unchanged space geometric topological constraint relation of the optical characteristic points, and corresponding monocular image pixel coordinates and world coordinates under an airbag coordinate system are obtained;
(2) according to the three-dimensional coordinates of each characteristic point in the air bag coordinate system and the two-dimensional pixel coordinates of each characteristic point in the monocular image, the position posture of the air bag coordinate system relative to the monocular camera measuring system coordinate system is solved;
(3) under the condition that the conversion relation between the pod coordinate system and the monocular camera measurement system coordinate system is known, the position and posture conversion relation of the air bag coordinate system relative to the pod coordinate system is obtained through coordinate system conversion calculation.
7. The device for measuring the relative pose of the approaching space vehicle based on the monocular vision as set forth in any one of claims 1 to 6, comprising an airbag and a nacelle, wherein a plurality of cooperative markers are arranged on the bottom surface of the airbag, the monocular camera, the illumination light source and the embedded processor are fixed in the nacelle, the monocular camera and the illumination light source are connected with the embedded processor, and the embedded processor is connected with the rear-end controller for data transmission.
CN202010499037.9A 2020-06-04 2020-06-04 Monocular vision-based relative pose measurement method and device for near space aircraft Active CN111551152B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010499037.9A CN111551152B (en) 2020-06-04 2020-06-04 Monocular vision-based relative pose measurement method and device for near space aircraft

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010499037.9A CN111551152B (en) 2020-06-04 2020-06-04 Monocular vision-based relative pose measurement method and device for near space aircraft

Publications (2)

Publication Number Publication Date
CN111551152A true CN111551152A (en) 2020-08-18
CN111551152B CN111551152B (en) 2021-06-15

Family

ID=72003156

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010499037.9A Active CN111551152B (en) 2020-06-04 2020-06-04 Monocular vision-based relative pose measurement method and device for near space aircraft

Country Status (1)

Country Link
CN (1) CN111551152B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112525158A (en) * 2020-11-16 2021-03-19 江苏集萃智能光电系统研究所有限公司 Double-shield six-degree-of-freedom measurement method and system based on monocular vision system
CN112556657A (en) * 2020-12-03 2021-03-26 北京强度环境研究所 Multi-view vision measurement system for flight motion parameters of separating body in vacuum environment
CN115355884A (en) * 2022-07-26 2022-11-18 中国人民解放军海军工程大学 Device and method for measuring relative pose of ship bearing
CN116958263A (en) * 2023-08-09 2023-10-27 苏州三垣航天科技有限公司 Monocular camera intelligent enhancement method in space observation target gesture recognition process

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101782385A (en) * 2010-02-02 2010-07-21 王建雄 Unmanned airship low-altitude photogrammetry
CN104417743A (en) * 2013-09-09 2015-03-18 陈德荣 Bag body telescopic stratospheric airship
CN109238235A (en) * 2018-06-29 2019-01-18 华南农业大学 Monocular sequence image realizes rigid body pose parameter continuity measurement method
CN109319161A (en) * 2018-10-08 2019-02-12 水利部南京水利水文自动化研究所 A kind of unmanned buoyance lift one aircraft nacelle device
JP2019151216A (en) * 2018-03-02 2019-09-12 トヨタ自動車株式会社 Steering device for vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101782385A (en) * 2010-02-02 2010-07-21 王建雄 Unmanned airship low-altitude photogrammetry
CN104417743A (en) * 2013-09-09 2015-03-18 陈德荣 Bag body telescopic stratospheric airship
JP2019151216A (en) * 2018-03-02 2019-09-12 トヨタ自動車株式会社 Steering device for vehicle
CN109238235A (en) * 2018-06-29 2019-01-18 华南农业大学 Monocular sequence image realizes rigid body pose parameter continuity measurement method
CN109319161A (en) * 2018-10-08 2019-02-12 水利部南京水利水文自动化研究所 A kind of unmanned buoyance lift one aircraft nacelle device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李思聪: "基于双目视觉的相对位姿测量方法研究", 《中国优秀硕士硕士学位论文全文数据库电子期刊》 *
王宏伦: "基于双目视觉的自动空中加油近距导航方法", 《北京航空航天大学学报》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112525158A (en) * 2020-11-16 2021-03-19 江苏集萃智能光电系统研究所有限公司 Double-shield six-degree-of-freedom measurement method and system based on monocular vision system
CN112556657A (en) * 2020-12-03 2021-03-26 北京强度环境研究所 Multi-view vision measurement system for flight motion parameters of separating body in vacuum environment
CN112556657B (en) * 2020-12-03 2022-07-12 北京强度环境研究所 Multi-view vision measurement system for flight motion parameters of separating body in vacuum environment
CN115355884A (en) * 2022-07-26 2022-11-18 中国人民解放军海军工程大学 Device and method for measuring relative pose of ship bearing
CN115355884B (en) * 2022-07-26 2024-03-22 中国人民解放军海军工程大学 Relative pose measuring device and method for ship bearing
CN116958263A (en) * 2023-08-09 2023-10-27 苏州三垣航天科技有限公司 Monocular camera intelligent enhancement method in space observation target gesture recognition process
CN116958263B (en) * 2023-08-09 2024-04-12 苏州三垣航天科技有限公司 Monocular camera intelligent enhancement method in space observation target gesture recognition process

Also Published As

Publication number Publication date
CN111551152B (en) 2021-06-15

Similar Documents

Publication Publication Date Title
CN111551151B (en) Binocular vision-based near space vehicle relative pose measurement method and device
CN111551152B (en) Monocular vision-based relative pose measurement method and device for near space aircraft
CN111735479A (en) Multi-sensor combined calibration device and method
CN107121125B (en) A kind of communication base station antenna pose automatic detection device and method
CN108305264B (en) A kind of unmanned plane precision landing method based on image procossing
CN108038885B (en) More depth camera scaling methods
CN109634279A (en) Object positioning method based on laser radar and monocular vision
CN104200086A (en) Wide-baseline visible light camera pose estimation method
CN102538793B (en) Double-base-line non-cooperative target binocular measurement system
CN103759669A (en) Monocular vision measuring method for large parts
CN110006408A (en) LiDAR data " cloud control " aviation image photogrammetric survey method
CN112184812B (en) Method for improving identification and positioning precision of unmanned aerial vehicle camera to april tag and positioning method and system
CN109708649A (en) A kind of attitude determination method and system of remote sensing satellite
CN110470226A (en) A kind of bridge structure displacement measurement method based on UAV system
CN109242918A (en) A kind of helicopter-mounted binocular stereo vision scaling method
CN110503687A (en) A kind of aerial photoelectric measurement platform object localization method
CN108168472B (en) Method and device for measuring satellite antenna unfolding flatness and pointing accuracy
CN114923477A (en) Multi-dimensional space-ground collaborative map building system and method based on vision and laser SLAM technology
CN113031462A (en) Port machine inspection route planning system and method for unmanned aerial vehicle
CN111596259A (en) Infrared positioning system, positioning method and application thereof
CN103363961A (en) Aerial image-based determination method of three-dimensional information of traffic accident scene based on s
Peng et al. A measuring method for large antenna assembly using laser and vision guiding technology
CN113028990A (en) Laser tracking attitude measurement system and method based on weighted least square
CN113124821B (en) Structure measurement method based on curved mirror and plane mirror
CN102168973B (en) Automatic navigating Z-shaft positioning method for omni-directional vision sensor and positioning system thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant