CN115406414A - Dynamic target measurement on-orbit illumination evaluation method for space station mechanical arm - Google Patents

Dynamic target measurement on-orbit illumination evaluation method for space station mechanical arm Download PDF

Info

Publication number
CN115406414A
CN115406414A CN202210981620.2A CN202210981620A CN115406414A CN 115406414 A CN115406414 A CN 115406414A CN 202210981620 A CN202210981620 A CN 202210981620A CN 115406414 A CN115406414 A CN 115406414A
Authority
CN
China
Prior art keywords
illumination
image
camera
value
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210981620.2A
Other languages
Chinese (zh)
Other versions
CN115406414B (en
Inventor
谭启蒙
陈磊
李大明
侯作勋
王飞
杜晓东
梁常春
危清清
潘冬
王瑞
郭宇
孙沂昆
贾馨
王友渔
高升
熊明华
唐自新
周永辉
吴志红
邹大力
张昕蕊
马超
程刚
许哲
沈莹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Spacecraft System Engineering
Original Assignee
Beijing Institute of Spacecraft System Engineering
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Spacecraft System Engineering filed Critical Beijing Institute of Spacecraft System Engineering
Priority to CN202210981620.2A priority Critical patent/CN115406414B/en
Publication of CN115406414A publication Critical patent/CN115406414A/en
Application granted granted Critical
Publication of CN115406414B publication Critical patent/CN115406414B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/02Details
    • G01J1/04Optical or mechanical part supplementary adjustable parts
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/42Photometry, e.g. photographic exposure meter using electric radiation detectors
    • G01J1/4204Photometry, e.g. photographic exposure meter using electric radiation detectors with determination of ambient light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Sustainable Development (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A dynamic target measurement on-orbit illumination assessment method for a mechanical arm of a space station belongs to the field of photoelectric measurement and solves the problem of dynamic target measurement on-orbit illumination assessment for the mechanical arm of the space station. The method relies on a visual measurement system carried by a mechanical arm, and through local improvement on a visible light vision camera, on one hand, the method can automatically calculate the compensation information of the object plane illumination of the active light source of the camera by quickly and accurately estimating the luminous brightness value of a typical target in an on-orbit image shot by the visual camera, thereby adaptively adjusting the illumination strategy of the active light source of the visual camera, effectively enhancing the robustness of the mechanical arm vision camera against the illumination interference effect while remarkably improving the quality of the on-orbit image, and providing reliable guarantee for the smooth development of a subsequent on-orbit task or a space experiment of the mechanical arm.

Description

Dynamic target measurement on-orbit illumination evaluation method for space station mechanical arm
Technical Field
The invention relates to an on-orbit illumination evaluation method for dynamic target measurement of a mechanical arm of a space station, which is particularly suitable for accurate measurement and compensation of on-orbit illumination information required by target measurement when a space service large mechanical arm faces a severe illumination environment on an orbit and belongs to the field of photoelectric measurement.
Background
The space station mechanical arm configuration vision measurement system is mainly used for vision monitoring of a mechanical arm working area and accurate estimation of a three-dimensional pose of a cooperative target, and provides necessary reference information such as a vision image, a target pose and the like for accurate control of the mechanical arm. However, space station robotic arms often face the rigors of complex and varied lighting environments during orbital operations. Although the visual camera fully considers the interference factors such as stray light resistance and the like in the design stage, the space station mechanical arm has the function of extravehicular crawling, the included angle between the optical axis of the visual camera and the sunlight incidence direction is random due to the complex and changeable illumination environment in the orbit, and the series of problems such as large illumination difference, complex background types, discontinuous motion, multi-target shielding and the like in the field range of the camera are easily caused. In particular, once the vision camera is confronted with a severe environment such as high-intensity direct sunlight or sudden change of the incident angle of the sunlight, the contrast of the vision image will suddenly drop or be overexposed, which undoubtedly poses a serious challenge to the reliability of the on-orbit task of the mechanical arm.
At present, a large mechanical arm serving an international space station mainly faces a severe test of a complex, severe and variable space illumination environment during rail operation, and NASA firstly proposes two targeted solutions: firstly, an active light source is configured for a mechanical arm hand-eye camera; and secondly, developing a ground digital image enhancement algorithm. The former is only suitable for partial illumination conditions due to the lack of an automatic adjusting function of an active light source, and some detailed information may be lost, so that various extreme illumination working conditions cannot be considered; the latter requires images to be downloaded to the ground, thereby bringing about a large time delay and being incapable of ensuring the continuous development of the on-orbit task of the mechanical arm. Therefore, how to effectively overcome the complicated and variable space illumination effect of the mechanical arm of the space station is one of the key technical problems which restrict the on-orbit environmental adaptability level of the mechanical arm.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the defects of the prior art are overcome, and the problem of on-orbit illumination evaluation of dynamic target measurement for the mechanical arm of the space station is solved.
The purpose of the invention is realized by the following technical scheme:
a dynamic target measurement on-orbit illumination evaluation method for a mechanical arm of a space station is disclosed, a visible light vision camera has two measurement modes of monocular and binocular, and the illumination evaluation method comprises the following steps:
determining a cooperative target pattern;
the active light source illuminator of the visible light vision camera is set to be in a closed state, an image is obtained, image gray scale information is determined, and target detection and mode recognition are carried out by utilizing the image gray scale information; determining a visual mark area gray level DN difference value of a cooperative target pattern in the image according to the target detection and pattern recognition results; if the difference value of the gray DN of the visual mark area is not lower than the set threshold, the illumination compensation of the object plane is not needed; if the DN difference of the visual mark area gray levels is lower than the set threshold, the illuminator of the active light source of the visible light visual camera is turned on, the DN difference of the visual mark area gray levels is obtained again, if the DN difference of the visual mark area gray levels obtained again is not lower than the set threshold, the illumination compensation of the object plane is not needed, and if the DN difference of the visual mark area gray levels is still lower than the set threshold, the following processing is carried out:
determining the image surface illumination of the camera by using the DN difference value of the gray scales of the visual marking areas;
determining the surface luminance of the visual mark by using the image surface illumination of the camera;
determining a left eye image object plane illumination compensation value and a right eye image object plane illumination compensation value by using the surface illumination brightness of the visual mark;
and selecting the maximum value from the left eye image object plane illumination compensation value and the right eye image object plane illumination compensation value, determining the final object plane illumination compensation value, and further determining the light source illumination power.
Preferably, the cooperation target pattern simultaneously satisfies the following conditions:
the pattern comprises an isotropic set of circular mark points;
at the farthest observation distance, the number of pixels occupied by the diameter of the mark point in the pattern in the camera image is at least not lower than the imaging resolution of the camera;
the pattern is selected to be matched with black and white;
the pattern is made of a special vacuum-resistant high-temperature and low-temperature diffuse reflection material;
the pattern meets the uniqueness of rotation, translation and scaling;
the number of the different-surface mark points contained in a single image shot at the nearest observation distance of the camera is not less than 4.
Preferably, after a Bayer image is acquired by adopting a monocular measurement mode, an RGB color image is obtained through nonlinear interpolation and then converted into a YUV image data format, wherein a Y component is image gray scale information acquired by a camera.
Preferably, the determining the illumination of the image plane of the camera by using the difference value of the visual mark area gray levels DN comprises:
and determining a statistical mean value of the visual marking area gray DN difference values by using the visual marking area gray DN difference values, resolving an image gray value output by the camera image detector after photoelectric conversion according to the image quantization digit number output by the camera, and further determining the camera image surface illumination.
Preferably, the method for determining the brightness of the surface of the visual mark comprises the following steps:
Figure BDA0003798924730000031
wherein τ is the transmittance of the camera optical system; D/F denotes the relative aperture of the camera, i.e. the reciprocal of the F number, pi is the circumference ratio, E image Is the camera image plane illumination.
Preferably, the illumination compensation value of the object plane of the left eye image and the illumination compensation value of the object plane of the right eye image are determined by utilizing the surface illumination brightness of the visual mark according to the geometric position relation between the active light source illuminator of the visible light visual camera and the cooperative target.
Preferably, the method for determining the illumination power of the light source comprises the following steps:
Figure BDA0003798924730000032
wherein d is the illumination distance of the active light source, pi is the circumference ratio, theta is the illumination angle, K is the luminous efficacy, E lamp-left Is the object plane illumination compensation value of the left eye image, E lamp-right The illumination compensation value of the object plane of the right eye image is obtained, max is the maximum value of the two, and the illumination uniformity is eta.
Preferably, a monocular or binocular measurement mode is used to acquire the image and determine the image gray scale information.
A three-dimensional measurement method for the pose of a dynamic target of a mechanical arm of a space station is characterized in that the illumination evaluation method is adopted, and the following processing is carried out after illumination evaluation:
extracting two-dimensional coordinate values of the centers of all marked feature points in the cooperative target pattern in the image by utilizing a gray-scale weighted centroid method according to the optimal candidate connected domain subset in the target detection and pattern recognition results;
determining a measurement mode of a visible light vision camera;
for the monocular measurement mode, based on the physical mapping relation between the two-dimensional imaging plane and the three-dimensional space, determining an initial value of target pose data in the monocular measurement mode by using the two-dimensional coordinate value and camera internal reference calibration data; then determining a target-tail end three-dimensional pose parameter by using an external reference calibration result between a camera and a tail end;
for the binocular measurement mode, after a three-dimensional coordinate system is reconstructed, coordinate conversion is carried out, and an initial value of three-dimensional pose data in the binocular measurement mode is determined by using the two-dimensional coordinate value; and then determining the three-dimensional pose parameters of the target and the tail end by using the external reference calibration result between the camera and the tail end.
Preferably, for the monocular measurement mode, a least square method is adopted to determine an initial value of target pose data in the monocular measurement mode; and for the binocular measurement mode, determining the initial value of the three-dimensional pose data in the binocular measurement mode by using a singular value decomposition method.
Compared with the prior art, the invention has the following beneficial effects:
(1) The cooperative target pattern design principle provided by the invention can be helpful for improving the visual image contrast, obviously reducing the illumination information measurement and calculation difficulty and calculation amount, and improving the accuracy and precision level of illumination estimation;
(2) The on-orbit illumination evaluation method for the dynamic target of the mechanical arm can solve the light-emitting brightness value of the surface of the space cooperative target and the object plane illumination compensation value of the active light source in real time and reversely, provides a basis for the self-adaptive adjustment of the illumination strategy of the active light source of the camera, and obviously improves the image quality under the severe on-orbit illumination condition;
(3) The target on-orbit illumination evaluation method provided by the invention can be used for objectively and quantitatively evaluating the on-orbit illumination environment of the mechanical arm of the space station, and can further provide important data for on-orbit health monitoring of the mechanical arm of the space station;
(4) The space station mechanical arm dynamic target measurement on-orbit illumination evaluation system provided by the invention is beneficial to improving the self space environment adaptive capacity of the space station mechanical arm, and simultaneously effectively inhibiting the complex and changeable space illumination effect;
(5) The target pose measurement method provided by the invention can simultaneously consider two measurement modes of monocular and binocular, the relative pose relation between the space cooperative target and the mechanical arm end effector is solved in real time, the space cooperative target and the mechanical arm end effector are mutually backed up, and the reliability and the safety of the mechanical arm for completing the vision measurement function in an on-orbit manner are obviously improved;
(6) According to the method, the light-emitting brightness value of a typical target in an on-orbit image shot by the vision camera is quickly and accurately estimated, and the compensation information of the illumination of the object plane of the active light source of the camera is automatically calculated, so that the illumination strategy of the active light source of the vision camera is adaptively adjusted, the quality of the on-orbit image is remarkably improved, the robustness and robustness of the mechanical arm vision camera for resisting the illumination interference effect are effectively enhanced, and the reliable guarantee is provided for the smooth development of a subsequent mechanical arm on-orbit task or a space experiment.
Drawings
Fig. 1 is a schematic diagram of a dynamic target measurement in-orbit illumination evaluation system oriented to a mechanical arm of a space station.
Fig. 2 is a schematic diagram illustrating the definition of the left eye camera coordinate system and the right eye camera coordinate system.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
As shown in fig. 1, a dynamic target measurement in-orbit illumination evaluation system for a mechanical arm of a space station mainly includes: the system comprises a visible light vision camera, a space cooperation target, a multi-degree-of-freedom numerical control device, a ground support tool, an on-orbit illumination simulator, an illuminometer, a ground inspection computer and the like. The system comprises a visible light vision camera, a ground detection computer, a ground support tool and a ground illumination tool, wherein an active light source illuminator is arranged in the visible light vision camera and is mainly used for shooting a visible light image containing a space cooperation target and transmitting the visible light image to the ground detection computer; the space cooperative target carries a visual mark with a known pattern design and is fixedly arranged on a supporting platform of the multi-degree-of-freedom numerical control device; the multi-degree-of-freedom numerical control device is used for simulating the surface of a ruler disc at the target adapter end of the mechanical arm of the space station, providing stable support and motion drive for a cooperative target, and transmitting target motion information to the ground detection computer in real time; the ground support tool mainly simulates the end part of a tail end executing mechanism of a mechanical arm of a space station and provides fixed support for a visible light vision camera; the on-orbit illumination simulator is mainly used for simulating various complex and changeable illumination environments; the illuminometer is mainly used for actually measuring the illumination intensity value of the target surface; the ground detection computer is mainly used for receiving image data and target motion information collected by the visible light vision camera, processing the image information, and measuring and calculating data such as a luminous brightness value of a target surface and object surface illumination of the active light source in real time.
The dynamic target measurement on-orbit illumination evaluation method for the mechanical arm of the space station mainly relates to two technical schemes of cooperative target pattern design and on-orbit illumination evaluation; the dynamic target pose three-dimensional measurement method facing the space station mechanical arm is based on the illumination evaluation method and further comprises target pose three-dimensional measurement.
The first part of the content: a cooperative target pattern solution.
The cooperation target pattern simultaneously satisfies the following conditions:
1) The pattern is preferably a round mark point set with isotropy, and if no special requirement exists, the processing and installation thicknesses of all the mark points are kept consistent;
2) The size of the pattern ensures that the number of pixels occupied by the diameter of the mark in the image is at least not less than the imaging resolution of the camera even at the farthest observation distance according to the imaging resolution of the camera and the observation distance between the pattern and a cooperative target, so that the camera can still clearly image the mark;
3) The pattern is selected to match black and white with high contrast;
4) The pattern material is made of a special vacuum-resistant high-low temperature diffuse reflection material, namely a black material with high absorptivity and a white material with high reflectivity are selected as far as possible;
5) The pattern satisfies the uniqueness of rotation, translation and scaling;
6) The pattern layout is combined with the appearance characteristics of the cooperative target, so that the number of the different-surface mark points contained in a single image shot at the nearest observation distance of the camera is not less than 4, and the number of the mark points is increased as much as possible on the premise of not interfering other load equipment, so that the accuracy and the robustness of pose measurement are further improved.
The second part of the content: an on-orbit illumination evaluation technical scheme.
And (1) setting an active light source illuminator of the visible light vision camera to be in a closed state.
And (2) collecting the left (right) eye gray level image.
The left (right) camera image sensor automatically completes photoelectric conversion to acquire a Bayer image in an optical field range of the camera, an RGB color image is obtained through nonlinear interpolation, and then the RGB color image is converted into a YUV image data format, wherein a Y component is image gray scale information acquired by the camera.
And (3) target detection and pattern recognition.
Firstly, judging whether the envelope diameter of a target edge point list graph in an image exceeds 6 pixels, ensuring that each target area has a clear and sharp edge and the adjacent edge areas of a regular connected domain can not generate an aliasing phenomenon;
secondly, marking all connected domains in the image based on a region growing principle, and sequencing the connected domains in sequence according to the number of pixels occupied by each connected domain;
thirdly, calculating the circularity value of each connected domain, setting the circularity value to be not less than 75% as a shape discrimination criterion, and rapidly screening a plurality of candidate connected domain sequences meeting the requirements;
and finally, using the known number of the marking points and the relative position layout relation thereof in the design of the cooperative target pattern as a training sample, traversing and searching the connected domain sequence by using a pattern recognition classification algorithm, selecting a group of optimal candidate connected domain subsets matched with the training sample, and determining the corresponding positions of the connected domains in the optimal subsets according to the marking point sequence numbers in the training sample so as to establish a one-to-one correspondence relation between the connected domains and the training sample.
And (4) counting the image contrast of each visual mark point area.
And calculating the statistical mean value of the difference value of the black and white grayscales DN of each visual mark point area in the single-frame image. If the gray level DN difference is not lower than 70, the current visual imaging illumination environment can be determined to be ideal, and the three-dimensional measurement of the target posture of the third part can be directly carried out without the auxiliary illumination of an active light source; and (3) on the contrary, if the current visual imaging illumination environment is determined to be poor, turning on an active light source illuminator of the visible light visual camera, returning to the step (2), recalculating the statistical mean value of the difference values of the black and white grayscales DN of each visual mark point area in the single-frame image, if the statistical mean value is not less than 70, performing illumination compensation on the object plane of the light source, and if the statistical mean value is still less than 70, executing the step (5).
And (5) resolving the image plane illumination of the camera.
Substituting the obtained value into the statistical mean value of the difference value of the black and white gray scales DN of each visual mark point region in the image, and calculating the physical gray value output by the camera image detector after photoelectric conversion according to the image quantization digit output by the camera. The specific calculation process is as follows:
assuming that the quantization bit number of an image output by a visible light visual camera is x bit, and the statistical mean value of the difference values of the black and white grayscales DN of each visual mark point area of the image is omega, the calculation formula of the gray value of the image actually output by the corresponding image detector after photoelectric conversion is as follows:
Figure BDA0003798924730000081
further, it can be derived that the camera image plane illumination calculation formula is expressed as follows:
Figure BDA0003798924730000082
in the formula, T exposal Representing the exposure time required for the camera to capture a single frame of image, unit: s (seconds); s respond Expressing the sensitivity and responsivity of the image detector device, the unit:DN/(nJ/cm 2 )=DN×cm 2 /nJ, where nJ is nanoJoule, 1nJ =10 -9 Joule.
Considering that illumination is an important index for representing illumination constraint, the dimension thereof generally has two expression modes: lux and Watt per square meter W/m 2 To ensure the uniformity and continuity of subsequent calculation units, the conversion relationship between the two is described as follows:
engineering optics specify: "one light source emits illumination of 1lx and frequency of 540 × 10 12 Hz monochromatic light with radiation intensity of 1/683W/m 2 . The definition is only a reference value, and specifically should be determined by combining a responsivity curve (sum of responsivities of RGB three primary color channels) of the image sensing device itself to the white spectrum (420 to 700 nm) selected by the camera and a light source illumination curve. The conversion relation obtained by observing the white light by the visible light vision camera is as follows through optical analysis:
Figure BDA0003798924730000083
and (6) estimating the illumination compensation value of the object plane of the light source.
Substituting the optical design parameters into the visible light visual camera of the mechanical arm of the space station, and expressing the calculation formula of the surface luminous brightness of the visual mark as follows:
Figure BDA0003798924730000084
wherein τ is the transmittance of the camera optical system; D/F represents the relative aperture of the camera, i.e. the inverse of the F number, and pi is the circumferential ratio.
In view of the fact that the space station mechanical arm visible light vision camera adopts a binocular integrated design, when an image is shot in an orbit, the lens of the left eye camera always faces a cooperative target, and the vision mark is ensured to be located in the central view field area of the left eye image; and the observing angle of view of the right eye camera lens has a certain position deviation, as shown in fig. 2.
For the left eye image, it should be assumed that the active light source directly faces the surface of the vertically illuminated visual mark, and an included angle α =0 is formed between the incident direction of the exit optics of the illuminator of the active light source and the normal vector of the surface of the visual mark, and then the calculation formula of the object plane illumination compensation value of the visual mark illuminated by the light source should be written as follows:
Figure BDA0003798924730000091
where ρ represents the white paint reflectance, typically in percent, of the surface of the visual indicia; l represents the luminance corresponding to the light reflected from the surface of the visual mark, in units: cd/m 2
For the right eye image, a certain included angle α (unit:degree) exists between the incident direction of the emergent optics of the active light source illuminator and the normal vector of the surface of the visual mark, and then the calculation formula of the object surface illumination compensation value of the visual mark irradiated by the light source is written into the following form:
Figure BDA0003798924730000092
therefore, the light source illumination power calculation formula is expressed as follows:
Figure BDA0003798924730000093
in the formula, d represents the illumination distance (unit: m) of the active light source, the default value of the illumination distance is the farthest distance for clear imaging of the visible light vision camera, and dynamic adjustment can be performed according to the result output by the three-dimensional measurement of the third part of target poses in some cases; θ represents an illumination angle (unit: °), and is generally required to cover the entire field of view of the camera; k is luminous efficacy, unit: lm/W; the illumination uniformity is eta, dimensionless, and is generally taken as a percentage.
In general, the luminous efficacy K, the illumination uniformity η, and the illumination angle θ are all constant coefficients for the same camera light source. Wherein, the luminous efficacy K is taken as the value of 90-120 lm/W; the illumination uniformity eta is 50-70%.
And the third part of contents: and (4) three-dimensional measurement of the pose of the target.
And (7) positioning and extracting the two-dimensional coordinates of the center of the visual mark point.
And (3) accurately extracting the two-dimensional coordinate values of the centers of the marked feature points in the image by utilizing a gray-scale weighted centroid method.
And (8) determining a measurement mode.
The space station mechanical arm visible light vision camera adopts binocular integration design. If the monocular measurement mode is selected, the independent execution of the left (right) eye cameras proceeds to step (9) thereafter; if the binocular measurement mode is selected, the left eye camera and the right eye camera must synchronously complete image gray scale information acquisition, which is an important prerequisite for binocular measurement, and then step (10) is carried out.
And (9) solving the monocular pose initial value.
And substituting the three-dimensional coordinate third-party instrument measured values of the left (right) target camera internal reference calibration data and each marking point, solving a linear equation set by using a least square method to obtain a monocular measured target pose data initial value (comprising a rotation matrix R and a translation vector T), and then turning to the step (12).
And (10) three-dimensional reconstruction and coordinate conversion.
When the binocular measurement mode is selected, the calculation formula of three-dimensional reconstruction and coordinate conversion is as follows:
Figure BDA0003798924730000101
Figure BDA0003798924730000102
wherein S is l ,S r Respectively, a non-zero scale factor of the left and right eye cameras, A l ,A r Representing the internal reference matrix of the left and right eye cameras, and the external reference calibration result between the two eye cameras is represented as a rotation matrix R c And a translation vector t c ,(u li ,v li ) And (u) ri ,v ri ) Respectively is a mark point P i And the two-dimensional coordinate values of the centers of the feature points in the left eye image and the right eye image are opposite. Now, assuming the left eye camera as the reference coordinate system, mark point P i The three-dimensional coordinate values in the levo-ocular camera coordinate system may be represented as P ci =[X ci ,Y ci ,Z ci ] T . In summary, A l ,A r ,(u li ,v li ),(u ri ,v ri ),R c ,t c All are known constant coefficients, and the mark point P can be solved according to the formula i Three-dimensional coordinate value P in the coordinate system of the left eye camera ci =[X ci ,Y ci ,Z ci ] T . The binocular measurement pose calculation formula can be converted into the following form:
Figure BDA0003798924730000111
in the formula, mark point P i The three-dimensional coordinate values in the camera coordinate system may be represented as P ci =[X ci ,Y ci ,Z ci ] T
And (11) solving the binocular pose initial value.
And substituting the reference calibration data and the three-dimensional coordinate measured value of each mark point into the left/right eye camera respectively, and solving an initial value of the target pose data in a binocular measurement mode by using a Singular Value Decomposition (SVD) method.
And (12) calculating the pose parameters of the target and the tail end.
Substituting the initial value of the target pose (including the rotation matrix R and the translation vector T) into the external reference calibration data (including the rotation matrix R) between the camera and the tail end CE And translation vector T CE ) The target-end pose parameter (including rotation matrix R) can be calculated by following the following calculation formula OE And translation vector T OE ):
Figure BDA0003798924730000112
Then the three position quantities and three attitude angles of the target-end can be calculated as:
Figure BDA0003798924730000113
wherein the rotation matrix R OE Is a 3 x 3 matrix containing 3 mutually independent variables, respectively three rotation attitude angles alpha OEOEOE
Those skilled in the art will appreciate that those matters not described in detail in the present specification are well known in the art.
Although the present invention has been described with reference to the preferred embodiments, it is not intended to limit the present invention, and those skilled in the art can make variations and modifications of the present invention without departing from the spirit and scope of the present invention by using the methods and technical contents disclosed above.

Claims (10)

1. A dynamic target measurement on-orbit illumination assessment method for a mechanical arm of a space station is characterized in that a visible light vision camera has two measurement modes of monocular and binocular, and the illumination assessment method comprises the following steps:
determining a cooperative target pattern;
the active light source illuminator of the visible light vision camera is set to be in a closed state, an image is obtained, image gray scale information is determined, and target detection and mode recognition are carried out by utilizing the image gray scale information; determining a visual mark area gray level DN difference value of a cooperative target pattern in the image according to the target detection and pattern recognition results; if the difference value of the gray DN of the visual mark area is not lower than a set threshold, the illumination compensation of the object plane is not needed; if the DN difference of the visual mark area gray levels is lower than the set threshold, the illuminator of the active light source of the visible light visual camera is turned on, the DN difference of the visual mark area gray levels is obtained again, if the DN difference of the visual mark area gray levels obtained again is not lower than the set threshold, the illumination compensation of the object plane is not needed, and if the DN difference of the visual mark area gray levels is still lower than the set threshold, the following processing is carried out:
determining the image surface illumination of the camera by using the DN difference value of the gray scales of the visual marking areas;
determining the surface luminance of the visual mark by using the image surface illumination of the camera;
determining a left eye image object plane illumination compensation value and a right eye image object plane illumination compensation value by using the surface illumination brightness of the visual mark;
and selecting the maximum value from the left eye image object plane illumination compensation value and the right eye image object plane illumination compensation value, determining the final object plane illumination compensation value, and further determining the light source illumination power.
2. A lighting evaluation method according to claim 1 wherein said cooperative target patterns simultaneously satisfy the following condition:
the pattern comprises an isotropic set of circular mark points;
at the farthest observation distance, the number of pixels occupied by the diameter of the mark point in the pattern in the camera image is at least not lower than the imaging resolution of the camera;
the pattern is selected to be matched with black and white;
the pattern is made of a special vacuum-resistant high-temperature and low-temperature diffuse reflection material;
the pattern satisfies the uniqueness of rotation, translation and scaling;
the number of the different-surface mark points contained in a single image shot at the nearest observation distance of the camera is not less than 4.
3. The illumination evaluation method according to claim 1, wherein a Bayer image is acquired in a monocular measurement mode, and then an RGB color image is obtained through nonlinear interpolation and converted into a YUV image data format, wherein a Y component is image gray scale information acquired by a camera.
4. A lighting assessment method according to claim 1 wherein said determining camera image plane illumination using visual marker area gray level DN difference comprises:
and determining a statistical mean value of the visual marking area gray DN difference value by using the visual marking area gray DN difference value, resolving an image gray value output by a camera image detector after photoelectric conversion according to the image quantization digit output by the camera, and further determining the image surface illumination of the camera.
5. A lighting evaluation method as defined in claim 1, wherein said method of determining the lighting brightness of the surface of the visual marker is:
Figure FDA0003798924720000021
wherein τ is the transmittance of the camera optical system; df denotes the relative aperture of the camera, i.e. the reciprocal of the F number, pi is the circumference ratio, E image Is the camera image plane illumination.
6. A lighting evaluation method according to claim 1 wherein the compensation value for the illumination of the object plane of the left eye image and the compensation value for the illumination of the object plane of the right eye image are determined based on the geometric relationship between the active light source illuminator of the visible light vision camera and the cooperative target using the luminance of the surface of the visual marker.
7. An illumination evaluation method according to claim 1, wherein said determination of the illumination power of the light source is performed by:
Figure FDA0003798924720000022
wherein d is the illumination distance of the active light source, pi is the circumference ratio, theta is the illumination angle, K is the luminous efficacy, E lamp-left Is the object plane illumination compensation value of the left eye image, E lamp-right The compensation value of the illumination of the object plane of the right eye image is obtained, max is the maximum value of the two, and the illumination uniformity is eta.
8. A lighting assessment method according to claim 1 wherein a monocular or binocular measurement mode is used to acquire the image and determine the image grey scale information.
9. A three-dimensional measurement method for a dynamic target pose of a mechanical arm of a space station is characterized in that after illumination evaluation is carried out by the illumination evaluation method of any one of claims 1 to 8, the following processing is carried out:
extracting two-dimensional coordinate values of the centers of all marked feature points in the cooperative target pattern in the image by utilizing a gray-scale weighted centroid method according to the optimal candidate connected domain subset in the target detection and pattern recognition results;
determining a measurement mode of a visible light vision camera;
for the monocular measurement mode, based on the physical mapping relation between the two-dimensional imaging plane and the three-dimensional space, determining an initial value of target pose data in the monocular measurement mode by using the two-dimensional coordinate value and camera internal reference calibration data; then determining a target-tail end three-dimensional pose parameter by using an external reference calibration result between a camera and a tail end;
for a binocular measurement mode, after a three-dimensional coordinate system is reconstructed, coordinate conversion is carried out, and an initial value of three-dimensional pose data in the binocular measurement mode is determined by using the two-dimensional coordinate value; and then determining the three-dimensional pose parameters of the target and the tail end by using the external reference calibration result between the camera and the tail end.
10. The dynamic target pose three-dimensional measurement method according to claim 9, characterized in that for the monocular measurement mode, a least square method is adopted to determine an initial value of target pose data in the monocular measurement mode; and for the binocular measurement mode, determining the initial value of the three-dimensional pose data in the binocular measurement mode by using a singular value decomposition method.
CN202210981620.2A 2022-08-15 2022-08-15 Space station mechanical arm-oriented dynamic target measurement on-orbit illumination evaluation method Active CN115406414B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210981620.2A CN115406414B (en) 2022-08-15 2022-08-15 Space station mechanical arm-oriented dynamic target measurement on-orbit illumination evaluation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210981620.2A CN115406414B (en) 2022-08-15 2022-08-15 Space station mechanical arm-oriented dynamic target measurement on-orbit illumination evaluation method

Publications (2)

Publication Number Publication Date
CN115406414A true CN115406414A (en) 2022-11-29
CN115406414B CN115406414B (en) 2024-03-29

Family

ID=84160330

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210981620.2A Active CN115406414B (en) 2022-08-15 2022-08-15 Space station mechanical arm-oriented dynamic target measurement on-orbit illumination evaluation method

Country Status (1)

Country Link
CN (1) CN115406414B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117314793A (en) * 2023-11-28 2023-12-29 中国建筑第五工程局有限公司 Building construction data acquisition method based on BIM model

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5751863A (en) * 1996-06-18 1998-05-12 Hewlett Packard Company Method and system having relaxed front end distortion requirements
JP2003208599A (en) * 1993-10-27 2003-07-25 Matsushita Electric Ind Co Ltd Object recognition device
JP2007142911A (en) * 2005-11-21 2007-06-07 Seiko Epson Corp Image processing apparatus, image processing method, and program
CN101051117A (en) * 2007-04-05 2007-10-10 北京中星微电子有限公司 Method and device for correcting lens image non-uniformity and extracting lens parameter
CN113744354A (en) * 2021-11-05 2021-12-03 苏州浪潮智能科技有限公司 Method, device and equipment for adjusting brightness based on gray value and readable medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003208599A (en) * 1993-10-27 2003-07-25 Matsushita Electric Ind Co Ltd Object recognition device
US5751863A (en) * 1996-06-18 1998-05-12 Hewlett Packard Company Method and system having relaxed front end distortion requirements
JP2007142911A (en) * 2005-11-21 2007-06-07 Seiko Epson Corp Image processing apparatus, image processing method, and program
CN101051117A (en) * 2007-04-05 2007-10-10 北京中星微电子有限公司 Method and device for correcting lens image non-uniformity and extracting lens parameter
CN113744354A (en) * 2021-11-05 2021-12-03 苏州浪潮智能科技有限公司 Method, device and equipment for adjusting brightness based on gray value and readable medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117314793A (en) * 2023-11-28 2023-12-29 中国建筑第五工程局有限公司 Building construction data acquisition method based on BIM model
CN117314793B (en) * 2023-11-28 2024-02-09 中国建筑第五工程局有限公司 Building construction data acquisition method based on BIM model

Also Published As

Publication number Publication date
CN115406414B (en) 2024-03-29

Similar Documents

Publication Publication Date Title
EP1462992B1 (en) System and method for shape reconstruction from optical images
JP7028333B2 (en) Lighting condition setting method, equipment, system and program, and storage medium
US7171037B2 (en) Optical inspection system and method for displaying imaged objects in greater than two dimensions
CN105424726B (en) Luminescent panel detection method based on machine vision
CN110942060A (en) Material identification method and device based on laser speckle and modal fusion
JP6054576B2 (en) Method and apparatus for generating at least one virtual image of a measurement object
CN101957188B (en) Method and device for determining properties of textured surfaces
CN101819024B (en) Machine vision-based two-dimensional displacement detection method
CN112567428A (en) Photographing method and photographing apparatus
CN109767425B (en) Machine vision light source uniformity evaluation device and method
JP2011027734A (en) Device for inspection of textured surface
CN103745055A (en) Space target visible light imaging method based on spectrum BRDF (Bidirectional Reflectance Distribution Function)
Ciortan et al. A practical reflectance transformation imaging pipeline for surface characterization in cultural heritage
CN114719749A (en) Metal surface crack detection and real size measurement method and system based on machine vision
CN115406414B (en) Space station mechanical arm-oriented dynamic target measurement on-orbit illumination evaluation method
CN114998308A (en) Defect detection method and system based on photometric stereo
CN113762161A (en) Intelligent obstacle monitoring method and system
KR20230042706A (en) Neural network analysis of LFA test strips
Varjo et al. Image based visibility estimation during day and night
Seulin et al. Simulation of specular surface imaging based on computer graphics: application on a vision inspection system
US20240096059A1 (en) Method for classifying images and method for optically examining an object
Yin et al. Learning based visibility measuring with images
Siatou et al. Reflectance Transformation Imaging (RTI) Data Analysis for Change Detection: Application to Monitoring Protective Coating Failure on Low Carbon Steel
CN112149578B (en) Face skin material calculation method, device and equipment based on face three-dimensional model
KR20190075283A (en) System and Method for detecting Metallic Particles

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant