CN118247344A - Image acquisition system and sea wave image matching method - Google Patents

Image acquisition system and sea wave image matching method Download PDF

Info

Publication number
CN118247344A
CN118247344A CN202410233669.9A CN202410233669A CN118247344A CN 118247344 A CN118247344 A CN 118247344A CN 202410233669 A CN202410233669 A CN 202410233669A CN 118247344 A CN118247344 A CN 118247344A
Authority
CN
China
Prior art keywords
image
camera
point
matching
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410233669.9A
Other languages
Chinese (zh)
Inventor
杨光
姜文正
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
First Institute of Oceanography MNR
Original Assignee
First Institute of Oceanography MNR
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by First Institute of Oceanography MNR filed Critical First Institute of Oceanography MNR
Priority to CN202410233669.9A priority Critical patent/CN118247344A/en
Publication of CN118247344A publication Critical patent/CN118247344A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The invention discloses an image acquisition system and an ocean wave image matching method, which belong to the technical field of ocean monitoring, and comprise the following steps: calibrating intrinsic parameters of the binocular camera; selecting at least one image pair from the sea wave image sequence acquired by the image acquisition system, wherein each image pair comprises a left image and a right image; performing primary matching on the left image and the right image of the at least one image pair, and removing rough difference points to obtain a primary matching result; extracting a plurality of feature points on a left image of the at least one image pair; and combining the primary matching result, and performing image matching on the extracted plurality of feature points to obtain a final matching result. The method solves the problems of low image matching precision and low reliability caused by weak sea surface smooth texture and unobvious characteristics in the stereo photography sea wave measurement technology.

Description

Image acquisition system and sea wave image matching method
Technical Field
The invention belongs to the technical field of ocean monitoring, and particularly relates to an image acquisition system based on binocular stereoscopic vision and an ocean wave image matching method.
Background
Ocean waves are a very common and vital physical phenomenon in the ocean. The observation and analysis of wave fluctuation parameters are the most basic work in the aspects of ocean science research, ocean engineering construction and the like. The sea wave observation device with reliable performance, advanced technology, convenience and high efficiency is urgently needed by human beings,
At present, the sea wave observation method is various, and can be roughly divided into three methods, namely an artificial observation method, an instrument measurement method and a remote sensing inversion method according to different measurement methods. Manual observation is highly subjective and subject to error. The instrument measurement method mainly relies on instruments and equipment such as gravity type, acoustic type, pressure type and the like, and represents ocean buoys, acoustic wave meters and the like, which are the most commonly used methods at present, are relatively reliable, but have the defects of inflexible observation, general precision, larger data influence by weather, easy loss of equipment and the like. The remote sensing inversion method comprises a satellite remote sensing technology, a radar wave measurement technology and a stereo photography wave measurement technology. The satellite remote sensing and radar sea wave observation has the advantages of no direct contact with sea surface, suitability for wide sea observation, low measurement accuracy and high system price.
The stereo photography sea wave measuring technology utilizes two non-measuring cameras (called binocular stereo vision systems) erected on the shore, offshore platform or ship to synchronously collect sea image pairs, utilizes an upper computer to process the sea image pairs, precisely measures three-dimensional space-time distribution of sea fluctuation through a lens imaging principle and image matching, and acquires a sea wave direction spectrum according to the three-dimensional space-time distribution. The stereo photography sea wave measurement technology has the advantages of simple equipment, low cost, convenient deployment, high measurement precision and the like. The method can accurately measure micro-scale waves, sea surface white crown coverage rate and sea surface flow field. The system can also be used for the ocean engineering construction of oil platforms, ocean pastures and the like.
However, in practical application, the stereo photography sea wave measurement technology has a plurality of problems. Firstly, the time precision of synchronous shooting of the binocular camera is not high, the obtained images have large difference on sea waves, characteristic point matching is seriously influenced, and the error of the three-dimensional reconstructed sea wave model is large. Secondly, camera control software meeting special requirements of ocean operation is not available, secondary development degree of the binocular camera is low, and functional use is limited. Thirdly, because the sea surface is smooth, the texture is weaker, the reliability and the precision of sea surface image matching are far lower than those of natural solids, and the sea surface image matching is a great challenge.
Disclosure of Invention
In order to overcome the defects and problems, the invention aims to provide an image acquisition system based on binocular stereoscopic vision, which is convenient to deploy, simple to operate and low in cost, and can be used for stereoscopic photography sea wave measurement.
The embodiment of the invention aims to provide an ocean wave image matching method based on an image acquisition system, which solves the problems of low image matching precision and low reliability caused by weak sea surface smooth textures and unobvious characteristics in the stereoscopic photographic ocean wave measurement technology.
In order to solve the technical problems, the invention is realized as follows:
In one aspect, an embodiment of the present invention provides an image acquisition system based on binocular stereoscopic vision, including:
A pair of cameras of the binocular type,
And a synchronous triggering unit connected with the binocular camera and used for driving synchronous image acquisition operation of the binocular camera, wherein the synchronous triggering unit comprises:
the main control board is used for outputting a square wave signal with a self-defined frequency in a timing and quantitative mode and used as an external trigger signal for synchronous shooting of the binocular camera;
the network expansion board is connected with the main control board and at least one upper computer and is used for realizing interactive communication between the at least one upper computer and the main control board;
and the optocoupler isolation driving module is used for isolating the low-voltage control circuit of the main control board from the high-voltage execution circuit of the triggering side of the binocular camera.
In some embodiments, a digital input/output pin of the main control board is used as an output pin of a square wave signal and a GND end and is respectively connected with a low-voltage signal input end and a COM end of the optocoupler isolation driving module through wires;
Connecting respective trigger wires of the binocular cameras with the high-voltage signal output side of the optocoupler isolation driving module, and connecting respective common ground wires of the cameras with the GND end of the optocoupler isolation driving module;
And the VCC end and the GND end of the optocoupler isolation driving module are respectively connected with the V+ end and the V-end of the direct current switch power supply of the high-voltage executing circuit.
In some embodiments, the network expansion board is directly connected with the main control board in a pin header direct insertion manner; and/or
And connecting the network cable port of the network expansion board with the at least one upper computer through a network cable.
In some embodiments, the at least one host computer stores and executes a camera control program comprising:
the camera information display module is configured to display various hardware information and indexes of the binocular camera, and is used for checking whether a target camera is opened and checking whether the target camera is in a normal working state;
the image real-time display module is configured to display the images captured by the binocular camera in real time;
A camera parameter setting module configured to adjust various intrinsic parameters of the binocular camera to achieve the best image effect;
The camera trigger setting module is configured to select a camera shutter mode, select a trigger mode and set software trigger parameters;
the external trigger setting module is configured to control and operate the synchronous trigger unit, set external trigger parameters and send corresponding external trigger parameters to the synchronous trigger unit;
The camera acquisition operation module is configured to control the binocular camera to complete the operation of image acquisition;
and the image storage module is configured to store image data acquired by the binocular camera.
In some embodiments, the external trigger setting module sets an external trigger parameter, including:
Setting the frequency and the start-stop time of the square wave signal according to an actual acquisition requirement, and sending the frequency and the start-stop time of the square wave signal to the main control board through the network expansion board.
In some embodiments, the image storage module includes a fast memory mode and a transfer memory mode,
Storing the image data acquired by the binocular camera in a solid state disk through the fast memory mode for pre-storage;
the prestored image data is transferred to a mechanical hard disk through the transfer mode.
On the other hand, the invention also provides a sea wave image matching method, which adopts the image acquisition system to acquire a sea wave image sequence, and comprises the following steps:
Calibrating intrinsic parameters of the binocular camera;
Selecting at least one image pair from the sea wave image sequence acquired by the image acquisition system, wherein each image pair comprises a left image and a right image;
Performing primary matching on the left image and the right image of the at least one image pair, and removing rough difference points to obtain a primary matching result;
Extracting a plurality of feature points on a left image of the at least one image pair;
and combining the primary matching result, and performing image matching on the extracted plurality of feature points to obtain a final matching result.
In some embodiments, before the first matching of the left image and the right image of the at least one image pair, the method further comprises:
Filtering the at least one image pair, comprising:
filtering the selected at least one image pair by adopting a Wallis filtering operator, wherein the filtered image is expressed as:
Wherein g c (x, y) represents an image generated by filtering the image g (x, y), and x, y represents x, y coordinates; m g and s g are respectively the gray average value and the gray variance of a certain area of a certain pixel in an image; m f and s f are target values of image mean and image variance, respectively; c is the image contrast expansion constant and b is the image brightness coefficient.
In some embodiments, the first matching of the left image and the right image of the at least one image pair using a epipolar line-based pyramid image matching method includes:
uniformly selecting a plurality of target points on the left image;
according to the coordinate of each target point in the left camera coordinate system and the coordinate of the target point in the pixel coordinate system;
searching conjugate points matched with the target point along a epipolar line when pyramid images are matched;
determining the coordinates of the conjugate point matched with the target point in a right camera coordinate system and the coordinates of the conjugate point in a pixel coordinate system;
wherein, the coordinates of the conjugate point of the target point under the right camera coordinate system satisfy:
Wherein the coordinate of a target point in the left camera coordinate system is a (x a,ya -f), the coordinate of the target point in the pixel coordinate system is (u a,va), the coordinate of a conjugate point of the target point in the right camera coordinate system is a '(x a',ya' -f'), and the coordinate of the conjugate point in the pixel coordinate system is (u a',va'); The distortion coefficients of the left camera and the right camera of the binocular camera are k 1、k1 ', the coordinates of the intersection points of the main optical axes of the left camera and the right camera and the negative film are (x 0,y0)、(x0',y0 '), and the focal lengths of the left camera and the right camera are f and f ', respectively.
In some embodiments, the coarse difference is culled using a cross method, comprising:
If the distance from the conjugate point matched with the target point to an intersection point exceeds a distance threshold, identifying the conjugate point as a rough difference point and removing the rough difference point; and is combined with
Searching the conjugate point of the target point along the epipolar line again;
Wherein, a straight line formed by approximately fitting conjugate points corresponding to all target points in the ith row and a straight line formed by approximately fitting conjugate points corresponding to all target points in the jth column intersect to generate the intersection point, and a conjugate point a ij' corresponding to a target point a ij in the ith row and the jth column should be positioned near the intersection point.
In some embodiments, the extracting feature points on the at least one image pair left image includes:
determining an area by taking each target point of the left image as a center;
At least one feature point is extracted at each region.
In some embodiments, image matching is performed on the extracted plurality of feature points in combination with the primary matching result to obtain a final matching result, including:
Combining the primary matching result, and searching the position of the conjugate point corresponding to each extracted characteristic point by adopting a pyramid image matching method based on a epipolar line;
Marking each characteristic point on the left image, and marking the conjugate point corresponding to the characteristic point on the right image;
and matching the left image and the right image after identification by a least square method to obtain the final matching result.
Compared with the common image acquisition system, the image acquisition system provided by the embodiment of the invention has the advantages that the synchronous triggering unit greatly improves the time synchronism of shooting of the binocular camera, the time precision reaches the millisecond level, and the requirement of the current stereo photography sea wave measurement technology is met.
Meanwhile, the sea wave image matching method provided by the embodiment of the invention solves the problems of lower image matching precision and low reliability caused by weak sea surface smooth texture and unobvious characteristics in the stereo photography sea wave measurement technology.
Drawings
Fig. 1 shows an overall architecture diagram of an image acquisition system according to a first embodiment of the present invention;
FIG. 2 shows a block schematic of a camera control program;
Fig. 3 is a schematic flow chart of a sea wave image matching method according to a second embodiment of the present invention;
FIG. 4 shows a schematic diagram of a stereographic sea wave measurement geometry model;
Fig. 5 shows the final sea wave image matching result.
Wherein, the reference numerals are as follows:
10 binocular camera
20 Synchronous trigger unit
21 Main control board
22 Network expansion board
23 Optocoupler isolation drive module
30 Upper computer
401 Camera information display Module
402 Real-time image display module
403 Camera parameter setting Module
404 Camera trigger setting module
405 External trigger setting Module
406 Camera acquisition operation module
407, An image storage module.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Fig. 1 is a schematic diagram of an overall image capturing system according to an embodiment of the present invention.
An image acquisition system is based on binocular stereoscopic vision, can be applied to stereoscopic sea wave measurement and acquires sea wave images. Referring to fig. 1, the system specifically includes:
A pair of binocular cameras 10 are provided which,
A synchronization triggering unit 20 connected to the binocular camera 10 for driving the synchronous shooting of the binocular camera, the synchronization triggering unit 20 comprising:
A main control board 21 for outputting a square wave signal with a self-defined frequency at fixed time and fixed quantity as an external trigger signal for synchronous shooting of the binocular camera;
The network expansion board 22 is connected with the main control board and at least one upper computer 30 and is used for realizing interactive communication between the at least one upper computer and the main control board;
an optocoupler isolation driving module 23, configured to isolate the low voltage control circuit of the main control board from the high voltage execution circuit of the triggering side of the binocular camera; and is also provided with
At least one upper computer 30, the at least one upper computer 30 and the binocular camera 10 are in interactive communication through optical fibers.
In a specific implementation, the binocular camera comprises two trigger modes of software trigger and external trigger, wherein the software trigger is realized through a software program to trigger shooting, and the process occupies a certain upper computer resource, so that an unmeasured time difference can be generated when the binocular camera captures images at the same moment, particularly when an observed object is a sea wave with high change speed and unobvious characteristics, the time difference can lead to larger image pair difference of the binocular camera, and when the sea wave image is processed by utilizing a stereo photography sea wave measurement technology, the detected characteristic point quantity difference is larger, the characteristic point matching degree is not high, and the precision of three-dimensional reconstruction of sea waves is affected. Therefore, the external trigger is adopted at the same time in the embodiment, namely, the binocular camera captures the rising edge or the falling edge of the square wave signal sent by the same hardware system by means of the respective trigger lines to complete shooting. For a moving object with rapid change speed and unobvious characteristics of sea waves, the synchronism of shooting time can be ensured by external triggering. Therefore, the embodiment designs the synchronous triggering unit to realize external triggering, and the synchronous triggering unit greatly improves the time synchronism of shooting by the binocular camera, the time precision reaches the millisecond level, and the requirement of the existing stereoscopic photography sea wave measurement technology is met.
In some embodiments, the main control board is connected to at least one host computer by a USB cable. Meanwhile, arduino UNO R3 is adopted as a main control board, and the main control board is provided with 14 digital input/output pins, wherein 6 of the pins can be used as PWM output, 6 of the pins are used as analog input, and a 16MHz crystal oscillator clock can be supplied with power and programmed by connecting a computer through a USB data line, and the main control board is reliable in performance, powerful in function and low in cost.
Meanwhile, in some embodiments, a W5500 network expansion board is selected as a communication medium between the upper computer 30 and the lower main control board 21, and integrates a TCP/IP protocol stack, a 10-100M ethernet data link layer (MAC) and a physical layer (PHY), so that the communication medium becomes a simple Web server or is used for controlling network applications such as read-write digital and analog interfaces through a network, and the communication efficiency and stability are greatly improved. More specifically, the network expansion board is directly connected with the main control board in a pin-array direct-insertion mode, and is connected with at least an upper computer through five types, more than five types, six types and other kilomega network cables.
In a specific implementation, a library file of the W5500 network expansion board is imported into an IDE of the Arduino master control board. And copying the library file (Ethernet folder) of the network expansion board into a library folder under the IDE installation path of the Arduino master control board. Then, a lower computer socket communication program is written in an Arduino main control board IDE. Firstly, the header files SPI.h and Ethernet.h are declared, and secondly, mac, IP address and HTTP port number of the Arduino master control board are set. In the setup () function, the ethernet. Begin () function is called to initialize the settings, and the server. Begin () function is called to make the server start to monitor the information transmitted from the host computer. In the loop () function, ETHERNETCLIENT CLIENT =server.available () is written, and this line code indicates that when a client accesses the server, the server.available () function returns a client object to feed back information to the client.
Then, a trigger logic and a timing start-stop program are written for the Arduino master control board. A digital input/output pin of the Arduino master board is defined as an output pin of the square wave signal by pinMode () function. The generation of the square wave signal with the self-defined frequency is realized through the cooperation of DIGITALWRITE () function and delayMicroseconds () function. The time parameter in delayMicroseconds () function is the frame number interval time in the external parameter setting from the upper computer monitored by the network expansion board. Meanwhile, pin5 and pin6 of the Arduino master control board are defined as a Timer0 for calculating the occurrence time of the square wave signal. When the stop time or the sequence frame number transmitted by the upper computer is reached, the Arduino master control board is set to zero, and the Arduino master control board stops transmitting square wave signals.
In addition, in this embodiment, the optocoupler isolation driving module 23 isolates the low-voltage control circuit of the main control board 21 from the high-voltage execution circuit on the triggering side of the binocular camera 10, so as to control the on-off of the high-voltage circuit. In some embodiments, the voltage range of the trigger signal which can be captured by the selected binocular camera is 5V-24V, and the high-voltage execution circuit selects a 12V direct current switch power supply to work together with the optocoupler isolation driving module in consideration of loss of the voltage signal transmitted in the trigger line of the camera. Specifically, a digital input/output PIN (for example PIN 9) of the main control board 21 is used as an output PIN of the square wave signal and GND end, and is connected with the low voltage signal input end and COM end of the optocoupler isolation driving module 23 respectively through wires; the trigger wires of the left and right two cameras of the binocular camera 10 are connected with the high-voltage signal output side of the optocoupler isolation driving module 23, and the common ground wires of the cameras are connected with the GND end of the optocoupler isolation driving module; and the VCC end and the GND end of the optocoupler isolation driving module are respectively connected with the V+ end and the V-end of the direct current switch power supply of the high-voltage executing circuit.
In addition, in this embodiment, at least one upper computer stores a camera control program, and after receiving the instruction of the camera control program, the synchronization triggering unit executes the instruction of the camera control program to drive the synchronous image acquisition operation of the binocular camera. The camera control program adopts a multithread data acquisition mode, so that a thread is opened up for each of the left camera and the right camera of the binocular camera to acquire work, and the processing efficiency of the system and the stability of software operation are improved. Meanwhile, the camera control software has rich functions, can perform all-around setting on the binocular camera, and can ensure the image quality to the greatest extent under the complex and various ocean real environments.
Specifically, referring to fig. 2, the camera control program includes:
A camera information display module 401 configured to display various hardware information and indexes of the binocular camera, including a device name, a device type, a device serial number, a sensor serial number, whether color is supported, whether self-cooling is supported, a sensor temperature and a lens focal length; and for checking whether the target camera is turned on and checking whether the target camera is in a normal operating state.
The image real-time display module 402 is configured to display the image captured by the binocular camera in real time, and may perform operations of zooming in and out, displaying coordinates, drawing a cross center line, and the like on the real-time image.
The camera parameter setting module 403 is configured to adjust various intrinsic parameters of the binocular camera to achieve the best image effect, including selection of a "parameter independent" or "parameter synchronization" mode, adjustment of lens aperture and focus, selection of an exposure mode and a white balance mode, wherein manual exposure includes setting an exposure area, an exposure delay, a gain, and a sampling rate, and manual white balance includes setting a white balance area, luminosity, chromaticity, sharpness, and RGB channel values.
A camera trigger setting module 404 configured to select a camera shutter mode, including a global shutter or a rolling shutter; and selecting a triggering mode, including software triggering or external triggering; and setting software triggering parameters including sequence frame number, sequence interval and triggering delay.
The external trigger setting module 405 is configured to control and operate the synchronous trigger unit, set external trigger parameters, and send corresponding external trigger parameters to the synchronous trigger unit. The method comprises the steps of online inquiry, time synchronization, awakening and manual stopping operation, and comprises the steps of displaying communication parameters, selecting an available serial port, setting baud rate, setting external triggering parameters such as sequence frame number, frame number interval, a triggering channel, triggering delay time, starting time and the like, setting the frequency and starting time of the square wave signal according to an actual acquisition requirement, and sending the frequency and starting time of the square wave signal to the main control board through the network expansion board.
The camera acquisition operation module 406 is configured to control the binocular camera to complete the image acquisition operation, including free running, manual acquisition, automatic acquisition, and the like.
The image storage module 407 is configured to store image data collected by the binocular camera, including setting a fast storage path and a transfer path, setting image and pixel formats, naming and numbering images, and the like. In a specific implementation, the image storage module comprises a fast storage mode and a transfer storage mode, and at the beginning of image acquisition, image data acquired by the binocular camera is pre-stored in a solid state disk with a higher read-write speed through the fast storage mode; and then in the image acquisition process or after the acquisition is finished, the prestored image data is further transferred into a mechanical hard disk with larger capacity through the transfer mode, so that computer resources are reasonably allocated, and the problem of dead clamping and breakdown caused by data reading and writing in the software operation process is avoided. In addition, the naming format of the image storage can adopt the form of prefix, serial number and time stamp, so that whether the image pairs acquired by the binocular camera are at the same moment or not can be checked conveniently, and the selection and processing of the image pairs are also facilitated by utilizing the stereo photography sea wave measurement technology.
In a specific implementation, the specific implementation process of the image acquisition system is as follows:
(1) And connecting and configuring the binocular camera with an upper computer. The data transmission interface of the binocular camera selected in the embodiment is a PCle2.0 interface, so that the upper computer needs to be provided with an image acquisition card of the PCle2.0 interface, and the data transmission interface and the image acquisition card are connected through optical fibers, so that stable transmission of a large number of data streams can be ensured. After the camera driver is installed, the test software provided by a camera manufacturer is used for checking whether the camera can work normally.
(2) The binocular camera is connected with the synchronous triggering unit. And connecting the respective trigger wires of the binocular cameras with the high-voltage signal output side of the optical coupler isolation driving module, and connecting the common ground wires of the respective cameras with the GND end of the optical coupler isolation driving module.
(3) The camera control program is run and the camera is turned on. Running the Camera control program, clicking QComboBox control (displayed as No Camera by default) at the upper right corner of the main window, pulling down to select the Camera, and prompting "Two CAMERA HAVE been opened-! "typeface". Clicking QPushButton on the control "left camera information" or "right camera information" may look at current camera information including device name, device type, device serial number, sensor serial number, whether color is supported, whether self-cooling is supported, sensor temperature, lens focus, etc. Clicking QPushButton a control "free running", displaying real-time pictures shot by the binocular camera in two QLabel controls in the middle of the main window, clicking QPushButton a control "cross line" and "coordinates" in the lower right corner, displaying the cross center line and the horizontal and vertical coordinates of the position of the mouse in the real-time pictures, and realizing the enlargement and reduction of the real-time pictures by using QWHEELEVENT () through the rolling of the mouse pulley.
(4) The binocular camera parameters are configured. Clicking a QPushButton control "camera configuration" in a main interface, and popup a sub window into a setting interface, wherein the method specifically comprises the following steps:
① The "parameter synchronization" or "parameter independent" mode is selected according to the actual environment. In the parameter synchronization mode, only the parameters of one camera are adjusted, and the parameters of the other camera are adjusted in the same way. In the parameter independent mode, parameter configuration needs to be performed on the two cameras respectively.
② The modes of exposure and white balance are selected. Under the mode of 'automatic exposure' or 'automatic white balance', the camera automatically adjusts the picture according to actual conditions, and under the mode of 'manual exposure' or 'manual white balance', respective setting sub-windows are respectively popped up, so that parameters such as an exposure area, exposure delay, gain, sampling rate, white balance area, luminosity, chromaticity, sharpness and the like can be set manually.
③ The aperture and focus are adjusted. Pulling QSlider the control left and right to slide can adjust the aperture size, clicking QPushButton the control "<", ">", and "> >" to conduct fine adjustment, fine adjustment and coarse adjustment of focusing.
④ The image format and pixel format to be saved are selected. Clicking on the QComboBox control next to "pixel format" selects four pixel formats, GRAY8, RGB24, RGB32, MONO 8. The same operation clicks "photo format", and the selectable photo formats are PNG, JPEG, TIFF, BMP, RAW.
⑤ And performing trigger setting. Clicking QPushButton a control "trigger setting", popping up a trigger setting window, firstly selecting a shutter mode of the binocular camera, then selecting a trigger mode of the camera, wherein the selectable trigger modes comprise software trigger and external trigger. Software triggering can be selected in the debugging camera, and triggering delay, sequence frame number and sequence interval can be correspondingly set. In the formal wave image acquisition work, external trigger is required to be selected.
⑥ And performing external trigger setting. After the external trigger is selected, click QPushButton the child window of the control "external trigger settings" pop-up settings. The control 'online inquiry' below QPushButton is clicked, after the synchronous triggering unit of the lower computer is successfully connected with the upper computer, the IP addresses of the upper computer and the lower computer are displayed in the communication parameter module, the corresponding port of the Arduino main control board connected with the computer can be searched and selected by using a serial port in serial port setting, and the baud rate is set. And then setting parameters such as a sequence frame number, a frame number interval, a trigger channel, a trigger delay and the like according to actual acquisition requirements, and setting the time for starting or stopping the work of the Arduino main control board. Clicking the 'sending setting' after setting, sending each parameter to an Arduino main control board of a synchronous trigger unit in a socket communication mode, wherein the sequence frame number and the frame number interval determine the number and the frequency of square wave signals sent by the Arduino main control board.
After all the settings are completed, the camera is returned to the configuration main interface, and the parameters can be validated by clicking QPushButton the control "apply".
(5) And setting the storage position of the image. In the camera configuration interface, a storage path of an image captured by the binocular camera is set. The method comprises the steps of clicking a 'fast memory path', wherein the fast memory path generally selects a solid state disk to perform pre-memory operation, and the read-write speed of image data and the stability of software operation are ensured. Then, respectively setting an image prefix, a serial number and a transfer path of the binocular camera, wherein the transfer path generally selects a mechanical hard disk with larger capacity.
(6) And collecting. After the camera frame is well configured and successful, a control 'manual acquisition' is clicked QPushButton, an image pair is captured by the binocular camera, and a manual acquisition mode is usually used for detecting whether the synchronous trigger unit works normally. Clicking QPushButton a control piece 'automatic acquisition', receiving a working signal by a synchronous triggering unit, outputting corresponding square wave signals according to parameters set by external triggering, capturing the signals by a triggering line of a binocular camera to trigger shooting, counting 2 in the QLINEEDIT control piece 'number to be stored' at the left lower corner of a main interface every time an image pair is acquired, and writing the image pair into a fast storage path. Clicking QPushButton a control "transfer data", and writing the image pair stored in the fast storage path into the fast storage path, counting the number stored in QLINEEDIT control "stored number" at the left lower corner of the main interface, adding 2, and naming the image stored in the fast storage path in the form of prefix-serial number-timestamp, wherein the timestamp is shooting time and is in the format of yyyy.mm.dd.hh.mm.ss (year, month, day, hour, minute, second).
(7) And after the collection and the transfer work are finished, closing the binocular camera, and exiting the software to release the resources.
In summary, compared with a common image acquisition system, the image acquisition system provided by the embodiment of the invention has the advantages that the time synchronism of the shooting of the binocular camera is greatly improved by the synchronous triggering unit, the time precision reaches the millisecond level, the requirement of the existing stereoscopic photography sea wave measurement technology is met, the system is simpler in equipment, lower in cost, stronger in function and higher in time synchronism precision, unmanned operation can be realized, and automation of marine equipment is realized. Can be arranged on shore bases, offshore platforms and ships, and is more specific to the special requirements of the stereo photography sea wave measurement.
Further, based on the image acquisition system of the first embodiment, the second embodiment of the present invention further provides an ocean wave image matching method, and the image acquisition system of the first embodiment is adopted to acquire an ocean wave image sequence, and referring to fig. 3, fig. 3 shows a specific flow diagram of the ocean wave image matching method. The method specifically comprises the following steps:
s1, calibrating inherent parameters of a binocular camera;
In a specific implementation, the intrinsic parameters of the camera include distortion coefficients k 1、k1 ' of two cameras, coordinates (x 0,y0)、(x0',y0 ') of intersection points of principal optical axes of left and right cameras of the binocular camera and a negative film (CMOS), and focal lengths f and f ' of the left and right cameras. In this embodiment, the distance equation is used as a condition, the coplanarity equation is used, and 13 camera calibration parameters are determined by using a conditional adjustment method with parameters, and besides k 1、k1'、(x0,y0)、(x0',y0 ', f and f', 5 relative orientation parameters for describing the relative positions of the two cameras are included.
S2, selecting at least one image pair from the sea wave image sequence acquired by the image acquisition system, wherein each image pair comprises a left image and a right image.
In a specific implementation, a binocular camera is installed on an observation platform, the area of an overlapping area of image pairs is increased as much as possible, a TIFF picture format is adopted to collect at least 100 pairs of effective sea wave image sequences, images of other objects except sea waves in a large-area exposure area or a picture are removed, and at least one image pair is selected, wherein each image pair comprises a left image and a right image.
Then, in this embodiment, the filtering process is performed on the at least one selected image pair. Because the sea surface smooth texture is weaker and the whole sea wave image background area is darker, the embodiment specifically adopts the Wallis filtering operator to enhance the image texture, inhibit the noise of the picture and improve the sharpening effect of the picture. Filtering the selected at least one image pair by adopting a Wallis filtering operator, wherein the filtered image is expressed as:
Wherein g c (x, y) represents an image generated by filtering the image g (x, y), and x, y represents x, y coordinate parameters; m g and s g are respectively the gray average value and the gray variance of a certain area of a certain pixel in an image; m f and s f are target values of image mean and image variance, respectively; c is the image contrast expansion constant and b is the image brightness coefficient. Generally, the value range of b is [0,1], the value of c increases with the increase of the selected window, the value range of c is also [0,1], and the specific parameter selection size is given empirically according to practical situations, for example, the values of b and c are both 0.5.
S3, performing primary matching on the left image and the right image of the at least one image pair, and removing rough difference points to obtain a primary matching result.
In this embodiment, the left image and the right image of the at least one pair after the filtering process are further subjected to primary matching. Specifically, the present embodiment adopts a pyramid image matching method based on epipolar lines to perform primary matching on the left image and the right image of the at least one image pair. The golden sub-tower image is an image processing mode that every n×n=n 2 pixels on the initial image are averaged to form a second-stage image upwards, and then a third-stage image is formed on the basis of the second-stage image. The method can rapidly extract the characteristic points and ensure the robustness of the image while ensuring the clear textures of the image.
In this embodiment, the next image of the pyramid image is averaged every 3×3 pixels to generate the pixel of the upper layer, and the pyramid image is taken to the third layer in consideration of the actual environment of the ocean.
The pyramid image matching based on the epipolar line is to search conjugate points along the epipolar line instead of on one surface during pyramid image matching, so that the search area can be greatly reduced. Referring to fig. 4, fig. 4 shows a schematic diagram of a geometric model of stereo photography sea wave measurement, a coordinate of a target point uniformly selected on a epipolar line, i.e. a left image, in a left camera coordinate system is denoted as a (x a,ya, -f), a coordinate of a conjugate point of the target point in a right camera coordinate system is denoted as a '(x a',ya', -f'), and a baseline is shownAnd intersecting the determined plane with the right camera image plane. The epipolar line equation can be expressed as:
Wherein the method comprises the steps of Is the normal vector of the plane determined by the conjugate point and the two image points under the right camera coordinate system.
Specifically, the present embodiment adopts a pyramid image matching method based on epipolar lines to perform primary matching on a left image and a right image of the at least one image pair, and specifically includes:
uniformly selecting a plurality of target points on the left image;
according to the coordinate of each target point in the left camera coordinate system and the coordinate of the target point in the pixel coordinate system;
searching conjugate points matched with the target point along a epipolar line when pyramid images are matched;
determining the coordinates of the conjugate point matched with the target point in a right camera coordinate system and the coordinates of the conjugate point in a pixel coordinate system;
wherein, according to the epipolar equation (1), the coordinates of the conjugate point of the target point under the right camera coordinate system are determined to satisfy the following conditions:
Wherein the coordinate of a target point in the left camera coordinate system is a (x a,ya -f), the coordinate of the target point in the pixel coordinate system is (u a,va), the coordinate of a conjugate point of the target point in the right camera coordinate system is a '(x a',ya' -f'), and the coordinate of the conjugate point in the pixel coordinate system is (u a',va'); The distortion coefficients of the left camera and the right camera of the binocular camera are k 1、k1 ', the coordinates of the intersection points of the main optical axes of the left camera and the right camera and the negative film are (x 0,y0)、(x0',y0 '), and the focal lengths of the left camera and the right camera are f and f ', respectively. Solving the equation (2) to obtain the coordinates (u a',va') of the conjugate point in the pixel coordinate system. And calculating the correlation coefficient of the target window and the search window to judge whether the point is a conjugate point or not. In some embodiments, a correlation coefficient obtained by dividing the covariance function by the variance function of the two signals is used as the matching measure.
Furthermore, after the pyramid image matching process based on the epipolar line, coarse difference points in the pyramid image matching process need to be further removed, and in the embodiment, the coarse difference points are removed by using a cross method to obtain a primary matching result, so that a smaller parallax range is obtained to search for conjugate points. And (3) in the plurality of target points uniformly selected by the left image, a straight line formed by approximate fitting of conjugate points corresponding to all target points in the ith row is intersected with a straight line formed by approximate fitting of conjugate points corresponding to all target points in the jth column to generate an intersection point, and a conjugate point a ij' corresponding to a target point a ij in the ith row and the jth column is required to be positioned near the intersection point. Therefore, in the present embodiment, the distance from the conjugate point matched with the target point to an intersection point is used to determine whether the conjugate point is a rough difference point. If the distance from the conjugate point matched with the target point to an intersection point exceeds a distance threshold, the conjugate point is identified as a rough difference point and is removed. In some embodiments, the distance threshold is set to 8 pixel units, and when the distance from the conjugate point to the intersection point is greater than 8 pixel units, the conjugate point is identified as a rough difference point, and the conjugate point of the target point is searched again along the epipolar line by iterative elimination from large to small from the rough difference point to the intersection point of two straight lines. After this operation, a primary matching result can be obtained, and the magnitude of parallax in the vicinity thereof is estimated using these conjugate points.
S4, extracting a plurality of characteristic points from the left image of the at least one image pair.
The feature points are special places in the image, generally corner points or edges, are obvious points in the image, and the accuracy and the reliability of the matching can be further improved by extracting the feature points for image matching. In this embodiment, each target point uniformly selected on the left image is taken as a center, an area is determined, specifically, a square area with a side length of 4 pixel units can be taken, at least a feature point is extracted by using a Forstner operator in the square area, and then a plurality of feature points can be obtained on the left image.
And S5, combining the primary matching result, and performing image matching on the extracted feature points to obtain a final matching result.
In this embodiment, after determining the primary matching result in step S3 and extracting a plurality of feature points in step S4, image matching is performed on the extracted plurality of feature points by further combining the primary matching result, so as to obtain a final matching result. Fig. 5 shows the final sea wave image matching result, wherein the left image is a left image acquired by a left camera, and the right image is a right image acquired by a right camera.
Specifically, in this embodiment, the position of the conjugate point corresponding to each extracted feature point is searched by using the pyramid image matching method based on the epipolar line in the step S3 again in combination with the parallax information reflected by the primary matching result. Marking each characteristic point on the left image, and marking the conjugate point corresponding to the characteristic point on the right image; and finally, matching the left image and the right image after identification by a least square method to obtain the final matching result. The least square image matching method considers geometric distortion and radiation distortion of the image in matching, and simultaneously fully utilizes information in an image window to perform adjustment calculation, so that the image matching can reach high precision of 1/10 or even 1/100 pixel, namely the image matching precision can reach sub-pixel level, and the least square image matching method is a high-precision image matching method. The pyramid image matching result based on the epipolar line can provide more accurate conjugate point positions, and solves the problem that iterative computation is not converged due to poor initial value of least square image matching. The gray function around the conjugate point pair should satisfy:
Wherein, Representing output image,/>Representing the input image, h 0 and h 1 represent radiation distortion parameters, and a 0,a1,a2,b0,b1,b2 represents geometric distortion parameters. In some embodiments, an indirect adjustment method is adopted, and a correlation coefficient maximum discrimination method is used to calculate the radiation distortion parameter and the geometric distortion parameter, so that the conjugate point can be accurately determined. The maximum correlation coefficient discrimination method refers to that when the correlation coefficient after iteration is smaller than the correlation coefficient after the last iteration, the iteration is stopped.
In conclusion, the sea wave image matching method solves the problems of low image matching precision and low reliability caused by weak sea surface smooth texture and unobvious characteristics in the stereo photography sea wave measurement technology, and provides powerful early-stage data for three-dimensional reconstruction work. Meanwhile, the method for eliminating the rough difference points by the cross method is provided, most of the rough difference points can be eliminated, the correct conjugate points can be found, the problem that time and labor are wasted when the rough difference points are manually eliminated is solved, and automation of image matching is realized.
In addition, the embodiment of the invention also provides a readable storage medium, and the readable storage medium stores a program or instructions which can realize the steps of the wave image matching method when being executed by a processor and can achieve the same technical effect.
The embodiments of the present invention have been described above with reference to the accompanying drawings, but the present invention is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present invention and the scope of the claims, which are to be protected by the present invention.

Claims (10)

1. An image acquisition system, comprising:
A pair of cameras of the binocular type,
And a synchronous triggering unit connected with the binocular camera and used for driving synchronous image acquisition operation of the binocular camera, wherein the synchronous triggering unit comprises:
the main control board is used for outputting a square wave signal with a self-defined frequency in a timing and quantitative mode and used as an external trigger signal for synchronous shooting of the binocular camera;
the network expansion board is connected with the main control board and at least one upper computer and is used for realizing interactive communication between the at least one upper computer and the main control board;
and the optocoupler isolation driving module is used for isolating the low-voltage control circuit of the main control board from the high-voltage execution circuit of the triggering side of the binocular camera.
2. The system of claim 1, wherein the system further comprises a controller configured to control the controller,
A digital input/output pin of the main control board is used as an output pin of a square wave signal and a GND end and is respectively connected with a low-voltage signal input end and a COM end of the optocoupler isolation driving module through wires;
Connecting respective trigger wires of the binocular cameras with the high-voltage signal output side of the optocoupler isolation driving module, and connecting respective common ground wires of the cameras with the GND end of the optocoupler isolation driving module;
And the VCC end and the GND end of the optocoupler isolation driving module are respectively connected with the V+ end and the V-end of the direct current switch power supply of the high-voltage executing circuit.
3. The system of claim 1, wherein the system further comprises a controller configured to control the controller,
The at least one upper computer stores a camera control program, the camera control program comprising:
the camera information display module is configured to display various hardware information and indexes of the binocular camera, and is used for checking whether a target camera is opened and checking whether the target camera is in a normal working state;
the image real-time display module is configured to display the images captured by the binocular camera in real time;
A camera parameter setting module configured to adjust various intrinsic parameters of the binocular camera to achieve the best image effect;
The camera trigger setting module is configured to select a camera shutter mode, select a trigger mode and set software trigger parameters;
the external trigger setting module is configured to control and operate the synchronous trigger unit, set external trigger parameters and send corresponding external trigger parameters to the synchronous trigger unit;
The camera acquisition operation module is configured to control the binocular camera to complete the operation of image acquisition;
and the image storage module is configured to store image data acquired by the binocular camera.
4. The system of claim 3, wherein the system further comprises a controller configured to control the controller,
The image storage module comprises a fast storage mode and a transfer mode,
Storing the image data acquired by the binocular camera in a solid state disk through the fast memory mode for pre-storage;
the prestored image data is transferred to a mechanical hard disk through the transfer mode.
5. A method for matching sea wave images, characterized in that the image acquisition system according to any one of claims 1-4 is adopted to acquire a sea wave image sequence, and the method comprises the following steps:
Calibrating intrinsic parameters of the binocular camera;
Selecting at least one image pair from the sea wave image sequence acquired by the image acquisition system, wherein each image pair comprises a left image and a right image;
Performing primary matching on the left image and the right image of the at least one image pair, and removing rough difference points to obtain a primary matching result;
Extracting a plurality of feature points on a left image of the at least one image pair;
and combining the primary matching result, and performing image matching on the extracted plurality of feature points to obtain a final matching result.
6. The method of claim 5, further comprising, prior to the first matching of the left image and the right image of the at least one image pair:
Filtering the at least one image pair, comprising:
filtering the selected at least one image pair by adopting a Wallis filtering operator, wherein the filtered image is expressed as:
Wherein g c (x, y) represents an image generated by filtering the image g (x, y), and x, y represents x, y coordinates; m g and s g are respectively the gray average value and the gray variance of a certain area of a certain pixel in an image; m f and s f are target values of image mean and image variance, respectively; c is the image contrast expansion constant and b is the image brightness coefficient.
7. The method of claim 5, wherein the step of determining the position of the probe is performed,
The pyramid image matching method based on the epipolar line is adopted to perform primary matching on the left image and the right image of the at least one image pair, and comprises the following steps:
uniformly selecting a plurality of target points on the left image;
according to the coordinate of each target point in the left camera coordinate system and the coordinate of the target point in the pixel coordinate system;
searching conjugate points matched with the target point along a epipolar line when pyramid images are matched;
determining the coordinates of the conjugate point matched with the target point in a right camera coordinate system and the coordinates of the conjugate point in a pixel coordinate system;
wherein, the coordinates of the conjugate point of the target point under the right camera coordinate system satisfy:
Wherein the coordinate of a target point in the left camera coordinate system is a (x a,ya -f), the coordinate of the target point in the pixel coordinate system is (u a,va), the coordinate of a conjugate point of the target point in the right camera coordinate system is a '(x a',ya' -f'), and the coordinate of the conjugate point in the pixel coordinate system is (u a',va'); The distortion coefficients of the left camera and the right camera of the binocular camera are k 1、k1 ', the coordinates of the intersection points of the main optical axes of the left camera and the right camera and the negative film are (x 0,y0)、(x0',y0 '), and the focal lengths of the left camera and the right camera are f and f ', respectively.
8. The method of claim 7, wherein the step of determining the position of the probe is performed,
Removing the rough difference points by using a cross method, comprising:
If the distance from the conjugate point matched with the target point to an intersection point exceeds a distance threshold, identifying the conjugate point as a rough difference point and removing the rough difference point; and is combined with
Searching the conjugate point of the target point along the epipolar line again;
Wherein, a straight line formed by approximately fitting conjugate points corresponding to all target points in the ith row and a straight line formed by approximately fitting conjugate points corresponding to all target points in the jth column intersect to generate the intersection point, and a conjugate point a ij' corresponding to a target point a ij in the ith row and the jth column should be positioned near the intersection point.
9. The method of claim 7, wherein extracting feature points on the at least one image pair left image comprises:
determining an area by taking each target point of the left image as a center;
At least one feature point is extracted at each region.
10. The method of claim 9, wherein the step of determining the position of the substrate comprises,
And combining the primary matching result, performing image matching on the extracted plurality of feature points to obtain a final matching result, wherein the final matching result comprises the following steps of:
Combining the primary matching result, and searching the position of the conjugate point corresponding to each extracted characteristic point by adopting a pyramid image matching method based on a epipolar line;
Marking each characteristic point on the left image, and marking the conjugate point corresponding to the characteristic point on the right image;
and matching the left image and the right image after identification by a least square method to obtain the final matching result.
CN202410233669.9A 2024-03-01 2024-03-01 Image acquisition system and sea wave image matching method Pending CN118247344A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410233669.9A CN118247344A (en) 2024-03-01 2024-03-01 Image acquisition system and sea wave image matching method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410233669.9A CN118247344A (en) 2024-03-01 2024-03-01 Image acquisition system and sea wave image matching method

Publications (1)

Publication Number Publication Date
CN118247344A true CN118247344A (en) 2024-06-25

Family

ID=91554685

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410233669.9A Pending CN118247344A (en) 2024-03-01 2024-03-01 Image acquisition system and sea wave image matching method

Country Status (1)

Country Link
CN (1) CN118247344A (en)

Similar Documents

Publication Publication Date Title
EP3248374B1 (en) Method and apparatus for multiple technology depth map acquisition and fusion
US7616885B2 (en) Single lens auto focus system for stereo image generation and method thereof
CN108269238B (en) Depth image acquisition device, depth image acquisition system and image processing method thereof
CN106097318A (en) A kind of grain volume measuring system and method
CN102901489A (en) Pavement water accumulation and ice accumulation detection method and apparatus thereof
WO2022183685A1 (en) Target detection method, electronic medium and computer storage medium
US11989827B2 (en) Method, apparatus and system for generating a three-dimensional model of a scene
CN107092905B (en) Method for positioning instrument to be identified of power inspection robot
CN114581817B (en) Method and system for intelligently detecting wave height and wind speed from offshore wave monitoring video
JP2001524211A (en) 3D object measurement system using laser
CN109141236A (en) Laser strobe dimensional visual measurement system and method based on vibration mirror scanning
CN110044266B (en) Photogrammetry system based on speckle projection
CN108416091A (en) A kind of measurement method of easy camera ground resolution and drone flying height relationship
WO2020179439A1 (en) Displacement detection method, photography instruction method, displacement detection device, and photography instruction device
CN117332370A (en) Underwater target acousto-optic panorama cooperative identification device and identification method
CN118247344A (en) Image acquisition system and sea wave image matching method
CN101980299B (en) Chessboard calibration-based camera mapping method
CN102826209A (en) Method for realizing stereo shooting of ship draft image by using one-armed wall-climbing robot
CN114184127B (en) Single-camera target-free building global displacement monitoring method
CN115830225A (en) Underwater RGB-D three-dimensional reconstruction system and method
CN110018436B (en) Power tester and power testing method based on image recognition technology
Qu et al. Computer vision-based 3D coordinate acquisition of surface feature points of building structures
Haussmann et al. Streak detection of space debris by a passive optical sensor
CN111780683A (en) Portable scanning system and method of use
CN112540364A (en) Time delay measuring method, measuring device and measuring system of TOF depth camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination