CN106683043B - Parallel image splicing method and device of multi-channel optical detection system - Google Patents

Parallel image splicing method and device of multi-channel optical detection system Download PDF

Info

Publication number
CN106683043B
CN106683043B CN201510762866.0A CN201510762866A CN106683043B CN 106683043 B CN106683043 B CN 106683043B CN 201510762866 A CN201510762866 A CN 201510762866A CN 106683043 B CN106683043 B CN 106683043B
Authority
CN
China
Prior art keywords
image
spliced
images
reference image
splicing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510762866.0A
Other languages
Chinese (zh)
Other versions
CN106683043A (en
Inventor
曹扬
胡荣
吴京辉
邵光征
周武
魏正宜
杨柳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Aerospace Ke Gong Group 4th Research Institute's Command Automation Technical Research And Application Center
Original Assignee
China Aerospace Ke Gong Group 4th Research Institute's Command Automation Technical Research And Application Center
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Aerospace Ke Gong Group 4th Research Institute's Command Automation Technical Research And Application Center filed Critical China Aerospace Ke Gong Group 4th Research Institute's Command Automation Technical Research And Application Center
Priority to CN201510762866.0A priority Critical patent/CN106683043B/en
Publication of CN106683043A publication Critical patent/CN106683043A/en
Application granted granted Critical
Publication of CN106683043B publication Critical patent/CN106683043B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10052Images from lightfield camera

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The application provides a parallel image splicing method of a multi-channel optical detection system, which belongs to the field of image processing and comprises the following steps: preprocessing the collected images according to a set down-sampling proportion to obtain down-sampled images to be spliced; selecting a reference image from the images to be spliced, and setting the splicing sequence of the images to be spliced; respectively extracting the characteristics of the adjacent images to be spliced according to a set splicing sequence, screening, and determining preferred characteristic point pair information and a transformation matrix of each group of images to be spliced corresponding to the preferred characteristic point pair information; calculating a transformation matrix of each image to be spliced relative to the reference image according to a set splicing sequence; and splicing all the images to be spliced according to the transformation matrix of each image to be spliced relative to the reference image. By the method, the parallel splicing from the image to be spliced to the reference image is realized, and the splicing efficiency is further improved.

Description

Parallel image splicing method and device of multi-channel optical detection system
Technical Field
The present application relates to the field of image processing, and in particular, to a parallel image stitching method and apparatus for a multi-channel optical detection system.
Background
In many application fields such as military and civil, there is a strong demand for large-field and high-resolution optical detection systems. In order to solve the problems that the traditional visible light reconnaissance detection device is limited by the technical level and the physical principle, high angular resolution and large field angle cannot be considered, and large-range target detection and identification cannot be reliably realized, researchers develop the research of a multi-channel optical detection system based on the bionic compound eye technology and obtain certain achievement.
The multi-channel optical detection system has the advantages of large field angle, high resolution and the like, and can solve the problems of a typical optical reconnaissance detection system to a certain extent. The system mainly comprises a plurality of image acquisition devices (such as high-resolution cameras) for shooting images of a target area, an image splicing device and an image display device, wherein the image splicing device is used for splicing the images acquired by a plurality of channels and is a core component of the multi-channel optical detection system. In practical application of a multi-channel optical detection system, an image stitching device is often required to combine a plurality of images acquired by the multi-channel optical system into a large image, and the large image is displayed at a certain frame frequency.
The image splicing method based on the invariant features is a well-known reliable image splicing method. However, in practical application, the existing method has certain problems, and the calculation efficiency and the splicing precision of the method are still to be further improved. For a typical image splicing method based on invariant features, in the prior art, a serial splicing mode of sequentially splicing a second image and a first image according to a splicing sequence is adopted for image splicing, and if a third image is spliced to the spliced images of the first image and the second image, the splicing efficiency is low.
Therefore, one technical problem that needs to be urgently solved by those skilled in the art is: how to improve the image splicing efficiency.
Disclosure of Invention
The technical problem to be solved by the application is to provide an image splicing method of a multi-channel optical detection system, which can improve the efficiency of image splicing.
In order to solve the above problem, the present application discloses a parallel image stitching method for a multi-channel optical detection system, which includes: preprocessing the collected images according to a set down-sampling proportion to obtain down-sampled images to be spliced; selecting a reference image from the images to be spliced, and setting the splicing sequence of the images to be spliced; respectively extracting the characteristics of the adjacent images to be spliced according to a set splicing sequence to obtain candidate characteristic point pair information of each group of images to be spliced; screening the candidate characteristic point pair information, and determining preferred characteristic point pair information and a transformation matrix of each group of images to be spliced corresponding to the preferred characteristic point pair information; calculating a transformation matrix of each image to be spliced relative to the reference image according to a set splicing sequence; and splicing all the images to be spliced according to the transformation matrix of each image to be spliced relative to the reference image.
Selecting a reference image from the images to be spliced, and setting the splicing sequence of the images to be spliced, wherein the method comprises the following steps: and selecting the image to be spliced corresponding to the center of the field of view of the multi-channel optical detection system as a reference image, and setting the splicing sequence on the two sides respectively according to the sequence that the distance between the image to be spliced and the reference image is from small to large by taking the reference image as the center.
Further, the calculating a transformation matrix of each image to be stitched relative to the reference image according to the set stitching order includes: and according to a set splicing sequence, multiplying transformation matrixes corresponding to all the images to be spliced which are located at the same side of the reference image and are prior to the images to be spliced in the splicing sequence by the transformation matrixes of the images to be spliced to obtain the transformation matrix of the images to be spliced relative to the reference image.
Further, the calculating a transformation matrix of each image to be stitched relative to the reference image according to the set stitching order includes: selecting an image to be spliced corresponding to the center of the field of view of the multi-channel optical detection system as a reference image according to the field of view position of the image to be spliced acquired by an image acquisition device in the multi-channel optical detection system, and dividing the image to be spliced into lines by taking the reference image as the center; selecting the image to be spliced which is closest to the reference image from each line of images as the line reference image of the line; according to the sequence that the distance between the image to be spliced and the line reference image is from small to large, the splicing sequence of the image to be spliced is respectively arranged on the two sides; according to the sequence that the distance between the line reference image and the reference image is from small to large, the splicing sequence of the line reference images is respectively arranged on the two sides.
The calculating of the transformation matrix of each image to be stitched relative to the reference image according to the set stitching sequence comprises: according to a set splicing sequence, the current image to be spliced and a transformation matrix corresponding to all the images to be spliced of the current image to be spliced are multiplied on the same side of the row reference image of the row, wherein the splicing sequence is prior to the transformation matrix corresponding to all the images to be spliced of the current image to be spliced and the transformation matrix of the current image to be spliced, so that a transformation matrix of the current image to be spliced relative to the row reference image of the row is obtained; and accumulating and multiplying the transformation matrix of all the line reference images which are positioned at the same side of the reference image and are prior to the line reference image of the line in the splicing sequence with the transformation matrix of the current image to be spliced relative to the line reference image of the line to obtain the transformation matrix of the current image to be spliced relative to the reference image.
In another embodiment of the present application, the method further comprises: determining a splicing area of the images to be spliced as an overlapping area of adjacent images to be spliced; the method comprises the following steps of extracting the characteristics of the adjacent images to be spliced according to a set splicing sequence to obtain candidate characteristic point pair information of each group of images to be spliced, and further comprising the following steps: and respectively extracting the characteristics of the splicing areas of the adjacent images to be spliced according to the set splicing sequence to obtain the candidate characteristic point pair information of each group of images to be spliced.
In yet another embodiment of the present application, the method further comprises: and automatically setting the down-sampling proportion according to the time for splicing one frame of image by the multi-channel optical detection system and the set display frame rate.
Correspondingly, this application still discloses a parallel image splicing apparatus of multichannel optical detection system, includes: the down-sampling module is used for preprocessing the acquired images according to a set down-sampling proportion to obtain down-sampled images to be spliced; the splicing sequence setting module is used for selecting a reference image from the images to be spliced and setting the splicing sequence of the images to be spliced; the characteristic extraction module is used for respectively extracting the characteristics of the adjacent images to be spliced according to a set splicing sequence to obtain candidate characteristic point pair information of each group of images to be spliced; the characteristic screening module is used for screening the candidate characteristic point pair information and determining the preferred characteristic point pair information and a transformation matrix of each group of images to be spliced corresponding to the preferred characteristic point pair information; the splicing parameter calculation module is used for calculating a transformation matrix of each image to be spliced relative to the reference image according to a set splicing sequence; and splicing all the images to be spliced according to the transformation matrix of each image to be spliced relative to the reference image.
Further, the splicing sequence setting module is specifically configured to select an image to be spliced corresponding to the center of the field of view of the multi-channel optical detection system as a reference image, and set the splicing sequence on both sides of the reference image in the order from small to large according to the distance between the image to be spliced and the reference image.
Further, the stitching parameter calculation module is specifically configured to, according to a set stitching sequence, multiply the transformation matrices corresponding to all the images to be stitched that are prior to the current image to be stitched and the transformation matrix of the current image to be stitched together to obtain a transformation matrix of the current image to be stitched relative to the reference image.
In another embodiment of the present application, the stitching order setting module is specifically configured to select an image to be stitched corresponding to a center of a field of view of the multi-channel optical detection system as a reference image according to a field of view position of the image to be stitched, which is acquired by an image acquisition device in the multi-channel optical detection system, and divide the image to be stitched into lines with the reference image as a center; selecting the image to be spliced which is closest to the reference image from each line of images as the line reference image of the line; according to the sequence that the distance between the image to be spliced and the line reference image is from small to large, the splicing sequence of the image to be spliced is respectively arranged on the two sides; according to the sequence that the distance between the line reference image and the reference image is from small to large, the splicing sequence of the line reference images is respectively arranged on the two sides.
Further, the splicing parameter calculation module is specifically configured to, according to a set splicing sequence, locate the image to be spliced currently in the same row and on the same side of the row reference image of the row, where the splicing sequence is prior to multiply the transformation matrices corresponding to all the images to be spliced of the image to be spliced currently and the transformation matrix of the image to be spliced currently, so as to obtain a transformation matrix of the image to be spliced currently relative to the row reference image of the row; and accumulating and multiplying the transformation matrix of all the line reference images which are positioned at the same side of the reference image and are prior to the line reference image of the line in the splicing sequence with the transformation matrix of the current image to be spliced relative to the line reference image of the line to obtain the transformation matrix of the current image to be spliced relative to the reference image.
Compared with the prior art, the method has the following advantages: after the splicing sequence of the images to be spliced is set, the feature point pairs of the images to be spliced and the calculation transformation matrix can be respectively extracted, and the feature extraction and the calculation of the transformation matrix between the non-adjacent images to be spliced have no dependency relationship, so that the parallel calculation is facilitated, and the splicing efficiency is improved; meanwhile, the transfer of an image coordinate system is realized through the continuous product of the transformation matrixes, the transformation matrixes of all other images relative to the reference image are calculated, the adjacent images to be spliced are not required to be spliced in sequence during specific splicing, the parallel splicing from the images to be spliced to the reference image can be realized, and the splicing efficiency is further improved.
Drawings
FIG. 1 is a schematic flowchart of a first embodiment of a parallel image stitching method of a multi-channel optical detection system according to the present application;
FIG. 2 is a schematic diagram of the distribution of image positions acquired by the multi-channel optical detection system of the present application;
FIG. 3 is another schematic diagram of the distribution of image positions acquired by the multi-channel optical detection system of the present application;
FIG. 4 is a schematic flowchart of a fourth embodiment of a parallel image stitching method of the multi-channel optical detection system of the present application;
FIG. 5 is a schematic structural diagram of a parallel image stitching apparatus disclosed in the fifth embodiment of the present application;
fig. 6 is a schematic structural diagram of a parallel image stitching apparatus according to still another embodiment of the present application.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, the present application is described in further detail with reference to the accompanying drawings and the detailed description.
The first embodiment is as follows:
referring to fig. 1, fig. 1 shows a schematic flowchart of a first embodiment of a parallel image stitching method of a multi-channel optical detection system according to the present application, where the parallel image stitching method includes the following steps:
step 100, preprocessing the collected image according to a set down-sampling proportion to obtain a down-sampled image to be spliced;
step 110, selecting a reference image from the images to be spliced, and setting a splicing sequence of the images to be spliced;
step 130, respectively extracting the features of the adjacent images to be spliced according to a set splicing sequence to obtain candidate feature point pair information of each group of images to be spliced;
step 140, screening the candidate characteristic point pair information, and determining preferred characteristic point pair information and a transformation matrix of each group of images to be stitched corresponding to the preferred characteristic point pair information;
step 150, calculating a transformation matrix of each image to be spliced relative to the reference image according to a set splicing sequence;
and 160, splicing all the images to be spliced according to the transformation matrix of each image to be spliced relative to the reference image.
Wherein the step 100 specifically comprises:
before image splicing, denoising image frames acquired by each channel of the multi-channel optical detection system to remove interference noise. The optical image may be denoised by the prior art, which is not described herein.
Then, in order to reduce the amount of image stitching calculation performed by the system and ensure that the processed image is not distorted, the image after the denoising process needs to be downsampled to obtain a to-be-sampled image. In specific implementation, the image acquired by the multi-channel optical detection system can be subjected to down-sampling processing according to a preset down-sampling proportion T. The image down-sampling proportion, namely the down-sampling multiple T, is calculated according to the output frame frequency to be displayed, the calculation performance of the multi-channel optical detection system and the resolution of the image acquired by the multi-channel optical detection system, and can be preset. For example: if the frame frequency of the display image required by the user is high and the calculation performance of the multi-channel optical detection system is general, the acquired image needs to be down-sampled by a large multiple, taking an image acquisition device adopted by the multi-channel optical detection system as a 7-channel camera with 500 ten thousand pixels as an example, if the output display frame frequency is required to be 5 frames per second and an HPZ820 workstation is used as a calculation device, an E5-2690 processor and an ECC DDR 31600 MHz are adopted, according to experience verification, the acquired image needs to be down-sampled by 4 times, namely the side length of the down-sampled image is changed into 1/4 of the original acquired image, and the single-channel image only has about 30 more ten thousand pixels. The collected images are subjected to down-sampling processing, and the operation amount of image splicing and display can be effectively reduced. If the number of pixels in the length direction and the width direction of the original collected image is X, Y respectively, the number of pixels in the length direction and the width direction of the down-sampled image obtained after the down-sampling of the image is changed into ceil (X/T) and ceil (Y/T), wherein ceil is an upward integer function.
In the step 110, in order to improve the image stitching efficiency, a reference image needs to be selected, and then stitching information of other images to be stitched relative to the reference image is calculated respectively by using the reference image as a standard, so that image stitching can be performed in parallel. The reference image may be set manually. The splicing sequence of the images to be spliced can be set manually according to experience, for example, the splicing sequence of the images collected by the image collecting devices can be set according to the view field positions of the images to be spliced collected by the image collecting devices of the multi-channel optical detection system from left to right and from top to bottom. For example, the multi-channel optical detection system includes 5 paths of image acquisition devices, as shown in fig. 2, the images to be spliced acquired by the 5 paths of image acquisition devices are respectively: p1, P2, P3, P4 and P5, wherein the splicing sequence of the images to be spliced can be set as follows: p1 is a reference image, and P2 and P1 are spliced, P3 and P2 are spliced, P4 and P3 are spliced, and P5 and P4 are spliced. Preferably, the splicing sequence of each image to be spliced is set according to the position relationship from the center to the periphery according to the view field position of the image to be spliced acquired by the image acquisition device in the multi-channel optical detection system, and in specific implementation, the view field position can be determined according to the installation position of each image acquisition device. Selecting an image to be spliced corresponding to the center of a view field of the multi-channel optical detection system as a reference image, taking the reference image as the center, respectively setting the splicing sequence on the two sides according to the sequence that the distance between the image to be spliced and the reference image is from small to large, and splicing the image to be spliced with the reference image at the first when the distance between the image to be spliced and the reference image is smaller. Taking fig. 2 as an example, selecting P3 located at the center of the field of view as a reference image, and according to the order of the distance between the image to be stitched and the reference image from small to large, the stitching order respectively set at the two sides is: splicing P4 and P3, splicing P5 and P4; p2 and P3 splices, P1 and P2 splices. The splicing sequence of the images to be spliced can be marked by setting a splicing sequence number, a target image spliced by a certain image to be spliced is indexed according to the splicing sequence number, namely, a group of identifiers of the images to be spliced are recorded, such as: the splicing sequence number of P3 is 0, and splicing is not needed; the splicing sequence number of P4 is 1, and the splicing is required to be spliced with P3, as shown in the following table:
image to be stitched P1 P2 P3 P4 P5
Splicing sequence number 4 2 0 1 3
Adjacent images to be stitched P2 P3 - P3 P4
In specific implementation, for convenience of calculation, the splicing sequence number of the reference image is set to be the minimum value, when the splicing sequence numbers of other images to be spliced are set, the reference image is taken as the center, the splicing sequence numbers of the images to be spliced on the two sides are respectively set to be odd numbers and even numbers, and the images to be spliced are numbered according to the sequence of the distances from the reference image from small to large, and the images to be spliced are spliced more and more first when the numbers are smaller. Here, the manner of representing the splicing order is only illustrated by way of example, and the method of representing the splicing order is not limited in the present application.
The image splicing method based on invariant features needs to extract invariant features from images to be spliced, obtain feature descriptions of candidate feature points and corresponding coordinate information on respective images, and determine feature point pairs for image splicing according to the matching degree of the image feature descriptions to complete the splicing of the two images. Therefore, in the step 130, the features of the adjacent images to be stitched are respectively extracted according to the stitching sequence set in the step 110, and then the extracted features are subjected to matching operation to obtain candidate feature point pair information of each group of images to be stitched. This embodiment takes as an example the extraction of candidate feature point pair information of P3 and P4.
The image splicing is carried out based on the invariant features, and a plurality of methods for extracting the invariant features of the image are provided, such as: scale Invariant Feature Transform (SIFT) method, surf (speeduprobust features) Feature extraction method, and the like; the "feature descriptions" obtained by different feature extraction methods have certain differences, so the methods for matching feature points also have slight differences. The specific method for extracting and matching invariant features is not limited in the present application. In the embodiment of the present application, a Scale Invariant Feature Transform (SIFT) method is taken as an example to describe a feature extraction process of an image to be stitched, and a "closest distance to next nearest distance method" is taken as an example to describe a process of operation of feature matching.
The image to be spliced P3 is represented by i, the image to be spliced P4 is represented by j, feature extraction is carried out on the spliced area of the image i and the image j respectively by utilizing an SIFT algorithm, and feature description of candidate feature points and corresponding coordinate information on the images are obtained.
Assume that the stitched region of image i is i 'and the stitched region of image j is j'. And (3) performing feature extraction on i 'and j' by using an SIFT algorithm to obtain a candidate feature point feature description set:
Figure BDA0000843398850000081
wherein Featurei'、Featurej'Extracting feature description sets of the candidate feature points for i 'and j' respectively; fpExtracting feature description of the p-th candidate feature point for i', FpIs a 128-dimensional vector, Fp=(fp1,fp2,...,fp128);MqFor the feature description of the q-th candidate feature point extracted for j', MqIs a 128-dimensional vector, Mq=(mq1,mq2,...,mq128) (ii) a P, Q are the number of candidate feature points extracted from i ', j', respectively, and P, Q is a positive integer.
And then, performing feature matching operation on the extracted candidate feature points to obtain candidate feature point pairs.
In the embodiment of the invention, a method of 'nearest distance to next nearest distance' is adopted, and candidate feature point matching is carried out on the basis of the solved feature description of the candidate feature points to obtain candidate feature point pairs.
Computing Featurei'Each Feature description and Feature inj' distance between each of the profiles, characterized by FpAnd the description of the characteristics MqFor example, the distance between the two is calculated by the following formula:
Figure BDA0000843398850000091
Dis(Fp,Mq) For the feature description FpAnd the description of the characteristics MqThe distance between them.
For Featurei'One of the characteristics of FpRespectively calculate it and Featurej'The distance of each Feature description in the Feature description is calculated to obtain Featurej'Middle distance FpThe most recent feature is described as MHH is more than or equal to 1 and less than or equal to Q, wherein H is a positive integer; distance FpThe next nearest feature is described as MSAnd S is more than or equal to 1 and less than or equal to Q, wherein S is a positive integer. Dis (F) if the ratio of the closest distance to the next closest distance is less than or equal to a set threshold, e.g. 0.7p,MH)/Dis(Fp,MS) If the content is less than or equal to 0.7, the result is FpAnd MHIs a pair of matched signatures, in this case, Fp、MHAnd the pixel points on the corresponding splicing region i 'of the image i and the splicing region j' of the image j are the solved pair of candidate characteristic point pairs. The threshold set above may take any positive number less than 1, and may take a smaller number if a high confidence rate is desired.
All candidate feature point pairs on i 'and j' are obtained based on the above method, and all candidate feature point pairs of the image i and the image j are obtained.
The candidate characteristic point pair sets of the image to be jigsaw puzzle are obtained through the method. If the image to be stitched shown in fig. 2 is taken as an example, the step 130 obtains 4 sets of candidate feature point pair sets, which includes: p4 and candidate feature point pair set F 'of reference image P3'1->0Candidate pairs of feature points set F 'of P2 and P3'2->0Candidate pairs of feature points set F 'of P5 and P4'3->1Candidate pairs of minutiae P1 and P2F's collection'4->2. The elements in the set include: and describing the characteristics of the candidate characteristic points and corresponding coordinate information on the respective images.
In this embodiment, the splicing region of the images to be spliced may be all regions of the images to be spliced, or may be a partial region of the images to be spliced. If the splicing area is a partial area of the image to be spliced, the size of the area to be spliced needs to be set.
In step 140, in order to further reduce the stitching computation amount and improve the stitching accuracy, the candidate feature point pair information is further screened, preferred feature point pair information is determined, and a transformation matrix of each group of the images to be stitched corresponding to the preferred feature point pair information is determined. In this embodiment, candidate feature point pairs are screened based on the method of the preferred feature point pair and the transformation matrix, so as to obtain the preferred feature point pair and the corresponding preferred transformation matrix.
In specific implementation, a Random Sample Consensus (RANSAC) algorithm is used to optimize the pairs of feature points and the transformation matrix. The following is still a specific method for specifying the preferred feature point pair information and the transformation matrix by taking the images i and j as an example.
The coordinates of a certain pair of candidate feature point pairs on the images i and j are (x, y) and (x ', y'), respectively, wherein the following transformation relationship is satisfied between (x, y) and (x ', y'):
Figure BDA0000843398850000101
matrix is a transformation Matrix between the image i and the image j, and is a 3 × 3 Matrix, and each pair of feature point pairs can be listed with an equation shown in formula (3).
Firstly, randomly selecting 4 pairs of candidate feature point pairs from the candidate feature point pairs of the image i and the image j, and calculating a transformation Matrix between the image i and the image j by using a formula (3).
Then, based on the calculated transformation Matrix, an interior point set is solved.
(Nx ', Ny') and (Mx, My) are coordinates corresponding to the image j and the image i for a candidate feature point pair, and first, the corresponding actual coordinates of (Nx ', Ny') in the image i are solved by using the solved transformation Matrix, and the calculation formula is as follows:
Figure BDA0000843398850000111
wherein (Nx, Ny) is (Nx ', Ny') corresponding to the real coordinates in the image i, and the Euclidean distance d between (Nx, Ny) and (Mx, My) is calculated and recorded1
Similarly, the corresponding real coordinate in the image j of (Mx, My) is solved, and the calculation formula is as follows:
Figure BDA0000843398850000112
wherein Matrix-1Which is the inverse of the transformation Matrix, (Mx ', My') is (Mx, My) corresponding to the real coordinate in image j. Calculating and recording Euclidean distance d between (Mx ', My') and (Nx ', Ny')2D is mixing1And d2The sum is used as the judgment basis for whether the candidate feature point is an interior point, when (d)1+d2) And when the Th is less than or equal to Th, the candidate characteristic point pair is considered as an inner point, and Th is set to 1 and is an empirical value which can be set according to requirements.
And judging the Number of the inner points in all the candidate characteristic point pairs of the image i and the image j by using the method based on the transformation Matrix, and recording a set formed by the inner points.
And finally, solving the optimal transformation matrix and the optimal characteristic point pairs. The number of cycles is set to SN, for example: and SN is 1000, 4 pairs of candidate characteristic point pairs are selected from the candidate characteristic point pairs of the image i and the image j in a circulating mode, a transformation Matrix between the image i and the image j is calculated by using a formula (3), an inner point set is solved based on the calculated transformation Matrix, the transformation Matrix corresponding to the inner point set with the largest number of elements is the preferred transformation Matrix, and the candidate characteristic point pairs in the inner point set are the preferred characteristic point pairs at the moment. The cycle number SN is an empirical value and is determined according to requirements. Preferably, the cycle number SN is greater than 4/4 of the number of pixels in the splicing region.
Through the step 140, a set of preferred feature point pairs of all adjacent images to be stitched and a transformation matrix of each group of images to be stitched can be obtained, and taking the image to be stitched shown in fig. 2 as an example, the candidate feature point pair set F 'is subjected to the step 140'1->0、F'2->0、F'3->1、F'4->2Screening to obtain 4 groups of preferred characteristic point pair sets F1->0、F2->0、F3->1、F4->2And 4 transformation matrices Matrix1->0、Matrix2->0、Matrix3->1、Matrix4->2Respectively correspond to: splicing parameters of P4 and a reference image P3, splicing parameters of P2 and a reference image P3, splicing parameters of P5 and P4, and splicing parameters of P1 and P22->0And showing the splicing sequence number and the splicing relation of the spliced images.
In the step 150, according to the set stitching sequence, the transformation matrix corresponding to all the images to be stitched that are located on the same side of the reference image and are prior to the current image to be stitched is multiplied by the transformation matrix of the current image to be stitched to obtain the transformation matrix of the current image to be stitched relative to the reference image. For example: matrixA→A-1For the transformation matrix of the image to be spliced with the splicing sequence number A relative to the image to be spliced with the splicing sequence number A-1, the transformation matrix of the image to be spliced with the splicing sequence number A relative to the reference image is as follows: matrixA→0=Matrix1→0×...×MatrixA-1→A-2×MatrixA→A-1(ii) a Wherein, Matrix1→0Is a transformation Matrix, of the image to be stitched closest to (i.e. first stitched) the reference image with respect to the reference imageA→A-1The image to be spliced with the splicing sequence number A is a transformation matrix relative to the image to be spliced with the splicing sequence number A-1. The image to be stitched shown in fig. 2 is still used as an example for explanation. There is no image on the same side of the reference image as P4 that precedes image P4 in stitching order, and therefore the imageThe transformation Matrix of the image P4 with respect to the reference image P3 is Matrix1->0(ii) a Since the image P5 is preceded in the stitching order and is on the same side as the reference image P5 as P4, the transformation Matrix of the image P5 with respect to the reference image P33->0Being a multiplication of the transformation matrices of image P5 and image P4, i.e. Matrix3->0=Matrix3->1×Matrix1->0(ii) a Since there is no image on the same side as the reference image as the image P2 that precedes the image P2 in the stitching order, the transformation Matrix of the image P2 with respect to the reference image P3 is Matrix2->0(ii) a Since the image P1 is preceded in the stitching order and is on the same side as the reference image P1 as P2, the transformation Matrix of the image P1 with respect to the reference image P34->0Being a multiplication of the transformation matrices of image P1 and image P2, i.e. Matrix4->0=Matrix4->2×Matrix2->0
In the step 160, all the images to be stitched are stitched to the reference image according to the transformation matrix of each image to be stitched relative to the reference image, so as to obtain a final stitched image.
The embodiment of the application preprocesses the collected image according to the set down-sampling proportion to obtain the down-sampled image to be spliced; selecting a reference image from the images to be spliced, and then setting the splicing sequence of the images to be spliced; respectively extracting the characteristics of the adjacent images to be spliced according to a set splicing sequence to obtain candidate characteristic point pair information of each group of images to be spliced; screening the candidate characteristic point pair information, and determining preferred characteristic point pair information and a transformation matrix of each group of images to be spliced corresponding to the preferred characteristic point pair information; then, calculating a transformation matrix of each image to be spliced relative to the reference image according to a set splicing sequence; and finally, splicing all the images to be spliced according to the transformation matrix of each image to be spliced relative to the reference image to obtain a final spliced image.
After the splicing sequence of the images to be spliced is set, the feature point pairs of the images to be spliced and the calculation transformation matrix can be respectively extracted, and the feature extraction and the calculation of the transformation matrix between the non-adjacent images to be spliced have no dependency relationship, so that the parallel calculation is facilitated, and the splicing efficiency is improved; meanwhile, the transfer of an image coordinate system is realized through the continuous product of the transformation matrixes, the transformation matrixes of all other images relative to the reference image are calculated, the adjacent images to be spliced are not required to be spliced in sequence during specific splicing, the parallel splicing from the images to be spliced to the reference image can be realized, and the splicing efficiency is further improved.
Example two:
in a preferred embodiment of the present application, the selecting a reference image from the images to be stitched, and setting a stitching order of the images to be stitched further includes: selecting an image to be spliced corresponding to the center of the field of view of the multi-channel optical detection system as a reference image according to the field of view position of the image to be spliced acquired by an image acquisition device in the multi-channel optical detection system, and dividing the image to be spliced into lines by taking the reference image as the center; selecting the image to be spliced which is closest to the reference image from each line of images as the line reference image of the line; according to the sequence that the distance between the image to be spliced and the line reference image is from small to large, the splicing sequence of the image to be spliced is respectively arranged on the two sides, and the image to be spliced with the smaller distance from the reference image is spliced firstly; according to the sequence that the distance between the line reference image and the reference image is from small to large, the splicing sequence of the line reference images is respectively arranged on the two sides, and the line reference images with smaller distances from the reference image are spliced first. Fig. 3 is a schematic diagram of images acquired by a multi-channel optical detection system, as shown in fig. 3, 8 images to be stitched are selected, an image P3 to be stitched corresponding to the center of the field of view of the multi-channel optical detection system is selected as a reference image, and the image to be stitched is divided into 4 lines, L1-L4, by taking the reference image as the center. Wherein, the line L1 includes P4, the line L2 includes P2, P3, P5, the line L3 includes P1, P6, P7, the line L4 includes P8. Selecting an image P4 to be spliced, which is closest to the standard image, from the L1 line images as a line reference image of the line; selecting an image P3 to be spliced, which is closest to the standard image, from the L2 line images as a line reference image of the line; selecting an image P6 to be spliced, which is closest to the standard image, from the L3 line images as a line reference image of the line; and selecting the image P8 to be spliced closest to the standard image from the L4 line images as the line reference image of the line. Taking an L3 line of images to be stitched as an example, according to the sequence from small to large of the distance between the images to be stitched and the line reference images, the stitching sequence of the images to be stitched is respectively set on both sides, that is: splicing P7 and P6; p1 and P6. The stitching sequence of the images to be stitched can be marked by setting a stitching sequence number, as shown in the following table:
image to be stitched P1 P6 P7
Splicing sequence number 2 0 1
Adjacent images to be stitched P6 - P6
And according to the sequence that the distance between the line reference image and the reference image is from small to large, the splicing sequence of the line reference images is respectively arranged on the two sides: splicing P4 and P3; splicing P6 and P3; the images to be spliced are spliced by P8 and P6, and the splicing sequence of the images to be spliced can be marked by setting a splicing sequence number, as shown in the following table:
image to be stitched P4 P3 P6 P8
Splicing sequence number 10 0 11 13
Adjacent images to be stitched P3 - P3 P6
The method for extracting the features of the image to be stitched in the row and the row reference image and generating the transformation matrix refer to step 130 and step 140 in the first embodiment, which are not described herein again.
The calculating of the transformation matrix of each image to be stitched relative to the reference image according to the set stitching sequence further comprises: according to a set splicing sequence, the current image to be spliced and a transformation matrix corresponding to all the images to be spliced of the current image to be spliced are multiplied on the same side of the row reference image of the row, wherein the splicing sequence is prior to the transformation matrix corresponding to all the images to be spliced of the current image to be spliced and the transformation matrix of the current image to be spliced, so that a transformation matrix of the current image to be spliced relative to the row reference image of the row is obtained; then will be referenced to the line of the lineThe images are positioned on the same side of the reference image, the splicing sequence is prior to the multiplication of the transformation matrixes of all the line reference images of the line reference image of the line and the transformation matrixes of the current image to be spliced relative to the line reference image of the line, so that the transformation matrixes of the current image to be spliced relative to the reference image are obtained, and the transformation matrixes of the current image to be spliced relative to the reference image are obtained. For convenience of description, in this embodiment, a stitching sequence number of the reference image is 0, stitching sequence numbers of other images to be stitched in a row are respectively odd (e.g., 1, 3, 5 ….) and even (e.g., 2, 4, 6, … a-2, a, …) on two sides of the reference image, and a stitching sequence number of the row reference image is respectively odd (e.g., 11, 13, 15 ….) and even (e.g., 12, 14, 16, … (B-2), (B), and.) on two sides of the reference image, as an example, a calculation method of a stitching matrix of a current image to be stitched with respect to the reference image is described. If the transformation Matrix sequence of the to-be-spliced images of each row is obtained as { Matrix }A→A-2-sequence of transformation matrices for the line-based images { Matrix }(B)→(B-2)A, A-2, (B) and (B-2) are splicing sequence numbers, MatrixA→A-2The transformation Matrix required for splicing the image with splicing sequence number A to the image with splicing sequence number A-2 has Matrix(B)→(B-2)And (3) a transformation matrix required for splicing the row reference image with the splicing sequence number (B) to the row reference image with the splicing sequence number (B-2). In step 140, a transformation Matrix sequence { Matrix } of each row of the images to be stitched is obtainedA→A-2And a sequence of transformation matrices for the line reference images { Matrix }(B)→(B-2)And fourthly, calculating a transformation matrix of the image to be spliced with the splicing sequence number A in a certain row relative to the reference image by adopting a transformation matrix multiplication method according to the splicing sequence, wherein the formula is as follows:
Matrix(BM)→(0)×...×Matrix(BN-2)→(BN-4)×Matrix(BN)→(BN-2)×MatrixAN→0×.
..×MatrixA-2→A-4×MatrixA→A-2wherein, in the step (A),
AN is a splicing sequence number of AN image to be spliced in a row where AN image to be spliced currently (AN image to be spliced in a certain row with a splicing sequence number of a) is located at the same side of a row reference image as the image to be spliced currently, and the image to be spliced is spliced firstly (namely, the image to be spliced in the nearest row reference image), (BN) is a splicing sequence number of a reference image to be spliced in a current row (namely, the row reference image on the row where the image to be spliced currently is located) relative to the reference image, and (BM) is a splicing sequence number of a reference image to be spliced firstly (namely, the image to be spliced in the nearest row reference image) and located at the same side of the reference image as the current row reference image.
If the image to be stitched shown in fig. 3 is taken as an example, the sequence of change matrices { Matrix x } of the image to be stitched of L3 rows is obtained in step 1401→0、Matrix2→0The sequence number of the images to be spliced P7 in the L3 line is 1, the sequence number of the images to be spliced P1 in the L3 line is 2, and the sequence number of the reference images P6 in the L3 line is 0; matrix1→0Is a transformation Matrix, of the image to be stitched P7 within L3 lines with respect to the line reference image P62→0Is a transformation matrix of the image to be stitched P1 within the L3 lines with respect to the line reference image P6. In step 140, a row-based image variation Matrix { Matrix } sequence is obtained10→0、Matrix11→0、Matrix13→11Where 10 is the stitching order number of the line reference image P4, 11 is the stitching order number of the line reference image P6, 13 is the stitching order number of the line reference image P8, Matrix11→0Being a transformation matrix of the row reference image P6 with respect to the reference image P3, the transformation matrix of the L3 row image to be stitched P1 with respect to the reference image P3 is: matrix2→0×Matrix11→0
Example three:
in another specific embodiment of the present application, after determining the preferred feature point pair information of the adjacent images to be stitched, the method further includes: for a gray image, calculating an initial image balance parameter of the image to be spliced according to the gray value of the optimal characteristic point pair of the image to be spliced; for the color image, calculating an initial image equalization parameter of the image to be spliced according to the color information of the preferred characteristic point pair of the image to be spliced; and calculating the image balance parameter of each image to be spliced relative to the reference image by using the initial image balance parameter and combining the set splicing sequence. The step 160 further includes performing image equalization processing on the images to be stitched according to the image equalization parameters of each image to be stitched relative to the reference image, and stitching all the images to be stitched after the image equalization processing according to the transformation matrix of each image to be stitched relative to the reference image. The following describes a method for acquiring the equalization parameters of the initial image, by taking the two cases of the gray image and the color image as examples.
And calculating initial image balance parameters according to the gray values of the preferred characteristic point pairs of the images to be spliced aiming at the gray images, namely the initial image balance parameters of each group of images to be spliced are the ratio of vectors formed by the gray values corresponding to the preferred characteristic point pairs of the images to be spliced. The method specifically comprises the following steps: the gray value matrix corresponding to the image I is I, the gray value matrix corresponding to the image J is J, and if the gray values corresponding to the preferred feature point pairs on the image I and the image J form a vector ViAnd Vj
Figure BDA0000843398850000171
Figure BDA0000843398850000172
n is the number of pairs of characteristic points,
Figure BDA0000843398850000173
the gray value corresponding to the s-th characteristic point pair on the image i,
Figure BDA0000843398850000174
The gray value corresponding to the s-th characteristic point pair on the image j is obtained according to the formula Vi=KVjAnd calculating to obtain a real number K as an initial image balance parameter of the image j relative to the image i.
For a color image, calculating initial image equalization parameters according to the color information of the preferred characteristic point pairs of the images to be spliced, namely the initial image equalization parameters of the color components of each group of images to be spliced are vectors formed by the color components corresponding to the preferred characteristic point pairs of the images to be splicedThe ratio of (a) to (b). The method comprises the following specific steps: three-dimensional matrix [ I ] corresponding to color image IR,IG,IB]In which IR、IG、IBRespectively red, green and blue components of image i, having a three-dimensional matrix [ J ] corresponding to color image JR,JG,JB]Wherein JR、JG、JBThe red, green and blue components of image j, respectively, and the preferred feature point pairs of image i constitute vectors of color information of the corresponding images
Figure BDA0000843398850000175
The color information of the image corresponding to the preferred feature point pair of image j constitutes a vector of
Figure BDA0000843398850000177
n is the logarithm of the characteristic points. Respectively forming vectors by information of red, green and blue components corresponding to the optimized characteristic point pairs of the image i and the image j, and calculating the real number K according to a formulaRAn initial image equalization parameter for the red component of image j versus image i; real number K calculated according to formulaGAn initial image equalization parameter for the green component of image j versus image i; real number K calculated according to formulaBThe initial image equalization parameter for the blue component of image j versus image i.
And calculating the image balance parameters of all other images to be spliced relative to the reference image by using the initial image balance parameters and combining the splicing sequence. The splicing sequence number is as follows: 0. 1, 2 …. image to be stitched with stitching number a is taken as an example to illustrate the image equalization parameters of the image to be stitched with stitching number a relative to the reference image, wherein the image to be stitched with stitching number 0 is the reference image.
For grayscale images, define Kseq1→seq2The initial image equalization parameter of the image with the stitching sequence number seq1 to the image with the stitching sequence number seq2 is as follows: k ═ K1→0...×KA-2→A-1×KA→A-1(ii) a Wherein, K1→0Is dividing the reference imageInitial image equalization parameter, K, of the first outer image to be stitched relative to the reference imageA→A-1The initial image balance parameter of the image to be spliced with the splicing serial number A relative to the image to be spliced with the splicing serial number A-1 is obtained. And obtaining a matrix J ' corresponding to the image gray value after the image equalization according to the formula J ' ═ K ' J.
For a color image, the initial image equalization parameter of the image with the stitching order number seq1 with respect to the red component of the image with the stitching order number seq2 is defined as:
Figure BDA0000843398850000183
the initial image equalization parameter of the first image to be stitched except the reference image relative to the red component of the reference image is the initial image equalization parameter of the image to be stitched with the stitching serial number of A relative to the red component of the image to be stitched with the stitching serial number of A-1. According to formula J'R=K'RJRObtaining a matrix J 'of red components of the image after image equalization'R
Similarly, the image equalization parameter K 'of the green component of the image of any splicing sequence number relative to the reference image can be obtained'GAnd image equalization parameter K 'of blue component'BAnd is further according to formula J'G=K'GJGObtaining an image-equalized green component J'GAccording to formula J'B=K'BJBObtaining a blue component J 'of the image after image equalization'B
Due to the difference of illumination and the sensors, the brightness, the contrast and the like of the images acquired by the channels have certain differences, if the images are directly spliced, the differences of the brightness, the contrast and the like of different areas of the spliced images are large, splicing traces are obvious, and practical application is influenced.
Example four:
based on the first, second, and third embodiments of the present application, in another preferred embodiment of the present application, as shown in fig. 4, the method further includes:
and step 120, determining a splicing area of the images to be spliced as an overlapping area of the adjacent images to be spliced.
The step 130 further includes, according to the set stitching sequence, respectively extracting features of the stitching regions of the adjacent images to be stitched, to obtain candidate feature point pair information of each group of images to be stitched.
In order to obtain a complete and accurate image, images acquired by each image acquisition device of the multi-channel optical detection system are usually partially overlapped, and in this embodiment, a splicing region of images to be spliced is an overlapping region of adjacent images to be spliced. For example, in fig. 2, the overlapping area of the reference image P3 and the image to be stitched P4 in the embodiment.
And determining the overlapping area of the adjacent images to be spliced according to the overlapping condition of the shooting fields of view of the image acquisition equipment in the multi-channel optical detection system. In specific implementation, the overlapping area of the splicing reference image and the image to be spliced can be set according to the actual overlapping condition of the images shot by the multiple cameras. If the side length Z of one side of the image shot by one camera is P, the camera view angle corresponding to the side length direction is P, and the view field overlapping angle with the adjacent camera in the direction is P ', the image size parameter of the splicing area in the direction is ZP'/P. The image size parameter of the splicing region refers to the image size involved in calculation in the feature extraction operation.
See example one for a detailed implementation of other steps in this embodiment. The beneficial effects of this embodiment: only the image overlapping area of the reference image and the other images to be spliced needs to be subjected to feature extraction, and the feature extraction of the whole image of the reference image and the images to be spliced in the traditional method is not needed, so that the calculated amount is reduced, and the splicing efficiency is improved.
Example five:
based on the foregoing first to fourth embodiments, in another preferred embodiment of the present disclosure, in order to adapt to a change in the computational performance of the multi-channel optical detection system, the present disclosure further includes:
and 170, automatically setting a down-sampling proportion according to the time for splicing one frame of image by the multi-channel optical detection system and the set display frame rate. When the frame frequency to be displayed is a frame, firstly, a down-sampling multiple is set to be T, T is 1, the time for splicing one frame of image by the multi-channel optical detection system is time, if the time is more than 1/frame, in order to meet the display requirement, the down-sampling multiple of the multi-channel image needs to be increased, the down-sampling multiple T is automatically added with 1, and T is 2, and the next frame of image is spliced. At the moment, the time for splicing one frame of image by the multi-channel optical detection system is time, if the time for splicing the frame of image at the moment meets the condition that the time is less than or equal to 1/frame, the down-sampling proportion can meet the requirement of smoothly outputting and displaying the image frame, the down-sampling of the self-adaptive image is completed, and the down-sampling multiple T is 2; otherwise, automatically adding 1 to the down-sampling multiple, and repeating the process until the time is less than or equal to 1/frame, thereby completing the self-adaptive setting of the down-sampling multiple of the image.
For example, when the frame frequency to be displayed is frame (e.g. 5 frames per second), the time for stitching one frame of image can only be 1/frame (i.e. 0.2 second) at most. The system times image splicing, if one frame of image is not output within time second, the automatic down-sampling multiple is increased by 1 time, and then the image splicing time is continuously timed until the time for splicing one frame of image is less than the time. Due to the aging, the change, the occupied resources and the like of the multi-channel optical detection system, the change of the calculation performance can be caused, and the spliced image can be output timely according to the change of the calculation performance of the system by the method for adaptively setting the down-sampling multiple.
Similarly, in order to adapt to the situation that the computing power is enhanced after the resources are released because the resources of the multi-channel optical detection system occupy too much in a period of time, during the specific implementation, a period can be set, in the set period, the time for splicing one frame of image by the multi-channel optical detection system is time, when the time for judging and processing one frame of image meets the condition that the time is not more than 1/2 frames, the sampling reduction multiple is automatically reduced by 1, and the splicing effect and the splicing efficiency can be considered by adaptively adjusting the sampling reduction multiple.
The resampling method in the image down-sampling may select any method such as a linear average, a gaussian average, and the like, which is not limited in this application.
Example six:
correspondingly, the present application also discloses an image stitching apparatus of a multi-channel optical detection system, as shown in fig. 5, including:
the down-sampling module 500 is used for preprocessing the acquired image according to a set down-sampling proportion to obtain a down-sampled image to be spliced;
a splicing sequence setting module 510, configured to select a reference image from the images to be spliced, and set a splicing sequence of the images to be spliced;
the feature extraction module 530 is configured to extract features of the adjacent images to be stitched respectively according to a set stitching order, so as to obtain candidate feature point pair information of each group of images to be stitched;
a feature screening module 540, configured to screen the candidate feature point pair information, and determine preferred feature point pair information and a transformation matrix of each group of images to be stitched corresponding to the preferred feature point pair information;
a stitching parameter calculating module 550, configured to calculate a transformation matrix of each image to be stitched relative to the reference image according to a set stitching order;
and a splicing module 560, configured to splice all the images to be spliced according to the transformation matrix of each image to be spliced with respect to the reference image.
In an embodiment of the application, the stitching order setting module 510 is specifically configured to select an image to be stitched corresponding to a center of a field of view of the multi-channel optical detection system as a reference image, and set the stitching orders on two sides of the reference image in an order from small to large according to a distance between the image to be stitched and the reference image.
The stitching parameter calculating module 550 is specifically configured to, according to a set stitching sequence, multiply the transformation matrices corresponding to all images to be stitched, which are located on the same side as the reference image and are prior to the current image to be stitched, and the transformation matrices of the current image to be stitched, to obtain a transformation matrix of the current image to be stitched relative to the reference image.
In another specific embodiment of the present application, the stitching order setting module 510 is specifically configured to select, according to a field position of an image to be stitched acquired by an image acquisition device in the multi-channel optical detection system, an image to be stitched corresponding to a field center of the multi-channel optical detection system as a reference image, and divide the image to be stitched into rows by taking the reference image as a center; selecting the image to be spliced which is closest to the reference image from each line of images as the line reference image of the line; according to the sequence that the distance between the image to be spliced and the line reference image is from small to large, the splicing sequence of the image to be spliced is respectively arranged on the two sides; according to the sequence that the distance between the line reference image and the reference image is from small to large, the splicing sequence of the line reference images is respectively arranged on the two sides.
The stitching parameter calculating module 550 is specifically configured to, according to a set stitching sequence, multiply the transformation matrices corresponding to all the images to be stitched of the current image to be stitched and the transformation matrices of the current image to be stitched together, which are located in the same row as the current image to be stitched and located on the same side of the row reference image of the row, and the stitching sequence is prior to that of all the images to be stitched of the current image to be stitched, to obtain a transformation matrix of the current image to be stitched relative to the row reference image of the row; and accumulating and multiplying the transformation matrix of all the line reference images which are positioned at the same side of the reference image and are prior to the line reference image of the line in the splicing sequence with the transformation matrix of the current image to be spliced relative to the line reference image of the line to obtain the transformation matrix of the current image to be spliced relative to the reference image.
In another embodiment of the present application, the method further includes: the image equalization parameter calculation module is used for calculating an initial image equalization parameter of the image to be spliced according to the gray value of the optimal characteristic point pair of the image to be spliced; for the color image, calculating an initial image equalization parameter of the image to be spliced according to the color information of the preferred characteristic point pair of the image to be spliced; and calculating the image balance parameter of each image to be spliced relative to the reference image by using the initial image balance parameter and combining the set splicing sequence.
The stitching module 560 is further configured to perform image equalization processing on the images to be stitched according to the image equalization parameter of each image to be stitched relative to the reference image; and splicing all the images to be spliced after image equalization processing according to the transformation matrix of each image to be spliced relative to the reference image.
In a further embodiment of the present application, as shown in fig. 6, the parallel image stitching apparatus of the multi-channel optical detection system further includes: a splicing region determining module 520, configured to determine that a splicing region of images to be spliced is an overlapping region of adjacent images to be spliced. The feature extraction module 530 is further configured to: and respectively extracting the characteristics of the splicing areas of the adjacent images to be spliced according to the set splicing sequence to obtain the candidate characteristic point pair information of each group of images to be spliced. For a specific implementation manner of the splicing region determining module 520, refer to the method embodiment, which is not described herein again.
In another embodiment of the present application, the apparatus for stitching parallel images of a multi-channel optical detection system further includes: and the self-adaptive down-sampling module is used for automatically setting down-sampling proportion according to the time for splicing one frame of image by the multi-channel optical detection system and the set display frame rate.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
The above-described system embodiments are merely illustrative, wherein the modules described as separate components may or may not be physically separate, and the components shown as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. Based on such understanding, the above technical solutions essentially or contributing to the prior art may be embodied in the form of a software product, which can be stored in a computer-readable storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments or some parts of the embodiments.

Claims (10)

1. A parallel image splicing method of a multi-channel optical detection system is characterized by comprising the following steps:
preprocessing the collected images according to a set down-sampling proportion to obtain down-sampled images to be spliced;
selecting a reference image from the images to be spliced, and setting the splicing sequence of the images to be spliced;
respectively extracting the characteristics of the adjacent images to be spliced according to a set splicing sequence to obtain candidate characteristic point pair information of each group of images to be spliced;
screening the candidate characteristic point pair information, and determining preferred characteristic point pair information and a transformation matrix of each group of images to be spliced corresponding to the preferred characteristic point pair information;
calculating a transformation matrix of each image to be spliced relative to the reference image according to a set splicing sequence;
all the images to be spliced are spliced according to the transformation matrix of each image to be spliced relative to the reference image;
wherein, according to the set splicing sequence, calculating the transformation matrix of each image to be spliced relative to the reference image comprises:
selecting an image to be spliced corresponding to the center of the field of view of the multi-channel optical detection system as a reference image according to the field of view position of the image to be spliced acquired by an image acquisition device in the multi-channel optical detection system, and dividing the image to be spliced into lines by taking the reference image as the center;
selecting the image to be spliced which is closest to the reference image from each line of images as the line reference image of the line;
according to the sequence that the distance between the image to be spliced and the line reference image is from small to large, the splicing sequence of the image to be spliced is respectively arranged on the two sides;
according to the sequence that the distance between the line reference image and the reference image is from small to large, the splicing sequence of the line reference images is respectively arranged on the two sides.
2. The parallel image stitching method according to claim 1, wherein the selecting a reference image from the images to be stitched and setting the stitching order of the images to be stitched comprises: and selecting the image to be spliced corresponding to the center of the field of view of the multi-channel optical detection system as a reference image, and setting the splicing sequence on the two sides respectively according to the sequence that the distance between the image to be spliced and the reference image is from small to large by taking the reference image as the center.
3. The parallel image stitching method according to claim 2, wherein the calculating the transformation matrix of each image to be stitched relative to the reference image according to the set stitching order comprises: and according to a set splicing sequence, multiplying transformation matrixes corresponding to all the images to be spliced which are located at the same side of the reference image and are prior to the images to be spliced in the splicing sequence by the transformation matrixes of the images to be spliced to obtain the transformation matrix of the images to be spliced relative to the reference image.
4. The parallel image stitching method according to claim 1, wherein the calculating the transformation matrix of each image to be stitched relative to the reference image according to the set stitching order comprises:
according to a set splicing sequence, the current image to be spliced and a transformation matrix corresponding to all the images to be spliced of the current image to be spliced are multiplied on the same side of the row reference image of the row, wherein the splicing sequence is prior to the transformation matrix corresponding to all the images to be spliced of the current image to be spliced and the transformation matrix of the current image to be spliced, so that a transformation matrix of the current image to be spliced relative to the row reference image of the row is obtained;
and accumulating and multiplying the transformation matrix of all the line reference images which are positioned at the same side of the reference image and are prior to the line reference image of the line in the splicing sequence with the transformation matrix of the current image to be spliced relative to the line reference image of the line to obtain the transformation matrix of the current image to be spliced relative to the reference image.
5. The method of parallel image stitching according to claim 1, further comprising: determining a splicing area of the images to be spliced as an overlapping area of adjacent images to be spliced;
the method comprises the following steps of extracting the characteristics of the adjacent images to be spliced according to a set splicing sequence to obtain candidate characteristic point pair information of each group of images to be spliced, and further comprising the following steps:
and respectively extracting the characteristics of the splicing areas of the adjacent images to be spliced according to the set splicing sequence to obtain the candidate characteristic point pair information of each group of images to be spliced.
6. The parallel image stitching method according to any one of claims 1 to 5, characterized in that the method further comprises: and automatically setting the down-sampling proportion according to the time for splicing one frame of image by the multi-channel optical detection system and the set display frame rate.
7. A parallel image stitching apparatus of a multi-channel optical detection system, comprising:
the down-sampling module is used for preprocessing the acquired images according to a set down-sampling proportion to obtain down-sampled images to be spliced;
the splicing sequence setting module is used for selecting a reference image from the images to be spliced and setting the splicing sequence of the images to be spliced;
the characteristic extraction module is used for respectively extracting the characteristics of the adjacent images to be spliced according to a set splicing sequence to obtain candidate characteristic point pair information of each group of images to be spliced;
the characteristic screening module is used for screening the candidate characteristic point pair information and determining the preferred characteristic point pair information and a transformation matrix of each group of images to be spliced corresponding to the preferred characteristic point pair information;
the splicing parameter calculation module is used for calculating a transformation matrix of each image to be spliced relative to the reference image according to a set splicing sequence;
all the images to be spliced are spliced according to the transformation matrix of each image to be spliced relative to the reference image;
wherein the splicing sequence setting module is specifically configured to:
selecting an image to be spliced corresponding to the center of the field of view of the multi-channel optical detection system as a reference image according to the field of view position of the image to be spliced acquired by an image acquisition device in the multi-channel optical detection system, and dividing the image to be spliced into lines by taking the reference image as the center;
selecting the image to be spliced which is closest to the reference image from each line of images as the line reference image of the line;
according to the sequence that the distance between the image to be spliced and the line reference image is from small to large, the splicing sequence of the image to be spliced is respectively arranged on the two sides;
according to the sequence that the distance between the line reference image and the reference image is from small to large, the splicing sequence of the line reference images is respectively arranged on the two sides.
8. The parallel image stitching device according to claim 7, wherein the stitching sequence setting module is specifically configured to select an image to be stitched corresponding to a center of a field of view of the multi-channel optical detection system as a reference image, and set the stitching sequence on two sides of the reference image in an order from small to large according to a distance between the image to be stitched and the reference image.
9. The parallel image stitching device according to claim 8, wherein the stitching parameter calculation module is specifically configured to, according to a set stitching sequence, multiply the transformation matrices corresponding to all the images to be stitched that are prior to the current image to be stitched and the transformation matrices of the current image to be stitched together to obtain the transformation matrices of the current image to be stitched relative to the reference image.
10. The parallel image stitching device of claim 9, wherein the stitching parameter calculation module is specifically configured to,
according to a set splicing sequence, the current image to be spliced and a transformation matrix corresponding to all the images to be spliced of the current image to be spliced are multiplied on the same side of the row reference image of the row, wherein the splicing sequence is prior to the transformation matrix corresponding to all the images to be spliced of the current image to be spliced and the transformation matrix of the current image to be spliced, so that a transformation matrix of the current image to be spliced relative to the row reference image of the row is obtained;
and accumulating and multiplying the transformation matrix of all the line reference images which are positioned at the same side of the reference image and are prior to the line reference image of the line in the splicing sequence with the transformation matrix of the current image to be spliced relative to the line reference image of the line to obtain the transformation matrix of the current image to be spliced relative to the reference image.
CN201510762866.0A 2015-11-10 2015-11-10 Parallel image splicing method and device of multi-channel optical detection system Active CN106683043B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510762866.0A CN106683043B (en) 2015-11-10 2015-11-10 Parallel image splicing method and device of multi-channel optical detection system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510762866.0A CN106683043B (en) 2015-11-10 2015-11-10 Parallel image splicing method and device of multi-channel optical detection system

Publications (2)

Publication Number Publication Date
CN106683043A CN106683043A (en) 2017-05-17
CN106683043B true CN106683043B (en) 2020-03-06

Family

ID=58864743

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510762866.0A Active CN106683043B (en) 2015-11-10 2015-11-10 Parallel image splicing method and device of multi-channel optical detection system

Country Status (1)

Country Link
CN (1) CN106683043B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110068271B (en) * 2019-04-19 2021-03-30 怡得乐电子(杭州)有限公司 PIN needle position degree detection method for large-size product with sub-pixel precision
CN111526302A (en) * 2020-04-28 2020-08-11 飞友科技有限公司 Stackable panoramic video real-time splicing method
CN111858992B (en) * 2020-07-04 2023-10-20 广东粤源工程咨询有限公司 Hydraulic engineering photo management method and system based on GPS and tag information
CN112017120A (en) * 2020-09-04 2020-12-01 北京伟杰东博信息科技有限公司 Image synthesis method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102801899A (en) * 2012-08-29 2012-11-28 广东威创视讯科技股份有限公司 Method and device for improving image display quality of spliced screen
EP2688286A2 (en) * 2012-07-18 2014-01-22 Nokia Corporation Robust two dimensional panorama generation using light field camera capture
CN103925912A (en) * 2014-04-02 2014-07-16 中国人民解放军总参谋部测绘研究所 Internal view field optical partitional large-area-array CCD (charge coupled device) image geometric splicing method
CN104079915A (en) * 2014-07-03 2014-10-01 清华大学深圳研究生院 Parallel virtual view point synthetizing method

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3889650B2 (en) * 2002-03-28 2007-03-07 三洋電機株式会社 Image processing method, image processing apparatus, computer program, and recording medium
CN101916452B (en) * 2010-07-26 2012-04-25 中国科学院遥感应用研究所 Method for automatically stitching unmanned aerial vehicle remote sensing images based on flight control information
CN102819835A (en) * 2012-07-26 2012-12-12 中国航天科工集团第三研究院第八三五七研究所 Method for screening matching pairs of feature points to splice images
CN104867137B (en) * 2015-05-08 2017-12-08 中国科学院苏州生物医学工程技术研究所 A kind of method for registering images based on improvement RANSAC algorithms
CN104966270B (en) * 2015-06-26 2018-03-09 浙江大学 A kind of more image split-joint methods

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2688286A2 (en) * 2012-07-18 2014-01-22 Nokia Corporation Robust two dimensional panorama generation using light field camera capture
CN102801899A (en) * 2012-08-29 2012-11-28 广东威创视讯科技股份有限公司 Method and device for improving image display quality of spliced screen
CN103925912A (en) * 2014-04-02 2014-07-16 中国人民解放军总参谋部测绘研究所 Internal view field optical partitional large-area-array CCD (charge coupled device) image geometric splicing method
CN104079915A (en) * 2014-07-03 2014-10-01 清华大学深圳研究生院 Parallel virtual view point synthetizing method

Also Published As

Publication number Publication date
CN106683043A (en) 2017-05-17

Similar Documents

Publication Publication Date Title
US11562498B2 (en) Systems and methods for hybrid depth regularization
Tan et al. DeepDemosaicking: Adaptive image demosaicking via multiple deep fully convolutional networks
CN106683048B (en) Image super-resolution method and device
CN112288658A (en) Underwater image enhancement method based on multi-residual joint learning
CN106683043B (en) Parallel image splicing method and device of multi-channel optical detection system
CN104966285B (en) A kind of detection method of salient region
CN109816612A (en) Image enchancing method and device, computer readable storage medium
CN111383252B (en) Multi-camera target tracking method, system, device and storage medium
CN110827312A (en) Learning method based on cooperative visual attention neural network
CN106683044B (en) Image splicing method and device of multi-channel optical detection system
CN105976334B (en) A kind of denoising system and method for three-dimensional filtering Denoising Algorithm
CN111476835A (en) Unsupervised depth prediction method, system and device for consistency of multi-view images
CN108257153B (en) Target tracking method based on direction gradient statistical characteristics
CN116681636A (en) Light infrared and visible light image fusion method based on convolutional neural network
CN110111347B (en) Image sign extraction method, device and storage medium
CN107945119B (en) Method for estimating correlated noise in image based on Bayer pattern
CN113935917A (en) Optical remote sensing image thin cloud removing method based on cloud picture operation and multi-scale generation countermeasure network
CN116229123B (en) Binocular stereo matching method and device based on multi-channel grouping cross-correlation cost volume
Zhao et al. Saliency map-aided generative adversarial network for raw to rgb mapping
CN115294035B (en) Bright spot positioning method, bright spot positioning device, electronic equipment and storage medium
CN108268533A (en) A kind of Image Feature Matching method for image retrieval
CN113435270A (en) Target detection method, device, equipment and storage medium
CN108426566B (en) Mobile robot positioning method based on multiple cameras
CN109961083A (en) For convolutional neural networks to be applied to the method and image procossing entity of image
Khamassi et al. Joint denoising of stereo images using 3D CNN

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant