KR20150142846A - mosaic image of black box - Google Patents

mosaic image of black box Download PDF

Info

Publication number
KR20150142846A
KR20150142846A KR1020140071241A KR20140071241A KR20150142846A KR 20150142846 A KR20150142846 A KR 20150142846A KR 1020140071241 A KR1020140071241 A KR 1020140071241A KR 20140071241 A KR20140071241 A KR 20140071241A KR 20150142846 A KR20150142846 A KR 20150142846A
Authority
KR
South Korea
Prior art keywords
image
transformation matrix
mosaic
matrix
coordinate system
Prior art date
Application number
KR1020140071241A
Other languages
Korean (ko)
Inventor
홍승우
Original Assignee
주식회사그린티
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사그린티 filed Critical 주식회사그린티
Priority to KR1020140071241A priority Critical patent/KR20150142846A/en
Publication of KR20150142846A publication Critical patent/KR20150142846A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4038Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images

Abstract

A mosaic processing system of a black box image is disclosed. A transform matrix estimation apparatus and method for extracting feature points between images from multiple images, matching feature points between images, and estimating a transformation matrix between images based on the feature points, and a mosaic image generation apparatus for stably generating a mosaic image And a computer readable recording medium having recorded thereon a program for realizing the method, comprising: first feature point matching means for receiving multiple images from outside and matching image feature points; First transform matrix estimation means for estimating a transform matrix using the feature points matched by the first feature point matching means; A reference image selecting means for selecting a reference image using the relationship of each transformation matrix estimated by the first transformation matrix estimating means; And a transformation matrix correction means for setting an image coordinate system of the reference image selected by the reference image selection means to a reference coordinate system and correcting the transformation matrix to a transformation matrix for the reference coordinate system, .

Description

{Mosaic image of black box}

The present invention relates to a mosaic processing system for a black box image, and more particularly, to a system for automatically mosaicing an image acquired from a black box.

The video shot by the black box contains various accident scenes. These accident scenes are important evidence in conflict. However, there are cases where the face of a black box is also photographed with the face of another person, so there may be controversy about the privacy invasion when uploading such a video to the server.

Therefore, when a moving picture is uploaded to a server, a technique of automatically recognizing a face of a person or a number of another car and then performing a mosaic processing is required.

Korean Patent Laid-Open No. 10-2013-0055447 (mosaic image generating apparatus and method) is a related art.

The present invention relates to a technique for mosaicing a screen on which a privacy is to be respected on the screen of a black box.

According to an aspect of the present invention,

An apparatus for generating a mosaic image using a transformation matrix estimated using image feature point matching,

First feature point matching means for receiving multiple images from outside and matching image feature points;

First transform matrix estimation means for estimating a transform matrix using the feature points matched by the first feature point matching means;

A reference image selecting means for selecting a reference image using the relationship of each transformation matrix estimated by the first transformation matrix estimating means;

A transformation matrix modification means for setting an image coordinate system of the reference image selected by the reference image selection means to a reference coordinate system and modifying the transformation matrix to a transformation matrix for the reference coordinate system;

A mosaic image size determining means for determining a size of a mosaic image to be generated by each transformation matrix modified by the transformation matrix modification means;

A mosaic image mapping means for mapping each image to a mosaic image; And

There is provided a mosaic processing system for a black box image including color value correcting means for correcting a color change of a mosaic image changed at the time of acquisition of each image.

The present invention provides a technique for mosaicing a screen on which privacy is to be respected on the screen of a black box.

1 is an explanatory diagram of a system environment to which the present invention is applied;
BACKGROUND OF THE INVENTION 1. Field of the Invention [0001] The present invention relates to a transformation matrix estimation apparatus using image feature point matching, and a mosaic image generation apparatus using the transformation matrix estimation apparatus.
3 is a detailed block diagram of an embodiment of a feature point matching unit and a transform matrix estimator according to the present invention.
FIG. 4 is a flowchart illustrating a method for estimating a transform matrix using image feature point matching according to the present invention and a method for generating a mosaic image using the transform matrix.
5 is a detailed flowchart of an embodiment of a feature point matching process and a transform matrix estimation process according to the present invention.
6 is an explanatory diagram of a transformation matrix function according to the present invention;
7 is an explanatory diagram of a transformation matrix function for transformation into a reference coordinate system according to the present invention;
8 is an explanatory diagram of a color value correction process according to the present invention.

The above-mentioned objects, features and advantages will become more apparent from the following detailed description in conjunction with the accompanying drawings. Hereinafter, a preferred embodiment of the present invention will be described in detail with reference to the accompanying drawings.

FIG. 1 is an explanatory diagram of a system environment to which the present invention is applied. In FIG. 1, reference numeral 100 denotes a real space, reference numeral 110 denotes an image acquisition device such as a camera or a camcorder, reference numeral 120 denotes acquired multiple images, and reference numeral 130 denotes acquired single images.

As shown in FIG. 1, an image of an object or background 100 of a real world is acquired by a camera or a camcorder 110, and the acquired multiple images 120 are input to a computer. In the computer, a mosaic image is generated by inputting multiple images 120. Here, each single image 130 is an arbitrary digital image format of arbitrary size that can be converted into an algebraic (RGB) format.

FIG. 2 is a block diagram of an apparatus for estimating a transform matrix using image feature point matching according to an embodiment of the present invention and a mosaic image generating apparatus using the same. Referring to FIG. 2, reference numeral 200 denotes a feature point matching unit between two images, reference numeral 210 denotes a transform matrix Reference numeral 220 denotes a reference image selection unit, reference numeral 230 denotes a transformation matrix correction unit that corrects the transformation matrix so that the transformation matrix transforms each image coordinate to the reference image coordinate, 240 denotes a mosaic image that determines the size of the mosaic image, A mosaic image mapping unit 250 for mapping each image to a mosaic image, and a color value correction unit 260 for correcting the color value of the mosaic image.

The transform matrix estimation apparatus using image feature point matching according to the present invention and the apparatus for generating a mosaic image using the same include a feature point matching unit 200 for receiving multiple images from outside and matching feature points of neighboring image pairs, A transform matrix estimating unit 210 for estimating a transform matrix for each image pair using matching feature points in the matching unit 200 and a transform matrix estimating unit 210 for estimating a transform matrix for each image pair, A reference image selection unit 220 for selecting a reference image as a representative image when the representative image is selected in the reference image selection unit 220 and an image coordinate system of the representative image as a reference coordinate system, A matrix for transforming the image coordinates into other neighboring image coordinates) into a transformation matrix for the reference coordinate system, A mosaic image size determination unit 240 for determining a size of a mosaic image to be generated by each transformation matrix modified by the transformation matrix corrector 230, a mosaic image mapping unit 250 for mapping each image to a mosaic image, And a color value correcting unit 260 for correcting the color change of the mosaic image according to photographing conditions (illumination change, camera setting, etc.) upon acquisition of each image.

3 is a detailed block diagram of an embodiment of a feature point matching unit 200 and a transform matrix estimator 210 according to the present invention. Reference numeral 300 denotes a feature point extracting unit between two images, 310 denotes a feature point matching unit between two images, A transform matrix calculator between two images; and 330, an optimal transform matrix estimator.

The feature point matching unit 200 according to the present invention includes a feature point extraction unit 300 for extracting feature points by receiving multiple images from the outside and applying the same edge point detector to each image, And a feature point matching unit 310 for matching the image points by applying a normalized correlation coefficient (NCC) scheme to the feature point sets.

The transform matrix estimator 210 according to the present invention includes a transform matrix calculator 320 for calculating an initial transform matrix from a matched set of image point pairs and a transform matrix calculator 320 for calculating initial transform matrices And a transform matrix estimating unit 330 for estimating an optimal transform matrix from the transform matrix.

FIG. 4 is a flowchart illustrating a method of estimating a transform matrix using image feature point matching according to the present invention and a mosaic image generating method using the transform matrix. The multi-image obtained from an image acquisition device such as a camera or a camcorder is analyzed, We show a method of calculating a matrix and generating a mosaic image for multiple images.

As shown in FIG. 4, first, a feature point matching 400 is performed for neighboring image pairs, and a transformation matrix for each image pair is estimated 410 using the feature point matching.

Then, the central image is selected as the representative image using the relationship of the respective transformation matrices (420). When the representative image is selected, the image coordinate system of the representative image is set as the reference coordinate system, and the transformation matrix of each image is modified to a transformation matrix for the reference coordinate system (430).

Thereafter, the size of the mosaic image to be generated by each modified transformation matrix is determined (step 440), and each image is mapped to a mosaic image (450). Thereafter, the color change of the mosaic image, which is changed according to the photographing conditions (illumination change, camera setting, etc.) at the time of acquiring each image, is corrected (460).

Each of the above processes will be described in detail with reference to FIGS. 5 to 8. FIG.

5 is a detailed flowchart of a feature point matching process and a transform matrix estimation process according to an embodiment of the present invention.

In the present invention, when the number of images of multiple images is n, image point matching is performed on neighboring image pairs from image 1 to image n. The feature points to be matched are selected as the entire process for matching feature points between images. The same edge point detector is applied to each image to extract feature points (500).

Next, image point matching is performed by applying a normalized correlation coefficient (NCC) method to two feature point sets extracted from each image pair (510). The image point matching of the i-th image with the number of image points ni and the j-th image with nj number of image points is {(xi1, yi1), ... , (xi, ni, yi, ni)} and the image point set {(xj1, yj1), ... , (xj, nj, yj, nj) corresponding to the respective image points.

Then, the obtained pairs of corresponding points {(x 'i1, y'i1, x'j1, y'j1), ... , a transformation matrix is calculated from (x'i, m, y'i, m, x'j, m, y'j, m) (520). A typical two-dimensional perspective transformation is represented by a 3x3 matrix. The transformation is unique for the scale, so one element can be fixed to one. Therefore, the transformation matrix has a degree of freedom of 8, and the calculation of the transformation matrix is the same as that of estimating the eight unknowns.

A pair of corresponding points provides an expression for the x-coordinate and an expression for the y-coordinate, so that the transformation matrix can be calculated from four or more pairs of corresponding points.

Then, an optimal transformation matrix is estimated using the obtained transformation matrix as an initial value (530).

6 is an explanatory diagram of the transformation matrix function according to the present invention. In FIG. 6, 600 denotes an image in multiple images, 610 denotes a neighboring image of 600, 620 denotes an image coordinate system of 600, 630 denotes an image coordinate system of 610, A transformation matrix for expressing the rendered points 620, and 650 represents a transformation region of 610 using 640, respectively.

Here, a matrix for transforming the coordinates expressed by the image coordinate system of the j-th image to the coordinates of the corresponding position in the i-th image coordinate system is denoted by Tij. First, to obtain Tij, the transformation matrix Tij0 obtained from a pair of corresponding points between two images is set to the initial value of Tij. The transformation matrix T ij (640) between the neighboring i-th image 600 and the i + 1-th image 610 corresponds to the position of the point on the (i + 1) -th image represented by the image coordinate 630 i < / RTI > The region of the (i + 1) -th image is expressed by an arbitrary rectangular shape 650 when transformed into the image coordinates of the i-th image. The process of calculating the optimal transformation matrix T ij is very important in the generation of the mosaic image. For accurate mapping, we define the image brightness difference error for the current transformation matrix and use the Levenberg-Marquardt nonlinear minimization method to calculate the transformation matrix T ij with the minimum error. An optimal transform matrix is calculated for each neighboring image pair to obtain n-1 transform matrices.

When a conversion matrix for n-1 image pairs is obtained for all the images (from the first image to the n-th image) as described above, a representative image is selected from the transformation matrix. For all i from 1 to n, the transformation matrices are transformed based on the image coordinates of the i-th image, and the distance D ij between the i-th image center and the j-th image center in the mapping space is calculated. Dij is a two-dimensional array with distance values. Let Dj, jm denote the index of the cell having the minimum value when i and j are different in Dij. Now, the images im and jm are arranged next to each other. Let's say this arrangement is labeled as <im, jm>. Dim, jm is assigned an infinite value ∞ so that it is not selected as the minimum value afterwards. Now, the index of the cell having the smallest value among the i m -th column and the j m -th column in the Dij array is selected. Let the index of this cell be Dim, jm '(or Di m', jm). jm ', or im', to change the layout to <jm ', im, jm> (or <im, jm, im'>). Dim, jm '(or Dim', jm) is assigned an infinite value ∞ so that it is not selected as the minimum value thereafter. Now, the index of the cell having the minimum value in the jm 'th column and the jm th column (or among the jm th column and the im' th column) in the Dij array is selected. Repeat until all images are included in this way. In the final arrangement of the images, the center image is designated as the reference image. When n is an even number, a reference image is designated as a reference image having a smaller sum of the distances from the center points of the adjacent images.

7 is a diagram illustrating a transformation matrix function for transformation into a reference coordinate system according to the present invention. Reference numeral 700 denotes a reference image (i-th image) in an aligned multiple image, reference frame 710, which is an image coordinate system of a reference image, Reference numeral 720 denotes an image preceding the reference image (ik-th image) in the aligned multiple images, reference numeral 730 denotes an image (i + kth image) positioned behind the reference image in the aligned multiple images, reference numeral 740 denotes a transformation matrix 750 denotes a transformation matrix for transforming 730 into reference coordinates, 760 denotes a mapping image when converting 720 to 740, and 770 denotes a mapping image when converting 730 to 750, respectively.

When the reference image is selected as described above, the image coordinate system of the reference image is set as a reference coordinate system, and the transformation matrix of each image is transformed into a transformation matrix for the reference coordinate system. Assuming that the i-th image 700 is the reference image in the rearranged multiple images, the transformation matrix 740 for transforming the points of the ik-th image 720 into the reference coordinate system 710 is T i-1, i-1 · Ti -2, i-1-1 ... The transformation matrix 750 for calculating Ti-k, i-k + 1 - 1 and transforming the points of the (i + k) th image 730 into the reference coordinate system is T i, i + i + 2 ... · Calculate Ti + k-1, i + k. By transforming the four corner points of each image with each modified transformation matrix, the position in the reference image is determined (760, 770). The maximum and minimum values of these positions are obtained to determine the size of the mosaic image. Once the size is determined, a memory for the mosaic image is allocated, and the pixel information of each image is mapped to each position of the mosaic image.

FIG. 8 is a diagram for explaining a color value correction process according to the present invention. In FIG. 8, 800 denotes an image in multiple images, and 810 denotes an image in multiple images overlapping 800.

Correction of the color change of the mosaic image depending on the shooting conditions (illumination change, camera setting, etc.) at the time of acquiring each image. An embodiment in which two images mapped with a mosaic image are overlapped as shown in Fig. 7 will be described. The overlapping area of the first image 800 and the second image 810 is represented by a polygon. It is judged whether the outer tangent region of each line segment of the polygon representing the overlap region is image No. 1 or image No. 2, and classified into two classes of line segments. In the drawing, the line segments in contact with the image No. 1 are l 1a and l1b, and the segments in contact with the image No. 2 are l2a, l2b, and l2c. The color value of a given point p in the overlapping region of the mosaic image is calculated from the relationship of the distances to the lines at that point. Calculate the distance from point p to each line. Let d1 be the minimum value of the distances d1a and d1b to the line segments in contact with the image No. 1, and let d2 be the minimum value of the distances d2a, d2b2b, and d2c2c to the line segment tangent to the image No. 2. (R1, G1, B1) in the first image is represented by w = "d1 / (d1 + d2)

(R, G, B) at p in the mosaic image is R = "R1 * w + R2 * (1-w), G = "G 1 * w + G 2 * (1-w), B = In the case of overlapping, the color values in the overlapping region are calculated with the first two images, and the calculation is performed step by step considering the calculated color value and the remaining overlapping images.

It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the invention. Will be apparent to those of ordinary skill in the art.

200: feature point matching unit 210: conversion matrix estimator
220: reference image selection unit 230: transformation matrix correction unit
240: Mosaic image size determination unit 250: Mosaic image mapping unit
260: Color value correcting unit

Claims (1)

An apparatus for generating a mosaic image using a transformation matrix estimated using image feature point matching,
First feature point matching means for receiving multiple images from outside and matching image feature points;
First transform matrix estimation means for estimating a transform matrix using the feature points matched by the first feature point matching means;
A reference image selecting means for selecting a reference image using the relationship of each transformation matrix estimated by the first transformation matrix estimating means;
A transformation matrix modification means for setting an image coordinate system of the reference image selected by the reference image selection means to a reference coordinate system and modifying the transformation matrix to a transformation matrix for the reference coordinate system;
A mosaic image size determining means for determining a size of a mosaic image to be generated by each transformation matrix modified by the transformation matrix modification means;
A mosaic image mapping means for mapping each image to a mosaic image; And
And a color value correcting means for correcting the color change of the changed mosaic image at the time of acquisition of each image,
KR1020140071241A 2014-06-12 2014-06-12 mosaic image of black box KR20150142846A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020140071241A KR20150142846A (en) 2014-06-12 2014-06-12 mosaic image of black box

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020140071241A KR20150142846A (en) 2014-06-12 2014-06-12 mosaic image of black box

Publications (1)

Publication Number Publication Date
KR20150142846A true KR20150142846A (en) 2015-12-23

Family

ID=55082134

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020140071241A KR20150142846A (en) 2014-06-12 2014-06-12 mosaic image of black box

Country Status (1)

Country Link
KR (1) KR20150142846A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106851130A (en) * 2016-12-13 2017-06-13 北京搜狐新媒体信息技术有限公司 A kind of video-splicing method and device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106851130A (en) * 2016-12-13 2017-06-13 北京搜狐新媒体信息技术有限公司 A kind of video-splicing method and device

Similar Documents

Publication Publication Date Title
JP4371130B2 (en) Method for generating a composite image from a set of images
WO2014080613A1 (en) Color correction device, method, and program
WO2015112652A1 (en) Image demosaicing
EP2785047B1 (en) Image pickup apparatus, image processing system, image pickup system, image processing method, image processing program, and storage medium
JP5825172B2 (en) Image determination apparatus, image determination method, and computer program for image determination
US20150286044A1 (en) Method and system for extended depth of field calculation for microscopic images
JP6521626B2 (en) Object tracking device, method and program
CN104184935A (en) Image shooting device and method
CN110378250B (en) Training method and device for neural network for scene cognition and terminal equipment
US20170289516A1 (en) Depth map based perspective correction in digital photos
Zhang et al. A new modified panoramic UAV image stitching model based on the GA-SIFT and adaptive threshold method
US20170116739A1 (en) Apparatus and method for raw-cost calculation using adaptive window mask
CN110490924B (en) Light field image feature point detection method based on multi-scale Harris
KR100362171B1 (en) Apparatus, method and computer readable medium for computing a transform matrix using image feature point matching technique, and apparatus, method and computer readable medium for generating mosaic image using the transform matrix
CN110598795A (en) Image difference detection method and device, storage medium and terminal
WO2017032096A1 (en) Method for predicting stereoscopic depth and apparatus thereof
JP7146461B2 (en) Image processing method, image processing device, imaging device, program, and storage medium
KR20150142846A (en) mosaic image of black box
CN113298177B (en) Night image coloring method, device, medium and equipment
JP6006675B2 (en) Marker detection apparatus, marker detection method, and program
CN107005643A (en) Image processing apparatus, image processing method and program
JP6891808B2 (en) Image alignment system, method and program
CN109753930B (en) Face detection method and face detection system
JP7034781B2 (en) Image processing equipment, image processing methods, and programs
CN109029779B (en) Real-time human body temperature rapid detection method

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination