KR101819984B1 - Image synthesis method in real time - Google Patents

Image synthesis method in real time Download PDF

Info

Publication number
KR101819984B1
KR101819984B1 KR1020160072216A KR20160072216A KR101819984B1 KR 101819984 B1 KR101819984 B1 KR 101819984B1 KR 1020160072216 A KR1020160072216 A KR 1020160072216A KR 20160072216 A KR20160072216 A KR 20160072216A KR 101819984 B1 KR101819984 B1 KR 101819984B1
Authority
KR
South Korea
Prior art keywords
image
weight
new
pixel
data value
Prior art date
Application number
KR1020160072216A
Other languages
Korean (ko)
Other versions
KR20170139816A (en
Inventor
임현국
Original Assignee
(주)한국미래기술
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)한국미래기술 filed Critical (주)한국미래기술
Priority to KR1020160072216A priority Critical patent/KR101819984B1/en
Priority to PCT/KR2017/003112 priority patent/WO2017213335A1/en
Publication of KR20170139816A publication Critical patent/KR20170139816A/en
Application granted granted Critical
Publication of KR101819984B1 publication Critical patent/KR101819984B1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/74Circuits for processing colour signals for obtaining special effects
    • H04N9/75Chroma key

Abstract

Deriving a conventional weight for a conventional image that existed prior to combining for a redundant pixel located within an overlapping region generated when a plurality of images are synthesized into one image; Deriving a new weight for a new image added to the redundant pixel by synthesis; Deriving a conventional weight and a new weight for a redundant pixel for each synthesized image according to a synthesis order of the images; And calculating the data value for each image by applying the conventional weight and the new weight to the data value of the redundant pixel of each image and calculating the final data value for the redundant pixel by summing all the data values for each image Real-time image synthesis method is introduced.

Description

{IMAGE SYNTHESIS METHOD IN REAL TIME}

The present invention relates to a real-time image synthesis method in which a plurality of images are synthesized in real time in a single image, and at the same time, distortions or variations or distinctions in boundary portions are minimized.

A plurality of techniques have been provided so that a user can view images at various angles at a glance by outputting images acquired by a plurality of cameras through a single monitor. A typical example is an arousal view system applied to automobiles, and in the case of a large robot, an around view can be mounted so that the user can recognize the surrounding environment of the robot.

To do this, it is necessary to synthesize a plurality of images into one image, and its synthesis and output should be performed in real time.

The existing technology for image mosaic, image stitching, and panoramic imaging is based on high-speed image processing technology based on FPGA, parallel image processing technology using GPU, ASIC-based high-speed image processing technology There is a case to utilize. In recent years, there is a CPU-based real-time image synthesis without hardware for high-speed calculation, but a method of generating a composite image by simple pixel average or pixel selection is also proposed.

A conventional image processing apparatus includes a signal input unit for converting input signals of a first gamut representing image data into linear first image signals, a display unit for displaying a second gamut narrower than the first gamut, A gamut conversion unit for converting the first video signals into the second video signals of the second gamut, a ratio of the synthesis of the first video signals and the second video signals based on the saturation obtained from the input signals, And a color synthesizer for generating composite video signals obtained by synthesizing the first video signals and the second video signals at a ratio according to the set blend coefficient, Wherein the blend coefficient setting unit sets the boundary of the first gamut and the boundary of the first gamut when transforming the boundary between the first gamut and the second gamut into the L * a * b * The chroma component of the boundary of the second gamut and the chroma component of the boundary of the first gamut, the composite video signals generated by the blend coefficient by the blend coefficient become the upper limit of the saturation of the second video signals Respectively.

It should be understood that the foregoing description of the background art is merely for the purpose of promoting an understanding of the background of the present invention and is not to be construed as an admission that the prior art is known to those skilled in the art.

KR 10-2015-0098566 A

SUMMARY OF THE INVENTION The present invention has been proposed in order to solve such a problem, and it is an object of the present invention to provide a real-time image synthesizing apparatus and a method for synthesizing a plurality of images into a single image, And to provide a synthesis method.

According to another aspect of the present invention, there is provided a real-time image synthesizing method for deriving a conventional weight for a conventional image existing before a synthesis for a redundant pixel located in an overlap region generated when a plurality of images are synthesized into a single image, ; Deriving a new weight for a new image added to the redundant pixel by synthesis; Performing a step of deriving a conventional weight for a redundant pixel for each synthesized image and deriving a new weight according to a synthesis order of the images; And calculating the data value for each image by applying the conventional weight and the new weight to the data value of the redundant pixel of each image and calculating the final data value for the redundant pixel by summing all the data values for each image do.

The redundant area may be composed of a conventional boundary line existing before the synthesis and a new boundary line added by the new image synthesized.

The conventional weighting value may be a value obtained by dividing the shortest distance between the overlapping pixel and the new boundary line by the shortest distance between the overlapping pixel and the conventional boundary line and the shortest distance between the overlapping pixel and the new boundary line.

The new weight may be a value obtained by subtracting the conventional weight from 1 or the shortest distance between the overlapping pixel and the conventional border line by the shortest distance between the overlapping pixel and the conventional borderline and the shortest distance between the overlapping pixel and the new borderline.

In the step of deriving the conventional weights and the new weights, the conventional weights and the new weights derived for the overlapping pixels for each image can be stored in the memory as a table.

In the step of calculating the final data value for the redundant pixel, the final data value for the corresponding redundant pixel can be calculated by using the data value of the redundant pixel of each image to be synthesized, the conventional weight value stored in the memory in the table, have.

The data value of each image for the first image to be initially disposed can be calculated by multiplying the data value of the redundant pixel of the first image by the new weight of the first image and the conventional weight of all the images arranged next.

The data value of the intermediate image to be synthesized in the middle can be calculated by multiplying both the new weight of the intermediate image corresponding to the data value of the redundant pixel of the corresponding intermediate image and the conventional weight of the images arranged next.

The data value of the last image composited with the last image can be calculated by multiplying the data value of the redundant pixel of the last image by the conventional weight of the last image.

According to the real-time image synthesis method of the present invention, in synthesizing a plurality of images into one image, synthesis is performed very quickly in real time, and at the same time, distortion, deformation, or distinction at a boundary portion can be minimized.

1 is a system conceptual diagram for performing a real-time image synthesis method according to an embodiment of the present invention;
2 is a flowchart of a real-time image synthesis method according to an embodiment of the present invention.
FIGS. 3 and 4 illustrate a real-time image synthesis method according to an embodiment of the present invention;
FIGS. 5 to 7 illustrate a process of synthesizing a real-time image synthesis method according to an embodiment of the present invention.

FIG. 1 is a conceptual diagram of a system for performing a real-time image synthesis method according to an embodiment of the present invention. FIG. 2 is a flowchart of a real-time image synthesis method according to an embodiment of the present invention. 5A to 5C illustrate a process of composing a real-time image synthesis method according to an embodiment of the present invention.

A method for real-time image synthesis according to the present invention comprises the steps of: deriving a conventional weight for a conventional image existing before a synthesis for a redundant pixel located in a redundant region generated when a plurality of images are synthesized into one image; Deriving a new weight for a new image added to the redundant pixel by synthesis; Performing a step of deriving a conventional weight for a redundant pixel for each synthesized image and deriving a new weight according to a synthesis order of the images; And calculating the data value for each image by applying the conventional weight and the new weight to the data value of the redundant pixel of each image and calculating the final data value for the redundant pixel by summing all the data values for each image do.

A system for performing the real-time image synthesis method of the present invention is shown in FIG. First, a plurality of cameras capable of capturing a plurality of images are provided, and each camera transmits an image to a data receiving unit. In the memory unit, rules for synthesizing images are predetermined and stored. In the arithmetic unit, a synthesizing rule is loaded from a memory, and then the image data received by the data receiving unit is substituted to generate final synthesized image data. And outputs an image.

The homography matrices H 1 - 1 and H 2 - 1 between the plurality of camera images 210 and 220 and the output monitor image 100 are derived as shown in FIG. Through this matrix, the images acquired by each camera are reflected at the determined positions. As shown in FIG. 4, the edge of each camera image is projected on the output monitor image, and the border line of each image can be recognized through the edge.

Then, a redundant area according to a plurality of image composites is calculated. When a plurality of images are combined into one image, a plurality of images must be naturally connected at the boundary as if they were originally one image. Therefore, the boundary should be treated as smoothly as possible. To this end, for each overlapping pixel in each overlapping overlapping region of the image, we derive the conventional weights for the conventional image that existed prior to combining. And deriving a new weight for the new image added by the composite for the redundant pixel. Deriving a conventional weight and a new weight for a redundant pixel for each synthesized image according to a synthesizing order of the images; And a step of calculating a data value for each image by applying the conventional weight and the new weight to the data value of the redundant pixel of each image, and calculating the final data value for the redundant pixel by summing all the data values for each image will be.

That is, if three images are synthesized into one image, the order of synthesis of the images is arbitrarily determined, and the images arranged first are overlapped before the synthesized images, . In this case, in the case of the pixels located in overlapping overlapping regions, the influence of the image placed first and the influence of images overlapping later are all received, and it is affected by the distance from the edge of each image, Even if they are overlapped, the boundary line is not clear and the image can be connected naturally.

5 shows an example in which three images are synthesized. First, the first image 210 is arranged in the output image, and no synthesis is performed in this state. The second image 220 is arranged and the third image 230 is arranged so that the a section where the first image 210 and the second image 220 overlap each other is displayed and the first image 210, And a section b where both the second image 220 and the third image 230 are overlapped appears.

In the case of the area a, it can be seen as shown in FIG. Here, it is problematic to derive data values such as saturation, hue and contrast of the redundant pixel Pa located in the area a, which depends on the weight of each image. That is, weights are derived through the shortest distance from the overlapping pixel Pa to the boundary lines 210 'and 220' of the respective regions.

As shown in FIGS. 5 and 6, in the case of the area a, the first image 210 and the second image 220 overlap each other, thus reflecting the influence of the first image and the second image. For this, the shortest distance d C (2) from the overlapping pixel Pa to the border 210 'of the first image originally existing is obtained. Then, the shortest distance d N (2) to the synthesized second image 220 'is obtained.

In the case of the redundant pixel Pa, a weight is applied to the data value of the corresponding pixel in the first image and the data value of the corresponding pixel in the second image, and the data is synthesized to calculate the final data value of the redundant pixel Pa.

That is, the redundant area may be composed of a new boundary line 220 'formed by a new image 220 added by combining with a conventional boundary line 210' by a conventional image 210 existing before the combining, is the sum of the overlapping pixels Pa and a new boundary line (220 '), the shortest distance shortest distance between a d N (2) redundant pixels from the conventional boundary line d C (2) and overlapping pixels and the shortest distance between the new boundary line d N (2) between Can be divided. The new weight may be a value obtained by subtracting the conventional weight from 1 or the shortest distance between the overlapping pixel and the conventional boundary line by the shortest distance between the overlapping pixel and the conventional boundary line and the shortest distance between the overlapping pixel and the new boundary line.

That is, the conventional weight in FIG. 6 can be expressed by the following equation.

Figure 112016055773839-pat00001

Where w C is the conventional weight and w N is the new weight. That is, in the case of the conventional weight, since it represents the influence of the boundary 210 'of the conventional first image, it can be said that it is inversely proportional to the distance apart from the second image boundary 220' 1 < / RTI > image, it is close to the second image, and the weight is set so as to minimize the influence of the first image).

Accordingly, the final data value of the redundant pixel can be expressed as follows.

Figure 112016055773839-pat00002

Here, Image1 (u 1, v 1 ) are of redundant pixel data value in the first image, Image2 (u 2, v 2 ) is a redundant pixel data value in the second image, w1 is overlapping pixels of the first image The conventional weight for Pa is 0, and w2 is the conventional weight for the redundant pixel Pa of the second image. If two images are overlapped by such an equation, a value obtained by multiplying the data value of the redundant pixel of the first image by the conventional weight at the time of duplication of the second image and the value obtained by multiplying the data value of the redundant pixel of the second image, Is a sum of values obtained by multiplying new weight values at the time of duplication. Intuitively, the influence of the first image is reflected on the data value of the redundant pixel of the first image, the influence of the second image is reflected on the data value of the redundant pixel of the second image, The data value of the pixel is obtained.

FIG. 7 is a diagram illustrating a process for obtaining data values at points where three images are all overlapped. In order to do this, the conventional weight w2 is first obtained under the situation where the first image and the second image overlap each other (here, the conventional weight w1 is regarded as 0). Then, it is determined that the third image is overlapped. Thus, the shortest distance d C (3) between the boundary lines 210 'and 220' of the first and second images arranged first is obtained, D N (3) which is the shortest distance from the boundary line of the image 230 'is obtained, and the conventional weight w 3 corresponding thereto is obtained. For the redundant pixel Pb, the final data value is obtained according to the following equation.

Figure 112016055773839-pat00003

That is, in the case of the point where three images are synthesized, all three images are influenced by each. This formula multiplies the conventional weight w3 by the effect of the third image on the data values of the redundant pixels (calculated in the same manner as the formula for obtaining Pa) by the existing first and second images, By multiplying the data value of the redundant pixel of the third image by the new weighting value 1-w3 by the influence of the third image.

That is, when a plurality of images are overlapped, the concept of updating the data values of the newly combined overlapping images is sequentially updated.

Accordingly, the final formula can be expressed as:

Figure 112016055773839-pat00004

That is, the image-specific data value of the redundant pixel of the original image to be initially disposed is calculated by adding the new weight 1-w1 of the original image to the data value Image1 (u1, v1) of the redundant pixel of the original image, Can be calculated by multiplying all of the values w2 to wN (

Figure 112016055773839-pat00005
).

The data values of the redundant pixels of the intermediate image composing in the middle can be calculated by multiplying the new weight of the intermediate image corresponding to the data value of the redundant pixel of the corresponding intermediate image and the conventional weight of the images arranged next have(

Figure 112016055773839-pat00006
).

The data value of the redundant pixel for the last image to be synthesized last can be calculated by multiplying the data value of the redundant pixel of the last image by the conventional weight of the last image (

Figure 112016055773839-pat00007
).

On the other hand, according to this formula, the following values are stored in advance in the table of the memory unit. That is, in the step of deriving the conventional weights and the new weights, the conventional weights w N and the new weights 1-w N derived for the overlapping pixels for each image can be stored in a table in the memory.

Figure 112016055773839-pat00008

Figure 112016055773839-pat00009

As described above, the positions at which the plurality of images are synthesized are determined in advance, and the influence of the respective images at the respective points of the overlapping pixels can be calculated in advance. In the step of calculating the final data value for the redundant pixel, the final data value for the corresponding redundant pixel can be calculated by using the data value of the redundant pixel of each image to be synthesized, the conventional weight value stored in the memory in the table, have.

When the actual data values of the image are inputted, the final data values are simply derived using the data of each input image and the predetermined coefficient, r, and displayed on the output monitor image accordingly. Of course, monochrome images can be processed in one calculation, but in the case of color images, R, G, and B, respectively, must be calculated and then output as color correctly.

According to the real-time image synthesizing method of the present invention, since a table stored in a memory is used in synthesizing a plurality of images into one image, synthesis is performed very quickly in real time. At the same time, Is reflected in the distance according to the distance, it is possible to minimize the distortion, the distortion or the distinction at the boundary portion.

While the invention has been shown and described with respect to the specific embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the following claims It will be apparent to those of ordinary skill in the art.

100: Output monitor image 210, 220, 230: Multiple images
210 ', 220', and 230 ': a boundary line of each of a plurality of images

Claims (9)

Deriving a conventional weight for a conventional image that existed prior to combining for a redundant pixel located within an overlapping region generated when a plurality of images are synthesized into one image;
Deriving a new weight for a new image added to the redundant pixel by synthesis;
Performing a step of deriving a conventional weight for a redundant pixel for each synthesized image and deriving a new weight according to a synthesis order of the images; And
Calculating a data value for each image by applying a conventional weight value and a new weight value to the data value of the redundant pixel of each image, and summing all the data values for each image to calculate a final data value for the redundant pixel ,
The redundant area is composed of a conventional boundary line existing before the synthesis and a new boundary line added by the new image synthesized,
Wherein the conventional weight is a value obtained by dividing the shortest distance between the overlapping pixel and the new boundary by the shortest distance between the overlapping pixel and the conventional borderline and the shortest distance between the overlapping pixel and the new borderline.
delete delete The method according to claim 1,
Wherein the new weight is a value obtained by subtracting the conventional weight from 1 or the shortest distance between the overlapping pixel and the conventional border line by the shortest distance between the overlapping pixel and the conventional borderline and the shortest distance between the overlapping pixel and the new borderline. .
The method according to claim 1,
Wherein the step of deriving the conventional weight and the new weight is to store the conventional weight and the new weight derived for the redundant pixel for each image as a table in the memory.
The method of claim 5,
Calculating a final data value for a redundant pixel may include calculating a final data value for a corresponding redundant pixel by using a data value of a redundant pixel of each image to be synthesized and a conventional weight and a new weight stored in a memory in a table Real time image synthesis method.
The method according to claim 1,
Wherein the image data value of the original image to be initially disposed is calculated by multiplying the data value of the redundant pixel of the original image by the new weight of the original image and the conventional weight of all the images arranged next.
The method of claim 7,
Wherein the image data value of the intermediate image composited in the middle is calculated by multiplying both the new weight of the intermediate image corresponding to the data value of the redundant pixel of the corresponding intermediate image and the conventional weight of the images arranged next thereto, Real - time image synthesis method.
The method of claim 7,
Wherein the data value of the last image composited with the last image is calculated by multiplying the data value of the redundant pixel of the last image by the conventional weight of the last image.
KR1020160072216A 2016-06-10 2016-06-10 Image synthesis method in real time KR101819984B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020160072216A KR101819984B1 (en) 2016-06-10 2016-06-10 Image synthesis method in real time
PCT/KR2017/003112 WO2017213335A1 (en) 2016-06-10 2017-03-23 Method for combining images in real time

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160072216A KR101819984B1 (en) 2016-06-10 2016-06-10 Image synthesis method in real time

Publications (2)

Publication Number Publication Date
KR20170139816A KR20170139816A (en) 2017-12-20
KR101819984B1 true KR101819984B1 (en) 2018-01-18

Family

ID=60578819

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160072216A KR101819984B1 (en) 2016-06-10 2016-06-10 Image synthesis method in real time

Country Status (2)

Country Link
KR (1) KR101819984B1 (en)
WO (1) WO2017213335A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109951634B (en) * 2019-03-14 2021-09-03 Oppo广东移动通信有限公司 Image synthesis method, device, terminal and storage medium
KR102390433B1 (en) * 2020-12-21 2022-04-25 서울시립대학교 산학협력단 Global convergence video production device and production method
US20230316675A1 (en) * 2022-04-04 2023-10-05 Genome International Corporation Traveling in time and space continuum

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007274377A (en) * 2006-03-31 2007-10-18 Denso Corp Periphery monitoring apparatus, and program
JP2009171570A (en) 2008-01-21 2009-07-30 Denso Internatl America Inc Image data processing method and image data processing system

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08254499A (en) * 1995-03-17 1996-10-01 Sharp Corp Displaying/appearance inspection device
FR2880453A1 (en) * 2005-01-06 2006-07-07 Thomson Licensing Sa METHOD AND DEVICE FOR PROCESSING IMAGE MOSAIC
KR20130036593A (en) * 2011-10-04 2013-04-12 삼성디스플레이 주식회사 3d display apparatus prevneting image overlapping

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007274377A (en) * 2006-03-31 2007-10-18 Denso Corp Periphery monitoring apparatus, and program
JP2009171570A (en) 2008-01-21 2009-07-30 Denso Internatl America Inc Image data processing method and image data processing system

Also Published As

Publication number Publication date
WO2017213335A1 (en) 2017-12-14
KR20170139816A (en) 2017-12-20

Similar Documents

Publication Publication Date Title
JP5751986B2 (en) Image generation device
JP5451782B2 (en) Image processing apparatus and image processing method
JP7093015B2 (en) Panorama video compositing device, panoramic video compositing method, and panoramic video compositing program
KR101819984B1 (en) Image synthesis method in real time
JP2010034964A (en) Image composition apparatus, image composition method and image composition program
JP6087612B2 (en) Image processing apparatus and image processing method
JP5195841B2 (en) On-vehicle camera device and vehicle
JP2011211556A (en) Device and method for generating image, and program
JP2007067714A (en) Image processing circuit, and camera device and vehicle-mounted camera device using same
JP4828479B2 (en) Composite image generation apparatus and composite image generation program
JP2009141490A (en) Composite image generation device and composite image generation method
TW201806378A (en) Information processing device, information processing method, and program
JP6715217B2 (en) Video processing device, video processing method, and video processing program
JP2011205380A (en) Image processing apparatus
WO2018087856A1 (en) Image synthesis device and image synthesis method
US9723283B2 (en) Image processing device, image processing system, and image processing method
JP6391356B2 (en) Image processing apparatus, image processing method, and program
JP6770442B2 (en) Image processing device and its program
JP5588394B2 (en) Color correction apparatus, color correction method, and color correction program
JP2012150614A (en) Free viewpoint image generation device
JPH04329484A (en) Image synthesizer
JP2018182550A (en) Image processing apparatus
JP2019036902A (en) Video processing apparatus, video processing method, and video processing program
JP2017216634A (en) Video synthesizing device, video synthesizing method, and program
JP7458281B2 (en) Color correction device and its program

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
N231 Notification of change of applicant