JP2003317089A - Method and system for image correction - Google Patents

Method and system for image correction

Info

Publication number
JP2003317089A
JP2003317089A JP2002121719A JP2002121719A JP2003317089A JP 2003317089 A JP2003317089 A JP 2003317089A JP 2002121719 A JP2002121719 A JP 2002121719A JP 2002121719 A JP2002121719 A JP 2002121719A JP 2003317089 A JP2003317089 A JP 2003317089A
Authority
JP
Japan
Prior art keywords
image
original image
end portion
overlap
boundary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2002121719A
Other languages
Japanese (ja)
Other versions
JP4184703B2 (en
Inventor
Naoki Kawai
直樹 河合
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dai Nippon Printing Co Ltd
Original Assignee
Dai Nippon Printing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dai Nippon Printing Co Ltd filed Critical Dai Nippon Printing Co Ltd
Priority to JP2002121719A priority Critical patent/JP4184703B2/en
Publication of JP2003317089A publication Critical patent/JP2003317089A/en
Application granted granted Critical
Publication of JP4184703B2 publication Critical patent/JP4184703B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Landscapes

  • Image Processing (AREA)
  • Editing Of Facsimile Originals (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To provide a method and a system for image correction with high processing efficiency and superior reproducibility which can generate an image with an inconspicuous seam when images are joined and eliminate the need for operator's skillfulness and manual operation. <P>SOLUTION: A source image as an object of endless processing is inputted (S1). After an overlap quantity is set (S2), an overlap area is generated nearby an end of the source image (S3) and in this overlap area, an optimum border as a minimum error border is determined according to pixel values at the end of the source image and other ends (S4). According to the optimum border, the pixel values are corrected (S5) to output a corrected image after the endless processing (S6). <P>COPYRIGHT: (C)2004,JPO

Description

【発明の詳細な説明】Detailed Description of the Invention

【0001】[0001]

【産業上の利用分野】本発明は、壁、床、天井等の建築
物内装表面、家具、建具、造作部表面等を加飾する建材
印刷物に利用される画像の作成に関し、特に繋ぎ目が目
立たないようにシームレスなエンドレス画像を作成する
ための技術に関する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to the creation of images used for printed building materials for decorating interior surfaces of buildings such as walls, floors and ceilings, furniture, fittings, surfaces of featured parts, etc. The present invention relates to a technique for creating a seamless endless image so as not to be conspicuous.

【0002】[0002]

【従来の技術】建材印刷物は、出版印刷物、商業用印刷
物等の汎用印刷物に比べて、長さが非常に長い(長
尺)。例えば、壁面に加飾する印刷物を考えると、建物
の床から天井までの長さは、通常2.5m程度はあり、
また部屋の端から端までの水平方向長さも、6畳間の場
合で、長辺が3.5m程度はある。したがって、建材印
刷物用の画像も、最低限その程度の長さは繋ぎ目なく連
続し、一続きとなったエンドレスな画像、すなわちエン
ドレス画像が必要となる。
2. Description of the Related Art Building material printed matter is much longer (longer) than general printed matter such as publication printed matter and commercial printed matter. For example, considering a printed material to decorate the wall surface, the length from the floor to the ceiling of the building is usually about 2.5 m,
In addition, the horizontal length from the end to the end of the room is about 3.5 m in the case of 6 tatami mats. Therefore, the images for printed materials for building materials are also required to be endless images, that is, a continuous continuous image of at least that length, that is, an endless image.

【0003】上記のような長尺の画像を印刷用版胴の1
円周以内に収めることは不可能である。そのため、1円
周分の画像の印刷方向の始端部の辺と終端部の辺とを版
胴上で接続して(繋いで)製版する。そして、印刷時に
版胴が1回転するごとに1円周分の画像が途切れること
なく繰返し印刷されることによって、任意の長さの長尺
印刷物を得ている。
The long image as described above is printed on the plate cylinder 1 for printing.
It is impossible to fit within the circumference. Therefore, the side of the start end and the side of the end of the image for one circumference in the printing direction are connected (connected) on the plate cylinder to perform plate making. Then, each time the plate cylinder makes one rotation during printing, an image for one circumference is repeatedly printed without interruption, thereby obtaining a long printed matter of an arbitrary length.

【0004】ただし、単に1円周分の画像の印刷方向の
始端部の辺と終端部の辺とを突合わせて接続したのみで
は、接続部(繋ぎ目)で画像が不連続となり、エンドレ
ス画像は得られない。版胴画像のエンドレス化のために
は、画像の繋ぎ目において、繋ぎ目を境界として両側画
像の不連続部分を連続化して、繋ぎ目を目視で認識不能
とする作業、すなわち、シームレス化が必要となる。
However, if the side of the start end and the side of the end in the printing direction of the image for one circumference are simply butted and connected, the image becomes discontinuous at the connecting portion (joint) and the endless image is obtained. Can't get In order to make the plate cylinder image endless, it is necessary to make the discontinuity of both images continuous at the seam of the image and make the seam unrecognizable, that is, seamless. Becomes

【0005】そこで、特に木目模様の場合は有効な、画
像繋ぎ目のシームレス化による版胴画像のエンドレス化
の手法として、特公昭43−2065号公報等記載のご
とく、円周上の繋ぎ目を波形、あるいはジグザグの折れ
線として、繋ぎ目自体を目立ち難い形状とし、かつ繋ぎ
目近傍領域を熟練工が手作業で加筆、脱色、染色、削除
等を行うことにより、画像の不連続を修正してシームレ
ス化することが広く行われてきた(従来技術)。
Therefore, as a method of making the plate cylinder image endless by seamlessizing the image joints, which is particularly effective in the case of a wood grain pattern, as described in Japanese Patent Publication No. 43-2065, etc. As a wavy line or zigzag broken line, the joint itself is made inconspicuous and the area near the joint is manually added, decolorized, dyed, deleted, etc. by a skilled worker to correct the discontinuity of the image and seamless Has been widely used (prior art).

【0006】又近年は、電子計算機を応用したデジタル
画像処理技術の進歩によって、原稿画像をスキャナーで
読込んでデジタルデータ化し、CRT等の表示装置上で
マウス、タブレット、カーソル等により修正箇所を指定
し、画面内の他の部分等から適当な画像をコピーして移
植する等の手法により、より作業性を向上させることが
試みられている(従来技術)。さらに、繋ぎ目のシー
ムレス化の作業を両側画像の平均化演算処理によって、
自動的に所定の式で計算することによって効率化する方
法も提案されている(従来技術)。
Further, in recent years, due to the progress of digital image processing technology applying an electronic computer, an original image is read by a scanner to be converted into digital data, and a correction portion is designated by a mouse, a tablet, a cursor or the like on a display device such as a CRT. It has been attempted to further improve workability by a technique such as copying an appropriate image from another portion of the screen or the like and transplanting the image (prior art). Furthermore, the work of making the seams seamless is performed by averaging calculation processing of both side images.
A method of improving efficiency by automatically calculating with a predetermined formula has also been proposed (prior art).

【0007】[0007]

【発明が解決しようとする課題】しかしながら、上記従
来技術においては、シームレス化は、全て熟練工の手
作業によるため、作業効率が低く、かつ近年では熟練工
の確保も困難となってきている。なおかつ、再版時に全
く同一のシームレス処理の版を再現することも困難であ
る。
However, in the above-mentioned prior art, since the seamless operation is performed manually by a skilled worker, the work efficiency is low and it has become difficult to secure a skilled worker in recent years. Moreover, it is also difficult to reproduce a completely seamless plate when reprinting.

【0008】上記従来技術においては、シームレス化
の作業にデジタル化、機械化した処理を採用することに
より、上記従来技術に比べれば、画像の繋ぎ目等の工
程については、作業の効率化は図れた。しかし、特にシ
ームレス化の作業自体は、依然として熟練工の勘と手作
業に依存する点においては、ほとんど改善にはなってい
ない。
In the above-mentioned prior art, by adopting the digitized and mechanized processing for the work of the seamless operation, the work efficiency can be improved in the steps such as the joint of the images as compared with the above-mentioned prior art. . However, the seamlessness work itself has not been improved much in that it depends on the intuition and manual work of a skilled worker.

【0009】上記従来技術では、シームレス化の作業
をもデジタル処理により自動化することによって効率化
は図られた。しかし、シームレス化が、単に両側の画像
を繋ぎ目において加算(平均化)しているため、繋ぎ目
近傍において、空間周波数の高周波成分が欠落し、画像
の微細構造ないしは先鋭度が低下する。そのため、遠く
から見ると依然として繋ぎ目が認識されてしまうという
問題点が残ることになる。
In the above-mentioned prior art, the efficiency has been improved by automating the work of making it seamless by digital processing. However, since the seamlessization simply adds (averages) the images on both sides at the joint, the high frequency component of the spatial frequency is lost near the joint, and the fine structure or sharpness of the image is reduced. Therefore, the problem remains that the joint is still recognized when viewed from a distance.

【0010】上記のような点に鑑み、本発明は、その精
度が高く繋ぎ目が目立たない画像を作成することが可能
であると共に、作業者の熟練、手作業が不要で処理効率
が高く、かつ再現性に優れた画像修正方法およびシステ
ムを提供することを課題とする。
In view of the above points, according to the present invention, it is possible to create an image with high accuracy and inconspicuous seams, and at the same time, the skill of the operator and the manual work are unnecessary, and the processing efficiency is high, An object of the present invention is to provide an image correction method and system with excellent reproducibility.

【0011】[0011]

【課題を解決するための手段】上記課題を解決するた
め、本発明では、同一の画像の端部と対向する端部を繋
ぎ合わせた際に、その繋ぎ目が目立たないように画像を
修正する方法として、修正対象となる原画像を入力する
画像入力段階、原画像の繋ぎ目部分におけるオーバーラ
ップ量を設定する段階、前記設定されたオーバーラップ
量に基づいてオーバーラップ領域を作成する段階、前記
入力された原画像の端部と対向する端部の境界を、前記
オーバーラップ領域における前記端部の画素値と前記対
向する端部の画素値の差が最小となる画素の累積に基づ
いて決定する段階、前記決定された境界に基づいて、前
記原画像の端部および対向する端部の画素値を修正する
ことにより修正画像を作成する段階を実行するようにし
たことを特徴とする。
[Means for Solving the Problems]
Therefore, in the present invention, the edges of the same image are connected to the opposite edges.
Make sure the images are not conspicuous when you join them together.
Input the original image to be corrected as a correction method
Overlap at the image input stage and the joint part of the original image
Step of setting the amount of overlap, the set overlap
Creating an overlap region based on a quantity, said
The boundary of the edge part that is opposite to the edge part of the input original image is
The pixel value at the end and the pair in the overlap region
Based on the cumulative number of pixels with the smallest difference in pixel value
The step of determining, based on the determined boundary,
Modify the pixel values at the edges of the original image and at the opposite edges
By performing the step of creating a modified image
It is characterized by that.

【0012】本発明によれば、原画像の繋ぎ目部分にお
けるオーバーラップ量に基づいて作成されるオーバーラ
ップ領域において、原画像の端部と対向する端部の境界
をその画素値の差が最小となる画素の累積に基づいて決
定するようにしたので、画像を複数配列した際にも、繋
ぎ目が目立たない画像を作成することが可能となる。
According to the present invention, in the overlap area created based on the overlap amount in the joint portion of the original image, the boundary between the end portion and the end portion of the original image has a minimum difference in pixel value. Since the determination is made based on the accumulation of pixels, it becomes possible to create an image with no noticeable joint even when a plurality of images are arranged.

【0013】[0013]

【発明の実施の形態】以下、図面を参照して本発明の一
実施形態について詳細に説明する。 (1.システム構成)まず、本発明に係る画像修正システ
ムのシステム構成について説明する。図1は、画像修正
システムの一実施形態を示すシステム構成図である。図
1において、1は原画像入力手段、2はオーバーラップ
量設定手段、3は演算制御装置、3aはオーバーラップ
領域作成手段、3bは最適境界決定手段、3cは画素値
修正手段、4は画像出力手段である。
DETAILED DESCRIPTION OF THE INVENTION An embodiment of the present invention will be described in detail below with reference to the drawings. (1. System Configuration) First, the system configuration of the image correction system according to the present invention will be described. FIG. 1 is a system configuration diagram showing an embodiment of an image correction system. In FIG. 1, 1 is an original image inputting unit, 2 is an overlap amount setting unit, 3 is an arithmetic control unit, 3a is an overlap region creating unit, 3b is an optimum boundary determining unit, 3c is a pixel value correcting unit, 4 is an image. It is an output means.

【0014】原画像入力手段1は、シームレス処理、エ
ンドレス処理の対象となる原画像を入力するためのもの
であり、デジタルデータの読取装置で原画像データを読
取らせることにより実現される。また、原画像が紙媒体
等に描画されている場合は、スキャナ、デジタルカメラ
等を利用してデジタル化した後入力する。オーバーラッ
プ量設定手段2は、原画像の端部と対向する端部をどの
程度オーバーラップ、すなわち重複させるかを設定する
ためのものであり、マウス、キーボード等の入力機器で
実現される。
The original image input means 1 is for inputting an original image to be subjected to seamless processing and endless processing, and is realized by reading the original image data with a digital data reading device. When the original image is drawn on a paper medium or the like, it is input after being digitized by using a scanner, a digital camera or the like. The overlap amount setting means 2 is for setting how much the end portion of the original image facing the end portion overlaps, that is, overlaps, and is realized by an input device such as a mouse and a keyboard.

【0015】演算制御装置3は、本発明の中心的な役割
を有するものであり、入力された原画像、設定されたオ
ーバーラップ量に基づいて、原画像の両端部を修正する
ことによりエンドレス化された修正画像を作成する機能
を有している。実際には、コンピュータのCPU、メモ
リで構成され、メモリに専用のプログラムを読込み、C
PUが順次実行することにより実現される。演算制御装
置3は、さらにオーバーラップ領域作成手段3a、最適
境界決定手段3b、画素値修正手段3cを有している。
オーバーラップ領域作成手段3aは、入力された原画像
のサイズおよび設定されたオーバーラップ量に基づい
て、オーバーラップ領域のサイズを計算し、算出された
大きさのメモリ空間を確保する機能を有している。最適
境界決定手段3bは、オーバーラップ領域内の原画像の
端部の画素値と対向する他の端部の画素値に基づいて原
画像の端部と対向する他の端部の最適な境界を決定する
機能を有している。画素値修正手段3cは、決定された
最適境界に基づいて原画像上の画素値を変更する機能を
有している。オーバーラップ領域作成手段3a、最適境
界決定手段3b、画素値修正手段3cはそれぞれ専用の
プログラムをCPUで実行することにより実現される。
画像出力手段4は、作成された修正画像を出力する機能
を有しており、画像を表示するためのディスプレイ装
置、印刷するためのプリンター、データとして出力する
ためのハードディスク等の外部記憶装置、携帯可能な電
子記録媒体への記録装置、さらにネットワークを介して
遠隔地へ送信するための通信装置等が適用可能である。
The arithmetic and control unit 3 has a central role in the present invention, and makes both ends of the original image endless by correcting both ends of the original image based on the input original image and the set overlap amount. It has a function of creating a corrected image. Actually, it consists of the CPU and memory of the computer.
It is realized by sequentially executing the PU. The arithmetic and control unit 3 further has an overlap area creating unit 3a, an optimum boundary determining unit 3b, and a pixel value correcting unit 3c.
The overlap area creating means 3a has a function of calculating the size of the overlap area based on the size of the input original image and the set overlap amount, and ensuring a memory space of the calculated size. ing. The optimum boundary determining means 3b determines the optimum boundary of the other end portion that opposes the end portion of the original image based on the pixel value of the other end portion that opposes the pixel value of the end portion of the original image in the overlap area. It has the function of determining. The pixel value correction means 3c has a function of changing the pixel value on the original image based on the determined optimum boundary. The overlap area creating means 3a, the optimum boundary determining means 3b, and the pixel value correcting means 3c are realized by executing their own dedicated programs.
The image output unit 4 has a function of outputting the created corrected image, and has a display device for displaying the image, a printer for printing, an external storage device such as a hard disk for outputting as data, a portable device. A possible recording device for an electronic recording medium, a communication device for transmitting to a remote place via a network, and the like are applicable.

【0016】(2.処理の流れ)次に、本発明に係る画像
修正方法について、図1に示した画像修正システムの処
理動作と共に説明する。図2は本発明に係る画像修正方
法の概要を示すフローチャートである。まず、原画像入
力手段1より、エンドレス処理を行う対象となる原画像
を入力する(ステップS1)。次に、オーバーラップ量
設定手段2を用いて、オーバーラップ量の設定を行う
(ステップS2)。オーバーラップ量は、原画像の端部
と対向する端部を互いにオーバーラップさせる量であ
り、このオーバーラップ量に基づいてオーバーラップ領
域が決定される。オーバーラップ領域とは、原画像の端
部と対向する端部の境界付近において、最適な境界を決
定するための領域である。この最適境界決定処理を行わ
ないと、原画像を複数並べた際に、互いの境界部で段差
が目立つことになる。本実施形態では、オーバーラップ
量の設定は、1方向で原画像の端部と対向する端部が互
いに重なり合う画素数で設定される。
(2. Process Flow) Next, the image correction method according to the present invention will be described together with the processing operation of the image correction system shown in FIG. FIG. 2 is a flowchart showing the outline of the image correction method according to the present invention. First, the original image input means 1 inputs an original image to be subjected to endless processing (step S1). Next, the overlap amount setting means 2 is used to set the overlap amount (step S2). The overlap amount is an amount by which the end portions of the original image, which are opposed to the end portions, overlap each other, and the overlap region is determined based on this overlap amount. The overlap area is an area for determining an optimum boundary in the vicinity of the boundary between the edges of the original image and the edges of the original image. If this optimal boundary determination process is not performed, when a plurality of original images are arranged, a step becomes conspicuous at the boundary between them. In the present embodiment, the overlap amount is set by the number of pixels in which the end portions of the original image facing each other in one direction overlap each other.

【0017】続いて、入力された原画像のサイズ、およ
び設定されたオーバーラップ量に基づいてオーバーラッ
プ領域の作成を行う(ステップS3)。具体的には、オ
ーバーラップ領域作成手段3aが、コンピュータの主記
憶装置内にオーバーラップ領域として必要なメモリ空間
の確保を行う。例えば、原画像GORIのサイズ(画素
数)が横xp×縦ypであったとする。このとき、図3
(b)に示すように原画像GORIをx軸方向(図中左右
方向)に複数並べた際に、隣接する原画像GORIと接す
る辺の長さ(画素数)はypであるので、オーバーラッ
プ量がαであるとすると、yp×αの大きさのオーバー
ラップ領域Oが作成される。本発明で処理する原画像G
ORIは1つだけであり、図3(b)に示したように複数
並べた場合の左右の画像は同じ原画像GORIであるが、
ここでは説明の便宜上、一方を原画像GL、他方を原画
像GRとする。図3(b)に示すように原画像GLと原画
像GRにまたがってyp×αの大きさのオーバーラップ
領域が作成されることになる。
Then, an overlap area is created based on the size of the input original image and the set overlap amount (step S3). Specifically, the overlap area creating means 3a secures a memory space required as an overlap area in the main storage device of the computer. For example, assume that the size (the number of pixels) of the original image G ORI is horizontal xp x vertical yp. At this time,
As shown in (b), when a plurality of original images G ORI are arranged in the x-axis direction (horizontal direction in the figure), the length (number of pixels) of the side in contact with the adjacent original image G ORI is yp. Assuming that the overlap amount is α, an overlap region O having a size of yp × α is created. Original image G processed by the present invention
There is only one ORI , and the left and right images when a plurality of them are arranged as shown in FIG. 3B are the same original image G ORI ,
Here, for convenience of explanation, one is an original image G L and the other is an original image G R. As shown in FIG. 3B, an overlapping area having a size of yp × α is created across the original image G L and the original image G R.

【0018】オーバーラップ領域が作成されたら、次
に、このオーバーラップ領域における原画像GLと原画
像GRの最適な境界を決定する(ステップS4)。具体
的には、オーバーラップ領域において、原画像GLと原
画像GRの画素値の差の累計が最小となる最小誤差境界
を求め、元の原画像の左辺、原画像の右辺の代わりに、
求めた最小誤差境界を原画像GLと原画像GRの境界とす
る処理を行う。求められた最小誤差境界が最適境界とな
る。
After the overlap area is created, the optimum boundary between the original image G L and the original image G R in this overlap area is then determined (step S4). Specifically, in the overlap area, the minimum error boundary that minimizes the cumulative difference between the pixel values of the original image G L and the original image G R is obtained, and instead of the left side of the original image and the right side of the original image, ,
A process is performed in which the obtained minimum error boundary is the boundary between the original image G L and the original image G R. The obtained minimum error boundary is the optimum boundary.

【0019】このような最小誤差境界の決定の手法とし
ては、種々のものが提案されているため、FloydやDijks
traのアルゴリズム等の周知のものを適用すれば良い
が、本実施形態における手法を具体的に説明する。図4
にオーバーラップ量α=6とした場合の原画像GLと原
画像GRのオーバーラップ領域O付近を示す。図4の例
では、α=6であるため、オーバーラップ領域にはx軸
方向(図の左右方向)に6画素分用意されることにな
る。このようなオーバーラップ領域の場合、まず、第1
行目の6個の画素について、原画像GLの画素値と原画
像GRの画素値との比較を行う。この両者の差を最小と
する位置の算出には、一般にL2距離として知られる値
を用いる。例えば、ある画素Qにおいて、原画像GL
画素Qに対応する画素値をQL、原画像GRの画素Qに対
応する画素値をQRとすると、L2距離は、以下の〔数式
1〕により算出される。
Various methods have been proposed as methods for determining such a minimum error boundary. Therefore, Floyd and Dijks
A well-known algorithm such as tra algorithm may be applied, but the method in the present embodiment will be specifically described. Figure 4
Shows the vicinity of the overlap area O of the original image G L and the original image G R when the overlap amount α = 6. In the example of FIG. 4, since α = 6, six pixels are prepared in the overlap area in the x-axis direction (left and right direction in the drawing). In the case of such an overlap region, first
The pixel value of the original image G L and the pixel value of the original image G R are compared for the six pixels in the row. A value generally known as the L 2 distance is used to calculate the position that minimizes the difference between the two . For example, in a certain pixel Q, the original image G Q pixel values corresponding to the pixels Q of the L L, the pixel value corresponding to the pixel Q of the original image G R When Q R, L 2 distance, the following [Equation 1].

【0020】〔数式1〕 L2=( QL − QR2 [0020] [Formula 1] L 2 = (Q L - Q R) 2

【0021】オーバーラップ領域全域に渡って、この値
を計算すると、この配列はちょうどそこを境界とした誤
差のマップになる。したがって、この(6×yp)画素
の領域で上端(第1行目)から下端(第yp行目)に至
る全ての経路について、途中の各要素のL2を累積して
いき、この累積したコスト(値)が最も小さな経路を、
最も好ましい境界とするのである。なお、対象としてい
る画像がカラーの場合は、各色に対応したチャンネルの
2の値の総和をとるものとする。このようにして、累
積したコスト(値)が最も小さな経路を決定していくこ
とにより、最小誤差境界が得られる。
When this value is calculated over the entire overlap area, this array becomes a map of the error just at that boundary. Therefore, L 2 of each element on the way is accumulated for all the paths from the upper end (first row) to the lower end (the yp row) in this (6 × yp) pixel area, and this accumulation is performed. The route with the lowest cost (value)
It is the most preferable boundary. When the target image is color, the sum of the L 2 values of the channels corresponding to each color is taken. In this way, the minimum error boundary is obtained by determining the path with the smallest accumulated cost (value).

【0022】最適境界が決定されたら、決定された最適
境界に基づいて、画素値修正手段3cが、原画像の画素
値の修正を行う(ステップS5)。例えば、オーバーラ
ップ領域において、図5に示すような最小誤差境界が得
られた場合を考えてみる。図5において、網掛けで示す
画素が最小誤差境界である。この場合、最小誤差境界よ
りも左側にある画素は、原画像GLの画素値を採用し、
最小誤差境界よりも右側にある画素は、原画像GRの画
素値を採用する。また、本実施形態では、最小誤差境界
上に存在する画素については、原画像GLと原画像GR
画素値の平均を採るものとしている。図5において、
「L」で示された画素は原画像GLの画素値が、「R」
で示された画素は原画像GRの画素値がそれぞれ与えら
れることを示している。原画像の画素値を修正すること
により修正画像が得られたら、画像出力手段4より出力
する(ステップS6)。
When the optimum boundary is determined, the pixel value correcting means 3c corrects the pixel value of the original image based on the determined optimum boundary (step S5). For example, consider a case where the minimum error boundary as shown in FIG. 5 is obtained in the overlap region. In FIG. 5, the shaded pixel is the minimum error boundary. In this case, the pixel on the left side of the minimum error boundary adopts the pixel value of the original image G L ,
The pixel value on the right side of the minimum error boundary adopts the pixel value of the original image G R. Further, in the present embodiment, for the pixels existing on the minimum error boundary, the average of the pixel values of the original image G L and the original image G R is taken. In FIG.
The pixel value of the original image G L is “R” for the pixel indicated by “L”.
Pixels indicated by indicate that the pixel values of the original image G R are given respectively. When the corrected image is obtained by correcting the pixel value of the original image, the image output means 4 outputs it (step S6).

【0023】ここで、最小誤差境界の求め方の概要を整
理しておく。最小誤差境界の求め方の概要を図6のフロ
ーチャートに示す。最小誤差境界を求めるためには、ま
ず、第1行目の全画素(図の例では6画素)について各
画素におけるL2値を累積コストとして記憶する(ステ
ップS11)。続いて、第2行目の各画素について、自
身の左上、真上、右上の画素について、最小の累積コス
トを持つものを、この画素に至る最短経路として記憶す
ると共に、その上側の画素の累積コストに自身のL2
を加算したものを、自身の累積コストとして記憶する
(ステップS12)。このステップS12の処理を最終
行、図4、図5の例では第yp行目まで繰り返す。最終
行の全画素のうち、最小の累積コストを持つ画素とそこ
に至る経路を最小誤差境界とする(ステップS13)。
Here, the outline of the method of obtaining the minimum error boundary will be summarized. An outline of how to find the minimum error boundary is shown in the flowchart of FIG. In order to obtain the minimum error boundary, first, the L 2 value in each pixel for all the pixels in the first row (6 pixels in the example in the figure) is stored as a cumulative cost (step S11). Then, for each pixel in the second row, the one with the smallest accumulated cost for the upper left, directly above, and upper right pixels of itself is stored as the shortest path to this pixel, and the accumulation of the upper pixel is performed. The sum of the cost and the L 2 value of itself is stored as the cumulative cost of itself (step S12). The process of step S12 is repeated up to the last line, that is, the yp-th line in the examples of FIGS. Of all the pixels in the last row, the pixel having the smallest accumulated cost and the path leading to the pixel are set as the minimum error boundary (step S13).

【0024】この結果、図3(b)に示したオーバーラ
ップ領域Oには、図7(a)に示すような最適境界Bが
得られることになる。図7(a)において、最適境界B
の右側は原画像GRの画素値、最適境界Bの左側は原画
像GLの画素値が与えられることになる。これは、直感
的には、原画像をx軸方向に並べる際に、矩形の原画像
の代わりに、図7(b)に示すような最適境界によって
区切られた画像を並べるイメージとなる。しかし、図7
(b)に示すように画像の形状が、最適境界で示したよ
うな曲線であるのは現実的ではない。実際には、原画像
の片側(例えば右側)のオーバーラップ領域を削除し、
その領域の最適境界より右側の画素値を、対応する左側
のオーバーラップ領域にコピーする。具体的には、図7
(c)に示すように、原画像GORIのPLで示した領域の
画素に、PR’で示した領域の画素がコピーされ、原画
像GORIのPRで示した領域およびPR’で示した領域が
削除されることになる。すなわち、上記説明において
は、説明の便宜上、図3(b)に示したように2つの画
像を並べた状態で説明したが、実際には、1つの原画像
の両端部(図3(a)および図7(c)の例では左右の
両端部)の画素値に対して修正と削除が行われる。この
ようにして得られた修正画像をx軸方向に複数並べてい
くことにより、繋ぎ目が目立たなくなる。
As a result, the optimum boundary B as shown in FIG. 7A is obtained in the overlap area O shown in FIG. 3B. In FIG. 7A, the optimum boundary B
The pixel value of the original image G R is given to the right side of the, and the pixel value of the original image G L is given to the left side of the optimum boundary B. Intuitively, this is an image in which, when the original images are arranged in the x-axis direction, instead of the rectangular original images, the images separated by the optimum boundaries as shown in FIG. 7B are arranged. However, FIG.
As shown in (b), it is not realistic that the shape of the image is the curve shown by the optimum boundary. Actually, delete the overlap area on one side (eg right side) of the original image,
Pixel values on the right side of the optimum boundary of the area are copied to the corresponding left side overlap area. Specifically, FIG.
As shown in (c), the pixels in the region indicated by the original image G ORI of P L, P R 'in the copied pixels of the region indicated that the region indicated by the original image G ORI of P R and P R The area marked with 'will be deleted. That is, in the above description, for the sake of convenience of description, the two images are arranged side by side as shown in FIG. 3B, but in reality, both ends of one original image (FIG. 3A). Also, in the example of FIG. 7C, the pixel values at the left and right ends are corrected and deleted. By arranging a plurality of corrected images obtained in this way in the x-axis direction, the joints become inconspicuous.

【0025】上記の例では、原画像をx軸方向に複数配
列する場合にエンドレス画像となるように画像の修正を
行ったが、y軸方向に複数配列する場合であってもx軸
とy軸が入れ替わるだけであり、本質的には同様の処理
が行われる。また、x軸方向、y軸方向の双方に画像を
配列する場合には、x軸方向、y軸方向の双方に上記画
像修正処理を行うことにより、縦横の2次元に画像を配
列した場合でも全ての繋ぎ目が目立たないような修正画
像を得ることが可能となる。
In the above example, when a plurality of original images are arranged in the x-axis direction, the image is corrected so as to be an endless image. However, even when a plurality of original images are arranged in the y-axis direction, the x-axis and the y-axis are arranged. The axes are simply swapped, and essentially the same process is performed. Further, when the images are arranged in both the x-axis direction and the y-axis direction, the image correction processing is performed in both the x-axis direction and the y-axis direction, so that even when the images are arranged two-dimensionally in the vertical and horizontal directions. It is possible to obtain a corrected image in which all joints are inconspicuous.

【0026】また、上記の例では、同一の画像を複数並
べる場合について説明したが、接続する辺の長さ(画素
数)が等しい複数の画像を順に繋ぎ目を目立たないよう
に接続し、左端の画像の左端部と右端の画像の右端部を
同様にオーバーラップ領域で最適境界で接続すること
で、シームレスな画像を作成することもできる。例え
ば、図3(b)、図7(a)の原画像GR、原画像GL
して別の画像を利用して修正処理を行うことになる。
In the above example, the case where a plurality of the same images are lined up has been described. However, a plurality of images having the same side length (number of pixels) to be connected are sequentially connected so that the joints are not conspicuous, and the left end is connected. It is also possible to create a seamless image by connecting the left end portion of the image and the right end portion of the right end image at the optimum boundary in the same manner in the overlap region. For example, the correction process is performed by using different images as the original image G R and the original image G L in FIGS. 3B and 7A.

【0027】[0027]

【発明の効果】以上、説明したように本発明によれば、
エンドレス画像となるように修正すべき原画像を入力
し、隣接する原画像とのオーバーラップ量を設定し、設
定されたオーバーラップ量に基づいてオーバーラップ領
域を作成し、入力された原画像の端部と対向する端部の
境界を、オーバーラップ領域における前記端部の画素値
と対向する端部の画素値の差が最小となる画素の累積に
基づいて決定し、決定された境界に基づいて、原画像の
端部の画素値を修正することによりエンドレス画像を作
成するようにしたので、複数配列した際にも、繋ぎ目が
目立たない画像を作成することが可能となるという効果
を奏する。
As described above, according to the present invention,
Input the original image to be corrected so that it becomes an endless image, set the overlap amount with the adjacent original image, create an overlap area based on the set overlap amount, and The boundary between the end portion and the end portion facing the end portion is determined based on the accumulation of pixels in which the difference between the pixel value of the end portion and the pixel value of the opposite end portion in the overlap region is minimum, and based on the determined boundary. Since the endless image is created by correcting the pixel value at the end of the original image, it is possible to create an image in which the seams are not conspicuous even when a plurality of images are arranged. .

【図面の簡単な説明】[Brief description of drawings]

【図1】本発明に係る画像修正システムの構成図であ
る。
FIG. 1 is a configuration diagram of an image correction system according to the present invention.

【図2】本発明に係る画像修正方法のフローチャートで
ある。
FIG. 2 is a flowchart of an image correction method according to the present invention.

【図3】原画像GORI、および原画像GORIを複数並べる
場合のオーバーラップ領域Oの関係を示す図である。
3 is a diagram showing the relationship between the overlap region O of the case of arranging plural original images G ORI, and the original image G ORI.

【図4】オーバーラップ領域O付近を示す図である。FIG. 4 is a diagram showing the vicinity of an overlap region O.

【図5】最適境界決定後のオーバーラップ領域O付近を
示す図である。
FIG. 5 is a diagram showing the vicinity of an overlap area O after determining an optimum boundary.

【図6】最小誤差境界の求め方の概要を示すフローチャ
ートである。
FIG. 6 is a flowchart showing an outline of how to find a minimum error boundary.

【図7】最適境界決定により得られる修正画像の状態を
示す図である。
FIG. 7 is a diagram showing a state of a corrected image obtained by determining an optimum boundary.

【符号の説明】[Explanation of symbols]

1・・・原画像入力手段 2・・・オーバーラップ量設定手段 3・・・演算制御装置 3a・・・オーバーラップ領域作成手段 3b・・・最適境界決定手段 3c・・・画素値修正手段 4・・・画像出力手段 1. Original image input means 2 ... Overlap amount setting means 3 ... Arithmetic control device 3a ... Overlap area creating means 3b ... Optimal boundary determining means 3c ... Pixel value correction means 4 ... Image output means

Claims (3)

【特許請求の範囲】[Claims] 【請求項1】同一の画像の端部と対向する端部を繋ぎ合
わせた際に、その繋ぎ目が目立たないように画像を修正
する方法であって、 修正対象となる原画像を入力する画像入力段階と、 原画像の繋ぎ目部分におけるオーバーラップ量を設定す
る段階と、 前記設定されたオーバーラップ量に基づいてオーバーラ
ップ領域を作成する段階と、 前記入力された原画像の端部と対向する端部の境界を、
前記オーバーラップ領域における前記端部の画素値と前
記対向する端部の画素値の差が最小となる画素の累積に
基づいて決定する段階と、 前記決定された境界に基づいて、前記原画像の端部およ
び対向する端部の画素値を修正することにより修正画像
を作成する段階と、 を有することを特徴とする画像修正方法。
1. A method for correcting an image so that the seam is not conspicuous when the opposite ends of the same image are joined to each other, and an image for inputting an original image to be corrected is input. An input step, a step of setting an overlap amount in a joint portion of the original image, a step of creating an overlap area based on the set overlap amount, and a step of facing an end portion of the input original image. The boundary of the
A step of determining based on accumulation of pixels in which a difference between a pixel value of the end portion and a pixel value of the opposite end portion in the overlapping area is minimum, and based on the determined boundary, An image correction method comprising: a step of creating a corrected image by correcting pixel values at an end portion and an opposite end portion.
【請求項2】修正対象となる原画像を入力する原画像入
力手段と、 原画像の繋ぎ目部分におけるオーバーラップ量を設定す
るオーバーラップ量設定手段と、 前記設定されたオーバーラップ量に基づいてオーバーラ
ップ領域を作成するオーバーラップ領域作成手段と、 前記入力された原画像の端部と対向する端部の境界を、
前記オーバーラップ領域における前記端部の画素値と前
記対向する端部の画素値の差が最小となる画素の累積に
基づいて決定する最適境界決定手段と、 前記決定された境界に基づいて、前記原画像の端部およ
び対向する端部の画素値を修正することにより修正画像
を作成する画素値修正手段と、 得られた修正画像を出力する画像出力手段と、 を有することを特徴とする画像修正システム。
2. An original image input means for inputting an original image to be corrected, an overlap amount setting means for setting an overlap amount in a joint portion of the original images, and an overlap amount based on the set overlap amount. An overlap area creating means for creating an overlap area, and a boundary between the edges of the input original image and the edges facing each other,
Optimal boundary determining means for determining based on accumulation of pixels in which the difference between the pixel value of the end portion and the pixel value of the opposite end portion in the overlap region is minimum, and based on the determined boundary, An image characterized by comprising: a pixel value correcting means for creating a corrected image by correcting the pixel values of the end portion of the original image and the opposite end portion; and an image output means for outputting the obtained corrected image. Correction system.
【請求項3】コンピュータに、 修正対象となる原画像を入力する画像入力段階と、 原画像の繋ぎ目部分におけるオーバーラップ量を設定す
る段階と、 前記設定されたオーバーラップ量に基づいてオーバーラ
ップ領域を作成する段階と、 前記入力された原画像の端部と対向する端部の境界を、
前記オーバーラップ領域における前記端部の画素値と前
記対向する端部の画素値の差が最小となる画素の累積に
基づいて決定する段階と、 前記決定された境界に基づいて、前記原画像の端部およ
び対向する端部の画素値を修正することにより修正画像
を作成する段階と、 を実行させるためのプログラム。
3. An image input step of inputting an original image to be corrected to a computer, a step of setting an overlap amount at a joint portion of the original image, and an overlap based on the set overlap amount. A step of creating a region, and a boundary of an end portion facing the end portion of the input original image,
A step of determining based on accumulation of pixels in which a difference between a pixel value of the end portion and a pixel value of the opposite end portion in the overlapping area is minimum, and based on the determined boundary, A program for executing the steps of creating a corrected image by correcting the pixel values at the edge and the opposite edge.
JP2002121719A 2002-04-24 2002-04-24 Image correction method and system Expired - Fee Related JP4184703B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2002121719A JP4184703B2 (en) 2002-04-24 2002-04-24 Image correction method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2002121719A JP4184703B2 (en) 2002-04-24 2002-04-24 Image correction method and system

Publications (2)

Publication Number Publication Date
JP2003317089A true JP2003317089A (en) 2003-11-07
JP4184703B2 JP4184703B2 (en) 2008-11-19

Family

ID=29537536

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2002121719A Expired - Fee Related JP4184703B2 (en) 2002-04-24 2002-04-24 Image correction method and system

Country Status (1)

Country Link
JP (1) JP4184703B2 (en)

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005088251A1 (en) * 2004-02-27 2005-09-22 Intergraph Software Technologies Company Forming a single image from overlapping images
JP2005326944A (en) * 2004-05-12 2005-11-24 Hitachi Ltd Device and method for generating map image by laser measurement
US7787659B2 (en) 2002-11-08 2010-08-31 Pictometry International Corp. Method and apparatus for capturing, geolocating and measuring oblique images
JP2010272093A (en) * 2009-05-25 2010-12-02 Asahi Koyo Kk Image connecting method, device and program
US7873238B2 (en) 2006-08-30 2011-01-18 Pictometry International Corporation Mosaic oblique images and methods of making and using same
US7991226B2 (en) 2007-10-12 2011-08-02 Pictometry International Corporation System and process for color-balancing a series of oblique images
JP2011148184A (en) * 2010-01-21 2011-08-04 Dainippon Printing Co Ltd Texture data processor, method for processing texture data, program, device for manufacturing embossed plate, method for manufacturing embossed plate, and sheet
JP2012173424A (en) * 2011-02-18 2012-09-10 Canon Inc Image display apparatus and control method thereof
US8385672B2 (en) 2007-05-01 2013-02-26 Pictometry International Corp. System for detecting image abnormalities
US8401222B2 (en) 2009-05-22 2013-03-19 Pictometry International Corp. System and process for roof measurement using aerial imagery
US8477190B2 (en) 2010-07-07 2013-07-02 Pictometry International Corp. Real-time moving platform management system
US8520079B2 (en) 2007-02-15 2013-08-27 Pictometry International Corp. Event multiplexer for managing the capture of images
US8531472B2 (en) 2007-12-03 2013-09-10 Pictometry International Corp. Systems and methods for rapid three-dimensional modeling with real façade texture
US8588547B2 (en) 2008-08-05 2013-11-19 Pictometry International Corp. Cut-line steering methods for forming a mosaic image of a geographical area
US8593518B2 (en) 2007-02-01 2013-11-26 Pictometry International Corp. Computer system for continuous oblique panning
US8823732B2 (en) 2010-12-17 2014-09-02 Pictometry International Corp. Systems and methods for processing images with edge detection and snap-to feature
US9183538B2 (en) 2012-03-19 2015-11-10 Pictometry International Corp. Method and system for quick square roof reporting
US9262818B2 (en) 2007-05-01 2016-02-16 Pictometry International Corp. System for detecting image abnormalities
US9275080B2 (en) 2013-03-15 2016-03-01 Pictometry International Corp. System and method for early access to captured images
US9292913B2 (en) 2014-01-31 2016-03-22 Pictometry International Corp. Augmented three dimensional point collection of vertical structures
US9330494B2 (en) 2009-10-26 2016-05-03 Pictometry International Corp. Method for the automatic material classification and texture simulation for 3D models
US9612598B2 (en) 2014-01-10 2017-04-04 Pictometry International Corp. Unmanned aircraft structure evaluation system and method
US9753950B2 (en) 2013-03-15 2017-09-05 Pictometry International Corp. Virtual property reporting for automatic structure detection
US9881163B2 (en) 2013-03-12 2018-01-30 Pictometry International Corp. System and method for performing sensitive geo-spatial processing in non-sensitive operator environments
US9953112B2 (en) 2014-02-08 2018-04-24 Pictometry International Corp. Method and system for displaying room interiors on a floor plan
US10325350B2 (en) 2011-06-10 2019-06-18 Pictometry International Corp. System and method for forming a video stream containing GIS data in real-time
US10402676B2 (en) 2016-02-15 2019-09-03 Pictometry International Corp. Automated system and methodology for feature extraction
US10502813B2 (en) 2013-03-12 2019-12-10 Pictometry International Corp. LiDAR system producing multiple scan paths and method of making and using same
US10671648B2 (en) 2016-02-22 2020-06-02 Eagle View Technologies, Inc. Integrated centralized property database systems and methods
US12079013B2 (en) 2016-01-08 2024-09-03 Pictometry International Corp. Systems and methods for taking, processing, retrieving, and displaying images from unmanned aerial vehicles
US12123959B2 (en) 2023-07-18 2024-10-22 Pictometry International Corp. Unmanned aircraft structure evaluation system and method

Cited By (92)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7995799B2 (en) 2002-11-08 2011-08-09 Pictometry International Corporation Method and apparatus for capturing geolocating and measuring oblique images
US10607357B2 (en) 2002-11-08 2020-03-31 Pictometry International Corp. Method and apparatus for capturing, geolocating and measuring oblique images
US11069077B2 (en) 2002-11-08 2021-07-20 Pictometry International Corp. Method and apparatus for capturing, geolocating and measuring oblique images
US9443305B2 (en) 2002-11-08 2016-09-13 Pictometry International Corp. Method and apparatus for capturing, geolocating and measuring oblique images
US9811922B2 (en) 2002-11-08 2017-11-07 Pictometry International Corp. Method and apparatus for capturing, geolocating and measuring oblique images
US7787659B2 (en) 2002-11-08 2010-08-31 Pictometry International Corp. Method and apparatus for capturing, geolocating and measuring oblique images
WO2005088251A1 (en) * 2004-02-27 2005-09-22 Intergraph Software Technologies Company Forming a single image from overlapping images
JP4533659B2 (en) * 2004-05-12 2010-09-01 株式会社日立製作所 Apparatus and method for generating map image by laser measurement
JP2005326944A (en) * 2004-05-12 2005-11-24 Hitachi Ltd Device and method for generating map image by laser measurement
US7873238B2 (en) 2006-08-30 2011-01-18 Pictometry International Corporation Mosaic oblique images and methods of making and using same
US9805489B2 (en) 2006-08-30 2017-10-31 Pictometry International Corp. Mosaic oblique images and methods of making and using same
US9959653B2 (en) 2006-08-30 2018-05-01 Pictometry International Corporation Mosaic oblique images and methods of making and using same
US10489953B2 (en) 2006-08-30 2019-11-26 Pictometry International Corp. Mosaic oblique images and methods of making and using same
US9437029B2 (en) 2006-08-30 2016-09-06 Pictometry International Corp. Mosaic oblique images and methods of making and using same
US11080911B2 (en) 2006-08-30 2021-08-03 Pictometry International Corp. Mosaic oblique images and systems and methods of making and using same
US8593518B2 (en) 2007-02-01 2013-11-26 Pictometry International Corp. Computer system for continuous oblique panning
US8520079B2 (en) 2007-02-15 2013-08-27 Pictometry International Corp. Event multiplexer for managing the capture of images
US10679331B2 (en) 2007-05-01 2020-06-09 Pictometry International Corp. System for detecting image abnormalities
US8385672B2 (en) 2007-05-01 2013-02-26 Pictometry International Corp. System for detecting image abnormalities
US9262818B2 (en) 2007-05-01 2016-02-16 Pictometry International Corp. System for detecting image abnormalities
US9959609B2 (en) 2007-05-01 2018-05-01 Pictometry International Corporation System for detecting image abnormalities
US10198803B2 (en) 2007-05-01 2019-02-05 Pictometry International Corp. System for detecting image abnormalities
US11100625B2 (en) 2007-05-01 2021-08-24 Pictometry International Corp. System for detecting image abnormalities
US9633425B2 (en) 2007-05-01 2017-04-25 Pictometry International Corp. System for detecting image abnormalities
US11514564B2 (en) 2007-05-01 2022-11-29 Pictometry International Corp. System for detecting image abnormalities
US11087506B2 (en) 2007-10-12 2021-08-10 Pictometry International Corp. System and process for color-balancing a series of oblique images
US9503615B2 (en) 2007-10-12 2016-11-22 Pictometry International Corp. System and process for color-balancing a series of oblique images
US10580169B2 (en) 2007-10-12 2020-03-03 Pictometry International Corp. System and process for color-balancing a series of oblique images
US7991226B2 (en) 2007-10-12 2011-08-02 Pictometry International Corporation System and process for color-balancing a series of oblique images
US9520000B2 (en) 2007-12-03 2016-12-13 Pictometry International Corp. Systems and methods for rapid three-dimensional modeling with real facade texture
US10573069B2 (en) 2007-12-03 2020-02-25 Pictometry International Corp. Systems and methods for rapid three-dimensional modeling with real facade texture
US8531472B2 (en) 2007-12-03 2013-09-10 Pictometry International Corp. Systems and methods for rapid three-dimensional modeling with real façade texture
US10229532B2 (en) 2007-12-03 2019-03-12 Pictometry International Corporation Systems and methods for rapid three-dimensional modeling with real facade texture
US10896540B2 (en) 2007-12-03 2021-01-19 Pictometry International Corp. Systems and methods for rapid three-dimensional modeling with real façade texture
US11263808B2 (en) 2007-12-03 2022-03-01 Pictometry International Corp. Systems and methods for rapid three-dimensional modeling with real façade texture
US9836882B2 (en) 2007-12-03 2017-12-05 Pictometry International Corp. Systems and methods for rapid three-dimensional modeling with real facade texture
US9972126B2 (en) 2007-12-03 2018-05-15 Pictometry International Corp. Systems and methods for rapid three-dimensional modeling with real facade texture
US9275496B2 (en) 2007-12-03 2016-03-01 Pictometry International Corp. Systems and methods for rapid three-dimensional modeling with real facade texture
US8588547B2 (en) 2008-08-05 2013-11-19 Pictometry International Corp. Cut-line steering methods for forming a mosaic image of a geographical area
US11551331B2 (en) 2008-08-05 2023-01-10 Pictometry International Corp. Cut-line steering methods for forming a mosaic image of a geographical area
US10424047B2 (en) 2008-08-05 2019-09-24 Pictometry International Corp. Cut line steering methods for forming a mosaic image of a geographical area
US10839484B2 (en) 2008-08-05 2020-11-17 Pictometry International Corp. Cut-line steering methods for forming a mosaic image of a geographical area
US9898802B2 (en) 2008-08-05 2018-02-20 Pictometry International Corp. Cut line steering methods for forming a mosaic image of a geographical area
US8401222B2 (en) 2009-05-22 2013-03-19 Pictometry International Corp. System and process for roof measurement using aerial imagery
US9933254B2 (en) 2009-05-22 2018-04-03 Pictometry International Corp. System and process for roof measurement using aerial imagery
JP2010272093A (en) * 2009-05-25 2010-12-02 Asahi Koyo Kk Image connecting method, device and program
US10198857B2 (en) 2009-10-26 2019-02-05 Pictometry International Corp. Method for the automatic material classification and texture simulation for 3D models
US9330494B2 (en) 2009-10-26 2016-05-03 Pictometry International Corp. Method for the automatic material classification and texture simulation for 3D models
US9959667B2 (en) 2009-10-26 2018-05-01 Pictometry International Corp. Method for the automatic material classification and texture simulation for 3D models
JP2011148184A (en) * 2010-01-21 2011-08-04 Dainippon Printing Co Ltd Texture data processor, method for processing texture data, program, device for manufacturing embossed plate, method for manufacturing embossed plate, and sheet
US11483518B2 (en) 2010-07-07 2022-10-25 Pictometry International Corp. Real-time moving platform management system
US8477190B2 (en) 2010-07-07 2013-07-02 Pictometry International Corp. Real-time moving platform management system
US8823732B2 (en) 2010-12-17 2014-09-02 Pictometry International Corp. Systems and methods for processing images with edge detection and snap-to feature
US11003943B2 (en) 2010-12-17 2021-05-11 Pictometry International Corp. Systems and methods for processing images with edge detection and snap-to feature
US10621463B2 (en) 2010-12-17 2020-04-14 Pictometry International Corp. Systems and methods for processing images with edge detection and snap-to feature
JP2012173424A (en) * 2011-02-18 2012-09-10 Canon Inc Image display apparatus and control method thereof
US10325350B2 (en) 2011-06-10 2019-06-18 Pictometry International Corp. System and method for forming a video stream containing GIS data in real-time
US9183538B2 (en) 2012-03-19 2015-11-10 Pictometry International Corp. Method and system for quick square roof reporting
US10346935B2 (en) 2012-03-19 2019-07-09 Pictometry International Corp. Medium and method for quick square roof reporting
US11525897B2 (en) 2013-03-12 2022-12-13 Pictometry International Corp. LiDAR system producing multiple scan paths and method of making and using same
US10311238B2 (en) 2013-03-12 2019-06-04 Pictometry International Corp. System and method for performing sensitive geo-spatial processing in non-sensitive operator environments
US10502813B2 (en) 2013-03-12 2019-12-10 Pictometry International Corp. LiDAR system producing multiple scan paths and method of making and using same
US9881163B2 (en) 2013-03-12 2018-01-30 Pictometry International Corp. System and method for performing sensitive geo-spatial processing in non-sensitive operator environments
US10311089B2 (en) 2013-03-15 2019-06-04 Pictometry International Corp. System and method for early access to captured images
US9275080B2 (en) 2013-03-15 2016-03-01 Pictometry International Corp. System and method for early access to captured images
US9753950B2 (en) 2013-03-15 2017-09-05 Pictometry International Corp. Virtual property reporting for automatic structure detection
US9805059B2 (en) 2013-03-15 2017-10-31 Pictometry International Corp. System and method for early access to captured images
US10037463B2 (en) 2014-01-10 2018-07-31 Pictometry International Corp. Unmanned aircraft structure evaluation system and method
US10037464B2 (en) 2014-01-10 2018-07-31 Pictometry International Corp. Unmanned aircraft structure evaluation system and method
US11747486B2 (en) 2014-01-10 2023-09-05 Pictometry International Corp. Unmanned aircraft structure evaluation system and method
US9612598B2 (en) 2014-01-10 2017-04-04 Pictometry International Corp. Unmanned aircraft structure evaluation system and method
US10032078B2 (en) 2014-01-10 2018-07-24 Pictometry International Corp. Unmanned aircraft structure evaluation system and method
US11120262B2 (en) 2014-01-10 2021-09-14 Pictometry International Corp. Unmanned aircraft structure evaluation system and method
US10181081B2 (en) 2014-01-10 2019-01-15 Pictometry International Corp. Unmanned aircraft structure evaluation system and method
US10318809B2 (en) 2014-01-10 2019-06-11 Pictometry International Corp. Unmanned aircraft structure evaluation system and method
US10204269B2 (en) 2014-01-10 2019-02-12 Pictometry International Corp. Unmanned aircraft obstacle avoidance
US10181080B2 (en) 2014-01-10 2019-01-15 Pictometry International Corp. Unmanned aircraft structure evaluation system and method
US11087131B2 (en) 2014-01-10 2021-08-10 Pictometry International Corp. Unmanned aircraft structure evaluation system and method
US9292913B2 (en) 2014-01-31 2016-03-22 Pictometry International Corp. Augmented three dimensional point collection of vertical structures
US11686849B2 (en) 2014-01-31 2023-06-27 Pictometry International Corp. Augmented three dimensional point collection of vertical structures
US10571575B2 (en) 2014-01-31 2020-02-25 Pictometry International Corp. Augmented three dimensional point collection of vertical structures
US10338222B2 (en) 2014-01-31 2019-07-02 Pictometry International Corp. Augmented three dimensional point collection of vertical structures
US9542738B2 (en) 2014-01-31 2017-01-10 Pictometry International Corp. Augmented three dimensional point collection of vertical structures
US10942276B2 (en) 2014-01-31 2021-03-09 Pictometry International Corp. Augmented three dimensional point collection of vertical structures
US9953112B2 (en) 2014-02-08 2018-04-24 Pictometry International Corp. Method and system for displaying room interiors on a floor plan
US11100259B2 (en) 2014-02-08 2021-08-24 Pictometry International Corp. Method and system for displaying room interiors on a floor plan
US12079013B2 (en) 2016-01-08 2024-09-03 Pictometry International Corp. Systems and methods for taking, processing, retrieving, and displaying images from unmanned aerial vehicles
US10796189B2 (en) 2016-02-15 2020-10-06 Pictometry International Corp. Automated system and methodology for feature extraction
US11417081B2 (en) 2016-02-15 2022-08-16 Pictometry International Corp. Automated system and methodology for feature extraction
US10402676B2 (en) 2016-02-15 2019-09-03 Pictometry International Corp. Automated system and methodology for feature extraction
US10671648B2 (en) 2016-02-22 2020-06-02 Eagle View Technologies, Inc. Integrated centralized property database systems and methods
US12123959B2 (en) 2023-07-18 2024-10-22 Pictometry International Corp. Unmanned aircraft structure evaluation system and method

Also Published As

Publication number Publication date
JP4184703B2 (en) 2008-11-19

Similar Documents

Publication Publication Date Title
JP2003317089A (en) Method and system for image correction
US6411742B1 (en) Merging images to form a panoramic image
JP4501481B2 (en) Image correction method for multi-projection system
JP2007525770A (en) Technology to form a single image from multiple overlapping images
US6583823B1 (en) Methods, apparatuses, and mediums for repairing a pixel associated with motion-picture processes
CN109300084B (en) Image stitching method and device, electronic equipment and storage medium
CN107301005B (en) Method for determining touch position and touch projection system using same
CN105975236A (en) Automatic positioning method, automatic positioning display system and display devices
JP2005210418A (en) Image processing device, method, and program
JP2012221235A (en) Image processing method, image processing device, building image diagnosis method, and building image diagnosis apparatus
JP2007081611A (en) Method of setting display screen correction parameter
JP2003319165A (en) Image composition method and system
US8320016B2 (en) Image processing method for fast fill-in of a figure and computer readable medium therefor
JP4184704B2 (en) Image correction method and system
JP2004021578A (en) Image processing method
JP2003187244A (en) Image correction method
JP6508874B2 (en) Image projection apparatus, image projection method, and computer program
JP3830598B2 (en) Color correction method
JPH11341266A (en) Method and device for picture processing
JPH09200496A (en) Automatic image editing device
JP6090004B2 (en) Reading apparatus, reading method and reading program
JPH11345313A (en) Picture processor and picture processing method
JP4202661B2 (en) Image creation method
JP3584195B2 (en) Image processing method and apparatus
JP2000276584A (en) Method and device for image processing, and storage medium

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20050418

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20080513

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20080520

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20080808

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20080904

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110912

Year of fee payment: 3

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

Ref document number: 4184703

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110912

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120912

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120912

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130912

Year of fee payment: 5

LAPS Cancellation because of no payment of annual fees