WO2012023370A1 - ターゲット位置決定装置 - Google Patents
ターゲット位置決定装置 Download PDFInfo
- Publication number
- WO2012023370A1 WO2012023370A1 PCT/JP2011/066124 JP2011066124W WO2012023370A1 WO 2012023370 A1 WO2012023370 A1 WO 2012023370A1 JP 2011066124 W JP2011066124 W JP 2011066124W WO 2012023370 A1 WO2012023370 A1 WO 2012023370A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- color
- target
- grayscale
- profile
- target position
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
- H04N9/70—Circuits for processing colour signals for colour killing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/56—Extraction of image or video features relating to colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/40—Picture signal circuits
- H04N1/40012—Conversion of colour to monochrome
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- the present invention relates to a target position determination apparatus that determines the position of a color target on a captured image by performing image processing on the captured image obtained by capturing a color target composed of a combination of different colors.
- an increasing number of vehicles are equipped with a camera so that the driver of the vehicle can visually recognize a scene such as the side or rear of the vehicle via a monitor inside the vehicle.
- an apparatus that supports driving such as parking by performing image processing using a photographed image of the camera has been developed.
- calibration such as optical axis adjustment is required with high accuracy.
- the target position specifying device is used for the calibration process of such an in-vehicle camera.
- a marker (target) with a black and white checkered pattern placed in two places in the camera's field of view is photographed with an in-vehicle camera, the center point (calibration point) of the marker is detected through image processing, and the in-vehicle camera is A technique for calibrating is known (see, for example, Patent Document 1).
- a monochrome pattern target is used as described above, it is necessary to use a monochrome camera adjusted to clarify the brightness difference of the monochrome pattern in order to appropriately obtain the brightness difference of the monochrome pattern.
- a color camera that generates a color image from the viewpoint of visibility and the like is often adopted as a camera used to project a scene such as a side or a rear side of a vehicle on a monitor inside the vehicle.
- the color image signal output from the color camera is used for evaluating the luminance difference of the monochrome pattern, there is a possibility that the luminance difference of the monochrome pattern cannot be obtained sufficiently.
- a target position specifying device using a color camera and a color target is known (for example, see Patent Document 2).
- the target position specifying device processes a pixel value of a photographed image obtained by photographing a target composed of a combination of a first color color and a second color color, thereby processing a first color component value and a second color.
- a color difference conversion unit that generates a component value and a luminance value; and a first color component value and a second color component value based on the first color component value and the second color component value using a determination condition based on the luminance value.
- a color region determination unit that determines a region; a boundary detection unit that detects a boundary between the first color color and the second color color based on the determination result; and a target position that calculates a target position based on the boundary detection result It has a calculation section. Since the color image signal is composed of three color component signals (for example, RGB signals), the calculation load of the image processing calculation becomes larger than that of the black and white image (grayscale image), and the cost burden increases.
- an object of the present invention is to provide a target position determination device that can reduce a calculation load and accurately recognize a color pattern while using a color camera and a color target.
- the present invention is characterized in that the first monochrome color converted from the first color color and the second monochrome color converted from the second color color.
- the first monochrome color and the second monochrome color are set to be the first monochrome color and the second monochrome color, respectively, so that the brightness difference between the first color color and the second color color is larger than the brightness difference between the first color color and the second color color.
- a grayscale profile storage unit that stores a grayscale profile to be converted into a grayscale profile, and the grayscale profile using the grayscale profile
- a gray scale conversion unit that converts color image data into gray scale image data, and a boundary between the first target area and the second target area of the color target in the gray scale image data, and the photographing And a target position determination module for determining a target position in the image.
- the first monochrome color as the conversion destination of the first color color and the second monochrome color as the conversion destination of the second color color are converted. Since a gray scale profile that increases the luminance difference is used, the boundary between the first monochrome color region and the second monochrome color region can be detected with high accuracy.
- the boundary detection is performed based on the gray scale image, that is, based only on the luminance value, the calculation load is reduced as compared with the color processing.
- the photographed image that has been converted to gray scale in this way has a color balance (accurately, gray balance) that is greatly lost. Therefore, the color image data before gray scale is used for monitor display for confirming the driver's surroundings. Good.
- the grayscale profile is closer to white as the color component of the first color color is larger and closer to black as the color component of the second color color is larger.
- the first color is red and the second color is blue so that it can be easily understood with the naked eye.
- a first relationship is created such that when the ratio of the R component value is larger than the other two color component values among the RGB values that are pixel values, the R component value is white.
- a second relationship is created such that, when the ratio of the B component value among the RGB values is larger than the other two color component values, the color becomes black.
- a gray scale profile is created that can express the relationship that the larger the red component is, the whiter the color is, and the larger the blue component is, the darker the color is.
- red and blue targets are used relatively well because discrimination by luminance difference is difficult in general gray scale, but because human visual discrimination is good, the first color is red.
- a color target in which the second color is blue is a preferred application example of the present invention.
- the target color of the color target may differ depending on the purpose of use and other circumstances.
- the gray scale for each different color combination of the first color color and the second color color used for the color target is used.
- Profile is stored in the grayscale profile storage unit. According to this configuration, it is possible to accurately recognize various color patterns by creating and storing a grayscale profile that matches the target color combinations that can be used in advance. Become.
- the original first color color or the second color color is largely displaced in the color image data acquired by the color camera.
- the color after the original first color color or the second color color is shifted by a specific illumination light source type is obtained in advance, and the first color color and the second color are obtained.
- the gray scaled profile is created for each illumination source type illuminating the color target and stored in the gray scaled profile storage. ing.
- FIG. 1 schematically shows the principle of gray scale processing employed in the target position determination apparatus according to the present invention.
- the color target 2 is configured by a combination (pattern) of two different colors.
- One of the two colors is named as the first color color (blue color here) and is represented by C1 (R1, G1, B1) (RGB color system).
- the other of the two colors is named C2 (R2, G2, B2) as a second color (here, red). That is, the characteristic colors that characterize the color target 2 are the first color color and the second color color, and the first target area painted with the first color color and the first color painted with the second color color.
- Accurate position detection is performed through detection of a boundary line between two target regions.
- Color image data is generated by photographing the color target 2 with a color camera.
- the pixel values of the first color and the second color in the color image data are also C1 ( R1, G1, B1) and C2 (R2, G2, B2).
- Converting the color image data to gray scale image data is the gray scale processing.
- the first color color: C1 (R1, G1, B1) and the second color color: C2 (R2, G2, B2) in the color image data are respectively the first monochrome color: D1 (N1) in the grayscale image data.
- R1, G1, B1, R2, G2, and B2 take values from 0 to 255
- N1 and N2 are also from 0. It takes a value from 0 to 255.
- C (0,0,0) and D (0) represent black
- C (255,255,255) and D (255) represent white.
- the luminance difference between the first monochrome color converted from the first color color and the second monochrome color converted from the second color color is the first color color and the second color color. It is intended to be greater than the luminance difference between.
- a color conversion profile also called a color conversion matrix.
- This will be called a scaled profile.
- R, G, and B can be expressed as pixel values of color image data
- N can be expressed as a pixel value of grayscale image data.
- the color conversion profile here is also used for color adjustment of the camera image visually recognized by the user on the display device during normal use (when used by the user). However, when determining the target position according to the present invention, the gray scale is used. It functions as a computerized profile.
- this gray scale profile is shown in a form composed of a number of RB surface tables created for each unit position of the G axis. That is, the pixel value: N of the grayscale image data is described at the coordinate position determined by the values of the R and B color components in the predetermined RB surface table extracted by the value of the G color component. It is said.
- FIG. 1 (a) shows a conventional gray scale profile used for general gray scale. For example, it is obtained by dividing a value obtained by adding together the color component values of RGB by three.
- the pixel value of the grayscale image data is N. Accordingly, as shown in FIG.
- C2 is the second monochrome color having N value (100) (Medium gray) Converted to D2.
- a predetermined first color color and second color color are converted to gray scale, they are converted into a first monochrome color and a second monochrome color having a large luminance difference.
- a grayscale profile is prepared.
- a first color (approximately blue) having R, G, B values (10, 20, 240): C1 is an N value (185 )
- the second monochrome color (light gray) D2 is converted.
- This gray scale profile is such that blue R, G, B values are converted as continuously as possible into regions that are dark gray, and red R, G, B values are light gray regions It is preferable that the conversion is made as continuous as possible.
- R, G moves from pure blue represented by R, G, B values (0, 0, 255) to approximate blue.
- B value may be calculated by using a weight calculation for setting a weighting coefficient corresponding to the shift amount.
- gray scale image data in which a large luminance difference is generated between the first monochrome color and the second monochrome color is output.
- grayscale image data having such a luminance difference detection of a boundary line between the first target area painted with the first color and the second target area painted with the second color is performed. Can be done easily and accurately.
- calibration of the color camera (hereinafter simply referred to as camera) 11 is performed in a state where the vehicle 1 is stopped at a predetermined position. For example, if the vehicle 1 is moved back or forward and the wheels are stopped by a tire groove or a tire stop provided at a predetermined position, the vehicle 1 can be stopped at an accurate position.
- two color targets (hereinafter simply referred to as targets) 2 (2a, 2b) are arranged on the floor surface.
- targets 2a, 2b are arranged on the floor surface.
- the distance between the two targets 2 a and 2 b is narrower than the tread width of the vehicle 1, and the target 2 is not easily stepped on by the wheels of the vehicle 1.
- an arrangement may be adopted in which the distance between the two targets 2a and 2b is wider than the tread width of the vehicle 1 so that the target 2 is not easily stepped on by the wheels of the vehicle 1.
- the vehicle 1 is stopped so that the floor surface at the center of the rear end is the origin Ow of the world coordinate system (reference coordinate system, Xw, Yw, Zw).
- the axes of the camera coordinate system (Xc, Yc, Zc) centered on the optical center Oc of the camera 11 are not parallel to the world coordinate system.
- Both the world coordinate system and the camera coordinate system are right-handed coordinate systems, and in the drawing, the Xw axis in the vertical direction and the Xc axis in the substantially vertical direction are not shown in the drawing. Coordinate conversion between the world coordinate system and the camera coordinate system can be performed using a well-known calculation method.
- the target 2 is arranged in at least two places within the field of view of the camera 11.
- the target 2 is arranged so that its coordinates are known in the world coordinate system.
- the target 2 has a checkered pattern of blue (first color color) and red (second color color) as characteristic colors characterizing the target 2, as shown in FIG.
- a point Q at the center of the pattern is a calibration point, which is a reference for calibration of the camera 11. That is, the target 2 is arranged so that the coordinates of the calibration point Q are known in the world coordinate system.
- a total of four rectangles of 2 blue rectangles (1st target area) and 2 red rectangles (2nd target area) was shown here, a total of 4 or more may be sufficient, The number and shape are not limited to those shown here.
- D1 D2
- W1 W2
- the dimensions of the target 2 are appropriately determined so that the calibration point Q can be detected with high accuracy according to the resolution of the camera 11, the performance of the image processing function for processing the image taken by the camera 11, the position of the target, and the like.
- D1 and D2 are about 1 to 2 m and W1 and W2 are about 0.5 m
- a target 2 having 10 to 15 cm square for each black and white and 20 to 30 cm square as a whole is shown in FIG. Used.
- the target position specifying apparatus is substantially constituted by an image processing unit having a computer as a core member.
- FIG. 4 is a block diagram schematically showing the image processing function.
- this image processing unit receives a color image signal (color image data) from an image signal output unit 12 incorporated in a camera 11 as a photographing device.
- the image processing unit includes a gray scale module 4, a target position determination module 5, and an attachment accuracy determination module 6.
- the gray scale conversion module 4 inputs color image data that is a color photographed image sent from the image signal output unit 12 and develops it in a working memory (not shown). Further, this color image data is converted into grayscale image data which is a black and white photographed image.
- the target position determination module 5 obtains the position of the target 2, particularly the position of the calibration point Q of the target 2 from the grayscale image data that is a black and white photographed image developed in the working memory.
- the attachment accuracy determination module 6 determines the attachment accuracy of the camera 11 from the difference between the position of the calibration point Q of the target 2 specified by the target position determination module 5 and the target calibration point position.
- the gray scale module 4 includes a target color setting unit 41, a gray scale profile storage unit 42, a gray scale profile selection unit 43, and a gray scale unit 44.
- the target color setting unit 41 sets a first color color and a second color color that are characteristic colors of the target 2 to be processed through an input operation from the keyboard 14.
- a method of estimating and setting the first color color and the second color color from the input color image data may be employed.
- the grayscale profile storage unit 42 stores a grayscale profile (color space conversion matrix) as a grayscale conversion table used for the grayscale processing described with reference to FIG.
- each gray-scaled profile is created for each combination of specific color configurations of target 2 (here, blue as the first color and red as the second color).
- a database is formed so that the color configuration of the target 2 set by the color setting unit 41 can be searched and extracted as a search keyword.
- the gray-scaled profile selecting unit 43 selects a gray-scaled profile that matches the color configuration of the target 2 set by the target color setting unit 41, here, the color configuration of the blue and red checkered pattern, and converts it to gray scale. Part 44 is given.
- the gray scale conversion unit 44 uses the gray scale conversion profile selected by the gray scale conversion profile selection unit 43 to generate grace case image data from the color image data.
- the target position determination module 5 includes a preprocessing unit 51, a luminance difference calculation unit 52, a threshold setting unit 53, a target region determination unit 54, a boundary detection unit 55, and a target position calculation unit. 56.
- the preprocessing unit 51 corrects image distortion caused by the lens characteristics of the camera 11 as necessary.
- the luminance difference calculation unit 52 calculates the luminance difference between the first monochrome color and the second monochrome color obtained by making the first color color and the second color color into a grace case using the selected grayscale profile. To do.
- the threshold setting unit 53 determines whether the target pixel (target region) is the first monochrome color (first color color: blue) or the second.
- a specific color detection threshold is set as a determination condition for determining whether the color is monochrome (second color: red).
- the area determination unit 54 sequentially scans the grayscale image data including the target 2 using the threshold value set by the threshold value setting unit 53, and the first target area (blue area) and the second target area ( The red area is determined.
- the boundary detection unit 55 detects the boundary between the blue region and the red region in the target 2 using the determination result of the first target region (blue region) and the second target region (red region) by the region determination unit 54. . Since the boundary detected by the boundary detection unit 55, that is, the intersection of two boundary lines becomes the calibration point Q, the target position calculation unit 56 determines the position of the target 2 in the captured image based on the boundary detection result by the boundary detection unit 55. That is, the calibration point Q can be calculated.
- the color image signal acquired by the camera 11 and output from the image signal output unit 12 is displayed as a color photographed image on the monitor 13 through the video signal generation unit 33.
- the vehicle 1 is accurately positioned and stopped at a predetermined position of the inspection site (# 01).
- the camera 11 is operated to photograph the vicinity of the vehicle (# 02).
- the camera is set so that the color-captured image by the camera 11 of the vehicle 1 parked at a predetermined position includes the images of the two targets 2 even if the mounting accuracy of the camera 11 is somewhat poor.
- the color image data output through the image signal output unit 12 is subjected to basic image processing such as white balance correction (# 03).
- the first color color and the second color that are the characteristic colors of the target 2 to be processed that are set in advance in the target color setting unit 41 are read out. (# 04).
- a suitable grayscale profile is selected (# 05).
- the color image data is converted into grayscale image data (# 06).
- the converted grayscale image data is used for target position determination processing.
- the luminance difference calculation unit 52 calculates the luminance difference calculation values of the first monochrome color and the second monochrome color (# 11).
- the threshold setting unit 53 sets the detection threshold for the first target area (blue area) of the target 2 (# 12), and the second target area ( A detection threshold value for the red region) is set (# 13). For example, if the luminance difference (luminance difference calculation value) is large, it can be inferred that the image was taken in a good light environment, so the detection threshold is set strictly to increase the automatic detection accuracy.
- the detection threshold value for the first target region (blue region) and the detection threshold value for the second target region (red region) are increased. Conversely, if the luminance difference (luminance difference calculation value) is small, it is presumed that the image was taken in an unfavorable light environment (dark light environment), so first detect it to enable automatic detection. It is assumed that the threshold value is set to be loose (reduction between the detection threshold value for the first target region (blue region) and the detection threshold value for the second target region (red region)). Such a method of making the threshold value variable is suitable for a case where manual fine adjustment by an operator is performed after automatic detection.
- the determination of the first target area (blue area) of the target 2 (# 14) and the determination of the second target area (red area) (# 15) are performed.
- a boundary line between the first target area and the second target area of the target 2 is detected from the determination result of the first target area and the second target area (# 16).
- the determination of the first target region and the second target region of the target 2 and the detection of the boundary line between the first target region and the second target region can be performed simultaneously.
- the calibration point coordinates are calculated with the intersection as a calibration point. (# 17). Note that a configuration in which a predetermined threshold value is used without performing the process of making the detection threshold value variable as in steps # 12 and # 13 described above may be used.
- the position of the target 2 that is, the coordinate position of the calibration point can be obtained. Therefore, the deviation amount between the preset target calibration point and the calibration point calculated in step # 17 is calculated (# 18). Based on this calculated deviation amount, the attachment accuracy determination module 6 determines the attachment accuracy of the camera 11 (# 19).
- FIG. 6 is a functional block diagram showing an image processing function in another embodiment.
- the difference between the image processing function in the other embodiment and the embodiment shown in FIG. 4 is that a light source estimation unit 45 is added and the contents of the grayscale profile.
- the light source estimation unit 45 uses various image feature amounts calculated from the color image data received from the image signal output unit 12, for example, average luminance, histogram characteristics, color component ratios, etc. as input parameters, such as weight calculation and rule calculation.
- a simulation calculation is performed, and the light source type is estimated and output.
- the weight calculation here is, for example, a calculation that assigns a weight coefficient to each input parameter and corrects the weight coefficient by iterative learning to minimize the error in the output result, as used in neural networks, for example.
- a rule operation is a general term for operations that derive a result based on a predetermined rule such as an ifthen statement.
- the content of the rule itself is, for example, “If the B (blue) color component value is higher than a predetermined value and the R (red) and G (green) component values are lower than the predetermined value, the probability of an orange lamp is predetermined%. And so on. "
- a tungsten lamp, a sodium lamp, a fluorescent lamp, an LED lamp, etc. are mentioned as a light source seed
- the same light source type may be classified according to the intensity of each lamp and the magnitude of color influence.
- the grayscale profile stored in the grayscale profile storage unit 42 is not only created for each combination of the specific color configuration (first color color and second color color) of the target 2 as described above. Furthermore, it is created for each light source type estimated by the light source estimation unit 45. Therefore, the grayscale profile selection unit 43 selects a grayscale profile using the color configuration of the target 2 set by the target color setting unit 41 and the light source type estimated by the light source estimation unit 45 as search keywords. With this configuration, it is possible to determine a target position in which adverse effects due to the type of light source during target imaging are suppressed.
- Each functional part in the gray scale conversion module 4 and the target position determination module 5 described above indicates a division as a function and does not necessarily have to be provided independently.
- each function may be realized by cooperation of hardware such as a microcomputer and software such as a program executed on the hardware.
- the target 2 to be processed by the color target position determination device is the target 2 for determining the in-vehicle camera mounting position, but the target for stopping at a parking lot or a battery charging station. It may be. Also, the present invention may be applied by regarding the white line or yellow line drawn on the road as the target 2.
- the grayscale profile for reducing the color influence on the first color (blue) as the grayscale profile having the characteristic of reducing the color influence on the target 2 by the light source;
- a common blue-red gray scaled profile was used, which was combined with a gray scaled profile to reduce the color effect on the color (red).
- a blue-corrected grayscale profile for reducing the color effect on blue and a red-corrected grayscale profile for reducing the color effect on red are prepared and applied separately. Also good.
- the present invention can be widely used in image processing technology for detecting a boundary between different colors by converting a target color image characterized by a combination of different colors into a grayscale image.
- Color target 4 Gray scale conversion module 41: Target color setting section 42: Gray scale conversion profile storage section 43: Gray scale conversion profile selection section 44: Gray scale conversion section 45: Light source estimation section 5: Target position determination module 6 : Mounting accuracy judgment module Q: Calibration point (target center position)
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Studio Devices (AREA)
Abstract
Description
一般的なグレースケール化において輝度差による識別が困難ではあるが人間の視覚による識別性が良好なために赤色と青色によるターゲットが比較的良く用いられることを考慮すると、前記第1カラー色が赤色であり、前記第2カラー色が青色であるカラーターゲットが、本発明の好適な適用例となる。
図1には、本発明によるターゲット位置決定装置で採用されている、グレースケール化処理の原理が模式的に示されている。ここでは、カラーターゲット2は、2つの異なる色の組み合わせ(パターン)によって構成されている。2つの色のうちの一方は第1カラー色(ここでは青色系としておく)と名づけてC1(R1,G1,B1)(RGB表色系)で表している。2つの色のうちの他方は第2カラー色(ここでは赤色系としておく)と名づけてC2(R2,G2,B2)で表している。つまり、このカラーターゲット2を特徴付けている特徴色は、第1カラー色と第2カラー色であり、この第1カラー色で塗られた第1ターゲット領域と第2カラー色で塗られた第2ターゲット領域との間の境界線の検出を通じて正確な位置検出が行われる。
M[C(R,G,B) ]=D(N)、
ここで、R,G,Bはカラー画像データの画素値、Nはグレースケール画像データの画素値のように表現することができる。
なお、ここでの色変換プロファイルは、通常時(ユーザ使用時)は、ユーザが表示装置にて視認するカメラ画像の色調整にも用いられるが、本発明によるターゲット位置決定の際にはグレースケール化プロファイルとして機能するものである。通常時(ユーザ使用時)の色変換プロファイルとしての使用時には、あるカラー画素値Cm(Rm,Gm,Bm)から他のカラー画素値Cn(Rn,Gn,Bn)を導出するが、グレースケール化プロファイルとしての使用時には、あるカラー画素値Cm(Rm,Gm,Bm)から一つのグレースケール画素値D(N)を導出する。
M(10,20,240)=20、
M(240,30,30)=185、
となる。このグレースケール化プロファイルは、青色系のR,G,B値は濃い灰色となる領域にできるだけ連続的に変換されるように、かつ、赤色系のR,G,B値は薄い灰色となる領域にできるだけ連続的に変換されるように作成されることが好ましい。
境界検出部55は、領域判定部54による第1ターゲット領域(青色領域)と第2ターゲット領域(赤色領域)の判定結果を利用して、ターゲット2における青色領域と赤色領域との境界を検出する。境界検出部55よって検出された境界、つまり2つの境界線の交点が較正点Qとなるので、ターゲット位置算定部56は境界検出部55による境界検出結果に基づいて、撮像画像におけるターゲット2の位置、つまり較正点Qを算定することができる。
まず、車両1は、検査場の所定位置に正確に位置決め停車される(#01)。停車の確認後、カメラ11を動作させ、車両周辺を撮影する(#02)。所定位置に停車した車両1のカメラ11によるカラー撮影画像には、カメラ11の取り付け精度が多少悪くても2つのターゲット2の画像が含まれるようにカメラは設定されている。画像信号出力部12を通じて出力されたカラー画像データは、例えば、ホワイトバランス補正などの基本的な画像処理が施される(#03)。
(1) 図6に別実施形態での画像処理機能を示す機能ブロック図が示されている。この別実施形態と図4で示された実施形態での画像処理機能との違いは、光源推定部45が追加されていることと、グレースケール化プロファイルの内容である。光源推定部45は、画像信号出力部12から受け取ったカラー画像データから算出された種々の画像特徴量、例えば、平均輝度、ヒストグラム特性、色成分割合などを入力パラメータとして重み演算やルール演算などのシミュレーション演算を行い、光源種を推定して出力する。ここでの重み演算とは、例えばニューラルネットワークで用いられているような、各入力パラメータに重み係数を付与し、繰り返し学習によってこの重み係数の修正を行って、出力結果の誤差を最小にした演算式を用いた演算の総称である。また、ルール演算とは、ifthen文のような所定のルールに基づいて、結果を導く演算の総称である。ルール自体の内容としては、例えば、「B(青)色成分値が所定値より高くR(赤)とG(緑)の成分値が所定値より低いならば、オレンジランプの可能性が所定%以上」などが挙げられる。なお、光源種として、タングステンランプ、ナトリウムランプ、蛍光ランプ、LEDランプなどが挙げられるが、太陽光に含めて、互い同士の組み合わせもここで推定される光源種としてもよい。また、その各ランプの強度、色影響の大きさ別に同一光源種を区分けしてもよい。
4:グレースケール化モジュール
41:ターゲット色設定部
42:グレースケール化プロファイル格納部
43:グレースケール化プロファイル選択部
44:グレースケール化部
45:光源推定部
5:ターゲット位置決定モジュール
6:取付精度判定モジュール
Q:較正点(ターゲットの中心位置)
Claims (8)
- 第1カラー色の第1ターゲット領域と当該第1カラー色とは異なる色成分を有する第2カラー色の第2ターゲット領域との組み合わせからなるカラーターゲットを撮影して得られたカラー画像データに基づいて前記カラーターゲットの撮影画像における位置を決定するターゲット位置決定装置であって、
前記第1カラー色から変換される第1モノクロ色と前記第2カラー色から変換される第2モノクロ色との輝度差が前記第1カラー色と前記第2カラー色との輝度差より大きくなるように前記第1カラー色と前記第2カラー色とをそれぞれ前記第1モノクロ色と前記第2モノクロ色とに変換するグレースケール化プロファイルを格納するグレースケール化プロファイル格納部と、
前記グレースケール化プロファイルを用いて前記カラー画像データをグレースケール画像データに変換するグレースケール化部と、
前記グレースケール画像データにおける前記カラーターゲットの前記第1ターゲット領域と前記第2ターゲット領域との間の境界を認識して、前記撮影画像におけるターゲット位置を決定するターゲット位置決定モジュールと、
を備えたターゲット位置決定装置。 - 前記グレースケール化プロファイルは、前記第1カラー色の色成分が大きいほど白色に近づくとともに前記第2カラー色の色成分が大きいほど黒色に近づくように作成されている請求項1に記載のターゲット位置決定装置。
- 前記第1カラー色が赤色であり、前記第2カラー色が青色である請求項2に記載のターゲット位置決定装置。
- 前記第1カラー色が青色であり、前記第2カラー色が赤色である請求項2に記載のターゲット位置決定装置。
- 前記カラーターゲットに利用される前記第1カラー色と前記2カラー色との異なる色組み合わせ毎の前記グレースケール化プロファイルが前記グレースケール化プロファイル格納部に格納されている請求項1から4のいずれか一項に記載のターゲット位置決定装置。
- 前記グレースケール化プロファイルは、前記カラーターゲットを照明している照明光源種毎に作成され、前記グレースケール化プロファイル格納部に格納されている請求項1から5のいずれか一項に記載のターゲット位置決定装置。
- 前記第1ターゲット領域と前記第2ターゲット領域とが隣り合って配置されている請求項1から6のいずれか一項に記載のターゲット位置決定装置。
- 前記第1ターゲット領域と前記第2ターゲット領域とは複数の直線状境界線を作り出すように配置され、前記直線状境界線の交点をターゲットの中心位置とする請求項7に記載のターゲット位置決定装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2011800400564A CN103069808A (zh) | 2010-08-19 | 2011-07-14 | 靶位置决定装置 |
EP11818014.0A EP2608545A4 (en) | 2010-08-19 | 2011-07-14 | DEVICE FOR DETERMINING A TARGET LOCATION |
JP2012529527A JP5252247B2 (ja) | 2010-08-19 | 2011-07-14 | ターゲット位置決定装置 |
US13/805,480 US20130101211A1 (en) | 2010-08-19 | 2011-07-14 | Target location determination device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010184185 | 2010-08-19 | ||
JP2010-184185 | 2010-08-19 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012023370A1 true WO2012023370A1 (ja) | 2012-02-23 |
Family
ID=45605031
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/066124 WO2012023370A1 (ja) | 2010-08-19 | 2011-07-14 | ターゲット位置決定装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20130101211A1 (ja) |
EP (1) | EP2608545A4 (ja) |
JP (1) | JP5252247B2 (ja) |
CN (1) | CN103069808A (ja) |
WO (1) | WO2012023370A1 (ja) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9135718B2 (en) * | 2012-03-30 | 2015-09-15 | Canon Kabushiki Kaisha | Image processing apparatus and control method therefor |
CN108256521B (zh) * | 2017-12-29 | 2021-06-22 | 济南中维世纪科技有限公司 | 用于车身颜色识别的有效区域定位方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008131250A (ja) | 2006-11-20 | 2008-06-05 | Aisin Seiki Co Ltd | 車載カメラの校正装置及び当該装置を用いた車両の生産方法 |
JP2010016661A (ja) * | 2008-07-04 | 2010-01-21 | Murata Mach Ltd | 画像処理装置 |
WO2010016379A1 (ja) | 2008-08-05 | 2010-02-11 | アイシン精機株式会社 | ターゲット位置特定装置 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6891960B2 (en) * | 2000-08-12 | 2005-05-10 | Facet Technology | System for road sign sheeting classification |
EP1220182A3 (en) * | 2000-12-25 | 2005-08-17 | Matsushita Electric Industrial Co., Ltd. | Image detection apparatus, program, and recording medium |
US7231063B2 (en) * | 2002-08-09 | 2007-06-12 | Intersense, Inc. | Fiducial detection system |
JP4155049B2 (ja) * | 2003-02-14 | 2008-09-24 | 富士ゼロックス株式会社 | ドキュメント処理装置 |
US8462384B2 (en) * | 2004-09-29 | 2013-06-11 | Apple Inc. | Methods and apparatuses for aesthetically enhanced image conversion |
KR100782505B1 (ko) * | 2006-09-19 | 2007-12-05 | 삼성전자주식회사 | 이동통신 단말기의 명암색을 이용한 영상 표시 방법 및장치 |
US8596541B2 (en) * | 2008-02-22 | 2013-12-03 | Qualcomm Incorporated | Image capture device with integrated barcode scanning |
-
2011
- 2011-07-14 US US13/805,480 patent/US20130101211A1/en not_active Abandoned
- 2011-07-14 WO PCT/JP2011/066124 patent/WO2012023370A1/ja active Application Filing
- 2011-07-14 EP EP11818014.0A patent/EP2608545A4/en not_active Withdrawn
- 2011-07-14 JP JP2012529527A patent/JP5252247B2/ja not_active Expired - Fee Related
- 2011-07-14 CN CN2011800400564A patent/CN103069808A/zh active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008131250A (ja) | 2006-11-20 | 2008-06-05 | Aisin Seiki Co Ltd | 車載カメラの校正装置及び当該装置を用いた車両の生産方法 |
JP2010016661A (ja) * | 2008-07-04 | 2010-01-21 | Murata Mach Ltd | 画像処理装置 |
WO2010016379A1 (ja) | 2008-08-05 | 2010-02-11 | アイシン精機株式会社 | ターゲット位置特定装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2608545A4 |
Also Published As
Publication number | Publication date |
---|---|
CN103069808A (zh) | 2013-04-24 |
EP2608545A1 (en) | 2013-06-26 |
EP2608545A4 (en) | 2013-06-26 |
US20130101211A1 (en) | 2013-04-25 |
JP5252247B2 (ja) | 2013-07-31 |
JPWO2012023370A1 (ja) | 2013-10-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101205428B1 (ko) | 타겟 위치 특정 장치 | |
JP4707450B2 (ja) | 画像処理装置及びホワイトバランス調整装置 | |
US11915516B2 (en) | Information processing device and recognition support method | |
EP2723060A1 (en) | Vehicle-mounted camera device | |
CN109408008B (zh) | 影像辨识系统及其信息显示方法 | |
CN106161974A (zh) | 利用高动态范围功能的车辆用显示装置及其方法 | |
US7065245B2 (en) | Image processing apparatus and the method thereof | |
JP5697646B2 (ja) | 車両周辺監視装置 | |
JP2011254311A (ja) | 車両周辺画像処理装置 | |
JP5240517B2 (ja) | 車載カメラの校正装置 | |
JP5252247B2 (ja) | ターゲット位置決定装置 | |
JP5590387B2 (ja) | カラーターゲット位置決定装置 | |
CN109726708A (zh) | 一种车道线识别方法及装置 | |
JP4053280B2 (ja) | 画像処理装置および画像処理方法 | |
JP5904825B2 (ja) | 画像処理装置 | |
JP2013149040A (ja) | ナンバープレート色判定装置、コンピュータプログラム及びナンバープレート色判定方法 | |
KR102373571B1 (ko) | 서라운드 뷰 모니터링 시스템 및 그 방법 | |
US11743438B1 (en) | Method for reducing a color shift of image pixels of an image for a motor vehicle captured by a camera | |
WO2020195515A1 (ja) | 撮像装置及び画像処理方法 | |
JP2000356514A (ja) | ステレオカメラを用いた距離計測方法及び距離計測装置 | |
JP2006053801A (ja) | 物体検出装置及び物体検出方法 | |
CN117351730A (zh) | 图像处理系统、方法、车辆以及存储介质 | |
CN116682337A (zh) | 显示单元的检测方法、装置、终端设备、系统和存储介质 | |
JP2002262087A (ja) | 画像の2値化処理システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180040056.4 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11818014 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13805480 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011818014 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012529527 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |