WO2017138754A1 - Appareil photographique de dessous de véhicule et procédé de photographie de dessous de véhicule de fonctionnement pour le faire fonctionner - Google Patents

Appareil photographique de dessous de véhicule et procédé de photographie de dessous de véhicule de fonctionnement pour le faire fonctionner Download PDF

Info

Publication number
WO2017138754A1
WO2017138754A1 PCT/KR2017/001434 KR2017001434W WO2017138754A1 WO 2017138754 A1 WO2017138754 A1 WO 2017138754A1 KR 2017001434 W KR2017001434 W KR 2017001434W WO 2017138754 A1 WO2017138754 A1 WO 2017138754A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
camera
image
images
perspective distortion
Prior art date
Application number
PCT/KR2017/001434
Other languages
English (en)
Korean (ko)
Inventor
이정희
Original Assignee
㈜잼시큐리티시스템
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ㈜잼시큐리티시스템 filed Critical ㈜잼시큐리티시스템
Publication of WO2017138754A1 publication Critical patent/WO2017138754A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to a vehicle undershooting apparatus and a vehicle undershooting method using the same, wherein the camera is located at a close distance to a floor of a vehicle to be photographed to correct a perspective distortion generated by photographing and captures a plurality of images. It relates to a vehicle undershooting device to match a clear image and a vehicle undershooting method using the same.
  • the vehicle is a simple means of transportation.
  • the threat of terrorism by means of transportation is not endless in many parts of the world.
  • it is not easy to monitor, so it is used as a frequent terrorist tool for terrorists.
  • the most common way is to visually check the underside of the vehicle using a reflecting mirror, but this is only a glimpse of the edge of the bottom of the vehicle, the center of the bottom of the vehicle is difficult to see and It is dark in nature and it is very difficult to identify signs of dangerous goods.
  • Republic of Korea Patent Application Publication No. 10-2011-0043048 is composed of a lower light for illuminating the lower portion of the moving vehicle and a lower inspection unit mounted on the lower portion of the moving vehicle to photograph the bottom surface of the vehicle.
  • Korean Patent Laid-Open Publication No. 10-2011-0043148 improves the dark environment of the bottom of the vehicle when shooting through the lower light, but due to the characteristics of the vehicle, the angle of view of the camera is generally wide and the focus is close to the position of the camera and the bottom of the vehicle The distance becomes shorter.
  • the angle of view is widened, the center of the bottom of the vehicle facing the center of the camera may be displayed in its actual size, but the outer portion of the bottom of the vehicle may have a perspective distortion. When the perspective distortion is severe, it is difficult to detect dangerous goods mounted on the vehicle floor by distorting the image.
  • Korean Patent Laid-Open Publication No. 10-2014-0043148 discloses a control computing device that calculates the speed of a moving vehicle to take images of a plurality of vehicle bottom surfaces, and forms them as one image.
  • the image matching method is a method that depends only on the speed of movement of the vehicle, so that an error occurs frequently during image matching.
  • the speed at which the vehicle passes the lower inspection unit where the camera is arranged is not always uniform, and the vehicle speed during the time moving by the acceleration force at the stop position is inevitably different from the average speed.
  • Patent Document 1 Korean Unexamined Patent Publication No. 10-2014-0043148 (Vehicle Undercarriage Apparatus)
  • the present inventors have conducted various studies to solve the problem, and as a result, the perspective distortion correction unit and the corrected image correct the perspective distortion of the photographed image by calculating the distance difference between the angle of view of the camera and the bottom of the vehicle from the camera. It has been found that a clear vehicle floor can be photographed without distortion through an under vehicle imaging apparatus including an image matching unit that combines multiple images by specifying corresponding points having the same RGB (Red-Green-Blue) value per pixel.
  • RGB Red-Green-Blue
  • an object of the present invention is to provide an undercarriage photographing apparatus capable of acquiring a clear vehicle floor image without perspective distortion and having a high level of matching, and a photographing method for operating the same.
  • Undercarriage photographing apparatus and a method for operating the same according to the present invention when shooting the vehicle floor surface, the camera is located close to the vehicle floor surface to correct the perspective distortion of the image generated as the process of shooting, and the RGB value and transparency
  • the under vehicle imaging apparatus and the method of operating the same according to the present invention can accurately detect various dangerous objects that can be installed on the vehicle floor.
  • FIG. 1 is a plan view of the undercarriage photographing apparatus according to an embodiment of the present invention.
  • FIG. 2 is a schematic diagram of a correction module operating the under vehicle imaging apparatus shown in FIG. 1.
  • FIG. 3 is a perspective distortion image generated when a vehicle is photographed in close-up and an image of correcting perspective distortion.
  • FIG. 4 is a schematic diagram illustrating a process of matching the plurality of images by the image matching unit illustrated in FIG. 2.
  • FIG. 5 is a perspective view of the undercarriage photographing system equipped with the undercarriage photographing apparatus shown in FIG. 1.
  • the under vehicle imaging apparatus according to an embodiment of the present invention
  • An illumination module for irradiating light to a vehicle floor to be photographed by the camera
  • a lower vehicle imaging device comprising a correction module for correcting the image of the vehicle bottom surface obtained by the camera
  • the correction module is the correction module
  • Perspective distortion correction unit for correcting the perspective distortion caused by the distance difference of the vehicle floor surface from the camera according to the angle of view of the camera
  • an image matching unit for comparing corresponding RGB values per pixel of the images corrected by the perspective distortion correction unit, specifying corresponding points to which the RGB values match, and matching the plurality of images into one image based on the corresponding points.
  • the image matching unit may further specify the corresponding points by further comparing the transparency ⁇ value with the RGB value per pixel.
  • a light source unit irradiating light to the vehicle floor
  • It may include a light source control unit for controlling the light source to compare the illuminance of the light received from the sensor unit so that the entire vehicle floor surface has a constant illuminance.
  • FIG. 1 is a plan view of a lower vehicle imaging apparatus according to an embodiment of the present invention
  • Figure 2 is a schematic diagram of a correction module for operating the lower vehicle imaging apparatus shown in FIG. 1 and 2
  • the under vehicle imaging apparatus 100 of the present invention captures a vehicle bottom surface while moving a lower portion of the vehicle bottom surface.
  • the under vehicle imaging apparatus 100 includes a camera 110 for capturing an image of a vehicle floor, a lighting module 120, and a correction module 130 (not shown in FIG. 1) for correcting the photographed image.
  • the vehicle lower imaging apparatus 100 photographs the vehicle bottom surface while moving the lower portion of the vehicle bottom surface.
  • One camera may be disposed in the camera 110, and a plurality of cameras may be disposed to be wider than the vehicle floor.
  • the camera 110 may include a center camera 111 and edge cameras 112 and 113.
  • the center camera 111 may be disposed at a position parallel to the floor of the vehicle to photograph the center of the floor of the vehicle.
  • the edge cameras 112 and 113 may be inclined at an oblique angle with the vehicle bottom surface or the center camera 111, and may photograph edges of the vehicle bottom surface.
  • the center camera 111 and the edge cameras 112 and 113 may be spaced apart from each other at a position parallel to the floor of the vehicle.
  • the center camera 111 and the edge cameras 112 and 113 are not particularly limited as long as they can be used to acquire an image.
  • the center camera 111 and the edge cameras 112 and 113 are charge-coupled devices.
  • ) Includes a camera.
  • the photographing apparatus 100 equipped with a camera as shown in FIG. 3 (1), performs photographing at a position close to the position of the vehicle bottom surface due to the characteristics of the vehicle.
  • the angle of view is widened, the vehicle's bottom surface is photographed.
  • a perspective image of the vehicle floor surface is severely generated as shown in FIG. 3 (2).
  • FIG. 3- (3) is an image of the underside of the vehicle obtained by correcting the perspective distortion phenomenon. Comparing Fig. 3- (3) and Fig. 3- (2), Fig. 3- (2) shows the perspective of the bottom of the vehicle facing the camera without perspective distortion but gradually narrowing toward the edge of the vehicle. Distortion occurs.
  • the distance Z from the lens position of the center camera 111 to the edge of the vehicle bottom surface is determined by the lens position of the center camera 111. As it becomes longer than the distance A to the center, perspective distortion occurs.
  • Perspective distortion causes the edges of the bottom of the vehicle to be narrowly distorted, thereby making identification of dangerous goods or the like installed on the bottom of the vehicle inaccurate and difficult.
  • the correction module 130 corrects perspective distortion shapes of the images acquired by the center camera 111 and the edge cameras 112 and 113, and corrects the perspective distortion phenomenon. Match the images and provide them as one image.
  • the correction module 130 may include a perspective distortion corrector 131, an image matcher 132, and a distance measurer 133.
  • the perspective distortion correction unit 131 corrects the perspective distorted image obtained from the camera 110.
  • the distance measuring unit 133 may measure the position of the camera 110 and the distance between the bottom of the vehicle.
  • the distance measuring unit 133 may calculate the distance from the camera 110 to the vehicle bottom by, for example, irradiating infrared rays and measuring the time of the infrared rays reflected and returned to the vehicle bottom surface, and further, the distance measuring unit ( 133 may calculate the distance from the camera 110 to the vehicle floor by acquiring the time of the ultrasound generated after the ultrasound is reflected on the vehicle floor and returned.
  • the distance measuring unit 133 measures the distance between the camera 110 and the vehicle floor and transmits the distance information to the perspective distortion correction unit 131.
  • the perspective distortion correction unit 131 is a distance between the camera 110 and the vehicle bottom surface received from the distance measuring unit 133 (A in FIG. 3 (2)) and the angle of view of the camera (FIG. 3- (2)). Use B] to correct perspective distorted images.
  • the perspective distortion correction unit 131 may obtain information of an angle of view when capturing from the camera 110.
  • the image enlargement value is calculated to enlarge the image narrowed by the perspective distortion.
  • the center camera 111 is disposed in a position parallel to the vehicle floor surface to capture the center of the vehicle floor surface, the edge cameras 112, 113 are angled with the vehicle floor surface. It is disposed at an angle so that the edge of the vehicle floor can be photographed.
  • the image magnification value G of the center camera 111 may be calculated through Equation 1 below.
  • the perspective distortion corrector 131 enlarges the width of the distorted image using the acquired image enlargement value G.
  • FIG. 1 when the enlarged perspective image is enlarged, the corrected image may be displayed when the enlarged pixel is filled with RGB information, and the pixel that needs the RGB information by expanding to the image enlargement value (G) is the RGB of the adjacent pixel.
  • the same information as the value can be input and displayed as color.
  • the perspective distortion correction unit 131 may calculate the image magnification values G1 of the edge cameras 112 and 113 in the following order. First, the distance F between the edge cameras 112 and 113 and the vehicle floor surface is measured through the distance measuring unit 133 disposed at a position close to the edge cameras 112 and 113. Perspective distortion correction unit 131 may calculate the image magnification values G1 of the edge cameras 112 and 113 by substituting the F distance and A and G values of Equation 1 into Equation 2 below.
  • G1 F * G / A
  • Equation 2 is based on the ratio of the distance camera (A) of the center camera 111 and the vehicle floor surface (A) and the distance (F) of the vehicle floor surface, using the ratio of the edge camera (112, 113)
  • the image magnification value G1 is relatively calculated.
  • the perspective distortion corrector 131 enlarges the width of the distorted image acquired through the edge cameras 112 and 113 by using the acquired image enlargement value G1. Pixels that are enlarged to an image enlargement value G1 and need RGB information may be displayed as color by inputting the same information as the RGB value of the pixel adjacent thereto.
  • the edge cameras 112 and 113 may be disposed to be spaced apart from each other in a plane parallel to the vehicle floor surface in the same manner as the center camera 111, in this case the center camera.
  • the image magnification values of the 111 and the edge cameras 112 and 113 may be obtained through Equation 1 described above.
  • the perspective distortion corrector 131 may sequentially correct the distorted images over several times.
  • the perspective distortion correction unit 131 transmits the corrected images to the image matching unit 132.
  • the image matching unit 132 matches the plurality of images received from the perspective distortion correction unit 131 to be displayed as one image. In order to precisely match a plurality of images photographing different vehicle floors, the image matching unit 132 compares the RGB values per pixel of the plurality of images, specifies corresponding points to which the RGB values match, and overlaps the corresponding points. Match.
  • FIG. 4 is a schematic diagram illustrating a process of matching the plurality of images by the image matching unit illustrated in FIG. 2. 2 and 4, the image matching unit 132 may screen pixels having the same RGB value of the images in order to match the plurality of images. When a plurality of images are enlarged, numerous pixels are combined to form an image, and each pixel has a unique RGB value (pixel information). As shown in FIG. 4, the image matching unit 132 specifies a pixel having a matching RGB value as a corresponding point Y, and provides a plurality of images as one image data based on the corresponding point Y. do.
  • the corresponding point Y may be set to a plurality of pixels in which RGB values match.
  • the image matching unit 132 may set the ten pixels as corresponding points.
  • 10 pixels are specified as the corresponding points Y, but the number of pixels that can be specified as the corresponding points may be variously changed in consideration of the processing speed of the image matching unit 132 and the accuracy of matching.
  • the image matching unit 132 may further specify the corresponding point by further comparing the transparency ⁇ value with the RGB value per pixel.
  • Transparency ( ⁇ ) is information containing chromaticity, which is a property of color ignoring brightness, and turbidity displayed by optically measuring the degree of scattering of light.
  • the image matching unit 132 may transmit the image of the matched vehicle floor to the display device, and the inspector may inspect the vehicle floor surface through the display device.
  • the image matching unit 132 since the image matching unit 132 specifies the corresponding point based on the RGB value and the transparency, the illuminance of the light shining brightly on the lower part of the vehicle should be kept constant. If the illuminance of the illumination is not constant, the RGB value and the transparency of the floor of the vehicle are changed, so that it is difficult for the image matching unit 132 to specify a corresponding point, and an error that cannot be matched may occur. In particular, when transparency is used as a criterion for specifying a corresponding point, the uniformity of illuminance of the lower part of the vehicle serves as an important factor.
  • the lighting module 120 irradiates light onto the vehicle floor to be photographed by the camera.
  • the lighting module 120 includes a light source unit 121, a sensor unit 122, and a light source control unit 123.
  • the light source unit 121 irradiates light on the bottom surface of the vehicle to brighten a dark photographing environment due to the characteristics of the lower portion of the vehicle.
  • the light source unit 121 may be mounted with a plurality of LED chips, and the number of LED chips may be changed in various ways in consideration of the vehicle lower imaging apparatus 100 and the size of the vehicle to be photographed.
  • the sensor unit 122 receives the reflected light from the light emitted from the light source unit 121 reflected on the bottom surface of the vehicle.
  • the sensor unit 122 senses illuminance information of the received reflected light and transmits illuminance information to the light source control unit 123.
  • two sensor units 122 are illustrated on both sides of the camera 110, but may be variously changed in consideration of the number of LED chips and the number of cameras mounted on the light source unit 121.
  • the light source control unit 123 collects illuminance information received from the sensor unit 122 and compares respective illuminance information.
  • the light source controller 123 may control the light source unit 121 to increase the amount of light irradiation to the LED chip in which the sensor unit 122 is located when the sensor unit 122 having low illumination detected from the collected illuminance information is confirmed. have.
  • the light source controller 123 controls the uneven illuminance that may be generated by the external environment or the shape of the bottom of the vehicle to be uniform, so that the image matching unit 132 may specify the corresponding point of the plurality of images without error. do.
  • FIG. 5 is a perspective view of the undercarriage photographing system equipped with the undercarriage photographing apparatus shown in FIG. 1.
  • the vehicle lower imaging system 1000 includes a vehicle lower imaging apparatus 100, a stage 200, and a guide unit 300.
  • the vehicle is positioned above the stage 200.
  • a portion where the wheel of the vehicle is positioned may protrude slightly higher.
  • the stage 200 may have a form inclined in one direction, which allows rainwater to drain naturally.
  • a laser or an infrared sensor may be attached to the stage 200 to determine whether the vehicle has stopped at the correct position of the stage 200.
  • the stage 200 may be mounted as a sensing camera 400 that detects a vehicle number, and may recognize a number of a vehicle entering the stage 200.
  • the guide part 300 extends along the longitudinal direction of the central portion of the stage 200 and moves the under vehicle imaging apparatus 100. Accordingly, the vehicle lower imaging apparatus 100 may move while photographing the vehicle bottom surface along the guide unit 300.
  • the under vehicle imaging method of the present invention includes (A) photographing a plurality of images of the underside of a vehicle using a camera, (B) perspective of a plurality of images photographed using an angle of view of the camera and a distance between the camera and a bottom surface of the vehicle. Correcting the distortion portion and (C) comparing the RGB values per pixel of the plurality of images in which the perspective distortion correction is corrected, and overlapping the plurality of images based on a corresponding point where the RGB values match to form one image. It includes.
  • step (A) first irradiates light using a light source to the vehicle floor to be photographed (A-1). Subsequently, the light reflected on the vehicle bottom surface is received to measure the illuminance of the vehicle bottom surface, and the light irradiation amount of the light source is adjusted so that the illuminance of the light reflected on the vehicle bottom surface is uniform (A-2). Thereafter, the camera photographs the vehicle floor (A-3).
  • step (B) measures the distance between the camera and the vehicle floor to be photographed (B-1), and then checks the angle of view of the camera (B-2), and enlarges the width of the perspective distorted image.
  • the image magnification value is calculated (B-3), and the distortion portion of the perspective distorted images is enlarged to the image magnification value (B-4). Since the method of calculating the image magnification value has been described above, the detailed description thereof will be omitted. Subsequently, the pixels of the enlarged image are displayed at the same numerical value as the RGB values of the adjacent pixels before correction (B-5).
  • Step (C) compares the RGB values per pixel between the corrected images, specifies corresponding points having the same RGB value of at least 10 consecutively arranged on the image (C-1), and overlaps the plurality of images with the corresponding points. Matches to one image (C-2).
  • the corresponding point may include pixels having the same RGB value and transparency ⁇ value per pixel of the plurality of images.
  • the under vehicle imaging apparatus and the method of operating the same according to the present invention correct a perspective distortion of an image generated as the camera is located close to the vehicle floor when photographing the vehicle bottom surface, and proceeds with the photographing.
  • the transparency and matching the plurality of images after specifying the corresponding point of the plurality of images it is possible to obtain a clear image of the vehicle floor without distortion. Therefore, the under vehicle imaging apparatus and the method of operating the same according to the present invention can accurately detect dangerous goods that may be installed on the vehicle floor.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

La présente invention concerne un appareil photographique de dessous de véhicule destiné à capturer une image nette de la surface inférieure d'un véhicule sans distorsion de perspective et son procédé de fonctionnement. La présente invention comprend : un appareil photo destiné à capturer une image de la surface inférieure d'un véhicule ; un module d'éclairage destiné à émettre de la lumière sur la surface inférieure du véhicule à photographier par l'appareil photo ; et un module d'étalonnage destiné à étalonner l'image de la surface inférieure du véhicule obtenue par l'appareil photo. Le module d'étalonnage comprend : une partie d'étalonnage de distorsion de perspective destinée à étalonner la distorsion de perspective produite par une différence de distance entre la surface inférieure du véhicule et l'appareil photo en fonction de l'angle de vue de l'appareil photo ; et une partie d'appariement d'images destinée à spécifier un point correspondant ayant des valeurs RVB correspondantes par comparaison des valeurs RVB de chaque pixel des images étalonnées par la partie d'étalonnage de distorsion de perspective et apparier de multiples images en une image unique sur la base du point correspondant.
PCT/KR2017/001434 2016-02-12 2017-02-10 Appareil photographique de dessous de véhicule et procédé de photographie de dessous de véhicule de fonctionnement pour le faire fonctionner WO2017138754A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2016-0016340 2016-02-12
KR20160016340 2016-02-12
KR10-2016-0027489 2016-03-08
KR1020160027489A KR101630596B1 (ko) 2016-02-12 2016-03-08 차량하부 촬영장치 및 이를 운용하는 차량하부 촬영방법

Publications (1)

Publication Number Publication Date
WO2017138754A1 true WO2017138754A1 (fr) 2017-08-17

Family

ID=56192082

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/001434 WO2017138754A1 (fr) 2016-02-12 2017-02-10 Appareil photographique de dessous de véhicule et procédé de photographie de dessous de véhicule de fonctionnement pour le faire fonctionner

Country Status (2)

Country Link
KR (1) KR101630596B1 (fr)
WO (1) WO2017138754A1 (fr)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106524952B (zh) * 2016-12-22 2022-03-01 桂林施瑞德科技发展有限公司 单目相机3d车轮定位仪
KR101910062B1 (ko) * 2017-07-06 2018-10-19 (주)잼시큐리티시스템 차량 하부 촬영 시스템
CA3136110A1 (fr) * 2019-04-02 2020-10-08 ACV Auctions Inc. Systeme d'imagerie de train roulant de vehicule
US10893213B2 (en) 2019-04-02 2021-01-12 ACV Auctions Inc. Vehicle undercarriage imaging system
US11770493B2 (en) 2019-04-02 2023-09-26 ACV Auctions Inc. Vehicle undercarriage imaging system
KR20240047180A (ko) * 2022-10-04 2024-04-12 현대자동차주식회사 이미지 처리 장치, 이미지 처리 시스템 및 이미지 처리 방법

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090008560A (ko) * 2007-07-18 2009-01-22 삼성전자주식회사 휴대용 단말기에서 카메라의 파노라마 기능 구현 방법 및장치
JP2009128261A (ja) * 2007-11-27 2009-06-11 Takashima Giken Kk 外観検査方法および装置
KR20120095220A (ko) * 2011-02-18 2012-08-28 엘지전자 주식회사 영상 표시 장치 및 그 방법
KR20130069347A (ko) * 2011-12-14 2013-06-26 삼성전자주식회사 촬상 장치 및 촬상 방법
KR101450733B1 (ko) * 2013-12-19 2014-10-16 인천국제공항공사 차량 하부 검색 장치 및 방법

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4079680B2 (ja) * 2002-04-26 2008-04-23 独立行政法人産業技術総合研究所 画像合成装置及び方法
JP2012008020A (ja) * 2010-06-25 2012-01-12 Clarion Co Ltd 車載装置
KR101164915B1 (ko) * 2011-01-21 2012-07-19 (주)옵티스 입체영상 카메라의 왜곡보정장치
US9258086B2 (en) 2011-08-03 2016-02-09 Qualcomm Incorporated Allocating physical hybrid ARQ indicator channel (PHICH) resources

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090008560A (ko) * 2007-07-18 2009-01-22 삼성전자주식회사 휴대용 단말기에서 카메라의 파노라마 기능 구현 방법 및장치
JP2009128261A (ja) * 2007-11-27 2009-06-11 Takashima Giken Kk 外観検査方法および装置
KR20120095220A (ko) * 2011-02-18 2012-08-28 엘지전자 주식회사 영상 표시 장치 및 그 방법
KR20130069347A (ko) * 2011-12-14 2013-06-26 삼성전자주식회사 촬상 장치 및 촬상 방법
KR101450733B1 (ko) * 2013-12-19 2014-10-16 인천국제공항공사 차량 하부 검색 장치 및 방법

Also Published As

Publication number Publication date
KR101630596B1 (ko) 2016-06-14

Similar Documents

Publication Publication Date Title
WO2017138754A1 (fr) Appareil photographique de dessous de véhicule et procédé de photographie de dessous de véhicule de fonctionnement pour le faire fonctionner
CN108760765B (zh) 一种基于侧视相机拍摄的表面损伤缺陷检测装置及方法
KR102178903B1 (ko) 외관 검사 장치, 및 외관 검사 장치의 조명 조건 설정 방법
US7471381B2 (en) Method and apparatus for bump inspection
KR101949257B1 (ko) 디스플레이 모듈 검사장치 및 검사방법
TWI471542B (zh) Tire shape inspection device and tire shape inspection method
WO2013009065A2 (fr) Appareil pour contrôler la vision 3d d'un composant led et procédé de contrôle de vision
WO2011087337A2 (fr) Dispositif d'inspection de substrat
WO2013062345A1 (fr) Procédé de commande d'éclairage en couleur pour améliorer la qualité d'image dans un système de vision
WO2016163840A1 (fr) Appareil de mesure de forme tridimensionnelle
JP2006189421A (ja) 高知能デジタルイメージ検査システム及びその検査方法
WO2013176482A1 (fr) Procédé de mesure de la hauteur pour un dispositif de mesure de formes tridimensionnelles
WO2012134146A1 (fr) Appareil pour inspecter la vision à l'aide d'une vision stéréo et d'un modèle de grille
WO2016099154A1 (fr) Procédé de contrôle et appareil de contrôle pour un substrat sur lequel des composants sont chargés
WO2015026210A1 (fr) Procédé d'inspection de joint de soudure
JP4383071B2 (ja) ウエハ収納カセットの検査装置及び方法
WO2017204452A1 (fr) Système de détection de défaut de film optique et procédé de détection de défaut de film optique
KR102014171B1 (ko) 유기발광소자의 혼색 불량 검출장치 및 검출방법
WO2012134147A1 (fr) Appareil d'inspection visuelle utilisant une grille de partition de la lumière visible et une grille de partition de la lumière invisible
WO2016024648A1 (fr) Appareil d'inspection de plan de grande surface
JP2009080004A (ja) 検査装置
WO2015076513A1 (fr) Appareil d'inspection de transmittance de motif imprimé pour capteur ir
KR101198406B1 (ko) 패턴 검사 장치
US10091443B2 (en) Camera system and method for inspecting and/or measuring objects
WO2013180394A1 (fr) Dispositif d'inspection visuelle de moiré utilisant un réseau uniforme

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17750446

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17750446

Country of ref document: EP

Kind code of ref document: A1