KR101576273B1 - System for recognizing multiplex hybrid object car number - Google Patents

System for recognizing multiplex hybrid object car number Download PDF

Info

Publication number
KR101576273B1
KR101576273B1 KR1020150116939A KR20150116939A KR101576273B1 KR 101576273 B1 KR101576273 B1 KR 101576273B1 KR 1020150116939 A KR1020150116939 A KR 1020150116939A KR 20150116939 A KR20150116939 A KR 20150116939A KR 101576273 B1 KR101576273 B1 KR 101576273B1
Authority
KR
South Korea
Prior art keywords
image
vehicle
unit
control signal
channel region
Prior art date
Application number
KR1020150116939A
Other languages
Korean (ko)
Inventor
김홍기
Original Assignee
(주)샤인정보통신
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)샤인정보통신 filed Critical (주)샤인정보통신
Priority to KR1020150116939A priority Critical patent/KR101576273B1/en
Application granted granted Critical
Publication of KR101576273B1 publication Critical patent/KR101576273B1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Image Analysis (AREA)

Abstract

The present invention relates to a multi complex object vehicle number recognizing system, capable of obtaining front and rear images of a vehicle regardless of the number of vehicles and driving directions, and recognizing a vehicle number from the obtained images. The system includes: a first sensor part installed towards a first area, sensing a movement direction of a vehicle towards the first area, and generating a first signal according to the sensed movement direction of the vehicle; a second sensor part installed towards a second area, sensing a movement direction of the vehicle towards the second area, and generating a second control signal according to the sensed movement direction of the vehicle; a first photographing part installed towards the first area, and photographing the vehicle, moving in the first area, by the first or second control signal; a second photographing installed towards the second area, and photographing the vehicle, moving in the second area, by the first or second control signal; and an image control part receiving the first image, photographed through the first photographing part, and the second image, photographed through the second photographing part respectively, extracting and decoding the vehicle number from the first and second images, and generating a third image by combining images, having the same vehicle numbers, with the first and second images based on the decoding result.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention [0001] The present invention relates to a multi-

An embodiment of the present invention relates to a multi-compound object car number identification system.

Recently, a system for recognizing a car number using a camera is common. As the crime rate increases along with safety accidents, the use of the control system linked with the CCTV to arrest the criminals who escape to the vehicle or to locate the vehicle is becoming active. The trend is steadily growing.

In the conventional vehicle identification system, a sensor for detecting the movement of the vehicle and a camera for photographing a license plate of the vehicle are installed in a direction facing the front of the vehicle. In this case, since only the image of the front of the vehicle can be acquired, there is a limitation that the image of the rear of the vehicle can not be obtained.

In addition, even if the sensor and the camera are provided in the forward and backward directions according to the traveling direction of the vehicle, in this case, it is also possible to acquire only different images for the front of the vehicle. The image of the escape vehicle can not be photographed according to the detection method of the sensor when the escape vehicle moves to the backward direction.

In addition, in the existing car number recognition system, it is impossible to acquire both the front and rear images of a certain vehicle when a plurality of vehicles move, or acquire one front and rear images for each driving direction.

Registration No. 10-1489859 (Feb. 29, 2015) 'Vehicle number recognition system using X-ray' Registered Patent Publication No. 10-0968433 (June 30, 2010) 'Vehicle identification information storage system and vehicle image information retrieval system using the system' Open Patent Publication No. 10-2008-0023178 (Mar. 12, 2008) 'Automatic Parking Management System'

An embodiment of the present invention provides a multi-compound object vehicle number recognition system capable of acquiring front and rear images of a vehicle regardless of the running direction of the vehicle and the number of vehicles, and recognizing the vehicle number from the obtained images.

A multi-compound object vehicle number recognition system according to an embodiment of the present invention includes: a first multi-complex object vehicle number recognition system installed in a first area, sensing a moving direction of the vehicle with respect to the first area, A first sensor unit for generating a signal; A second sensor unit installed toward the second area and sensing a moving direction of the vehicle with respect to the second area and generating a second control signal in accordance with the moving direction of the sensed vehicle; A first photographing unit which is installed toward the first area and photographs a vehicle moving in the first area by the first control signal or the second control signal; A second photographing unit which is installed toward the second area and photographs a vehicle moving in the second area by the first control signal or the second control signal; And a second image picked up through the first image pickup unit and a second image picked up through the second image pickup unit, respectively, extracting and reading the vehicle number from the first image and the second image, respectively, And an image controller for generating a third image by combining images having the same vehicle number with respect to the first image and the second image, respectively.

The first area and the second area may be regions opposite to each other with reference to the first sensor unit, the second sensor unit, the first photographing unit, and the second photographing unit.

The first sensor unit may include a first ultra wide band (UWB) sensor, the second sensor unit may include a second UWB sensor, and the first UWB sensor may be connected to the first And transmits the first control signal to the image control unit. The second UWB sensor senses a vehicle moving to a region close to the second photographing unit, and transmits the first control signal to an area close to the second photographing unit And may detect the moving vehicle, generate the second control signal, and transmit the second control signal to the image control unit.

The first UWB sensor detects the first region by dividing the first region into a first channel region to an Nth channel region, and the Nth (N) The first control signal is generated when a signal is sequentially detected in a direction from the channel region toward the first channel region, and when the signal is detected in only one channel region of the first channel region to the N-th channel region, Wherein the first channel region is a region from the installation point of the first UWB sensor to a point spaced apart from the installation point by a predetermined distance, 1 < / RTI > UWB sensor.

The second sensor unit includes a second ultra wide band (UWB) sensor, and the second UWB sensor detects the second region as a first channel region to an Nth channel region, The second control signal is generated when a signal is sequentially detected in a direction from the channel region toward the first channel region, and when a signal is detected in only one channel region from the first channel region to the N-th channel region, Wherein the first channel region is a region from the installation point of the second UWB sensor to a point spaced apart from the installation point by a predetermined distance, 2 < / RTI > UWB sensor.

In addition, each of the first photographing unit and the second photographing unit may include at least one of an IP camera, a three-dimensional camera, and a depth camera.

The image control unit receives the first control signal generated by the first sensor unit and transmits the first control signal to the first image sensing unit and the second image sensing unit, A control signal transmitting / receiving unit receiving a signal and transmitting the signal to the first photographing unit and the second photographing unit, respectively; An image processor for performing an image recognition algorithm on the first image and the second image to respectively extract and read the vehicle number; And comparing the vehicle numbers extracted and read out from the first image with the vehicle numbers extracted and read out from the second image, combining the images having the same vehicle number with each other to form the third image, And an image generating unit.

Each of the first photographing unit and the second photographing unit may be configured such that X-rays are irradiated to a vehicle moving in the first area and the second area, respectively, and X-ray images A first X-ray camera and a second X-ray camera for photographing the first X-ray camera.

The image control unit receives the first control signal generated by the first sensor unit and transmits the first control signal to the first X-ray camera and the second X-ray camera, respectively, A control signal transmitting and receiving unit receiving the second control signal and transmitting the second control signal to the first X-ray camera and the X-ray second camera, respectively; Ray image captured through the first X-ray camera and a second X-ray image captured through the second X-ray camera, respectively, to extract and read the vehicle number, respectively, ; And comparing the vehicle numbers read out from the first X-ray image with the vehicle numbers read out from the second X-ray image, combining the X-ray images having the same vehicle number with each other to obtain the third X- And a forward and backward image generating unit for generating the forward and backward image.

Also, the image control unit may include: an image determination unit for determining whether the license plate exists in the first X-ray image and the second X-ray image; And an image capturing unit capturing the first X-ray image and the second X-ray image when no license plate exists in the first X-ray image and the second X-ray image, respectively, Can perform the shadow reading algorithm when a license plate exists in each of the first X-ray image and the second X-ray image.

According to the embodiment of the present invention, there is provided a multi-compound object vehicle number recognition system capable of acquiring front and rear images of a vehicle regardless of the running direction of the vehicle and the number of vehicles and recognizing the vehicle number from the obtained image .

FIG. 1 is a diagram illustrating the overall configuration of a multi-compound object car number identification system according to an embodiment of the present invention.
FIG. 2 is a block diagram showing a configuration and an operation method of a first sensor unit, a second sensor unit, a first photographing unit, a second photographing unit, and an image control unit according to an embodiment of the present invention.
3 is a block diagram illustrating an operation method of a control signal transmitting and receiving unit according to control signals of a first sensor unit and a second sensor unit according to an embodiment of the present invention.
4 to 6 are views showing the operation of the first sensor unit, the second sensor unit, the first photographing unit, and the second photographing unit according to an embodiment of the present invention.
FIG. 7 is a block diagram showing a configuration and an operation method of a first sensor unit, a second sensor unit, a first photographing unit, a second photographing unit, and an image control unit according to another embodiment of the present invention.

The terms used in this specification will be briefly described and the present invention will be described in detail.

While the present invention has been described in connection with what is presently considered to be the most practical and preferred embodiment, it is to be understood that the invention is not limited to the disclosed embodiments. Also, in certain cases, there may be a term selected arbitrarily by the applicant, in which case the meaning thereof will be described in detail in the description of the corresponding invention. Therefore, the term used in the present invention should be defined based on the meaning of the term, not on the name of a simple term, but on the entire contents of the present invention.

When an element is referred to as "including" an element throughout the specification, it is to be understood that the element may include other elements as well, without departing from the spirit or scope of the present invention. Also, the terms "part," " module, "and the like described in the specification mean units for processing at least one function or operation, which may be implemented in hardware or software or a combination of hardware and software .

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings so that those skilled in the art can easily carry out the present invention. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. In order to clearly illustrate the present invention, parts not related to the description are omitted, and similar parts are denoted by like reference characters throughout the specification.

FIG. 1 is a diagram illustrating the overall configuration of a multi-compound object car number identification system according to an embodiment of the present invention. FIG. 2 is a block diagram showing a configuration and an operation method of a first sensor unit, a second sensor unit, a first photographing unit, a second photographing unit, and an image control unit according to an embodiment of the present invention. 3 is a block diagram illustrating an operation method of a control signal transmitting and receiving unit according to control signals of a first sensor unit and a second sensor unit according to an embodiment of the present invention.

1, a multi-compound object vehicle number recognition system 1000 according to an exemplary embodiment of the present invention includes a first sensor unit 100, a second sensor unit 200, a first photographing unit 300, A second photographing unit 400, and an image control unit 500. In addition, the multi-compound object car number identification system 1000 according to an embodiment of the present invention may further include the management server 600. [

Hereinafter, " A " is appended to the reference numerals of the configuration according to the embodiment of the present invention, and " B " The example structures are distinguished from each other.

The first sensor unit 100 is installed toward the first area S1 and senses the moving direction of the vehicle 10 moving in the first area S1, The first control signal can be generated according to the direction. The first sensor unit 100 may include a first ultra wide band (UWB) sensor 100A. The first UWB sensor 100A moves to an area of the first area S1 closer to the area remote from the first photographing part 300 when the vehicle 10 enters and exits the first area S1 In this case, the first control signal may be generated and transmitted to the image control unit 500.

The UWB sensor is a broadband sensor with a bandwidth of 450 ~ 500MHz, which is different from a microwave sensor having a bandwidth of several tens of MHz. In an embodiment of the present invention, the type of the first sensor unit 100 is not limited, and a plurality of photo reflectors or ultrasonic sensors disposed to replace the function of the UWB sensor may also be used. The first sensor unit 100 generates a first control signal for operating the first and second photographing units 300 and 400 when the movement of the vehicle 10 is sensed, 500).

The second sensor unit 200 is installed toward the first area S2 and senses the moving direction of the vehicle 10 moving in the second area S2, The second control signal can be generated according to the direction. The second sensor unit 200 may include a second ultra wide band (UWB) sensor 100B. The second UWB sensor 100B moves to the area of the second area S2 closer to the area farther from the second photographing part 400 when the vehicle 10 enters and exits the second area S2 In this case, the second control signal may be generated and transmitted to the image controller 500.

In an embodiment of the present invention, the type of the second sensor unit 200 is not limited, and a plurality of photo reflectors or ultrasonic sensors arranged to replace the function of the UWB sensor may also be used. The second sensor unit 200 generates a second control signal for operating the first and second photographing units 300 and 400 when the movement of the vehicle 10 is sensed, 500).

The first and second regions S1 and S2 are disposed in a direction opposite to each other with respect to the first sensor unit 100, the second sensor unit 200, the first photographing unit 300 and the second photographing unit 400, Lt; / RTI >

The first UWB sensor 100A senses the first region S1 as a first channel region to an Nth channel region and detects a signal in a direction from the Nth channel region toward the first channel region. And if so, generate the first control signal. Also, the first UWB sensor 100A does not generate the first control signal when a signal is detected in only one channel region of the first channel region to the N-th channel region. Here, the first channel region is an area from the installation point of the first UWB sensor 100A to a position spaced apart by a certain distance (a, b, c, d), and the N-th channel region is an N- To a detectable critical point of the first UWB sensor 100A that is spaced a certain distance from the first UWB sensor 100A.

The second UWB sensor 200A senses the second region S2 as a first channel region to an Nth channel region and detects a signal in a direction from the Nth channel region toward the first channel region The second control signal may be generated. Also, the second UWB sensor 200A does not generate the second control signal when a signal is detected in only one channel region of the first channel region to the N-th channel region. Here, the first channel region is an area from the installation point of the second UWB sensor 200A to a predetermined distance, and the N-th channel region is a region a ', b' , c ', d') to the detectable critical point of the second UWB sensor 200A.

A specific method by which the first and second sensor units 100 and 200 sense the moving direction of the vehicle will be described later.

The first photographing unit 300 can photograph a vehicle that is installed toward the first area S1 and moves into the first area S1 by the first control signal or the second control signal. The first photographing unit 300 may start a photographing operation when a first control signal is received from the image controller 500 or a second control signal is received.

The first photographing unit 300 may include at least one of an IP camera, a three-dimensional camera, and a depth camera. Hereinafter, an embodiment of the present invention will be described in which the first photographing unit 300 is the first camera 300A.

The second photographing unit 400 may be installed toward the second area S2 and photographs a vehicle moving in the second area S2 by the first control signal or the second control signal. The second photographing unit 400 may start a photographing operation when the first control signal is received from the image controller 500 or the second control signal is received.

The second photographing unit 400 may include at least one of an IP camera, a three-dimensional camera, and a depth camera. Hereinafter, an embodiment of the present invention will be described in which the second photographing unit 400 is a second camera 400A.

When the vehicle 10 moves toward the first camera 400A in the first area S1 as described above, the first UWB sensor 100A senses the first UWB sensor 100A, The image control unit 500 transmits the received first control signal to the first and second cameras 300A and 400A to transmit the first and second cameras 300A and 400A to the first and second cameras 300A and 400A, . Accordingly, the first and second cameras 300A and 400A can photograph both front and rear images of the vehicle 10 traveling in the first area S1 from the first area S1 to the second area S2.

The image control unit 500 receives a first image photographed through the first camera 300A and a second image photographed through the second camera 400A respectively and outputs the first image and the second image Extracting and reading the car number from the image, and combining the images having the same car number with each other for the first image and the second image on the basis of the read result, thereby generating the third image.

2, the image control unit 500A according to an exemplary embodiment of the present invention includes a control signal transmission / reception unit 510A, an image processing unit 520A, a front / rear image generation unit 530A, 540A. ≪ / RTI >

As shown in FIG. 3A, the control signal transmitting / receiving unit 510A receives the first control signal generated from the first UWB sensor 100A and transmits the first control signal to the first camera 300A and the first camera 300A, And the second camera 400A, respectively. 3B, the control signal transmitting / receiving unit 510A receives the second control signal generated by the second UWB sensor 200A and transmits the control signal to the first camera 300A, And the second camera 400A, respectively.

The image processing unit 520A may perform an image recognition algorithm for the first image and the second image to extract and read the vehicle number, respectively.

More specifically, the image processing unit 520A can detect a vehicle as a tracked object from the first image and the second image, respectively, using an image recognition algorithm. At this time, the image processing unit 520A may have a preset image template, and the preset image template may be a vehicle-shaped image template. The predetermined image template may be a variety of templates such as a compact car, a medium-sized car, a large car, a bus and a freight car. The specific coordinates of the predetermined image template may have a template such as a license plate (front and rear). Accordingly, the image processing unit 520A checks whether the photographed object is a vehicle using the preset image template from the first and second image data, and determines whether the similarity of the photographed object It is possible to extract the portions of the template matching the specific coordinates after judging that the vehicle is the predetermined degree of similarity or more. Then, the image processing unit 520A can extract and read texts including letters or numbers of the front and rear license plates, respectively, using an image recognition algorithm.

When a plurality of vehicles are photographed in one image, the image processor 520A can divide the images into a plurality of images for each of the photographed vehicles. Accordingly, the image processing unit 520A generates and processes first images (forward images) having different vehicle numbers with respect to the first area S1, And generate and process second images (rear images) having numbers.

The forward and backward image generation unit 530A compares the vehicle numbers read out from the first image with the vehicle numbers extracted and read out from the second image, and generates a first image and a second image having the same vehicle number And the third image can be generated by combining them. The third image may include first image data (forward image), second image data (rear image), and vehicle number for the same vehicle. The forward and backward image generation unit 530A combines the images having the same vehicle number between the first image (forward image) and the second image (rear image) processed through the image processing unit 520A, 3 images (front and rear images) can be generated. In addition, the third image may include information such as the traveling direction of the vehicle, the shooting location, the passing time, the passing date, the passing speed, and the vehicle type.

The communication interface unit 540A is an apparatus for transmitting and receiving data with the management server 6000. The communication interface unit 540A connects a predetermined communication channel with the management server 600 based on a protocol stack defined in the communication network, And the result information of the front and rear image generating unit 530A through communication with the management server 600 using the communication protocol defined in the communication program provided in the management server 600. [ However, in one embodiment of the present invention, the type of the communication network is not limited, and a wifi method, a zigbee method, a bluetooth method, a 3G method, a 4G method, an LTE method, It can also be applied.

4 to 6 illustrate the operation of the first UWB sensor 100A, the second UWB sensor 200A, the first camera 300A, and the second camera 400A according to an embodiment of the present invention .

4 to 6, a description will be given of a manner in which the first UWB sensor 100A and the second UWB sensor 200A sense the moving direction of the vehicle 10 and generate a control signal.

The first UWB sensor 100A analyzes a signal sensed by each of the first to fourth channel regions A, B, C, and D to sense a moving direction 1 of the vehicle. For example, as shown in FIG. 4, the first UWB sensor 100A detects the vehicle moving in the first direction S1 in the first region S1 in the fourth channel region D It is recognized that the vehicle 10 is approaching toward the first camera 300A and then the first camera 300A is recognized as the first camera 300A by sequentially sensing the third channel region C, the second channel region B, And a second control signal for operating the second cameras 300A and 400A. Accordingly, the first and second cameras 300A and 400A acquire the forward and backward images of the vehicle by photographing the vehicle moving in the direction 1 in the first and second areas S1 and S2 .

The second UWB sensor 200A analyzes a signal sensed by each of the first to fourth channel regions A ', B', C ', and D' to sense the moving direction of the vehicle. For example, as shown in FIG. 4, the second UWB sensor 200A detects the vehicle moving in the first direction in the second region S2 in the fourth channel region D ' The second channel region B ', and the first channel region A', it is recognized that the vehicle is approaching the second camera 400A, 1 and the second cameras 300A and 400A, respectively. Accordingly, the first and second cameras 300A and 400A acquire the forward and backward images for the vehicle by photographing the vehicle moving in the second direction in the second and first areas S2 and S1 .

5, the first UWB sensor 100A detects the vehicle in the third channel region C and then detects the second channel region B and the first channel region A in order It recognizes that the vehicle is approaching the first camera 300A and generates a first control signal for operating the first and second cameras 300A and 400A. Accordingly, the first and second cameras 300A and 400A acquire the forward and backward images of the vehicle by photographing the vehicle moving in the direction 1 in the first and second areas S1 and S2 .

The second UWB sensor 200A detects the vehicle in the third channel region C 'and then the second channel region B' and the first channel region A ' Recognizes that the first camera 300A approaches the second camera 400A, and generates a second control signal for operating the first and second cameras 300A and 400A. Accordingly, the first and second cameras 300A and 400A acquire the forward and backward images for the vehicle by photographing the vehicle moving in the second direction in the second and first areas S2 and S1 .

On the other hand, as shown in FIG. 6, when the first UWB sensor 100A detects that an object is detected only in the second channel region B, the first UWB sensor 100A recognizes that the object simply passes near the first camera 300A, 1 and the second camera 300A, 400A. That is, when the first UWB sensor 100A detects that an object is detected only in one channel region, the first UWB sensor 100A recognizes that the object is simply passing near the first camera 300A and the first and second cameras 300A and 400A The first control signal is not generated. In this case, it is judged that the object corresponds to a case where a living thing such as a person or an animal, not a vehicle, moves across a road or the like. As such, when the first UWB sensor 100A senses a certain vehicle, the first UWB sensor 100A must be recognized as a channel that is sequentially close to the channel remote from the first camera 300A, so that the vehicle must pass through the first camera 300A And generates a first control signal for obtaining front and rear images of the vehicle through the first and second cameras 300A and 400A.

The second UWB sensor 200A recognizes that the object simply passes near the second camera 400A when a certain object is detected only in the second channel region B ' And 400A, respectively. That is, when the second UWB sensor 200A recognizes that an object is passing through the fourth camera 400A only when a certain object is sensed in only one channel region, the second UWB sensor 200A determines that the fourth camera 400A is operated No control signal is generated. In this case, it is determined that the object corresponds to a case where a living thing such as a person or an animal, not a vehicle, moves across a road or the like. As such, when the second UWB sensor 200A detects a certain vehicle, the second UWB sensor 200A must recognize the channel that is sequentially close to the second channel from the second camera 400A, And generates a second control signal for obtaining front and rear images of the vehicle through the first and second cameras 300A and 400A.

FIG. 7 is a block diagram showing a configuration and an operation method of a first sensor unit, a second sensor unit, a first photographing unit, a second photographing unit, and an image control unit according to another embodiment of the present invention.

7, the first and second X-ray cameras 300B and 300B are applied, and the configuration of the image controller 500B is changed according to the first and second X-ray cameras 300B and 300B.

The first and second X-ray cameras 300B and 300B irradiate X-rays to vehicles moving in the first area S1 and the second area S2, respectively, X-ray images to be diffracted can be photographed.

The image control unit 500B may include a control signal transmission and reception unit 510B, an image determination unit 520B, an image processing unit 530B, an image capture unit 520B, and an image capture unit 520B according to the application of the first and second X-ray cameras 300B and 300B. Unit 540B, a forward / backward image generation unit 550B, and a communication interface unit 560B.

The control signal transmitting and receiving unit 510B, the front and rear image generating unit 550B and the communication interface unit 560B according to the other embodiment of the present invention include the control signal transmitting and receiving unit 510A, the front and rear image generating unit 550A The image processing unit 530B and the image capturing unit 540B which are similar to the configuration of the communication interface unit 560A will be described with reference to the image determination unit 520B,

The control signal transmitting / receiving unit 510B receives the first control signal generated by the first sensor unit (hereinafter, referred to as a first UWB sensor) 100B and transmits the first control signal to the first X-ray camera 300B and the second X- Ray camera 300B and the second X-ray camera 300B by receiving the second control signal generated by the second sensor unit (hereinafter referred to as the second UWB sensor) 200B, 400B, respectively.

The image determining unit 520B may determine whether the license plate exists in the first X-ray image and the second X-ray image.

The image processor 530B may perform a shading reading algorithm for the first X-ray image and the second X-ray image to extract and read the vehicle number, respectively. More specifically, when it is determined that the front and rear license plates are present in the first and second X-ray images as a result of the determination of the image determination unit 520B, the image processing unit 520B determines that the first and second X Perform shading read operation on the line image. The image processing unit 520B reads an X-ray image by a shadow reading algorithm and extracts a character and a number using shades (intaglio, relief) of the front and back plate. For example, the image processor 520B binarizes the X-ray images transmitted from the first and second X-ray cameras 300B and 400B in a form corresponding to black and white, and then outputs the binarized pixel data Extracts a fine line corresponding to an area formed adjacent to black or white with the same value corresponding to black or white, and obtains information about each number corresponding to the parameters for distinguishing the numbers in the memory (not shown) Determine the number of fine lines.

The image processing unit 520B corrects angles when the front and rear license plate areas of the vehicle 10 are tilted, performs shading correction and noise correction, and binarizes the images so that letters or numbers And extracts and reads the text. Then, the image processing unit 520B transmits the letters or numerical information of the read front and rear license plates to the front and rear image generating unit 550B. The present invention is not limited to the above-described method of the shading reading algorithm for performing the shading reading operation using the X-ray image. In general, the shading reading algorithm used for recognizing the object in the X- Either can be used.

The image capturing unit 540B captures the first X-ray image and the second X-ray image when no license plate exists in the first X-ray image and the second X-ray image, respectively, 1 still image for the X-ray image and still image for the second X-ray image, respectively. The image capturing unit 540B may transmit the still images to the communication interface unit 560B.

The forward and backward image generating unit 550B compares the car numbers read from the first X-ray image with the car numbers read from the second X-ray image, and combines the X-ray images having the same car number into one Thereby generating a third X-ray image, respectively. The third image may include a first X-ray image (forward image), a second X-ray image (rear image), and a vehicle number for the same vehicle. The forward and backward image generation unit 530B generates the forward and backward image data by combining the first X-ray image (forward image) and the second X-ray image (rear image) processed through the image processing unit 520B, A new third image (front and rear images) can be generated. In addition, the third image may include information such as the traveling direction of the vehicle, the shooting location, the passing time, the passing date, the passing speed, and the vehicle type.

The communication interface unit 560B is a device for transmitting and receiving data with the management server 6000. The communication interface unit 560B connects a predetermined communication channel with the management server 600 based on a protocol stack defined in the communication network, And the result information of the image capturing unit 540B and the front and rear image generating unit 550B through communication with the management server 600 using a communication protocol defined in a communication program provided in the communication server 600. [ However, in another embodiment of the present invention, the type of the communication network is not limited, and a wifi method, a zigbee method, a bluetooth method, a 3G method, a 4G method, an LTE method, It can also be applied.

As described above, the present invention is not limited to the above-described embodiment, but can be applied to a multi-compound object vehicle number recognition system according to the present invention. It will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention.

1000: Multi-Complex Object Vehicle Number Recognition System
100: first sensor unit
200: second sensor unit
300: First shooting section
400: Second shooting section
500:
510A: Control signal transmission /
520A:
530A: forward and backward image generation unit
540A: Communication interface unit
510B: control signal transmission /
520B:
530B:
540B: image capture unit
550B: forward and backward image generating unit
560B: Communication interface unit
600: management server

Claims (9)

A first sensor unit installed toward the first area and sensing a moving direction of the vehicle with respect to the first area and generating a first control signal in accordance with the moving direction of the sensed vehicle;
A second sensor unit installed toward the second area and sensing a moving direction of the vehicle with respect to the second area and generating a second control signal in accordance with the moving direction of the sensed vehicle;
A first photographing unit which is installed toward the first area and photographs a vehicle moving in the first area by the first control signal or the second control signal;
A second photographing unit which is installed toward the second area and photographs a vehicle moving in the second area by the first control signal or the second control signal; And
A first image captured through the first image capturing unit and a second image captured through the second image capturing unit, extracting and reading the vehicle number from the first image and the second image, respectively, And an image controller for generating a third image by combining images having the same vehicle number with respect to the first image and the second image,
Wherein the first area and the second area are areas opposite to each other with reference to the first sensor part, the second sensor part, the first photographing part, and the second photographing part,
The first sensor unit includes a first ultra wide band (UWB) sensor,
The first UWB sensor includes:
The first region is divided into a first channel region to an N-th channel region,
Wherein the first control signal is generated when signals are sequentially detected in a direction from the N-th channel region toward the first channel region,
When the signal is detected in only one channel region of the first channel region to the N-th channel region, the first control signal is not generated,
Wherein the first channel region is a region from the installation point of the first UWB sensor to a point spaced from the installation point by a predetermined distance,
Wherein the N-th channel region is an area up to a detectable critical point of the first UWB sensor that is spaced apart from the N-1 channel region by a predetermined distance.
delete delete The method according to claim 1,
The second sensor unit includes a second ultra wide band (UWB) sensor,
Wherein the second UWB sensor comprises:
The second region is divided into a first channel region to an N-th channel region,
And generating the second control signal when signals are sequentially detected in a direction from the N-th channel region toward the first channel region,
The second control signal is not generated when a signal is detected only in one of the first channel region to the N-th channel region,
Wherein the first channel region is a region from the installation point of the second UWB sensor to a point spaced from the installation point by a predetermined distance,
And the N-th channel region is an area up to a detectable critical point of the second UWB sensor that is spaced apart from the N-1 channel region by a predetermined distance.
The method according to claim 1,
Wherein each of the first photographing unit and the second photographing unit includes at least one of an IP camera, a three-dimensional camera, and a depth camera.
6. The method of claim 5,
The image control unit includes:
Wherein the control unit receives the first control signal generated by the first sensor unit and transmits the first control signal to the first photographing unit and the second photographing unit respectively and receives the second control signal generated by the second sensor unit, A control signal transmitting / receiving unit for transmitting the control signal to the photographing unit and the second photographing unit, respectively;
An image processor for performing an image recognition algorithm on the first image and the second image to respectively extract and read the vehicle number; And
And comparing the vehicle numbers extracted and read out from the first image with the vehicle numbers extracted and read out from the second image, combining the images having the same vehicle number with each other to form the front and rear images And a generation unit for generating a composite object ID number.
The method according to claim 1,
Wherein each of the first photographing portion and the second photographing portion irradiates X-rays to a vehicle moving in the first region and the second region, respectively, and photographs an X-ray image diffracted from the vehicle irradiated with the X- And a first X-ray camera and a second X-ray camera which are connected to each other.
8. The method of claim 7,
The image control unit includes:
The first control unit receives the first control signal generated by the first sensor unit and transmits the first control signal to the first X-ray camera and the second X-ray camera, and receives the second control signal generated by the second sensor unit A control signal transmitting / receiving unit for transmitting the first control signal to the first X-ray camera and the second X-ray camera;
Ray image captured through the first X-ray camera and a second X-ray image captured through the second X-ray camera, respectively, to extract and read the vehicle number, respectively, ; And
The vehicle numbers read out from the first X-ray image and the car numbers read out from the second X-ray image are compared with each other, and X-ray images having the same car number are combined to form a third X- And a forward / backward image generating unit for generating a forward / backward image of the vehicle.
9. The method of claim 8,
The image control unit includes:
An image judging unit for judging whether a license plate exists in the first X-ray image and the second X-ray image; And
And an image capturing unit capturing the first X-ray image and the second X-ray image when no license plate exists in the first X-ray image and the second X-ray image, respectively,
Wherein the image processing unit performs the shade reading algorithm when a license plate exists in the first X-ray image and the second X-ray image, respectively.
KR1020150116939A 2015-08-19 2015-08-19 System for recognizing multiplex hybrid object car number KR101576273B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150116939A KR101576273B1 (en) 2015-08-19 2015-08-19 System for recognizing multiplex hybrid object car number

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150116939A KR101576273B1 (en) 2015-08-19 2015-08-19 System for recognizing multiplex hybrid object car number

Publications (1)

Publication Number Publication Date
KR101576273B1 true KR101576273B1 (en) 2015-12-21

Family

ID=55083857

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150116939A KR101576273B1 (en) 2015-08-19 2015-08-19 System for recognizing multiplex hybrid object car number

Country Status (1)

Country Link
KR (1) KR101576273B1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102332517B1 (en) * 2021-08-05 2021-12-01 주식회사 태영정보 Image surveilance control apparatus
KR102521567B1 (en) * 2022-10-27 2023-04-14 엘텍코리아 주식회사 Apparatus and method for recogniaing illegal driving of two-sheeled vehicle

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080023178A (en) 2006-09-07 2008-03-12 주식회사 코리아카파크 Automatic parking management system
KR20080087618A (en) * 2007-07-06 2008-10-01 렉스젠(주) A system and method for photographing the car
KR100968433B1 (en) 2010-04-07 2010-07-08 주식회사 비스타씨엔씨 Store system for the license plate images of vehicle and, search system for images of vehicle using that store system
KR101304900B1 (en) * 2012-05-18 2013-09-05 휴앤에스(주) Photographing system of multiline road
JP2014130435A (en) * 2012-12-28 2014-07-10 Fujitsu Ltd Information processing device and method
KR101422217B1 (en) * 2014-06-16 2014-08-13 (주)샤인정보통신 Method for providing number cognition service of vehicle
KR101489859B1 (en) 2013-11-28 2015-02-06 김홍기 System for recognizing car number using x-ray camera

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080023178A (en) 2006-09-07 2008-03-12 주식회사 코리아카파크 Automatic parking management system
KR20080087618A (en) * 2007-07-06 2008-10-01 렉스젠(주) A system and method for photographing the car
KR100968433B1 (en) 2010-04-07 2010-07-08 주식회사 비스타씨엔씨 Store system for the license plate images of vehicle and, search system for images of vehicle using that store system
KR101304900B1 (en) * 2012-05-18 2013-09-05 휴앤에스(주) Photographing system of multiline road
JP2014130435A (en) * 2012-12-28 2014-07-10 Fujitsu Ltd Information processing device and method
KR101489859B1 (en) 2013-11-28 2015-02-06 김홍기 System for recognizing car number using x-ray camera
KR101422217B1 (en) * 2014-06-16 2014-08-13 (주)샤인정보통신 Method for providing number cognition service of vehicle

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102332517B1 (en) * 2021-08-05 2021-12-01 주식회사 태영정보 Image surveilance control apparatus
KR102521567B1 (en) * 2022-10-27 2023-04-14 엘텍코리아 주식회사 Apparatus and method for recogniaing illegal driving of two-sheeled vehicle

Similar Documents

Publication Publication Date Title
US20200372267A1 (en) Multi-camera vision system and method of monitoring
KR101999993B1 (en) Automatic traffic enforcement system using radar and camera
KR101647370B1 (en) road traffic information management system for g using camera and radar
KR101971878B1 (en) Video surveillance system and method using deep-learning based car number recognition technology in multi-lane environment
KR101935399B1 (en) Wide Area Multi-Object Monitoring System Based on Deep Neural Network Algorithm
KR20140004291A (en) Forward collision warning system and forward collision warning method
US20140002658A1 (en) Overtaking vehicle warning system and overtaking vehicle warning method
KR100948382B1 (en) Security service method and system
JPH08329393A (en) Preceding vehicle detector
KR101576273B1 (en) System for recognizing multiplex hybrid object car number
JP4123138B2 (en) Vehicle detection method and vehicle detection device
KR20100004706A (en) Security image monitoring system and method using rfid reader
KR20160038943A (en) vehicle detecting system and method using laser sensors in stero camera
JP6431271B2 (en) Vehicle detection and vehicle number recognition device
KR101761357B1 (en) System and method for vehicle monitoring
KR101420242B1 (en) vehicle detector and method using stereo camera
JP5132164B2 (en) Background image creation device
JP4088182B2 (en) Image information processing system
KR101653820B1 (en) System for sensing object and emergency based on thermal detective operation, and method for tracking object using the same
JP2010020557A (en) Image processor and image processing method
JP2005140754A (en) Method of detecting person, monitoring system, and computer program
KR101489859B1 (en) System for recognizing car number using x-ray camera
KR101824042B1 (en) Camera-integrated laser detector and driving method thereof
JP2007259076A (en) Device for recognizing pedestrian
KR20080054094A (en) Method for object recognizing and distance measuring

Legal Events

Date Code Title Description
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20180928

Year of fee payment: 4

FPAY Annual fee payment

Payment date: 20190924

Year of fee payment: 5