KR101691312B1 - Method for showing parking lot and system - Google Patents

Method for showing parking lot and system Download PDF

Info

Publication number
KR101691312B1
KR101691312B1 KR1020150171745A KR20150171745A KR101691312B1 KR 101691312 B1 KR101691312 B1 KR 101691312B1 KR 1020150171745 A KR1020150171745 A KR 1020150171745A KR 20150171745 A KR20150171745 A KR 20150171745A KR 101691312 B1 KR101691312 B1 KR 101691312B1
Authority
KR
South Korea
Prior art keywords
parking
vehicle
parking lot
image
user
Prior art date
Application number
KR1020150171745A
Other languages
Korean (ko)
Inventor
김대권
박기범
Original Assignee
주식회사 넥스파시스템
주식회사 넥스쿼드
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 넥스파시스템, 주식회사 넥스쿼드 filed Critical 주식회사 넥스파시스템
Priority to KR1020150171745A priority Critical patent/KR101691312B1/en
Application granted granted Critical
Publication of KR101691312B1 publication Critical patent/KR101691312B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/30Transportation; Communications
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/14Traffic control systems for road vehicles indicating individual free spaces in parking areas
    • G08G1/141Traffic control systems for road vehicles indicating individual free spaces in parking areas with means giving the indication of available parking spaces
    • G08G1/143Traffic control systems for road vehicles indicating individual free spaces in parking areas with means giving the indication of available parking spaces inside the vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/14Traffic control systems for road vehicles indicating individual free spaces in parking areas
    • G08G1/141Traffic control systems for road vehicles indicating individual free spaces in parking areas with means giving the indication of available parking spaces
    • G08G1/144Traffic control systems for road vehicles indicating individual free spaces in parking areas with means giving the indication of available parking spaces on portable or mobile units, e.g. personal digital assistant [PDA]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04W4/003
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/023Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W88/00Devices specially adapted for wireless communication networks, e.g. terminals, base stations or access point devices
    • H04W88/02Terminal devices

Abstract

The present invention relates to a method and a system for showing a parking lot. Specifically, the present invention relates to a method and a system for showing a parking lot which provide road guidance by considering a parking space of a parking lot close to a user or a destination, costs, and an operation policy, and an element to walk on foot. The present invention relates to a terminal related to an embodiment of the present invention, and a method for showing a parking lot by using a plurality of parking lots establishing a network with the terminal. The method comprises: a first step of determining the current position of a terminal user; a second step of displaying, in the terminal, a parking lot within a preset distance from the current position of the user among the parking lots or the final destination inputted by the user; a third step of selecting a first parking lot from the display parking lot by the user; a fourth step of displaying a path moving toward the first parking lot and a lead time in the display; a fifth step of receiving at least one among a parking space state of the first parking lot, costs, and an operation policy from the first parking lot by the terminal; a sixth step of displaying at least one among the parking space state of the first parking lot, the lead time, and the operation policy by the terminal; and a seventh step of displaying a path moving toward the final destination and a lead time by considering the element that the user walks to the final destination on foot from the first parking lot.

Description

FIELD OF THE INVENTION [0001]

The present invention relates to a parking area guidance method and system. More particularly, the present invention relates to a parking area guidance method and system that provides guidance on the basis of a parking space, a cost, and an operation policy of a parking lot close to a user or a destination, and a factor of walking on foot.

A terminal such as a personal computer, a notebook computer, a mobile phone, or the like can be configured to perform various functions. Examples of such various functions include a data and voice communication function, a function of photographing a video or a moving image through a camera, a voice storage function, a music file playback function through a speaker system, and an image or video display function. Some terminals include additional functions to execute games, and some other terminals are also implemented as multimedia devices. Moreover, recent terminals can receive a broadcast or multicast signal to view a video or television program.

In general, a terminal can be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal according to whether the terminal can be moved. The mobile terminal can be divided into a handheld terminal and a vehicle mount terminal according to whether the user can directly carry the mobile terminal.

Such a terminal has various functions, for example, in the form of a multimedia device having multiple functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game and broadcasting, etc. .

In order to support and enhance the functionality of such terminals, it may be considered to improve the structural and / or software parts of the terminal.

As a terminal is intelligent and advanced, a navigation device that provides a navigation service to a user based on a predetermined mobile computing technology is provided. Telematics is popular as a typical navigation device.

A terminal (navigation device) is a device for providing a navigation service to a driver (or a passenger) based on a position measurement system (e.g., Global Positioning System) and a geographical information system (GIS) (Intelligent Computing System) that provides various services such as traffic information, emergency situation response, remote vehicle diagnosis, Internet use (e.g., financial transaction, news, mail, etc.) to a driver (or a passenger)

However, in a navigation service corresponding to a unique function of a navigation device, there is a problem that only a uniform navigation service is provided which is not intelligent due to the publicness of the navigation service and the uniformity of the navigation contents provided in the navigation device.

In particular, if the user desires to move to the surrounding parking lot, he / she will only guide the time and the time to move to the parking lot, mention the existence of the parking space in the parking lot, There is a problem that it can not be.

Therefore, a solution to the above-mentioned problem is required.

Korea Patent Office Registration No. 10-0853191

The present invention is to provide a parking area guidance method and system that provides guidance on the basis of a parking space, a cost, and an operating policy of a parking lot closest to a user or a destination and a factor of walking on foot.

According to the present invention, it is possible to manually check the counting and departure according to whether there is a doorway or not, or to automatically check the counting and departure, thereby performing guidance on the parking space, and use video detection, loop coil, ultrasonic, geomagnetic sensor And to provide a method and system for accurately determining a parking space of a parking lot.

The present invention also provides a method and system for easily calculating parking costs through a payment system by recognizing vehicle types and license plates and comparing the information registered in a specific application with each other.

It is to be understood that both the foregoing general description and the following detailed description of the present invention are exemplary and explanatory and are not intended to limit the invention to the precise form disclosed. It can be understood.

There is provided a method of guiding a parking area using a terminal related to an example of the present invention for realizing the above-mentioned problem and a plurality of parking lots having a network established with the terminal, the method comprising: a first step of determining a current position of the terminal user; A second step of displaying a parking lot within a predetermined distance from the current location of the user or the final destination inputted by the user among the plurality of parking lots; A third step of the user selecting a first parking lot among the displayed parking lots; A fourth step of displaying a route to the first parking lot and a required time on the terminal; A fifth step in which the terminal receives at least one of the presence, cost, and operation policy of the parking space of the first parking lot from the first parking lot; A sixth step in which the terminal displays at least one of a presence / absence of a parking space, a required cost, and an operation policy of the first parking lot; And a seventh step in which the route and time required to travel to the final destination are displayed on the terminal, in consideration of an element that the user moves on foot from the first parking lot to the final destination.

In addition, when the current position of the user is changed in the first step, the second step may display a parking lot within a predetermined distance from the current position of the changed user.

In addition, between the fourth step and the fifth step, it is determined whether or not there is an entry / exit section of the vehicle in the first parking lot. And a fourth step of determining whether the parking space is manually determined if there is no entry / exit section of the vehicle, and automatically determining whether the parking space exists if the entrance / exit section of the vehicle exists can do.

The method for automatically determining whether the parking space is present may include a camera image detection method, a method using a loop coil, a method using an ultrasonic wave, a method using a geomagnetic sensor, a method using an infrared sensor, a method using a laser, A method using a sensor may be included.

A 9-1 step in which the user executes the predetermined application of the terminal when the vehicle departs from the entry / exit section and the vehicle type and number plate of the leaving vehicle can be recognized in the step 4-2. Comparing the vehicle type and license plate of the recognized vehicle with information registered in the predetermined application; And a step 9-3 of calculating the parking cost of the leaving vehicle through the payment system interlocked with the predetermined application if the recognized vehicle type and license plate match the information registered in the predetermined application .

Also, the terminal and the plurality of parking lots communicate using at least one of short-range communication and wireless communication, and the short-range communication may be a Wi-Fi, a wireless fidelity, a Bluetooth, a radio frequency identification (UWB), ZigBee technology, and the wireless communication uses at least one of code division multiple access (CDMA), frequency division multiple access (FDMA), time division multiple access (TDMA) , Orthogonal frequency division multiple access (OFDMA), and single carrier frequency division multiple access (SC-FDMA) techniques.

Meanwhile, a terminal having a plurality of parking lots and a network constructed in accordance with another embodiment of the present invention for realizing the above-mentioned problems, includes a location information module for determining a current location of the terminal user; An input unit for receiving a final destination to which the user wants to move; A display unit for displaying a parking lot within a predetermined distance from the current location of the user or the final destination inputted by the user among the plurality of parking lots; A control unit for controlling the display unit to display a route and a time required to travel to the first parking lot when the user selects the first parking lot among the displayed parking lots; And a wireless communication unit for receiving at least one of the presence, cost, and operation policy of the parking space of the first parking lot from the first parking lot, wherein the control unit displays the presence / absence of the parking space, And a control unit for controlling the display unit to display at least one of a route and a time required for moving to the final destination on the display unit in consideration of an element that the user moves on foot from the first parking lot to the final destination, As shown in FIG.

The parking area guidance method and system according to at least one embodiment of the present invention configured as described above may be implemented in a way that takes into consideration the parking space, cost, and operation policy of the parking lot closest to the user's present location or destination, Guidance can be provided.

According to the present invention, it is possible to check whether a parking space is present or not by manually checking whether or not there is an entrance and exit, It is possible to accurately judge the parking space of a parking lot by using a laser, a pressure sensor, and the like.

In addition, the present invention recognizes the vehicle type and the license plate of the vehicle, and compares the information registered in the specific application with the information registered in the specific application, so that the parking cost can be easily settled through the payment system.

It should be understood, however, that the effects obtained by the present invention are not limited to the above-mentioned effects, and other effects not mentioned may be clearly understood by those skilled in the art to which the present invention belongs It will be possible.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.
2 is a flowchart for explaining a parking area guidance method related to the present invention.
FIG. 3 is a flowchart illustrating an example of performing the determination of presence or absence of a parking space in FIG.
FIGS. 4A to 4C are diagrams for illustrating an example of performing a parking space presence / absence determination using a loop coil.
5A to 5D are diagrams for illustrating another example for performing the determination of existence of a parking space using the loop coil.
FIGS. 6A and 6B are diagrams for illustrating an example of performing parking presence / absence determination using video detection.
FIGS. 7A and 7B are diagrams for illustrating an example of performing parking presence / absence determination using the LPR system.
FIG. 8 is a flowchart for recognizing vehicle types and license plates of the vehicle and easily adjusting the parking costs through the payment system.

Hereinafter, an image display apparatus that can be applied to the present invention will be described in detail with reference to the drawings. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.

It is assumed that the video display device described in this specification is a mobile terminal. Such a mobile terminal may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), and navigation. However, it will be appreciated by those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, a digital media player, and the like, will be.

1 is a block diagram of a mobile terminal according to an embodiment of the present invention.

The mobile terminal 100 includes a wireless communication unit 110, an audio / video input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, A controller 170, a controller 180, a power supply 190, and the like. The components shown in FIG. 1 are not essential, and a mobile terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules for enabling wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and the network in which the mobile terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 114, and a location information module 115 .

The broadcast receiving module 111 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast-related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast-related information may exist in various forms. For example, an EPG (Electronic Program Guide) of DMB (Digital Multimedia Broadcasting) or an ESG (Electronic Service Guide) of Digital Video Broadcast-Handheld (DVB-H).

For example, the broadcast receiving module 111 may be a Digital Multimedia Broadcasting-Terrestrial (DMB-T), a Digital Multimedia Broadcasting-Satellite (DMB-S), a Media Forward Link Only (DVF-H) And a Digital Broadcasting System (ISDB-T) (Integrated Services Digital Broadcast-Terrestrial). Of course, the broadcast receiving module 111 may be adapted to other broadcasting systems as well as the digital broadcasting system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 113 is a module for wireless Internet access, and may be built in or externally attached to the mobile terminal 100. WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) and the like can be used as wireless Internet technologies.

The short-range communication module 114 refers to a module for short-range communication. Bluetooth, Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), ZigBee, and the like can be used as a short range communication technology.

The position information module 115 is a module for obtaining the position of the mobile terminal, and a representative example thereof is a Global Position System (GPS) module.

Referring to FIG. 1, an A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. [ Two or more cameras 121 may be provided depending on the use environment.

The microphone 122 receives an external sound signal through a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 when the voice data is in the call mode, and output. Various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in receiving an external sound signal.

The user input unit 130 generates input data for a user to control the operation of the terminal. The user input unit 130 may include a key pad dome switch, a touch pad (static / static), a jog wheel, a jog switch, and the like.

The sensing unit 140 senses the current state of the mobile terminal 100 such as the open / close state of the mobile terminal 100, the position of the mobile terminal 100, the presence or absence of user contact, the orientation of the mobile terminal, And generates a sensing signal for controlling the operation of the mobile terminal 100. For example, when the mobile terminal 100 is in the form of a slide phone, it is possible to sense whether the slide phone is opened or closed. It is also possible to sense whether the power supply unit 190 is powered on, whether the interface unit 170 is connected to an external device, and the like. Meanwhile, the sensing unit 140 may include a proximity sensor 141.

The output unit 150 is for generating an output relating to visual, auditory or tactile sense and includes a display unit 151, an acoustic output module 152, an alarm unit 153, a haptic module 154, 155, and the like.

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, when the mobile terminal is in the call mode, a UI (User Interface) or a GUI (Graphic User Interface) associated with a call is displayed. When the mobile terminal 100 is in the video communication mode or the photographing mode, the photographed and / or received video or UI and GUI are displayed.

The display unit 151 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display display, and a 3D display.

Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the display unit 151 may also be of a light transmission type. With this structure, the user can see an object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

There may be two or more display units 151 according to the embodiment of the mobile terminal 100. For example, in the mobile terminal 100, a plurality of display portions may be spaced apart from one another or may be disposed integrally with each other, or may be disposed on different surfaces.

(Hereinafter, referred to as a 'touch screen') in which a display unit 151 and a sensor for sensing a touch operation (hereinafter, referred to as 'touch sensor') form a mutual layer structure, It can also be used as an input device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display unit 151 or a capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor can be configured to detect not only the position and area to be touched but also the pressure at the time of touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. Thus, the control unit 180 can know which area of the display unit 151 is touched or the like.

The proximity sensor 141 may be disposed in an inner region of the mobile terminal or in the vicinity of the touch screen, which is enclosed by the touch screen. The proximity sensor refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or a nearby object without mechanical contact using the force of an electromagnetic field or infrared rays. The proximity sensor has a longer life span than the contact sensor and its utilization is also high.

Examples of the proximity sensor include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of recognizing that the pointer is positioned on the touch screen while the pointer is not in contact with the touch screen is referred to as "proximity touch & The act of actually touching the pointer on the screen is called "contact touch. &Quot; The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.

The proximity sensor detects a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, and the like). Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

The audio output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output module 152 also outputs sound signals related to functions (e.g., call signal reception sound, message reception sound, etc.) performed in the mobile terminal 100. [ The audio output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying the occurrence of an event of the mobile terminal 100. Examples of events that occur in the mobile terminal include call signal reception, message reception, key signal input, touch input, and the like. The alarm unit 153 may output a signal for notifying the occurrence of an event in a form other than the video signal or the audio signal, for example, vibration. The video signal or the audio signal may be output through the display unit 151 or the audio output module 152 so that they may be classified as a part of the alarm unit 153.

The haptic module 154 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 154 is vibration. The intensity and pattern of the vibration generated by the hit module 154 can be controlled. For example, different vibrations may be synthesized and output or sequentially output.

In addition to the vibration, the haptic module 154 may include a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or a suction force of the air through the injection port or the suction port, a touch on the skin surface, contact with an electrode, And various tactile effects such as an effect of reproducing a cold sensation using an endothermic or exothermic element can be generated.

The haptic module 154 can be implemented not only to transmit the tactile effect through the direct contact but also to allow the user to feel the tactile effect through the muscular sensation of the finger or arm. The haptic module 154 may include two or more haptic modules 154 according to the configuration of the portable terminal 100.

The projector module 155 is a component for performing an image project function using the mobile terminal 100 and is similar to the image displayed on the display unit 151 in accordance with a control signal of the controller 180 Or at least partly display another image on an external screen or wall.

Specifically, the projector module 155 includes a light source (not shown) that generates light (for example, laser light) for outputting an image to the outside, a light source And a lens (not shown) for enlarging and outputting the image at a predetermined focal distance to the outside. Further, the projector module 155 may include a device (not shown) capable of mechanically moving the lens or the entire module to adjust the image projection direction.

The projector module 155 can be divided into a CRT (Cathode Ray Tube) module, an LCD (Liquid Crystal Display) module and a DLP (Digital Light Processing) module according to the type of the display means. In particular, the DLP module may be advantageous for miniaturization of the projector module 151 by enlarging and projecting an image generated by reflecting light generated from a light source on a DMD (Digital Micromirror Device) chip.

Preferably, the projector module 155 may be provided longitudinally on the side, front or back side of the mobile terminal 100. It goes without saying that the projector module 155 may be provided at any position of the mobile terminal 100 as occasion demands.

The memory unit 160 may store a program for processing and controlling the control unit 180 and temporarily store the input / output data (e.g., telephone directory, message, audio, For example. The memory unit 160 may also store the frequency of use of each of the data (for example, each telephone number, each message, and frequency of use for each multimedia). In addition, the memory unit 160 may store data on vibration and sound of various patterns output when the touch is input on the touch screen.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), a RAM (Random Access Memory), SRAM (Static Random Access Memory), ROM (Read Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), PROM A disk, and / or an optical disk. The mobile terminal 100 may operate in association with a web storage that performs a storage function of the memory 160 on the Internet.

The interface unit 170 serves as a path for communication with all external devices connected to the mobile terminal 100. The interface unit 170 receives data from an external device or supplies power to each component in the mobile terminal 100 or transmits data to the external device. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio I / O port, A video input / output (I / O) port, an earphone port, and the like may be included in the interface unit 170.

The identification module is a chip for storing various information for authenticating the use right of the mobile terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM), a general user authentication module A Universal Subscriber Identity Module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the port.

When the mobile terminal 100 is connected to an external cradle, the interface unit may be a path through which power from the cradle is supplied to the mobile terminal 100, or various command signals input by the user to the cradle may be transmitted It can be a passage to be transmitted to the terminal. The various command signals or the power source input from the cradle may be operated as a signal for recognizing that the mobile terminal is correctly mounted on the cradle.

The controller 180 typically controls the overall operation of the mobile terminal. For example, voice communication, data communication, video communication, and the like. The control unit 180 may include a multimedia module 181 for multimedia playback. The multimedia module 181 may be implemented in the control unit 180 or may be implemented separately from the control unit 180. [

The controller 180 may perform a pattern recognition process for recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components.

The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein may be implemented as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays May be implemented using at least one of a processor, controllers, micro-controllers, microprocessors, and other electronic units for performing other functions. In some cases, The embodiments described may be implemented by the control unit 180 itself.

According to a software implementation, embodiments such as the procedures and functions described herein may be implemented with separate software modules. Each of the software modules may perform one or more of the functions and operations described herein. Software code can be implemented in a software application written in a suitable programming language. The software code is stored in the memory 160 and can be executed by the control unit 180. [

On the other hand, as a terminal becomes intelligent and advanced, a navigation device that provides a navigation service to a user based on a predetermined mobile computing technology is provided, and telematics is popular as a typical navigation device.

A terminal (navigation device) is a device for providing a navigation service to a driver (or a passenger) based on a position measurement system (e.g., Global Positioning System) and a geographical information system (GIS) (Intelligent Computing System) that provides various services such as traffic information, emergency situation response, remote vehicle diagnosis, Internet use (e.g., financial transaction, news, mail, etc.) to a driver (or a passenger)

However, in a navigation service corresponding to a unique function of a navigation device, there is a problem that only a uniform navigation service is provided which is not intelligent due to the publicness of the navigation service and the uniformity of the navigation contents provided in the navigation device.

In particular, if the user desires to move to the surrounding parking lot, he / she will only guide the time and the time to move to the parking lot, mention the existence of the parking space in the parking lot, There is a problem that it can not be.

SUMMARY OF THE INVENTION Accordingly, the present invention provides a method and system for guiding a parking area, which provides guidance on the basis of a parking space, a cost, and an operation policy of a parking space closest to a user or a destination,

According to the present invention, it is possible to manually check the counting and departure according to whether there is a doorway or not, or to automatically check the counting and departure, thereby performing guidance on the parking space, and use video detection, loop coil, ultrasonic, geomagnetic sensor And a method and system for accurately determining a parking space of a parking lot.

In addition, the present invention provides a method and system for easily calculating a parking cost through a payment system through a step of recognizing a vehicle type and a license plate and comparing information registered in a specific application.

2 is a flowchart for explaining a parking area guidance method related to the present invention.

Referring to FIG. 2, first, a step S10 of executing a specific application of the mobile terminal 100 by a user may be performed.

The step S10 is a step for triggering the parking lot search system proposed by the present invention. In addition to the operation of executing the specific application proposed by the present invention, a predetermined touch is inputted to the mobile terminal 100 or a preset gesture is transmitted to the terminal 100, Or through an operation such as taking it through the network.

Thereafter, step S20 of displaying a parking candidate in the vicinity based on the current position of the user may be performed.

That is, the position of the current user is grasped through a GPS (Global Position System) module, which is a typical example of the position information module 115, and parking lots existing within a certain distance from the detected position of the user are displayed on the display unit 151 Can be displayed.

However, when the user sets another condition in advance, the parking lots arranged corresponding to the set conditions can be displayed on the display unit 151. [

In addition, when the current position is changed as the user moves, a parking candidate that exists in a short distance corresponding to the changed position may be displayed on the display unit 151. [

In addition, although it has been practiced to be performed through step S60, it is also possible for the user to input the final destination in advance through the user input unit at this stage.

In this case, a parking lot list located at a distance from the user from the current position or the final destination of the user may be displayed.

Thereafter, a step S30 of selecting a parking lot to which the user wishes to move may be performed.

Step S30 may be performed by the user inputting a command through the user input unit 130 or by touch input to the display unit 151 which is a touch screen.

Thereafter, step S40 may be performed in which the route to the selected parking lot is searched from the memory 160 according to the control of the controller 180 and the route guidance is automatically performed.

In addition, the step (S50) of receiving at least one of the presence / absence of the parking space, the cost and the operation policy from the selected parking lot is performed.

In step S50, the user can receive or periodically receive information on at least one of the existence of the parking space, the cost, and the operation policy from the selected parking lot.

In the step S50, the aforementioned short-range communication or long-distance communication may be used.

FIG. 3 is a flowchart illustrating an example of performing the determination of presence or absence of a parking space in FIG.

Referring to FIG. 3, first, it is determined whether or not there is a doorway (S100).

If the exit of the vehicle does not exist, a step S110 is performed in which the user manually confirms counting and departure.

If there is an entrance and exit of the vehicle, step S120 of checking whether the vehicle is counting and departing automatically is performed.

In step S120, camera image detection, loop coil, detection using ultrasound, geomagnetic sensor, infrared ray, laser, and detection using a pressure sensor can be used, and this will be described in detail with reference to FIGS. 4A to 7B.

After returning to FIG. 2 and receiving at least one of the presence / absence of parking space, cost, and operation from the selected parking lot, the user can select a final destination, and in response, the control unit 180 determines whether the user moves on foot The step S60 is performed in which the route is guided by reflecting the time.

If the step of inputting the final destination has been performed in step S20, it may not be necessary to separately select the final destination in step S60.

As a result, the guidance to the parking space and the time guidance are not simply performed by the user, but the guidance to the parking space is performed automatically, and when the final destination of the user is input, So that the route and time can be guided.

As described above, in step S120, detection of camera images, detection through loop coils, detection using ultrasonic waves, geomagnetic sensors, infrared rays, laser, and detection using pressure sensors can be used. Will be described in detail.

4A to 5D illustrate a parking space recognition method using a loop coil. FIGS. 6A and 6B illustrate a parking space recognition method using camera image sensing. FIGS. 7A and 7B illustrate an LPR system A method for performing the determination of presence or absence of a parking space will be described.

However, the methods described below are merely examples for application of the present invention, and can be implemented by a variety of methods, and the present invention is not limited thereto.

First, a parking space recognition method using a loop coil will be described.

FIGS. 4A to 4C are diagrams for illustrating an example of performing a parking space presence / absence determination using a loop coil.

4A and 4B, a loop coil 4120 is disposed at the center of each parking surface 4110 of the parking lot 4100. The loop coil 4120 detects the entrance and departure of the vehicle on each parking surface 4110 based on the change of the magnetic field. Each loop coil 4120 is interfaced with a parking lot database server 4170.

The remote camera 4130 as a surveillance camera is installed at a position where the entire parking lot 4100 can be photographed. The remote camera 4130 continuously records the moving picture of the entire parking lot 4100 or displays the moving picture of the entire parking lot 4100 only when the moving object 4100 includes the motion detection device and is interlocked with the motion detection device, ) You can record the entire movie.

The long distance camera 4130 is usually used in a fixed manner to photograph and record the entire parking lot 4100. [ The remote camera 4130 may be a pan / tilt zoom camera capable of adjusting the direction and the distance, so as to shoot a close-up image of the vehicle when it is detected that the vehicle is entering or exiting the parking surface 4110 by the loop coil 4120 .

The moving image photographed by the remote camera 4130 is stored in the parking lot database server 4170. [

The number recognition camera 4140 is disposed on the rear surface of each parking surface 4110, that is, on the roadside. The number recognition camera 4140 determines its height and direction so that the position of the license plate of the vehicle parked on each parking surface 4110 can be photographed. The number recognition camera 4140 shoots an image including a license plate of the vehicle when the entrance and departure of the vehicle is detected by the loop coil 4120. The number recognition camera 4140 can photograph one still image of the license plate of the vehicle, but preferably can continuously photograph still images of the license plate or take a moving image to improve the number recognition rate. The image photographed by the number recognition camera 4140 is stored in the parking lot database server 4170. The vehicle number can be extracted from the image extracted by the calculation device connected to each number recognition camera 4140 and transmitted to the parking lot database server 4170 or stored by the calculation device connected to the parking lot database server 4170 . The extracted vehicle number is also stored in the parking lot database server 4170.

The unmanned fare adjuster 4150 allows the parking fees for all the parking surfaces 4110 in the parking lot 4100 to be integrated and settled. The unmanned fare adjuster 4150 is connected to the parking lot database server 4170. When the parking lot user inputs the parked parking surface 4110, the unmanned fare adjuster 4150 calculates the parking time of the vehicle detected by the loop coil 4120, the parking time from the entry time to the current time, From the parking lot database server 4170, the parking fee of the vehicle calculated from the time, the image captured by the remote camera 4130 and the number recognition camera 4140, the vehicle number extracted from the image of the vehicle, .

Alternatively, the parking lot user can enter his car number 4 digits. When the user inputs the vehicle number, the unmanned fare adjuster 4150 determines whether or not the vehicle corresponding to the input number of the parked vehicle is parked on the parked surface 4110, the entering time of the vehicle detected by the loop coil 4120, The parking time from the time to the present time, the parking fee of the vehicle calculated from the parking time, the image taken by the remote camera 4130 and the number recognition camera 4140 at the entrance, the vehicle number extracted from the image of the vehicle, And receives it from the database server 4170 and displays it on the screen. If a plurality of vehicles having the inputted number are parked, the unmanned fare adjuster 4150 can cause the vehicle having the inputted number to select the parked face of the user's vehicle from the parked parking faces.

At this time, the unmanned fare adjuster 4150 may display an entry time, a parking time, a video, and the like of the parked vehicle on each parking surface so that the user can easily select the entry time.

In addition, the unmanned fare adjuster 4150 can display a fare by adding the unfixed parking fees to the vehicles that have been previously provided. After confirming the contents displayed on the screen, the user uses the commercialized means such as cash, credit card, traffic card, etc. to settle the parking charge.

The unmanned fare adjuster 4150 may further include an Internet phone to enable bidirectional communication between the parking lot user and an administrator of the integrated management server 4180 to be described later. When a problem arises in the parking fee settlement process, the user can solve the problem using the Internet phone.

The parking lot database server 4170 stores the entrance and departure times of each parking surface 4110 detected by the loop coil 4120, the distant or near moving image photographed by the remote camera 4130, The number of the vehicle extracted from the image photographed by the number recognition camera 4140, whether or not the parking fee is settled through the unmanned fare adjuster 4150, And information on the parking fee paid vehicle received from the driver 4180. Then, the parking lot database server 4170 transmits the stored data to the integrated management server 4180.

The integrated management server 4180 is connected to the unmanned settlement system installed in each parking lot, and integrates and manages each system. Receives updated information from the database server 4170 of each parking lot, and transmits only updates to the data on the unpaid vehicles to the database server 4170 of each parking lot. The exchange of data may be performed in real time, repeated at predetermined time intervals, or may be performed according to an instruction of the integrated management server 4180.

The integrated management server 4180 collects data from each parking lot and integrates and manages each parking lot. The manager of the integrated management server 4180 can check the moving images and data of each parking lot transmitted from the parking lot database server 4170 and control each parking lot. Also, when the parking lot user connects with the manager through the Internet phone, the manager of the integrated management server 4180 can solve the problem by talking with the user.

Information collected in the integrated management server 4180 may be provided to an intelligent transport system (ITS), the Internet, or a PDA to provide parking information to a person who intends to use the parking lot.

In addition, the integrated management server 4180 can collect information on the parking fee not paid for each parking lot, and issue a bill for the unpaid parking fee to the address of the parking fee paid vehicle.

Referring to FIG. 4C, the remote camera 4130 continuously records the entire parking lot 4100 (step S410). Alternatively, the remote camera 4130 may include a motion sensing device to record when a vehicle or a person enters and senses an operation. When the remote camera 4130 has a pan / tilt function, it is also possible to track the movement of the vehicle and perform the recording. When the remote camera 4130 has a zoom function, it is also possible to photograph a close-up moving image of the vehicle.

When the vehicle enters the parking lot 4100 and enters the parking plane 4110, the loop coil 4120 detects the entrance of the parking plane 4110 of the vehicle (step S420). At this time, the entrance time of the parking surface 4110 of the vehicle is stored in the parking lot database server 4170.

When the loop coil 4120 detects the entrance, the number recognition camera 4140 operates to photograph an image of a region including a license plate located at the front (front parking) or rear (rear parking) of the vehicle (S430) . The captured image may be a single still image, a plurality of still images, or a moving image. It is preferable to photograph the license plate continuously in comparison with the case where one still image is photographed in order to improve the number recognition rate.

The remote camera 4130 also shoots the image of the vehicle.

The vehicle number is extracted from the license plate of the photographed vehicle (step S440). The moving picture of the vehicle photographed in step S410, the entrance time measured in step S420, the image of the vehicle photographed in step S430, and the number of the vehicle extracted in step S440 are stored in the parking lot database server 4170 (step S450).

The user can settle the parking charge through the unmanned fare adjuster 4150 when leaving the vehicle. If the user inputs the number of the parking surface 4110 parked on the unmanned fare adjuster 4150 (step S460: Yes), the unmanned fare adjuster 4150 calculates the parking fee calculated from the entry time and the current time of the vehicle (Step S470). In addition, the unmanned fare adjuster 4150 can further display the photographed image, the recognized number, the entering time, etc. obtained in steps S420 and S440 of the vehicle. When the vehicle corresponds to the parking fee paid vehicle transmitted from the integrated management server 4180, the unmanned fare adjuster 4150 can further display the unpaid parking fee transmitted from the unified management server 4180. [ The user can confirm the parking fee or the like displayed on the unmanned fare adjuster 4150 and settle the parking charge using cash, a credit card, a cash card or the like (step S480).

The user can leave the vehicle without having to settle or settle the parking charge. When the vehicle leaves the parking surface 4110, the loop coil 4120 detects the vehicle departure (step S490). Then, the departure time of the vehicle is recorded in the parking lot database server 4170.

When the exit of the vehicle is detected by the loop coil 4120, the number recognition camera 4140 shoots an image including a license plate on the front (front parking) or rear (rear parking) of the vehicle (S4100). The captured image may be a single still image, a plurality of still images, or a moving image. It is preferable to photograph the license plate continuously in comparison with the case where one still image is photographed in order to improve the number recognition rate. The remote camera 4130 may also be used to capture an image of the vehicle.

The vehicle number is extracted from the image of the license plate photographed by the number recognition camera 4140 (step S4110).

The exit time of the vehicle, the image at the time of departure, the number of the vehicle, and whether or not the parking fee is settled are stored in the parking lot database server 4170 (step S4120). Even if the user departs the vehicle without paying the parking fee, the parking charge is charged back to the user who has not paid the parking fee based on the record of the entrance and departure time of the vehicle, the entrance and exit images, It is possible.

The time, the image, the vehicle number, and the parking fee at the time of departure and departure stored in the parking lot database server 4170 can be transmitted to the integrated management server 4180. The integrated management server 4180 collects and manages the information transmitted from each parking lot, and transmits the information of the uncharged vehicle to the parking lot database server 4170 of each parking lot. The communication between the parking lot database server 4170 and the integrated management server 4180 can be established in real time, at predetermined time intervals, or when there is an instruction from the integrated management server 4180.

5A to 5D are diagrams for illustrating another example for performing the determination of existence of a parking space using the loop coil.

5A and 5B show a directionally distinguishable vehicle counter which can be applied to the present invention.

The directional discrimination type vehicle counter according to the present invention includes a rain catcher 510, a front door 511, a door key 512, a hub 513, a TCP / IP conversion board 514, a power breaker 515, An I / O board 517, a detector 518, a ground terminal 519, and the like.

The direction-determining type vehicle counter according to the present invention is a device for counting the number of incoming and outgoing vehicles using a loop sensor, and can count the vehicles entering the forward direction by determining the forward / reverse direction of the vehicle detected by the loop sensor.

In addition, it is possible to accurately count the number of incoming and outgoing vehicles and to guide the vehicle smoothly by determining the forward / reverse direction.

In addition, the system can be easily configured without using a separate wiring by using the TCP / IP communication method.

Also, it is easy to remote control equipment status by TCP / IP communication method.

In addition, only the entrance guide / panel guide / zone guide lights can be displayed.

Referring to FIGS. 5C and 5D, when passing from Loop A to Loop B at the time of vehicle departure, it is possible to recognize recognition of a decrease in the number of vehicles by one.

Conversely, if the signal is passed in the order of Loop B to Loop A, or only Loop A or Loop B is detected, it may not be processed.

Thereafter, the vehicle counter data DATA can be transmitted to the control PC, and finally, the information can be transmitted to the user terminal 100.

Next, a parking space recognition method using camera image detection will be described.

FIGS. 6A and 6B are diagrams for illustrating an example of performing parking presence / absence determination using video detection.

Referring to FIG. 6A, each parking surface 630 is provided with a space in which one vehicle 640 can be parked.

A camera module 610 is installed above the car park to photograph a plurality of parking surfaces 630 partitioned in the parking lot and a parked vehicle 640 on the plurality of parking surfaces 630. [ In the case of the underground parking lot, the camera may be installed in the passageway ceiling of the parking lot. However, in the ground floor parking lot to which the present invention is applied, the camera module 610 can not be installed in the passageway of the parking lot as in the underground parking lot, May be provided in the support 62 of the region in which they are disposed.

The camera module 610 is composed of an omnidirectional dual camera 612 and a number recognition camera 616.

The pillars 62 are preferably fixed to the ground of the parking lot to which the present invention is applied, and are preferably fixed to the ground by bolts or fixedly inserted into the ground. The column 62 may have a structure that is installed in a conventional parking lot such as a streetlight, and a structure having a sufficient height to allow the camera module 610 to photograph a wide area. The support 62 may be formed hollow so that an electric wire connected to the camera module 610 can be penetrated through the ground to supply power to the camera module 610. The support 62 may be made of a metal material which is not corroded to rainwater or the like .

A camera module 610 may be installed at the cross bar to minimize the change of the existing column 62 installed in the parking lot, while the camera module 610 Can be installed at a position suitable for photographing. As in the case of the struts 62, the crossbars may be hollow and the material may be formed of a metal material.

Further, an illumination unit (not shown) for irradiating light may be further provided on the upper portion of the column 62. The illumination by the illumination unit can be controlled by a separate controller, and this illumination unit can provide an appropriate environment for the camera module 610 to perform imaging.

The camera module 610 may be installed on the upper portion of the strut 62 and is installed approximately 3 to 5 meters above the ground so that the camera module 610 can photograph 610 to 614 parking surfaces 630 . The present invention is suitable for outdoor or ground floor parking because the camera module 610 is installed at a high position and the detection and position determination of the vehicle is performed through a method using the image taken by the camera module 20.

The camera module 610 is composed of an omnidirectional dual camera 612 and a number recognition camera 616.

The omnidirectional dual camera 612 is implemented using a camera equipped with a fisheye lens. When a fisheye lens having a wide angle of view is used, it is possible to photograph an image in an omnidirectional (360 DEG) region around the omnidirectional dual camera 612.

In the case of using a general camera, it is difficult to photograph the parking surface 630 of 612 or more with one camera. Therefore, in order to secure a large number of parking surfaces 630 even at a relatively low height, the present invention uses a camera equipped with a fisheye lens . Further, the present invention aims to overcome the limitations of the recognition distance and the use environment by using a stereo type binocular camera.

The omnidirectional dual camera 612 is equipped with a first fisheye lens 614a for taking a first detected image in all directions and a second fisheye lens 614b for taking a second detected image in all directions. The first fisheye lens 614a operates in cooperation with the first image sensor, and the second fisheye lens 614b operates in conjunction with the second image sensor. The first image sensor and the second image sensor

The paper sensor is connected to the image processing unit. In the omnidirectional dual camera 612 applied to the present invention, two lenses and image sensors are mounted, but one image processing unit is mounted.

The image processor processes signals transmitted from the first image sensor and the second image sensor. The image processing unit generates a first detection image in all directions using the signal transmitted from the first image sensor and generates a second detection image in all directions using the signal transmitted from the second image sensor.

The detected image photographed by the omnidirectional dual camera 612 can be converted into an image in which the distortion of each region is removed by adjusting the distance ratio according to the distance and the nearness. As a method of correcting such distorted image information, forward mapping through interpolation of correction coefficients and interpolation associated therewith can be utilized. Assuming a corrected image in advance, Inverse mapping, which is a method of finding which point is matched, may be used. The omnidirectional dual camera 612 takes a surrounding image of the place where the omnidirectional dual camera 612 is installed. The area photographed by the omnidirectional dual camera 612 includes the car detection area 20 and the security video area 622 of the parking lot.

The vehicle detection area 20 includes a parking surface 630 near the column where the omnidirectional dual camera 612 is installed and approximately one omnidirectional dual camera 612 includes the vehicle detection area 630 including 612 parking faces 630. [ The region 20 can be photographed. The security video area 622 includes a wider area than the vehicle detection area 20. [

The number recognition camera 616 photographs the number plate of the parked vehicle 640 on the parking surface 630. The number recognition camera 616 can use a speed dome camera of 2,000,000 pixels or more and perform pan, tilt, and zoom operations to photograph a license plate of a vehicle illegally parked in the prohibited area.

A license plate recognition (LPR) camera may be used as the number recognition camera 616. The LPR camera has a tilting device capable of tilting in the x-axis, y-axis, and z-axis directions, respectively. Such an LPR camera has a configuration capable of photographing a zoom-in image or a zoom-out image with respect to the vehicle by rotating the camera at designated x, y coordinates.

6B, the omnidirectional dual camera 612 and the number recognition camera 616 installed in the parking lot are connected to the video

And is connected to the analysis device 650. The first detection image and the second detection image generated by the omnidirectional dual camera 612 and the license plate image of the vehicle generated by the number recognition camera 616 are transmitted to the image analysis apparatus 650.

The image analysis apparatus 650 includes a vehicle sensing module 660 and a security module 670. The vehicle sensing module 660 includes a sensing unit 661, a position determination unit 665, a number recognition unit 666, . ≪ / RTI >

The sensing unit 661 receives the first detection image and the second detection image from the omnidirectional dual camera 612 and transmits the first detection image and the second detection image to the plurality of parking surfaces 630 640 are parked. The sensing unit 661 includes an area setting unit 662, a feature detecting unit 663, and a detecting unit 664.

The area setting unit 662 sets the area of the parking surface 630 that is the area where the vehicle 640 is parked in the detected image. The location of the vehicle 640 can be photographed in the area of the parking surface 630 set by the area setting unit 662 and the feature detecting unit 663 can detect the position of the parking surface 630 set by the area setting unit 62, And extracts the feature vector from the region of FIG. The detecting unit 664 determines whether the vehicle 640 is parked in the area of the parking surface 630 by judging whether the vehicle 640 corresponds to the feature vector.

Here, the feature vector may be extracted from the detection image using a vehicle recognition algorithm for recognizing the vehicle 640. [ As the vehicle recognition algorithm, any one of Scale-Invariant Feature Transform (SIFT), Speed Up Robust Features (SURF), Histogram of Oriented Gradients (HOG), Haar-like features and Gabor filter can be used.

The feature vector extracted by the vehicle recognition algorithm can be compared with pre-stored learning data. Since the learning data includes a feature vector for a general vehicle image, it is possible to determine whether the object photographed in the parking area of the monitored image is a vehicle or not, and recognize the vehicle.

On the other hand, the position determination unit 665 and the number recognition unit 666 operate in response to the parking determination of the vehicle 640 of the sensing unit 661 and the sensing unit 661 controls the vehicle 640 on the parking surface 630, The operation is performed.

The position determination unit 665 measures the distance (depth information) between the parked vehicle and the omnidirectional dual camera using the first detection image and the second detection image. At this time, the distance between the parked vehicle and the omnidirectional dual camera can be measured by using a factor related to the difference between the first detection image and the second detection image.

In addition, the position determination unit 665 can determine the position where the vehicle is parked in the parking lot based on the measured distance.

The number recognition unit 666 receives the image photographed by the number recognition camera, and extracts the license plate information of the vehicle using the received image. For character recognition, one of the three detection methods of the license plate position can be executed first. First, the feature region of the license plate is detected using the vertical and horizontal edge information from the photographed image. The second is to detect the position of the license plate by the scan data analysis. The third is to detect the exact license plates by directly searching for numbers and letters.

When the position of the license plate is detected, the recognition algorithm uses the numbers, letters (consonants, vowels, and vowels) to recognize the characters by template matching (Hangul consonant + number) ) Are classified in detail and the recognized characters are re-confirmed, thereby minimizing the error in decoding the characters.

The security module 670 includes a memory 671 for storing the first detection image and the second detection image, and the detection image stored in the memory 671 is used when a problem related to security occurs in the parking facility to which the present invention is applied It can be set to be accessible.

The memory 671 included in the security module 670 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory , SD or XD memory), a random access memory (RAM), a static random access memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM) A magnetic disk, a magnetic disk, an optical disk, or the like.

On the other hand, in the parking lot to which the present invention is applied, a plurality of parking surfaces 630 may be provided with a differential gate (not shown) indicating whether or not the vehicle 640 is parked.

The paved roads display a sign that can easily tell if there is space available for the driver to enter the parking lot. The display may include a visual output relating to a predetermined color, an auditory output providing a voice regarding whether a full color or a tolerance exists, and the like.

At this time, the image analyzing apparatus 650 can control the output of the panning differential according to the determination of the parking plane 630 of the sensing unit 661. [ For example, if it is determined that the vehicle 640 has been parked on all the parking surfaces 630, the pseudo differential may output a full-time display, and there is a space available for parking the vehicle 640 among the parking surfaces 630 If judged, the pseudo-differential can output a tolerance indication. In addition, the pneumatic differential may be configured to individually indicate whether or not the vehicle 640 is parked on each parking surface 630.

On the other hand, in addition to the above-described manner Application No. 1020090069833, 1020090119171, 1020140151299, 1020110015134, 1020060081445, 1020150032491, 1020140001338, 1020140151298, 1020080043838, and 1020080023986, 1020060017773, 1020140151297, 1020040100238, 1020060035180, 1020070125958, 1020070087392, 1020110055115, 2020100008900, 1020050055515, 1020070007861 method of Can also be applied to the present invention.

Next, a method for performing parking presence / absence determination using the LPR system will be described.

FIGS. 7A and 7B are diagrams for illustrating an example of performing parking presence / absence determination using the LPR system.

7A, the LPR system 7100 of the present invention includes an imaging module 710, a number recognition module 720, a determination unit 730, an image processing unit 740, a vehicle recognition module 780, 790), and the like.

However, the components shown in Fig. 7A are not essential, and an LPR system 7100 having components having more or fewer components may be implemented. Also, it is possible that the components shown in Fig. 7A are interconnected in an interdependent manner, and that each component is separately or integrally implemented as shown in Fig. 7A. Hereinafter, each configuration will be described.

The photographing module 710 is installed in the LPR system to photograph a vehicle, and photographs a vehicle located in a predetermined section to generate an original image. The original image taken by the photographing module 710 is photographed with the license plate of the vehicle and the vehicle. The original image is transmitted to the number recognizing module 720 and used for license plate recognition of the vehicle, and is transmitted to the vehicle recognizing module 780 And is used for recognizing the type of the vehicle.

The photographing module 710 includes a tilting device capable of tilting in the x-, y-, and z-axis directions, respectively. Such a photographing module 710 has a configuration capable of photographing a zoom-in image or a zoom-out image with respect to the vehicle by rotating the camera at specified x, y coordinates.

The photographing module 710 may be implemented using a camera equipped with a fisheye lens. When a fisheye lens having a wide angle of view is used, it is possible to photograph an image in an omnidirectional (360 DEG) area around the photographing module 710.

The photographing module 710 is equipped with a CCD sensor or a CMOS sensor, preferably a CCD sensor. The CCD type image sensor and the CMOS type image sensor commonly have a light receiving section that receives light and converts it into an electric signal. The CCD type image sensor transmits the electric signal through the CCD and converts it to the voltage at the last stage. On the other hand, the CMOS image sensor converts the signal to a voltage at each pixel and transfers it to the outside. That is, the CCD type image sensor moves the electrons generated by the light directly to the output part by using the gate pulse, and the CMOS type image sensor converts the electrons generated by the light into voltage in each pixel, There is a difference in output through the switch.

In the CCD type image sensor, a smear phenomenon occurs due to the signal processing method. The smear phenomenon refers to a phenomenon in which a line of vertical lines appears on the screen when a strong reflected light of a light source or an illumination lamp is photographed. It is often seen when using high-speed shutter and is often seen when shooting very bright objects such as light sources. The CCD type image sensor has a structure in which only one light is present in one cell, and reflection phenomenon and interference phenomenon between cells cause smear phenomenon when the amount of charge that can be stored in one cell overflows.

The smear phenomenon is easily generated in the buffer area for storing or transmitting to the image sensor according to the exposure of the light in the high-speed shutter setting. The high-speed shutter of the CCD adjusts the exposure by the exposure time of the CCD through the shutter of the camera body and by directly controlling the CCD at a shutter speed higher than the synchronization speed. If the shutter of the camera body is opened when acquiring an image using the electronic shutter of the CCD, the light continues to be incident on the photodiode and the charge is overflowed in the stored space. If the charge of the CCD composed of the longitudinal array is read, And a smear phenomenon is generated.

The smear phenomenon thus generated can distort the photographed image, cause a problem in that the system of detecting or checking the vehicle may grasp the vehicle shape and obstruct the recognition of the vehicle number.

Meanwhile, the number recognition module 720 of the LPR system 7100 of the present invention may include a determination unit 730, an image processing unit 740, and the like, as shown in FIG. 7A.

The discrimination unit 730 classifies the original image generated by the photographing module 710 into one of a low-illuminance image, a high-illuminance image, and an unprocessed image based on the discriminant related to the original image. Here, the discriminant related to the original image is the intensity of the light of the original image converted to gray scale.

Specifically, when the intensity of light of the original image is higher than a predetermined threshold value, the determination unit 730 classifies the original image as a raw image. When the light intensity of the original image is lower than the threshold value, the original image is classified into one of the low-illuminance image and the high-illuminance image according to the intensity of the light of the original image. That is, if the intensity of light of the original image is less than the predetermined first value, the original image is classified as a low-illuminance image, and if the intensity of light of the original image is higher than the first value, the original image is classified as a high-

On the other hand, the image processing unit 740 improves the image of the original image classified into the low-illuminance image or the high-illuminance image. The image processor 740 includes a low-illuminance image processor 50 for an original image classified into a low-illuminance image, a high-illuminance image processor 760 for an original image classified into a high-illuminance image, 770. < / RTI >

The low-illuminance image processing unit 50 generates a corrected image from the original image by using the Advanced Clipped Histogram Equalization (A_CHE) method when the original image is classified as a low-illuminance image.

Specifically, according to the improved truncation histogram smoothing method, an adaptive truncation ratio for an original image is determined, and a truncated histogram is generated in which the upper region of the histogram of the original image is removed according to the determined adaptive truncation ratio. At least a part thereof is cut and reassigned to the cut histogram.

7B, the photographing module 710 equipped with the CCD sensor photographs an original image including a license plate of the vehicle using the CCD sensor, and the original image photographed by the photographing module 710 is recognized as a number Is input to the module 720 (S710). In the original image, the vehicle is photographed. In general, the license plate of the vehicle is located under the original image.

Then, the determination unit 730 determines whether there is a change in the intensity of light on the original image (S720). In step S720, the original image is converted into a gray scale image, and a characteristic change is observed with respect to an ROI, which is a part of the original image.

Then, the determination unit 730 determines whether the light intensity of the original image is a low-illuminance component (S730). According to the determination of step S730, the present invention can largely correct two pieces of distortion information. First, we propose an image enhancement method by expanding the dynamic range for the low-light and high-intensity regions, and the second method suggests a method for detecting and restoring the smear due to the amount of light charge in the high-intensity region .

Then, the image processing unit 740 generates a corrected image using the original image. In detail, when the original image is classified as a low-illuminance image, the low-illuminance image processing unit 50 generates a corrected image using the improved Clipped Histogram Equalization (S740) The high-illuminance image processing unit 60 removes the smear generated in the original image to generate a corrected image (S742).

In step S740, an improved cut histogram smoothing method, which is one of the histogram smoothing methods, is used as a method of improving the image quality of the original image.

In general, histogram equalization improves the image by uniformly distributing the distribution of brightness values by processing images in which the distribution of brightness values is shifted to one side or is not uniform. The ultimate goal of histogram smoothing is to create a histogram with a uniform distribution, which makes the distribution of the histogram uniform during processing. In this case, since the brightness value is significantly changed according to the input image and the undesired noise can be amplified, the method can increase the contrast while maintaining the average brightness value.

Since the histogram processing method is a simple method for solving the degraded image quality, there are various methods.

Typical examples are Bi Histogram Equalization, Recursive Mean-Separate Histogram Equalization, and Clipped Histogram Equalization.

Among them, the Clipped Histogram Equalization (CHE) method is most effective and maintains the amount of information in the image, and there is no image distortion. This method controls the maximum value of the histogram by setting an arbitrary maximum value and cutting the upper portion of the histogram exceeding the maximum value to reset the entire region of the threshold value. This is the minimum range after the histogram transformation.

And a dynamic threshold value according to a change in image characteristic can be set by assigning a threshold value to an initial setting according to an image. In this case, the upper part of the histogram is reassigned to the whole area, so it is strong against noise, but in general image, the improvement of the contrast results in inefficiency compared with other methods.

Therefore, in the present invention, the histogram section is divided into several sections without resetting the upper part of the histogram to the entire area, and the biased distribution is evenly distributed in the peripheral section of the histogram section by the distance ratio, We proposed an improved A_CHE method of CHE as a way to improve the image contrast.

As a result, the low-illuminance area is improved from the dynamic range, and furthermore, the high-illuminance area can be processed with the improved image having a strong form.

Meanwhile, in step S742, smear is detected and removed from the original image using image processing.

After inputting the original image from the photographing module 710, the input original image is statistically analyzed. The extracting unit of the detecting unit may curve the input original image into a signal distribution for a column unit signal. The signal distribution curve represents the sum of gray values of a plurality of pixels constituting each column constituting the original image.

Further, the converting unit of the detecting unit may convert the signal distribution curve related to the input original image into a normal distribution curve. That is, when the smear is generated at a specific place from the source of sunlight or passive light by the vehicle in general, it can be expressed as a normal distribution.

After the original image is expressed as a normal distribution, the presence or absence of smear is determined. It can be concluded that the smear is generated in the columns of the image due to the characteristics of the smear, and the smear occurs in the regions of the column having white and bright shapes, especially white and bright shapes.

Thus, the sum of the gray values along the direction of the distribution generated in the smear and other sections in the signal distribution curve and the value of the maximum estimate for the sum of the column distribution curve, i.e., the blurring, In the curve, when a portion having a specific and significantly higher frequency than other portions in the normal distribution exists, it can be determined that the smear is generated in the original image.

After the presence of the smear in the original image is determined, the position of the smear is determined. It is possible to judge the portion where the smear is generated by judging the portion having a specific and remarkably high frequency in the signal distribution curve of the original image as compared with the other portions as the region in which the smear occurs.

It is determined that the smear region exists, and after the position is determined, the smear is removed and a binary pattern map (Alpha map) for restoration is generated.

The smear intensity and the exact background intensity are estimated to remove the smear. A binary pattern map is generated by applying an average filter to each column of the original image.

In this case, the binary pattern map has a value of 1 when the signal intensity on the normal distribution is larger than a predetermined threshold value, and the column has a value of 1, and the column has a value of 0 when the signal intensity is small.

After the binary pattern map is generated, the smear position is rearranged using the binary pattern map. It consists of vehicle, noise, background, and smear when analyzing each pixel station. The intensity of the smear signal is estimated by reconstructing the smear region search size in order to align the gray values of the pixels in the column using the applied filter, and the accurate position is determined and rearranged.

After rearranging the smear position, the smear is removed. Using the determined area and intensity of the smear, smear can be removed from the entire image.

After removing the smear, the original image is reconstructed. There are various methods of restoring the original image, but in the case of the present invention, inpainting is applied. In particular, although the original image can be reconstructed using the interpolation method, the original image can be reconstructed using a patch method having a certain size in the periphery since it is not suitable for the region.

Then, the image reconstruction unit generates a reconstructed image by applying a focus deterioration method combined with a high-resolution image generation method and a dithering method to a target image (S750).

Specifically, in step S750, up-scaling is performed according to an up-scale coefficient in a target image having focus deterioration to generate a super-resolution image, A high-resolution image is calculated by applying a bicubic interpolation to the generated super-resolution image. The process may be repeated according to the value of the predetermined coefficients, and preferably it may be repeated until the focus deterioration is no longer improved

After the above process is repeatedly performed, a high-resolution image in which a part of the focal deterioration has been removed can be obtained. In order to restore a high-resolution image thus produced to a clear image with good visual quality, Method.

However, the step S750 is not an essential step, and it is also possible to proceed to the step S760 without omitting the step S750.

Then, the number recognition module 720 recognizes characters of license plates of the vehicle using the target image (S760).

For character recognition, one of the three detection methods of the license plate position can be executed first. First, the feature region of the license plate is detected using the vertical and horizontal edge information from the photographed image. The second is to detect the position of the license plate by the scan data analysis. The third is to detect the exact license plates by directly searching for numbers and letters.

When the position of the license plate is detected, the recognition algorithm uses the numbers, letters (consonants, vowels, and vowels) to recognize the characters by template matching (Hangul consonant + number) ) Are classified in detail and the recognized characters are re-confirmed, thereby minimizing the error in decoding the characters.

However, the method using the LPR system is not limited to the above-described method, Application No. 1020150057097, 1020060035180, 1020150050779, 1020040069107, 1020040100238, 1020070087392, 1020130103651, 1020130047465, 1020150032491, 1020140151298, 1020140001336, 1020130130131, 1020130109702, 1020130099774, 1020130057752, 1020130012960, The contents of 1020080043838, 1020070007861, and 1020100015924 are also applicable to the present invention.

According to another embodiment of the present invention, there is provided a method of easily recognizing a vehicle type and a license plate and easily calculating a parking charge through a payment system.

FIG. 8 is a flowchart for recognizing vehicle types and license plates of the vehicle and easily adjusting the parking costs through the payment system.

Referring to Fig. 8, first, the parked vehicle departs (S810) is performed.

Thereafter, step S820 of recognizing the vehicle type and number plate is performed through at least one of the plurality of methods described above.

At this time, it is also possible to additionally obtain information other than the vehicle type and license plate.

Thereafter, a step in which the user executes a specific application (S830) is performed.

For example, in step S830, payment related applications such as PayPal, Kakao Pay, and Paynay can be executed.

In addition, the control unit 180 compares the recognized vehicle type and license plate with the information registered by the user in the specific application (S840).

Thereafter, when the recognized information matches the registered information, the controller 180 performs a step S850 of calculating the parking expense through the payment system linked to the specific application (S850).

Therefore, in the present invention, when the vehicle type and the license plate are recognized, the user can easily pay the parking cost while leaving the vehicle, so that additional convenience is provided.

When the above-described configuration of the present invention is applied, it is possible to provide the guidance by taking into consideration the parking space, the cost and the operation policy of the parking lot closest to the user or the destination, and the factors walking on the foot.

According to the present invention, it is possible to manually check the counting and departure according to whether there is a doorway or not, or to automatically check the counting and departure, thereby performing guidance on the parking space, and use video detection, loop coil, ultrasonic, geomagnetic sensor So that the parking space of the parking lot can be accurately determined.

In addition, the present invention recognizes the vehicle type and the license plate of the vehicle, and compares the information registered in the specific application with the information registered in the specific application, so that the parking cost can be easily settled through the payment system.

Further, according to an embodiment of the present invention, the above-described method can be implemented as a code that can be read by a processor on a medium on which the program is recorded. Examples of the medium that can be read by the processor include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, etc., and may be implemented in the form of a carrier wave (e.g., transmission over the Internet) .

The above-described parking area guidance method and system can be applied to a case where the configuration and method of the above-described embodiments are not limitedly applied, but the embodiments can be applied to all or some of the embodiments selectively And may be configured in combination.

Claims (7)

A method of guiding a parking area using a terminal and a plurality of parking lots in which a network is established,
A first step of determining a current position of the terminal user;
A second step of displaying a parking lot within a predetermined distance from the current location of the user or the final destination inputted by the user among the plurality of parking lots;
A third step of the user selecting a first parking lot among the displayed parking lots;
A fourth step of displaying a route to the first parking lot and a required time on the terminal;
A fifth step in which the terminal receives at least one of the presence, cost, and operation policy of the parking space of the first parking lot from the first parking lot;
A sixth step in which the terminal displays at least one of a presence / absence of a parking space, a required cost, and an operation policy of the first parking lot; And
And a seventh step in which the route and time required for the user to move to the final destination are displayed on the terminal in consideration of an element that the user moves on foot from the first parking lot to the final destination,
Between the fourth step and the fifth step,
A fourth step of determining whether there is an entry / exit section of the vehicle in the first parking lot; And
And a fourth step (4-2) of automatically determining whether the parking space is present when the entrance / exit section of the vehicle exists,
When the vehicle departs from the entry / exit section, and the vehicle type and number plate of the leaving vehicle can be recognized in the step 4-2,
(9-1) the user executing a predetermined application of the terminal;
Comparing the vehicle type and license plate of the recognized vehicle with information registered in the predetermined application; And
And a step 9-3 of calculating the parking cost of the leaving vehicle through the payment system interlocked with the predetermined application if the recognized vehicle type and license plate match with the information registered in the predetermined application Wherein the parking area guide method comprises the steps of:
The method according to claim 1,
If the current position of the user is changed in the first step,
Wherein the second step displays the parking lot within a predetermined distance from the current position of the changed user.
delete The method according to claim 1,
The method of automatically detecting the presence of the parking space may include a camera image detection method, a method using a loop coil, a method using an ultrasonic wave, a method using a geomagnetic sensor, a method using an infrared sensor, a method using a laser, And a method of using the parking area.
delete The method according to claim 1,
Wherein the terminal and the plurality of parking lots communicate using at least one of a short-distance communication and a wireless communication,
The near-field communication uses at least one of WiFi, Bluetooth, Radio Frequency Identification (RFID), IrDA, Ultra Wideband (UWB), and ZigBee technology,
The wireless communication may be implemented in a wireless communication system such as code division multiple access (CDMA), frequency division multiple access (FDMA), time division multiple access (TDMA), orthogonal frequency division multiple access (OFDMA), and single carrier frequency division multiple access Wherein at least one of the plurality of parking areas is used.
delete
KR1020150171745A 2015-12-03 2015-12-03 Method for showing parking lot and system KR101691312B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150171745A KR101691312B1 (en) 2015-12-03 2015-12-03 Method for showing parking lot and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150171745A KR101691312B1 (en) 2015-12-03 2015-12-03 Method for showing parking lot and system

Publications (1)

Publication Number Publication Date
KR101691312B1 true KR101691312B1 (en) 2016-12-30

Family

ID=57737290

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150171745A KR101691312B1 (en) 2015-12-03 2015-12-03 Method for showing parking lot and system

Country Status (1)

Country Link
KR (1) KR101691312B1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101859342B1 (en) * 2017-12-19 2018-05-17 김항중 The method for detecting an illegal parking car in handicapped person's parking area
CN109523826A (en) * 2018-11-29 2019-03-26 中山市博安通通信技术有限公司 Parking stall reservation navigation method and reservation navigation system
CN109584602A (en) * 2018-10-15 2019-04-05 甘龙龙 A kind of residential property parking management system
CN109830119A (en) * 2019-02-18 2019-05-31 南京邮电大学 A kind of efficient parking lot management method and its system
WO2019225820A1 (en) * 2018-05-21 2019-11-28 에스케이텔레콤 주식회사 Parking guidance apparatus and method
CN111882908A (en) * 2020-07-14 2020-11-03 苏秋燕 Vehicle service system based on cloud computing and data analysis
KR102261664B1 (en) * 2019-12-19 2021-06-09 동국대학교 산학협력단 Parking helper system through smart devices
KR102556036B1 (en) * 2023-03-29 2023-07-18 주식회사 로드맵 System for managing outdoor parking lot using image analysis technology by AI deep learning and Lidar sensor

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100853191B1 (en) 2006-12-08 2008-08-20 한국전자통신연구원 Apparatus and method of intelligent parking information
JP2012230001A (en) * 2011-04-26 2012-11-22 Zenrin Datacom Co Ltd Route search system and route search method
KR20140087080A (en) * 2012-12-24 2014-07-09 (주)포모스트원 Parking system and method for controlling thereof
KR20150129336A (en) * 2014-05-08 2015-11-20 (주)데이타뱅크시스템즈 Payment processing method for parking fee and processing system thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100853191B1 (en) 2006-12-08 2008-08-20 한국전자통신연구원 Apparatus and method of intelligent parking information
JP2012230001A (en) * 2011-04-26 2012-11-22 Zenrin Datacom Co Ltd Route search system and route search method
KR20140087080A (en) * 2012-12-24 2014-07-09 (주)포모스트원 Parking system and method for controlling thereof
KR20150129336A (en) * 2014-05-08 2015-11-20 (주)데이타뱅크시스템즈 Payment processing method for parking fee and processing system thereof

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101859342B1 (en) * 2017-12-19 2018-05-17 김항중 The method for detecting an illegal parking car in handicapped person's parking area
WO2019225820A1 (en) * 2018-05-21 2019-11-28 에스케이텔레콤 주식회사 Parking guidance apparatus and method
US11605293B2 (en) 2018-05-21 2023-03-14 Sk Telecom Co., Ltd. Parking guidance apparatus and method
CN109584602A (en) * 2018-10-15 2019-04-05 甘龙龙 A kind of residential property parking management system
CN109523826A (en) * 2018-11-29 2019-03-26 中山市博安通通信技术有限公司 Parking stall reservation navigation method and reservation navigation system
CN109830119A (en) * 2019-02-18 2019-05-31 南京邮电大学 A kind of efficient parking lot management method and its system
KR102261664B1 (en) * 2019-12-19 2021-06-09 동국대학교 산학협력단 Parking helper system through smart devices
CN111882908A (en) * 2020-07-14 2020-11-03 苏秋燕 Vehicle service system based on cloud computing and data analysis
KR102556036B1 (en) * 2023-03-29 2023-07-18 주식회사 로드맵 System for managing outdoor parking lot using image analysis technology by AI deep learning and Lidar sensor

Similar Documents

Publication Publication Date Title
KR101691312B1 (en) Method for showing parking lot and system
CN111339846B (en) Image recognition method and device, electronic equipment and storage medium
US10694175B2 (en) Real-time automatic vehicle camera calibration
US10529205B2 (en) Surveillance camera system and surveillance method
KR101937833B1 (en) Parking Management Systems and Methods based Image Processing
KR101743878B1 (en) System for providing services of parking management
US20170032199A1 (en) Video data analyzing method and apparatus and parking lot monitoring system
JP2021520017A (en) Graphic code recognition method and device, as well as terminals and programs
KR101873438B1 (en) System for paying parking charge using smart device
JP2012063869A (en) License plate reader
KR102090907B1 (en) vehicle detection method and number cognition method using image enhancement and deep learning, and park systems using the method
CN115641518A (en) View sensing network model for unmanned aerial vehicle and target detection method
CN112862856A (en) Method, device and equipment for identifying illegal vehicle and computer readable storage medium
KR20190113393A (en) Vehicle recognition device, method and computer program
KR101986463B1 (en) Parking guidance system and method for controlling thereof
CN110728390B (en) Event prediction method and device
CN110933314A (en) Focus-following shooting method and related product
CN111444749A (en) Method and device for identifying road surface guide mark and storage medium
KR101416457B1 (en) Road crime prevention system using recognition of opposite direction drive and pedestrian
EP3349201B1 (en) Parking assist method and vehicle parking assist system
CN109829393A (en) A kind of mobile object detection method, device and storage medium
CN115223143A (en) Image processing method, apparatus, device, and medium for automatically driving vehicle
JP3222136U (en) Unmanned parking lot management system
CN110971813B (en) Focusing method and device, electronic equipment and storage medium
KR101892636B1 (en) Mobile terminal and method for forming 3d image thereof

Legal Events

Date Code Title Description
GRNT Written decision to grant