WO2007015631A1 - Smart video monitoring system and method communicating with auto-tracking radar system - Google Patents

Smart video monitoring system and method communicating with auto-tracking radar system Download PDF

Info

Publication number
WO2007015631A1
WO2007015631A1 PCT/KR2006/003067 KR2006003067W WO2007015631A1 WO 2007015631 A1 WO2007015631 A1 WO 2007015631A1 KR 2006003067 W KR2006003067 W KR 2006003067W WO 2007015631 A1 WO2007015631 A1 WO 2007015631A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
image
radar
target
target object
Prior art date
Application number
PCT/KR2006/003067
Other languages
French (fr)
Inventor
Chang-Ho Park
Kyoung-Jae Lee
Sang-Hwan Oh
Jae-Sik Yoo
Do-Han Lee
Ik-Jung Jeong
Dong-Moon Lee
Yong-Ho An
Original Assignee
Isenteck Enc Co., Ltd.
Korea Electric Power Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020050071447A external-priority patent/KR100594513B1/en
Priority claimed from KR1020060073033A external-priority patent/KR100720595B1/en
Application filed by Isenteck Enc Co., Ltd., Korea Electric Power Corporation filed Critical Isenteck Enc Co., Ltd.
Priority to CN2006800343611A priority Critical patent/CN101268383B/en
Publication of WO2007015631A1 publication Critical patent/WO2007015631A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/003Transmission of data between radar, sonar or lidar systems and remote stations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Studio Devices (AREA)

Abstract

A smart video monitoring system interacting with an auto-tracking radar system and a method for controlling the same are disclosed. The smart video monitoring system converts radar data received from a management server via a radar signal processor into a two-dimensional image, removes images of fixed objects (i.e. GIS data) using a mask map, acquires target information using a linear-motion prediction algorithm, and tracks the target object, records information of the target object in a database (DB), manages the recorded information, and provides a plurality of users the target information over a network. The smart video monitoring system com¬ municates with an auto-tracking radar system, adds a high-quality image captured by a camera to two-dimensional position data acquired by a radar system, allows a user or operator to easily recognize a target object displayed on a display, and tracks/monitors the target object located at a monitoring area at a remote site.

Description

Description
SMART VIDEO MONITORING SYSTEM AND METHOD COMMUNICATING WITH AUTO-TRACKING RADAR SYSTEM
Technical Field
[1] The present invention relates to a radar system, and more particularly to a smart video monitoring system for communicating with an auto-tracking radar system, which adds a high-quality image captured by a camera to two-dimensional position data acquired by a radar system, allows a user or operator to easily recognize a target object displayed on a display, and tracks/monitors the target object located at a monitoring area at a remote site, and a method for controlling the smart video monitoring system. Background Art
[2] A conventional radar system includes an antenna, a transceiver, a radar signal processor, and a radar display.
[3] The antenna performs transmission or reception of radio or electric waves. The antenna rotates by 36O0C and at the same time performs incidence or projection of the radio waves. The transceiver receives the radio waves from the antenna, or transmits the radio waves to a destination via the antenna. The radar signal processor analyzes radio signals reflected from a target object using a radio- wave projection algorithm, and processes the analyzed radio signals. The radar display displays the object's position processed by the radar signal processor in the form of dark dots.
[4] Although the above-mentioned radar system allows a user to recognize position information of a desired target object, it cannot provide the user to easily identify the target object.
[5] The above-mentioned conventional radar system has been developed for special purposes (e.g., military or navigation purposes).
[6] The conventional radar system is configured in the form of a console or a Peer- to-Peer (P2P) network system over a dedicated network (e.g., a telephone phone line), such that a large amount of costs are required to upgrade or extend the radar system, and a dexterous engineer skillful in operating the radar system is also required. Therefore, in order to solve the above-mentioned problems and provide the user with precise monitoring data of the target object, a high-performance radar system must be developed and introduced to the market. Disclosure of Invention Technical Problem
[7] Therefore, the present invention has been made in view of the above problems, and it is an object of the present invention to provide a smart video monitoring system for communicating with an auto-tracking radar system, which adds a high-quality image captured by a camera to two-dimensional position data acquired by a radar system, allows a user or operator to easily recognize a target object displayed on a display, and tracks/monitors the target object located at a monitoring area at a remote site, and a method for controlling the smart video monitoring system.
[8] It is another object of the present invention to provide a smart video monitoring system for communicating with an auto-tracking radar system, which can be controlled by a user at a remote site, and can provide the user with two-dimensional position data of a target object and video data captured by a camera, and a method for controlling the smart video monitoring system. Technical Solution
[9] In accordance with one aspect of the present invention, the above and other objects can be accomplished by the provision of a smart video monitoring system interacting with an auto-tracking radar system comprising: a transmitter for transmitting position data of a target object by the radar system, generating video data using a camera, and transmitting the generated video data, which includes: an antenna unit including: an antenna for rotating, transmitting a radar signal to the target object, and receiving incident waves reflected from the target object to detect the presence of the target object and its position data, a pedestal for physically controlling operations of the antenna according to a control signal, and a housing for forming a waveguide and an overall layout; an embedded radar data converter for processing the radar signal received via the antenna unit, finding information of the target object, transmitting the found target-object information, and driving the antenna unit according to a control signal; and a video monitoring unit for capturing an image of the target object detected by the antenna unit at high resolution; a receiver for receiving output data of the transmitter, storing the camera-captured image, performing signal processing of the stored image, acquiring data of the target object, and controlling operations of the transmitter, such that a user is able to view the acquired target-object data in real time; and a signal processor for performing data communication between the transmitter and the receiver.
[10] In accordance with another aspect of the present invention, there is provided a smart video monitoring system interacting with an auto-tracking radar system, including: a transmitter for transmitting position data of a target object by the radar system, generating video data using a camera, and transmitting the generated video data; a receiver for receiving output data of the transmitter, storing the camera- captured image, performing signal processing of the stored image, acquiring data of the target object, and controlling operations of the transmitter, such that a user is able to view the acquired target-object data in real time; and a signal processor for performing data communication between the transmitter and the receiver, the smart video monitoring system comprising: the receiver including: a management server for controlling operations of the camera and the antenna unit at a remote site, transmitting information acquired from the target object by replying to control signals of the camera and the antenna unit, managing target information acquired by performing signal processing on radar data generated from the transmitter, and controlling the receiver to output image information corresponding to position information of the target object, an embedded video switcher for transmitting the video data captured by the camera to a user, a DVR (Digital Video Recorder) for converting the output image of the embedded video switcher into digital video data, storing the digital video data, and playing the stored video data, and a plurality of user terminals; and the management server for converting radar data received from a radar signal processor into a two- dimensional image, removing images of fixed objects from the two-dimensional image using a mask map, applying a linear-motion prediction algorithm to the resultant data, acquiring/tracking an image of the target object, recording information of the target object in a database (DB), managing the target-object information recorded in the DB, and providing a plurality of users with the target-object information over a network.
Advantageous Effects
[11] A smart video monitoring system according to the present invention communicates with an auto-tracking radar system, adds a high-quality image captured by a camera to two-dimensional position data acquired by a radar system, allows a user or operator to easily recognize a target object displayed on a display, and tracks/monitors the target object located at a monitoring area at a remote site.
[12] The smart video monitoring system interacts with a camera to increase the accuracy of radar monitoring data, combines a local-area common-type radar system with an imaging system, reduces the production costs of the system, and uses an open IP (Internet Protocol) as a radar communication protocol, thereby easily extending a network area of the system.
[13] In addition, the smart video monitoring system uses radar monitoring software
(SAV) available for personal computers (PCs), resulting in greater convenience of the user and the implementation of monitoring automation. The smart video monitoring system is combined with a video monitoring system, such that it implements an upgradable or extensible system capable of being applied to a variety of monitoring information (e.g., forest fires or sea contamination). The smart video monitoring system records the monitoring data in a database (DB), and manages the recorded monitoring data, such that it can allow a user or operator to systemically monitor the position of a target object.
Brief Description of the Drawings
[14] The above and other objects, features and other advantages of the present invention will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings, in which: [15] FIG. 1 is a block diagram illustrating a smart video monitoring system interacting with an auto-tracking radar system according to a preferred embodiment of the present invention. [16] FIG. 2 is a detailed block diagram illustrating a transmitter of FIG. 1 according to the present invention. [17] FIG. 3 is a detailed block diagram illustrating a receiver of FIG. 1 according to the present invention. [18] FIG. 4 is a conceptual diagram illustrating a combination implementation example of the smart video monitoring system of FIG. 1, the transmitter of FIG. 2, and the receiver of FIG. 3 according to the present invention. [19] FIG. 5 is a detailed block diagram illustrating a management server of FIG. 3 according to the present invention. [20] FIG. 6 is a conceptual diagram illustrating the flow of data or signals for use in the management server of FIG. 3 according to the present invention. [21] FIG. 7 is a flow chart illustrating a smart video monitoring method interacting with an auto-tracking radar system according to a preferred embodiment of the present invention [22] FIG. 8 is a flow chart illustrating operations of a radar signal processor of FIG. 7 according to the present invention [23] FIG. 9 is a flow chart illustrating operations of a video processor of FIG. 7 according to the present invention. [24] FIG. 10 is a flow chart illustrating operations of a target processor of FIG. 7 according to the present invention. [25] FIG. 11 is a conceptual diagram illustrating mapping operations of a radar signal processor according to the present invention. [26] FIG. 12 is a conceptual diagram illustrating radar signal conversion operations of a radar signal processor according to the present invention. [27] FIG. 13 is a conceptual diagram illustrating display mapping operations of a radar signal processor according to the present invention. [28] FIG. 14 is a conceptual diagram illustrating video mixing operations of a video processor according to the present invention. [29] FIG. 15 is a conceptual diagram illustrating a process for generating a target image using a video processor according to the present invention.
[30] FIG. 16 is a conceptual diagram illustrating the target- processing result of a target processor according to the present invention.
[31] FIG. 17 is a conceptual diagram illustrating a process for establishing a pre- monitoring area of a receiver according to the present invention.
[32] FIG. 18 shows a plurality of exemplary images illustrating the simulation result of a radar-data processing according to the present invention.
[33] FIG. 19 shows an implementation example of a radar/video system monitoring program according to the present invention. Mode for the Invention
[34] Now, preferred embodiments of the present invention will be described in detail with reference to the annexed drawings. In the drawings, the same or similar elements are denoted by the same reference numerals even though they are depicted in different drawings. In the following description, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present invention rather unclear.
[35] A smart video monitoring system communicating with an auto-tracking radar system, and a method for controlling the same according to the present invention will hereinafter be described with reference to the annexed drawings.
[36] It should be noted that most terminology disclosed in the present invention is defined in consideration of functions of the present invention, and can be differently determined according to intention of those skilled in the art or usual practices. Therefore, it is preferable that the above-mentioned terminology be understood on the basis of all contents disclosed in the present invention.
[37] The smart video monitoring system according to the present invention communicates with an auto-tracking radar system, adds a high-quality image captured by a camera to two-dimensional position data acquired by a radar system, allows a user or operator to easily recognize a target object displayed on a display, and tracks/monitors the target object located at a monitoring area at a remote site. The smart video monitoring system interacts with a camera to increases the accuracy of radar monitoring data, combines a local-area common-type radar system with an imaging system, reduces the production costs of the system, and uses an open IP (Internet Protocol) as a radar communication protocol, thereby easily extending a network area of the system. In addition, the smart video monitoring system uses radar monitoring software (SAV) available for personal computers (PCs), resulting in greater convenience of the user and the implementation of monitoring automation. The smart video monitoring system is combined with a video monitoring system, such that it implements an upgradable or extensible system capable of being applied to a variety of monitoring information (e.g., forest fires or sea contamination). The smart video monitoring system records the monitoring data in a database (DB), and manages the recorded monitoring data, such that it can allow a user or operator to systemically monitor the position of a target object.
[38] In order to accomplish the above-mentioned purposes, the smart video monitoring system according to the present invention monitors a wide area using a common-type probing radar system, and includes a combination system of an infrared camera (also called a thermal vision camera) and an ultra- zoom camera in order to observe a shaded area, identify a target object, and track the position of the target object, and manages the combination system using an Internet Protocol (IP), such that it can implement integrated-, open-, and extensible- systems in a monitoring/management area.
[39] Generally, with the increasing demands of power and communication services desired by users or consumers who live in islands, the number of submarine cables and associated devices is rapidly increasing.
[40] The higher the number of the submarine cables, the higher the installation costs of the submarine cables. The higher the number of the submarine cables, the higher the maintenance or management costs of the submarine cables. Provided that the submarine cables are unexpectedly damaged or lost, a significant amount of a financial loss unavoidably occurs.
[41] Typically, the submarine cables are indirectly monitored by non-official civilians or patrol ships. Therefore, since an administrator or manager has difficulty in recognizing unexpected problems of the submarine cables in real time, the administrator or manager cannot rapidly solve the above-mentioned problems of the submarine cables, such that there is a limitation in extending the range of a monitoring area.
[42] With the increasing number of terrorists and terrorist activities throughout the world, a monitoring system for harbors acting as the principal physical distribution bases of countries must be implemented. Since the monitoring area is too wide to be easily monitored, a large amount of costs are required to implement the above- mentioned monitoring system using only cameras. Therefore, in order to effectively maintain or manage the monitoring system, maintenance costs are continuously required for the monitoring system.
[43] The principal areas (e.g., danger areas or the Military Demarcation line) of individual countries are being monitored by a variety of systems (e.g., a border surveillance system, a fence monitoring system, and a video monitoring system, etc.). In this case, a target area having a broad range must be monitored by the above- mentioned systems 24 hours a day, and dexterous administrators or managers must be unavoidably assigned to effectively manage the individual systems.
[44] Due to the above-mentioned problems, the smart video monitoring system according to the present invention includes a local-area radar system capable of simultaneously monitoring the broad target area, and installs a camera at a desired shaded area to reduce the costs of the system, resulting in the implementation of high- performance monitoring system capable of monitoring the target area day and night.
[45] The conventional system operating/monitoring tasks performed by a radar administrator or manager generally includes a user-oriented management system, such that it is greatly affected by the dexterity of the administrator or manager, the performance of a radar system, and environments of the radar system.
[46] In order to obviate the above-mentioned disadvantages, the smart video monitoring system according to the present invention correctly identifies a monitoring target using an ultra-zoom camera, and tracks the position of the target using the ultra- zoom camera, resulting in the increased performance of the radar monitoring system. Also, a sensor-oriented passive monitoring system(e.g., a camera or fence monitoring system) must be developed or upgraded to an active radar monitoring system.
[47] Therefore, the smart video monitoring system interacting with the auto-tracking radar system includes a transmitter, a receiver, and a signal processor. The transmitter generates position data indicating the target via an antenna, and generates image data interacting with the position data using a video monitoring unit. The receiver receives data from the transmitter, stores the received data, and controls operations of the transmitter, such that it allows a user to view the stored data in real time. The signal processor performs data communication between the transmitter and the receiver.
[48] A smart video monitoring system communicating with an auto-tracking radar system, and a method for controlling the same according to the present invention will hereinafter be described with reference to the annexed drawings.
[49] FIG. 1 is a block diagram illustrating a smart video monitoring system interacting with an auto-tracking radar system according to a preferred embodiment of the present invention.
[50] Referring to FIG. 1, the smart video monitoring system according to the present invention includes: a transmitter 100 for generating radar position data of a target object; a receiver 300 for receiving an output signal of the transmitter, and outputting a control signal to the transmitter by replying to the received signal. The signal processor 200 is located between the transmitter 100 and the receiver 300, such that it performs data communication between the transmitter 100 and the receiver 300.
[51] FIG. 2 is a detailed block diagram illustrating a transmitter of FIG. 1 according to the present invention. FIG. 3 is a detailed block diagram illustrating a receiver of FIG. 1 according to the present invention. FIG. 4 is a conceptual diagram illustrating a combination implementation example of the smart video monitoring system of FIG. 1, the transmitter of FIG. 2, and the receiver of FIG. 3 according to the present invention.
[52] The transmitter 100 and the receiver 300 will hereinafter be described with reference to FIGS. 2-3.
[53] As shown in FIG. 2, the transmitter 100 includes an antenna 110, an embedded radar data converter 120, and a video monitoring unit 130. The transmitter 100 is connected to the receiver 300 via the signal processor 200 acting as a data transmitter, transmits information of the target object to the receiver 300, and is driven by a control signal of the receiver 300. In this case, the receiver 300 includes a management server 320 equipped with a display 360, an embedded video switcher330, a DVR (Digital Video Recorder) 310, a switching hub 250, and a plurality of user terminals 340.
[54] The antenna unit 110 includes an antenna, a pedestal, and a housing. The antenna rotates by 36O0C and at the same time transmits a radar beam to a target monitoring area. If the target is found, the antenna receives incident waves reflected from the target object. The pedestal physically controls the driving of the antenna. The housing forms a waveguide and an overall layout.
[55] The antenna unit 110 is connected to a radar data converter 120 for processing a radar signal reflected from the target object to provide position data of the target object. The embedded radar data converter 120 outputs position data acquired via the antenna 110, outputs the acquired position data on its own screen, converts the reflected radar signal into a digital signal, and transmits the digital signal to the management server 320.
[56] The video monitoring unit 130 installed at the smart video monitoring system along with the antenna unit 110 adds video data to radar position data of the target object to acquire desired data, and includes a CCD camera 131, an infrared camera 132, and a receiver 133. The CCD camera 131 outputs high-quality video data (i.e., a high-quality image), and captures an image of the target object even at night.
[57] The video monitoring unit 130 is driven 24 hours a day according to a control signal, and captures an image of the target object by interacting with the target information received in the antenna unit 110. In the daytime or ordinary climate environments, the CCD camera 131 is operated. In the night, or if a thick fog, a heavy rain, or a snowfall occurs, the CCD camera 131 is unable to capture an image of the target object, such that the infrared camera 132 starts its operation. The zoom lens is mounted to the camera, such that the camera can zoom in on the image of the target object located at a remote site using the zoom lens. The housing is located at the outside of the CCD camera 131 and the infrared camera 132,such that it can prevent the CCD camera 131 and the infrared camera 132 from being damaged by a variety of climates varying with time. The housing includes a heater and a blower to constantly maintain an inner temperature, such that it can allow the camera to be operated at optimum conditions.
[58] The CCD camera 131 and the infrared camera 132 are connected to the receiver
133 capable of controlling the camera's driving and a power-supply voltage of the camera by replying to a control signal. The receiver 133 includes a variety of functions, for example, a function for tilting/panning the CCD camera 131 and the infrared camera 132, a zoom in/out function, a focusing function, and a preset function for establishing a pre-monitoring area.
[59] The signal processor 200 receives output data configured in the form of an optical signal from the transmitter 100, and transmits the output data of the transmitter 100 to the receiver 300. The signal processor 200 includes an optical transmitter 210 connected to the transmitter 110, and an optical receiver 220 connected to the receiver 300.
[60] Technologies associated with optical signal transmission are classified into wired optical transmission technology and wireless optical transmission technology, thereby transmitting desired information configured in the form of the optical signal via a transmission path. Particularly, the wired optical transmission technology has been widely used to correctly transmit a large amount of information. The wireless optical transmission technology has been partially made available within a local area (e.g., a short-distance area), however, it has difficulty in transmitting a large amount of information over a long-distance area. Therefore, in order to effectively transmit a large amount of information to a desired destination regardless of the long-distance or short- distance area, the wired optical transmission technology for employing an optical fiber path made of glass has been widely used.
[61] The optical transmitter 210 converts digital-type radar position data acquired from the target object into another data capable of being transmitted to a destination located at a remote site using an optical modulation process and an optical multiplexing process. The optical modulation process converts video data captured by the camera into an optical signal. The optical multiplexing process performs simultaneous communication of a plurality of signals.
[62] The optical receiver 220 receives data of the target object (hereinafter referred to as target data) from the optical transmitter 210 by wire or wirelessly, performs optical demodulation and optical distribution on the received target data, and transmits the resultant data to the receiver 300.
[63] The receiver 300 depicted in FIG. 3 includes the management server 320, the embedded video switcher 330, the DVR 310, and a plurality of user terminals 340.
[64] The management server 320 manages overall operations of the system, processes radar position data and video data captured by a camera having been received from the transmitter 100, and controls the driving of the transmitter 100 and its data transmission. The embedded video switcher 330 converts a video signal captured by the CCD camera 131 and the infrared cameral32 into digital signals, and transmits video data to the user terminals 340. The DVR 310 stores/manages analog video data received from the embedded video switcher 330.
[65] The embedded video switcher 330 receives analog video data including target's moving images captured by the cameras 131 and 132, and outputs the received analog video data to the DVR 310. The DVR 310 converts the received analog video data into digital data, and records the digital data according to an MPEG standard indicating a moving-image standard format, such that it manages the target's video data captured by the cameras 131 and 132 contained in the transmitter 100.
[66] The management server 320 performs data processing of the signals received from the transmitter 100 using a radar signal processor 321, a video processor 322, and a target processor 323, such that it tracks/records the position of the target object, and at the same time properly controls the receiver 133. Also, the management server transmits unified monitoring information to the display 360 and the user terminals 340 via a service unit 324.
[67] The radar signal processor 321 contained in the management server 320 converts an antenna signal of an one-dimensional 3-bit level capable of allowing user's eyes to easily identify the target object into a two-dimensional image with resolution of 1280 x 1280 pixels. In order to correctly track the position of the target object, the video processor 322 removes images of the remaining objects other than the target object from the two-dimensional image, and performs an image processing step capable of minimizing the number of unnecessary images.
[68] The target processor 323 detects speed, position, size, and a moving direction of the target object from the image acquired from the video processor 322, searches for the target object to be monitored, and starts tracking the target object. In order to manage data of the target object, the target processor 323 assigns a tracking number to the target object, and records a variety of information (e.g., a target tracking time, a moving path of the target object, and speed of the target object) of the target object in a predetermined DB (database).
[69] If the radar signal processor 321 starts tracking the target object, the target processor 323 transmits position information of the target object to the transmitter 100 according to the movement of the target object, such that two cameras 131 and 132 can capture the image of the target object at optimum conditions. In this case, the receiver 133 located at a specific position approximating an actual position value of the target object searches for a preset area, and transmits information of the preset area to the transmitter 100. [70] The service unit 324 of the management server 320 transmits the monitored information of the target object to the user terminals 340 via Web services. There are a variety of services supplied from the service unit 324, for example, a log-in service for system security, a monitoring-information service, and a receiver 133's remote control service for controlling two cameras 131 and 132. The monitoring-information service combines target information recorded in the DB with an electronic marine chart, and provides a user with the combined result, such that the user can easily read and recognize position data of the target object. The service unit 324 receives remote- control information of the cameras 131 and 132 from the user terminals 340, and transmits the cameras remote-control information to the receiver 133 via the signal processor 200, such that the user can acquire detailed visual information.
[71] The display 360 has SXGA resolution composed of 1280 x 1024 pixels, and is connected to the management server 320, such that target position data processed by the management server 320 and the video data of the embedded video switch 330 are displayed on the display 360.
[72] The user terminals 340 are connected to the management server 320 via a switching hub 350. If a subscriber user pre-registered as a member in the management server 320 gains access to the management server 320, the service unit 324 assigns specific authority to the subscriber user, such that the subscriber user can control overall operations of the system. Therefore, the user terminals 340 control data transmission from the transmitter 100 to the receiver 300 at a remote site, properly controls motions of the antenna unit 110 and the cameras 131 and 132 of the transmitter 100, and can process radar position data stored in an auxiliary memory and video data corresponding to the radar position data, thereby displaying target data on the display 360.
[73] The above-mentioned video monitoring system interacting with the local-area radar system will hereinafter be described in detail.
[74] The antenna unit 110 of the transmitter 110 rotates by 36O0C and at the same time transmits a radar beam to a target monitoring area. If the target is found, the antenna unit 110 receives incident waves reflected from the target object, and outputs the received incident waves to the embedded radar data converter 120. Upon receiving the incident waves from the antenna unit 110, the embedded radar data converter 120 displays the acquired position information on its screen, and converts the radar position information of the target object into digital signals, such that the position information of the target object can be transmitted to the management server 320 located at a remote site.
[75] The video monitoring unit 130 installed at the smart video monitoring system along with the antenna unit 110 controls the CCD camera 131 or the infrared cameral32 to capture an image of the target object detected by the antenna unit 110, acquires video data of the target object, and transmits the acquired video data to the optical transmitter 210. In this case, in order to more precisely capture the image of the target object, the video monitoring unit 130 precisely adjusts the cameras 131 and 132 upon receiving a control signal from the receiver 133 before capturing the image of the target object.
[76] The optical transmitter 210 performs optical modulation and optical multiplexing processes on data received from the antenna unit 110 and the video monitoring unit 130, and transmits the resultant data to the optical receiver 220.
[77] By the optical demodulation and optical distribution steps, the optical camera video data is received in the embedded video switcher 330, and radar signals are received in the management server 320.
[78] The embedded video switcher 330 is transmitted to the service unit of the management 320, such that it transmits video data to the user terminals 340. The DVR 310 receives output data of the embedded video switcher 330, and stores the received data. In this case, radar signals received in the management server 320 undergo a variety of data processing steps for two-dimensional image conversion. In other words, the radar signals are processed by the radar signal processor 321, the video processor 322 for removing unnecessary images or fixed objects from the acquired image, and the target processor 323 for extracting the target information, and the resultant signals are recorded in the DB. Also, the data recorded in the DB is outputted to the display 360 and the user terminals340 via the service unit 320 for combining the target data with map data, such that the user can simultaneously recognize the position and image of the target object.
[79] The management server 320 controls a video processor 325 to output a control signal, such that the target information acquired from the target processor 323 is transmitted to the receiver 133. Therefore, the cameras 131 and 132 also move according to the target position data, such that the management server 320 receives the target position data varying with time and video data corresponding to the target position data in real time. The management server 320 assigns a tracking number to a desired target object, and indicates the position of the moving target object using residual images or lines, such that it can allow the user to easily identify the moving path of the target object. Also, the speed-, position-, and size- information of the target object is displayed on the display.
[80] The smart video monitoring system interacting with the auto-tracking radar system and a method for controlling the same according to the present invention will hereinafter be described with reference to FIG. 5.
[81] FIG. 5 is a detailed block diagram illustrating the management server 320 of FIG. 3 according to the present invention.
[82] Referring to FIG. 5, the management server 320 includes the radar signal processor 321, the video processor 322, the target processor 323, and the service unit 324.
[83] In this case, the management server320 converts the radar data received from the radar signal processor 321 into a two-dimensional image, removes images of fixed objects from the two-dimensional image using a mask map, applies a linear-motion prediction algorithm to the resultant data, acquires/tracks an image of a desired target object, records the information of the target object in the DB, manages the information recorded in the DB, and provides a plurality of users with the information of the target object over a network.
[84] The radar signal processor 321 contained in the management server 320 converts an antenna signal of a one-dimensional 3-bit level capable of allowing user s eyes to easily identify the target object into a two-dimensional image with resolution of 1280 x 1280 pixels.
[85] In order to correctly track the position of the target object, the video processor 322 removes images of the remaining objects other than the target object from the two- dimensional image, and performs an image processing step capable of minimizing the number of unnecessary images.
[86] The target processor 323 detects speed, position, size, and a moving direction of the target object from the image acquired from the video processor 322, searches for the target object to be monitored, and starts tracking the target object. In order to manage data of the target object, the target processor 323 assigns a tracking number to the target object, and records a variety of information(e.g., a target tracking time, a moving path of the target object, and speed of the target object) of the target object in the DB.
[87] The service unit 323 provides a plurality of user terminals 340 with monitored information of the target object via Web services.
[88] FIG. 6 is a conceptual diagram illustrating the flow of data or signals for use in the management server of FIG. 3 according to the present invention.
[89] Referring to FIG. 6, the management server 320 includes the radar signal processor
321, the video processor 322, the target processor 323, and the service unit 324.
[90] FIG. 7 is a flow chart illustrating a smart video monitoring method interacting with the auto-tracking radar system according to a preferred embodiment of the present invention.
[91] Referring to FIG. 7, the smart video monitoring system for use in the smart video monitoring method receives one-dimensional radar signal data, and transmits the received one-dimensional radar signal data. In the case of receiving the one- dimensional radar data, the smart video monitoring system records the received data of a file format in a memory, and performs the radar signal processing of the recorded file-format data at step STl.
[92] The smart video monitoring system creates a radar image using a monitoring-area map, a GIS map, and a user map, removes images of the remaining objects other than a desired target object from the acquired radar image, minimizes the number of unnecessary images contained in the radar image, and performs image processing on the resultant data at step ST2.
[93] Thereafter, the smart video monitoring system extracts at least one of speed, position, size, and moving direction of the target object from the processed radar image, searches for the target object to be monitored, tracks the position of the target object, assigns a tracking number of the target object, records specific information including at least one of the tracking time, moving path, and speed of the target object in the DB, and performs target processing of the recorded information at step ST3.
[94] FIG. 8 is a flow chart illustrating operations of the radar signal processor of FIG. 7 according to the present invention.
[95] Referring to FIG. 8, if the radar signal processor 321 of the management server 320 receives line data from the embedded radar data converter 120 at step STl 1, it determines whether the number of received lines is equal to a specific line number corresponding to one-rotation at step ST 12.
[96] If it is determined that the number of received lines was equal to the one-rotation line number, the radar signal processor 321 maps one-dimensional data to two- dimensional coordinates at step ST 13. The radar signal processor321 generates a radar image, and forms a BMP file composed of 1280 x 1280 pixels.
[97] FIG. 9 is a flow chart illustrating operations of the video processor 322 of FIG. 7 according to the present invention.
[98] Referring to FIG. 9, if the video processor 322 contained in the management server
320 receives the radar image at step ST21, it determines the presence or absence of fixed objects (i.e., GIS (Graphical Information System) data) in the radar image at step ST22.
[99] If the GIS data is detected at step ST22, the video processor 322 performs masking of the detected GIS data at step ST23, resulting in the creation of a radar image. Otherwise, if the GIS data is not detected at step ST22 or the mask processing is performed, the video processor 322 determines the presence or absence of a monitoring- area map (i.e., hazard data) at step ST24. If the hazard data is not detected at step ST24, the video processor 322 performs the masking process on the received data at step ST25, such that a new image acquired by the sum of the GIS-processed data and the hazard data is formed. The video processor creates a radar image, and forms a BMP file composed of 1280 x 1280 pixels.
[100] FIG. 10 is a flow chart illustrating operations of the target processor 323 of FIG. 7 according to the present invention.
[101] Referring to FIG. 10, the target processor 323 receives an object image at step ST31.
[102] The target processor 323 removes unnecessary images (e.g., interference, sea waves, and snow/rain) from the received object image at step ST32. The target processor 323 assigns an object number to a corresponding object using an object searching algorithm at step ST33. The target processor 323 calculates a variety of information (e.g., size, position, and center) of the object at step ST34. The target processor 323 determines whether consistency between a current object and a previous object is maintained using a linear-motion prediction algorithm at step ST35. If it is determined that the consistency between the current object and the previous object was maintained at step ST35, the target processor 323 assigns the same tracking number as that of the previous object to the current object at step ST36. Otherwise, if it is determined that the consistency between the current object and the previous object was not maintained at step ST35, the target processor 323 assigns a new tracking number to the current object at step ST37. Thereafter, the target processor 323 records target's tracking information (e.g., number, and position of the target) in the DB at step ST38. The target processor 323 reads danger information (e.g., speed, position, and direction, etc.) established by a user at step ST39. The target processor 323 calculates danger levels of individual objects at step ST40, and determines the presence or absence of any danger object at step ST41. If the presence of the danger object is determined at step ST41, the target processor 323 searches for an object of the highest danger level at step ST42. The target processor 323 searches for a preset warning area approximating the position of the found object at step ST43. The target processor 323 transmits a camera control command to the receiver 133 at step ST44.
[103] FIG. 11 is a conceptual diagram illustrating mapping operations of the radar signal processor 321 according to the present invention.
[104] Referring to FIG. 11, the radar signal processor 321 generates a two-dimensional radar image according to the number of directions for each scale and resolution for each dot.
[105] FIG. 12 is a conceptual diagram illustrating radar signal conversion operations of the radar signal processor according to the present invention.
[106] Referring to FIG. 12, the radar signal processor 321 receives one-dimensional radar data, and converts the received one-dimensional radar data into a two-dimensional image.
[107] FIG. 13 is a conceptual diagram illustrating display mapping operations of the radar signal processor according to the present invention.
[108] Referring to FIG. 13, the radar signal processor 321 converts the one-dimensional radar data, and maps it to a two-dimensional map, such that it acquires the mapped resultant image. [109] FIG. 14 is a conceptual diagram illustrating video mixing operations of the video processor according to the present invention.
[110] Referring to FIG. 14, the video processor 322 receives height(H) information (hi), width(W) information (wl), position(P) information (pi), and scale(S) information(sl) from the two-dimensional image acquired by conversion of the one-dimensional radar data.
[I l l] The video processor 322 receives h2-, w2-, p2-, and s2- data of the target image, h3-, w3-, p3-, and s3- data of a User Hazard Map, h4-, w4-, p4-, and s4- data of an Original Hazard Map, h5-, w5-, p5-, and s5- data of a User Drawing Map, and h6-, w6-, p6-, and s6- data of a Sea Map obtained by GIS, and processes the received data. The video processor 322 outputs Kl-, w7-, p7-, and s7- data of a Service Map.
[112] FIG. 15 is a conceptual diagram illustrating a process for generating a target image using the video processor according to the present invention.
[113] Referring to FIG. 15, the video processor 322 defines or limits a target tracking range, removes data of fixed objects (e.g., GIS object and user-defined objects) from an acquired image, converts data of only a desired target object into a radar image.
[114] Therefore, the video processor 322 removes colored parts of the "GIS Map & User
Map" image denoted by "2" from the radar image denoted by " 1" in FIG. 15, and extracts only colored parts of the "Hazard Map & User Map" denoted by "3" from the resultant image, thereby forming a target image denoted by "4".
[115] In this case, the above-mentioned radar image is a picture file formed by converting the one-dimensional radar data into two-dimensional data.
[116] The "GIS Map & User Map" image denoted by "2"is used to remove fixed objects
(e.g., islands and breakwaters) from the radar tracking result. The "GIS Map & User Map" image denoted by "2" indicates the principal hazard areas of the radar system, such that the user can determine the danger levels of individual objects. The GIS Map picture is formed by capturing an electronic marine chart on the basis of geographical information of submarine cables and the radar antenna direction according to a predetermined scale.
[117] If the user desires to release the danger area differently from the electronic marine chart, the user can release the danger area by inserting the User Map picture into the radar image. The video processor 322 creates the GIS Map and the User Map as a single picture file, and processes the created picture file. Each of the remaining areas other than the hazard areas is filled with a pixel value of "0" (i.e., binary number "0"), and each of the hazard areas is filled with a binary number "1111", resulting in the implementation of logic AND operation between the resultant image and the radar image.
[118] The "Hazard Map & User Map"denoted by "3"in FIG. 15 indicates the setup of main radar hazard areas (i.e., main radar monitoring areas) to determine the presence or absence of danger in each area. If a danger area is established by the user, the User Map picture is inserted into the resultant image. The Hazard Map and the User Map are configured in the form of a single picture file, such that the single picture file including the Hazard Map and the User Map is processed by the video processor 322. Each area to be used as a hazard area is filled with a specific pixel value equal to a binary number " 1111", and each of the remaining areas other than the above-mentioned areas is filled with a binary number "0", resulting in the implementation of logic AND operation between the resultant image and the radar image.
[119] The Target Image denoted by "4"in FIG. 15 is indicative of a final target-object picture acquired by the video mixing of the video processor 322.
[120] FIG. 16 is a conceptual diagram illustrating the target- processing result of the target processor according to the present invention.
[121] Referring to FIG. 16, the target processor 323 extracts a target from the target image, analyzes the extracted target, and records the analyzed data in the DB. In this case, the information extracted from the target image includes ID number, position, and speed, etc.
[122] FIG. 17 is a conceptual diagram illustrating a process for establishing a pre- monitoring area of a receiver according to the present invention.
[123] As shown in FIG. 17, target monitoring information(e.g., position, speed, and operating time) and the warning level are displayed on FIG. 17.
[124] FIG. 18 shows a plurality of exemplary images illustrating the simulation result of a radar-data processing according to the present invention.
[125] Referring to FIG. 18, the radar signal processor for the radar-data processing receives an original radar image denoted by " 1".
[126] The radar signal processor establishes a fixed object mask denoted by "2". The radar signal processor can extract only the image of the fixed object mask from among the original radar image, as shown in the fixed-object removing image denoted by "3". The radar signal processor establishes a submarine cable mask denoted by "4". Therefore, only the target object image denoted by "5", corresponding to the submarine cable mask denoted by "4", is extracted from the fixed-object removing image denoted by "3", such that the extracted image denoted by "5" remains. Target (i.e, object) information can be displayed as denoted by "6" in FIG. 18.
[127] FIG. 19 shows an implementation example of a radar/video control monitoring program according to the present invention.
[128] FIG. 19 shows an image illustrating the summarized result of the video monitoring information program interacting with the local-area radar system. The image of FIG. 19 can be found on the display 360 or the user terminal 340.
[129] The target information is configured in the form of a table. In the table, the electronic submarine chart is combined with the target information, such that the combined result is shown in the table. The images captured by the cameras 131 and 132 are overlaid with each other, such that the overlay result is displayed on a single screen. The images captured by the cameras 131 and 132 can move to another position, can be hidden, and can also be zoomed in on. The target information is represented by symbols of a variety of colors and sizes, such that the user can easily recognize the monitoring information. The moving path, speed, and tracking time of the target object are determined by the system, such that the system automatically generates the warning or alert sound, resulting in greater convenience of the user. Industrial Applicability
[130] As apparent from the above description, the smart video monitoring system communicating with an auto-tracking radar system according to the present invention adds a high-quality image captured by a camera to two-dimensional position data acquired by a radar system, allows a user or operator to easily recognize a target object displayed on a display, and tracks/monitors the target object located at a monitoring area at a remote site.
[131] Although the preferred embodiments of the present invention have been disclosed for illustrative purposes, those skilled in the art will appreciate that various modifications, additions and substitutions are possible, without departing from the scope and spirit of the invention as disclosed in the accompanying claims.

Claims

Claims
[1] A smart video monitoring system interacting with an auto-tracking radar system comprising: a transmitter for transmitting position data of a target object by the radar system, generating video data using a camera, and transmitting the generated video data, which includes: an antenna unit including: an antenna for rotating, transmitting a radar signal to the target object, and receiving incident waves reflected from the target object to detect the presence of the target object and its position data, a pedestal for physically controlling operations of the antenna according to a control signal, and a housing for forming a waveguide and an overall layout; an embedded radar data converter for processing the radar signal received via the antenna unit, finding information of the target object, transmitting the found target-object information, and driving the antenna unit according to a control signal; and a video monitoring unit for capturing an image of the target object detected by the antenna unit at high resolution; a receiver for receiving output data of the transmitter, storing the camera- captured image, performing signal processing of the stored image, acquiring data of the target object, and controlling operations of the transmitter, such that a user is able to view the acquired target-object data in real time; and a signal processor for performing data communication between the transmitter and the receiver.
[2] The smart video monitoring system according to claim 1, wherein the video monitoring unit includes two cameras and a receiver.
[3] The smart video monitoring system according to claim 2, wherein the two cameras are a CCD camera and an infrared camera, respectively.
[4] The smart video monitoring system according to claim 3, wherein each of the two cameras includes a zoom lens capable of capturing an image of the target object located at a remote site at high resolution.
[5] The smart video monitoring system according to claim 4, wherein the CCD camera and the infrared camera are driven 24 hours a day, in which: the CCD camera is driven in the daytime or ordinary climate environments, and the infrared camera is driven in the night or a thick fog, a heavy rain, or a snowfall.
[6] The smart video monitoring system according to claim 2, wherein the receiver receives the control signal, and transmits the received control signal to the cameras, such that the cameras can be controlled at a remote site.
[7] The smart video monitoring system according to claim 1, wherein the signal processor includes: an optical transmitter for receiving radar position data received via the antenna unit and video data generated from the video monitoring unit, performs an optical modulation process and an optical multiplexing process on the received position and video data; and an optical receiver for receiving the position and video data of the optical transmitter, performing optical demodulation and optical distribution on the received data via an optical fiber or wireless communication, and transmitting the resultant data to the receiver.
[8] The smart video monitoring system according to claim 1, wherein the receiver includes: a management server for controlling operations of the camera and the antenna unit at a remote site, transmitting information acquired from the target object by replying to control signals of the camera and the antenna unit, managing target information acquired by performing signal processing on radar data generated from the transmitter, and controlling the receiver to output image information corresponding to position information of the target object; an embedded video switcher for transmitting the video data captured by the camera to a user; a DVR (Digital Video Recorder) for converting the output image of the embedded video switcher into digital video data, storing the digital video data, and playing the stored video data; and a plurality of user terminals.
[9] The smart video monitoring system according to claim 7, wherein: if a specific user pre-registered as a member from among users of the user terminals gains access to the management server over a communication network, the management server assigns specific authority to the specific user, such that the specific user is able to control the management server using the authority.
[10] The smart video monitoring system according to claim 8, wherein the management server assigns a tracking number of the target object, indicates target's position data varying with time using residual images or lines, allows the user to easily identify a moving path of the target object, and outputs speed-, position-, and size- information of the target object on a display.
[11] The smart video monitoring system according to claim 7, wherein the video data is generated by an NTSC (National Television Systems Committee) or PAL (Phase Alternation Line) scheme.
[12] The smart video monitoring system according to claim 1, wherein the management server removes an image of a corresponding target object from a radar image using a video processing algorithm for removing images of fixed objects, and performs signal processing on the removed resultant image, such that it tracks the position of the target object.
[13] The smart video monitoring system according to claim 6, wherein the receiver presets a range of a local-area monitoring area (i.e., a local-area hazard area), and controls the position of the camera moving with the movement of the target object according to the preset value of the receiver.
[14] A smart video monitoring system interacting with an auto-tracking radar system, including: a transmitter for transmitting position data of a target object by the radar system, generating video data using a camera, and transmitting the generated video data; a receiver for receiving output data of the transmitter, storing the camera-captured image, performing signal processing of the stored image, acquiring data of the target object, and controlling operations of the transmitter, such that a user is able to view the acquired target-object data in real time; and a signal processor for performing data communication between the transmitter and the receiver, the smart video monitoring system comprising: the receiver including: a management server for controlling operations of the camera and the antenna unit at a remote site, transmitting information acquired from the target object by replying to control signals of the camera and the antenna unit, managing target information acquired by performing signal processing on radar data generated from the transmitter, and controlling the receiver to output image information corresponding to position information of the target object, an embedded video switcher for transmitting the video data captured by the camera to a user, a DVR (Digital Video Recorder) for converting the output image of the embedded video switcher into digital video data, storing the digital video data, and playing the stored video data, and a plurality of user terminals; and the management server for converting radar data received from a radar signal processor into a two-dimensional image, removing images of fixed objects from the two-dimensional image using a mask map, applying a linear-motion prediction algorithm to the resultant data, acquiring/tracking an image of the target object, recording information of the target object in a database (DB), managing the target-object information recorded in the DB, and providing a plurality of users with the target-object information over a network.
[15] The smart video monitoring system according to claim 14, wherein the management server includes: a radar signal processor for converting an one-dimensional antenna signal into the two-dimensional image; a video processor for removing images of the remaining objects other than the target object from the two-dimensional image, and minimizing the number of unnecessary images in the two-dimensional image; a target processor for detecting detects speed, position, size, and a moving direction of the target object from the image acquired from the video processor, searching for the target object to be monitored, starting tracking the target object, assigning a tracking number to the target object, and recording a variety of information (i.e., a target tracking time, a moving path of the target object, and speed of the target object) of the target object in the DB, thereby managing the data of the target object; and a service unit for receiving the data recorded in the DB from the target processor, and providing the user terminals with monitoring information of the target object via Web services.
[16] The smart video monitoring system according to claim 14, wherein the management server includes: a radar signal processor for performing transmission/reception of the radar data, receiving one-dimensional radar data, recording the received one-dimensional radar data as a file format in a memory, performing radar-imaging process, and converting the one-dimensional radar data to the two-dimensional image; a video processor for receiving a scanned radar image, a GIS map image, and a hazard map image, and generating a target object using the received images; a target processor for extracting an image of the target object using a target- searching algorithm, analyzing an ID number, position, size, and speed of the target object using a linear-motion prediction algorithm, recording the analyzed information in a database (DB), starting tracking the target object, and commanding the camera to focus on the most harmful target object; a service unit for combining the target information, the radar image, a GIS map, and a hazard map with one another, providing the users with the combined resultant data, and providing a link to the embedded video switcher; and a video processor for controlling the camera according to request signals of the service unit and the target processor.
[17] The smart video monitoring system according to claim 16, wherein the target processor controls panning/tilting/zooming (P/T/Z) operations using a preset function of the receiver.
[18] The smart video monitoring system according to claim 16, wherein the management server includes: a radar signal processor for performing transmission/reception of the radar data, receiving one-dimensional radar data, recording the received one-dimensional radar data as a file format in a memory, performing radar-imaging process, and converting the one-dimensional radar data to the two-dimensional image; a video processor for receiving the two-dimensional image from the radar signal processor, defining a target tracking range, removing images of fixed images, converting only data of a desired target object into a radar image, and outputting a target image using the radar image, a "hazard map + user map" image, and a "GIS map + user map" image; a target processor for extracting an image of the target object from the target image generated from the video processor, analyzing the extracted target image, acquiring target information (i.e., a picture number, ID number, position, size, and speed) in the DB, determining the stored target information, and commanding the camera to focus on the most harmful target object; a service unit for receiving the radar image, the "hazard map + user map" image, and the "GIS map + user map" image, receiving the target information recorded in the DB from the target processor, mixing the received information, and providing a client with a monitoring picture indicating the mixed result a video controller for providing a monitoring unit with moving images, and controlling the camera upon receiving control commands from the target processor and the service unit; and a monitoring unit connected to the service unit, for displaying a radar monitoring picture, the target information, and a camera-captured image (i.e., stream), and processing a camera control operation.
[19] The smart video monitoring system according to claim 18, wherein the target processor controls panning/tilting/zooming (P/T/Z) operations using a preset function of the receiver.
[20] A smart video monitoring method interacting with an auto-tracking radar system comprising the steps of: a) receiving one-dimensional radar data, transmitting the received one- dimensional radar data, recording the received one-dimensional radar data as a file format in a memory, and performing radar signal processing on the recorded data; b) generating a radar image using a radar image, a hazard map image, a GIS map image, a user map, removing images of the remaining objects other than a target object to be monitored from the radar image, minimizing the number of unnecessary images in the radar image, and performing an image processing on the resultant radar image; and c) analyzing speed-, position-, size-, and moving direction- data of the target object contained in the image -processed radar image, searching for data of a specific object to be monitored according to the analyzed result, tracking the position of the target object, assigning a tracking number to the target object, recording at least one of speed, position, size, and moving direction data of the target object from the image-processed radar image, searching for the target object to be monitored, tracks the position of the target object, assigns a tracking number of the target object, records specific information including at least one of the tracking time, moving path, and speed of the target object in a database (DB), and performing target-processing of the recorded information.
[21] The smart video monitoring method according to claim 20, wherein the step (a) includes the steps of: al) if a radar signal processor of a management server receives line data from an embedded radar data converter, determining, by the radar signal processor, whether the number of received lines is equal to a specific line number corresponding to one -rotation; a2) if the number of received lines is equal to the one-rotation line number, mapping maps one-dimensional data to two-dimensional coordinates; and a3) generating a radar image, and forms a two-dimensional image file.
[22] The smart video monitoring method according to claim 20, wherein the step (b) includes the steps of: bl) if a video processor of a management server receives the radar image, determining, by the video processor, the presence or absence of GIS data (i.e., GIS map); b2) if the presence of the GIS data is determined, performing a masking process; b3) if the absence of the GIS data is determined at the step (bl),or if the masking process is performed, determining the presence or absence of the hazard map; b4) if the absence of the hazard map is determined, performing the masking process; and b5) generating the radar image, and forming a two-dimensional image file.
[23] The smart video monitoring method according to claim 20, wherein the step (c) includes the steps of: cl) receiving an image of the target object from a target processor of a management server, and determining whether consistency between a current object and a previous object is maintained; c2) if the consistency between the current object and the previous object is maintained, assigning the same tracking number as that of the previous object to the current object; c3) if the consistency between the current object and the previous object is not maintained, assigning a new tracking number to the current object; c4) recording target- tracking information in the DB, reading danger information established by a user, calculating a danger level of each object, and determining the presence or absence of any dangerous object; and c5) if the dangerous object is found, performing camera control operations. [24] The smart video monitoring method according to claim 23, wherein the step (cl) includes the steps of: cl-1) receiving an image of the target object from the target object processor of the management server; c 1-2) removing unnecessary images from the received image; c 1-3) assigning a target number to the target object; c 1-4) calculating target information; and c 1-5) determining whether consistency between a current object and a previous object is maintained. [25] The smart video monitoring method according to claim 24, wherein the step c 1-3) includes the step of: assigning the target number to the target object using a target-searching algorithm. [26] The smart video monitoring method according to claim 24, wherein the step c 1-5) includes the step of: determining whether consistency between the current object and the previous object is maintained using a linear-motion prediction algorithm. [27] The smart video monitoring method according to claim 23, wherein the step c5) includes the steps of: c5-l) if a danger or harmful target object is found at step c4), searching for the most harmful object from among a plurality of objects; c5-2) searching for a preset area approximating a position of the most harmful object; and c5-3) transmitting a camera control command to a receiver.
PCT/KR2006/003067 2005-08-04 2006-08-04 Smart video monitoring system and method communicating with auto-tracking radar system WO2007015631A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2006800343611A CN101268383B (en) 2005-08-04 2006-08-04 Smart video monitoring system and method communicating with auto-tracking radar system

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR1020050071447A KR100594513B1 (en) 2005-08-04 2005-08-04 Image monitoring system connected with close range radar
KR10-2005-0071447 2005-08-04
KR1020060073033A KR100720595B1 (en) 2006-08-02 2006-08-02 Smart video monitoring system and method communicating with auto-tracking radar system
KR10-2006-0073033 2006-08-02

Publications (1)

Publication Number Publication Date
WO2007015631A1 true WO2007015631A1 (en) 2007-02-08

Family

ID=37708892

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2006/003067 WO2007015631A1 (en) 2005-08-04 2006-08-04 Smart video monitoring system and method communicating with auto-tracking radar system

Country Status (1)

Country Link
WO (1) WO2007015631A1 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2267409A1 (en) * 2009-06-08 2010-12-29 Honeywell International Inc. System and method for displaying information on a display element
CN104135644A (en) * 2014-07-31 2014-11-05 天津市亚安科技股份有限公司 Intelligent tracking cradle head having radar monitoring function and monitoring method
CN104280730A (en) * 2013-07-09 2015-01-14 北京瑞达恩科技股份有限公司 Low-altitude search radar
US9055201B2 (en) 2011-10-14 2015-06-09 Samsung Techwin Co., Ltd. Apparatus and method of storing and searching for image
JP2016057185A (en) * 2014-09-10 2016-04-21 日本無線株式会社 Buried object survey device
GB2508770B (en) * 2011-09-09 2017-11-22 Accipiter Radar Tech Inc Device and method for 3D sampling with avian radar
CN108037501A (en) * 2018-01-30 2018-05-15 长沙深之瞳信息科技有限公司 It is a kind of to obtain area outlook radar system and method for the target pitch to angle
CN109188429A (en) * 2018-08-30 2019-01-11 国网电力科学研究院武汉南瑞有限责任公司 Electricity transmission line monitoring method and monitoring system based on radar and two waveband video camera
CN110046130A (en) * 2019-05-17 2019-07-23 大连海事大学 A kind of Organization And Management's method of radar signal file
CN110297234A (en) * 2018-03-22 2019-10-01 西安航通测控技术有限责任公司 A kind of big region passive type air target intersection measuring method of networking and system
CN110413836A (en) * 2019-07-18 2019-11-05 湖南宏动光电有限公司 A kind of panorama search system
CN111402296A (en) * 2020-03-12 2020-07-10 浙江大华技术股份有限公司 Target tracking method based on camera and radar and related device
CN111695771A (en) * 2020-05-07 2020-09-22 国网安徽省电力有限公司淮南供电公司 Intelligent electric power material detection, management and control system and method based on Internet of things technology
US10937232B2 (en) 2019-06-26 2021-03-02 Honeywell International Inc. Dense mapping using range sensor multi-scanning and multi-view geometry from successive image frames
CN113596325A (en) * 2021-07-15 2021-11-02 盛景智能科技(嘉兴)有限公司 Picture capturing method and device, electronic equipment and storage medium
CN113759366A (en) * 2020-06-02 2021-12-07 杭州海康威视数字技术股份有限公司 Target detection method and device
CN113790667A (en) * 2021-11-18 2021-12-14 中大检测(湖南)股份有限公司 Dam deformation detection method based on radar
CN115980739A (en) * 2023-03-21 2023-04-18 安徽隼波科技有限公司 Automatic defense deploying method for radar guided photoelectric tracking

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0815406A (en) * 1994-06-28 1996-01-19 Mitsubishi Electric Corp Video display apparatus
US6011515A (en) * 1996-10-08 2000-01-04 The Johns Hopkins University System for measuring average speed and traffic volume on a roadway
KR20000017043A (en) * 1998-08-04 2000-03-25 요코미조 히로시 Three-dimensional radar apparatus and method for displaying three-dimensional radar image
KR20010017644A (en) * 1999-08-13 2001-03-05 김계호 Complex Image Displaying Apparatus for Vessel

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0815406A (en) * 1994-06-28 1996-01-19 Mitsubishi Electric Corp Video display apparatus
US6011515A (en) * 1996-10-08 2000-01-04 The Johns Hopkins University System for measuring average speed and traffic volume on a roadway
KR20000017043A (en) * 1998-08-04 2000-03-25 요코미조 히로시 Three-dimensional radar apparatus and method for displaying three-dimensional radar image
KR20010017644A (en) * 1999-08-13 2001-03-05 김계호 Complex Image Displaying Apparatus for Vessel

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8314816B2 (en) 2009-06-08 2012-11-20 Honeywell International Inc. System and method for displaying information on a display element
EP2267409A1 (en) * 2009-06-08 2010-12-29 Honeywell International Inc. System and method for displaying information on a display element
GB2508770B (en) * 2011-09-09 2017-11-22 Accipiter Radar Tech Inc Device and method for 3D sampling with avian radar
US9055201B2 (en) 2011-10-14 2015-06-09 Samsung Techwin Co., Ltd. Apparatus and method of storing and searching for image
CN104280730A (en) * 2013-07-09 2015-01-14 北京瑞达恩科技股份有限公司 Low-altitude search radar
CN104135644A (en) * 2014-07-31 2014-11-05 天津市亚安科技股份有限公司 Intelligent tracking cradle head having radar monitoring function and monitoring method
JP2016057185A (en) * 2014-09-10 2016-04-21 日本無線株式会社 Buried object survey device
CN108037501A (en) * 2018-01-30 2018-05-15 长沙深之瞳信息科技有限公司 It is a kind of to obtain area outlook radar system and method for the target pitch to angle
CN108037501B (en) * 2018-01-30 2023-10-03 长沙深之瞳信息科技有限公司 Regional warning radar system and method capable of acquiring pitching angle of target
CN110297234A (en) * 2018-03-22 2019-10-01 西安航通测控技术有限责任公司 A kind of big region passive type air target intersection measuring method of networking and system
CN110297234B (en) * 2018-03-22 2023-03-14 西安航通测控技术有限责任公司 Networked large-area passive air target intersection determination method and system
CN109188429A (en) * 2018-08-30 2019-01-11 国网电力科学研究院武汉南瑞有限责任公司 Electricity transmission line monitoring method and monitoring system based on radar and two waveband video camera
CN109188429B (en) * 2018-08-30 2022-11-25 国网电力科学研究院武汉南瑞有限责任公司 Power transmission line monitoring method and monitoring system based on radar and dual-band camera
CN110046130B (en) * 2019-05-17 2022-10-18 大连海事大学 Radar signal file organization and management method
CN110046130A (en) * 2019-05-17 2019-07-23 大连海事大学 A kind of Organization And Management's method of radar signal file
US10937232B2 (en) 2019-06-26 2021-03-02 Honeywell International Inc. Dense mapping using range sensor multi-scanning and multi-view geometry from successive image frames
CN110413836A (en) * 2019-07-18 2019-11-05 湖南宏动光电有限公司 A kind of panorama search system
CN111402296A (en) * 2020-03-12 2020-07-10 浙江大华技术股份有限公司 Target tracking method based on camera and radar and related device
CN111402296B (en) * 2020-03-12 2023-09-01 浙江大华技术股份有限公司 Target tracking method and related device based on camera and radar
CN111695771A (en) * 2020-05-07 2020-09-22 国网安徽省电力有限公司淮南供电公司 Intelligent electric power material detection, management and control system and method based on Internet of things technology
CN111695771B (en) * 2020-05-07 2024-02-27 国网安徽省电力有限公司淮南供电公司 Electric power material intelligent detection management and control system and method based on Internet of things technology
CN113759366A (en) * 2020-06-02 2021-12-07 杭州海康威视数字技术股份有限公司 Target detection method and device
CN113596325A (en) * 2021-07-15 2021-11-02 盛景智能科技(嘉兴)有限公司 Picture capturing method and device, electronic equipment and storage medium
CN113596325B (en) * 2021-07-15 2023-05-05 盛景智能科技(嘉兴)有限公司 Method and device for capturing images, electronic equipment and storage medium
CN113790667A (en) * 2021-11-18 2021-12-14 中大检测(湖南)股份有限公司 Dam deformation detection method based on radar
CN115980739A (en) * 2023-03-21 2023-04-18 安徽隼波科技有限公司 Automatic defense deploying method for radar guided photoelectric tracking

Similar Documents

Publication Publication Date Title
WO2007015631A1 (en) Smart video monitoring system and method communicating with auto-tracking radar system
KR100720595B1 (en) Smart video monitoring system and method communicating with auto-tracking radar system
KR100594513B1 (en) Image monitoring system connected with close range radar
US7801331B2 (en) Monitoring device
KR101120131B1 (en) Intelligent Panorama Camera, Circuit and Method for Controlling thereof, and Video Monitoring System
US6529234B2 (en) Camera control system, camera server, camera client, control method, and storage medium
US9762864B2 (en) System and method for monitoring at least one observation area
EP1585332A1 (en) Remote video display method, video acquisition device, method thereof, and program thereof
US20110310219A1 (en) Intelligent monitoring camera apparatus and image monitoring system implementing same
EP2284814A1 (en) Systems and methods for night time surveillance
US20050036036A1 (en) Camera control apparatus and method
US20110157358A1 (en) Confined motion detection for pan-tilt cameras employing motion detection and autonomous motion tracking
US20050099500A1 (en) Image processing apparatus, network camera system, image processing method and program
KR101502448B1 (en) Video Surveillance System and Method Having Field of Views of 360 Degrees Horizontally and Vertically
US10397474B2 (en) System and method for remote monitoring at least one observation area
CN102868875A (en) Multidirectional early-warning positioning and automatic tracking and monitoring device for monitoring area
WO2009066988A2 (en) Device and method for a surveillance system
KR101096157B1 (en) watching apparatus using dual camera
CN202818503U (en) Multidirectional monitoring area early warning positioning automatic tracking and monitoring device
KR101075874B1 (en) Video transmission system
KR20090015311A (en) Video surveillance system
KR101281687B1 (en) Method for monitoring region on bad visuality
CN113497877A (en) Image pickup apparatus, control method, and storage medium
KR102259637B1 (en) Broadcasting Transmission System Based on Artificial Intelligence for Unmanned Broadcasting
CN113572946B (en) Image display method, device, system and storage medium

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200680034361.1

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 12008500271

Country of ref document: PH

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 577/KOLNP/2008

Country of ref document: IN

WWE Wipo information: entry into national phase

Ref document number: 1200800539

Country of ref document: VN

122 Ep: pct application non-entry in european phase

Ref document number: 06783516

Country of ref document: EP

Kind code of ref document: A1