WO2012050244A1 - 영상 감시 장치 및 그 오브젝트 검색 방법 - Google Patents
영상 감시 장치 및 그 오브젝트 검색 방법 Download PDFInfo
- Publication number
- WO2012050244A1 WO2012050244A1 PCT/KR2010/006929 KR2010006929W WO2012050244A1 WO 2012050244 A1 WO2012050244 A1 WO 2012050244A1 KR 2010006929 W KR2010006929 W KR 2010006929W WO 2012050244 A1 WO2012050244 A1 WO 2012050244A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- area
- displaying
- image
- characteristic value
- detection frequency
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/188—Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/5854—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using shape and object relationship
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/77—Determining position or orientation of objects or cameras using statistical methods
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/40—Software arrangements specially adapted for pattern recognition, e.g. user interfaces or toolboxes therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20076—Probabilistic image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
Definitions
- the method for searching an object of the video surveillance apparatus further includes inputting a characteristic value of an object to be searched in the surveillance image prior to displaying in the first area,
- the displaying step is characterized by separately displaying the detection frequency of the object having the input characteristic value in the first area.
- the object searching method of the video surveillance apparatus further includes a step of inputting time information to be searched from the surveillance image prior to the step of displaying in the first area, Is characterized in that it is characterized in that the detection frequency of the object having the input characteristic value is separately displayed on the basis of the input time information in the first area.
- the object searching method of the video surveillance apparatus further includes inputting area information to be searched in the surveillance image before displaying in the first area, Is a step of separately displaying the detection frequency of the object having the input characteristic value based on the input area information in the first area.
- the input characteristic value includes a first characteristic value and a second characteristic value of an object to be searched in the supervisory image
- the step of displaying in the first region includes: And displaying the detected frequency of the object having the value between the first characteristic value and the second characteristic value separately.
- the method for searching an object of the video surveillance apparatus may further include the steps of: determining a detection frequency of an object having the characteristic value in the surveillance image before displaying the object in the first area; And generating the detection frequency by normalizing the determined detection frequency.
- the characteristic value includes at least one of a size, a speed, and a hue of an object detected from the monitored image.
- the step of displaying in the first area is a step of graphically displaying the characteristic value and the detection frequency in the first area.
- the step of displaying the second area may include displaying the selected frame in the searched frame.
- the frame displayed in the second area is a thumbnail image.
- the object searching method of the video surveillance apparatus further includes displaying an image related to a thumbnail image selected from the thumbnail images.
- a user can efficiently perform characteristic filtering of an object to be searched based on statistical data on characteristics of an object detected in a supervised image. Accordingly, the accuracy of the search for the object of interest is improved, so that the user can easily perform the search, and the installation environment of the surveillance camera can be provided that is robust against the characteristic change.
- FIG. 1 is a block diagram illustrating a video surveillance system according to an embodiment of the present invention.
- FIG. 2 is a block diagram of a video surveillance apparatus according to an embodiment of the present invention.
- 3 (a) and 3 (b) are views showing the structure of image data and metadata according to an embodiment of the present invention.
- FIG. 4 is a diagram illustrating a structure of a characteristic information table according to an embodiment of the present invention.
- FIG. 5 is a flowchart illustrating an object search process according to an exemplary embodiment of the present invention.
- FIG. 6 is a diagram illustrating an object search screen according to an embodiment of the present invention.
- FIG. 7 is a flowchart illustrating an object search process according to an embodiment of the present invention.
- FIG 8 and 9 are flowcharts showing an object search screen according to an embodiment of the present invention.
- FIG. 10 is a flowchart illustrating an object search process according to an embodiment of the present invention.
- FIG. 11 is a view showing an object search screen according to an embodiment of the present invention.
- FIG. 12 is a flowchart illustrating an object search process according to an embodiment of the present invention.
- FIG. 13 is a flowchart illustrating an object search screen according to an embodiment of the present invention.
- " object " described in the present specification means a person or object to be monitored in the surveillance image.
- a person or object judged to move may be an object.
- event refers to events or facts that can occur in a surveillance image.
- " monitoring event " refers to an event that allows the user to effectively achieve the monitoring purpose.
- the video surveillance system 10 acquires a surveillance image through the at least one image acquisition apparatus 200, processes and analyzes the obtained surveillance image, and outputs processing and analysis results to the output apparatus 300 ) To the user.
- the video surveillance system 10 includes an image surveillance apparatus 100, an image acquisition apparatus 200, and an output apparatus 300.
- the output device 300 can output information related to processing and analysis of the monitoring image received from the video monitoring apparatus 100 and generation of the monitoring event sensed in the monitoring image.
- Output device 300 may be a terminal, a network computer, a wireless device (e.g., a PDA), a wireless telephone, an information appliance, a workstation, a minicomputer, a mainframe computer, Device or the like.
- the user may be remotely provided with processing / analysis results of the supervised image by an output device 300 having text, messaging and imaging capabilities.
- the user can be provided with the warning message generated by the video surveillance apparatus 100 by the output apparatus 300.
- the alert message may include an image.
- the image processing unit 110 analyzes the image obtained from the image capturing apparatus 200, and generates compressed image data and metadata.
- the image processing unit 110 stores the generated image data in the storage unit 130 and outputs the metadata to the monitoring control unit 120.
- the storage unit 130 stores the image acquired through the image acquisition apparatus 200.
- the storage unit 130 may be a VCR (Video Cassette Recorder), a DVR (Digital Video Recorder), a RAID (Redundant Array of Independent Disks) array, a USB (Universal Serial Bus) hard drive, an optical disk recorder, , A multipurpose computer, a high-dimensional imaging device, a de-interlacer, a processing and storage element for storing and / or processing a scaler and / or image.
- the user input unit 140 accepts an input command that the user inputs for controlling the operation of the video surveillance apparatus 100 and transmits the received command to the monitoring control unit 120 so that the monitoring control unit 120 operates according to the command do.
- the user input unit 140 may include a key pad, a dome switch, a touch pad (static / static), a jog wheel, a jog switch, and the like.
- the monitoring control unit 120 detects an object on a frame basis in a supervised image acquired from the image acquisition apparatus 200. Further, the monitoring control unit 120 calculates the characteristic value of the detected object, and records the identification information of the frame in which the object is detected in the meta data so as to correspond to the calculated characteristic value. Also, the monitoring control unit 120 accumulates and records the number of object detection times of the characteristic values calculated in the meta data.
- the metadata recorded in this manner includes the number of times of detection of an object having each characteristic value in the supervisory image up to now and identification information of the frame in which an object having each characteristic value is detected.
- the monitoring control unit 120 stores the meta data in the storage unit 130.
- the monitoring control unit 120 displays a characteristic value of an object detected in the supervisory image stored in the storage unit 130 and a detection frequency of the object having the characteristic value in the first area through the display unit.
- the monitoring control unit 120 reads the number of times of detection of the object having each characteristic value from the metadata stored in the storage unit 130. [
- the detection frequency of the readout is normalized or regularized to generate a detection frequency of an object having each characteristic value.
- the monitoring control unit 120 displays the generated detection frequency for each characteristic value through the display unit. In this case, each property value and the detection frequency of the object having each property value can be displayed in a graph.
- the detection frequency is a relative value of the detection frequency as a value obtained by normalizing the detection frequency. For example, when the object having the first characteristic value is detected 50 times and the object having the second characteristic value is detected 100 times in the supervised image, the detection frequency of the object having the first characteristic value is obtained by dividing the detection frequency by 50 And the detection frequency of the object having the second characteristic value may be 2, which is obtained by dividing the detection frequency by 10.
- the monitoring control unit 120 may display the searched frame in the second area through the display unit.
- the monitoring control unit 120 extracts a frame corresponding to the identification information of the frame read from the metadata from the video data stored in the storage unit 130. [
- the monitoring control unit 120 may display the extracted frame in the second area through the display unit.
- the frame displayed in the second area may be a thumbnail image representative of a series of images from the surveillance image until the object is once out of the surveillance image after the object is once detected. Also, when one or more thumbnail images are selected from the thumbnail images displayed in the second area, a series of images represented by the selected thumbnail images may be displayed.
- 3 (a) and 3 (b) are views showing the structure of image data and metadata according to an embodiment of the present invention.
- the video data and the metadata include link information and a data body.
- the data body is the data of the supervised video.
- the data body includes information indicating the object to be monitored and attribute information defining a descriptive method of information indicating the object to be monitored.
- the link information includes related information indicating the relation between the video data and the meta data, and attribute information defining a description method of the contents of the related information.
- the related information for example, a time stamp or a sequence number for specifying the video data is used.
- the time stamp means information (time information) indicating the generation time of the video data.
- the sequence number means information (sequence information) indicating the generation order of the content data.
- the related information may include information (e.g., a manufacturer name, model name, manufacturing number, etc.) for specifying the image acquisition apparatus.
- a markup language defined as a method of describing information exchanged on the Web may be used. As described above, when the markup language is used, exchange of information through the network becomes easy. In addition, when XML used for exchange of a markup language, for example, a document or electronic data, is used, exchange of image data and meta data is facilitated. When XML is used, for example, XMLSchema is used as attribute information.
- the data body of the metadata includes object detection information for storing the image acquisition environment information including the position, angle, and the like of the image acquisition device, the ID and characteristic of the detected object, Whether or not a monitoring event is generated, and the like.
- FIG. 4 is a diagram illustrating a structure of a characteristic information table according to an embodiment of the present invention.
- the characteristic information table 400 includes a characteristic value field 410, a detection frequency field 420, a frame number field 430 and an identification information field 440 of frames.
- the property value field 410 stores the property value of the object.
- a 1 is determined by the following equation (1).
- a 2 is determined by the following equation (2).
- a 3 are determined by the following equation (3).
- c is an RGB value of an object, and m is a natural number
- m is a factor for the precision of object retrieval. As m is smaller, the user can precisely retrieve the object detected by the property value, and when m is larger, the user can search the object detected by property value .
- the detection count field 420 stores the number of times the object having the property value is detected. When a plurality of objects are detected in one frame, each of the plurality of objects can be stored as the number of times the object is detected.
- the frame number field 430 indicates the number of frames in which an object having the property value is detected. When a plurality of objects are detected in one frame, only one frame can be stored as the number of detected frames.
- the identification information field 440 of frames denotes identification information of frames in which an object having the property value is detected.
- the identification information for the first frame, the identification information for the second frame, ..., the n-th frame and n is a natural number).
- the monitoring control unit 120 can determine the characteristic value of the object detected in the supervisory image and the detection frequency of the object having the characteristic value. For example, the monitoring control unit 120 can determine the detection frequency of the characteristic value by referring to the detection frequency field 420 in the characteristic information table 400. [ Also, the monitoring control unit 120 can search the monitored image for the frame in which the object having the selected characteristic value is detected. For example, the monitoring control unit 120 may refer to the identification information field 440 of frames in the characteristic information table 400 to search for a frame in which an object having a selected characteristic value is detected.
- FIG. 5 is a flowchart illustrating an object search process according to an exemplary embodiment of the present invention.
- step S300 the monitoring control unit 120 searches for a frame in which an object having a property value selected from the characteristic values displayed in the first area is detected in the supervisory image stored in the storage unit 130.
- FIG. The monitoring control unit 120 receives an input from the user through the user input unit 140 to select one or more characteristic values among the characteristic values displayed in the first region.
- the monitoring control unit 120 reads identification information of a frame in which an object having a property value selected from the metadata stored in the storage unit 130 is detected.
- the monitoring control unit 120 displays the searched frame in the second area through the display unit (S400).
- the monitoring control unit 120 extracts a frame corresponding to the identification information of the frame read from the metadata from the video data stored in the storage unit 130. [
- the monitoring control unit 120 may display the extracted frame in the second area through the display unit.
- FIG. 6 is a diagram illustrating an object search screen according to an embodiment of the present invention.
- the object search screen 500 includes a detection frequency display area 510 and a frame display area 520.
- a detection frequency display area 510 a characteristic value of an object detected in the supervisory image and a detection frequency of the object having the characteristic value are displayed.
- the frame display area 520 a frame in which an object having a property value selected from among the property values displayed in the detection frequency display area 510 is detected is displayed.
- a specific characteristic value item 512 is selected from among the characteristic value items displayed in the detection frequency display region 510, a frame having the characteristic value corresponding to the selected item 512 in the frame display region 520 is detected, Is displayed.
- a specific frame 522 is selected from the frames displayed in the frame display area 520, an image related to the selected frame is displayed in an area where the selected frame is displayed.
- the monitoring control unit 120 displays a characteristic value for an object detected in the supervisory image and a detection frequency of the object having the characteristic value through the display unit.
- the monitoring control unit 120 separately displays the detection frequency of the object having the input characteristic value (S114).
- the monitoring control unit 120 can display the detection frequency of the object having the characteristic value corresponding to the plurality of characteristic values separately have.
- FIG 8 and 9 are flowcharts showing an object search screen according to an embodiment of the present invention.
- the object search screen 600 includes a characteristic value input area 610 and a detection frequency display area 620.
- the property value input area 610 is provided with an interface for inputting a property value of an object to be searched.
- the monitoring image can be displayed in the characteristic value input area 610.
- the detection frequency display area 620 the characteristic values of the object detected in the supervisory image and the detection frequency of the object having the characteristic value are displayed. In this case, the detection frequency corresponding to the characteristic value input in the characteristic value input area 610 is displayed separately in the detection frequency display area 620.
- the detection frequency item 622 corresponding to the characteristic value 614 of the selected object in the detection frequency display region 620 is selected in the detection frequency display region 620.
- the detection frequency item 622 may be distinguished from other detection frequency items by shape or color.
- a user can input a plurality of characteristic values directly by referring to an object 612 displayed in a supervised image in a characteristic value input area 610.
- the user may refer to the object 612 to generate an item 616 representing the first characteristic value of the object and an item 618 representing the second characteristic value.
- a detection frequency item 624 corresponding to the characteristic value between the first characteristic value and the second characteristic value is displayed in the detection frequency display region 620 so as to be distinguished from other detection frequency items.
- FIG. 10 is a flowchart illustrating an object search process according to an embodiment of the present invention.
- the monitoring control unit 120 receives characteristic values of an object to be searched in the supervisory image and time information to be searched (S122). For example, the monitoring control unit 120 may receive the property value of the object to be searched in the supervisory image and the time information to be searched through the user input unit 140.
- the monitoring control unit 120 displays the property value of the object detected in the supervisory image and the detection frequency of the object having the property value through the output unit 150 based on the input time information.
- the monitoring control unit 120 separately displays the detection frequency of the object having the input characteristic value (S124).
- FIG. 11 is a view showing an object search screen according to an embodiment of the present invention.
- the feature value input area 710 further includes an item 716 for entering time information.
- the time information may include a start time and an end time.
- the detection frequency display area 720 a property value of the object detected in the supervised image and an input start time and an end time of the object having the property value Is displayed.
- the detection frequency item 722 corresponding to the characteristic value 714 of the object selected in the detection frequency display area 720 is differently detected It is displayed separately from the frequency item.
- FIG. 12 is a flowchart illustrating an object search process according to an embodiment of the present invention.
- FIG. 13 is a flowchart illustrating an object search screen according to an embodiment of the present invention.
- the object search screen 800 includes a characteristic value input area 810 and a detection frequency display area 820.
- An attribute value input area 810 provides an interface for inputting a property value of an object to be searched.
- the monitoring image can be displayed in the characteristic value input area 810.
- the detection frequency display area 820 a property value of an object detected in the supervisory image and a detection frequency of the object having the property value are displayed. In this case, the detection frequency corresponding to the characteristic value input in the characteristic value input area 810 is displayed separately in the detection frequency display area 820.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Library & Information Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Life Sciences & Earth Sciences (AREA)
- Probability & Statistics with Applications (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- General Engineering & Computer Science (AREA)
- Closed-Circuit Television Systems (AREA)
- Alarm Systems (AREA)
Abstract
Description
Claims (15)
- 감시 영상에서 검출되는 오브젝트에 대한 특성 값 및 상기 특성 값을 갖는 오브젝트의 검출 빈도를 제 1 영역에 표시하는 단계;상기 제 1 영역에 표시된 특성 값 중에서 선택된 특성 값을 갖는 오브젝트가 검출된 프레임을 상기 감시 영상에서 검색하는 단계; 및상기 검색된 프레임을 제 2 영역에 표시하는 단계를 포함하는 것을 특징으로 하는 영상 감시 장치의 오브젝트 검색 방법.
- 제 1항에 있어서, 상기 제 1 영역에 표시하는 단계 이전에,상기 감시 영상에서 검색할 오브젝트의 특성 값이 입력되는 단계를 더 포함하고,상기 제 1 영역에 표시하는 단계는,상기 제 1 영역에서 상기 입력된 특성 값을 갖는 오브젝트의 검출 빈도를 구분되게 표시하는 단계인 것을 특징으로 하는 영상 감시 장치의 오브젝트 검색 방법.
- 제 2항에 있어서, 상기 제 1 영역에 표시하는 단계 이전에,상기 감시 영상에서 검색할 시간 정보가 입력되는 단계를 더 포함하고,상기 제 1 영역에 표시하는 단계는,상기 제 1 영역에서 상기 입력된 시간 정보에 근거하여 상기 입력된 특성 값을 갖는 오브젝트의 검출 빈도를 구분되게 표시하는 단계인 것을 특징으로 하는 영상 감시 장치의 오브젝트 검색 방법.
- 제 2항에 있어서, 상기 제 1 영역에 표시하는 단계 이전에,상기 감시 영상에서 검색할 영역 정보가 입력되는 단계를 더 포함하고,상기 제 1 영역에 표시하는 단계는,상기 제 1 영역에서 상기 입력된 영역 정보에 근거하여 상기 입력된 특성 값을 갖는 오브젝트의 검출 빈도를 구분되게 표시하는 단계인 것을 특징으로 하는 영상 감시 장치의 오브젝트 검색 방법.
- 제 2항에 있어서, 상기 입력된 특성 값은,상기 감시 영상에서 검색할 오브젝트의 제 1 특성 값 및 제 2 특성 값을 포함하고,상기 제 1 영역에 표시하는 단계는,상기 제 1 영역에서 상기 제 1 특성 값 및 제 2 특성 값 사이의 값을 갖는 오브젝트의 검출 빈도를 구분되게 표시하는 단계인 것을 특징으로 하는 영상 감시 장치의 오브젝트 검색 방법.
- 제 2항에 있어서, 상기 입력되는 단계 이전에,상기 감시 영상을 제 3 영역에 표시하는 단계를 더 포함하고,상기 입력되는 단계는,상기 제 3 영역에서 검출되는 오브젝트 중에서 선택된 오브젝트의 특성 값을 계산하는 단계를 포함하는 것을 특징으로 하는 영상 감시 장치의 오브젝트 검색 방법.
- 제 1항에 있어서, 상기 제 1 영역에 표시하는 단계 이전에,상기 감시 영상에서 검출되는 오브젝트의 특성 값에 대한 검출 빈도를 저장하는 단계를 더 포함하는 것을 특징으로 하는 영상 감시 장치의 오브젝트 검색 방법.
- 제 1항에 있어서, 상기 제 1 영역에 표시하는 단계 이전에,상기 감시 영상에서 상기 특성 값을 갖는 오브젝트의 검출 횟수를 판단하는 단계; 및상기 판단된 검출 횟수를 정규화하여 상기 검출 빈도를 생성하는 단계를 더 포함하는 것을 특징으로 하는 영상 감시 장치의 오브젝트 검색 방법.
- 제 1항에 있어서, 상기 특성 값은,상기 감시 영상에서 검출되는 오브젝트의 크기, 속도 및 색상 중 적어도 하나를 포함하는 것을 특징으로 하는 영상 감시 장치의 오브젝트 검색 방법.
- 제 1항에 있어서, 상기 제 1 영역에 표시하는 단계는,상기 특성 값 및 상기 검출 빈도를 제 1 영역에 그래프로 표시하는 단계인 것을 특징으로 하는 영상 감시 장치의 오브젝트 검색 방법.
- 제 1항에 있어서, 상기 제 2 영역에 표시하는 단계는,상기 검색된 프레임에서 선택된 프레임을 표시하는 단계인 것을 특징으로 하는 영상 감시 장치의 오브젝트 검색 방법.
- 제 1항에 있어서, 상기 제 2 영역에 표시되는 프레임은,썸네일 이미지인 것을 특징으로 하는 영상 감시 장치의 오브젝트 검색 방법.
- 제 12항에 있어서,상기 썸네일 이미지 중에서 선택된 썸네일 이미지와 관련된 영상을 표시하는 단계를 더 포함하는 것을 특징으로 하는 영상 감시 장치의 오브젝트 검색 방법.
- 감시 영상에서 검출되는 오브젝트에 대한 특성 값 및 상기 특성 값을 갖는 오브젝트의 검출 빈도를 제 1 영역에 표시하는 출력부; 및상기 제 1 영역에 표시된 특성 값 중에서 선택된 특성 값을 갖는 오브젝트가 검출된 프레임을 상기 감시 영상에서 검색하는 제어부를 포함하고,상기 출력부는,상기 검색된 프레임을 제 2 영역에 표시하는 것을 특징으로 하는 영상 감시 장치.
- 제 14항에 있어서,상기 감시 영상에서 검색할 오브젝트의 특성 값을 입력받는 입력부를 더 포함하고,상기 출력부는,상기 제 1 영역에서 상기 입력된 특성 값을 갖는 오브젝트의 검출 빈도를 구분되게 표시하는 것을 특징으로 하는 영상 감시 장치.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP10858431.9A EP2629516B1 (en) | 2010-10-11 | 2010-10-11 | Image-monitoring device and method for searching for objects therefor |
PCT/KR2010/006929 WO2012050244A1 (ko) | 2010-10-11 | 2010-10-11 | 영상 감시 장치 및 그 오브젝트 검색 방법 |
CN201080069558.5A CN103155550B (zh) | 2010-10-11 | 2010-10-11 | 图像监视装置和用于其的搜索对象的方法 |
US13/878,554 US20130201333A1 (en) | 2010-10-11 | 2010-10-11 | Image-monitoring device and method for ssearching for objects therefor |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/KR2010/006929 WO2012050244A1 (ko) | 2010-10-11 | 2010-10-11 | 영상 감시 장치 및 그 오브젝트 검색 방법 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012050244A1 true WO2012050244A1 (ko) | 2012-04-19 |
Family
ID=45938448
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2010/006929 WO2012050244A1 (ko) | 2010-10-11 | 2010-10-11 | 영상 감시 장치 및 그 오브젝트 검색 방법 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20130201333A1 (ko) |
EP (1) | EP2629516B1 (ko) |
CN (1) | CN103155550B (ko) |
WO (1) | WO2012050244A1 (ko) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018124510A1 (ko) * | 2016-12-27 | 2018-07-05 | 한화테크윈 주식회사 | 이벤트 저장 장치, 이벤트 검색 장치, 및 이벤트 알람 장치 |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11039108B2 (en) * | 2013-03-15 | 2021-06-15 | James Carey | Video identification and analytical recognition system |
CN108197025B (zh) * | 2017-12-29 | 2021-04-09 | 大陆汽车车身电子系统(芜湖)有限公司 | 一种仪表压力测试系统及仪表压力测试方法 |
CN108540195B (zh) * | 2018-03-05 | 2019-01-08 | 杭州掌门物联科技有限公司 | 狭窄空间网络中继系统及方法 |
CN110324528A (zh) * | 2018-03-28 | 2019-10-11 | 富泰华工业(深圳)有限公司 | 摄像装置、影像处理系统及方法 |
CN111666453B (zh) * | 2019-03-07 | 2024-01-02 | 杭州海康威视数字技术股份有限公司 | 一种视频管理、检索方法、装置、电子设备及存储介质 |
CN114792461B (zh) * | 2021-01-25 | 2024-01-16 | 广州汽车集团股份有限公司 | 一种车内求救信号发送的方法及装置 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20070030555A (ko) * | 2005-09-13 | 2007-03-16 | 엘지전자 주식회사 | 디지털 비디오 레코더에서의 검색 제어방법 |
KR20070038227A (ko) * | 2005-10-05 | 2007-04-10 | 엘지전자 주식회사 | 디지털 비디오 레코더에서의 오브젝트 검출 연결 녹화제어방법 |
KR20100033863A (ko) * | 2008-09-22 | 2010-03-31 | 엘지전자 주식회사 | 감시 시스템 및 감시 시스템의 동작 방법 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3826598B2 (ja) * | 1999-01-29 | 2006-09-27 | 株式会社日立製作所 | 画像監視装置及び記録媒体 |
US7624337B2 (en) * | 2000-07-24 | 2009-11-24 | Vmark, Inc. | System and method for indexing, searching, identifying, and editing portions of electronic multimedia files |
US20090040303A1 (en) * | 2005-04-29 | 2009-02-12 | Chubb International Holdings Limited | Automatic video quality monitoring for surveillance cameras |
US8630497B2 (en) * | 2007-11-27 | 2014-01-14 | Intelliview Technologies Inc. | Analyzing a segment of video |
CN101459828A (zh) * | 2007-12-10 | 2009-06-17 | 张学斌 | 基于目标检索与识别的图像监视系统 |
CN101770644A (zh) * | 2010-01-19 | 2010-07-07 | 浙江林学院 | 森林火灾远程视频监控烟火识别方法 |
-
2010
- 2010-10-11 CN CN201080069558.5A patent/CN103155550B/zh not_active Expired - Fee Related
- 2010-10-11 EP EP10858431.9A patent/EP2629516B1/en not_active Not-in-force
- 2010-10-11 WO PCT/KR2010/006929 patent/WO2012050244A1/ko active Application Filing
- 2010-10-11 US US13/878,554 patent/US20130201333A1/en not_active Abandoned
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20070030555A (ko) * | 2005-09-13 | 2007-03-16 | 엘지전자 주식회사 | 디지털 비디오 레코더에서의 검색 제어방법 |
KR20070038227A (ko) * | 2005-10-05 | 2007-04-10 | 엘지전자 주식회사 | 디지털 비디오 레코더에서의 오브젝트 검출 연결 녹화제어방법 |
KR20100033863A (ko) * | 2008-09-22 | 2010-03-31 | 엘지전자 주식회사 | 감시 시스템 및 감시 시스템의 동작 방법 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2629516A4 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018124510A1 (ko) * | 2016-12-27 | 2018-07-05 | 한화테크윈 주식회사 | 이벤트 저장 장치, 이벤트 검색 장치, 및 이벤트 알람 장치 |
KR20180075983A (ko) * | 2016-12-27 | 2018-07-05 | 한화에어로스페이스 주식회사 | 가변 이벤트 검출 조건을 가지는 촬상 장치 |
CN110089104A (zh) * | 2016-12-27 | 2019-08-02 | 韩华泰科株式会社 | 事件存储装置、事件搜索装置和事件警报装置 |
CN110089104B (zh) * | 2016-12-27 | 2022-02-11 | 韩华泰科株式会社 | 事件存储装置、事件搜索装置和事件警报装置 |
US11308777B2 (en) | 2016-12-27 | 2022-04-19 | Hanwha Techwin Co., Ltd. | Image capturing apparatus with variable event detecting condition |
KR102674150B1 (ko) * | 2016-12-27 | 2024-06-10 | 한화비전 주식회사 | 가변 이벤트 검출 조건을 가지는 촬상 장치 |
Also Published As
Publication number | Publication date |
---|---|
CN103155550B (zh) | 2016-10-26 |
EP2629516B1 (en) | 2017-03-22 |
US20130201333A1 (en) | 2013-08-08 |
CN103155550A (zh) | 2013-06-12 |
EP2629516A4 (en) | 2014-04-16 |
EP2629516A1 (en) | 2013-08-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2012046899A1 (ko) | 영상 감시 장치 및 그 이벤트 감지 방법 | |
WO2012050244A1 (ko) | 영상 감시 장치 및 그 오브젝트 검색 방법 | |
KR100896949B1 (ko) | 객체식별이 가능한 영상처리 감시시스템 및 감시방법 | |
JP4847165B2 (ja) | 映像記録再生方法及び映像記録再生装置 | |
WO2018124510A1 (ko) | 이벤트 저장 장치, 이벤트 검색 장치, 및 이벤트 알람 장치 | |
CN103209316B (zh) | 影像监视系统 | |
WO2018066742A1 (ko) | 영상 제공 장치 및 방법 | |
WO2014193065A1 (en) | Video search apparatus and method | |
WO2021167374A1 (ko) | 영상 검색 장치 및 이를 포함하는 네트워크 감시 카메라 시스템 | |
WO2012137994A1 (ko) | 영상인식장치 및 그 영상 감시방법 | |
KR100872878B1 (ko) | 감시 카메라의 이벤트 검출 촬영시스템 | |
WO2018097384A1 (ko) | 밀집도 알림 장치 및 방법 | |
WO2021172943A1 (ko) | 영상 검색 장치 및 이를 포함하는 네트워크 감시 카메라 시스템 | |
JP2002152721A (ja) | 録画再生装置の映像表示方法および装置 | |
KR100718521B1 (ko) | 영상 출력 시스템 및 영상을 출력하기 위한 제어프로그램을 기록한 컴퓨터로 판독 가능한 기록 매체 | |
JPH11252534A (ja) | カメラシステム | |
JP2002133540A (ja) | 防犯監視装置 | |
JP2001224011A (ja) | 異常監視方法および装置 | |
KR100472546B1 (ko) | 보안용 녹화 시스템 및 그 제어 방법 | |
JP2004064438A (ja) | 監視システム及び監視方法 | |
US20180063470A1 (en) | Monitoring camera system and reproduction method | |
JPH09198577A (ja) | マルチメディア防犯システム | |
JP2001126157A (ja) | 動画検索を用いた防犯システム | |
KR200238157Y1 (ko) | 폐쇄회로 텔레비전 시스템의 사건 발생내역 기록 장치 | |
JP5999879B2 (ja) | 監視装置、及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080069558.5 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10858431 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13878554 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REEP | Request for entry into the european phase |
Ref document number: 2010858431 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010858431 Country of ref document: EP |