WO2014006859A1 - 位置管理装置、位置管理システム、位置管理方法、及び、位置管理プログラム - Google Patents
位置管理装置、位置管理システム、位置管理方法、及び、位置管理プログラム Download PDFInfo
- Publication number
- WO2014006859A1 WO2014006859A1 PCT/JP2013/004030 JP2013004030W WO2014006859A1 WO 2014006859 A1 WO2014006859 A1 WO 2014006859A1 JP 2013004030 W JP2013004030 W JP 2013004030W WO 2014006859 A1 WO2014006859 A1 WO 2014006859A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- position information
- time
- monitoring object
- value
- photographing
- Prior art date
Links
- 238000007726 management method Methods 0.000 title claims description 35
- 238000012544 monitoring process Methods 0.000 claims abstract description 125
- 238000003384 imaging method Methods 0.000 claims abstract description 60
- 238000004364 calculation method Methods 0.000 claims description 63
- 238000000034 method Methods 0.000 claims description 37
- 238000010586 diagram Methods 0.000 description 7
- 230000010365 information processing Effects 0.000 description 7
- 238000004590 computer program Methods 0.000 description 6
- 239000000284 extract Substances 0.000 description 6
- 238000004891 communication Methods 0.000 description 4
- 230000002159 abnormal effect Effects 0.000 description 2
- 235000019504 cigarettes Nutrition 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 238000010191 image analysis Methods 0.000 description 2
- 238000013075 data extraction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/292—Multi-camera tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/188—Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
Definitions
- the present invention relates to a position management device for calculating position information of a monitoring object from an image taken by a monitoring camera.
- Patent Document 1 discloses that data processing is performed on an image captured by an infrared camera and a PTZ (Pan Tilt Zoom) camera that take a moving image.
- An apparatus is disclosed in which a person who walks with a cigarette, which is a monitoring target, is identified, and the person who walks with the identified cigarette is tracked with a PTZ camera.
- Patent Document 2 even if a detection error occurs in a plurality of position information of the same person photographed by a plurality of monitoring cameras, it is possible to accurately integrate the position information and detect the position of the person.
- Patent Document 3 discloses that when the difference between the coordinates of the center of gravity of the object i at time k and the coordinates of the center of gravity of the object j at time (k + 1) is less than a predetermined reference value, the object i and the object. An apparatus that determines that j is the same is disclosed.
- the position of the monitoring object is calculated using image data obtained by photographing the same monitoring object from a plurality of monitoring cameras. At this time, if there is an abnormal output from any of the monitoring cameras, there is a problem that the accuracy may be greatly reduced in the integration of the position information output by the plurality of monitoring cameras.
- the main object of the present invention is to provide a position management device, a position management system, a position management method, and a position management program that solve the above-mentioned problems.
- the position management device provides monitoring object identification information to a monitoring object periodically photographed by a photographing device, and position information relating to the monitoring object is obtained from a photographed image obtained by photographing the monitoring object.
- the stored absolute value of the difference between the position information value at the imaging time one cycle before and two cycles before the current time is compared with a predetermined reference value, and the imaging one cycle before Even if the absolute value of the difference at the time is greater than or equal to the reference value, the current position information is output if the absolute value of the difference at the photographing time two cycles before is less than the reference value.
- the position management method of the present invention provides monitoring object identification information to a monitoring object periodically imaged by an imaging device, and the position information related to the monitoring object is Calculated from a captured image obtained by photographing the monitoring object, and stores the position information in a storage area in association with the monitoring object identification information and the photographing time, and the current position information value for the monitoring object And the absolute value of the difference between the position information value at the photographing time one cycle before and two cycles before stored in the storage area with a predetermined reference value, Even if the absolute value of the difference at the imaging time before one cycle is greater than or equal to the reference value, the absolute value of the difference corresponding to the imaging time before the two cycles is less than the reference value, The current location information To output.
- the position management program provides monitoring object identification information to a monitoring object periodically photographed by a photographing device, and position information relating to the monitoring object.
- Calculation processing for calculating from the captured image obtained by photographing the monitoring object, storage processing for storing the position information in a storage area in association with the monitoring object identification information and photographing time, and the monitoring object For the absolute value of the difference between the current position information value and the position information value stored in the storage area at the shooting time one cycle before and two cycles before the current time.
- the absolute value of the difference at the photographing time two cycles before is Standard If it is less than, to perform a comparison process of outputting the position information of the current, to the computer.
- Another aspect of the present invention can be realized by a computer-readable non-volatile storage medium for the position management program (computer program).
- the present invention makes it possible to calculate position information related to a monitoring object with high accuracy from an image taken by a monitoring camera.
- FIG. 1 is a block diagram showing the configuration of the location management system according to the first embodiment of the present invention.
- the position management system 1 of the present embodiment includes a position management device 10 and three photographing devices 20-1 to 20-3.
- the photographing devices 20-1 to 20-3 are devices for photographing images, and generally correspond to a color camera, a monochrome camera, a thermo camera, and the like.
- Data formats of images captured by the imaging devices 20-1 to 20-3 are BMP (Bit MaP) and JPEG (Joint Photographic Experts Group), which are image formats generally used during image analysis processing.
- the photographing devices 20-1 to 20-3 photograph the monitoring target object 2 from different directions periodically, for example, every second, and transmit the photographed image data to the position management device 10.
- the monitoring target object 2 in this embodiment is a person, for example.
- the location management device 10 includes calculation units 100-1 to 100-3, comparison units 101-1 to 101-3, a specifying unit 102, and a storage unit 103.
- the calculation units 100-1 to 100-3 give identification information for identifying the monitoring target 2 recorded in the image data received from the imaging devices 20-1 to 20-3, respectively.
- the calculation units 100-1 to 100-3 calculate the position information of the monitoring target 2 for each shooting time based on the image data received from the shooting devices 20-1 to 20-3, respectively, and the calculation results are obtained.
- Each is transmitted to the comparison units 101-1 to 101-3.
- the calculation units 100-1 to 100-3 also transmit the above calculation results to the storage unit 103.
- calculation units 100-1 to 100-3 and the comparison units 101-1 to 101-3 may be bundled to form one calculation unit and one comparison unit.
- the bundled calculation unit and comparison unit perform the above-described processing on the three image data received from the imaging devices 20-1 to 20-3.
- the storage unit 103 uses the positional information related to the monitoring target 2 received from the calculation units 100-1 to 100-3, identification information for identifying the imaging devices 20-1 to 20-3, and identification information for identifying the monitoring target 2. And stored as the position calculation result 104 in association with the shooting time.
- FIG. 4 A configuration example of the position calculation result 104 is shown in FIG.
- the imaging device ID, the object ID, the time, and the coordinates of the object captured by the imaging device at each time are associated.
- the photographing device ID is identification information for identifying the photographing devices 20-1 to 20-3, and the photographing device IDs of the photographing devices 20-1 to 20-3 are 001 to 003, respectively.
- the object ID is identification information given to the monitoring object 2 by the calculation units 100-1 to 100-3, and the object ID of the monitoring object 2 is 01a. Although not shown in FIG. 1, the object ID of 01b is assigned to the other monitoring objects other than the monitoring object 2 by the calculation units 100-1 to 100-3.
- the current time is t, and the monitoring object 2 calculated by the calculation units 100-1 to 100-3 based on the image data obtained by the imaging devices 20-1 to 20-3 capturing the monitoring object 2 at the time t.
- the positional information regarding is the coordinate A.
- the positional information related to the monitoring object 2 in the present embodiment is that the target area imaged by the imaging devices 20-1 to 20-3 is a virtual space, the horizontal axis is the X coordinate, and the vertical axis is the Y coordinate in the virtual space. It is shown in two-dimensional coordinates.
- the coordinates A calculated from the image data photographed by the photographing devices 20-1 to 20-3 are (x1, y1), (x1, y1), and (x1 ′, y1 ′), respectively.
- the coordinates A (x1 ′, y1 ′) calculated from the image data photographed by the photographing device 20-3 are the coordinates A (x1, x1) calculated from the image data photographed by the photographing devices 20-1 and 20-2.
- the difference from y1) is that the photographing device 20-3 abnormally outputs image data due to a problem such as image disturbance at time t.
- the position calculation result 104 includes the above-described data recorded at periodic shooting times.
- the calculation units 100-1 to 100-3 calculate on the basis of image data captured by the imaging devices 20-1 to 20-3 at a time (t-1) one cycle before the time t, for example, one second before.
- the position information of the monitored object 2 is coordinate B.
- the calculation units 100-1 to 100- are based on image data photographed by the photographing devices 20-1 to 20-3 at a time (t-2) two cycles before the time t, for example, two seconds before.
- the position information of the monitoring object 2 calculated by 3 is the coordinate C.
- comparison units 101-1 to 101-3 receive the position information of the monitoring object 2 related to the imaging devices 20-1 to 20-3 at time t from the calculation units 100-1 to 100-3, respectively.
- the validity of the received position information is determined. Details of the validity determination operation of the comparison units 101-1 to 103-3 will be described later.
- the comparison units 101-1 to 101-3 transmit the positional information of the monitored object 2 at the received time t to the specifying unit 102 only when it is determined to be valid. For example, when the comparison unit 101-3 determines that the position information calculated from the image data captured by the imaging device 20-3 is invalid, the specifying unit 102 uses the imaging devices 20-1 to 20-2. Only the position information calculated from the captured image data is received from the comparison units 101-1 to 101-2.
- the specifying unit 102 specifies the position of the monitoring object 2 at time t based on the position information received from the comparison units 101-1 to 101-3.
- a method of specifying the position by the specifying unit 2 for example, a method of calculating an average value of a plurality of position information, or position information within a predetermined error is regarded as the same value, and a value is specified by majority vote. There are methods.
- the specifying unit 102 transmits a specifying result related to the position of the monitoring object 2 to the storage unit 103.
- the storage unit 103 stores the position specifying result of the monitoring target 2 received from the specifying unit 102 as the position specifying result 105 in association with the identification information for identifying the monitoring target 2 and the photographing time.
- FIG. 5 shows a configuration example of the position specifying result 105.
- the object ID, the time, and the coordinates of the object specified by the specifying unit 102 at each time are associated with each other.
- the specified coordinates A, B, and C at times t, (t-1), and (t-2) related to the monitoring object 2 with the object ID 01a are (x4, y4), (x5), respectively. , Y5), (x6, y6).
- the comparison unit 101-i When the comparison unit 101-i receives, from the calculation unit 100-i, the position information of the monitoring target 2 calculated from the image data captured by the imaging device 20-i at the time t, the comparison unit 101-i The position information of the monitoring object 2 related to the imaging device 20-i at time (t-2) is extracted from the position calculation result 104 and the position specifying result 105.
- the comparison unit 101-i relates to the position information of the monitoring target 2 related to the imaging device 20-i, and compares the difference between the value at the time t and the value at the time (t-1) and the time (t-2). Calculate the absolute value.
- the comparison unit 101-i compares the absolute value of the above difference with a predetermined reference value.
- the reference value ⁇ related to the X coordinate and the reference value ⁇ related to the Y coordinate are set in the comparison unit 101-i by the administrator of the position management system 1.
- the values indicated by ⁇ and ⁇ are determined by the manager of the position management system 1 based on the distance that the person may move during one cycle time from the movement speed of the person to be monitored by the position management system 1. calculate.
- the comparison unit 101-i calculates the absolute value of the difference between the value at the time t and the value at the time (t-1) and the time (t-2) with respect to the position information of the monitored object 2. And compare with ⁇ and ⁇ . Therefore, the values indicated by ⁇ and ⁇ are values that a person may move between two cycle times, and are set by the administrator of the location management system 1. For example, when one cycle time is 1 second, the values of ⁇ and ⁇ are about several meters.
- the comparison unit 101-i does not use the reference values set for the X coordinate and the Y coordinate, but the distance traveled by the monitored object 2 is determined from the value indicated by the coordinates of the monitored object 2 before and after the movement. There is also a method that uses one reference value that is calculated and indicates the moving distance.
- the comparison unit 101-i compares the absolute value of the difference and the reference values ⁇ and ⁇ with the absolute value of the difference regarding the X coordinate being less than ⁇ and the absolute value of the difference regarding the Y coordinate being less than ⁇ . If there is, it is first determined that there is no problem with the coordinate value of the monitored object 2.
- FIG. 6 shows an example of a list of validity determination results of position information related to the imaging device 20-i in the comparison unit 101-i.
- the data extraction source, the time, the extracted data for each photographing device at the time, the calculation result in the comparison unit 101, and the determination result are associated with each other.
- the format of the data extracted from the imaging device and the position calculation result 104 is ⁇ imaging device ID, object ID, calculated coordinates of the object, imaging time ⁇ . Therefore, for example, the data of ⁇ 001, 01a, (x1, y1), t ⁇ has an object ID 01a calculated from a photographed image photographed by the photographing device 20-1 whose photographing device ID is 001.
- the coordinates of a certain monitoring object 2 at time t are (x1, y1) ”.
- the comparison unit 101-1 determines that the coordinate value of the monitoring target 2 calculated from the captured image captured by the imaging device 20-1 at time t is OK.
- the comparison unit 101-2 similarly determines that the coordinate value of the monitoring target 2 calculated from the captured image captured by the imaging device 20-2 at time t is OK.
- is larger than ⁇ and ⁇ , respectively. Accordingly, the comparison unit 101-3 determines that the coordinate value of the monitored object 2 at time t calculated from the captured image captured by the imaging device 20-3 is NG.
- the comparison unit 101-i calculates the absolute value of the difference between the X coordinate and the Y coordinate at time t extracted from the imaging device 20-i, and the reference value ⁇ And ⁇ .
- the format of the data extracted from the position specifying result 105 is ⁇ object ID, calculated object coordinates, photographing time ⁇ . Therefore, for example, the data of ⁇ 01a, (x5, y5), (t-1) ⁇ is “the coordinates specified by the specifying unit 102 at the time (t ⁇ 1) relating to the monitoring target 2 with the target ID 01a. Is (x5, y5) ".
- the comparison unit 101-1 determines that the coordinate value of the monitoring target 2 calculated from the captured image captured by the imaging device 20-1 at time t is OK.
- the comparison unit 101-2 similarly determines that the coordinate value of the monitoring target 2 calculated from the captured image captured by the imaging device 20-2 at time t is OK.
- and the Y coordinate movement from time (t-1) to time t related to the monitored object 2 calculated from the captured image captured by the imaging device 20-3 The distances
- the comparison unit 101-i determines whether or not the position information of the monitoring object 2 related to the imaging device 20-i at time t is finally valid according to a predetermined determination flow. judge.
- the determination flow will be described in the description of the operation flow of the present embodiment.
- the photographing device 20-i transmits a photographed image obtained by photographing the monitoring object 2 at time t to the calculation unit 100-i (S102).
- the calculation unit 100-i gives identification information for identifying the monitoring object 2, calculates position information regarding the monitoring object 2 from the captured image captured by the imaging device 20-i, and compares the calculation result with the comparison unit 101.
- -I is transmitted to the storage unit 103 (S103).
- the storage unit 103 receives the position information regarding the monitoring object 2 received from the calculation unit 100-i at the time t, the identification information for identifying the imaging device 20-i, the identification information for identifying the monitoring object 2, and the imaging time And stored as a position calculation result 104 (S104).
- the comparison unit 101-i determines the validity of the position information received at time t received from the calculation unit 100-i (S105).
- the comparison unit 101-i extracts the position information at the time (t-1) related to the monitored object 2 from the position calculation result 104 in the storage unit 103, and the position information at the time t received from the calculation unit 100-i
- the absolute value of the difference is calculated and compared with the reference value (S201). If the absolute value of the difference is greater than or equal to the reference value (Yes in S202), the process branches to S203. If the absolute value of the difference is less than the reference value (No in S202), the process branches to S205.
- the comparison unit 101-i extracts the position information at the time (t-2) related to the monitored object 2 from the position calculation result 104 in the storage unit 103, and the position information at the time t received from the calculation unit 100-i
- the absolute value of the difference is calculated and compared with the reference value (S203). If the absolute value of the difference is greater than or equal to the reference value (Yes in S204), the process branches to S210. If the absolute value of the difference is less than the reference value (No in S204), the process branches to S205.
- the comparison unit 101-i extracts the position information at the time (t ⁇ 1) related to the monitored object 2 specified by the specifying unit 102 from the position specification result 105 in the storage unit 103, and outputs the position information from the calculation unit 100-i.
- the absolute value of the difference from the received position information at time t is calculated and compared with the reference value (S205). If the absolute value of the difference is greater than or equal to the reference value (Yes in S206), the process branches to S207. If the absolute value of the difference is less than the reference value (No in S206), the process branches to S209.
- the comparison unit 101-i extracts the position information at the time (t-2) related to the monitoring target 2 specified by the specifying unit 102 from the position specifying result 105 in the storage unit 103, and outputs the position information from the calculating unit 100-i.
- the absolute value of the difference from the received position information at time t is calculated and compared with the reference value (S207).
- the process branches to S210. If the absolute value of the difference is less than the reference value (No in S208), the process branches to S209.
- the comparison unit 101-i determines that the position information received from the calculation unit 100-i at time t is valid (S209), and the process returns to S105.
- the comparison unit 101-i determines that the position information received from the calculation unit 100-i at time t is invalid (S210), and the process returns to S105.
- the comparison unit 101-i transmits the position information at time t to the specifying unit 102 (S107).
- the specifying unit 102 specifies the position of the monitoring object 2 based on the position information at time t received from the comparison units 101-1 to 101-3, and transmits the specifying result to the storage unit 103 (S110).
- the storage unit 103 stores the identification result received from the identification unit 102 as the position identification result 105 in association with the identification information for identifying the monitoring object 2 and the photographing time (S111), and the entire process ends. .
- This embodiment has an effect of calculating the position information related to the monitoring object with high accuracy from the images photographed by a plurality of photographing devices.
- the reason is that the comparison unit 101-i compares the difference between the position information of the monitoring target calculated from the images photographed by the respective photographing devices and the position information in the position calculation result 104 and the position specifying result 105. However, when the comparison result does not satisfy the predetermined standard, the position information calculated from the image photographed by the photographing device is discarded.
- the monitored object is a walking person
- the person cannot move at high speed, and there is an upper limit on the distance that the person moves in unit time. If the moving distance of the monitored object calculated from the position information related to the monitored object calculated from the image captured by a certain imaging device exceeds the above upper limit, the image captured by the imaging device There is a problem, and there is a high possibility that the calculation unit has calculated incorrect position information.
- the storage unit 103 stores position information for each cycle time related to the monitoring target calculated from images captured by a plurality of imaging devices. Then, the comparison unit 101-i discards the defective position information based on the difference between the position information of one cycle time or two cycles before and the current position information and the upper limit value of the movement described above. Thus, the specifying unit 102 can specify the position of the monitoring object with high accuracy.
- the comparison unit 101-i refers to not only the position calculation result 104 but also the position specification result 105 for the above-described comparison.
- the position calculation result 104 may include position information that is not very accurate, and the comparison unit 101 also refers to the position specification result 105 to identify the position specified by the specifying unit 102. The accuracy of the result can be further increased.
- FIG. 7 is a block diagram showing the configuration of the location management system according to the second embodiment of the present invention.
- the location management system 1 includes a location management device 10 and a photographing device 20-1.
- the location management device 10 includes a calculation unit 100-1, a comparison unit 101-1, and a storage unit 103.
- the calculation unit 100-1 assigns monitoring object identification information to the monitoring object 2 periodically imaged by the imaging device 20-1, and captures the position information related to the monitoring object 2 by imaging the monitoring object 2. Calculate from the image.
- the storage unit 103 stores the above-described position information as the position calculation result 104 in association with the identification information for identifying the monitoring object 2 and the photographing time.
- the comparison unit 101-1 compares the current position information value of the monitoring target object 2 and the position information value stored in the storage unit 103 at the shooting time one cycle before and two cycles before the current time. The absolute value of the difference is compared with a predetermined reference value. Then, the comparison unit 101-1 outputs the current position information when the absolute value of the difference is less than the reference value at at least one of the above-described shooting times.
- the comparison unit 101-1 calculates the absolute value of the difference between the X coordinate and the Y coordinate at time t extracted from the imaging device 20-1 for the data extracted from the position calculation result 104, and sets the reference values ⁇ and ⁇ as Make a comparison. Individual validity determination operations regarding the position information of the monitoring target 2 in the comparison unit 101-1 are the same as those in the first embodiment.
- the photographing device 20-1 transmits a photographed image obtained by photographing the monitoring object 2 at time t to the calculating unit 100-1 (S301).
- the calculation unit 100-1 assigns identification information for identifying the monitoring object 2, calculates position information regarding the monitoring object 2 from the captured image of the imaging device 20-1, and calculates the calculation result with the comparison unit 101-1.
- the data is transmitted to the storage unit 103 (S302).
- the storage unit 103 receives the position information regarding the monitoring object 2 received from the calculation unit 100-1 at the time t, the identification information for identifying the imaging device 20-1, the identification information for identifying the monitoring object 2, and the imaging time And stored as the position calculation result 104 (S303).
- the comparison unit 101-1 extracts the position information at the time (t ⁇ 1) related to the monitoring object 2 from the position calculation result 104 in the storage unit 103 and receives the position information at the time t received from the calculation unit 100-1.
- the absolute value of the difference is calculated and compared with the reference value (S304).
- the comparison unit 101-1 extracts the position information at the time (t ⁇ 2) related to the monitoring object 2 from the position calculation result 104 in the storage unit 103 and receives the position information at the time t received from the calculation unit 100-1.
- the absolute value of the difference is calculated and compared with the reference value (S306). When the absolute value of the difference is equal to or larger than the reference value (Yes in S307), the comparison unit 101-1 discards the position information at time t (S309), and the entire process ends. When the absolute value of the difference is less than the reference value (No in S307), the comparison unit 101-1 outputs position information at time t (S308), and the entire process ends.
- This embodiment has an effect of calculating the position information related to the monitoring target with high accuracy from the image photographed by the photographing device.
- the reason is that the comparison unit 101-1 compares the difference between the position information related to the monitoring target calculated from the image captured by the image capturing device and the position information in the position calculation result 104, and the comparison result is a predetermined reference. This is because the position information calculated from the image photographed by the photographing device is discarded when the condition is not satisfied.
- the position information output by the comparison unit 101-1 is used, for example, to specify the position information related to the monitoring target, as in the first embodiment, and the comparison unit 101-1 discards the position information with a large error. By doing so, it is possible to avoid a decrease in accuracy in specifying the position.
- ⁇ Hardware configuration example> In the above-described embodiment, each unit illustrated in FIGS. 1 and 7 can be regarded as a function (processing) unit (software module) of a software program. However, the division of each part shown in these drawings is a configuration for convenience of explanation, and various configurations can be assumed for mounting. An example of the hardware environment in this case will be described with reference to FIG.
- FIG. 9 is a diagram exemplarily illustrating a configuration of an information processing apparatus 900 (computer) that can execute the position management apparatus according to the exemplary embodiment of the present invention. That is, FIG. 9 shows a configuration of a computer (information processing apparatus) that can realize the information processing apparatus shown in FIGS. 1 and 7, and shows a hardware environment that can realize each function in the above-described embodiment. To express.
- the information processing apparatus 900 illustrated in FIG. 9 includes a CPU 901 (Central_Processing_Unit), a ROM 902 (Read_Only_Memory), a RAM 903 (Random_Access_Memory), a hard disk 904 (storage device), and a communication interface 905 (Interface: “I / F” hereinafter).
- a reader / writer 908 capable of reading and writing data stored in a storage medium 907 such as a CD-ROM (Compact_Disc_Read_Only_Memory), and an input / output interface 909. These components are connected via a bus 906 (communication line). It is a general computer connected.
- FIG. 1 A block configuration diagram
- FIG. 2 A block configuration diagram
- FIG. 3 A block configuration diagram
- FIG. 8 A block configuration diagram
- the computer program can be supplied to the hardware by a method of installing in the apparatus via various storage media 907 such as a CD-ROM, or an external method via a communication line such as the Internet.
- a general procedure can be adopted at present, such as a method of downloading more.
- the present invention is configured by a code constituting the computer program or a storage medium 907 in which the code is stored.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Alarm Systems (AREA)
- Image Analysis (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
図1は、本発明の第1の実施形態に係る位置管理システムの構成を示すブロック図である。
次に、本願発明の第2の実施形態について図面を参照して詳細に説明する。
<ハードウェア構成例>
上述した実施形態において図1、及び、図7に示した各部は、ソフトウェアプログラムの機能(処理)単位(ソフトウェアモジュール)と捉えることができる。但し、これらの図面に示した各部の区分けは、説明の便宜上の構成であり、実装に際しては、様々な構成が想定され得る。この場合のハードウェア環境の一例を、図9を参照して説明する。
2 監視対象物
10 位置管理装置
100-1乃至100-3 算出部
101-1乃至101-3 比較部
102 特定部
103 記憶部
104 位置算出結果
105 位置特定結果
106 判定結果一覧
20-1乃至20-3 撮影機器
900 情報処理装置
901 CPU
902 ROM
903 RAM
904 ハードディスク
905 通信インタフェース
906 バス
907 記憶媒体
908 リーダライタ
909 入出力インタフェース
Claims (7)
- 撮影機器が周期的に撮影した監視対象物に、監視対象物識別情報を付与し、前記監視対象物に関する位置情報を、前記監視対象物を撮影した撮影画像から算出する算出手段と、
前記位置情報を、前記監視対象物識別情報、及び、撮影時刻に関連付けて記憶する記憶手段と、
前記監視対象物について、現在の前記位置情報の値と、前記記憶手段に記憶された、現在から1周期前、及び、2周期前の前記撮影時刻における前記位置情報の値との差分の絶対値を、所定の基準値と比較をして、前記1周期前の前記撮影時刻における、前記差分の絶対値が前記基準値以上であっても、前記2周期前の前記撮影時刻における、前記差分の絶対値が前記基準値未満である場合、前記現在の前記位置情報を出力する比較手段と、
を備える位置管理装置。 - 複数の前記撮影機器によって撮影された前記撮影画像を各々入力する複数の前記算出手段と、
複数の前記算出手段が算出した前記位置情報を各々入力する複数の前記比較手段と、
複数の前記比較手段から出力された前記位置情報を統合して、前記監視対象物の位置を特定した位置特定情報を出力する特定手段と、をさらに備え、
前記記憶手段は、前記位置特定情報を、前記監視対象物識別情報、及び、前記撮影時刻に関連付けて記憶し、
前記比較手段は、前記監視対象物について、現在の前記位置情報の値と、前記記憶手段に記憶された、現在から1周期前、及び、2周期前の前記撮影時刻における前記位置特定情報の値との差分の絶対値を、所定の基準値と比較をして、前記1周期前の前記撮影時刻における、前記差分の絶対値が前記基準値以上であっても、前記2周期前の前記撮影時刻における、前記差分の絶対値が前記基準値未満である場合、前記現在の前記位置情報を出力する
請求項1の位置管理装置。 - 請求項1乃至2の位置管理装置と、前記撮影機器とを包含する位置管理システム。
- 撮影機器が周期的に撮影した監視対象物に、監視対象物識別情報を付与し、前記監視対象物に関する位置情報を、前記監視対象物を撮影した撮影画像から算出し、
前記位置情報を、前記監視対象物識別情報、及び、撮影時刻に関連付けて記憶域に記憶し、
前記監視対象物について、現在の前記位置情報の値と、前記記憶域に記憶された、現在から1周期前、及び、2周期前の前記撮影時刻における前記位置情報の値との差分の絶対値を、所定の基準値と比較をして、少なくとも何れか1つの前記撮影時刻における、前記差分の絶対値が前記基準値未満である場合、前記現在の前記位置情報を出力する
位置管理方法。 - 複数の前記撮影機器によって撮影された前記撮影画像を各々入力し、
複数の前記撮影機器によって撮影された前記撮影画像から算出した前記位置情報を各々入力し、
複数の前記位置情報を統合して、前記監視対象物の位置を特定した位置特定情報を出力し、
前記位置特定情報を、前記監視対象物識別情報、及び、前記撮影時刻に関連付けて前記記憶域に記憶し、
前記監視対象物について、現在の前記位置情報の値と、前記記憶域に記憶された、現在から1周期前、及び、2周期前の前記撮影時刻における前記位置特定情報の値との差分の絶対値を、所定の基準値と比較をして、前記1周期前の前記撮影時刻における、前記差分の絶対値が前記基準値以上であっても、前記2周期前の前記撮影時刻における、前記差分の絶対値が前記基準値未満である場合、前記現在の前記位置情報を出力する
請求項4の位置管理方法。 - 撮影機器が周期的に撮影した監視対象物に、監視対象物識別情報を付与し、前記監視対象物に関する位置情報を、前記監視対象物を撮影した撮影画像から算出する算出処理と、
前記位置情報を、前記監視対象物識別情報、及び、撮影時刻に関連付けて記憶域に記憶する記憶処理と、
前記監視対象物について、現在の前記位置情報の値と、前記記憶域に記憶された、現在から1周期前、及び、2周期前の前記撮影時刻における前記位置情報の値との差分の絶対値を、所定の基準値と比較をして、前記1周期前の前記撮影時刻における、前記差分の絶対値が前記基準値以上であっても、前記2周期前の前記撮影時刻における、前記差分の絶対値が前記基準値未満である場合、前記現在の前記位置情報を出力する比較処理と、
をコンピュータに実行させる位置管理プログラム。 - 複数の前記撮影機器によって撮影された前記撮影画像を各々入力する複数の前記算出処理と、
複数の前記算出処理が算出した前記位置情報を各々入力する複数の前記比較処理と、
複数の前記比較処理から出力された前記位置情報を統合して、前記監視対象物の位置を特定した位置特定情報を出力する特定処理と、をコンピュータに実行させ、
前記記憶処理は、前記位置特定情報を、前記監視対象物識別情報、及び、前記撮影時刻に関連応付けて記憶域に記憶し、
前記比較処理は、前記監視対象物について、現在の前記位置情報の値と、前記記憶域に記憶された、現在から1周期前、及び、2周期前の前記撮影時刻における前記位置特定情報の値との差分の絶対値を、所定の基準値と比較をして、前記1周期前の前記撮影時刻における、前記差分の絶対値が前記基準値以上であっても、前記2周期前の前記撮影時刻における、前記差分の絶対値が前記基準値未満である場合、前記現在の前記位置情報を出力する
請求項6の位置管理プログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014523585A JP6015756B2 (ja) | 2012-07-02 | 2013-06-28 | 位置管理装置、位置管理システム、位置管理方法、及び、位置管理プログラム |
US14/411,985 US9418428B2 (en) | 2012-07-02 | 2013-06-28 | Position management device, position management system, position management method, and position management program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-148404 | 2012-07-02 | ||
JP2012148404 | 2012-07-02 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014006859A1 true WO2014006859A1 (ja) | 2014-01-09 |
Family
ID=49881635
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/004030 WO2014006859A1 (ja) | 2012-07-02 | 2013-06-28 | 位置管理装置、位置管理システム、位置管理方法、及び、位置管理プログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US9418428B2 (ja) |
JP (1) | JP6015756B2 (ja) |
WO (1) | WO2014006859A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105072414B (zh) * | 2015-08-19 | 2019-03-12 | 浙江宇视科技有限公司 | 一种目标检测和跟踪方法及系统 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6495705B2 (ja) * | 2015-03-23 | 2019-04-03 | 株式会社東芝 | 画像処理装置、画像処理方法、画像処理プログラムおよび画像処理システム |
JPWO2020145004A1 (ja) * | 2019-01-10 | 2021-10-28 | 日本電気株式会社 | 撮影ガイド装置 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005217786A (ja) * | 2004-01-29 | 2005-08-11 | Victor Co Of Japan Ltd | 映像記録装置及び映像記録方法 |
JP2010206404A (ja) * | 2009-03-02 | 2010-09-16 | Secom Co Ltd | 画像監視装置 |
JP2010237971A (ja) * | 2009-03-31 | 2010-10-21 | Saxa Inc | 歩きたばこ監視装置 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6693648B1 (en) * | 2000-11-22 | 2004-02-17 | Campus Crusade For Christ, Inc. | Pointer interactive apparatus |
US7424056B2 (en) * | 2003-07-04 | 2008-09-09 | Sigmatel, Inc. | Method for motion estimation and bandwidth reduction in memory and device for performing the same |
JP3949628B2 (ja) | 2003-09-02 | 2007-07-25 | 本田技研工業株式会社 | 車両の周辺監視装置 |
JP4650669B2 (ja) * | 2004-11-04 | 2011-03-16 | 富士ゼロックス株式会社 | 動体認識装置 |
-
2013
- 2013-06-28 WO PCT/JP2013/004030 patent/WO2014006859A1/ja active Application Filing
- 2013-06-28 US US14/411,985 patent/US9418428B2/en active Active
- 2013-06-28 JP JP2014523585A patent/JP6015756B2/ja active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005217786A (ja) * | 2004-01-29 | 2005-08-11 | Victor Co Of Japan Ltd | 映像記録装置及び映像記録方法 |
JP2010206404A (ja) * | 2009-03-02 | 2010-09-16 | Secom Co Ltd | 画像監視装置 |
JP2010237971A (ja) * | 2009-03-31 | 2010-10-21 | Saxa Inc | 歩きたばこ監視装置 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105072414B (zh) * | 2015-08-19 | 2019-03-12 | 浙江宇视科技有限公司 | 一种目标检测和跟踪方法及系统 |
Also Published As
Publication number | Publication date |
---|---|
JP6015756B2 (ja) | 2016-10-26 |
US20150161794A1 (en) | 2015-06-11 |
JPWO2014006859A1 (ja) | 2016-06-02 |
US9418428B2 (en) | 2016-08-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10572736B2 (en) | Image processing apparatus, image processing system, method for image processing, and computer program | |
EP2640057B1 (en) | Image processing device, image processing method and program | |
CN110834327B (zh) | 一种机器人的控制方法及设备 | |
US9477891B2 (en) | Surveillance system and method based on accumulated feature of object | |
US20160142680A1 (en) | Image processing apparatus, image processing method, and storage medium | |
CN112232279B (zh) | 一种人员间距检测方法和装置 | |
US9295141B2 (en) | Identification device, method and computer program product | |
KR20180101746A (ko) | 증강 현실 컨텐츠를 제공하기 위한 방법, 전자 기기 및 시스템 | |
US20150146006A1 (en) | Display control apparatus and display control method | |
KR101493009B1 (ko) | 전후면 차량 번호 인식 방법과 그 시스템 | |
US20150078618A1 (en) | System for tracking dangerous situation in cooperation with mobile device and method thereof | |
KR102303779B1 (ko) | 복수 영역 검출을 이용한 객체 탐지 방법 및 그 장치 | |
US8284292B2 (en) | Probability distribution constructing method, probability distribution constructing apparatus, storage medium of probability distribution constructing program, subject detecting method, subject detecting apparatus, and storage medium of subject detecting program | |
CN105898208A (zh) | 监控系统、监控摄像机和图像处理方法 | |
JP6015756B2 (ja) | 位置管理装置、位置管理システム、位置管理方法、及び、位置管理プログラム | |
JP2017010527A (ja) | 対象を追跡するための方法および装置 | |
KR102055275B1 (ko) | IoT를 이용한 통합 재난 관리 시스템 및 제어 방법 | |
TW202242803A (zh) | 定位方法、裝置、電子設備及儲存媒體 | |
JP2009152733A (ja) | 人物特定システム、人物特定装置、人物特定方法および人物特定プログラム | |
Karakaya et al. | Collaborative localization in visual sensor networks | |
KR20160072617A (ko) | 감시 카메라 및 감시 카메라 제어 방법 | |
US10943102B2 (en) | Computer vision system that provides information relative to animal wellness and habitat/intervention design | |
JP2008085832A (ja) | 監視カメラ、監視カメラの制御方法および監視カメラシステム | |
JP5354697B2 (ja) | 混雑状況管理システム、混雑状況管理装置、混雑状況管理方法、プログラム | |
JP2016225892A (ja) | 画像監視装置、画像監視方法及び画像監視プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13813301 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014523585 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14411985 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13813301 Country of ref document: EP Kind code of ref document: A1 |