WO2023166648A1 - Movement trajectory information processing device - Google Patents

Movement trajectory information processing device Download PDF

Info

Publication number
WO2023166648A1
WO2023166648A1 PCT/JP2022/009080 JP2022009080W WO2023166648A1 WO 2023166648 A1 WO2023166648 A1 WO 2023166648A1 JP 2022009080 W JP2022009080 W JP 2022009080W WO 2023166648 A1 WO2023166648 A1 WO 2023166648A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
image
information
movement trajectory
trajectory information
Prior art date
Application number
PCT/JP2022/009080
Other languages
French (fr)
Japanese (ja)
Inventor
拓也 小川
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to PCT/JP2022/009080 priority Critical patent/WO2023166648A1/en
Publication of WO2023166648A1 publication Critical patent/WO2023166648A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to a movement trajectory information processing device, a movement trajectory information processing method, and a recording medium.
  • a conventional apparatus uses a camera with fixed pan, tilt and zoom values (PTZ values) to continuously photograph an object such as a moving person.
  • the conventional apparatus extracts the position of the object from the image obtained by the camera at the time of each photographing, and uses the camera calibration information (intrinsic parameters and extrinsic parameters) to extract the position of the object on the image of the object. Converts coordinate values to coordinate values in real space.
  • the conventional apparatus uses the time series of the position of the object on the real space obtained as described above as movement trajectory information.
  • a camera with a fixed PTZ value has a limited field of view. Therefore, for the purpose of monitoring a wider range, it is possible to calculate the movement trajectory information of an object from a plurality of images continuously captured by a camera while changing the PTZ value and capturing the moving object within the imaging range. is important.
  • the camera calibration information for that camera changes.
  • calibration work by the operator is required. For this reason, the conventional apparatus that directly converts the coordinate values on the image of the target object to the coordinate values on the real space using the camera calibration information from each of a plurality of images that are continuously captured cannot capture the moving target object. It has been difficult to calculate movement trajectory information from a plurality of images of an object continuously captured by a camera that captures it within its range.
  • An object of the present invention is to provide a movement trajectory information processing apparatus that solves the above-described problems.
  • a movement trajectory information processing device includes: Calculation means for calculating movement trajectory information, which is a time series of position information of the moving object on the image, from a plurality of images of the object continuously photographed by a camera that captures the moving object within an imaging range; calculating a mapping function from the image to a reference image; using the calculated mapping function to calculate position information on the reference image corresponding to position information on the image of the object; conversion means for converting the obtained position information on the reference image into position information in a world coordinate system using camera calibration information; is configured to include
  • a movement trajectory information processing method includes: calculating movement trajectory information, which is a time series of position information of the moving object on the image, from a plurality of images of the object continuously photographed by a camera that captures the moving object within an imaging range; calculating a mapping function from the image to a reference image; using the calculated mapping function to calculate position information on the reference image corresponding to position information on the image of the object; converting the obtained position information on the reference image into position information in a world coordinate system using camera calibration information; is configured as
  • a computer-readable recording medium comprises: to the computer, A process of calculating movement trajectory information, which is a time series of position information of the moving object on the image, from a plurality of images of the object continuously photographed by a camera that captures the moving object within an imaging range; calculating a mapping function from the image to a reference image; using the calculated mapping function to calculate position information on the reference image corresponding to position information on the image of the object; a process of converting the obtained position information on the reference image into position information in a world coordinate system using camera calibration information; It is configured to record a program for causing the
  • a time series of the position of the object in the real space is obtained from a plurality of images of the object continuously photographed by a camera that captures the moving object within the photographing range. Movement trajectory information can be calculated.
  • FIG. 4 is a diagram showing an example of tracked object movement trajectory information used by the control device in the tracking system according to one embodiment of the present invention
  • FIG. 4 is a diagram showing an example of to-be-verified trajectory information used by the control device in the tracking system according to one embodiment of the present invention
  • FIG. 4 is a diagram showing an example of to-be-verified trajectory information used by the control device in the tracking system according to one embodiment of the present invention
  • FIG. 4 is a schematic diagram for explaining an example of a method for converting position information on an image from an image coordinate system to a world coordinate system in the control device in the tracking system according to one embodiment of the present invention
  • 4 is a flowchart showing an example of tracking processing performed by a control device using one camera in the tracking system according to one embodiment of the present invention
  • 4 is a flowchart showing an example of tracking processing performed by the control device using the other camera in the tracking system according to one embodiment of the present invention
  • 1 is a block diagram of an embodiment of a movement trajectory information processing device according to the present invention
  • a camera with fixed PTZ values has a limited field of view of the camera.
  • the field of view of the camera can be expanded.
  • the field of view of a single camera is still limited. Therefore, a system is considered in which a plurality of PTZ cameras are distributed over a monitoring area with a part of the camera field of view shared, and a person is tracked in a wider range spatially and temporally.
  • the system automatically recognizes when a person being tracked by one camera comes into the field of view of another camera, and the other camera begins tracking that person. I hope you can.
  • a person's movement trajectory can be used to determine which person should take over tracking from many persons present in the field of view of the camera. That is, by collating information on the movement trajectory of a person being tracked by one camera (tracking target movement trajectory information) with information on the movement trajectory of a person within the field of view of the other camera (matched movement trajectory information). , determine the tracking target. In order to perform such collation, it is necessary that the tracking target trajectory information and the collated trajectory information are expressed in the same coordinate system.
  • the coordinate system that expresses the position of a person captured by a camera includes an image coordinate system and a camera coordinate system unique to each camera, and a world coordinate system that is common to multiple cameras.
  • the image coordinate system is a two-dimensional coordinate system on the image pickup device, and the position of a point on the image is usually represented by this image coordinate system.
  • the camera coordinate system is a coordinate system defined by the image coordinate system and internal parameters of the camera. Therefore, the image coordinate system of the camera and the camera coordinate system of the camera can be mutually transformed using the intrinsic parameters of the camera.
  • the world coordinate system is a coordinate system defined by camera calibration information composed of camera intrinsic and extrinsic parameters and an image coordinate system. Therefore, the image coordinate system of a camera and the world coordinate system can be transformed to each other using the camera calibration information of the camera.
  • a conventional device extracts the position of a person from a photographed image in an image coordinate system, and uses camera calibration information to convert the coordinate values of the extracted person in the image coordinate system into coordinate values in the world coordinate system. do.
  • changing the PTZ value of the camera to track a person changes the camera calibration information.
  • a tracking system 10 according to an embodiment to which the present invention is applied will now be described in detail with reference to the drawings.
  • FIG. 1 is a schematic diagram showing a configuration example of a tracking system 10 according to one embodiment of the present invention.
  • the tracking system 10 is a system for detecting and tracking a person in a surveillance area, and includes two PTZ cameras 11 and 12 and a control device 20.
  • the PTZ cameras include solid-state imaging devices such as CMOS sensors and CCD sensors, and pan heads.
  • the cameras 11 and 12 generate a plurality of images by continuously photographing the photographing range at regular intervals, for example, in accordance with instructions from the control device 20 . It is preferable that the imaging cycles and imaging timings of the cameras 11 and 12 are the same or similar, but they may be different.
  • the cameras 11 and 12 can change the imaging range by changing the pan, tilt, and zoom in accordance with commands from the control device 20 .
  • “pan” means moving the camera left and right
  • tilt means moving the camera up and down
  • zoom means changing the angle of view to telephoto or wide angle.
  • the cameras 11 and 12 are installed dispersedly in the monitoring area so that a part of their photographing ranges are shared.
  • the camera 11 monitors monitoring areas 13A, 13B, and 13C set on a passage extending in the horizontal direction of the paper.
  • the camera 12 monitors a monitoring area 13C and a monitoring area 13D set on a passage extending in the vertical direction of the paper.
  • the monitoring area 13C corresponds to the corner portion of the passage. That is, the corner area of the passage is the surveillance area 13C that is common to the two cameras 11 and 12 .
  • On the aisle a person enters from the left side of the paper surface, passes through the corner of the passage, and exits from the lower side of the paper surface.
  • the monitoring areas 13A to 13D are assumed to be flat surfaces.
  • the control device 20 is connected to the cameras 11 and 12 by wire or wirelessly.
  • the control device 20 calculates camera calibration information (intrinsic parameters and extrinsic parameters) of each camera when the cameras 11 and 12 are installed.
  • This camera calibration information is calculated by setting the PTZ values of the cameras 11 and 12 as the reference PTZ of each camera.
  • the reference PTZ of the camera 11 is set so as to provide a camera field of view capable of photographing a person in the monitoring area 13A.
  • the reference PTZ of the camera 12 is set so as to provide a camera field of view capable of photographing a person in the monitoring area 13C.
  • the control device 20 monitors and displays on the screen display unit images continuously captured by the camera 11 fixed to the reference PTZ.
  • the control device 20 tracks the designated person while changing the PTZ value of the camera 11 . Then, the control device 20 calculates in real time information on the movement trajectory of the person being tracked (tracking target movement trajectory information) in the world coordinate system.
  • the control device 20 obtains information on the movement trajectories of all persons in the monitoring area 13C (matched movement trajectory information) in real time in the world coordinate system. Then, the control device 20 collates the tracked movement trajectory information of the person being tracked by the camera 11 with the verified movement trajectory information of all the persons captured by the camera 12 in real time. Thereby, the control device 20 determines which person captured by the camera 12 is the person being tracked by the camera 11 . Next, the control device 20 continues tracking with the camera 12 while catching the determined person in the imaging range of the camera 12, and displays the image on the screen display unit as a monitor.
  • FIG. 2 is a block diagram showing an example of the control device 20.
  • the control device 20 includes a communication I/F (interface) section 21 , an operation input section 22 , a screen display section 23 , a storage section 24 and an arithmetic processing section 25 .
  • the communication I/F unit 21 is composed of a data communication circuit, and is configured to perform data communication with the cameras 11 and 12 and other external devices (not shown) by wire or wirelessly.
  • the operation input unit 22 is composed of an operation input device such as a keyboard and a mouse, and is configured to detect an operator's operation and output it to the arithmetic processing unit 25 .
  • the screen display unit 23 is composed of a display device such as an LCD (Liquid Crystal Display), and is configured to display images captured by the cameras 11 and 12 and the like.
  • the storage unit 24 is composed of one or more types of storage devices such as hard disks and memories, and is configured to store processing information and programs 241 required for various processes in the arithmetic processing unit 25 .
  • the program 241 is a program that realizes various processing units by being read and executed by the arithmetic processing unit 25, and can be read from an external device or a recording medium (not shown) via a data input/output function such as the communication I/F unit 21. It is read in advance and stored in the storage unit 24 .
  • Main processing information stored in the storage unit 24 includes camera calibration information 242-1, 242-2, reference images 243-1, 243-2, image DB (database) 244-1, 244-2, tracking target There is moving track information 245 and collated moving track information 246 .
  • the camera calibration information 242-1 is the camera calibration information of the camera 11 (internal parameters and external parameters of the camera 11).
  • the camera calibration information 242-2 is camera calibration information of the camera 12 (intrinsic parameters and extrinsic parameters of the camera 12).
  • a reference image 243-1 is an image of the surveillance area 13A captured by the camera 11 set to the pan value, tilt value, and zoom value when obtaining the camera calibration information 242-1.
  • the reference image 243-1 includes, in addition to the image, the pan value, tilt value, and zoom value (reference PTZ value of the camera 11) of the camera 11 when the reference image was captured, and the camera position (the position of the camera 11 in the world coordinate system). may be included.
  • the reference image 243-2 is an image of the monitoring area 13C captured by the camera 12 set to the pan value, tilt value, and zoom value when the camera calibration information 242-2 was obtained.
  • the reference image 243-2 includes the pan value, tilt value, zoom value (reference PTZ value of the camera 12) and camera position (position of the camera 12 in the world coordinate system) at the time of photographing the reference image, in addition to the image. It's okay.
  • the image DB 244-1 accumulates the time series of images taken by the camera 11. Each image stored in the image DB 244-1 is added with the camera ID of the camera 11, the shooting time, and the PTZ value at the time of shooting.
  • the image DB 244-2 accumulates the time series of images taken by the camera 12. FIG. Each image stored in the image DB 244-2 is added with the camera ID of the camera 12, the shooting time, and the PTZ value at the time of shooting.
  • the tracked object movement trajectory information 245 is information related to the movement trajectory of the tracked person calculated based on the image captured by the camera 11 .
  • FIG. 3 is a diagram showing a configuration example of the tracked object movement trajectory information 245.
  • the tracked object movement trajectory information 245 in this example includes an entry 2451 composed of a tracked object ID that uniquely identifies the tracked object, the camera ID of the camera 11, and associated information, and a plurality of It is composed of a plurality of entries 2452 corresponding to images one-to-one.
  • a plurality of entries 2452 are linked in a line in order of photographing time by the association information in the entry 2451 and the association information in the entry 2452 .
  • Each entry 2452 consists of shooting time, person area, position information (image coordinate system), image feature, position information (world coordinate system), movement vector, movement speed, acceleration, and association information.
  • the shooting time represents the shooting time of the corresponding image.
  • a person area represents, for example, a bounding rectangle of the person area in the corresponding image.
  • the position information (image coordinate system) represents the coordinate values of one point in the image coordinate system that represents the person area in the same entry.
  • One point representing the person area may be, for example, the center of gravity of the person area, but is not limited to this, and may be the center of gravity of the face, feet, or the like.
  • An image feature represents a feature amount of an image extracted from a human region in the corresponding image.
  • Image features may include, but are not limited to, face features, clothing features, and person size.
  • position information world coordinate system
  • coordinate values obtained by converting the position information (image coordinate system) in the same entry from the image coordinate system to the world coordinate system are set.
  • a motion vector represents the amount and direction of movement of the tracked person between the corresponding image and an adjacent image.
  • the moving speed represents the moving speed of the tracked person.
  • Acceleration represents the acceleration of the tracked person.
  • the movement vector, movement velocity, and acceleration are calculated, for example, based on the position information (world coordinate system) in the entry 2452 of the image adjacent to the corresponding image.
  • the type of data included in entry 2452 is an example, and is not limited to the above.
  • the to-be-verified movement trajectory information 246 is information related to the movement trajectory of the person extracted from the image captured by the camera 12 .
  • the to-be-verified movement trajectory information 246 exists for each person extracted from the image captured by the camera 12 .
  • FIG. 4 is a diagram showing a configuration example of the to-be-verified movement trajectory information 246.
  • the to-be-verified movement trajectory information 246 in this example includes an entry 2461 composed of a person ID that uniquely identifies a person, a camera ID of the camera 12, and association information, and a plurality of images continuously shot by the camera 12. It is composed of a plurality of entries 2462 in one-to-one correspondence.
  • a plurality of entries 2462 are linked in a line in order of photographing time by the association information in the entry 2461 and the association information in the entry 2462 .
  • Each entry 2462 is composed of data similar to the entry 2452 of the tracked object movement trajectory information 245 described with reference to FIG.
  • the type of data included in entry 2462 is an example and is not limited to the above.
  • the arithmetic processing unit 25 has one or more microprocessors such as an MPU and their peripheral circuits, and by reading and executing the program 241 from the storage unit 24, the hardware and the program 241 cooperate to perform various processes. It is configured to realize the part.
  • Main processing units realized by the arithmetic processing unit 25 include a camera calibration unit 251 and two monitoring units 252 and 253 .
  • the camera calibration unit 251 is configured to acquire camera calibration information of the cameras 11 and 12 through interactive processing with the operator via the operation input unit 22 and screen display unit 23 .
  • the camera calibration method used is not particularly limited.
  • the monitoring unit 252 is configured to use the camera 11 to detect and track a person moving within the monitoring areas 13A to 13C.
  • the monitoring unit 252 includes a detection unit 2521 , a tracking unit 2522 and a coordinate conversion unit 2523 .
  • the detection unit 2521 detects a specific person as a tracking target from among all persons present in the monitoring area 13A based on images of the monitoring area 13A continuously captured by the camera 11 set to the reference PTZ value. is configured as
  • the tracking unit 2522 calculates information on the movement trajectory of the tracking target in the image coordinate system of the camera 11 while tracking the tracking target detected by the detection unit 2521 with the camera 11 so as to capture it within the imaging range, and calculates the movement of the tracking target. It is configured to be stored in the storage unit 24 as locus information 245 . Since the tracking unit 2522 changes the PTZ value of the camera 11 to track the tracking target, even if the tracking target moves from the monitoring area 13A to the monitoring area 13B or the monitoring area 13C, the tracking target is captured within the shooting range. be able to.
  • the coordinate transformation unit 2523 is configured to transform the tracked object movement trajectory information 245 calculated by the tracking unit 2522 from the image coordinate system of the camera 11 to the world coordinate system. That is, the coordinate transformation unit 2523 transforms the position information on the image from the image coordinate system to the world coordinate system.
  • the coordinate transformation unit 2523 may be configured to transform points in the image coordinate system into the three-dimensional world coordinate system.
  • FIG. 5 is a schematic diagram for explaining an example of a method for converting position information on an image from the image coordinate system to the world coordinate system.
  • an image I 0 is an image captured by the camera 11 set to the same PTZ value as when the camera calibration information 242-1 was calculated.
  • Images I 1 , I 2 , and I 3 are a plurality of images taken by camera 11 continuously while changing the PTZ value after image I 0 was taken.
  • the coordinate transformation unit 2523 transforms the position information on the image I 0 from the image coordinate system to the world coordinate system using the camera calibration information 242-1.
  • the coordinate conversion unit 2523 converts the position information on the image I 1 to the world coordinate system by first converting the position information on the image I 1 to the corresponding position on the image I 0 using the mapping function f 1 . Convert to information.
  • the coordinate transformation unit 2523 transforms the position information on the image I 0 obtained by the above transformation from the image coordinate system to the world coordinate system using the camera calibration information 242-1.
  • the coordinate transformation unit 2523 obtains a group of corresponding points between the image I 1 and the image I 0 using a known feature point extraction method. Calculate the planar projective transformation matrix H 1 from the group.
  • the corresponding point cloud between image I 1 and image I 0 is, in other words, pairs of corresponding feature points between image I 1 and image I 0 .
  • the coordinate transformation unit 2523 obtains the planar projective transformation matrix H 2 by using a known feature point extraction method to obtain the corresponding point group between the image I 2 and the image I 1 , and from the obtained corresponding point group A planar projective transformation matrix H 2 is calculated.
  • the coordinate transformation unit 2523 obtains the planar projective transformation matrix H 3 by using a known feature point extraction method to obtain a group of corresponding points between the images I 3 and I 2 , and from the obtained corresponding point group A planar projective transformation matrix H3 is calculated.
  • the method of obtaining the mapping function is not limited to the above.
  • the common area between adjacent images shot continuously by the camera 11 while tracking a moving person in the shooting range is relatively large, but the common area between the images becomes smaller as the time increases. tend to become Between images with a small common area, it is difficult to obtain corresponding point groups between both images.
  • the photographing time interval between the two images for obtaining the planar projective transformation matrix is determined in consideration of the above circumstances.
  • the monitoring unit 253 uses the camera 12 to detect the person being tracked by the camera 11 from among the persons in the monitoring area 13C, and monitors the detected person in the monitoring areas 13C and 13D. configured to track.
  • the monitoring unit 253 includes a matching unit 2531 , a tracking unit 2532 and a coordinate conversion unit 2533 .
  • the matching unit 2531 extracts information on the movement trajectory of each person moving in the monitoring area 13C (matched movement trajectory information) from a plurality of images of the monitoring area 13C continuously captured by the camera 12 set to the reference PTZ value. 246) in the image coordinate system of the camera 12 .
  • the coordinate transformation unit 2533 is configured to transform the collated movement trajectory information 246 calculated by the collation unit 2531 from the image coordinate system of the camera 12 to the world coordinate system using the camera calibration information 242-2.
  • the collation unit 2531 is configured to collate the collated trajectory information 246 converted into the world coordinate system and the tracked trajectory information 245 of the person being tracked by the monitoring unit 252 .
  • the matching unit 2531 determines which of the people photographed by the camera 12 and moving in the monitoring area 13C is the person being tracked by the camera 11. It is configured.
  • the matching unit 2531 matches the latest N frames of each trajectory.
  • N is a predetermined value of 2 or more. Therefore, the matching unit 2531 extracts N entries 2452 in order from the last entry 2452 of the tracked object movement trajectory information 245, and uses the extracted N entries 2452 as the tracked object movement trajectory information. If there are no N entries 2452 in the tracking target movement trajectory information 245, the matching unit 2531 does not perform matching at that point. On the other hand, the matching unit 2531 extracts N entries 2462 in order from the last entry 2462 from each of the one or more to-be-matched movement trajectory information 246, and converts the extracted N entries 2462 to the to-be-matched movement trajectory. information.
  • the collation unit 2531 does not collate the to-be-verified movement trajectory information 246 in which N entries 2462 do not exist at that time.
  • the collation unit 2531 does not perform collation at that time if there is no collated movement trajectory information 246 in which N entries 2462 exist.
  • the collation unit 2531 calculates collation between the movement trajectory information of the tracked object and the individual trajectory information to be collated based on the matching degree of the shape and the matching degree of the direction.
  • the matching unit 2531 first normalizes each movement trajectory by the center of gravity in order to eliminate the influence of the absolute position when calculating the matching degree of the shapes.
  • the movement trajectory of the tracked object is obtained by connecting the position information (in the world coordinate system) in the N entries 2452 constituting the movement trajectory information of the tracked object with line segments in order of photographing time.
  • the location information (in the world coordinate system) in the N entries 2462 constituting the movement trajectory information to be verified is connected by line segments in the order of photographing time to become the movement trajectory to be verified.
  • the collation unit 2531 obtains the error between the positions of the movement trajectories of the tracked object and the object to be collated at the same time by using the method of least squares or the like, and calculates a value that increases as the error decreases as the matching degree of the shape. .
  • the matching unit 2531 calculates a value that increases as the sum of errors between movement vectors at the same time of the movement trajectories of the tracked object and the object to be matched becomes smaller, as the degree of orientation matching. .
  • the matching unit 2531 calculates a matching result from the shape matching degree and orientation matching degree.
  • the collation unit 2531 may obtain a value obtained by multiplying the matching degree of the shape by the matching degree of the orientation as the collation result.
  • the collation unit 2531 may use, for example, a value obtained by adding together the matching degree of the shape and the matching degree of the orientation as the matching result.
  • the above is an example of a method of collating the tracked movement trajectory information 245 and the collated movement trajectory information 246 in the collation unit 2531 .
  • the matching method is not limited to the above.
  • the collation unit 2531 also considers the image features, movement speed, and acceleration included in the entry 2452 of the tracked movement trajectory information 245 and the entry 2462 of the to-be-verified movement trajectory information 246, and collates both movement trajectory information. you can
  • the collation unit 2531 identifies a person moving in the monitoring area 13C captured by the camera 12 based on the result of collating the tracked movement trajectory information 245 and one or more to-be-verified movement trajectory information 246. Among them, it is configured to determine which person is the person being tracked by the camera 11 . For example, if the highest matching result is equal to or greater than a certain threshold, the matching unit 2531 determines that the person associated with the to-be-matched movement trajectory information 246 with the highest matching result is the person being tracked by the camera 11. do. On the other hand, for example, the matching unit 2531 determines that the person being tracked by the camera 11 does not exist among the people photographed by the camera 12 unless the highest matching result is equal to or greater than a certain threshold.
  • the tracking unit 2532 is configured so that the tracking target determined by the matching unit 2531 (the person captured by the camera 12 and being tracked by the camera 11) is tracked by the camera 12 so as to be captured within the shooting range.
  • the tracking unit 2532 changes the PTZ value of the camera 12 to track the tracking target, so even if the tracking target moves from the monitoring area 14C to the monitoring area 13D, the tracking target can be captured within the imaging range.
  • control device 20 Next, the operation of the control device 20 will be explained. First, camera calibration performed before system operation will be described.
  • the camera calibration unit 251 of the control device 20 calibrates the cameras 11 and 12 at any time before system operation, such as when the cameras 11 and 12 are installed.
  • the camera calibration unit 251 In calibrating the camera 11, the camera calibration unit 251 first adjusts the PTZ value of the camera 11 so that the entire monitoring area 13A can be captured. Next, the camera calibration unit 251 calculates camera calibration information of the camera 11 using a predetermined camera calibration method through interactive processing with the operator through the operation input unit 22 and the screen display unit 23 . The camera calibration unit 251 stores the calculated camera calibration information in the storage unit 24 as the camera calibration information 242-1. Also, the camera calibration unit 251 acquires the PTZ value of the camera 11 at the time of calibration as the reference PTZ value. Further, the camera calibration unit 251 acquires an image of the area 13A captured by the camera 11 set to the reference PTZ value as a reference image. The camera calibration unit 251 adds a reference PTZ value to the obtained reference image and stores it in the storage unit 24 as a reference image 243-1.
  • the camera calibration unit 251 In calibrating the camera 12, the camera calibration unit 251 first adjusts the PTZ value of the camera 12 so that the entire monitoring area 13C (common area with the camera 11) can be captured. Next, the camera calibration unit 251 calculates camera calibration information of the camera 12 using a predetermined camera calibration method through interactive processing with the operator through the operation input unit 22 and the screen display unit 23 . The camera calibration unit 251 stores the calculated camera calibration information in the storage unit 24 as camera calibration information 242-2. Also, the camera calibration unit 251 acquires the PTZ value of the camera 12 at the time of calibration as the reference PTZ value. Further, the camera calibration unit 251 acquires an image of the area 15 captured by the camera 12 set to the reference PTZ value as a reference image. The camera calibration unit 251 adds a reference PTZ value to the obtained reference image and stores it in the storage unit 24 as a reference image 243-2.
  • FIG. 6 is a flowchart showing an example of tracking processing performed using the camera 11. As shown in FIG. The tracking process performed by the control device 20 using the camera 11 will be described below with reference to FIG.
  • the detection unit 2521 in the monitoring unit 252 of the control device 20 initializes (step S11). In this initialization, the detection unit 2521 sets the camera 11 to the reference PTZ value. Further, the detection unit 2521 clears all entries of the tracked object movement trajectory information 245 in the initialization. Next, the detection unit 2521 acquires an image of the surveillance area 13A captured by the camera 11 set to the reference PTZ value, saves it in the image DB 244-1, and displays it on the screen display unit 23 (step S12). Next, the detection unit 2521 detects all persons from the image saved this time using various techniques such as pattern recognition and machine learning (step S13). Next, the detection unit 2521 displays the detection result on the screen display unit 23 (step S14).
  • the detection unit 2521 generates an image in which the circumscribed rectangle for each person detected from the image is superimposed on the image, and displays the image on the screen display unit 23 .
  • the detection unit 2521 determines whether or not a person to be tracked has been designated by the operation input unit 22 (step S15). For example, the operator can specify a person to be tracked by performing an operation such as clicking a rectangle of the person on the image of the camera 11 displayed on the screen display unit 23 . If the person to be tracked is not specified, the detection unit 2521 returns to step S12 and repeats the same processing as described above. On the other hand, when a person to be tracked is designated, the detection unit 2521 updates the tracked object movement trajectory information 245 (step S16).
  • the detection unit 2521 sets the ID assigned to the tracked person specified by the operator and the camera ID of the camera 11 in the entry 2451 when updating the tracked object movement trajectory information 245 in step S16. Further, the detection unit 2521 secures one empty entry 2452 and sets association information to the secured entries 2452 and 2451 , thereby mutually relating the secured entries 2452 and 2451 . Next, the detection unit 2521 focuses on the secured entry 2452, sets the shooting time, person area, position information (image coordinate system), and image features to the focused entry 2452, and sets the position information (world coordinate system) and The movement vector, movement speed, and acceleration are NULL values.
  • the coordinate transformation unit 2523 of the monitoring unit 252 performs coordinate transformation on the noted entry 2452 of the tracked object movement trajectory information 245 (step S17).
  • the coordinate conversion unit 2523 first uses the camera calibration information 242-1 to convert the position information (image coordinate system) set in the focused entry 2452 from the image coordinate system to the world coordinate system. do.
  • the coordinate conversion unit 2523 sets the location information (world coordinate system) obtained by the above conversion to the entry 25452 of interest.
  • the tracking unit 2522 of the monitoring unit 252 performs tracking control (step S18). For example, the tracking unit 2522 generates a command for controlling the PTZ value of the camera 11 according to the position information (image coordinate system) set in the focused entry 2452, and sends the command to the camera 11 through the communication I/F unit 21. send to.
  • the tracking unit 2522 adjusts panning and tilting so that the center of gravity of the circumscribing rectangle of the person to be tracked represented by the position information (image coordinate system) of the focused entry 2452 is displayed at the center of the image.
  • the zoom may be adjusted so that the entire circumscribing rectangle is within a predetermined angle of view.
  • the camera 11 changes the imaging range by changing the pan, tilt, and zoom in response to the above command.
  • the tracking unit 2522 acquires an image captured by the camera 11 set to the changed PTZ value, saves it in the image DB 244-1, and displays it on the screen display unit 23 (step S19).
  • the tracking unit 2522 uses various techniques such as pattern recognition and machine learning to detect the person to be tracked from the image saved this time (step S20), and superimposes the circumscribed rectangle of the detected person on the image.
  • the resulting image is generated and displayed on the screen display unit 23 (step S21). Thereby, the operator can confirm the tracking status of the person specified on the image displayed on the screen display section 23 in real time.
  • the tracking unit 2522 updates the tracked object movement trajectory information 245 (step S22).
  • the tracking unit 2522 first secures one empty entry 2452, and after setting the association information between this secured empty entry 2452 and the focused entry 2452, Attention is shifted to the reserved free entry 2452 .
  • the tracking unit 2522 sets the shooting time, person area, position information (image coordinate system), and image features to the entry 2452 of interest newly, and sets the position information (world coordinate system), the movement vector, and the movement speed. Assume that the acceleration is a NULL value.
  • the coordinate conversion unit 2523 of the monitoring unit 252 converts the position information (image coordinate system) set in the focused entry 2452 from the image coordinate system to the world coordinate system (step S23).
  • the image corresponding to the interested entry 2452 is an image captured by the camera 11 set to a PTZ value different from that of the reference image. Therefore, as described with reference to FIG. 5, the coordinate conversion unit 2523 first calculates a mapping function for converting the position information (image coordinate system) set in the target entry 2452 into position information on the reference image. Then, using the calculated mapping function, position information (image coordinate system) is transformed into position information on the reference image.
  • the coordinate conversion unit 2523 converts the position information on the reference image obtained by conversion from the image coordinate system to the world coordinate system using the camera calibration information 242-1. Further, in step S23, the coordinate conversion unit 2523 calculates the movement vector, movement speed, and acceleration based on the position information (world coordinate system) of the entry 2452 of interest and the entries 2452 preceding it, and converts the entry 2452 of interest to set
  • the tracking unit 2522 of the monitoring unit 252 determines whether or not to end tracking (step S24). For example, the tracking unit 2522 may determine to end tracking when it detects that the person to be tracked is no longer present in the monitoring areas 13A, 13B, and 13C. Alternatively, the tracking unit 2522 may end tracking when it detects that the person to be tracked cannot be tracked by the camera 11 . If the tracking unit 2522 does not determine to end the tracking, it returns to step S18 and repeats the same processing as described above. As a result, the tracking of the tracked person by the camera 11 is continued, and the tracked object movement trajectory information 245 is further updated accordingly.
  • the tracking unit 2522 determines whether or not to end monitoring (step S25). For example, the tracking unit 2522 determines to end the monitoring when the operator inputs a monitoring end command through the operation input unit 22 . If it is not determined to end the monitoring, the monitoring unit 252 returns to step S11 and repeats the same processing as described above. When it is determined to end the monitoring, the monitoring unit 252 ends the processing shown in FIG.
  • FIG. 7 is a flowchart showing an example of tracking processing performed using the camera 12.
  • FIG. 7 The tracking process performed by the control device 20 using the camera 12 will be described below with reference to FIG.
  • the matching unit 2531 in the monitoring unit 253 of the control device 20 initializes (step S31).
  • the matching unit 2531 sets the camera 12 to the reference PTZ value.
  • the collation unit 2531 clears all entries of the collated movement trajectory information 246 in the initialization.
  • the matching unit 2531 acquires an image of the monitoring area 13C captured by the camera 12 set to the reference PTZ value, saves it in the image DB 244-2, and displays it on the screen display unit 23 (step S32).
  • the matching unit 2531 detects all persons from the image saved this time using various techniques such as pattern recognition and machine learning (step S33).
  • the matching unit 2531 displays the detection result on the screen display unit 23 (step S34).
  • the matching unit 2531 updates the to-be-matched movement trajectory information 246 based on the detection result of the person (step S35). Since all the persons detected from the image of the camera 12 first acquired after the initialization are the persons detected for the first time, the collation unit 2531, in step S35, creates one to-be-identified movement trajectory information for each detected person. 246 are assigned, and the following processing is performed for each of the assigned collated movement trajectory information 246 .
  • the collation unit 2531 sets the person ID assigned to the detected person and the camera ID of the camera 12 in the entry 2461 of the movement trajectory information 246 to be collated.
  • the collating unit 2531 secures one free entry 2462, sets information for correlating the secured free entry 2462 and the entry 2461, and then focuses on the secured free entry 2462.
  • the matching unit 2531 sets the photographing time, the human area, the position information (image coordinate system), and the image feature in the focused entry 2462, and the position information (world coordinate system), movement vector, movement velocity, and acceleration are set to NULL. value.
  • the coordinate conversion unit 2533 of the monitoring unit 253 performs the following processing on the focused entry 2462 (step S36).
  • the coordinate conversion unit 2533 uses the camera calibration information 442-2 to convert the position information (image coordinate system) set in the focused entry 2462 from the image coordinate system to the world coordinate system.
  • the coordinate conversion unit 2533 sets the position information of the world coordinate system obtained by the above conversion as the position information (world coordinate system) of the entry 2462 of interest.
  • the collating unit 2531 of the monitoring unit 253 collates the tracked trajectory information 245 with one or more pieces of collated trajectory information 246 (step S37).
  • the collation unit 2531 determines which of the persons photographed by the camera 12 and moving in the monitoring area 13C is the person being tracked by the camera 11 by this collation.
  • the matching unit 2531 determines whether or not the person being tracked by the camera 11 has been successfully detected (step S38). If the detection is not successful, the matching unit 2531 returns to step S32 and repeats the same processing as described above. On the other hand, if the above detection is successful, the matching unit 2531 transmits the person ID of the person determined to be the person being tracked by the camera 11 to the tracking unit 2532 .
  • the tracking unit 2532 focuses on the last entry 2462 of the to-be-matched movement trajectory information 246 in which the person ID received from the matching unit 2531 is set in the entry 2461, and uses the camera 12 to track the tracking target person. For example, the tracking unit 2532 generates a command for controlling the PTZ value of the camera 12 according to the position information (image coordinate system) set in the focused entry 2462, and sends the command to the camera 12 through the communication I/F unit 21. send to. At this time, for example, the tracking unit 2532 adjusts the pan and tilt so that the center of gravity of the circumscribing rectangle of the person to be tracked represented by the position information (image coordinate system) of the focused entry 2462 is displayed at the center of the image.
  • the zoom may be adjusted so that the entire circumscribing rectangle is within a predetermined angle of view.
  • the camera 12 changes the imaging range by changing the pan, tilt and zoom in response to the above command.
  • the tracking unit 2532 acquires an image captured by the camera 12 after changing the PTZ, and performs processing such as detection of a person to be tracked and monitor display on the screen display unit 23.
  • the tracking unit 2532 determines whether or not to end tracking (step S40). For example, the tracking unit 2532 may determine to end tracking when it detects that the person to be tracked no longer exists in the monitoring areas 13C and 13D. Alternatively, the tracking unit 2532 may end tracking when it detects that the person to be tracked cannot be tracked by the camera 12 . If the tracking unit 2532 does not determine to end tracking, it returns to step S38 and repeats the same processing as described above. As a result, the tracking of the person to be tracked by the camera 12 is continued.
  • the tracking unit 2532 determines whether to end monitoring (step S41). For example, the tracking unit 2532 determines to end the monitoring when the operator inputs a monitoring end command through the operation input unit 22 . If it is not determined to end the monitoring, the monitoring unit 253 returns to step S31 and repeats the same processing as described above. If it is determined to end the monitoring, the monitoring unit 253 ends the processing shown in FIG.
  • the tracking unit 2522 captures a moving tracking target person from a plurality of images captured continuously while tracking the camera 11 having a tracking function that captures the moving tracking target person within the shooting range.
  • Position information image coordinate system
  • each entry 2452 of the tracked object movement trajectory information 245 is calculated.
  • the coordinate transformation unit 2523 calculates a mapping function from the image corresponding to each entry 2452 captured by the camera 11 to the reference image, and uses the calculated mapping function to convert the tracked object movement trajectory information 245.
  • the position information (image coordinate system) of each entry 2452 of is converted into position information on the reference image, and the position information on the reference image obtained by this conversion is converted to the image coordinate system using the camera calibration information 242-1. to the world coordinate system.
  • the mapping function can be calculated without manual intervention unlike the camera calibration information. Therefore, according to the present embodiment, it is possible to calculate the movement trajectory information, which is the time series of the position of the person in the real space, while tracking the person with a PTZ camera having a tracking function that captures the moving person within the shooting range. can. As a result, the moving trajectory of a person can be collated in real time between a plurality of PTZ cameras, and the moving object can be automatically collated between the PTZ cameras.
  • the coordinate transformation unit 2523 may calculate a planar projective transformation matrix from an image obtained by photographing with the camera 11 to a reference image based on the PTZ value of the camera 11 when the image was photographed and the reference PTZ value. good.
  • a computer may be mounted on the camera 11, and all or part of the functions of the monitoring unit 252 may be mounted on the computer.
  • a computer may be installed in the camera 12, and all or part of the functions of the monitoring section 253 may be installed on the computer.
  • the camera 11 and a computer connected to the camera 12 via a network may be configured to include the operation input unit 22, the screen display unit 23, and the storage unit 24.
  • the monitoring area may be a place other than aisles, such as a store, factory, station platform, ground, or gymnasium.
  • the target to be tracked may be a moving object other than a person, such as an animal, a car, or a walking robot.
  • the cameras that share part of the camera field of view are not limited to the two cameras 11 and 12, and may be three or more.
  • the movement trajectory information processing device 30 includes calculation means 31 and conversion means 32 .
  • the calculating means 31 calculates movement trajectory information, which is a time series of positional information on the images of the object, from a plurality of images of the object continuously photographed by a camera that captures the moving object within the photographing range. is configured to
  • the calculation means 31 can be configured, for example, in the same manner as the tracking section 2522 in FIG. 2, but is not limited thereto.
  • the conversion means 32 calculates a mapping function from the image to the reference image, calculates position information on the reference image corresponding to the position information of the object on the image using the calculated mapping function, and calculates The position information on the reference image thus obtained is converted into position information on the world coordinate system using the camera calibration information.
  • the conversion means 32 can be configured, for example, in the same manner as the coordinate conversion section 2523 in FIG. 2, but is not limited thereto.
  • the movement trajectory information processing device 30 configured as described above operates as follows. That is, the calculation means 31 calculates the movement trajectory information, which is the time series of the position information of the object on the image, from a plurality of images of the object continuously photographed by a camera that captures the moving object within the photographing range. do. Next, the conversion means 32 calculates a mapping function from the image to the reference image, and uses the calculated mapping function to calculate position information on the reference image corresponding to the position information on the image of the object. , the calculated positional information on the reference image is converted into positional information in the world coordinate system using the camera calibration information.
  • movement trajectory information processing apparatus 30 configured and operated as described above, a plurality of images of a moving object continuously photographed by a camera that captures the moving object within the photographing range are used to obtain the real space image of the object. Movement trajectory information, which is a time series of positions, can be calculated. The reason is that the mapping function can be calculated without manual intervention unlike the camera calibration information.
  • the movement trajectory of an object calculated by the present invention may be used for the purposes of personal authentication, abnormal behavior detection, etc. based on the movement trajectory, in addition to monitoring and tracking.
  • the camera used in the present invention does not necessarily have a tracking function. For example, the photographer manually changes the orientation of the camera so that the moving object is captured within the shooting range.
  • the present invention can also be applied when
  • It can be used for systems that detect and track objects from multiple images that are continuously captured by a camera such as a PTZ camera that captures moving objects within the shooting range.
  • [Appendix 1] Calculation means for calculating movement trajectory information, which is a time series of position information of the moving object on the image, from a plurality of images of the object continuously photographed by a camera that captures the moving object within an imaging range; calculating a mapping function from the image to a reference image; using the calculated mapping function to calculate position information on the reference image corresponding to position information on the image of the object; conversion means for converting the obtained position information on the reference image into position information in a world coordinate system using camera calibration information;
  • a movement trajectory information processing device comprising: [Appendix 2]
  • the transforming means calculates a planar projective transformation matrix from the image to the reference image, and uses the computed planar projective transformation matrix to compute the mapping function.
  • the movement trajectory information processing device according to appendix 1.
  • Appendix 3 further comprising collation means for collating the trajectory information with other trajectory information; 3.
  • the other movement trajectory information is movement trajectory information calculated from a plurality of images continuously captured by another camera installed so that a part of the camera field of view overlaps with the camera.
  • the movement trajectory information processing device according to appendix 3.
  • Appendix 5 wherein the camera is a PTZ camera; 5.
  • the movement trajectory information processing apparatus according to any one of Appendices 1 to 4.
  • [Appendix 6] calculating movement trajectory information, which is a time series of position information of the moving object on the image, from a plurality of images of the object continuously photographed by a camera that captures the moving object within an imaging range; calculating a mapping function from the image to a reference image; using the calculated mapping function to calculate position information on the reference image corresponding to position information on the image of the object; converting the obtained position information on the reference image into position information in a world coordinate system using camera calibration information; A moving trajectory information processing method.
  • Appendix 7 In the conversion, a planar projective transformation matrix from the image to the reference image is calculated, and the mapping function is calculated using the calculated planar projective transformation matrix.
  • the movement trajectory information processing method according to appendix 6.
  • the other movement trajectory information is movement trajectory information calculated from a plurality of images continuously captured by another camera installed so that a part of the camera field of view overlaps with the camera.
  • a process of calculating movement trajectory information which is a time series of position information of the moving object on the image, from a plurality of images of the object continuously photographed by a camera that captures the moving object within an imaging range; calculating a mapping function from the image to a reference image; using the calculated mapping function to calculate position information on the reference image corresponding to position information on the image of the object; a process of converting the obtained position information on the reference image into position information in a world coordinate system using camera calibration information;
  • a computer-readable recording medium that records a program for performing
  • tracking system 11 10 tracking system 11, 12 PTZ cameras 13A to 13D surveillance area 20 control device 21 communication I/F section 22 operation input section 23 screen display section 24 storage section 25 arithmetic processing section 30 movement trajectory information processing device 31 calculation means 32 conversion means 241 Programs 242-1, 242-2 Camera calibration information 243-1, 243-2 Reference images 244-1, 244-2 Image DB 245 tracking target movement trajectory information 246 to-be-verified movement trajectory information 251 camera calibration units 252, 253 monitoring unit 2521 detection units 2522, 2532 tracking units 2523, 2533 coordinate conversion unit 2531 matching unit

Abstract

This movement trajectory information processing device comprises a calculation means and a conversion means. The calculation means calculates movement trajectory information, which is a time series of information pertaining to the position of a moving object in an image, from a plurality of images in which the object is continuously captured by a camera for capturing the image within an imaging range. The conversion means calculates a function for mapping from the image to a reference image, converts the information pertaining to the position of the object in the image to information pertaining to the position of the object in the reference image by using the calculated mapping function, and converts the information pertaining to the position of the object in the reference image that was obtained through conversion to information pertaining to the position of the object in a global coordinate system using camera calibration information.

Description

移動軌跡情報処理装置Movement trajectory information processing device
 本発明は、移動軌跡情報処理装置、移動軌跡情報処理方法、および記録媒体に関する。 The present invention relates to a movement trajectory information processing device, a movement trajectory information processing method, and a recording medium.
 移動軌跡情報を算出する従来装置の一例が特許文献1に記載されている。従来装置は、パン値・チルト値・ズーム値(PTZ値)を固定したカメラを用いて、移動する人物などの対象物を連続して撮影する。また、従来装置は、それぞれの撮影時に、カメラで撮影されて得られた画像中から対象物の位置を抽出し、カメラ校正情報(内部パラメータおよび外部パラメータ)を用いて、対象物の画像上の座標値を実空間上の座標値に変換する。従来装置は、上記のようにして得られた対象物の実空間上の位置の時系列を移動軌跡情報とする。 An example of a conventional device that calculates movement trajectory information is described in Patent Document 1. A conventional apparatus uses a camera with fixed pan, tilt and zoom values (PTZ values) to continuously photograph an object such as a moving person. In addition, the conventional apparatus extracts the position of the object from the image obtained by the camera at the time of each photographing, and uses the camera calibration information (intrinsic parameters and extrinsic parameters) to extract the position of the object on the image of the object. Converts coordinate values to coordinate values in real space. The conventional apparatus uses the time series of the position of the object on the real space obtained as described above as movement trajectory information.
WO2014/010174WO2014/010174
 PTZ値を固定したカメラでは、カメラの視野は限られる。そのため、より広い範囲を監視する目的でPTZ値を変更して移動する対象物を撮像範囲内に捉えながらカメラで連続して撮影された複数の画像から対象物の移動軌跡情報を算出することが重要である。しかし、カメラのPTZ値が変化すると、そのカメラのカメラ校正情報が変化する。また、変化後のPTZ値に対応するカメラ校正情報を取得するためにはオペレータによる校正作業が必要になる。そのため、連続して撮影された複数の画像のそれぞれからカメラ校正情報を用いて対象物の画像上の座標値を実空間上の座標値に直接に変換する従来装置では、移動する対象物を撮影範囲内に捉えるカメラで対象物を連続して撮影された複数の画像から移動軌跡情報を算出するのは困難であった。 A camera with a fixed PTZ value has a limited field of view. Therefore, for the purpose of monitoring a wider range, it is possible to calculate the movement trajectory information of an object from a plurality of images continuously captured by a camera while changing the PTZ value and capturing the moving object within the imaging range. is important. However, when the PTZ value of a camera changes, the camera calibration information for that camera changes. Further, in order to acquire the camera calibration information corresponding to the changed PTZ value, calibration work by the operator is required. For this reason, the conventional apparatus that directly converts the coordinate values on the image of the target object to the coordinate values on the real space using the camera calibration information from each of a plurality of images that are continuously captured cannot capture the moving target object. It has been difficult to calculate movement trajectory information from a plurality of images of an object continuously captured by a camera that captures it within its range.
 本発明は、上述した課題を解決する移動軌跡情報処理装置を提供することにある。 An object of the present invention is to provide a movement trajectory information processing apparatus that solves the above-described problems.
 本発明の一形態に係る移動軌跡情報処理装置は、
 移動する対象物を撮影範囲内に捉えるカメラで前記対象物を連続して撮影された複数の画像から前記対象物の画像上の位置情報の時系列である移動軌跡情報を算出する算出手段と、
 前記画像から基準画像への写像関数を算出し、該算出された写像関数を用いて前記対象物の画像上の位置情報に対応する前記基準画像上の位置情報を算出し、該算出されて得られた前記基準画像上の位置情報を、カメラ校正情報を用いて世界座標系の位置情報に変換する変換手段と、
を備えるように構成されている。
A movement trajectory information processing device according to one aspect of the present invention includes:
Calculation means for calculating movement trajectory information, which is a time series of position information of the moving object on the image, from a plurality of images of the object continuously photographed by a camera that captures the moving object within an imaging range;
calculating a mapping function from the image to a reference image; using the calculated mapping function to calculate position information on the reference image corresponding to position information on the image of the object; conversion means for converting the obtained position information on the reference image into position information in a world coordinate system using camera calibration information;
is configured to include
 本発明の他の形態に係る移動軌跡情報処理方法は、
 移動する対象物を撮影範囲内に捉えるカメラで前記対象物を連続して撮影された複数の画像から前記対象物の画像上の位置情報の時系列である移動軌跡情報を算出し、
 前記画像から基準画像への写像関数を算出し、該算出された写像関数を用いて前記対象物の画像上の位置情報に対応する前記基準画像上の位置情報を算出し、該算出されて得られた前記基準画像上の位置情報を、カメラ校正情報を用いて世界座標系の位置情報に変換する、
ように構成されている。
A movement trajectory information processing method according to another aspect of the present invention includes:
calculating movement trajectory information, which is a time series of position information of the moving object on the image, from a plurality of images of the object continuously photographed by a camera that captures the moving object within an imaging range;
calculating a mapping function from the image to a reference image; using the calculated mapping function to calculate position information on the reference image corresponding to position information on the image of the object; converting the obtained position information on the reference image into position information in a world coordinate system using camera calibration information;
is configured as
 本発明の他の形態に係るコンピュータ読み取り可能な記録媒体は、
 コンピュータに、
 移動する対象物を撮影範囲内に捉えるカメラで前記対象物を連続して撮影された複数の画像から前記対象物の画像上の位置情報の時系列である移動軌跡情報を算出する処理と、
 前記画像から基準画像への写像関数を算出し、該算出された写像関数を用いて前記対象物の画像上の位置情報に対応する前記基準画像上の位置情報を算出し、該算出されて得られた前記基準画像上の位置情報を、カメラ校正情報を用いて世界座標系の位置情報に変換する処理と、
を行わせるためのプログラムを記録するように構成されている。
A computer-readable recording medium according to another aspect of the present invention comprises:
to the computer,
A process of calculating movement trajectory information, which is a time series of position information of the moving object on the image, from a plurality of images of the object continuously photographed by a camera that captures the moving object within an imaging range;
calculating a mapping function from the image to a reference image; using the calculated mapping function to calculate position information on the reference image corresponding to position information on the image of the object; a process of converting the obtained position information on the reference image into position information in a world coordinate system using camera calibration information;
It is configured to record a program for causing the
 本発明は、上述した構成を有することにより、移動する対象物を撮影範囲内に捉えるカメラで対象物を連続して撮影された複数の画像から対象物の実空間上の位置の時系列である移動軌跡情報を算出することができる。 According to the present invention, having the above-described configuration, a time series of the position of the object in the real space is obtained from a plurality of images of the object continuously photographed by a camera that captures the moving object within the photographing range. Movement trajectory information can be calculated.
本発明の一実施形態に係る追跡システムの構成例を示す模式図である。It is a mimetic diagram showing an example of composition of a tracking system concerning one embodiment of the present invention. 本発明の一実施形態に係る追跡システムにおける制御装置の一例を示すブロック図である。It is a block diagram which shows an example of the control apparatus in the tracking system which concerns on one Embodiment of this invention. 本発明の一実施形態に係る追跡システムにおける制御装置で使用する追跡対象移動軌跡情報の一例を示す図である。FIG. 4 is a diagram showing an example of tracked object movement trajectory information used by the control device in the tracking system according to one embodiment of the present invention; 本発明の一実施形態に係る追跡システムにおける制御装置で使用する被照合移動軌跡情報の一例を示す図である。FIG. 4 is a diagram showing an example of to-be-verified trajectory information used by the control device in the tracking system according to one embodiment of the present invention; 本発明の一実施形態に係る追跡システムにおける制御装置において、画像上の位置情報を画像座標系から世界座標系に変換する方法の一例を説明するための模式図である。FIG. 4 is a schematic diagram for explaining an example of a method for converting position information on an image from an image coordinate system to a world coordinate system in the control device in the tracking system according to one embodiment of the present invention; 本発明の一実施形態に係る追跡システムにおける制御装置が一方のカメラを用いて行う追跡処理の一例を示すフローチャートである。4 is a flowchart showing an example of tracking processing performed by a control device using one camera in the tracking system according to one embodiment of the present invention; 本発明の一実施形態に係る追跡システムにおける制御装置が他方のカメラを用いて行う追跡処理の一例を示すフローチャートである。4 is a flowchart showing an example of tracking processing performed by the control device using the other camera in the tracking system according to one embodiment of the present invention; 本発明に係る移動軌跡情報処理装置の一実施形態のブロック図である。1 is a block diagram of an embodiment of a movement trajectory information processing device according to the present invention; FIG.
 次に、本発明の実施の形態について、図面を参照して詳細に説明する。なお、明細書中の記載において、「追跡」と「追尾」の同様の意味を持った記載が併存するが、「追跡」は主として人との関わりの強い部分での構成を、「追尾」は主として装置との関わりの強い部分での構成を、それぞれ説明するために便宜上使い分けているに過ぎない。
[第1の実施の形態]
 まず本発明の第1の実施形態について、理解を容易にするため、本発明の第1の実施形態が想定する課題について説明する。
Next, embodiments of the present invention will be described in detail with reference to the drawings. In addition, in the description in the specification, the descriptions with the same meaning of "tracking" and "tracking" coexist, but "tracking" mainly refers to the structure of the part that is strongly related to people, and "tracking" It is only for the sake of convenience to explain the configuration of the part that is mainly related to the device.
[First embodiment]
First, in order to facilitate understanding of the first embodiment of the present invention, problems assumed by the first embodiment of the present invention will be described.
 監視カメラの画像から人物を検出し、追跡(tracking)を行うことは、防犯やマーケティング等に役立つ。PTZ値を固定したカメラでは、カメラの視野は限られる。カメラのPTZ値を変更することにより、カメラの視野を拡大することができる。しかし、それでも1台のカメラの視野には限界がある。そのため、複数台のPTZカメラをカメラ視野の一部を共通にして監視エリアに分散して設置し、空間的・時間的により広範囲に人物を追跡するシステムを考える。このような追跡システムでは、何れかのカメラで追尾(tracing)している人物が別のカメラの視野に入ったことをシステムが自動的に認識し、当該別のカメラがその人物の追尾を開始できるとよい。カメラの視野に存在する多くの人物の中から、追尾を引き継ぐべき人物を決定するために、人物の移動軌跡を利用することができる。即ち、一方のカメラで追尾している人物の移動軌跡の情報(追跡対象移動軌跡情報)と他方のカメラの視野内の人物の移動軌跡の情報(被照合移動軌跡情報)とを照合することによって、追尾対象を決定する。このような照合を行うためには、追跡対象移動軌跡情報と被照合移動軌跡情報とが同一の座標系で表現されている必要がある。 Detecting and tracking people from surveillance camera images is useful for crime prevention and marketing. A camera with fixed PTZ values has a limited field of view of the camera. By changing the PTZ value of the camera, the field of view of the camera can be expanded. However, the field of view of a single camera is still limited. Therefore, a system is considered in which a plurality of PTZ cameras are distributed over a monitoring area with a part of the camera field of view shared, and a person is tracked in a wider range spatially and temporally. In such tracking systems, the system automatically recognizes when a person being tracked by one camera comes into the field of view of another camera, and the other camera begins tracking that person. I hope you can. A person's movement trajectory can be used to determine which person should take over tracking from many persons present in the field of view of the camera. That is, by collating information on the movement trajectory of a person being tracked by one camera (tracking target movement trajectory information) with information on the movement trajectory of a person within the field of view of the other camera (matched movement trajectory information). , determine the tracking target. In order to perform such collation, it is necessary that the tracking target trajectory information and the collated trajectory information are expressed in the same coordinate system.
 一般に、カメラで撮影して得られた人物の位置を表現する座標系には、個々のカメラで固有な画像座標系およびカメラ座標系と、複数のカメラで共通な世界座標系とがある。画像座標系は、撮像素子上の2次元座標系であり、画像上の点の位置は、通常、この画像座標系で表される。カメラ座標系は、画像座標系とカメラの内部パラメータとによって定まる座標系である。従って、カメラの画像座標系とそのカメラのカメラ座標系とは、そのカメラの内部パラメータを用いて、相互に変換できる。世界座標系は、カメラの内部パラメータおよび外部パラメータから構成されるカメラ校正情報と画像座標系とによって定まる座標系である。従って、カメラの画像座標系と世界座標系とは、そのカメラのカメラ校正情報を用いて、相互に変換できる。 In general, the coordinate system that expresses the position of a person captured by a camera includes an image coordinate system and a camera coordinate system unique to each camera, and a world coordinate system that is common to multiple cameras. The image coordinate system is a two-dimensional coordinate system on the image pickup device, and the position of a point on the image is usually represented by this image coordinate system. The camera coordinate system is a coordinate system defined by the image coordinate system and internal parameters of the camera. Therefore, the image coordinate system of the camera and the camera coordinate system of the camera can be mutually transformed using the intrinsic parameters of the camera. The world coordinate system is a coordinate system defined by camera calibration information composed of camera intrinsic and extrinsic parameters and an image coordinate system. Therefore, the image coordinate system of a camera and the world coordinate system can be transformed to each other using the camera calibration information of the camera.
 従来装置は、撮影されて得られた画像中から人物の位置を画像座標系で抽出し、カメラ校正情報を用いて、抽出した人物の画像座標系の座標値を世界座標系の座標値に変換する。しかし、人物を追尾するためにカメラのPTZ値を変化させると、カメラ校正情報が変化する。また、変化後のPTZ値に対応するカメラ校正情報をリアルタイムに取得することは困難である。そのため、従来装置では、移動する対象物を撮影範囲内に捉える追尾機能を有するカメラで対象物を追尾しながら移動軌跡情報を算出するのは困難であった。その結果、複数のPTZカメラ間で人物の移動軌跡をリアルタイムに照合することが困難であった。 A conventional device extracts the position of a person from a photographed image in an image coordinate system, and uses camera calibration information to convert the coordinate values of the extracted person in the image coordinate system into coordinate values in the world coordinate system. do. However, changing the PTZ value of the camera to track a person changes the camera calibration information. Also, it is difficult to acquire camera calibration information corresponding to the changed PTZ value in real time. Therefore, in the conventional apparatus, it is difficult to calculate movement locus information while tracking a moving object with a camera having a tracking function that captures the moving object within an imaging range. As a result, it has been difficult to collate the movement trajectory of a person between a plurality of PTZ cameras in real time.
 本実施形態は、上記課題を解決する。以下、本発明を適用した一実施形態に係る追跡システム10について、図面を参照して詳細に説明する。 This embodiment solves the above problems. A tracking system 10 according to an embodiment to which the present invention is applied will now be described in detail with reference to the drawings.
 図1は、本発明の一実施形態に係る追跡システム10の構成例を示す模式図である。図1を参照すると、追跡システム10は、監視エリア内の人物を検出し、追跡するシステムであり、2台のPTZカメラ11、12と、制御装置20とを含んで構成されている。 FIG. 1 is a schematic diagram showing a configuration example of a tracking system 10 according to one embodiment of the present invention. Referring to FIG. 1, the tracking system 10 is a system for detecting and tracking a person in a surveillance area, and includes two PTZ cameras 11 and 12 and a control device 20.
 PTZカメラ(以下、単にカメラと記す)11、12は、CMOSセンサやCCDセンサなどの固体撮像装置と雲台とを含む。カメラ11、12は、制御装置20からの指令に従って、例えば一定の周期で撮影範囲を連続して撮影することにより複数の画像を生成する。カメラ11とカメラ12の撮影周期と撮影タイミングは同じか或いは類似していることが好ましいが、異なっていてもよい。カメラ11、12は、制御装置20からの指令に従って、パン、チルト、ズームを変化させることで撮影範囲を変更することができる。ここで、パンはカメラの向きを左右に振ること、チルトはカメラの向きを上下に振ること、ズームは画角を望遠または広角に変化させることを、それぞれ意味する。カメラ11、12は、その撮影範囲の一部が共通するように監視エリアに分散して設置される。図1に示す例では、カメラ11は、紙面の左右方向に延びる通路上に設定された監視エリア13A、13B、13Cを監視している。また、カメラ12は、監視エリア13Cと紙面の上下方向に延びる通路上に設定された監視エリア13Dとを監視している。監視エリア13Cは通路の曲がり角の部分に相当する。即ち、通路の曲がり角のエリアが、2台のカメラ11、12で共通する監視エリア13Cとなっている。なお、通路上では、人物は、紙面左側の方向から入ってきて、通路の曲がり角を通り、紙面下側の方向から出ていくものとする。また、監視エリア13A~13Dは平坦な面とする。 The PTZ cameras (hereinafter simply referred to as cameras) 11 and 12 include solid-state imaging devices such as CMOS sensors and CCD sensors, and pan heads. The cameras 11 and 12 generate a plurality of images by continuously photographing the photographing range at regular intervals, for example, in accordance with instructions from the control device 20 . It is preferable that the imaging cycles and imaging timings of the cameras 11 and 12 are the same or similar, but they may be different. The cameras 11 and 12 can change the imaging range by changing the pan, tilt, and zoom in accordance with commands from the control device 20 . Here, "pan" means moving the camera left and right, "tilt" means moving the camera up and down, and "zoom" means changing the angle of view to telephoto or wide angle. The cameras 11 and 12 are installed dispersedly in the monitoring area so that a part of their photographing ranges are shared. In the example shown in FIG. 1, the camera 11 monitors monitoring areas 13A, 13B, and 13C set on a passage extending in the horizontal direction of the paper. In addition, the camera 12 monitors a monitoring area 13C and a monitoring area 13D set on a passage extending in the vertical direction of the paper. The monitoring area 13C corresponds to the corner portion of the passage. That is, the corner area of the passage is the surveillance area 13C that is common to the two cameras 11 and 12 . On the aisle, a person enters from the left side of the paper surface, passes through the corner of the passage, and exits from the lower side of the paper surface. Also, the monitoring areas 13A to 13D are assumed to be flat surfaces.
 制御装置20は、カメラ11、12と有線または無線により接続されている。制御装置20は、カメラ11、12の設置時に各カメラのカメラ校正情報(内部パラメータと外部パラメータ)を算出する。このカメラ校正情報の算出は、カメラ11、12のPTZ値をそれぞれのカメラの基準PTZに設定して行われる。例えば、カメラ11の基準PTZは、監視エリア13A内に居る人物を撮影できるカメラ視野となるように設定されている。また、カメラ12の基準PTZは、監視エリア13C内に居る人物を撮影できるカメラ視野となるように設定されている。システムの運用開始時、制御装置20は、基準PTZに固定したカメラ11で連続して撮影された画像を画面表示部にモニタ表示する。オペレータがモニタ表示された画像上において追跡したい人物を指定すると、制御装置20は、カメラ11のPTZ値を変更しながら当該指定された人物を追尾する。そして、制御装置20は、追尾している人物の移動軌跡の情報(追跡対象移動軌跡情報)を世界座標系でリアルタイムに算出する。 The control device 20 is connected to the cameras 11 and 12 by wire or wirelessly. The control device 20 calculates camera calibration information (intrinsic parameters and extrinsic parameters) of each camera when the cameras 11 and 12 are installed. This camera calibration information is calculated by setting the PTZ values of the cameras 11 and 12 as the reference PTZ of each camera. For example, the reference PTZ of the camera 11 is set so as to provide a camera field of view capable of photographing a person in the monitoring area 13A. Also, the reference PTZ of the camera 12 is set so as to provide a camera field of view capable of photographing a person in the monitoring area 13C. At the start of system operation, the control device 20 monitors and displays on the screen display unit images continuously captured by the camera 11 fixed to the reference PTZ. When the operator designates a person to be tracked on the image displayed on the monitor, the control device 20 tracks the designated person while changing the PTZ value of the camera 11 . Then, the control device 20 calculates in real time information on the movement trajectory of the person being tracked (tracking target movement trajectory information) in the world coordinate system.
 一方、システムの運用開始時、制御装置20は、基準PTZに固定したカメラ12で連続して撮影された画像に基づいて、監視エリア13C内に居る全ての人物の移動軌跡の情報(被照合移動軌跡情報)を世界座標系でリアルタイムに算出する。そして、制御装置20は、カメラ11で追尾している人物の追跡対象移動軌跡情報とカメラ12に写っている全ての人物の被照合移動軌跡情報とをリアルタイムで照合する。これにより、制御装置20は、カメラ11で追尾している人物がカメラ12に写っているどの人物であるかを決定する。次に、制御装置20は、上記決定した人物をカメラ12の撮影範囲に捉えながらカメラ12で追尾を続け、その画像を画面表示部にモニタ表示する。 On the other hand, at the start of system operation, the control device 20 obtains information on the movement trajectories of all persons in the monitoring area 13C (matched movement trajectory information) in real time in the world coordinate system. Then, the control device 20 collates the tracked movement trajectory information of the person being tracked by the camera 11 with the verified movement trajectory information of all the persons captured by the camera 12 in real time. Thereby, the control device 20 determines which person captured by the camera 12 is the person being tracked by the camera 11 . Next, the control device 20 continues tracking with the camera 12 while catching the determined person in the imaging range of the camera 12, and displays the image on the screen display unit as a monitor.
 以上が追跡システム10の概要である。続いて、制御装置20について詳細に説明する。 The above is the outline of the tracking system 10. Next, the control device 20 will be described in detail.
 図2は、制御装置20の一例を示すブロック図である。図2を参照すると、制御装置20は、通信I/F(インターフェース)部21と、操作入力部22と、画面表示部23と、記憶部24と、演算処理部25とを備える。 FIG. 2 is a block diagram showing an example of the control device 20. As shown in FIG. Referring to FIG. 2 , the control device 20 includes a communication I/F (interface) section 21 , an operation input section 22 , a screen display section 23 , a storage section 24 and an arithmetic processing section 25 .
 通信I/F部21は、データ通信回路から構成され、有線または無線によりカメラ11、12、および図示しない他の外部装置との間でデータ通信を行うように構成されている。操作入力部22は、キーボードやマウスなどの操作入力装置から構成され、オペレータの操作を検出して演算処理部25に出力するように構成されている。画面表示部23は、LCD(Liquid Crystal Display:液晶ディスプレイ)などの表示装置から構成され、カメラ11、12で撮影されて得られた画像などを表示するように構成されている。 The communication I/F unit 21 is composed of a data communication circuit, and is configured to perform data communication with the cameras 11 and 12 and other external devices (not shown) by wire or wirelessly. The operation input unit 22 is composed of an operation input device such as a keyboard and a mouse, and is configured to detect an operator's operation and output it to the arithmetic processing unit 25 . The screen display unit 23 is composed of a display device such as an LCD (Liquid Crystal Display), and is configured to display images captured by the cameras 11 and 12 and the like.
 記憶部24は、ハードディスクやメモリなどの1種類あるいは多種類の1以上の記憶装置から構成され、演算処理部25における各種処理に必要な処理情報およびプログラム241を記憶するように構成されている。プログラム241は、演算処理部25に読み込まれて実行されることにより各種処理部を実現するプログラムであり、通信I/F部21などのデータ入出力機能を介して図示しない外部装置や記録媒体から予め読み込まれて記憶部24に保存される。記憶部24に記憶される主な処理情報には、カメラ校正情報242-1、242-2、基準画像243-1、243-2、画像DB(データベース)244-1、244-2、追跡対象移動軌跡情報245、被照合移動軌跡情報246がある。 The storage unit 24 is composed of one or more types of storage devices such as hard disks and memories, and is configured to store processing information and programs 241 required for various processes in the arithmetic processing unit 25 . The program 241 is a program that realizes various processing units by being read and executed by the arithmetic processing unit 25, and can be read from an external device or a recording medium (not shown) via a data input/output function such as the communication I/F unit 21. It is read in advance and stored in the storage unit 24 . Main processing information stored in the storage unit 24 includes camera calibration information 242-1, 242-2, reference images 243-1, 243-2, image DB (database) 244-1, 244-2, tracking target There is moving track information 245 and collated moving track information 246 .
 カメラ校正情報242-1は、カメラ11のカメラ校正情報(カメラ11の内部パラメータおよび外部パラメータ)である。カメラ校正情報242-2は、カメラ12のカメラ校正情報(カメラ12の内部パラメータおよび外部パラメータ)である。 The camera calibration information 242-1 is the camera calibration information of the camera 11 (internal parameters and external parameters of the camera 11). The camera calibration information 242-2 is camera calibration information of the camera 12 (intrinsic parameters and extrinsic parameters of the camera 12).
 基準画像243-1は、カメラ校正情報242-1を求めた際のパン値、チルト値、および、ズーム値に設定したカメラ11によって撮影された監視エリア13Aの画像である。基準画像243-1は、画像以外に、基準画像撮影時のカメラ11のパン値、チルト値、および、ズーム値(カメラ11の基準PTZ値)、カメラ位置(カメラ11の世界座標系における位置)が含まれていてよい。また、基準画像243-2は、カメラ校正情報242-2を求めた際のパン値、チルト値、および、ズーム値に設定したカメラ12によって撮影された監視エリア13Cの画像である。基準画像243-2は、画像以外に、基準画像撮影時のパン値、チルト値、および、ズーム値(カメラ12の基準PTZ値)、カメラ位置(カメラ12の世界座標系における位置)が含まれていてよい。 A reference image 243-1 is an image of the surveillance area 13A captured by the camera 11 set to the pan value, tilt value, and zoom value when obtaining the camera calibration information 242-1. The reference image 243-1 includes, in addition to the image, the pan value, tilt value, and zoom value (reference PTZ value of the camera 11) of the camera 11 when the reference image was captured, and the camera position (the position of the camera 11 in the world coordinate system). may be included. The reference image 243-2 is an image of the monitoring area 13C captured by the camera 12 set to the pan value, tilt value, and zoom value when the camera calibration information 242-2 was obtained. The reference image 243-2 includes the pan value, tilt value, zoom value (reference PTZ value of the camera 12) and camera position (position of the camera 12 in the world coordinate system) at the time of photographing the reference image, in addition to the image. It's okay.
 画像DB244-1は、カメラ11によって撮影された画像の時系列を蓄積する。画像DB244-1に蓄積された個々の画像には、カメラ11のカメラIDと撮影時刻と撮影時のPTZ値が付加されている。画像DB244-2は、カメラ12によって撮影された画像の時系列を蓄積する。画像DB244-2に蓄積された個々の画像には、カメラ12のカメラIDと撮影時刻と撮影時のPTZ値が付加されている。 The image DB 244-1 accumulates the time series of images taken by the camera 11. Each image stored in the image DB 244-1 is added with the camera ID of the camera 11, the shooting time, and the PTZ value at the time of shooting. The image DB 244-2 accumulates the time series of images taken by the camera 12. FIG. Each image stored in the image DB 244-2 is added with the camera ID of the camera 12, the shooting time, and the PTZ value at the time of shooting.
 追跡対象移動軌跡情報245は、カメラ11で撮影された画像に基づいて算出された追跡対象人物の移動軌跡に関する情報である。図3は、追跡対象移動軌跡情報245の一構成例を示す図である。この例の追跡対象移動軌跡情報245は、追跡対象を一意に識別する追跡対象IDとカメラ11のカメラIDと関連付け情報とから構成されるエントリ2451と、カメラ11によって連続して撮影された複数の画像に1対1に対応する複数のエントリ2452とから構成される。複数のエントリ2452は、エントリ2451内の関連付け情報およびエントリ2452内の関連付け情報によって、撮影時刻の順に一列に繋げられている。 The tracked object movement trajectory information 245 is information related to the movement trajectory of the tracked person calculated based on the image captured by the camera 11 . FIG. 3 is a diagram showing a configuration example of the tracked object movement trajectory information 245. As shown in FIG. The tracked object movement trajectory information 245 in this example includes an entry 2451 composed of a tracked object ID that uniquely identifies the tracked object, the camera ID of the camera 11, and associated information, and a plurality of It is composed of a plurality of entries 2452 corresponding to images one-to-one. A plurality of entries 2452 are linked in a line in order of photographing time by the association information in the entry 2451 and the association information in the entry 2452 .
 個々のエントリ2452は、撮影時刻、人物領域、位置情報(画像座標系)、画像特徴、位置情報(世界座標系)、移動ベクトル、移動速度、加速度、および、関連付け情報とから構成される。撮影時刻は、対応する画像の撮影時刻を表す。人物領域は、例えば対応する画像中の人物領域の外接矩形を表す。位置情報(画像座標系)は、同じエントリにおける人物領域を代表する1点の画像座標系における座標値を表す。人物領域を代表する1点は、例えば人物領域の重心が考えられるが、それに限定されず、顔の重心や足元などであってもよい。画像特徴は、対応する画像中の人物領域から抽出された画像の特徴量を表す。画像特徴として、顔の特徴量、服装の特徴量、人物の大きさなどが考えられるが、それに限定されない。位置情報(世界座標系)は、同じエントリにおける位置情報(画像座標系)を画像座標系から世界座標系に変換して得られる座標値が設定される。移動ベクトルは、対応する画像と隣接する画像間における追跡対象人物の移動量と移動方向を表す。移動速度は、追跡対象人物の移動速度を表す。加速度は、追跡対象人物の加速度を表す。移動ベクトル、移動速度、加速度は、例えば、対応する画像と隣接する画像のエントリ2452における位置情報(世界座標系)に基づいて算出される。エントリ2452に含まれるデータの種類は一例であり、上記に限定されない。 Each entry 2452 consists of shooting time, person area, position information (image coordinate system), image feature, position information (world coordinate system), movement vector, movement speed, acceleration, and association information. The shooting time represents the shooting time of the corresponding image. A person area represents, for example, a bounding rectangle of the person area in the corresponding image. The position information (image coordinate system) represents the coordinate values of one point in the image coordinate system that represents the person area in the same entry. One point representing the person area may be, for example, the center of gravity of the person area, but is not limited to this, and may be the center of gravity of the face, feet, or the like. An image feature represents a feature amount of an image extracted from a human region in the corresponding image. Image features may include, but are not limited to, face features, clothing features, and person size. For the position information (world coordinate system), coordinate values obtained by converting the position information (image coordinate system) in the same entry from the image coordinate system to the world coordinate system are set. A motion vector represents the amount and direction of movement of the tracked person between the corresponding image and an adjacent image. The moving speed represents the moving speed of the tracked person. Acceleration represents the acceleration of the tracked person. The movement vector, movement velocity, and acceleration are calculated, for example, based on the position information (world coordinate system) in the entry 2452 of the image adjacent to the corresponding image. The type of data included in entry 2452 is an example, and is not limited to the above.
 被照合移動軌跡情報246は、カメラ12で撮影された画像から抽出された人物の移動軌跡に関する情報である。被照合移動軌跡情報246は、カメラ12で撮影された画像から抽出された人物毎に存在する。図4は、被照合移動軌跡情報246の一構成例を示す図である。この例の被照合移動軌跡情報246は、人物を一意に識別する人物IDとカメラ12のカメラIDと関連付け情報とから構成されるエントリ2461と、カメラ12によって連続して撮影された複数の画像に1対1に対応する複数のエントリ2462とから構成される。複数のエントリ2462は、エントリ2461内の関連付け情報およびエントリ2462内の関連付け情報によって、撮影時刻の順に一列に繋げられている。個々のエントリ2462は、図3を参照して説明した追跡対象移動軌跡情報245のエントリ2452と同様のデータから構成される。エントリ2462に含まれるデータの種類は一例であり、上記に限定されない。 The to-be-verified movement trajectory information 246 is information related to the movement trajectory of the person extracted from the image captured by the camera 12 . The to-be-verified movement trajectory information 246 exists for each person extracted from the image captured by the camera 12 . FIG. 4 is a diagram showing a configuration example of the to-be-verified movement trajectory information 246. As shown in FIG. The to-be-verified movement trajectory information 246 in this example includes an entry 2461 composed of a person ID that uniquely identifies a person, a camera ID of the camera 12, and association information, and a plurality of images continuously shot by the camera 12. It is composed of a plurality of entries 2462 in one-to-one correspondence. A plurality of entries 2462 are linked in a line in order of photographing time by the association information in the entry 2461 and the association information in the entry 2462 . Each entry 2462 is composed of data similar to the entry 2452 of the tracked object movement trajectory information 245 described with reference to FIG. The type of data included in entry 2462 is an example and is not limited to the above.
 演算処理部25は、MPUなどの1以上のマイクロプロセッサとその周辺回路を有し、記憶部24からプログラム241を読み込んで実行することにより、上記ハードウェアとプログラム241とを協働させて各種処理部を実現するように構成されている。演算処理部25で実現される主な処理部には、カメラ校正部251と2つの監視部252、253がある。 The arithmetic processing unit 25 has one or more microprocessors such as an MPU and their peripheral circuits, and by reading and executing the program 241 from the storage unit 24, the hardware and the program 241 cooperate to perform various processes. It is configured to realize the part. Main processing units realized by the arithmetic processing unit 25 include a camera calibration unit 251 and two monitoring units 252 and 253 .
 カメラ校正部251は、操作入力部22および画面表示部23を介したオペレータとの対話型処理によりカメラ11、12のカメラ校正情報を取得するように構成されている。使用するカメラ校正手法は、特に限定されない。 The camera calibration unit 251 is configured to acquire camera calibration information of the cameras 11 and 12 through interactive processing with the operator via the operation input unit 22 and screen display unit 23 . The camera calibration method used is not particularly limited.
 監視部252は、カメラ11を用いて監視エリア13A~13C内を移動する人物を検出し、追跡するように構成されている。監視部252は、検知部2521と追尾部2522と座標変換部2523とを備える。 The monitoring unit 252 is configured to use the camera 11 to detect and track a person moving within the monitoring areas 13A to 13C. The monitoring unit 252 includes a detection unit 2521 , a tracking unit 2522 and a coordinate conversion unit 2523 .
 検知部2521は、基準PTZ値に設定したカメラ11で監視エリア13Aを連続して撮影した画像に基づいて、監視エリア13A内に存在する全ての人物の中から特定の人物を追尾対象として検知するように構成されている。 The detection unit 2521 detects a specific person as a tracking target from among all persons present in the monitoring area 13A based on images of the monitoring area 13A continuously captured by the camera 11 set to the reference PTZ value. is configured as
 追尾部2522は、検知部2521によって検知された追尾対象を撮影範囲内に捉えるようにカメラ11で追尾しながら、追尾対象の移動軌跡の情報をカメラ11の画像座標系で算出し、追跡対象移動軌跡情報245として記憶部24に保存するように構成されている。追尾部2522は、カメラ11のPTZ値を変更して追尾対象を追尾するため、追尾対象が監視エリア13Aから監視エリア13Bや監視エリア13Cへ移動しても、当該追尾対象を撮影範囲内に捉えることができる。 The tracking unit 2522 calculates information on the movement trajectory of the tracking target in the image coordinate system of the camera 11 while tracking the tracking target detected by the detection unit 2521 with the camera 11 so as to capture it within the imaging range, and calculates the movement of the tracking target. It is configured to be stored in the storage unit 24 as locus information 245 . Since the tracking unit 2522 changes the PTZ value of the camera 11 to track the tracking target, even if the tracking target moves from the monitoring area 13A to the monitoring area 13B or the monitoring area 13C, the tracking target is captured within the shooting range. be able to.
 座標変換部2523は、追尾部2522によって算出された追跡対象移動軌跡情報245をカメラ11の画像座標系から世界座標系に変換するように構成されている。即ち、座標変換部2523は、画像上の位置情報を画像座標系から世界座標系に変換する。ここで、座標変換部2523は、世界座標系として、z=0(即ち、高さ0)となる平面(共通マップ)を使用してよい。画像座標系上の点を3次元の世界座標系へ変換する構成と比較して、2次元の世界座標系への変換は演算量および誤差が減少する。但し、座標変換部2523は、画像座標系の点を3次元の世界座標系に変換するように構成されていてもよい。 The coordinate transformation unit 2523 is configured to transform the tracked object movement trajectory information 245 calculated by the tracking unit 2522 from the image coordinate system of the camera 11 to the world coordinate system. That is, the coordinate transformation unit 2523 transforms the position information on the image from the image coordinate system to the world coordinate system. Here, the coordinate transformation unit 2523 may use a plane (common map) where z=0 (that is, height 0) as the world coordinate system. Transformation into the two-dimensional world coordinate system reduces the amount of calculation and errors compared to the configuration of transforming points on the image coordinate system into the three-dimensional world coordinate system. However, the coordinate transformation unit 2523 may be configured to transform points in the image coordinate system into the three-dimensional world coordinate system.
 図5は、画像上の位置情報を画像座標系から世界座標系に変換する方法の一例を説明するための模式図である。図5において、画像I0は、カメラ校正情報242-1が算出されたときと同じPTZ値に設定されたカメラ11で撮影された画像である。画像I1、I2、I3は、画像I0撮影後にPTZ値を変更しながら連続して撮影されたカメラ11の複数の画像である。 FIG. 5 is a schematic diagram for explaining an example of a method for converting position information on an image from the image coordinate system to the world coordinate system. In FIG. 5, an image I 0 is an image captured by the camera 11 set to the same PTZ value as when the camera calibration information 242-1 was calculated. Images I 1 , I 2 , and I 3 are a plurality of images taken by camera 11 continuously while changing the PTZ value after image I 0 was taken.
 座標変換部2523は、画像I0上の位置情報を、カメラ校正情報242-1を用いて、画像座標系から世界座標系に変換する。 The coordinate transformation unit 2523 transforms the position information on the image I 0 from the image coordinate system to the world coordinate system using the camera calibration information 242-1.
 また、座標変換部2523は、画像I1上の位置情報の世界座標系への変換では、先ず、画像I1上の位置情報を、写像関数f1を用いて画像I0上の対応する位置情報に変換する。次に、座標変換部2523は、上記変換によって得られた画像I0上の位置情報を、カメラ校正情報242-1を用いて、画像座標系から世界座標系に変換する。また、座標変換部2523は、上記写像関数f1を、画像I1から画像I0への平面射影変換行列H1として算出する。即ち、f1=H1である。また、座標変換部2523は、上記平面射影変換行列H1の算出では、画像I1と画像I0との間の対応点群を既知の特徴点抽出方式を用いて求め、その求めた対応点群から平面射影変換行列H1を算出する。画像I1と画像I0との間の対応点群は、換言すれば、画像I1と画像I0との間の対応する複数の特徴点のペアである。 In addition, the coordinate conversion unit 2523 converts the position information on the image I 1 to the world coordinate system by first converting the position information on the image I 1 to the corresponding position on the image I 0 using the mapping function f 1 . Convert to information. Next, the coordinate transformation unit 2523 transforms the position information on the image I 0 obtained by the above transformation from the image coordinate system to the world coordinate system using the camera calibration information 242-1. The coordinate transformation unit 2523 also calculates the mapping function f 1 as a planar projective transformation matrix H 1 from the image I 1 to the image I 0 . That is, f 1 =H 1 . In the calculation of the planar projective transformation matrix H 1 , the coordinate transformation unit 2523 obtains a group of corresponding points between the image I 1 and the image I 0 using a known feature point extraction method. Calculate the planar projective transformation matrix H 1 from the group. The corresponding point cloud between image I 1 and image I 0 is, in other words, pairs of corresponding feature points between image I 1 and image I 0 .
 また、座標変換部2523は、画像I2上の位置情報の世界座標系への変換では、先ず、画像I2上の位置情報を、写像関数f2を用いて画像I0上の対応する位置情報に変換し、次に、この変換によって得られた画像I0上の位置情報を、カメラ校正情報242-1を用いて、画像座標系から世界座標系に変換する。また、座標変換部2523は、上記写像関数f2を、写像関数f1と画像I2から画像I1への平面射影変換行列H2とから求める。即ち、f2=f1*H2である。また、座標変換部2523は、上記平面射影変換行列H2を、画像I2と画像I1との間の対応点群を既知の特徴点抽出方式を用いて求め、その求めた対応点群から平面射影変換行列H2を算出する。 In addition, the coordinate conversion unit 2523 converts the position information on the image I 2 to the world coordinate system by converting the position information on the image I 2 to the corresponding position on the image I 0 using the mapping function f 2 . information, and then position information on the image I 0 obtained by this transformation is transformed from the image coordinate system to the world coordinate system using the camera calibration information 242-1. Also, the coordinate transformation unit 2523 obtains the mapping function f 2 from the mapping function f 1 and the planar projective transformation matrix H 2 from the image I 2 to the image I 1 . That is, f 2 =f 1 *H 2 . Further, the coordinate transformation unit 2523 obtains the planar projective transformation matrix H 2 by using a known feature point extraction method to obtain the corresponding point group between the image I 2 and the image I 1 , and from the obtained corresponding point group A planar projective transformation matrix H 2 is calculated.
 また、座標変換部2523は、画像I3上の位置情報の世界座標系への変換では、先ず、画像I3上の位置情報を、写像関数f3を用いて画像I0上の対応する位置情報に変換し、次に、この変換によって得られた画像I0上の位置情報を、カメラ校正情報242-1を用いて、画像座標系から世界座標系に変換する。また、座標変換部2523は、上記写像関数f3を、写像関数f2と画像I3から画像I2への平面射影変換行列H3とから求める。即ち、f3=f2*H3である。また、座標変換部2523は、上記平面射影変換行列H3を、画像I3と画像I2との間の対応点群を既知の特徴点抽出方式を用いて求め、その求めた対応点群から平面射影変換行列H3を算出する。 In addition, the coordinate conversion unit 2523 converts the position information on the image I 3 to the world coordinate system by first converting the position information on the image I 3 to the corresponding position on the image I 0 using the mapping function f 3 . information, and then position information on the image I 0 obtained by this transformation is transformed from the image coordinate system to the world coordinate system using the camera calibration information 242-1. Also, the coordinate transformation unit 2523 obtains the mapping function f3 from the mapping function f2 and the planar projective transformation matrix H3 from the image I3 to the image I2 . That is, f 3 =f 2 *H 3 . In addition, the coordinate transformation unit 2523 obtains the planar projective transformation matrix H 3 by using a known feature point extraction method to obtain a group of corresponding points between the images I 3 and I 2 , and from the obtained corresponding point group A planar projective transformation matrix H3 is calculated.
 写像関数を求める方法は、上記に限定されない。例えば、座標変換部2523は、写像関数f3を、写像関数f1と画像I3から画像I1への平面射影変換行列H31を用いて、f3=f1*H31として算出してもよい。或いは、座標変換部2523は、写像関数f3を、画像I3から画像I0への平面射影変換行列H30を用いて、f3=H30として算出してもよい。但し、移動する人物を撮影範囲に捉えるように追尾しながらカメラ11で連続して撮影された隣接する画像間の共通領域は比較的大きいが、時間的に離れる程に画像間の共通領域は小さくなる傾向がある。共通領域の小さな画像間では、両画像間の対応点群を求めることが困難になる。また、両画像間の差が大きいと共通領域があっても対応点群を正確に求めることができないことがある。そのため、平面射影変換行列を求める2つの画像の撮影時間間隔は、上記の事情を考慮して決定されていることが望ましい。 The method of obtaining the mapping function is not limited to the above. For example, the coordinate transformation unit 2523 calculates the mapping function f3 as f3 = f1 * H31 using the mapping function f1 and the planar projective transformation matrix H31 from the image I3 to the image I1 . good too. Alternatively, the coordinate transformation unit 2523 may calculate the mapping function f 3 as f 3 =H 30 using the planar projective transformation matrix H 30 from the image I 3 to the image I 0 . However, the common area between adjacent images shot continuously by the camera 11 while tracking a moving person in the shooting range is relatively large, but the common area between the images becomes smaller as the time increases. tend to become Between images with a small common area, it is difficult to obtain corresponding point groups between both images. Also, if the difference between the two images is large, it may not be possible to obtain the corresponding point group accurately even if there is a common area. Therefore, it is desirable that the photographing time interval between the two images for obtaining the planar projective transformation matrix is determined in consideration of the above circumstances.
 再び、図2を参照すると、監視部253は、カメラ12を用いて監視エリア13C内に居る人物の中からカメラ11で追尾中の人物を検出し、この検出した人物を監視エリア13C、13Dにおいて追跡するように構成されている。監視部253は、照合部2531と追尾部2532と座標変換部2533とを備える。 Referring to FIG. 2 again, the monitoring unit 253 uses the camera 12 to detect the person being tracked by the camera 11 from among the persons in the monitoring area 13C, and monitors the detected person in the monitoring areas 13C and 13D. configured to track. The monitoring unit 253 includes a matching unit 2531 , a tracking unit 2532 and a coordinate conversion unit 2533 .
 照合部2531は、基準PTZ値に設定されたカメラ12で連続して撮影された監視エリア13Cの複数の画像から、監視エリア13C内を移動する人物毎の移動軌跡の情報(被照合移動軌跡情報246)をカメラ12の画像座標系で算出するように構成されている。 The matching unit 2531 extracts information on the movement trajectory of each person moving in the monitoring area 13C (matched movement trajectory information) from a plurality of images of the monitoring area 13C continuously captured by the camera 12 set to the reference PTZ value. 246) in the image coordinate system of the camera 12 .
 座標変換部2533は、照合部2531によって算出された被照合移動軌跡情報246を、カメラ校正情報242-2を用いて、カメラ12の画像座標系から世界座標系に変換するように構成されている。ここで、座標変換部2533は、座標変換部2523と同様に世界座標系として、z=0(即ち、高さ0)となる平面(共通マップ)を使用する。 The coordinate transformation unit 2533 is configured to transform the collated movement trajectory information 246 calculated by the collation unit 2531 from the image coordinate system of the camera 12 to the world coordinate system using the camera calibration information 242-2. . Here, the coordinate transformation unit 2533 uses a plane (common map) where z=0 (that is, height 0) as the world coordinate system, similarly to the coordinate transformation unit 2523 .
 また照合部2531は、世界座標系に変換された被照合移動軌跡情報246と監視部252で追尾中の人物の追跡対象移動軌跡情報245とを照合するように構成されている。また、照合部2531は、照合の結果に基づいて、カメラ12で撮影された監視エリア13C内を移動する人物のうち、何れの人物がカメラ11で追尾中の人物であるかを決定するように構成されている。 The collation unit 2531 is configured to collate the collated trajectory information 246 converted into the world coordinate system and the tracked trajectory information 245 of the person being tracked by the monitoring unit 252 . In addition, based on the result of the matching, the matching unit 2531 determines which of the people photographed by the camera 12 and moving in the monitoring area 13C is the person being tracked by the camera 11. It is configured.
 照合部2531において、追跡対象移動軌跡情報245と被照合移動軌跡情報246とを照合する方法の一例について説明する。 An example of a method for collating the tracking target movement trajectory information 245 and the to-be-verified movement trajectory information 246 in the verification unit 2531 will be described.
 先ず、照合部2531は、各軌跡の最新Nフレーム分を照合対象とする。ここで、Nは事前に定められた2以上の所定値である。そのため、照合部2531は、追跡対象移動軌跡情報245の最後尾のエントリ2452から順にN個のエントリ2452を抽出し、この抽出したN個のエントリ2452を追跡対象の移動軌跡情報とする。照合部2531は、追跡対象移動軌跡情報245にN個のエントリ2452が存在しなければ、その時点での照合は行わない。一方、照合部2531は、1以上の被照合移動軌跡情報246のそれぞれから、最後尾のエントリ2462から順にN個のエントリ2462を抽出し、この抽出したN個のエントリ2462を被照合の移動軌跡情報とする。照合部2531は、N個のエントリ2462が存在しない被照合移動軌跡情報246は、その時点での照合は行わない。照合部2531は、N個のエントリ2462が存在する被照合移動軌跡情報246が1つも存在しなければ、その時点での照合は行わない。 First, the matching unit 2531 matches the latest N frames of each trajectory. Here, N is a predetermined value of 2 or more. Therefore, the matching unit 2531 extracts N entries 2452 in order from the last entry 2452 of the tracked object movement trajectory information 245, and uses the extracted N entries 2452 as the tracked object movement trajectory information. If there are no N entries 2452 in the tracking target movement trajectory information 245, the matching unit 2531 does not perform matching at that point. On the other hand, the matching unit 2531 extracts N entries 2462 in order from the last entry 2462 from each of the one or more to-be-matched movement trajectory information 246, and converts the extracted N entries 2462 to the to-be-matched movement trajectory. information. The collation unit 2531 does not collate the to-be-verified movement trajectory information 246 in which N entries 2462 do not exist at that time. The collation unit 2531 does not perform collation at that time if there is no collated movement trajectory information 246 in which N entries 2462 exist.
 次に、照合部2531は、追跡対象の移動軌跡情報と個々の被照合の移動軌跡情報との照合は、形の合致度と向きの合致度とに基づいて算出する。 Next, the collation unit 2531 calculates collation between the movement trajectory information of the tracked object and the individual trajectory information to be collated based on the matching degree of the shape and the matching degree of the direction.
 照合部2531は、形の合致度の算出では、先ず、絶対位置の影響をなくすため、それぞれの移動軌跡を重心で正規化する。ここで、追跡対象の移動軌跡情報を構成するN個のエントリ2452における位置情報(世界座標系)を撮影時刻順に線分で繋いだものが追跡対象の移動軌跡になる。また、被照合の移動軌跡情報を構成するN個のエントリ2462における位置情報(世界座標系)を撮影時刻順に線分で繋いだものが被照合の移動軌跡になる。次に、照合部2531は、追跡対象と被照合それぞれの移動軌跡の同時刻の位置同士の誤差を最小二乗法などで求め、その誤差が小さいほど高くなる値を、形の合致度として算出する。 The matching unit 2531 first normalizes each movement trajectory by the center of gravity in order to eliminate the influence of the absolute position when calculating the matching degree of the shapes. Here, the movement trajectory of the tracked object is obtained by connecting the position information (in the world coordinate system) in the N entries 2452 constituting the movement trajectory information of the tracked object with line segments in order of photographing time. Also, the location information (in the world coordinate system) in the N entries 2462 constituting the movement trajectory information to be verified is connected by line segments in the order of photographing time to become the movement trajectory to be verified. Next, the collation unit 2531 obtains the error between the positions of the movement trajectories of the tracked object and the object to be collated at the same time by using the method of least squares or the like, and calculates a value that increases as the error decreases as the matching degree of the shape. .
 また、照合部2531は、向きの合致度の算出では、追跡対象と被照合それぞれの移動軌跡の同時刻における移動ベクトル同士の誤差の総和が小さいほど高くなる値を、向きの合致度として算出する。 In addition, in calculating the matching degree of orientation, the matching unit 2531 calculates a value that increases as the sum of errors between movement vectors at the same time of the movement trajectories of the tracked object and the object to be matched becomes smaller, as the degree of orientation matching. .
 次に、照合部2531は、形の合致度と向きの合致度とから、照合結果を算出する。例えば、照合部2531は、形の合致度と向きの合致度を掛け合わせた値を照合結果としてよい。或いは、照合部2531は、例えば、形の合致度と向きの合致度を足し合わせた値を照合結果としてよい。 Next, the matching unit 2531 calculates a matching result from the shape matching degree and orientation matching degree. For example, the collation unit 2531 may obtain a value obtained by multiplying the matching degree of the shape by the matching degree of the orientation as the collation result. Alternatively, the collation unit 2531 may use, for example, a value obtained by adding together the matching degree of the shape and the matching degree of the orientation as the matching result.
 以上が、照合部2531において追跡対象移動軌跡情報245と被照合移動軌跡情報246とを照合する方法の一例である。但し、照合方法は上記に限定されない。例えば、照合部2531は、追跡対象移動軌跡情報245のエントリ2452および被照合移動軌跡情報246のエントリ2462に含まれる画像特徴、移動速度、加速度をも考慮して、双方の移動軌跡情報を照合してよい。 The above is an example of a method of collating the tracked movement trajectory information 245 and the collated movement trajectory information 246 in the collation unit 2531 . However, the matching method is not limited to the above. For example, the collation unit 2531 also considers the image features, movement speed, and acceleration included in the entry 2452 of the tracked movement trajectory information 245 and the entry 2462 of the to-be-verified movement trajectory information 246, and collates both movement trajectory information. you can
 照合部2531は、以上のようにして、追跡対象移動軌跡情報245と1以上の被照合移動軌跡情報246とを照合した結果に基づいて、カメラ12で撮影された監視エリア13C内を移動する人物のうち、何れの人物がカメラ11で追尾中の人物であるかを決定するように構成されている。例えば、照合部2531は、一番高い照合結果が一定閾値以上であれば、その一番高い照合結果となった被照合移動軌跡情報246に係る人物がカメラ11で追尾中の人物であると決定する。一方、照合部2531は、例えば、一番高い照合結果が一定閾値以上でなければ、カメラ11で追尾中の人物はカメラ12で撮影された人物の中に存在しないと決定する。 As described above, the collation unit 2531 identifies a person moving in the monitoring area 13C captured by the camera 12 based on the result of collating the tracked movement trajectory information 245 and one or more to-be-verified movement trajectory information 246. Among them, it is configured to determine which person is the person being tracked by the camera 11 . For example, if the highest matching result is equal to or greater than a certain threshold, the matching unit 2531 determines that the person associated with the to-be-matched movement trajectory information 246 with the highest matching result is the person being tracked by the camera 11. do. On the other hand, for example, the matching unit 2531 determines that the person being tracked by the camera 11 does not exist among the people photographed by the camera 12 unless the highest matching result is equal to or greater than a certain threshold.
 追尾部2532は、照合部2531によって決定された追尾対象(カメラ12に写っているカメラ11で追尾中の人物)を撮影範囲内に捉えるようにカメラ12で追尾するように構成されている。追尾部2532は、カメラ12のPTZ値を変更して追尾対象を追尾するため、追尾対象が監視エリア14Cから監視エリア13Dへ移動しても当該追尾対象を撮影範囲内に捉えることができる。 The tracking unit 2532 is configured so that the tracking target determined by the matching unit 2531 (the person captured by the camera 12 and being tracked by the camera 11) is tracked by the camera 12 so as to be captured within the shooting range. The tracking unit 2532 changes the PTZ value of the camera 12 to track the tracking target, so even if the tracking target moves from the monitoring area 14C to the monitoring area 13D, the tracking target can be captured within the imaging range.
 続いて、制御装置20の動作について説明する。先ず、システム運用前に実施するカメラ校正について説明する。 Next, the operation of the control device 20 will be explained. First, camera calibration performed before system operation will be described.
 制御装置20のカメラ校正部251は、カメラ11、12の設置時などのシステム運用前の任意の時点で、カメラ11、12の校正を行う。 The camera calibration unit 251 of the control device 20 calibrates the cameras 11 and 12 at any time before system operation, such as when the cameras 11 and 12 are installed.
 カメラ11の校正では、カメラ校正部251は、先ず、監視エリア13Aの全域を撮影できるようにカメラ11のPTZ値を調整する。次に、カメラ校正部251は、操作入力部22および画面表示部23を通じたオペレータとの間の対話型処理により、所定のカメラ校正手法を用いてカメラ11のカメラ校正情報を算出する。カメラ校正部251は、算出したカメラ校正情報をカメラ校正情報242-1として記憶部24に保存する。またカメラ校正部251は、校正を行ったときのカメラ11のPTZ値を基準PTZ値として取得する。さらにカメラ校正部251は、基準PTZ値に設定したカメラ11でエリア13Aを撮影した画像を基準画像として取得する。カメラ校正部251は、上記取得した基準画像に基準PTZ値を付加し、基準画像243-1として記憶部24に保存する。 In calibrating the camera 11, the camera calibration unit 251 first adjusts the PTZ value of the camera 11 so that the entire monitoring area 13A can be captured. Next, the camera calibration unit 251 calculates camera calibration information of the camera 11 using a predetermined camera calibration method through interactive processing with the operator through the operation input unit 22 and the screen display unit 23 . The camera calibration unit 251 stores the calculated camera calibration information in the storage unit 24 as the camera calibration information 242-1. Also, the camera calibration unit 251 acquires the PTZ value of the camera 11 at the time of calibration as the reference PTZ value. Further, the camera calibration unit 251 acquires an image of the area 13A captured by the camera 11 set to the reference PTZ value as a reference image. The camera calibration unit 251 adds a reference PTZ value to the obtained reference image and stores it in the storage unit 24 as a reference image 243-1.
 カメラ12の校正では、カメラ校正部251は、先ず、監視エリア13C(カメラ11との共通エリア)の全域を撮影できるようにカメラ12のPTZ値を調整する。次に、カメラ校正部251は、操作入力部22および画面表示部23を通じたオペレータとの間の対話型処理により、所定のカメラ校正手法を用いてカメラ12のカメラ校正情報を算出する。カメラ校正部251は、算出したカメラ校正情報をカメラ校正情報242-2として記憶部24に保存する。またカメラ校正部251は、校正を行ったときのカメラ12のPTZ値を基準PTZ値として取得する。さらにカメラ校正部251は、基準PTZ値に設定したカメラ12でエリア15を撮影した画像を基準画像として取得する。カメラ校正部251は、上記取得した基準画像に基準PTZ値を付加して、基準画像243-2として記憶部24に保存する。 In calibrating the camera 12, the camera calibration unit 251 first adjusts the PTZ value of the camera 12 so that the entire monitoring area 13C (common area with the camera 11) can be captured. Next, the camera calibration unit 251 calculates camera calibration information of the camera 12 using a predetermined camera calibration method through interactive processing with the operator through the operation input unit 22 and the screen display unit 23 . The camera calibration unit 251 stores the calculated camera calibration information in the storage unit 24 as camera calibration information 242-2. Also, the camera calibration unit 251 acquires the PTZ value of the camera 12 at the time of calibration as the reference PTZ value. Further, the camera calibration unit 251 acquires an image of the area 15 captured by the camera 12 set to the reference PTZ value as a reference image. The camera calibration unit 251 adds a reference PTZ value to the obtained reference image and stores it in the storage unit 24 as a reference image 243-2.
 続いて、システム運用時の制御装置20の動作について説明する。図6は、カメラ11を用いて行う追跡処理の一例を示すフローチャートである。以下、図6を参照して、制御装置20がカメラ11を用いて行う追跡処理について説明する。 Next, the operation of the control device 20 during system operation will be described. FIG. 6 is a flowchart showing an example of tracking processing performed using the camera 11. As shown in FIG. The tracking process performed by the control device 20 using the camera 11 will be described below with reference to FIG.
 先ず、制御装置20の監視部252における検知部2521は、初期化を行う(ステップS11)。この初期化では、検知部2521は、カメラ11を基準PTZ値に設定する。検知部2521は、初期化では、さらに、追跡対象移動軌跡情報245の全てのエントリをクリアする。次に、検知部2521は、基準PTZ値に設定されたカメラ11で監視エリア13Aを撮影した画像を取得し、画像DB244-1に保存すると共に画面表示部23にモニタ表示する(ステップS12)。次に、検知部2521は、パターン認識や機械学習などの各種手法を用いて、今回保存した画像から全ての人物を検出する(ステップS13)。次に、検知部2521は、検出結果を画面表示部23に表示する(ステップS14)。例えば、検知部2521は、画像から検出した人物毎の外接矩形をその画像に重畳表示した画像を生成して画面表示部23に表示する。次に、検知部2521は、操作入力部22から追跡対象の人物が指定されたか否かを判定する(ステップS15)。オペレータは、例えば、画面表示部23に表示されたカメラ11の画像上で人物の矩形をクリックするなどの操作を行うことにより追跡対象の人物を指定することができる。検知部2521は、追跡対象の人物が指定されなければ、ステップS12に戻り、上述した処理と同様の処理を繰り返す。一方、追跡対象の人物が指定されると、検知部2521は、追跡対象移動軌跡情報245を更新する(ステップS16)。 First, the detection unit 2521 in the monitoring unit 252 of the control device 20 initializes (step S11). In this initialization, the detection unit 2521 sets the camera 11 to the reference PTZ value. Further, the detection unit 2521 clears all entries of the tracked object movement trajectory information 245 in the initialization. Next, the detection unit 2521 acquires an image of the surveillance area 13A captured by the camera 11 set to the reference PTZ value, saves it in the image DB 244-1, and displays it on the screen display unit 23 (step S12). Next, the detection unit 2521 detects all persons from the image saved this time using various techniques such as pattern recognition and machine learning (step S13). Next, the detection unit 2521 displays the detection result on the screen display unit 23 (step S14). For example, the detection unit 2521 generates an image in which the circumscribed rectangle for each person detected from the image is superimposed on the image, and displays the image on the screen display unit 23 . Next, the detection unit 2521 determines whether or not a person to be tracked has been designated by the operation input unit 22 (step S15). For example, the operator can specify a person to be tracked by performing an operation such as clicking a rectangle of the person on the image of the camera 11 displayed on the screen display unit 23 . If the person to be tracked is not specified, the detection unit 2521 returns to step S12 and repeats the same processing as described above. On the other hand, when a person to be tracked is designated, the detection unit 2521 updates the tracked object movement trajectory information 245 (step S16).
 検知部2521は、ステップS16における追跡対象移動軌跡情報245の更新では、エントリ2451にオペレータから指定された追跡対象人物に割り当てたIDおよびカメラ11のカメラIDを設定する。また、検知部2521は、1つの空きエントリ2452を確保し、確保したエントリ2452およびエントリ2451に関連付け情報を設定することにより、確保したエントリ2452とエントリ2451とを相互に関連付ける。次に、検知部2521は、確保したエントリ2452に注目し、注目中エントリ2452に、撮影時刻、人物領域、位置情報(画像座標系)、画像特徴を設定し、位置情報(世界座標系)と移動ベクトルと移動速度と加速度はNULL値とする。 The detection unit 2521 sets the ID assigned to the tracked person specified by the operator and the camera ID of the camera 11 in the entry 2451 when updating the tracked object movement trajectory information 245 in step S16. Further, the detection unit 2521 secures one empty entry 2452 and sets association information to the secured entries 2452 and 2451 , thereby mutually relating the secured entries 2452 and 2451 . Next, the detection unit 2521 focuses on the secured entry 2452, sets the shooting time, person area, position information (image coordinate system), and image features to the focused entry 2452, and sets the position information (world coordinate system) and The movement vector, movement speed, and acceleration are NULL values.
 次に、監視部252の座標変換部2523は、追跡対象移動軌跡情報245の注目中エントリ2452に対して座標変換を行う(ステップS17)。このステップS17において、座標変換部2523は、先ず、カメラ校正情報242-1を使用して、注目中エントリ2452に設定された位置情報(画像座標系)を、画像座標系から世界座標系に変換する。次に、座標変換部2523は、上記変換して得られた位置情報(世界座標系)を注目中エントリ25452に設定する。 Next, the coordinate transformation unit 2523 of the monitoring unit 252 performs coordinate transformation on the noted entry 2452 of the tracked object movement trajectory information 245 (step S17). In this step S17, the coordinate conversion unit 2523 first uses the camera calibration information 242-1 to convert the position information (image coordinate system) set in the focused entry 2452 from the image coordinate system to the world coordinate system. do. Next, the coordinate conversion unit 2523 sets the location information (world coordinate system) obtained by the above conversion to the entry 25452 of interest.
 次に、監視部252の追尾部2522は、追尾制御を行う(ステップS18)。例えば、追尾部2522は、注目中エントリ2452に設定された位置情報(画像座標系)に応じてカメラ11のPTZ値を制御するための指令を生成し、通信I/F部21を通じてカメラ11に対して送信する。このとき、例えば、追尾部2522は、注目中エントリ2452の位置情報(画像座標系)で表される追跡対象人物の外接矩形の重心が画像の中心に表示されるようにパンとチルトを調整し、その外接矩形の全体が所定の画角内に収まるようにズームを調整してよい。カメラ11は、上記指令に応答してパン、チルト、ズームを変更することにより撮像範囲を変更する。 Next, the tracking unit 2522 of the monitoring unit 252 performs tracking control (step S18). For example, the tracking unit 2522 generates a command for controlling the PTZ value of the camera 11 according to the position information (image coordinate system) set in the focused entry 2452, and sends the command to the camera 11 through the communication I/F unit 21. send to. At this time, for example, the tracking unit 2522 adjusts panning and tilting so that the center of gravity of the circumscribing rectangle of the person to be tracked represented by the position information (image coordinate system) of the focused entry 2452 is displayed at the center of the image. , the zoom may be adjusted so that the entire circumscribing rectangle is within a predetermined angle of view. The camera 11 changes the imaging range by changing the pan, tilt, and zoom in response to the above command.
 次に、追尾部2522は、変更後のPTZ値に設定されたカメラ11で撮影された画像を取得して画像DB244-1に保存するとともに画面表示部23にモニタ表示する(ステップS19)。次に、追尾部2522は、パターン認識や機械学習などの各種手法を用いて、今回保存した画像から追尾対象の人物を検出し(ステップS20)、検出した人物の外接矩形をその画像に重畳表示した画像を生成して画面表示部23に表示する(ステップS21)。これにより、オペレータは、画面表示部23に表示された画像上で指定した人物の追跡状況をリアルタイムに確認することができる。 Next, the tracking unit 2522 acquires an image captured by the camera 11 set to the changed PTZ value, saves it in the image DB 244-1, and displays it on the screen display unit 23 (step S19). Next, the tracking unit 2522 uses various techniques such as pattern recognition and machine learning to detect the person to be tracked from the image saved this time (step S20), and superimposes the circumscribed rectangle of the detected person on the image. The resulting image is generated and displayed on the screen display unit 23 (step S21). Thereby, the operator can confirm the tracking status of the person specified on the image displayed on the screen display section 23 in real time.
 次に、追尾部2522は、追跡対象移動軌跡情報245を更新する(ステップS22)。追尾部2522は、ステップS22における追跡対象移動軌跡情報245の更新では、先ず、1つの空きエントリ2452を確保し、この確保した空きエントリ2452と注目中エントリ2452とに関連付け情報を設定した後、上記確保した空きエントリ2452に注目を移す。次に、追尾部2522は、新たに注目したエントリ2452に、撮影時刻、人物領域、位置情報(画像座標系)、画像特徴を設定し、位置情報(世界座標系)と移動ベクトルと移動速度と加速度はNULL値とする。 Next, the tracking unit 2522 updates the tracked object movement trajectory information 245 (step S22). In updating the tracking target movement trajectory information 245 in step S22, the tracking unit 2522 first secures one empty entry 2452, and after setting the association information between this secured empty entry 2452 and the focused entry 2452, Attention is shifted to the reserved free entry 2452 . Next, the tracking unit 2522 sets the shooting time, person area, position information (image coordinate system), and image features to the entry 2452 of interest newly, and sets the position information (world coordinate system), the movement vector, and the movement speed. Assume that the acceleration is a NULL value.
 次に、監視部252の座標変換部2523は、注目中エントリ2452に設定された位置情報(画像座標系)を、画像座標系から世界座標系に変換する(ステップS23)。注目中エントリ2452に対応する画像は、基準画像とは異なるPTZ値に設定されたカメラ11で撮影された画像である。そのため、座標変換部2523は、図5を参照して説明したように、先ず、注目中エントリ2452に設定された位置情報(画像座標系)を基準画像上の位置情報に変換する写像関数を算出し、その算出した写像関数を用いて、位置情報(画像座標系)を基準画像上の位置情報に変換する。次に、座標変換部2523は、変換されて得られた基準画像上の位置情報を、カメラ校正情報242-1を用いて画像座標系から世界座標系に変換する。また、座標変換部2523は、ステップS23において、注目中エントリ2452とそれ以前のエントリ2452の位置情報(世界座標系)に基づいて、移動ベクトル、移動速度、加速度を算出し、注目中エントリ2452に設定する。 Next, the coordinate conversion unit 2523 of the monitoring unit 252 converts the position information (image coordinate system) set in the focused entry 2452 from the image coordinate system to the world coordinate system (step S23). The image corresponding to the interested entry 2452 is an image captured by the camera 11 set to a PTZ value different from that of the reference image. Therefore, as described with reference to FIG. 5, the coordinate conversion unit 2523 first calculates a mapping function for converting the position information (image coordinate system) set in the target entry 2452 into position information on the reference image. Then, using the calculated mapping function, position information (image coordinate system) is transformed into position information on the reference image. Next, the coordinate conversion unit 2523 converts the position information on the reference image obtained by conversion from the image coordinate system to the world coordinate system using the camera calibration information 242-1. Further, in step S23, the coordinate conversion unit 2523 calculates the movement vector, movement speed, and acceleration based on the position information (world coordinate system) of the entry 2452 of interest and the entries 2452 preceding it, and converts the entry 2452 of interest to set.
 次に、監視部252の追尾部2522は、追尾を終了するか否かを判断する(ステップS24)。例えば、追尾部2522は、追跡対象の人物が監視エリア13A、13Bおよび13Cに存在しなくなったことを検知したとき、追尾を終了すると判断してよい。或いは、追尾部2522は、追跡対象の人物がカメラ11によって追尾できない状態になったことを検知したとき、追尾を終了してよい。追尾部2522は、追尾を終了すると判断しなかった場合、ステップS18に戻って、上述した処理と同様の処理を繰り返す。これにより、追跡対象の人物のカメラ11による追尾が継続され、それに応じて追跡対象移動軌跡情報245が更に更新されていくことになる。 Next, the tracking unit 2522 of the monitoring unit 252 determines whether or not to end tracking (step S24). For example, the tracking unit 2522 may determine to end tracking when it detects that the person to be tracked is no longer present in the monitoring areas 13A, 13B, and 13C. Alternatively, the tracking unit 2522 may end tracking when it detects that the person to be tracked cannot be tracked by the camera 11 . If the tracking unit 2522 does not determine to end the tracking, it returns to step S18 and repeats the same processing as described above. As a result, the tracking of the tracked person by the camera 11 is continued, and the tracked object movement trajectory information 245 is further updated accordingly.
 追尾部2522は、追尾を終了すると判断した場合、監視を終了するか否かを判断する(ステップS25)。例えば、追尾部2522は、操作入力部22を通じてオペレータから監視終了コマンドが入力された場合、監視を終了すると判断する。監視を終了すると判断されなかった場合、監視部252は、ステップS11に戻って、上述した処理と同様の処理を繰り返す。監視を終了すると判断された場合、監視部252は、図6に示す処理を終了する。 When the tracking unit 2522 determines to end tracking, it determines whether or not to end monitoring (step S25). For example, the tracking unit 2522 determines to end the monitoring when the operator inputs a monitoring end command through the operation input unit 22 . If it is not determined to end the monitoring, the monitoring unit 252 returns to step S11 and repeats the same processing as described above. When it is determined to end the monitoring, the monitoring unit 252 ends the processing shown in FIG.
 図7は、カメラ12を用いて行う追跡処理の一例を示すフローチャートである。以下、図7を参照して、制御装置20がカメラ12を用いて行う追跡処理について説明する。 FIG. 7 is a flowchart showing an example of tracking processing performed using the camera 12. FIG. The tracking process performed by the control device 20 using the camera 12 will be described below with reference to FIG.
 先ず、制御装置20の監視部253における照合部2531は、初期化を行う(ステップS31)。この初期化では、照合部2531は、カメラ12を基準PTZ値に設定する。また、照合部2531は、初期化では、さらに、被照合移動軌跡情報246の全てのエントリをクリアする。次に、照合部2531は、基準PTZ値に設定されたカメラ12で監視エリア13Cを撮影した画像を取得し、画像DB244-2に保存すると共に画面表示部23にモニタ表示する(ステップS32)。次に、照合部2531は、パターン認識や機械学習などの各種手法を用いて、今回保存した画像から全ての人物を検出する(ステップS33)。次に、照合部2531は、検出結果を画面表示部23に表示する(ステップS34)。 First, the matching unit 2531 in the monitoring unit 253 of the control device 20 initializes (step S31). In this initialization, the matching unit 2531 sets the camera 12 to the reference PTZ value. Further, the collation unit 2531 clears all entries of the collated movement trajectory information 246 in the initialization. Next, the matching unit 2531 acquires an image of the monitoring area 13C captured by the camera 12 set to the reference PTZ value, saves it in the image DB 244-2, and displays it on the screen display unit 23 (step S32). Next, the matching unit 2531 detects all persons from the image saved this time using various techniques such as pattern recognition and machine learning (step S33). Next, the matching unit 2531 displays the detection result on the screen display unit 23 (step S34).
 次に、照合部2531は、人物の検出結果に基づいて、被照合移動軌跡情報246を更新する(ステップS35)。初期化後に最初に取得されたカメラ12の画像から検出された人物は全て初めて検出された人物であるため、照合部2531は、ステップS35において、検出された人物毎に1つの被照合移動軌跡情報246を割り当て、割り当てた被照合移動軌跡情報246のそれぞれに対して以下の処理を行う。 Next, the matching unit 2531 updates the to-be-matched movement trajectory information 246 based on the detection result of the person (step S35). Since all the persons detected from the image of the camera 12 first acquired after the initialization are the persons detected for the first time, the collation unit 2531, in step S35, creates one to-be-identified movement trajectory information for each detected person. 246 are assigned, and the following processing is performed for each of the assigned collated movement trajectory information 246 .
 先ず、照合部2531は、被照合移動軌跡情報246のエントリ2461に検出された人物に割り当てた人物IDとカメラ12のカメラIDを設定する。次に、照合部2531は、1つの空きエントリ2462を確保し、その確保した空きエントリ2462とエントリ2461とを相互に関連付ける情報を設定した後、その確保した空きエントリ2462に注目する。次に、照合部2531は、注目中エントリ2462に撮影時刻、人物領域、位置情報(画像座標系)、画像特徴を設定し、位置情報(世界座標系)と移動ベクトルと移動速度と加速度はNULL値とする。 First, the collation unit 2531 sets the person ID assigned to the detected person and the camera ID of the camera 12 in the entry 2461 of the movement trajectory information 246 to be collated. Next, the collating unit 2531 secures one free entry 2462, sets information for correlating the secured free entry 2462 and the entry 2461, and then focuses on the secured free entry 2462. FIG. Next, the matching unit 2531 sets the photographing time, the human area, the position information (image coordinate system), and the image feature in the focused entry 2462, and the position information (world coordinate system), movement vector, movement velocity, and acceleration are set to NULL. value.
 次に、監視部253の座標変換部2533は、注目中エントリ2462に対して以下のような処理を行う(ステップS36)。先ず、座標変換部2533は、カメラ校正情報442-2を使用して、注目中エントリ2462に設定された位置情報(画像座標系)を、画像座標系から世界座標系に変換する。次に、座標変換部2533は、上記変換して得られた世界座標系の位置情報を注目中エントリ2462の位置情報(世界座標系)として設定する。 Next, the coordinate conversion unit 2533 of the monitoring unit 253 performs the following processing on the focused entry 2462 (step S36). First, the coordinate conversion unit 2533 uses the camera calibration information 442-2 to convert the position information (image coordinate system) set in the focused entry 2462 from the image coordinate system to the world coordinate system. Next, the coordinate conversion unit 2533 sets the position information of the world coordinate system obtained by the above conversion as the position information (world coordinate system) of the entry 2462 of interest.
 次に、監視部253の照合部2531は、追跡対象移動軌跡情報245と1以上の被照合移動軌跡情報246とを照合する(ステップS37)。照合部2531は、この照合により、カメラ12で撮影された監視エリア13C内を移動する人物のうち、何れの人物がカメラ11で追尾中の人物であるかを決定する。次に、照合部2531は、カメラ11で追尾中の人物の検出に成功したか否かを判定する(ステップS38)。照合部2531は、上記検出に成功しなければ、ステップS32に戻って、上述した処理と同様の処理を繰り返す。一方、照合部2531は、上記検出に成功した場合、カメラ11で追尾中の人物であると決定した人物の人物IDを追尾部2532に伝達する。 Next, the collating unit 2531 of the monitoring unit 253 collates the tracked trajectory information 245 with one or more pieces of collated trajectory information 246 (step S37). The collation unit 2531 determines which of the persons photographed by the camera 12 and moving in the monitoring area 13C is the person being tracked by the camera 11 by this collation. Next, the matching unit 2531 determines whether or not the person being tracked by the camera 11 has been successfully detected (step S38). If the detection is not successful, the matching unit 2531 returns to step S32 and repeats the same processing as described above. On the other hand, if the above detection is successful, the matching unit 2531 transmits the person ID of the person determined to be the person being tracked by the camera 11 to the tracking unit 2532 .
 追尾部2532は、照合部2531ら受け取った人物IDがエントリ2461に設定された被照合移動軌跡情報246の最後尾のエントリ2462に注目し、カメラ12を用いて追尾対象人物を追尾する。例えば、追尾部2532は、注目中エントリ2462に設定された位置情報(画像座標系)に応じてカメラ12のPTZ値を制御するための指令を生成し、通信I/F部21を通じてカメラ12に対して送信する。このとき、例えば、追尾部2532は、注目中エントリ2462の位置情報(画像座標系)で表される追跡対象人物の外接矩形の重心が画像の中心に表示されるようにパンとチルトを調整し、その外接矩形の全体が所定の画角内に収まるようにズームを調整してよい。カメラ12は、上記指令に応答してパン、チルト、ズームを変更することにより撮像範囲を変更する。また、追尾部2532は、ステップS39において、PTZ変更後のカメラ12で撮影された画像を取得し、追尾対象人物の検出、画面表示部23へのモニタ表示などの処理を行う。 The tracking unit 2532 focuses on the last entry 2462 of the to-be-matched movement trajectory information 246 in which the person ID received from the matching unit 2531 is set in the entry 2461, and uses the camera 12 to track the tracking target person. For example, the tracking unit 2532 generates a command for controlling the PTZ value of the camera 12 according to the position information (image coordinate system) set in the focused entry 2462, and sends the command to the camera 12 through the communication I/F unit 21. send to. At this time, for example, the tracking unit 2532 adjusts the pan and tilt so that the center of gravity of the circumscribing rectangle of the person to be tracked represented by the position information (image coordinate system) of the focused entry 2462 is displayed at the center of the image. , the zoom may be adjusted so that the entire circumscribing rectangle is within a predetermined angle of view. The camera 12 changes the imaging range by changing the pan, tilt and zoom in response to the above command. Also, in step S39, the tracking unit 2532 acquires an image captured by the camera 12 after changing the PTZ, and performs processing such as detection of a person to be tracked and monitor display on the screen display unit 23. FIG.
 さらに、追尾部2532は、追尾を終了するか否かを判断する(ステップS40)。例えば、追尾部2532は、追跡対象の人物が監視エリア13C、13Dに存在しなくなったことを検知したとき、追尾を終了すると判断してよい。或いは、追尾部2532は、追跡対象の人物がカメラ12によって追尾できない状態になったことを検知したとき、追尾を終了してよい。追尾部2532は、追尾を終了すると判断しなかった場合、ステップS38に戻って、上述した処理と同様の処理を繰り返す。これにより、追跡対象の人物のカメラ12による追尾が継続されることになる。追尾部2532は、追尾を終了すると判断した場合、監視を終了するか否かを判断する(ステップS41)。例えば、追尾部2532は、操作入力部22を通じてオペレータから監視終了コマンドが入力された場合、監視を終了すると判断する。監視を終了すると判断されなかった場合、監視部253は、ステップS31に戻って、上述した処理と同様の処理を繰り返す。監視を終了すると判断された場合、監視部253は、図7に示す処理を終了する。 Furthermore, the tracking unit 2532 determines whether or not to end tracking (step S40). For example, the tracking unit 2532 may determine to end tracking when it detects that the person to be tracked no longer exists in the monitoring areas 13C and 13D. Alternatively, the tracking unit 2532 may end tracking when it detects that the person to be tracked cannot be tracked by the camera 12 . If the tracking unit 2532 does not determine to end tracking, it returns to step S38 and repeats the same processing as described above. As a result, the tracking of the person to be tracked by the camera 12 is continued. When the tracking unit 2532 determines to end tracking, it determines whether to end monitoring (step S41). For example, the tracking unit 2532 determines to end the monitoring when the operator inputs a monitoring end command through the operation input unit 22 . If it is not determined to end the monitoring, the monitoring unit 253 returns to step S31 and repeats the same processing as described above. If it is determined to end the monitoring, the monitoring unit 253 ends the processing shown in FIG.
 このように本実施形態によれば、追尾部2522は、移動する追尾対象人物を撮影範囲内に捉える追尾機能を有するカメラ11で追尾しながら連続して撮影された複数の画像から追尾対象人物の位置情報(画像座標系)を取得して、追跡対象移動軌跡情報245の各エントリ2452を算出する。また、座標変換部2523は、カメラ11で撮影されて得られた各エントリ2452に対応する画像から基準画像への写像関数を算出し、この算出された写像関数を用いて追跡対象移動軌跡情報245の各エントリ2452の位置情報(画像座標系)を基準画像上の位置情報に変換し、この変換して得られた基準画像上の位置情報を、カメラ校正情報242-1を用いて画像座標系から世界座標系に変換する。上記写像関数は、カメラ校正情報と相違して人手介入なしに算出することができる。そのため、本実施形態によれば、移動する人物を撮影範囲内に捉える追尾機能を有するPTZカメラで人物を追尾しながら人物の実空間上の位置の時系列である移動軌跡情報を算出することができる。その結果、複数のPTZカメラ間で人物の移動軌跡をリアルタイムに照合でき、PTZカメラ間での移動体の自動照合が可能になる。 As described above, according to the present embodiment, the tracking unit 2522 captures a moving tracking target person from a plurality of images captured continuously while tracking the camera 11 having a tracking function that captures the moving tracking target person within the shooting range. Position information (image coordinate system) is acquired, and each entry 2452 of the tracked object movement trajectory information 245 is calculated. In addition, the coordinate transformation unit 2523 calculates a mapping function from the image corresponding to each entry 2452 captured by the camera 11 to the reference image, and uses the calculated mapping function to convert the tracked object movement trajectory information 245. The position information (image coordinate system) of each entry 2452 of is converted into position information on the reference image, and the position information on the reference image obtained by this conversion is converted to the image coordinate system using the camera calibration information 242-1. to the world coordinate system. The mapping function can be calculated without manual intervention unlike the camera calibration information. Therefore, according to the present embodiment, it is possible to calculate the movement trajectory information, which is the time series of the position of the person in the real space, while tracking the person with a PTZ camera having a tracking function that captures the moving person within the shooting range. can. As a result, the moving trajectory of a person can be collated in real time between a plurality of PTZ cameras, and the moving object can be automatically collated between the PTZ cameras.
 続いて、本実施形態の変形例について説明する。 Next, a modified example of this embodiment will be described.
 座標変換部2523は、カメラ11で撮影されて得られた画像から基準画像への平面射影変換行列を、画像を撮影したときのカメラ11のPTZ値と基準PTZ値とに基づいて算出してもよい。 The coordinate transformation unit 2523 may calculate a planar projective transformation matrix from an image obtained by photographing with the camera 11 to a reference image based on the PTZ value of the camera 11 when the image was photographed and the reference PTZ value. good.
 カメラ11にコンピュータを搭載し、そのコンピュータ上に監視部252の機能の全部または一部を搭載するように構成してよい。同様に、カメラ12にコンピュータを搭載し、そのコンピュータ上に監視部253の機能の全部または一部を搭載するように構成してよい。また、カメラ11およびカメラ12にネットワークを通じて接続されたコンピュータに、操作入力部22、画面表示部23、記憶部24を搭載するように構成してよい。 A computer may be mounted on the camera 11, and all or part of the functions of the monitoring unit 252 may be mounted on the computer. Similarly, a computer may be installed in the camera 12, and all or part of the functions of the monitoring section 253 may be installed on the computer. Alternatively, the camera 11 and a computer connected to the camera 12 via a network may be configured to include the operation input unit 22, the screen display unit 23, and the storage unit 24. FIG.
 監視エリアは、通路以外の場所、例えば、店舗、工場、駅のプラットフォーム、グラウンド、体育館などとしてよい。 The monitoring area may be a place other than aisles, such as a store, factory, station platform, ground, or gymnasium.
 追跡する対象は、人物以外の移動体、例えば、動物、自動車、歩行ロボットなどであってよい。 The target to be tracked may be a moving object other than a person, such as an animal, a car, or a walking robot.
 カメラ視野の一部を共通するカメラは、カメラ11とカメラ12の2台に限定されず、3台以上あってもよい。 The cameras that share part of the camera field of view are not limited to the two cameras 11 and 12, and may be three or more.
[第2の実施の形態]
 次に、本発明の第2の実施形態について図面を参照して詳細に説明する。
[Second embodiment]
Next, a second embodiment of the present invention will be described in detail with reference to the drawings.
 図8を参照すると、本実施形態に係る移動軌跡情報処理装置30は、算出手段31と変換手段32とを備える。 Referring to FIG. 8, the movement trajectory information processing device 30 according to the present embodiment includes calculation means 31 and conversion means 32 .
 算出手段31は、移動する対象物を撮影範囲内に捉えるカメラで対象物を連続して撮影された複数の画像から対象物の画像上の位置情報の時系列である移動軌跡情報を算出するように構成されている。算出手段31は、例えば、図2の追尾部2522と同様に構成することができるが、それに限定されない。 The calculating means 31 calculates movement trajectory information, which is a time series of positional information on the images of the object, from a plurality of images of the object continuously photographed by a camera that captures the moving object within the photographing range. is configured to The calculation means 31 can be configured, for example, in the same manner as the tracking section 2522 in FIG. 2, but is not limited thereto.
 変換手段32は、上記画像から基準画像への写像関数を算出し、この算出された写像関数を用いて対象物の画像上の位置情報に対応する基準画像上の位置情報を算出し、この算出されて得られた基準画像上の位置情報を、カメラ校正情報を用いて世界座標系の位置情報に変換するように構成されている。変換手段32は、例えば、図2の座標変換部2523と同様に構成することができるが、それに限定されない。 The conversion means 32 calculates a mapping function from the image to the reference image, calculates position information on the reference image corresponding to the position information of the object on the image using the calculated mapping function, and calculates The position information on the reference image thus obtained is converted into position information on the world coordinate system using the camera calibration information. The conversion means 32 can be configured, for example, in the same manner as the coordinate conversion section 2523 in FIG. 2, but is not limited thereto.
 以上のように構成された移動軌跡情報処理装置30は、以下のように動作する。即ち、算出手段31は、移動する対象物を撮影範囲内に捉えるカメラで対象物を連続して撮影された複数の画像から対象物の画像上の位置情報の時系列である移動軌跡情報を算出する。次に、変換手段32は、上記画像から基準画像への写像関数を算出し、この算出された写像関数を用いて対象物の画像上の位置情報に対応する基準画像上の位置情報を算出し、この算出されて得られた基準画像上の位置情報を、カメラ校正情報を用いて世界座標系の位置情報に変換する。 The movement trajectory information processing device 30 configured as described above operates as follows. That is, the calculation means 31 calculates the movement trajectory information, which is the time series of the position information of the object on the image, from a plurality of images of the object continuously photographed by a camera that captures the moving object within the photographing range. do. Next, the conversion means 32 calculates a mapping function from the image to the reference image, and uses the calculated mapping function to calculate position information on the reference image corresponding to the position information on the image of the object. , the calculated positional information on the reference image is converted into positional information in the world coordinate system using the camera calibration information.
 以上のように構成され動作する移動軌跡情報処理装置30によれば、移動する対象物を撮影範囲内に捉えるカメラで対象物を連続して撮影された複数の画像から対象物の実空間上の位置の時系列である移動軌跡情報を算出することができる。その理由は、上記写像関数は、カメラ校正情報と相違して人手介入なしに算出することができるためである。 According to the movement trajectory information processing apparatus 30 configured and operated as described above, a plurality of images of a moving object continuously photographed by a camera that captures the moving object within the photographing range are used to obtain the real space image of the object. Movement trajectory information, which is a time series of positions, can be calculated. The reason is that the mapping function can be calculated without manual intervention unlike the camera calibration information.
 以上、上記各実施形態を参照して本発明を説明したが、本発明は、上述した実施形態に限定されるものではない。本発明の構成や詳細には、本発明の範囲内で当業者が理解しうる様々な変更をすることができる。 Although the present invention has been described with reference to the above-described embodiments, the present invention is not limited to the above-described embodiments. Various changes can be made to the configuration and details of the present invention within the scope of the present invention that can be understood by those skilled in the art.
 例えば、本発明によって算出された対象物の移動軌跡は、監視・追跡を目的とする以外に、移動軌跡に基づいて個人認証や異常行動検知などを行う目的で使用されてもよい。また、本発明で使用するカメラは必ずしも追尾機能を有している必要はない。例えば、移動する対象物を撮影範囲内に捉えるようにカメラの姿勢を撮影者が手動で変化させて対象物を追尾しながら連続して撮影された複数の画像から対象物の移動軌跡情報を算出する場合にも本発明は適用可能である。 For example, the movement trajectory of an object calculated by the present invention may be used for the purposes of personal authentication, abnormal behavior detection, etc. based on the movement trajectory, in addition to monitoring and tracking. Also, the camera used in the present invention does not necessarily have a tracking function. For example, the photographer manually changes the orientation of the camera so that the moving object is captured within the shooting range. The present invention can also be applied when
 PTZカメラなどの移動する対象物を撮影範囲内に捉えるカメラで対象物を連続して撮影された複数の画像から対象物を検出し、追跡するシステムなどに利用できる。 It can be used for systems that detect and track objects from multiple images that are continuously captured by a camera such as a PTZ camera that captures moving objects within the shooting range.
 上記の実施形態の一部又は全部は、以下の付記のようにも記載され得るが、以下には限られない。
[付記1]
 移動する対象物を撮影範囲内に捉えるカメラで前記対象物を連続して撮影された複数の画像から前記対象物の画像上の位置情報の時系列である移動軌跡情報を算出する算出手段と、
 前記画像から基準画像への写像関数を算出し、該算出された写像関数を用いて前記対象物の画像上の位置情報に対応する前記基準画像上の位置情報を算出し、該算出されて得られた前記基準画像上の位置情報を、カメラ校正情報を用いて世界座標系の位置情報に変換する変換手段と、
を備える移動軌跡情報処理装置。
[付記2]
 前記変換手段は、前記画像から前記基準画像への平面射影変換行列を算出し、該算出された平面射影変換行列を用いて前記写像関数を算出する、
付記1に記載の移動軌跡情報処理装置。
[付記3]
 前記移動軌跡情報と他の移動軌跡情報とを照合する照合手段を、更に備える、
付記1または2に記載の移動軌跡情報処理装置。
[付記4]
 前記他の移動軌跡情報は、前記カメラとの間でカメラ視野の一部が重複するように設置された他のカメラで連続して撮影された複数の画像から算出された移動軌跡情報である、
付記3に記載の移動軌跡情報処理装置。
[付記5]
 前記カメラは、PTZカメラである、
付記1乃至4の何れかに記載の移動軌跡情報処理装置。
[付記6]
 移動する対象物を撮影範囲内に捉えるカメラで前記対象物を連続して撮影された複数の画像から前記対象物の画像上の位置情報の時系列である移動軌跡情報を算出し、
 前記画像から基準画像への写像関数を算出し、該算出された写像関数を用いて前記対象物の画像上の位置情報に対応する前記基準画像上の位置情報を算出し、該算出されて得られた前記基準画像上の位置情報を、カメラ校正情報を用いて世界座標系の位置情報に変換する、
移動軌跡情報処理方法。
[付記7]
 前記変換では、前記画像から前記基準画像への平面射影変換行列を算出し、該算出された平面射影変換行列を用いて前記写像関数を算出する、
付記6に記載の移動軌跡情報処理方法。
[付記8]
 さらに、前記移動軌跡情報と他の移動軌跡情報とを照合する、
付記6または7に記載の移動軌跡情報処理方法。
[付記9]
 前記他の移動軌跡情報は、前記カメラとの間でカメラ視野の一部が重複するように設置された他のカメラで連続して撮影された複数の画像から算出された移動軌跡情報である、
付記8に記載の移動軌跡情報処理方法。
[付記10]
 コンピュータに、
 移動する対象物を撮影範囲内に捉えるカメラで前記対象物を連続して撮影された複数の画像から前記対象物の画像上の位置情報の時系列である移動軌跡情報を算出する処理と、
 前記画像から基準画像への写像関数を算出し、該算出された写像関数を用いて前記対象物の画像上の位置情報に対応する前記基準画像上の位置情報を算出し、該算出されて得られた前記基準画像上の位置情報を、カメラ校正情報を用いて世界座標系の位置情報に変換する処理と、
を行わせるためのプログラムを記録したコンピュータ読み取り可能な記録媒体。
Some or all of the above embodiments may also be described in the following additional remarks, but are not limited to the following.
[Appendix 1]
Calculation means for calculating movement trajectory information, which is a time series of position information of the moving object on the image, from a plurality of images of the object continuously photographed by a camera that captures the moving object within an imaging range;
calculating a mapping function from the image to a reference image; using the calculated mapping function to calculate position information on the reference image corresponding to position information on the image of the object; conversion means for converting the obtained position information on the reference image into position information in a world coordinate system using camera calibration information;
A movement trajectory information processing device comprising:
[Appendix 2]
The transforming means calculates a planar projective transformation matrix from the image to the reference image, and uses the computed planar projective transformation matrix to compute the mapping function.
The movement trajectory information processing device according to appendix 1.
[Appendix 3]
further comprising collation means for collating the trajectory information with other trajectory information;
3. The movement trajectory information processing device according to appendix 1 or 2.
[Appendix 4]
The other movement trajectory information is movement trajectory information calculated from a plurality of images continuously captured by another camera installed so that a part of the camera field of view overlaps with the camera.
The movement trajectory information processing device according to appendix 3.
[Appendix 5]
wherein the camera is a PTZ camera;
5. The movement trajectory information processing apparatus according to any one of Appendices 1 to 4.
[Appendix 6]
calculating movement trajectory information, which is a time series of position information of the moving object on the image, from a plurality of images of the object continuously photographed by a camera that captures the moving object within an imaging range;
calculating a mapping function from the image to a reference image; using the calculated mapping function to calculate position information on the reference image corresponding to position information on the image of the object; converting the obtained position information on the reference image into position information in a world coordinate system using camera calibration information;
A moving trajectory information processing method.
[Appendix 7]
In the conversion, a planar projective transformation matrix from the image to the reference image is calculated, and the mapping function is calculated using the calculated planar projective transformation matrix.
The movement trajectory information processing method according to appendix 6.
[Appendix 8]
Further, matching the movement trajectory information with other movement trajectory information,
The movement trajectory information processing method according to appendix 6 or 7.
[Appendix 9]
The other movement trajectory information is movement trajectory information calculated from a plurality of images continuously captured by another camera installed so that a part of the camera field of view overlaps with the camera.
The movement trajectory information processing method according to appendix 8.
[Appendix 10]
to the computer,
A process of calculating movement trajectory information, which is a time series of position information of the moving object on the image, from a plurality of images of the object continuously photographed by a camera that captures the moving object within an imaging range;
calculating a mapping function from the image to a reference image; using the calculated mapping function to calculate position information on the reference image corresponding to position information on the image of the object; a process of converting the obtained position information on the reference image into position information in a world coordinate system using camera calibration information;
A computer-readable recording medium that records a program for performing
10 追跡システム
11、12 PTZカメラ
13A~13D 監視エリア
20 制御装置
21 通信I/F部
22 操作入力部
23 画面表示部
24 記憶部
25 演算処理部
30 移動軌跡情報処理装置
31 算出手段
32 変換手段
241 プログラム
242-1、242-2 カメラ校正情報
243-1、243-2 基準画像
244-1、244-2 画像DB
245 追跡対象移動軌跡情報
246 被照合移動軌跡情報
251 カメラ校正部
252、253 監視部
2521 検知部
2522、2532 追尾部
2523、2533 座標変換部
2531 照合部
10 tracking system 11, 12 PTZ cameras 13A to 13D surveillance area 20 control device 21 communication I/F section 22 operation input section 23 screen display section 24 storage section 25 arithmetic processing section 30 movement trajectory information processing device 31 calculation means 32 conversion means 241 Programs 242-1, 242-2 Camera calibration information 243-1, 243-2 Reference images 244-1, 244-2 Image DB
245 tracking target movement trajectory information 246 to-be-verified movement trajectory information 251 camera calibration units 252, 253 monitoring unit 2521 detection units 2522, 2532 tracking units 2523, 2533 coordinate conversion unit 2531 matching unit

Claims (10)

  1.  移動する対象物を撮影範囲内に捉えるカメラで前記対象物を連続して撮影された複数の画像から前記対象物の画像上の位置情報の時系列である移動軌跡情報を算出する算出手段と、
     前記画像から基準画像への写像関数を算出し、該算出された写像関数を用いて前記対象物の画像上の位置情報に対応する前記基準画像上の位置情報を算出し、該算出されて得られた前記基準画像上の位置情報を、カメラ校正情報を用いて世界座標系の位置情報に変換する変換手段と、
    を備える移動軌跡情報処理装置。
    Calculation means for calculating movement trajectory information, which is a time series of position information of the moving object on the image, from a plurality of images of the object continuously photographed by a camera that captures the moving object within an imaging range;
    calculating a mapping function from the image to a reference image; using the calculated mapping function to calculate position information on the reference image corresponding to position information on the image of the object; conversion means for converting the obtained position information on the reference image into position information in a world coordinate system using camera calibration information;
    A movement trajectory information processing device comprising:
  2.  前記変換手段は、前記画像から前記基準画像への平面射影変換行列を算出し、該算出された平面射影変換行列を用いて前記写像関数を算出する、
    請求項1に記載の移動軌跡情報処理装置。
    The transforming means calculates a planar projective transformation matrix from the image to the reference image, and uses the computed planar projective transformation matrix to compute the mapping function.
    The moving trajectory information processing apparatus according to claim 1.
  3.  前記移動軌跡情報と他の移動軌跡情報とを照合する照合手段を、更に備える、
    請求項1または2に記載の移動軌跡情報処理装置。
    further comprising collation means for collating the trajectory information with other trajectory information;
    The movement locus information processing device according to claim 1 or 2.
  4.  前記他の移動軌跡情報は、前記カメラとの間でカメラ視野の一部が重複するように設置された他のカメラで連続して撮影された複数の画像から算出された移動軌跡情報である、
    請求項3に記載の移動軌跡情報処理装置。
    The other movement trajectory information is movement trajectory information calculated from a plurality of images continuously captured by another camera installed so that a part of the camera field of view overlaps with the camera.
    The movement trajectory information processing apparatus according to claim 3.
  5.  前記カメラは、PTZカメラである、
    請求項1乃至4の何れかに記載の移動軌跡情報処理装置。
    wherein the camera is a PTZ camera;
    5. The movement trajectory information processing apparatus according to any one of claims 1 to 4.
  6.  移動する対象物を撮影範囲内に捉えるカメラで前記対象物を連続して撮影された複数の画像から前記対象物の画像上の位置情報の時系列である移動軌跡情報を算出し、
     前記画像から基準画像への写像関数を算出し、該算出された写像関数を用いて前記対象物の画像上の位置情報に対応する前記基準画像上の位置情報を算出し、該算出されて得られた前記基準画像上の位置情報を、カメラ校正情報を用いて世界座標系の位置情報に変換する、
    移動軌跡情報処理方法。
    calculating movement trajectory information, which is a time series of position information of the moving object on the image, from a plurality of images of the object continuously photographed by a camera that captures the moving object within an imaging range;
    calculating a mapping function from the image to a reference image; using the calculated mapping function to calculate position information on the reference image corresponding to position information on the image of the object; converting the obtained position information on the reference image into position information in a world coordinate system using camera calibration information;
    A moving trajectory information processing method.
  7.  前記変換では、前記画像から前記基準画像への平面射影変換行列を算出し、該算出された平面射影変換行列を用いて前記写像関数を算出する、
    請求項6に記載の移動軌跡情報処理方法。
    In the conversion, a planar projective transformation matrix from the image to the reference image is calculated, and the mapping function is calculated using the calculated planar projective transformation matrix.
    The movement trajectory information processing method according to claim 6.
  8.  さらに、前記移動軌跡情報と他の移動軌跡情報とを照合する、
    請求項6または7に記載の移動軌跡情報処理方法。
    Further, matching the movement trajectory information with other movement trajectory information,
    The movement trajectory information processing method according to claim 6 or 7.
  9.  前記他の移動軌跡情報は、前記カメラとの間でカメラ視野の一部が重複するように設置された他のカメラで連続して撮影された複数の画像から算出された移動軌跡情報である、
    請求項8に記載の移動軌跡情報処理方法。
    The other movement trajectory information is movement trajectory information calculated from a plurality of images continuously captured by another camera installed so that a part of the camera field of view overlaps with the camera.
    The movement trajectory information processing method according to claim 8.
  10.  コンピュータに、
     移動する対象物を撮影範囲内に捉えるカメラで前記対象物を連続して撮影された複数の画像から前記対象物の画像上の位置情報の時系列である移動軌跡情報を算出する処理と、
     前記画像から基準画像への写像関数を算出し、該算出された写像関数を用いて前記対象物の画像上の位置情報に対応する前記基準画像上の位置情報を算出し、該算出されて得られた前記基準画像上の位置情報を、カメラ校正情報を用いて世界座標系の位置情報に変換する処理と、
    を行わせるためのプログラムを記録したコンピュータ読み取り可能な記録媒体。
    to the computer,
    A process of calculating movement trajectory information, which is a time series of position information of the moving object on the image, from a plurality of images of the object continuously photographed by a camera that captures the moving object within an imaging range;
    calculating a mapping function from the image to a reference image; using the calculated mapping function to calculate position information on the reference image corresponding to position information on the image of the object; a process of converting the obtained position information on the reference image into position information in a world coordinate system using camera calibration information;
    A computer-readable recording medium that records a program for performing
PCT/JP2022/009080 2022-03-03 2022-03-03 Movement trajectory information processing device WO2023166648A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/009080 WO2023166648A1 (en) 2022-03-03 2022-03-03 Movement trajectory information processing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/009080 WO2023166648A1 (en) 2022-03-03 2022-03-03 Movement trajectory information processing device

Publications (1)

Publication Number Publication Date
WO2023166648A1 true WO2023166648A1 (en) 2023-09-07

Family

ID=87883276

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/009080 WO2023166648A1 (en) 2022-03-03 2022-03-03 Movement trajectory information processing device

Country Status (1)

Country Link
WO (1) WO2023166648A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009110417A1 (en) * 2008-03-03 2009-09-11 ティーオーエー株式会社 Device and method for specifying installment condition of rotatable camera and camera control system equipped with the installment condition specifying device
JP2018160219A (en) * 2017-03-24 2018-10-11 株式会社 日立産業制御ソリューションズ Moving route prediction device and method for predicting moving route

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009110417A1 (en) * 2008-03-03 2009-09-11 ティーオーエー株式会社 Device and method for specifying installment condition of rotatable camera and camera control system equipped with the installment condition specifying device
JP2018160219A (en) * 2017-03-24 2018-10-11 株式会社 日立産業制御ソリューションズ Moving route prediction device and method for predicting moving route

Similar Documents

Publication Publication Date Title
CN107438173B (en) Video processing apparatus, video processing method, and storage medium
JP6532217B2 (en) IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND IMAGE PROCESSING SYSTEM
Senior et al. Acquiring multi-scale images by pan-tilt-zoom control and automatic multi-camera calibration
JP4451122B2 (en) Video tracking system and method
US10812686B2 (en) Method and system for mimicking human camera operation
US20100013917A1 (en) Method and system for performing surveillance
US20040119819A1 (en) Method and system for performing surveillance
US20050237390A1 (en) Multiple camera system for obtaining high resolution images of objects
CN108198199B (en) Moving object tracking method, moving object tracking device and electronic equipment
EP3726424A1 (en) Determination of audience attention
KR20170082735A (en) Object image provided method based on object tracking
KR20150019230A (en) Method and apparatus for tracking object using multiple camera
CN114037923A (en) Target activity hotspot graph drawing method, system, equipment and storage medium
WO2023166648A1 (en) Movement trajectory information processing device
WO2023166649A1 (en) Movement path information processing device, movement path information processing method, and recording medium
TWI471825B (en) System and method for managing security of a roof
KR20190064540A (en) Apparatus and method for generating panorama image
JP2002101408A (en) Supervisory camera system
KR101996907B1 (en) Apparatus for tracking object
JP2001285849A (en) Photographing system
CN110595443A (en) Projection device
WO2005120070A2 (en) Method and system for performing surveillance
Yeh et al. Cooperative dual camera surveillance system for real-time object searching and close-up viewing
JP2023091490A (en) Information processing apparatus, method for controlling information processing apparatus, and program
KR20240002340A (en) System and method for surveilling video using collaboration of PTZ cameras, and a recording medium recording a computer readable program for executing the method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22929793

Country of ref document: EP

Kind code of ref document: A1