CN111670456B - Information processing apparatus, tracking method, and recording medium - Google Patents

Information processing apparatus, tracking method, and recording medium Download PDF

Info

Publication number
CN111670456B
CN111670456B CN201880088321.8A CN201880088321A CN111670456B CN 111670456 B CN111670456 B CN 111670456B CN 201880088321 A CN201880088321 A CN 201880088321A CN 111670456 B CN111670456 B CN 111670456B
Authority
CN
China
Prior art keywords
coordinates
detection
coordinate
detection object
representative
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201880088321.8A
Other languages
Chinese (zh)
Other versions
CN111670456A (en
Inventor
饭野晋
助野顺司
小池正英
道簱聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Publication of CN111670456A publication Critical patent/CN111670456A/en
Application granted granted Critical
Publication of CN111670456B publication Critical patent/CN111670456B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

An information processing device (100) is provided with: a position detection unit (110) that detects a plurality of detection object positions indicating the position of the 1 st detection object on the basis of a plurality of pieces of detection information on the 1 st detection object periodically detected by a plurality of detection devices, respectively; a conversion unit (120) that converts a plurality of detection target positions into coordinates based on a space in which a plurality of detection devices are provided; a storage unit (140) that stores a 1 st coordinate, which is a position where a 1 st detection object is located before a plurality of detection devices perform detection; a classification unit (150) that acquires, from among a plurality of converted coordinates, a converted coordinate in which a plurality of detection target positions detected from detection information detected within a 1 st time period longer than the period are converted, and extracts, from the acquired converted coordinates, a plurality of 2 nd coordinates in which a relationship with the 1 st coordinate is expected; and a representative coordinate calculation unit (160) that calculates a representative coordinate from the plurality of 2 nd coordinates, and determines the representative coordinate as a position of the 1 st detection object after the 1 st detection object has moved from the 1 st coordinate.

Description

Information processing apparatus, tracking method, and recording medium
Technical Field
The invention relates to an information processing apparatus, a tracking method, and a recording medium.
Background
Due to the progress of semiconductor technology in recent years, digital cameras and image sensors with resolutions exceeding those of SoC (System On Chip) and VGA (Video Graphics Array: video graphics array) for industrial equipment are becoming high-performance and low-cost. Therefore, a system using a digital camera or an image sensor can be easily implemented. For example, the system can track people or objects within a space.
Here, a technique of tracking a person or an object is proposed (see patent documents 1 and 2). For example, the people counting device of patent document 1 extracts a person from image data captured by a plurality of cameras, and tracks the extracted person. For example, the control device of patent document 2 captures images from a plurality of cameras, extracts features representing a vehicle on the images, and tracks the movement of the vehicle.
Prior art literature
Patent literature
Patent document 1: japanese patent laid-open No. 10-49718
Patent document 2: japanese patent laid-open No. 10-269362
Disclosure of Invention
Problems to be solved by the invention
However, by integrating a plurality of pieces of position information, the movement locus of a person or an object is calculated. The plurality of pieces of position information may include position information of an outlier. For example, the outlier is position information when the position of the person or object is erroneously detected. When a plurality of pieces of position information including abnormal values are integrated, the accuracy of the movement locus of a person or an object is deteriorated.
Simply increasing the frame rate of the camera cannot solve the problem of deterioration of the accuracy of the moving track of a person or an object. Further, merely increasing the period of sensor sampling cannot solve the problem that the accuracy of the moving track of a person or an object becomes poor.
The invention aims to improve the accuracy of a movement track.
Means for solving the problems
An information processing device according to an embodiment of the present invention is provided. The information processing device includes: a position detection unit that detects a plurality of detection object positions indicating positions of the 1 st detection object based on a plurality of pieces of detection information obtained by periodically detecting the 1 st detection object by a plurality of detection devices, respectively; a conversion unit that converts the plurality of detection target positions into coordinates based on a space in which the plurality of detection devices are provided; a storage unit that stores 1 st coordinates, which are coordinates of a position where the 1 st detection object is located before the plurality of detection devices detect the 1 st detection object, and which are coordinates based on the space; a classification unit that obtains, from a plurality of converted coordinates, which are coordinates obtained by converting the plurality of detection target positions into coordinates based on the space, converted coordinates obtained by converting the plurality of detection target positions detected by the plurality of detection devices within a 1 st time period longer than a period of detection of the detection target, and extracts, from the obtained converted coordinates, a plurality of 2 nd coordinates, which are coordinates expected to be related to the 1 st coordinate; and a representative coordinate calculation unit that calculates a representative coordinate from the plurality of 2 nd coordinates, and determines the calculated representative coordinate as a position of the 1 st detection object after movement from the 1 st coordinate.
Effects of the invention
According to the invention, the accuracy of the movement track can be improved.
Drawings
Fig. 1 is a diagram (one of them) showing a tracking system according to embodiment 1.
Fig. 2 is a diagram showing a tracking system according to embodiment 1 (second).
Fig. 3 is a diagram showing a main hardware configuration of the information processing apparatus according to embodiment 1.
Fig. 4 is a functional block diagram showing the structure of the information processing apparatus of embodiment 1.
Fig. 5 is a diagram showing an example of the movement trajectory table of embodiment 1.
Fig. 6 is a flowchart showing the storage processing of the converted coordinates of embodiment 1.
Fig. 7 is a flowchart showing the calculation processing of representative coordinates of embodiment 1.
Fig. 8 is a diagram (one of them) showing a specific example of the tracking process according to embodiment 1.
Fig. 9 is a diagram showing a specific example of the tracking process in embodiment 1 (second example).
Fig. 10 is a diagram showing a specific example of the tracking process according to embodiment 1 (third).
Fig. 11 is a functional block diagram showing the structure of an information processing apparatus according to embodiment 2.
Fig. 12 is a flowchart (one of them) showing the calculation processing of the representative coordinates of embodiment 2.
Fig. 13 is a flowchart (second) showing the calculation processing of the representative coordinates in embodiment 2.
Detailed Description
The embodiments will be described below with reference to the drawings. The following embodiments are merely examples, and various modifications can be made within the scope of the present invention.
Embodiment 1
Fig. 1 is a diagram (one of them) showing a tracking system according to embodiment 1. The tracking system includes an information processing apparatus 100 and cameras 200, 201, 202. The information processing apparatus 100 and the cameras 200, 201, 202 are connected via a network.
The information processing apparatus 100 can perform tracking of the subject person. For example, the information processing apparatus 100 is a computer. The information processing apparatus 100 acquires images captured by the cameras 200, 201, and 202.
The cameras 200, 201, 202 are also referred to as image capturing devices or image generating devices. The cameras 200, 201, 202 may also be sensors. The cameras 200, 201, 202 may also contain sensors. The camera, the sensor or the camera comprising the sensor is also referred to as a detection device. In fig. 1, a case of 3 cameras is shown. However, the number of cameras is not limited to 3.
The cameras 200, 201, 202 are provided in the imaging target space. The imaging target space is a space in which a camera is provided, and is a space in which a camera can capture an image. The cameras 200, 201, 202 are disposed above the imaging target space. For example, the cameras 200, 201, 202 are provided in the ceiling of a room as a shooting target space. The cameras 200, 201, 202 capture images of the subject from above. The information processing apparatus 100 acquires a captured image. The captured image corresponds to the time of capturing.
In fig. 1, the cameras 200, 201, 202 are shown in a state of being installed in a ceiling of a room. The information processing apparatus 100 may be present in the room or may be present in a place different from the room.
Fig. 2 is a diagram showing a tracking system according to embodiment 1 (second). Fig. 2 shows the state of fig. 1 from the side. A part of the ranges that can be photographed by the camera 200 and the camera 201 overlap. A part of the ranges that can be photographed by the camera 201 and the camera 202 overlap. In this way, overlapping a part of the range in which the plurality of cameras can capture images means capturing images of the object with the plurality of cameras.
In fig. 2, the information processing apparatus 100 is not shown.
Next, a main hardware configuration of the information processing apparatus 100 will be described.
Fig. 3 is a diagram showing a main hardware configuration of the information processing apparatus according to embodiment 1. The information processing apparatus 100 has a processor 101, a volatile storage 102, and a nonvolatile storage 103.
The processor 101 controls the entire information processing apparatus 100. For example, the processor 101 is a CPU (Central Processing Unit: central processing unit) or an FPGA (Field Programmable Gate Array: field programmable gate array) or the like. The processor 101 may also be a multiprocessor. The information processing apparatus 100 may be implemented by a processing circuit, or may be implemented by software, firmware, or a combination thereof. In addition, the processing circuit may be a single circuit or a composite circuit.
The volatile storage 102 is a main storage of the information processing apparatus 100. For example, the volatile storage 102 is RAM (Random Access Memory: random Access memory). The nonvolatile storage device 103 is a secondary storage device of the information processing device 100. For example, the nonvolatile storage device 103 is an HDD (Hard Disk Drive) or an SSD (Solid State Drive: solid state Disk).
Fig. 4 is a functional block diagram showing the structure of the information processing apparatus of embodiment 1. The information processing apparatus 100 has a position detecting section 110, a converting section 120, a coordinate accumulating section 130, a storing section 140, a classifying section 150, a representative coordinate calculating section 160, a display control section 170, and a timer 180. The position detecting section 110 includes object detecting sections 111, 112, 113. The conversion unit 120 includes coordinate conversion units 121, 122, and 123.
Part or all of the position detecting section 110, the object detecting sections 111, 112, 113, the converting section 120, the coordinate converting sections 121, 122, 123, the classifying section 150, the representative coordinate calculating section 160, and the display control section 170 may be realized by the processor 101. Part or all of the position detecting section 110, the object detecting sections 111, 112, 113, the converting section 120, the coordinate converting sections 121, 122, 123, the classifying section 150, the representative coordinate calculating section 160, and the display control section 170 may be implemented as modules of a program executed by the processor 101. The program is stored in the volatile storage 102 or the nonvolatile storage 103. Further, the program is a trace program. In this way, the position detecting unit 110, the object detecting units 111, 112, 113, the converting unit 120, the coordinate converting units 121, 122, 123, the classifying unit 150, the representative coordinate calculating unit 160, and the display control unit 170 can be implemented as modules of the tracking program executed by the processor 101 included in the information processing apparatus 100 (for example, a computer).
The coordinate storage unit 130 and the storage unit 140 are implemented as storage areas secured in the volatile storage device 102 or the nonvolatile storage device 103.
The position detection unit 110 periodically captures a plurality of captured images of the subject from the cameras 200, 201, and 202, respectively, and detects a plurality of image coordinates indicating the position of the subject. The photographic subject is also referred to as a detection subject. The photographic subject may also be represented as a 1 st detection subject. The captured image is also referred to as detection information. The image coordinates are also called detection object positions.
The plurality of captured images may be images of the subject captured by the cameras 200, 201, and 202 in the same period, or images of the subject captured by the cameras 200, 201, and 202 in different periods.
The conversion section 120 converts the plurality of image coordinates into coordinates based on the space in which the cameras 200, 201, 202 are provided. The space may be represented by a space that is integrated with the object space of the cameras 200, 201, 202. The coordinates based on this space are referred to as common system coordinates.
The coordinate storage unit 130 stores a plurality of converted coordinates in which a plurality of image coordinates are converted into common system coordinates.
The storage unit 140 stores a plurality of pieces of position information on which the subject has been located in the past. The plurality of position information are represented by coordinates. The coordinates are common system coordinates. The display control unit 170 can generate a movement trace by integrating a plurality of pieces of position information (i.e., a plurality of coordinates). The movement track may also appear as a streamline. The coordinate corresponding to the latest time among the plurality of coordinates that are the basis of the movement trajectory is referred to as the latest coordinate. The latest coordinates may be expressed as the top coordinates, which are coordinates of the top of the movement track.
For example, the storage unit 140 stores coordinates of a position where a certain imaging object (for example, also referred to as a 1 st detection object) is located before the cameras 200, 201, 202 detect the imaging object. The coordinates are space-based coordinates (i.e., common system coordinates). As described later, the coordinate is the latest coordinate of the subject closest to the converted coordinate, and may be expressed as the 1 st coordinate.
The classifying unit 150 extracts a plurality of coordinates expected to have a relation with the latest coordinates from the plurality of converted coordinates stored in the coordinate accumulating unit 130. Further, the plurality of coordinates is also referred to as a plurality of 2 nd coordinates.
The representative coordinate calculation unit 160 calculates a representative coordinate from the plurality of coordinates. The representative coordinate calculation unit 160 determines the representative coordinate as the position of the imaging subject after the movement from the latest coordinate.
As described above, the display control unit 170 generates a two-dimensional map representing the movement trajectory from the plurality of pieces of position information (i.e., the plurality of coordinates) stored in the storage unit 140. For example, a two-dimensional view is a view overlooked from above in space, and represents a movement locus of a person or an object. The display control unit 170 displays a two-dimensional map on a display provided in the information processing apparatus 100. Thus, the user can recognize the movement locus of the person or the object.
Next, information stored in the storage unit 140 will be described.
Fig. 5 is a diagram showing an example of the movement trajectory table of embodiment 1. The movement trajectory table 141 is stored in the storage section 140. The movement trajectory table 141 has items of item numbers, data contents, data forms, and data sizes.
The items of the item number show identifiers. The items of data content show the data content. The items of the dataform illustrate the dataform. The items of the data size show the data size. The unit of information registered in the item of the data size is a byte.
For example, item number 2 shows that the number of movement trace IDs (identifiers) is N (N is a positive integer). In item No. 3, coordinates of a movement start position of an imaging object whose movement locus ID is T1 (hereinafter referred to as "movement locus ID: T1") are registered.
The item number 4 is registered with the movement trace ID: t1 corresponds to the subject ID. The calculation and movement trajectory ID is registered in item number 5: the number of coordinates used when the moving trajectory of the subject ID corresponds to T1. In FIG. 5, the number of coordinates is shown as m 1 And each. The movement trace ID is registered in item number 6: the final update time of T1. The movement trajectory IDs are registered in chronological order from item number 7: the position after the movement of the subject ID corresponding to T1. In fig. 5, a case of three-dimensional coordinates is illustrated, but there is also a case of two-dimensional coordinates.
Further, in fig. 5, a movement locus ID is shown: the latest coordinates of T1. Movement track ID: the latest coordinate of T1 is the x coordinate m 1x M, y coordinate 1y Z coordinate m 1z
The display control unit 170 uses m 1 The coordinates can thereby be generated with the movement locus ID: t1 corresponds to the movement track of the subject ID.
In the movement trajectory table 141, the movement trajectory ID: after the information on T1, the movement trace ID is registered: t2, …, TN. Further, in the movement locus table 141, such as a movement locus ID: as in T1, the latest coordinates are registered for each movement locus ID.
Next, a process before the converted coordinates are stored in the coordinate storage unit 130 will be described with reference to a flowchart.
The processing performed by the object detection unit 111 is the same as that performed by the object detection units 112 and 113. Therefore, in fig. 6, the process performed by the object detection section 111 will be described. Further, the description of the processing performed by the object detection units 112 and 113 is omitted.
The processing performed by the coordinate conversion section 121 is the same as the processing performed by the coordinate conversion sections 122, 123. Therefore, in fig. 6, the processing performed by the coordinate conversion section 121 will be described. Further, the description of the processing performed by the coordinate conversion units 122 and 123 is omitted.
Fig. 6 is a flowchart showing the storage processing of the converted coordinates of embodiment 1. The process of fig. 6 is executed every time the camera 200 performs shooting. In the description of the processing of fig. 6, reference is made to fig. 1, 2 and 4.
The object detection unit 111 acquires an image captured by the camera 200 (step S11).
The object detection unit 111 performs recognition processing on the image to detect the subject (step S12). For example, the recognition processing is background difference processing, inter-frame difference processing, general object recognition technology, or specific object recognition technology. In addition, when a plurality of objects exist in the image, the object detection unit 111 detects the plurality of objects.
The object detection unit 111 detects the position of the subject in the image (step S13). That is, the object detection section 111 detects the image coordinates. The image coordinates are relative positions with respect to the camera 200. Further, the object detection unit 111 detects a plurality of image coordinates when a plurality of imaging subjects are detected.
The coordinate conversion section 121 converts the image coordinates into common system coordinates (step S14). In the conversion, the installation positions of the cameras 200, 201, 202 and the positions corresponding to the orientations of the cameras 200, 201, 202 in the common system coordinates are previously measured, and parameters for converting the coordinates are calculated. The coordinate conversion section 121 converts the image coordinates into common system coordinates using the parameters.
The coordinate conversion unit 121 may convert the image coordinates into two-dimensional common system coordinates, or may convert the image coordinates into three-dimensional common system coordinates. For example, when converting the image coordinates into three-dimensional common system coordinates, the coordinate conversion unit 121 projects the two-dimensional common system coordinates onto a known plane such as the ground or floor.
The coordinate conversion section 121 (step S15) stores the converted coordinates, in which the image coordinates are converted into the common system coordinates, in the coordinate storage section 130.
In this way, the coordinate storage unit 130 stores a plurality of converted coordinates based on the images periodically captured by the cameras 200, 201, and 202.
Further, the post-conversion coordinates after the conversion of the image coordinates correspond to the imaging time at which the image including the image coordinates is imaged. The shooting time is also called a detection time.
Here, a case where the camera 200 or the like is a sensor will be described. For example, the sensor is an infrared sensor. The sensor detects the detection object using infrared rays or the like. The object detection unit 111 obtains detection information of a detection object from a sensor. The object detection unit 111 detects the detection target position based on the detection information. For example, the detection object position is information indicating a distance from the sensor to the detection object. The coordinate conversion section 121 converts the detection target position into a common system coordinate. The coordinate conversion unit 121 stores the converted coordinates, in which the detection target position is converted into the common system coordinates, in the coordinate storage unit 130. The information processing apparatus 100 stores the converted coordinates obtained by converting the detection object position in the coordinate storage unit 130 every time the sensor detects the detection object. The coordinates correspond to the detection time when the detection object is detected.
In this way, when the camera 200 is a sensor, the information processing apparatus 100 performs processing similar to the processing shown in fig. 6.
Fig. 7 is a flowchart showing the calculation processing of representative coordinates of embodiment 1. The process of fig. 7 starts when the classification section 150 receives a cycle trigger. The cycle trigger is generated by the timer 180 and sent to the classification section 150. In the description of the processing of fig. 7, reference is made to fig. 1, 2 and 4.
(step S21) the classifying unit 150 obtains the latest coordinates of each movement locus ID from the movement locus table 141. For example, the classification unit 150 obtains the movement trajectory ID from the movement trajectory table 141: the latest coordinates of T1, movement trajectory ID: the latest coordinates of T2, etc.
(step S22) the classifying unit 150 obtains the converted coordinates obtained by converting the image coordinates of the image captured from the start point of execution of step S22 to a predetermined time before, from the coordinate accumulating unit 130. That is, the classifying unit 150 obtains a plurality of converted coordinates from the coordinate storage unit 130 based on the imaging time (i.e., the detection time). The prescribed time is a time longer than the sampling period performed by the cameras 200, 201, 202. The predetermined time may be determined based on the accuracy of the time resolution obtained in actual use. For example, the predetermined time is about 0.1 to 2 seconds. The predetermined time is also referred to as time 1. In this way, the classifying unit 150 can acquire a plurality of transformed coordinates in accordance with the operation time of the classifying unit 150.
The classifying unit 150 may acquire all the converted coordinates stored in the coordinate storage unit 130.
Further, the time (also referred to as a storage time) at which the post-conversion coordinates are stored in the coordinate storage unit 130 can be associated with the post-conversion coordinates. The classifying unit 150 may acquire a plurality of converted coordinates stored in the coordinate storage unit 130 from the start point of execution of step S22 to a predetermined time based on the storage time.
(step S23) the classifying unit 150 selects one transformed coordinate from the plurality of transformed coordinates acquired in step S22.
(step S24) the classifying unit 150 adds the moving trajectory ID closest to the latest coordinate of the converted coordinates selected in step S23 among the latest coordinates of the moving trajectory IDs to the converted coordinates selected in step S23. For example, the classification unit 150 calculates the distance between the latest coordinate of each movement locus ID and the converted coordinate selected in step S23. As a result of the calculation by the classifying section 150, it is determined that the latest coordinate closest to the converted coordinate selected in step S23 is the movement locus ID: t1. The classifying unit 150 adds the movement trajectory ID to the transformed coordinates selected in step S23: t1.
The latest coordinate closest to the converted coordinate selected in step S23 is also referred to as the 1 st coordinate. Further, it can be said that the converted coordinates to which the movement locus ID is attached are coordinates expected to have a relation with the latest coordinates of the movement locus ID.
(step S25) the classifying unit 150 determines whether all of the plurality of converted coordinates acquired in step S22 are selected. If all of the plurality of converted coordinates acquired in step S22 are not selected (step S25: no), the classifying unit 150 advances the process to step S23. When all of the plurality of converted coordinates acquired in step S22 are selected (yes in step S25), the classifying unit 150 advances the process to step S26.
Here, when all the plurality of converted coordinates are selected, the movement track ID is added to each of the plurality of converted coordinates. In this way, the classification unit 150 can classify the plurality of converted coordinates for each movement locus ID by executing step S24. That is, the classification section 150 performs clustering by executing step S24.
At the end of step S25, the plurality of transformed coordinates are classified for each movement locus ID. For example, let the movement track ID: the latest coordinate of T1 is the 1 st coordinate. Further, let us say that the movement locus ID: movement trajectory IDs other than T1: the plurality of latest coordinates other than T1 are a plurality of 3 rd coordinates. It can be said that the movement trace ID is attached: the plurality of converted coordinates of T1 are coordinates closest to the 1 st coordinate among a plurality of coordinates including the 1 st coordinate and the plurality of 3 rd coordinates.
(step S26) the representative coordinate calculation unit 160 calculates the representative coordinates for each movement locus ID added in step S24. That is, the representative coordinate calculation unit 160 calculates the representative coordinates from the plurality of converted coordinates classified for each movement locus ID.
For example, when the converted coordinates are two-dimensional coordinates, the representative coordinate calculation unit 160 selects one converted coordinate from among the plurality of converted coordinates to which the same movement locus ID is added. The representative coordinate calculation section 160 individually calculates distances between each of the plurality of converted coordinates other than the selected converted coordinate and the selected converted coordinate. Then, the sum of the distances individually calculated between the coordinates is calculated. Similarly, the representative coordinate calculation unit 160 calculates the sum of distances calculated individually between the coordinates for all the converted coordinates. The representative coordinate calculation unit 160 determines the converted coordinates of the shortest distance among the total of distances as representative coordinates.
Alternatively, the representative coordinate calculation unit 160 randomly selects a plurality of transformed coordinates from among the plurality of transformed coordinates to which the same movement locus ID is added, and calculates the average coordinates of the selected transformed coordinates. When the number of converted coordinates present in a predetermined range around the average coordinate is equal to or greater than the threshold value, the representative coordinate calculation unit 160 determines the average coordinate as the representative coordinate. Further, when the number of converted coordinates present in the predetermined range is smaller than the threshold value, the representative coordinate calculating section 160 performs the process of randomly selecting the converted coordinates again.
The representative coordinate calculation unit 160 determines the representative coordinate calculated for each movement locus ID as the position moved from the latest coordinate of each movement locus ID. For example, the representative coordinate calculation section 160 will calculate the movement trajectory ID based on the movement trajectory ID added: representative coordinates calculated from the plurality of converted coordinates of T1 are determined as the movement locus ID: the shooting object corresponding to T1 is identified from the moving track ID: the position after the latest coordinate of T1 is moved.
The representative coordinate calculation unit 160 adds the representative coordinates to the movement trajectory table 141 as new latest coordinates for each movement trajectory ID (step S27). The representative coordinate calculation unit 160 registers the time point at which the representative coordinate is added in the movement trajectory table 141. For example, the representative coordinate calculation section 160 will calculate the movement locus ID: the point of time when the representative coordinates of T1 are added is registered as the movement locus ID of the movement locus table 141: the final update time of T1.
(step S28) the classifying unit 150 stands by for a certain time. The classifying unit 150 advances the process to step S21 after waiting. Further, the certain time is a time longer than the sampling period of the cameras 200, 201, 202. For example, the certain time is more than 2 times the sampling period.
Here, in general representative coordinate calculation, spatially abnormal values are removed. However, in embodiment 1, the plurality of converted coordinates (i.e., the plurality of converted coordinates to which the same movement locus ID is added) classified by the classification section 150 are coordinates based on images captured at a plurality of capturing times, and therefore, the removal of the spatially abnormal value and the removal of the temporally abnormal value are simultaneously performed. That is, the movement trajectory table 141 does not include an abnormal value. Accordingly, the information processing apparatus 100 can generate a high-precision movement trace from the movement trace table 141. Thus, the information processing apparatus 100 can improve the accuracy of the movement trace.
Further, the first representative coordinates of the respective movement locus IDs may be arbitrarily calculated.
In step S22, the converted coordinates obtained by converting the image coordinates of the image captured from the execution start point of step S22 to the predetermined time are acquired from the coordinate storage unit 130. The predetermined time may be equal to or longer than the standby time for which the classifying unit 150 is standby in step S28.
The classifying unit 150 adds each movement locus ID to the plurality of converted coordinates acquired in step S22. However, when the distance between the latest coordinates closest to the converted coordinates selected in step S23 and the latest coordinates of each movement trajectory ID is equal to or smaller than the threshold Th3 (also referred to as the 3 rd threshold), the classification unit 150 may add the movement trajectory ID of the latest coordinate closest to the converted coordinates selected in step S23. The following considerations may apply. The classifying unit 150 does not perform any processing on the post-conversion coordinates, of which the distance from the latest coordinate of the movement trajectory ID among the plurality of post-conversion coordinates to which the same movement trajectory ID has been added in step S24, that is, the post-conversion coordinates having a distance equal to or smaller than the threshold Th3, and removes the movement trajectory ID added to the post-conversion coordinates exceeding the threshold Th 3. Then, the representative coordinate calculation unit 160 calculates a representative coordinate from a plurality of converted coordinates whose distance from the latest coordinate of the movement trajectory ID is equal to or less than the threshold Th 3. Thus, the classification unit 150 can exclude abnormal values.
The classification unit 150 may execute the processing described below. When the distance between the latest coordinate closest to the converted coordinate selected in step S23 and the latest coordinate of each movement trajectory ID exceeds the threshold Th1 (also referred to as the 1 st threshold), the classifying unit 150 does not add the movement trajectory ID of the latest coordinate closest to the converted coordinate selected in step S23. That is, it can be said that the converted coordinates to which the movement trajectory ID is not attached are coordinates whose distances from each of the latest coordinates of each movement trajectory ID exceed the threshold Th 1. In this way, the classifying unit 150 extracts the transformed coordinates to which the movement trajectory ID is not added from the plurality of transformed coordinates acquired in step S22. The converted coordinates to which the movement locus ID is not attached are also referred to as 1 st converted coordinates. For example, these processes may be expressed as the classification unit 150 extracting a plurality of 1 st transformed coordinates whose distance from each of a plurality of coordinates including the 1 st coordinate and the 3 rd coordinates exceeds the 1 st threshold value.
The classification section 150 detects a plurality of features from the converted coordinates to which the movement trajectory ID is not attached. The classification unit 150 classifies the converted coordinates to which the movement locus ID is not added for each feature. For example, the classification section 150 extracts a post-conversion coordinate (also referred to as a 2 nd post-conversion coordinate) from the post-conversion coordinates to which the movement locus ID is not attached, based on the 1 st feature of the plurality of features. Here, as the definition of the 1 st feature, for example, the following definition is used: the number of other transformed coordinates within a predetermined range around the coordinates is equal to or greater than a threshold Th 6. When the number of extracted converted coordinates is equal to or greater than a threshold Th2 (also referred to as a 2 nd threshold), the classification unit 150 determines that the detected person or object is a newly detected person or object. When the number of extracted converted coordinates is equal to or greater than the threshold Th2, the representative coordinate calculation unit 160 calculates the representative coordinates from the extracted converted coordinates. The representative coordinate calculation section 160 registers the representative coordinates in the movement trajectory table 141 as the position where the newly detected person or object is detected. The representative coordinate calculation unit 160 adds a new movement trajectory ID to the newly detected person or object, and registers the new movement trajectory ID in the movement trajectory table 141. Thus, the classification unit 150 can detect the initial position of a new person or object to start tracking.
Fig. 8 is a diagram (one of them) showing a specific example of the tracking process according to embodiment 1. Fig. 8 shows that the cameras 200, 201 are arranged in the ceiling of a room. The subject person U1 and the subject person U2 walk on the floor 300 in the room in a staggered manner.
Fig. 9 is a diagram showing a specific example of the tracking process in embodiment 1 (second example). The status information 400 shows the position of the subject person U1 detected from the image captured by the camera 200 at the time T1 and the position of the subject person U2 detected from the image captured by the camera 201 at the time T1.
The position of the photographing subject detected from the image photographed by the camera 200 and the position of the photographing subject detected from the image photographed by the camera 201 are converted coordinates that are converted into common system coordinates.
Region 301 shows the range that camera 200 can capture. Region 302 shows a range that can be photographed by the camera 201. Region 303 is a range in which camera 200 and camera 201 repeatedly photograph. The range of repeated shooting is set as a repeated area. In fig. 9, the area of the grid is a repeated area.
The status information 401 shows the position of the subject person U1 detected from the image captured by the camera 200 at the time T2 and the positions of the subject persons U1, U2 detected from the image captured by the camera 201 at the time T2.
The status information 402 shows the positions of the subjects U1, U2 detected from the image captured by the camera 200 at the time T3 and the positions of the subjects U1, U2 detected from the image captured by the camera 201 at the time T3.
The status information 403 shows the positions of the subjects U1, U2 detected from the image captured by the camera 200 at the time T4 and the position of the subject U1 detected from the image captured by the camera 201 at the time T4.
The status information 404 shows the position of the subject person U2 detected from the image captured by the camera 200 at the time T5 and the position of the subject person U1 detected from the image captured by the camera 201 at the time T5.
Fig. 9 shows a case where the position detecting section 110 and the converting section 120 desirably perform coordinate conversion. In the repetitive region of fig. 9, the positions of the photographing subjects detected from the images photographed by the camera 200 and the camera 201 are substantially identical. In this way, when the detected positions of the subjects substantially coincide with each other, for example, by averaging the positions of the subjects detected by the cameras 200 and 201 at the times T2 to T4, the positions of the subjects U1 and U2 can be estimated with high accuracy. By generating a movement locus using the position estimated with high accuracy, a movement locus with high accuracy can be generated.
Fig. 10 is a diagram showing a specific example of the tracking process according to embodiment 1 (third). Fig. 10 is different in that the status information 402 of fig. 9 is changed to status information 402a.
The status information 402a shows the positions of the subjects U1, U2 detected from the image captured by the camera 200 at the time T3 'and the positions of the subjects U1, U2 detected from the image captured by the camera 201 at the time T3'.
The situation information 402a differs from the situation information 402 in that the position of the subject person U1 detected from the image captured by the camera 200 at the time T3' is different. For example, the position of the subject person U1 in the status information 402a is detected by the false detection of the subject person U1 by the position detecting unit 110. Alternatively, the position of the subject person U1 in the status information 402a is generated due to insufficient accuracy of the conversion parameters of the conversion unit 120.
Thus, fig. 10 shows an example in which the detection position of the subject person U1 is greatly deviated at the time T3'. Generating the movement trajectory of the subject person U1 using the position where the detection position greatly deviates may reduce the accuracy of the movement trajectory of the subject person U1.
In embodiment 1, even when the detection position of the subject person U1 is greatly deviated, the accuracy of the movement trajectory of the subject person U1 can be improved. The reason will be described below.
The coordinate storage unit 130 stores information of the status information 401, 402a, 403. That is, the coordinate storage unit 130 stores the positions of the image pickup subjects U1 and U2 detected from the images picked up by the cameras 200 and 201 at the times T2, T3', and T4. The classification unit 150 extracts the position of the subject person U1 closest to the time T1 from among the positions of the subject persons U1, U2 detected from the images captured by the cameras 200, 201 at the times T2, T3', T4. For example, the classification unit 150 extracts the positions (i.e., the coordinates after the change) in the regions 501, 502, 503 of fig. 10. The extracted position is a position expected to have a relation with the position of the subject person U1 at time T1. The representative coordinate calculation unit 160 determines the representative coordinates from the positions (i.e., the coordinates after the change) in the regions 501, 502, 503 in fig. 10. The position of the subject person U1 of the status information 402a becomes an abnormal value, excluding from the representative coordinates. The representative coordinates are positions other than the position of the subject person U1 of the status information 402 a. Then, the determined representative coordinates are determined as the position moved from the position of the subject person U1 of the status information 400. In this way, the information processing apparatus 100 does not include the position of the subject person U1 of the status information 402a in the coordinates (i.e., the position information) indicating the movement trajectory of the subject person U1, and thereby can improve the accuracy of the movement trajectory of the subject person U1.
Embodiment 2
Next, embodiment 2 will be described. The matters different from those of embodiment 1 will be mainly described, and the description of the matters similar to embodiment 1 will be omitted. In the description of embodiment 2, reference is made to fig. 1 to 6.
Fig. 11 is a functional block diagram showing the structure of an information processing apparatus according to embodiment 2. The information processing apparatus 100a has a classification section 150a and a representative coordinate calculation section 160a. The functions of the classification section 150a and the representative coordinate calculation section 160a will be described in detail later.
The structure of fig. 11 that is identical to or corresponds to the structure shown in fig. 4 is given the same reference numerals as those shown in fig. 4.
Fig. 12 is a flowchart (one of them) showing the calculation processing of the representative coordinates of embodiment 2. When the classification unit 150a receives the cycle trigger, the process of fig. 12 is started. The cycle trigger is generated by the timer 180 and sent to the classification section 150a. In addition, a plurality of converted coordinates are stored in the coordinate storage unit 130. In the description of the processing of fig. 12, reference is made to fig. 11.
(step S31) the classifying unit 150a obtains the latest coordinates of each movement track ID from the movement track table 141.
Here, in the representative coordinate calculation process of embodiment 2, the current representative coordinate candidate is used. The current representative coordinate candidate is information used in the processing from step S34. Further, different movement locus IDs are respectively associated with the plurality of current representative coordinate candidates.
(step S32) the classifying unit 150a sets the latest coordinates of each movement locus ID as the current representative coordinate candidate of each movement locus ID. That is, the classifying unit 150a sets the latest coordinates of the movement locus ID as the current representative coordinate candidates of the same movement locus ID as the movement locus ID. For example, the classification section 150a classifies the movement trajectory ID: the latest coordinates of T1 are set to the movement locus ID: t1 currently represents a coordinate candidate. The latest coordinates of each movement locus ID are also called 4 th coordinates.
(step S33) the classifying unit 150a acquires the converted coordinates obtained by converting the image coordinates of the image captured from the execution start point of step S33 to a predetermined time before, from the coordinate accumulating unit 130. That is, the classifying unit 150a obtains a plurality of converted coordinates from the coordinate storage unit 130 based on the imaging time (i.e., the detection time).
The classifying unit 150a may acquire all the converted coordinates stored in the coordinate storage unit 130.
(step S34) the classifying unit 150a selects one transformed coordinate from the plurality of transformed coordinates acquired in step S33.
(step S35) the classifying unit 150a adds the moving trajectory ID of the current representative coordinate candidate closest to the transformed coordinate selected in step S34 among the current representative coordinate candidates of the respective moving trajectory IDs to the transformed coordinate selected in step S34.
(step S36) the classifying unit 150a determines whether all of the plurality of converted coordinates acquired in step S33 are selected. If all of the plurality of converted coordinates acquired in step S33 are not selected (step S36: no), the classifying unit 150a advances the process to step S34. When all of the plurality of converted coordinates acquired in step S33 are selected (yes in step S36), the classifying unit 150a advances the process to step S37.
In this way, the classifying unit 150a extracts a plurality of coordinates, which are expected to have a relation, from the plurality of transformed coordinates obtained in step S33, for each 4 th coordinate (i.e., the latest coordinate of each movement locus ID) of the plurality of 4 th coordinates.
(step S37) the representative coordinate calculation unit 160a calculates the representative coordinates for each movement locus ID added in step S35. That is, the representative coordinate calculation unit 160a calculates the representative coordinates from the plurality of converted coordinates classified for each movement locus ID. The calculation method of the representative coordinates is the same as in step S26.
Then, the representative coordinate calculation unit 160a advances the process to step S41.
Fig. 13 is a flowchart (second) showing the calculation processing of the representative coordinates in embodiment 2.
(step S41) the representative coordinate calculation unit 160a calculates the distance between the current representative coordinate candidate and the representative coordinate for each movement locus ID. For example, the representative coordinate calculation unit 160a calculates the movement locus ID: the current representative coordinate candidate of T1 and the movement locus ID calculated in step S37: t1 represents the distance between the coordinates.
(step S42) the representative coordinate calculation unit 160a determines whether or not each distance calculated for each movement trajectory ID is equal to or less than a threshold Th4 (also referred to as a 4 Th threshold) or whether or not the number of repetitions exceeds a threshold Th5. For example, if at least one distance longer than the threshold Th4 exists among the distances calculated in step S41, the representative coordinate calculating unit 160a advances the process to step S43.
The number of repetitions is the number of times that the process of step S43 and the like is executed when the determination of step S42 is no, and the determination of step S42 is repeated again. Note that, the case where the determination of "no" in step S42 is first set to 1 st time.
When the condition is satisfied (yes in step S42), the representative coordinate calculation unit 160a advances the process to step S44. When the condition is not satisfied (no in step S42), the representative coordinate calculation unit 160a advances the process to step S43.
The representative coordinate calculating unit 160a sets the representative coordinate of each movement locus ID as the current representative coordinate candidate of each movement locus ID (step S43). That is, the representative coordinate calculation unit 160a sets the representative coordinate of the movement locus ID as the current representative coordinate candidate of the same movement locus ID as the movement locus ID. For example, the representative coordinate calculation unit 160a calculates the movement locus ID calculated in step S37: the representative coordinates of T1 are set to the movement locus ID: t1 currently represents a coordinate candidate.
Then, the representative coordinate calculation unit 160a advances the process to step S34.
(step S44) the representative coordinate calculation unit 160a adds the representative coordinates of the movement locus IDs to the movement locus table 141 as the latest coordinates. The representative coordinate calculation unit 160 registers the time point at which the representative coordinate is added in the movement trajectory table 141.
(step S45) the classifying unit 150a stands by for a certain period of time. The classifying unit 150a advances the process to step S31 after waiting.
According to embodiment 2, the information processing apparatus 100a repeatedly performs step S42 to converge on the appropriate representative coordinates. Therefore, it can be said that the representative coordinates registered in the movement trajectory table 141 represent the position of the photographic subject with high accuracy. The information processing apparatus 100a can generate a high-precision movement trajectory by using the coordinates registered in the movement trajectory table 141.
Modification examples
In embodiments 1 and 2, cameras 200, 201, and 202 are illustrated. However, the cameras 200, 201, 202 may be sensors capable of detecting the relative position of the imaging subject as two-dimensional coordinates from above at least in the imaging subject space. For example, the sensor is an image sensor comprising a pattern sensor. For example, in the case where the photographic subject is limited to a person, the image sensor is an infrared image sensor or a thermal image sensor. For example, in an infrared image obtained from an infrared image sensor, a person region is represented as a region having a higher temperature than the surrounding. Therefore, the position detecting unit 110 can extract the human figure region from the difference image between the infrared image obtained from the infrared image sensor and the infrared image obtained from the infrared image sensor in the background difference processing. Further, the position detecting section 110 can extract a circular area as the head of the person. Thereby, the position detection unit 110 can detect the person and the position of the person.
In addition, the cameras 200, 201, 202 may also be ToF (Time of Flight) sensors. The position detection unit 110 compares the depth map of the information acquired from the ToF sensor and the information in the case where the object is not photographed, and can thereby detect the relative position of the object from above as two-dimensional coordinates. The ToF sensor measures the distance by measuring the time of flight of light, and the output of the sensor is obtained as a depth image from the center of the sensor. When the ToF sensor is used, the position detecting unit 110 obtains a human figure region as a region having a small depth on the depth image, and thus calculates a difference image obtained by subtracting the depth image in the case where no human figure is present from the background difference, and extracts a region having a small local depth as a human figure head region. The use of a ToF sensor has the following effects: the detection of the photographic subject becomes stable as compared with the image sensor.
Thus, the cameras 200, 201, 202 may also be image sensors or ToF sensors. The image sensor or the ToF sensor is also referred to as a detection device, an imaging device or an image generation device.
The features of the embodiments described above can be combined with each other as appropriate.
Description of the reference numerals
100. 100a: an information processing device; 110: a position detection unit; 111. 112, 113: an object detection unit; 120: a conversion section; 121. 122, 123: a coordinate conversion section; 130: a coordinate storage unit; 140: a storage unit; 141: a movement track table; 150. 150a: a classification unit; 160. 160a: a representative coordinate calculation unit; 170: a display control unit; 180: a timer; 200. 201, 202: and a video camera.

Claims (7)

1. An information processing apparatus, the information processing apparatus having:
a position detection unit that detects a plurality of detection object positions indicating positions of the 1 st detection object based on a plurality of pieces of detection information obtained by periodically detecting the 1 st detection object by a plurality of detection devices, respectively;
a conversion unit that converts the plurality of detection target positions into coordinates based on a space in which the plurality of detection devices are provided;
a storage section that stores 1 st coordinates, which are coordinates of a position where the 1 st detection object is located before the plurality of detection devices detect the 1 st detection object, and which are coordinates based on the space, and that stores 3 rd coordinates, which are coordinates of a position where the plurality of detection devices detect the detection object other than the 1 st detection object, and which are coordinates based on the space;
A classification unit that obtains, from a plurality of converted coordinates, which are a plurality of coordinates obtained by converting the plurality of detection target positions into coordinates based on the space, a plurality of converted coordinates obtained by converting the plurality of detection target positions detected by the detection information detected by the plurality of detection devices within a 1 st time period longer than a period of detection of the detection target, and extracts, from the plurality of obtained converted coordinates, a plurality of 2 nd coordinates, which are a plurality of coordinates expected to be related to the 1 st coordinate, the plurality of 2 nd coordinates being coordinates closest to the 1 st coordinate among a plurality of coordinates including the 1 st coordinate and the plurality of 3 rd coordinates; and
a representative coordinate calculation unit that calculates a representative coordinate from the plurality of 2 nd coordinates, determines the calculated representative coordinate as a position of the 1 st detection object after movement from the 1 st coordinate,
wherein,,
the classifying section extracts, from the plurality of converted coordinates, a plurality of 1 st converted coordinates whose distances from each of the plurality of coordinates including the 1 st coordinate and the plurality of 3 rd coordinates exceed a 1 st threshold value, detects a plurality of features from the plurality of 1 st converted coordinates, extracts a 2 nd converted coordinate from the plurality of 1 st converted coordinates from the 1 st feature of the plurality of features,
When the number of coordinates after the 2 nd conversion is equal to or greater than a 2 nd threshold, the representative coordinate calculation unit calculates representative coordinates from the coordinates after the 2 nd conversion, and stores the calculated representative coordinates in the storage unit as a position where the detection object is newly detected.
2. The information processing apparatus according to claim 1, wherein,
the representative coordinate calculation unit calculates a representative coordinate from a plurality of coordinates in which a distance between the 1 st coordinate and the 2 nd coordinate is equal to or less than a 3 rd threshold value.
3. An information processing apparatus, the information processing apparatus having:
a position detection unit that detects a plurality of detection object positions indicating positions of the 1 st detection object based on a plurality of pieces of detection information obtained by periodically detecting the 1 st detection object by a plurality of detection devices, respectively;
a conversion unit that converts the plurality of detection target positions into coordinates based on a space in which the plurality of detection devices are provided;
a storage section that stores 1 st coordinates, the 1 st coordinates being coordinates of a position where the 1 st detection object is located before the plurality of detection devices detect the 1 st detection object and being coordinates based on the space,
The storage section further stores a plurality of 4 th coordinates, which are coordinates of positions where a plurality of detection objects including the 1 st detection object are located before the plurality of detection devices detect the plurality of detection objects, and which are based on the coordinates of the space,
a classification unit that obtains, from a plurality of converted coordinates, which are coordinates obtained by converting the plurality of detection target positions into coordinates based on the space, converted coordinates obtained by converting the plurality of detection target positions detected by the plurality of detection devices within a 1 st time period longer than a period of detection target, and extracts, from the obtained converted coordinates, a plurality of 2 nd coordinates, which are coordinates estimated to have a relation with the 1 st coordinate,
the classifying section extracts a plurality of coordinates, which are predicted to have a relation, from the plurality of transformed coordinates in accordance with each 4 th coordinate of the plurality of 4 th coordinates, and
a representative coordinate calculation unit that calculates a representative coordinate from the plurality of 2 nd coordinates, determines the calculated representative coordinate as a position of the 1 st detection object after movement from the 1 st coordinate,
The representative coordinate calculation section calculates a plurality of representative coordinates from a plurality of coordinates extracted for each of the 4 th coordinates,
the classification section further extracts a plurality of coordinates of a predetermined relationship from the plurality of converted coordinates, in accordance with each of the plurality of representative coordinates, in a case where distances between the plurality of representative coordinates and the plurality of 4 th coordinates exceed a 4 th threshold,
the representative coordinate calculation unit also newly calculates a plurality of representative coordinates based on the plurality of coordinates extracted for each of the plurality of representative coordinates.
4. A tracking method, wherein,
the information processing device detects a plurality of detection object positions indicating the position of the 1 st detection object based on a plurality of detection information obtained by periodically detecting the 1 st detection object by a plurality of detection devices, respectively,
the information processing apparatus converts the plurality of detection object positions into coordinates based on a space in which the plurality of detection apparatuses are provided,
the information processing apparatus acquires 1 st coordinates, which are coordinates of a position where the 1 st detection object is located before the 1 st detection object is detected by the plurality of detection apparatuses and are based on coordinates of the space, acquires 3 rd coordinates, which are coordinates of a position where the detection object is located before the detection object is detected by the plurality of detection apparatuses other than the 1 st detection object and are based on coordinates of the space,
The information processing apparatus acquires, from a plurality of converted coordinates which are a plurality of coordinates obtained by converting the plurality of detection object positions into coordinates based on the space, a plurality of converted coordinates obtained by converting the plurality of detection object positions detected based on detection information detected within a 1 st time longer than a period of detection of the detection object, extracts, from the plurality of obtained converted coordinates, a plurality of 2 nd coordinates which are a plurality of coordinates expected to be related to the 1 st coordinate, the plurality of 2 nd coordinates being coordinates closest to the 1 st coordinate among a plurality of coordinates including the 1 st coordinate and the plurality of 3 rd coordinates,
the information processing apparatus calculates representative coordinates from the plurality of 2 nd coordinates,
the information processing apparatus decides the calculated representative coordinates as the position of the 1 st detection object after the 1 st detection object is moved from the 1 st coordinates,
wherein the information processing apparatus extracts, from the plurality of post-conversion coordinates, a plurality of post-conversion coordinates whose distances from each of a plurality of coordinates including the 1 st coordinate and the plurality of 3 rd coordinates exceed a 1 st threshold, detects a plurality of features from the plurality of 1 st post-conversion coordinates, extracts, from the plurality of 1 st post-conversion coordinates, a 2 nd post-conversion coordinate from a 1 st feature of the plurality of features,
When the number of the 2 nd converted coordinates is equal to or greater than a 2 nd threshold value, a representative coordinate is calculated from the 2 nd converted coordinates, and the calculated representative coordinate is stored in a storage unit as a position where the detection object is newly detected.
5. A tracking method, wherein,
the information processing device detects a plurality of detection object positions indicating the position of the 1 st detection object based on a plurality of detection information obtained by periodically detecting the 1 st detection object by a plurality of detection devices, respectively,
the information processing apparatus converts the plurality of detection object positions into coordinates based on a space in which the plurality of detection apparatuses are provided,
the information processing apparatus obtains 1 st coordinates, which are coordinates of a position where the 1 st detection object is located before the plurality of detection apparatuses detect the 1 st detection object, and which are coordinates based on the space,
the information processing device obtains, from a plurality of converted coordinates which are coordinates obtained by converting the plurality of detection target positions into coordinates based on the space, converted coordinates obtained by converting the plurality of detection target positions detected by the plurality of detection devices within a 1 st time period longer than a period of detecting the detection target, and extracts a plurality of 2 nd coordinates which are coordinates expected to have a relation with the 1 st coordinates from the obtained converted coordinates,
The information processing apparatus calculates representative coordinates from the plurality of 2 nd coordinates,
the information processing apparatus decides the calculated representative coordinates as the position of the 1 st detection object after the 1 st detection object is moved from the 1 st coordinates,
wherein the information processing device stores a plurality of 4 th coordinates, the plurality of 4 th coordinates being coordinates of positions where a plurality of detection objects including the 1 st detection object are located before the plurality of detection objects are detected by the plurality of detection devices, and being coordinates based on the space,
the information processing apparatus extracts a plurality of coordinates, which are predicted to have a relation, from the plurality of converted coordinates in accordance with each 4 th coordinate of the plurality of 4 th coordinates,
the information processing apparatus calculates a plurality of representative coordinates from a plurality of coordinates extracted according to each 4 th coordinate of the plurality of 4 th coordinates,
extracting a plurality of coordinates of a predetermined relationship from the plurality of converted coordinates in accordance with each representative coordinate of the plurality of representative coordinates in the case where the distances between the plurality of representative coordinates and the plurality of 4 th coordinates exceeds a 4 th threshold,
the information processing apparatus further calculates a plurality of representative coordinates newly based on the plurality of coordinates extracted according to each of the plurality of representative coordinates.
6. A recording medium having recorded thereon a tracking program for causing a computer to execute:
detecting a plurality of detection object positions indicating the position of the 1 st detection object based on a plurality of detection information obtained by periodically detecting the 1 st detection object by a plurality of detection devices, respectively,
converting the plurality of detection object positions into coordinates based on a space in which the plurality of detection devices are provided,
acquiring 1 st coordinates, the 1 st coordinates being coordinates of a position where the 1 st detection object is located before the plurality of detection devices detect the 1 st detection object and being coordinates based on the space,
a plurality of 3 rd coordinates are also acquired, the plurality of 3 rd coordinates being coordinates of a position where a detection object other than the 1 st detection object is located before the plurality of detection devices detect the detection object other than the 1 st detection object, and being coordinates based on the space,
obtaining a plurality of converted coordinates in which the plurality of detection object positions detected based on detection information detected by the plurality of detection devices within a 1 st time longer than a period of detection of the detection object, from a plurality of converted coordinates in which the plurality of detection object positions are converted into a plurality of coordinates after the coordinates based on the space, and extracting a plurality of 2 nd coordinates in which the plurality of coordinates are expected to have a relation with the 1 st coordinate, the plurality of 2 nd coordinates being coordinates in which distances between the plurality of coordinates including the 1 st coordinate and the plurality of 3 rd coordinates and the 1 st coordinate are closest to each other, from the plurality of obtained converted coordinates,
Calculating a representative coordinate from the plurality of 2 nd coordinates,
determining the calculated representative coordinates as the position of the 1 st detection object after moving from the 1 st coordinates,
wherein,,
extracting a plurality of 1 st transformed coordinates, from which distances between the 1 st transformed coordinates and respective coordinates among a plurality of coordinates including the 1 st coordinates and the 3 rd coordinates exceed a 1 st threshold value, detecting a plurality of features from the plurality of 1 st transformed coordinates, extracting a 2 nd transformed coordinate from the plurality of 1 st transformed coordinates from a 1 st feature among the plurality of features,
and when the number of the 2 nd converted coordinates is equal to or greater than a 2 nd threshold value, calculating a representative coordinate from the 2 nd converted coordinates, and storing the calculated representative coordinate as a position of a newly detected detection object.
7. A recording medium having recorded thereon a tracking program for causing a computer to execute:
detecting a plurality of detection object positions indicating the position of the 1 st detection object based on a plurality of detection information obtained by periodically detecting the 1 st detection object by a plurality of detection devices, respectively,
converting the plurality of detection object positions into coordinates based on a space in which the plurality of detection devices are provided,
Acquiring 1 st coordinates, the 1 st coordinates being coordinates of a position where the 1 st detection object is located before the plurality of detection devices detect the 1 st detection object and being coordinates based on the space,
obtaining a converted coordinate obtained by converting the plurality of detection object positions into a plurality of coordinates obtained by converting the plurality of detection object positions into coordinates based on the space, wherein the plurality of coordinates are a plurality of 2 nd coordinates, which are a plurality of coordinates predicted to be related to the 1 st coordinate, from the obtained converted coordinates, wherein the plurality of detection devices respectively detect within a 1 st time period longer than a period of detecting the detection object,
calculating a representative coordinate from the plurality of 2 nd coordinates,
determining the calculated representative coordinates as the position of the 1 st detection object after moving from the 1 st coordinates,
a plurality of 4 th coordinates are acquired, the 4 th coordinates being coordinates of positions where a plurality of detection objects including the 1 st detection object are located before the plurality of detection objects are detected by the plurality of detection devices, and being coordinates based on the space,
Extracting a plurality of coordinates of a predetermined relationship from the plurality of transformed coordinates in accordance with each 4 th coordinate of the plurality of 4 th coordinates,
calculating a plurality of representative coordinates from a plurality of coordinates extracted according to each 4 th coordinate of the plurality of 4 th coordinates,
extracting a plurality of coordinates of a predetermined relationship from the plurality of converted coordinates in accordance with each representative coordinate of the plurality of representative coordinates in the case where the distances between the plurality of representative coordinates and the plurality of 4 th coordinates exceeds a 4 th threshold,
the plurality of representative coordinates are also newly calculated based on the plurality of coordinates extracted according to each of the plurality of representative coordinates.
CN201880088321.8A 2018-02-08 2018-11-22 Information processing apparatus, tracking method, and recording medium Active CN111670456B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2018-020592 2018-02-08
JP2018020592 2018-02-08
PCT/JP2018/043204 WO2019155727A1 (en) 2018-02-08 2018-11-22 Information processing device, tracking method, and tracking program

Publications (2)

Publication Number Publication Date
CN111670456A CN111670456A (en) 2020-09-15
CN111670456B true CN111670456B (en) 2023-09-15

Family

ID=67549551

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880088321.8A Active CN111670456B (en) 2018-02-08 2018-11-22 Information processing apparatus, tracking method, and recording medium

Country Status (3)

Country Link
JP (1) JP6789421B2 (en)
CN (1) CN111670456B (en)
WO (1) WO2019155727A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111222404A (en) * 2019-11-15 2020-06-02 北京市商汤科技开发有限公司 Method, device and system for detecting co-pedestrian, electronic equipment and storage medium
JP2021093037A (en) * 2019-12-11 2021-06-17 株式会社東芝 Calculation system, calculation method, program, and storage medium
JPWO2021210213A1 (en) * 2020-04-13 2021-10-21
JP7459248B2 (en) 2020-05-25 2024-04-01 三菱電機株式会社 Air conditioning control system, controller, and air conditioning control method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104718562A (en) * 2012-10-17 2015-06-17 富士通株式会社 Image processing device, image processing program and image processing method
JP2015219679A (en) * 2014-05-16 2015-12-07 株式会社リコー Image processing system, information processing device, and program
JP2016091468A (en) * 2014-11-10 2016-05-23 株式会社豊田中央研究所 Target trajectory calculation device and program
KR101645451B1 (en) * 2015-04-14 2016-08-12 공간정보기술 주식회사 Spatial analysis system using stereo camera
JP2016212500A (en) * 2015-04-30 2016-12-15 三菱電機株式会社 Shooting direction change detection device and shooting direction change detection method
JP2017016356A (en) * 2015-06-30 2017-01-19 キヤノン株式会社 Image processing apparatus, image processing method, and program
JPWO2015087730A1 (en) * 2013-12-10 2017-03-16 株式会社日立国際電気 Monitoring system
WO2017086741A1 (en) * 2015-11-19 2017-05-26 중앙대학교 산학협력단 Signal position detection device using array detector capable of reducing errors

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5147761B2 (en) * 2009-03-02 2013-02-20 セコム株式会社 Image monitoring device
US8831677B2 (en) * 2010-11-17 2014-09-09 Antony-Euclid C. Villa-Real Customer-controlled instant-response anti-fraud/anti-identity theft devices (with true-personal identity verification), method and systems for secured global applications in personal/business e-banking, e-commerce, e-medical/health insurance checker, e-education/research/invention, e-disaster advisor, e-immigration, e-airport/aircraft security, e-military/e-law enforcement, with or without NFC component and system, with cellular/satellite phone/internet/multi-media functions
US8855427B2 (en) * 2011-12-16 2014-10-07 Harris Corporation Systems and methods for efficiently and accurately detecting changes in spatial feature data

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104718562A (en) * 2012-10-17 2015-06-17 富士通株式会社 Image processing device, image processing program and image processing method
JPWO2015087730A1 (en) * 2013-12-10 2017-03-16 株式会社日立国際電気 Monitoring system
JP2015219679A (en) * 2014-05-16 2015-12-07 株式会社リコー Image processing system, information processing device, and program
JP2016091468A (en) * 2014-11-10 2016-05-23 株式会社豊田中央研究所 Target trajectory calculation device and program
KR101645451B1 (en) * 2015-04-14 2016-08-12 공간정보기술 주식회사 Spatial analysis system using stereo camera
JP2016212500A (en) * 2015-04-30 2016-12-15 三菱電機株式会社 Shooting direction change detection device and shooting direction change detection method
JP2017016356A (en) * 2015-06-30 2017-01-19 キヤノン株式会社 Image processing apparatus, image processing method, and program
WO2017086741A1 (en) * 2015-11-19 2017-05-26 중앙대학교 산학협력단 Signal position detection device using array detector capable of reducing errors

Also Published As

Publication number Publication date
WO2019155727A1 (en) 2019-08-15
CN111670456A (en) 2020-09-15
JPWO2019155727A1 (en) 2020-05-28
JP6789421B2 (en) 2020-11-25

Similar Documents

Publication Publication Date Title
CN111670456B (en) Information processing apparatus, tracking method, and recording medium
JP6741130B2 (en) Information processing system, information processing method, and program
JP6674584B2 (en) Video surveillance system
US10212324B2 (en) Position detection device, position detection method, and storage medium
JP5559335B2 (en) Behavior analysis device
JP6036824B2 (en) Angle of view variation detection device, angle of view variation detection method, and field angle variation detection program
JP6217635B2 (en) Fall detection device, fall detection method, fall detection camera, and computer program
JP2021048617A (en) Information processing system, information processing method, and program
US10467461B2 (en) Apparatus for searching for object and control method thereof
JPWO2014155958A1 (en) Object monitoring system, object monitoring method and monitoring object extraction program
JP6292540B2 (en) Information processing system, information processing method, and program
JP6503079B2 (en) Specific person detection system, specific person detection method and detection device
US8923552B2 (en) Object detection apparatus and object detection method
JP2008035301A (en) Mobile body tracing apparatus
US11544926B2 (en) Image processing apparatus, method of processing image, and storage medium
JP2018186397A (en) Information processing device, image monitoring system, information processing method, and program
KR101355206B1 (en) A count system of coming and going using image analysis and method thereof
JP5877725B2 (en) Image monitoring device
JP5864230B2 (en) Object detection device
KR102614895B1 (en) Real-time object tracking system and method in moving camera video
JP2018128800A (en) Bed identification apparatus
JPH11328365A (en) Device and method for monitoring image
JP2018195872A (en) Information processing device, information processing system, information processing method, and program
JP7357649B2 (en) Method and apparatus for facilitating identification
JP2010250571A (en) Person counting device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant