WO2016194770A1 - Image processing device for recognition of end of queue - Google Patents
Image processing device for recognition of end of queue Download PDFInfo
- Publication number
- WO2016194770A1 WO2016194770A1 PCT/JP2016/065586 JP2016065586W WO2016194770A1 WO 2016194770 A1 WO2016194770 A1 WO 2016194770A1 JP 2016065586 W JP2016065586 W JP 2016065586W WO 2016194770 A1 WO2016194770 A1 WO 2016194770A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- matrix
- farthest point
- pixel
- image
- queue
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
Definitions
- the present invention relates to an image processing apparatus for recognizing the end of a matrix, and for example, relates to an image processing apparatus for recognizing a matrix end suitable for uses such as waiting time estimation in a queue that appears before an entrance gate or a ticketing machine.
- image processing devices used for purposes such as waiting time estimation in a queue that appears in front of an entrance gate or a ticketing machine.
- Patent Document 1 An image is extracted by paying attention to individual persons forming a matrix, and the total extension of matrix lines is calculated by connecting adjacent images with a straight line, and a matrix separately estimated A device that calculates the waiting time of a matrix by dividing this by the moving speed (hereinafter referred to as “first conventional example”) is known (see Patent Document 1).
- the residual image obtained by the difference between the background difference image and the inter-frame difference image is regarded as a human aggregate, and the largest aggregate among these aggregates is used as a matrix.
- the number of people in the matrix is obtained, and this is divided by the predetermined number of former matrix visitors per unit time.
- a device for obtaining time (hereinafter referred to as “second conventional example”) is known (see Patent Document 2).
- tracking of edge information of only a moving object is performed based on an image obtained by performing a labeling process on an image obtained by superimposing an edge background difference image and an inter-frame difference image. And generating a contour image of the moving body, and detecting the convex shape of the contour image to extract the head of the person in the image and counting the number of people based on the head (Referred to as “third conventional example”) (see Patent Document 3).
- the difference between the background difference image and the inter-frame difference image can be avoided.
- the algorithm that considers the residual image obtained by the difference as a collection of people and immediately recognizes the largest collection among the collections as a matrix is easily affected by passersby around the matrix.
- the algorithm of extracting the human head in the image by detecting the convex shape of the contour image can be applied to a relatively dispersed crowd such as a station square.
- a relatively dispersed crowd such as a station square.
- an object of the present invention is to provide an image processing apparatus for recognizing the end of a matrix that can determine the waiting time of the matrix.
- an image processing apparatus for matrix tail recognition is configured as follows. That is, the image processing apparatus for recognizing the end of a queue according to the present invention recognizes the end of the queue from a moving image obtained by taking a bird's-eye view of an area where a queue composed of one or more lanes is expected to appear. And a non-stationary object extracting unit, a pixel block grouping unit, a farthest point searching unit, and a farthest point determining unit.
- the non-stationary object extracting means extracts pixels constituting the non-stationary object from each of a series of frame images constituting the moving image.
- the pixel block grouping unit divides the extracted non-stationary body pixels into pixel blocks formed by collecting adjacent pixels into a single block, and a predetermined matrix extension of the divided pixel blocks. A series of pixel blocks that are close to each other in the direction are stored as pixel blocks belonging to one group.
- the farthest point searching means searches for the farthest point in a predetermined matrix extending direction from all the pixel blocks belonging to one pixel block group and stores them as matrix end candidates.
- the farthest point determination means when the farthest point stored as the matrix end candidate changes beyond a predetermined allowable value between consecutive frame images, based on the change direction and the stability after the change Thus, the farthest point stored as the matrix end candidate is determined as the farthest point to be recognized as the true end of the matrix.
- a method for certifying “stability after change” not only a method for determining whether the value after change is maintained substantially the same over a constant number of continuous frames, but also a constant per constant number of continuous frames.
- Various methods can be employed in consideration of the influence of noise, such as determining whether or not the same is maintained at the above ratio.
- a series of pixel blocks adjacent to each other in a predetermined matrix extending direction is stored as a pixel block belonging to one group. Therefore, without being affected by passersby around the matrix, it is possible to reliably search only from the beginning to the end for the pixel block that is truly likely to form the matrix, and the farthest point.
- the search of all the pixel blocks included in the group instead of the pixel block that was last included in the group is also performed, and the possibility of the end of the matrix is looked at again within the group.
- the highest pixel can be the farthest point, and in addition, the farthest point obtained in this way is not directly recognized as the true farthest point, but the row is extended and the row is shortened. And according to different standards By reaffirming, while maintaining the recognized response can avoiding false matrix tail by passersby, etc. near matrices tail. Moreover, since the person included in the matrix is not individually recognized, the entire image processing is a combination of relatively simple elemental technologies, so the processing is simple and the processing time is relatively short. It can be easily implemented.
- the farthest point determination means when the farthest point determination means is such that when the direction of change of the farthest point is away from the top of the matrix, the farthest point after the change is approximately the number of frames. Only when it is kept constant, it may include a process of determining the farthest point stored as the matrix end candidate as the farthest point to be recognized as the true end of the matrix.
- the matrix when the farthest point determination means is a direction in which the direction of change of the farthest point approaches the head of the matrix, the matrix immediately (without waiting for a predetermined number of frames). It may include processing for determining the farthest point stored as the tail candidate as the farthest point to be recognized as the true matrix tail.
- the overhead view shooting is performed from the rear of an assumed queue, and the predetermined queue extending direction is a downward direction of each frame image. It may be.
- the assumed matrix extending direction coincides with the vertical direction of the frame image. Whether each pixel is close to or far from the end of the matrix can be easily determined using the vertical coordinate value of each pixel as it is.
- the non-stationary body extracting unit converts the frame image into a line image obtained by extracting the contours of various images included therein, and the line image.
- a background deleting unit that deletes a line image portion corresponding to a background using background difference processing
- a sharpening unit that sharpens lines constituting the line image after the background difference processing. It may be.
- the contour extraction process is performed prior to the background deletion process, and the line image is further sharpened after the contour extraction.
- the outline of the non-stationary object included in the original frame image is outlined and the outline is sharpened, and it is possible to perform various subsequent arithmetic processes only on the pixels included in the outline image. By reducing the number of pixels, it is possible to perform high-speed operation by reducing the processing load.
- the contour extraction means includes a gray scale conversion means for converting the frame image into a monochrome image, a spatial primary differentiation means for performing a spatial primary differentiation process on the monochrome image, and an image after the spatial primary differentiation process. If a binarization unit that binarizes and generates a line image is included, it is possible to easily obtain a line image in which the luminance difference between all the images included in the original frame image is emphasized.
- the background deletion unit compares the value of a pixel included in a line constituting the line image with a value of a predetermined number of frames before the pixel, and deletes a pixel having a matching value as a background pixel.
- a line image of an object that has appeared within a predetermined number of frames in the past that is, not only a true background image such as a building or a table, but also a fixed image
- the sharpening means removes fine stains caused by noise or the like included in the line image after the background deletion, and a broken wavy line portion of the line included in the line image after the background processing If there is an expansion means that connects and thickens, the blurring and discontinuity of the line included in the line image is repaired, and the noise stains existing in the space surrounded by the outline are deleted, By enlarging the pixel area to which the same label is attached in the labeling process, the entire non-stationary line image is divided into relatively small and large pixel clusters, which contributes to the efficiency of the grouping process described later.
- the pixel block grouping unit attaches the same label to pixels adjacent to each other among the pixels constituting the extracted non-stationary body.
- the labeling means for dividing the pixels constituting the body into pixel blocks and the pixel located at the farthest point in the predetermined matrix extending direction among the pixels included in one pixel block corresponding to the head of the matrix In the predetermined search area set toward the far side in the extending direction, it is searched whether there is a pixel with another label, and when it exists, a pixel block including the pixel Is repeated for all head equivalent pixel blocks until there is no pixel with a different label for the new pixel block stored as the same group.
- a pixel block exploration unit may be.
- a search region set in the expected extending direction of the queue is assumed for a plurality of pixel clusters (contour image) each having a unique label.
- pixel blocks that make up surrounding passersby are excluded, and only a series of pixel blocks from the beginning to the end that make up the queue of each lane are extracted in order reliably. Can be related to one group.
- the farthest point searching means is a pixel located at a farthest point in a predetermined matrix extending direction among pixels included in one pixel block belonging to the same group. It is also possible to include a process of specifying the farthest point in the same group by repeating the process of searching for all pixel blocks belonging to the group.
- the farthest point determination means has a difference between the value of the current frame at the farthest point in the same group and the value of the previous frame within a predetermined allowable value.
- the predetermined frame counter is incremented, while the counter control means for clearing the frame counter when the allowable value is exceeded, and the difference between the value of the current frame at the farthest point in the same group and the value of the previous frame is
- the change exceeds a predetermined allowable value when the change direction is near, the value of the true farthest point is updated with the value of the current frame, while when the change direction is far, the frame counter
- the farthest point true value update control means for updating the true farthest point value with the value of the current frame after waiting for the value of a predetermined number of frames to be reached.
- the farthest point stored as the matrix end candidate is immediately recognized as the true end of the matrix. While it is fixed immediately without delay as a far point, when the change direction of the farthest point is a direction away from the top of the matrix, it is only necessary to determine how many frames the farthest point after the change will continue and Whether or not the farthest point is stable can be determined with certainty, and misidentification of the farthest point based on a false farthest point by a passerby or the like can be avoided by simple processing.
- the present invention viewed from another aspect can also be grasped as an image processing method for matrix end recognition. That is, this image processing method for recognizing the end of a queue recognizes the end of the queue from a moving image obtained by taking a bird's-eye shot of an area in which a queue composed of one or more lanes is expected to appear.
- a non-stationary object extraction step, a pixel block grouping step, a farthest point search step, and a farthest point determination step are non-stationary object extraction step, a pixel block grouping step, a farthest point search step, and a farthest point determination step.
- non-stationary object extraction step pixels constituting the non-stationary object are extracted from each of a series of frame images constituting the moving image.
- the pixel block grouping step divides the extracted non-stationary body pixels into pixel blocks formed by collecting adjacent pixels into a single block, and extends a predetermined matrix among the divided pixel blocks. A series of pixel blocks that are close to each other in the direction are stored as pixel blocks belonging to one group.
- the farthest point searching step the farthest point in a predetermined matrix extending direction is searched from all the pixel blocks belonging to one pixel block group and stored as a matrix end candidate.
- the farthest point determination step is based on the change direction and the stability after the change when the farthest point stored as the matrix end candidate changes beyond a predetermined allowable value between consecutive frame images.
- the farthest point stored as the matrix end candidate is determined as the farthest point to be recognized as the true end of the matrix.
- the program in addition to obtaining the same operational effects as the device invention, for example, when an image processing program necessary for a personal computer is incorporated and executed, the program is always stored in an external storage of another company.
- the program can be executed without being resident in the personal computer as in the case where it is placed in a device or a so-called cloud center and transferred to the personal computer as necessary.
- the present invention can be grasped as a computer program for causing a computer to function as an image processing apparatus for matrix end recognition.
- this computer program recognizes the end of the queue from a moving image obtained by taking a bird's-eye view of an area in which a queue composed of one or more lanes is expected to appear.
- Non-stationary object extracting means, pixel block grouping means, farthest point searching means, and farthest point determining means are included.
- the non-stationary object extracting means extracts pixels constituting the non-stationary object from each of a series of frame images constituting the moving image.
- the pixel block grouping unit divides the extracted non-stationary body pixels into pixel blocks formed by collecting adjacent pixels into a single block, and a predetermined matrix extension of the divided pixel blocks. A series of pixel blocks that are close to each other in the direction are stored as pixel blocks belonging to one group.
- the farthest point searching means searches for the farthest point in a predetermined matrix extending direction from all the pixel blocks belonging to one pixel block group and stores them as matrix end candidates.
- the farthest point determination means when the farthest point stored as the matrix end candidate changes beyond a predetermined allowable value between consecutive frame images, based on the change direction and the stability after the change Thus, the farthest point stored as the matrix end candidate is determined as the farthest point to be recognized as the true end of the matrix.
- the present invention recognizes the end of the queue from a camera that takes a bird's-eye view of an area where a queue composed of one or more lanes is expected to appear, and a moving image obtained from the camera. It can also be grasped as a recognition system at the end of the matrix that includes an image processing device for this purpose.
- the image processing apparatus includes a non-stationary object extracting unit, a pixel block grouping unit, a farthest point searching unit, and a farthest point determining unit.
- the non-stationary object extracting means extracts pixels constituting the non-stationary object from each of a series of frame images constituting the moving image.
- the pixel block grouping unit divides the extracted non-stationary body pixels into pixel blocks formed by collecting adjacent pixels into a single block, and a predetermined matrix extension of the divided pixel blocks. A series of pixel blocks that are close to each other in the direction are stored as pixel blocks belonging to one group.
- the farthest point searching means searches for the farthest point in a predetermined matrix extending direction from all the pixel blocks belonging to one pixel block group and stores them as matrix end candidates.
- the farthest point determination means when the farthest point stored as the matrix end candidate changes beyond a predetermined allowable value between consecutive frame images, based on the change direction and the stability after the change Thus, the farthest point stored as the matrix end candidate is determined as the farthest point to be recognized as the true end of the matrix.
- the image processing apparatus in addition to obtaining the same operational effects as the apparatus invention, not only the image processing apparatus but also the overhead view camera itself is the equipment that constitutes the invention, and thus there is no authority. It goes without saying that the right to exercise not only a computer constituting the image processing apparatus but also a camera for overhead view photography can be exercised for an implementation act such as manufacturing and sales by a third party.
- the image processing apparatus for recognizing the end of the matrix according to the present invention further includes a correlation between the end position of the matrix and the waiting time of the matrix, which is statistically obtained in advance as the end position of the matrix obtained from the image processing apparatus.
- the statistical waiting time obtained by actually repeating the measurement corresponding to each of the matrix end reference position and each end reference position in a plurality of stages is obtained.
- the corresponding waiting time is determined based on the automatically generated matrix end position. Can be estimated.
- the current waiting time is estimated, and based on those waiting times, one overall waiting time corresponding to the plurality of lanes is determined. Also good.
- the waiting time for each gate is estimated.
- the boarding passenger can select a relatively free departure gate.
- the image processing apparatus for recognizing the end of a matrix According to the image processing apparatus for recognizing the end of a matrix according to the present invention, a series of pixel blocks that are close to each other in a predetermined matrix extending direction among one or two or more pixel blocks constituting a non-stationary body pixel. Since it is stored as a pixel block belonging to the group, it is surely searched from the head to the end only for the pixel block that is truly likely to form a matrix without being affected by passersby around the matrix. You can search for the farthest point from all the pixel blocks included in the group instead of the pixel block that was last included in the group.
- the most likely pixel at the end of the matrix can be the farthest point, and in addition, the farthest point obtained in this way is not directly recognized as the true farthest point, but the column is extended. If and column is shortened In the case, by re-confirmed by different criteria, while maintaining the recognized response can avoiding false matrix tail by passersby, etc. near matrices tail.
- the entire image processing is a combination of relatively simple elemental technologies, so the processing is simple and the processing time is relatively short. It can be easily implemented.
- FIG. 1 is a configuration diagram of a waiting time estimation system at an airport departure gate according to an embodiment of the present invention.
- FIG. 2 is an explanatory diagram showing the relationship between the queue at the entrance of the security checkpoint and the shooting direction of the camera.
- FIG. 3 is an explanatory diagram showing the relationship between the end of the queue and the statistical waiting time.
- FIG. 4 is a general flowchart showing the entire program executed by the waiting time generation PC.
- FIG. 5 is a detailed flowchart of matrix end recognition processing.
- FIG. 6 is a detailed flowchart of the non-stationary body constituent pixel extraction process.
- FIG. 7 is a detailed flowchart of the contour extraction and sharpening process.
- FIG. 8 is a detailed flowchart of the background deletion process.
- FIG. 1 is a configuration diagram of a waiting time estimation system at an airport departure gate according to an embodiment of the present invention.
- FIG. 2 is an explanatory diagram showing the relationship between the queue at the entrance of the security checkpoint and the shooting
- FIG. 9 is a detailed flowchart of the grouping process between adjacent pixel blocks.
- FIG. 10 is a detailed flowchart of the search process for the lowest point in each group.
- FIG. 11 is a detailed flowchart of the lowest point determination process in each group.
- FIG. 12 is a detailed flowchart of the waiting time estimation process.
- FIG. 13 is an explanatory diagram (part 1) schematically showing a process of image processing.
- FIG. 14 is an explanatory diagram (part 2) schematically showing a process of image processing.
- FIG. 15 is an explanatory diagram (part 3) schematically illustrating a process of image processing.
- FIG. 16 is an explanatory diagram (part 4) schematically showing a process of image processing.
- FIG. 17 is an explanatory diagram illustrating an image example after the contour extraction processing.
- FIG. 18 is an explanatory diagram illustrating an example of an image after background deletion and sharpening processing.
- FIG. 19 is an explanatory diagram showing a relationship between an example of an
- the present invention is a queue that appears in front of a ticket office such as a railway station or a public race, or a queue that appears in front of a settlement (registry) such as a supermarket or a convenience store.
- a queue that appears in front of a ticket office such as a railway station or a public race
- a queue that appears in front of a settlement (registry) such as a supermarket or a convenience store.
- a settlement such as a supermarket or a convenience store.
- it can be widely applied.
- each departure gate of an airport is provided with a security checkpoint having a plurality of lanes for passengers parallel to each other.
- Each passenger passage is provided with various inspection devices such as an X-ray detector for physical inspection as well as a metal detector for physical inspection.
- the image processing apparatus for recognizing the end of a queue automatically recognizes the end of the queue for each lane of the departure gate in real time, and estimates the waiting time for each lane or each departure gate, that is, the time required for passing. Can be used to
- ⁇ About airport facilities> As shown in FIG. 1, at the airport to which this embodiment system is applied, four departure gates comprising the first gate to the fourth gate are provided at an appropriate distance from each other. As shown in FIG. 3, a security checkpoint having a four-lane passenger passage composed of the first lane (L1) to the fourth lane (L4) is provided. It is customary that a passenger queue appears in front of the passenger passage in each lane. Therefore, four queues that extend substantially in parallel with each other along the extending direction of the passenger passages in the lanes (L1) to (L4) appear at each departure gate.
- Each of the four departure gates consisting of the first gate to the fourth gate is provided with a camera for taking a video to take a bird's-eye view of the passenger queue.
- the number of cameras for each gate and the shooting direction is whether the four queues are spaced apart from each other within the field of view of each camera, in other words, the four queues are It is determined so as not to overlap. From such a viewpoint, there are various options for the number of cameras and the shooting direction for each gate. As an example, if the number of cameras is one, it is preferable to take a bird's-eye view of four queues at a sufficient distance from the rear.
- the number of cameras is set to two or more. For example, when the number of cameras is two, the first camera is adjacent. A method may be employed in which a queue for two rows is photographed from behind, and a queue for two rows adjacent by the second camera is photographed from above.
- FIGS. 1 to 3 show an example in which four queues in front of each departure gate are photographed from above with a single camera at a sufficient distance. That is, the queue of the four rows of the first gate is the first camera 2-1, the queue of the four rows of the second gate is the second camera 2-2, and the queue of the fourth row of the third gate is the third queue.
- the queues of the four rows of the fourth gate and the fourth gate are photographed from a position separated from the rear by a fourth camera 2-4 from an overhead position.
- reference numeral 3 denotes a person who forms a queue. Further, in FIG. 3, the person 3 who constitutes the queue shows only a part of the queue in order to simplify the drawing, and the majority of other queue parts are omitted.
- FIG. 3 it should be noted that the distance from the entrance of the departure gate to the mounting position of the camera 2 has been greatly reduced due to space limitations.
- a ceiling-mounted dome camera having a photographing rate of 30 frames per second and having a communication function is employed. As shown in Fig. 2, it is attached to the front surface of the entrance of the security inspection site, on the ceiling surface at a sufficiently spaced position.
- the third camera 2-3 for taking a bird's-eye view of the queue of the four rows of gates and the fourth camera 2-4 for taking a bird's-eye view of the queue of the fourth row of the fourth gate are respectively connected via a LAN cable.
- a personal computer hereinafter referred to as “latency generation PC” 1 for generating a waiting time.
- a personal computer As a personal computer, a commercially available computer having a normal performance with a main unit (including a CPU, a hard disk, a memory, a wireless LAN board, etc.), a display, and an operation unit (including a mouse, a keyboard, etc.) is arbitrarily selected.
- the hardware configuration and the system configuration such as the OS are well known to those skilled in the art based on various documents, and thus the description thereof is omitted.
- the waiting time generation PC 1 can be connected to an airport management server via a wireless LAN.
- FIG. 4 shows a general flowchart schematically showing the entire process (computer program) executed by the waiting time generation PC1. As shown in the figure, this process basically includes a predetermined camera pointer value from the initial value (step 100) to the first camera ⁇ second camera ⁇ third camera ⁇ fourth camera one after another.
- step 106 While updating (step 106), each time the update is performed, processing for acquiring an image from the camera designated by the pointer (step 101), and FIFO processing for storing the latest 120 frames (within the past 4 seconds) (step 102)
- a series of processes consisting of the process of recognizing the end of the matrix (step 103) and the process of estimating the waiting time (step 104) are repeated until the processes for all the cameras are completed (NO in step 105).
- Wait (YES in step 105) save the estimated waiting time value in a predetermined file (step 107), and save the file to the airport management server.
- Process is repeated (step 108), and the above series of processes (steps 100 to 109) is repeated until the operation is stopped by a predetermined operation (NO in step 109). It waits (step 109 YES), and it is comprised so that a process may be complete
- Step 103 The details of the matrix end recognition process (step 103) are shown in FIG. As shown in the figure, this processing basically includes non-stationary pixel extraction processing (step 1030), grouping of adjacent pixel clusters (step 1031), and search processing for the lowest point in each group. (Step 032) and the lowest point determination process (Step 1033) in each group.
- the non-stationary body pixel extraction process is a process of extracting non-stationary body constituent pixels corresponding to a person or the like from a frame image acquired every 1/30 seconds. As will be described in detail with reference to the drawings, it includes a contour extraction process (step 10300), a background deletion process (step 10301), and a sharpening process (step 10302).
- the grouping process (step 1031) between adjacent pixel clusters is a pixel in which the extracted non-stationary pixels are grouped into adjacent pixels as will be described in detail later with reference to FIG. This is a process of dividing into a block and storing a series of pixel blocks close to each other in a predetermined matrix extending direction among the divided pixel blocks as a pixel block belonging to one group.
- the search processing for the lowest point in each group is performed by extending a predetermined matrix from all the pixel blocks belonging to one pixel block group, as will be described in detail later with reference to FIG. This is a process of searching for the farthest point (in this example, the lowest point in this example) in the direction (in this example, the downward direction on the screen) and storing it as a matrix end candidate.
- the determination process of the lowest point in each group is performed between frame images in which the farthest points stored as matrix end candidates are consecutive, as will be described in detail later with reference to FIG.
- the farthest point stored as the matrix end candidate is the farthest point to be recognized as the true end of the matrix based on the direction of change and the stability after the change.
- This is a process to confirm.
- a background deletion process for deleting the line image portion corresponding to the background difference process and a sharpening process (step 10302) for sharpening the lines constituting the line image after the background difference process. It consists of Therefore, in this non-stationary body constituent pixel extraction process, not only the entire individual pixel block composed of non-stationary body pixels, but only the outline of the individual pixel block is selectively extracted.
- the processing target pixels in the subsequent search processing and collation processing are limited to only the contour portion, and the number of target pixels is greatly reduced, which contributes to an improvement in processing speed.
- the contour extraction process includes, as shown in FIG. 7A, a grayscale process (step 10300-1) for converting a frame image into a monochrome image, and a monochrome image.
- a grayscale process step 10300-1 for converting a frame image into a monochrome image, and a monochrome image.
- spatial primary differentiation processing step 10300-2 for performing spatial primary differentiation
- binarization processing for generating a line image by binarizing the image after the spatial primary differentiation processing (Step 10300-3).
- the gray scale processing is a processing for equalizing the RGB values based on various algorithms. In the present invention, a known method is arbitrarily employed. Can do.
- the spatial first differentiation process is, in short, a process for generating a pixel value corresponding to the luminance difference between adjacent pixels, and thereby, the contour portion of the pixel block included in the grayscaled image is aspirated. While being able to stand up.
- the binarization process is a line image corresponding to the contour portion of the pixel block by binarizing each pixel value constituting the image after spatial primary differentiation with an appropriate luminance threshold value. Can be raised.
- FIG. 13A schematically shows an example of an image after gray scale processing
- FIG. 13B schematically shows an example of an image after spatial primary differentiation processing and binarization processing.
- PCG11, PCG12, and PCG13 are pixel blocks corresponding to persons and the like constituting the queue
- PCG2 is a pixel block corresponding to persons and the like that do not form the queue
- PCG31 and PCG32 are background objects.
- 4 is a contour line of the pixel block
- 5 is a dark spot corresponding to noise.
- a line drawing including the line 4 can be obtained.
- the thickness of the outline 4 is not uniform, there are partly broken wavy parts, and there are not a few stigma 5 corresponding to many noises.
- the background deletion process is a process in which the pixels included in the line constituting the line image are compared with the value of the pixel before the predetermined number of frames, and the matching pixels are deleted as the background pixels. As shown in the figure, this process shifts the pixel designation (pointer) value from the initial value (for example, the upper left of the screen) one pixel at a time in the horizontal direction by shifting one pixel at a time in the vertical downward direction.
- Steps 10301-1 and 10301-6 while searching for each pixel value, only if it is a contour part and matches the pixel value 120 frames before, delete the pixel as a background
- This process (steps 10301-2, 10301-3, and 10301-4) is repeatedly executed until the process is completed for all pixels (NO in step 10301-5).
- the image serving as the reference for the background difference is not fixed to the initial image including the original background object such as a column or table, but is always updated every 120 frame times. Even if a suitcase or other baggage placed on the airport floor is left on for more than 120 frame times (4 seconds), it will be deleted as a background item. For this reason, these packages will not be mistaken for part of the queue.
- FIG. 14A Details of the sharpening process (step 10302) are shown in FIG. 14A.
- FIG. 14B Details of the sharpening process (step 10302) are shown in FIG. 14A.
- the contour line 4 becomes a contour line 4A in which the wavy line portion is repaired and thickened, and the noise spot 5 is erased.
- the thickened outline 4A when the labeling process is executed later, the range to which the same label is attached becomes wider, which contributes to the increase in size and the number of pixel blocks.
- this processing is basically performed by assigning the same label to the pixels adjacent to each other among the extracted non-stationary pixels, so that the pixels constituting the non-stationary body are pixelated.
- a labeling process for dividing each block and the farthest point in the predetermined matrix extending direction among the pixels included in one pixel block corresponding to the top of the matrix (in this example, the lowest point of the screen)
- Predetermined search areas set in the far direction of the extending direction (in this example, the lower direction of the screen) with reference to the pixel located at (in this example, X pixels in the horizontal direction and Y in the lower direction, respectively)
- a search is made as to whether or not another labeled pixel exists, and when it exists, the pixel block including the pixel is stored as a pixel block belonging to the same group.
- the labeling process (step 10310) focuses on one pixel that forms an image and is adjacent to it (that is, four pixels on the top, bottom, left, and right) (In addition to the top, bottom, left, and right, the upper right, lower right, upper left, and lower left 8 pixels)
- the pixels forming the image indicate the pixels included in the bold outline 4A described above with reference to FIG.
- the first label (label 1) is added to the pixel block PCG12 for all the pixels forming the outline 4A of the pixel block PCG11 in the image of FIG.
- the second label (label 2) is assigned to all the pixels forming the contour line 4A
- the third label (label 3) is assigned to all the pixels forming the outline 4A of the pixel block PCG13.
- a fourth label (label 4) is attached to all the pixels forming the outline 4A.
- the proximity pixel block search processing (steps 10311 to 10315) will be described in more detail with reference to FIG. 15.
- a target label for example, label 1
- label 1 an initial value
- FIG. Pixels with different labels for example, label 2, label 3,
- A Whether or not exists is determined (step 10312).
- the pixel block for example, the outline 4A portion of the pixel block PCG12
- the pixel is included in the same group (for example, , Group G1) which is the first group (step 10313).
- the above processing is performed until no pixel with another label is found in the predetermined search area A when viewed from the lowest point (Pz) of the pixel block. Repeat (steps 10312 and 10313).
- the target label is updated to an unprocessed value (step 10314), and the above processing (steps 10312 and 10313) is performed for all labels. Is repeated until the process is completed (NO in step 10315).
- the search area A used in the above proximity pixel search processing (steps 10311 to 10315) is in the search area A that mainly extends downward on the screen, assuming that the queue extends downward in the screen.
- the constituent pixels of the pixel block are searched, and the found pixel blocks PCG12 and PCG13 are incorporated into the same group G1.
- the passersby around the group G1 In other words, only a series of pixel blocks PCG11, PCG12, and PCG13 having a high possibility of forming a queue are included.
- This in-group lowest point search process is, in short, the farthest point (lowest point) in a predetermined matrix extending direction (downward in the screen in this example) among the pixels included in one pixel block belonging to the same group.
- the process of searching for the pixel located at is repeated for all the pixel clusters belonging to the group to identify the farthest point (the lowest point) in the same groove.
- this process sets the group designation (pointer) value to an initial value (for example, group G1 corresponding to the first lane) (step 10320), and then all of the groups belonging to the group.
- the pixel that is the lowest point is searched from the pixel block (for example, PCG1, PCG12, PCG13) (step 10321), and the process of storing the searched lowest point (step 10322) is performed one after another. (Step 10324), and repeatedly (steps 10321 and 10322), the search of all groups is waited (step 10323), and the process is terminated.
- the determination process of the lowest point in each groove is basically that the difference between the value of the current frame at the farthest point in the same group (in this example, the lowest point) and the value of the previous frame is a predetermined allowable value.
- the counter control process increments a predetermined frame counter when it is within, while clearing the frame counter when the allowable value is exceeded, and the value of the current frame at the farthest point in the same group (the farthest point in this example)
- the true farthest point (depending on the value of the current frame)
- the value of the lowest point) is updated, while when the direction of change is the far direction (in this example, the lower part of the screen), it waits for the value of the frame counter to reach a predetermined number of frames and waits for the current frame.
- the value of the It is intended to include the update control process of the truer farthest point farthest point true value to update the value of (the lowest point) (nadir true value).
- step 1330 when processing is started (step 1330), first, the group designation (pointer) value is initialized to a value corresponding to the queue of the first lane (step 10330), and then the current frame It is determined whether the difference between the lowest point and the lowest point of the previous frame is within a predetermined allowable value (step 10331).
- a predetermined frame counter for determining the stable time is incremented by +1 (step 10332), and then It is determined whether or not the value of the frame counter has exceeded a specified value (for example, 90 frames corresponding to 3 seconds), that is, whether or not a predetermined stable time has been reached (step 10333).
- a specified value for example, 90 frames corresponding to 3 seconds
- step 10333 If it is determined that the specified value has been exceeded (YES in step 10333), it is determined that the queue has been extended, and the current lowest point is determined as the true lowest point (step 10334). If it is determined that the specified value has not yet been exceeded (NO at step 10333), it is determined whether or not the queue has been extended, and the lowest point determination process (step 10334) is skipped.
- step 10331 NO if it is determined that the difference between the lowest point of the current frame and the lowest point of the previous frame is not within the allowable value (step 10331 NO), there is a possibility that the queue has been extended or shrunk. After the frame counter is cleared (step 10337), it is further determined whether or not the lowest point of the current frame has fallen below the lowest point of the previous frame (step 10338).
- step 10338 If it is determined that the lowest point of the current frame is lower than the lowest point of the previous frame (step 10338), it is determined that there is a possibility that the queue has been extended. While the point determination process (step 10334) is skipped, if it is determined that the lowest point of the current frame is not lower than the lowest point of the previous frame (step 10338 NO), the queue is shortened. Immediately determining that it has been performed, the lowest point determination process (step 10334) is executed. Thereafter, until the processing for all groups (all lanes) is completed, the group designation values are updated one after another (step 10336), and the above-described series of processing is executed and processing for all groups is completed. (Step 10335 YES), and the process ends.
- step 10330 to 103308 being repeated for each frame, when the queue is shortened and the end of the matrix moves forward, the end of the matrix is immediately updated to the value after movement. On the other hand, when the queue is expanded and the end of the queue moves backward, the end of the queue is updated to the value after the movement after a predetermined confirmation time. Further, for example, as shown in FIG. 16, when a passer passes the end of a normal queue (see FIG. 16A) as indicated by an arrow 6, a pixel block PCG22 appears temporarily at the end of the matrix.
- step 104 the details of the waiting time estimation process (step 104) are shown in FIG.
- this processing first sets the groove designation (pointer) to an initial value (a value corresponding to the first lane) (step 1041), and then the lowest point of the group designated by the pointer. Is read (step 1042), the waiting time is obtained with reference to the correlation table with reference to the read lowest point (step 1043), and the obtained waiting time is recorded as the waiting time of the corresponding lane (step 1043).
- This correlation table is prepared and prepared in advance by a statistical method. That is, in the predetermined memory, as shown in FIG. 3, when the reference lines Pref1 to Pref5 at the end of each matrix and the end points of the matrix are at the positions of the respective reference lines Pref1 to Pref5, how long the waiting time is, respectively.
- the following correlation table showing the relationship with each waiting time obtained by statistically examining whether or not is stored is stored.
- Matrix end position Wait time Pref1 5 minutes or less Pref2 About 10 minutes Pref3 About 15 minutes Pref4 About 20 minutes Pref5 About 30 minutes or more Correlation table
- the end position of the matrix derived from the read bottom point is collated with each of the reference lines Pref1 to Pref5 of the correlation table, and if necessary, by a linear approximation process or the like.
- the waiting time is generated by interpolating the gap between the reference lines.
- the waiting time obtained in this way is recorded as the waiting time of the corresponding lane (step 1044).
- step 1042 to 1044 The above series of processing (steps 1042 to 1044) is executed repeatedly after updating the group designation (step 1046), waiting for completion of processing of all groups (step 1045 YES), and waiting for each lane.
- a process for obtaining the waiting time for the entire gate from the time (step 1047) and a process for recording the obtained waiting time as the waiting time for the corresponding gate (step 108) are sequentially executed, and the process ends.
- processing (step 1047) for obtaining the waiting time of the entire gate from the waiting time for each lane for example, when the maximum waiting time is set as the waiting time of the gate among the waiting times of each lane, the waiting time of each lane If the average is taken as the waiting time for the gate, various methods may be selected and employed as appropriate.
- FIG. 17 shows an image example after contour extraction processing
- FIG. 18 shows an example image after background deletion and sharpening processing
- FIG. 19 shows an example image after grouping processing
- each queue end position is indicated by a one-dot chain line. The corresponding waiting time is given to the drawer end.
- the pixel blocks grouped as a part of the queue are hatched, and the pixel blocks corresponding to other staff members are displayed with a satin appearance. Note that a white rectangular mark corresponding to the search area A is drawn in the lower left corner of the screen.
- ⁇ Use of queue waiting time information> Returning to FIG. 4, if the series of processing described above (steps 101 to 104) is completed for all the gate cameras including the first gate to the fourth gate (step 105), the waiting time obtained in this way.
- the estimated value is stored in a predetermined file (step 107), and this file is transferred to the airport-side management server for various uses. As an example of the form of use, for example, by listing the waiting time for each departure gate on a Web site operated by the airport, it can be used for browsing passengers.
- the processing of comparing the lowest points searched in the group between frames is adopted, so the number of pixel blocks varies between frames. In this case, the end recognition reliability can be maintained without being affected by the influence.
- the contraction process and the expansion process are introduced to remove the noise smudges and to make the contour line continuous with a thick line, so that the pixels in the labeling process provided in the subsequent stage It is possible to achieve good connection between each other, increase the size and decrease the number of pixel blocks, and thereby improve the efficiency of the proximity search process between the pixel blocks in the subsequent grouping process.
- the image processing apparatus for recognizing the end of the matrix according to the present invention is applied for the purpose of estimating the waiting time, but in addition to that, it can pass according to the maximum value of the end of the recognized matrix. Automatically increase or decrease the number of lanes, or in the case of a supermarket cashier, automatically call cashier staff to increase or decrease the number of available cashiers, or based on the time series record at the end of the recognized matrix Of course, it can be applied to various usages such as analyzing the usage trend of gates to find an optimal gate operation method.
- the image processing apparatus for recognizing the end of a queue for example, it is possible to recognize the end of a queue appearing in front of an entrance gate, a ticketing machine or the like with relatively high accuracy, and use this. Thus, for example, the waiting time of the matrix can be obtained.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
Abstract
[Problem] To enable the recognition, with relatively high accuracy, of the end of a queue formed in front of an entrance gate, ticket machine, or the like, and to estimate waiting time for said queue on the basis thereof. [Solution] Non-static body pixels extracted from a series of frames constituting a video are segmented into pixel clusters, which are formed by consolidating neighboring pixels into a single cluster. In conjunction with this, among said segmented pixel clusters, a series of pixel clusters that are adjacent to each other in the predetermined direction in which the queue extends is stored as a single group of pixel clusters. The farthest point in the predetermined direction in which the queue extends is searched for among the pixel clusters belonging to the one pixel cluster group, and this point is stored as a candidate end of the queue. When the stored farthest point changes between continuous frames in such a way as to exceed the predetermined allowance, said farthest point, which was stored as a candidate end of the queue, is confirmed as the true farthest point on the basis of the direction of the change as well as post-change stability.
Description
この発明は、行列末尾認識用の画像処理装置に係り、例えば、入場ゲートや発券機等の前に出現する待ち行列における待ち時間推定等の用途に好適な行列末尾認識用の画像処理装置に関する。
The present invention relates to an image processing apparatus for recognizing the end of a matrix, and for example, relates to an image processing apparatus for recognizing a matrix end suitable for uses such as waiting time estimation in a queue that appears before an entrance gate or a ticketing machine.
入場ゲートや発券機等の前に出現する待ち行列における待ち時間推定等の用途に使用される画像処理装置としては、様々な従来例が知られている。
Various conventional examples are known as image processing devices used for purposes such as waiting time estimation in a queue that appears in front of an entrance gate or a ticketing machine.
1つの従来例としては、行列を形成する個々の人物に着目してその像を抽出すると共に、隣接する像同士を直線で結ぶことで、行列ラインの総延長を算出し、別途推定される行列移動速度によりこれを除することで、行列の待ち時間を算出するもの(以下、「第1従来例」と称する)が知られている(特許文献1参照)。
As one conventional example, an image is extracted by paying attention to individual persons forming a matrix, and the total extension of matrix lines is calculated by connecting adjacent images with a straight line, and a matrix separately estimated A device that calculates the waiting time of a matrix by dividing this by the moving speed (hereinafter referred to as “first conventional example”) is known (see Patent Document 1).
また、他の1つの従来例としては、背景差分画像とフレーム間差分画像との差により得られる残画像を人の集合体とみなすと共に、それらの集合体のうちで最大の集合体を行列として認識する一方、こうして認識された行列の面積を予め定めた単位面積で除することにより、行列の人数を求め、これを予め定めた単位時間あたりの行列元入場者数で除することにより、待ち時間を求めるもの(以下、「第2従来例」と称する)が知られている(特許文献2参照)。
As another conventional example, the residual image obtained by the difference between the background difference image and the inter-frame difference image is regarded as a human aggregate, and the largest aggregate among these aggregates is used as a matrix. On the other hand, by dividing the area of the matrix thus recognized by a predetermined unit area, the number of people in the matrix is obtained, and this is divided by the predetermined number of former matrix visitors per unit time. A device for obtaining time (hereinafter referred to as “second conventional example”) is known (see Patent Document 2).
さらに、他の1つの従来例としては、エッジ背景差分画像とフレーム間差分画像とを重ね合わせた画像にラベリング処理を施すことにより得られた画像をもとに移動体のみのエッジ情報の追跡を行って移動体の輪郭画像を生成すると共に、この輪郭画像の凸形状を検出することで画像中の人の頭部を抽出し、この頭部をもとに人の数を計数するもの(以下、「第3従来例」と称する)が知られている(特許文献3参照)。
Further, as another conventional example, tracking of edge information of only a moving object is performed based on an image obtained by performing a labeling process on an image obtained by superimposing an edge background difference image and an inter-frame difference image. And generating a contour image of the moving body, and detecting the convex shape of the contour image to extract the head of the person in the image and counting the number of people based on the head (Referred to as “third conventional example”) (see Patent Document 3).
第1従来例にあっては、ほぼ一直線状に延びる待ち行列のみならず、折れ曲がったり蛇行したりする待ち行列にも適用できる利点はあるものの、人物像の個別抽出及びそれらの連繋に複雑な処理が必要で処理に時間がかかり、またコンピュータ負荷も大きいほか、多数の人物が重なり合う画像では人物の個別抽出が難しく、複雑な処理を行う割には、待ち時間算出精度はさほど期待し得ない。
In the first conventional example, although there is an advantage that can be applied not only to a queue that extends in a substantially straight line but also to a queue that bends or meanders, complicated processing is required for individual extraction of person images and their connection. In addition, it takes time to process, and the computer load is large. In addition, it is difficult to individually extract people in an image in which a large number of people overlap, and the accuracy of waiting time calculation cannot be expected for complicated processing.
第2従来例にあっては、行列を構成する個々の人に着目するのではなく、人の集合に着目する点で、複雑な処理は回避できるが、背景差分画像とフレーム間差分画像との差により得られる残画像を人の集合体とみなすと共に、それらの集合体のうちで最大の集合体を直ちに行列として認識すると言うアルゴリズムでは、行列周辺の通行人等の影響を大きく受けやすい。
In the second conventional example, it is possible to avoid complicated processing in terms of focusing on a set of people rather than focusing on individual persons constituting the matrix. However, the difference between the background difference image and the inter-frame difference image can be avoided. The algorithm that considers the residual image obtained by the difference as a collection of people and immediately recognizes the largest collection among the collections as a matrix is easily affected by passersby around the matrix.
第3従来例にあっては、輪郭画像の凸形状を検出することで画像中の人の頭部を抽出と言うアルゴリズムでは、駅の広場のような比較的に分散した群衆には適用できても、待ち行列を後方から俯瞰する画像のように、多数の人が重なり合う可能性の高い画像に適用した場合には、個々の人物の頭部を全て抽出することはかなり困難と思われる。
In the third conventional example, the algorithm of extracting the human head in the image by detecting the convex shape of the contour image can be applied to a relatively dispersed crowd such as a station square. However, when it is applied to an image in which a large number of people are likely to overlap, such as an image in which the queue is viewed from behind, it seems that it is quite difficult to extract all the heads of individual people.
この発明は、上述の技術的背景に鑑みてなされたものであり、その目的とするところは、例えば、入場ゲートや発券機等の前に出現する待ち行列の末尾を比較的に高精度に認識することができ、これを利用することで、例えば、当該行列の待ち時間を求めることも可能な行列末尾認識用の画像処理装置を提供することにある。
The present invention has been made in view of the above-mentioned technical background, and its purpose is to recognize the end of a queue appearing in front of an entrance gate, a ticket machine, etc. with relatively high accuracy, for example. By using this, for example, an object of the present invention is to provide an image processing apparatus for recognizing the end of a matrix that can determine the waiting time of the matrix.
この発明のさらに他の目的並びに作用効果については、明細書の以下の記述を参照することにより、当業者であれば容易に理解されるであろう。
Further objects and operational effects of the present invention will be easily understood by those skilled in the art by referring to the following description of the specification.
上記目的を達成するため、本発明に係る行列末尾認識用の画像処理装置は、以下のように構成されている。
すなわち、本発明の行列末尾認識用の画像処理装置は、1もしくは2以上のレーンからなる待ち行列の出現が想定される領域を俯瞰撮影することにより得られる動画像から当該待ち行列の末尾を認識するための画像処理装置であって、非静止体抽出手段と、画素塊グループ化手段と、最遠点探索手段と、最遠点確定手段とを包含するものである。 In order to achieve the above object, an image processing apparatus for matrix tail recognition according to the present invention is configured as follows.
That is, the image processing apparatus for recognizing the end of a queue according to the present invention recognizes the end of the queue from a moving image obtained by taking a bird's-eye view of an area where a queue composed of one or more lanes is expected to appear. And a non-stationary object extracting unit, a pixel block grouping unit, a farthest point searching unit, and a farthest point determining unit.
すなわち、本発明の行列末尾認識用の画像処理装置は、1もしくは2以上のレーンからなる待ち行列の出現が想定される領域を俯瞰撮影することにより得られる動画像から当該待ち行列の末尾を認識するための画像処理装置であって、非静止体抽出手段と、画素塊グループ化手段と、最遠点探索手段と、最遠点確定手段とを包含するものである。 In order to achieve the above object, an image processing apparatus for matrix tail recognition according to the present invention is configured as follows.
That is, the image processing apparatus for recognizing the end of a queue according to the present invention recognizes the end of the queue from a moving image obtained by taking a bird's-eye view of an area where a queue composed of one or more lanes is expected to appear. And a non-stationary object extracting unit, a pixel block grouping unit, a farthest point searching unit, and a farthest point determining unit.
前記非静止体抽出手段は、前記動画像を構成する一連のフレーム画像のそれぞれから非静止体を構成する画素を抽出する。前記画素塊グループ化手段は、前記抽出された非静止体画素を、互いに隣接する画素を一塊に纏めてなる画素塊に区分すると共に、それらの区分された画素塊のうち、所定の行列延在方向において互いに近接する一連の画素塊を1のグループに属する画素塊として記憶する。前記最遠点探索手段は、1の画素塊グループに属する全画素塊の中から、所定の行列延在方向における最遠点を探索して行列末尾候補として記憶する。前記最遠点確定手段は、前記行列末尾候補として記憶される最遠点が相連続するフレーム画像間で所定の許容値を超えて変化したとき、その変化方向と変化後の安定性とに基づいて、前記行列末尾候補として記憶される最遠点を、真の行列末尾と認識されるべき最遠点として確定する。なお、「変化後の安定性」を認定する手法としては、変化後の値が連続する一定フレーム数に亘りほぼ同一に維持されるかを判定する手法のみならず、連続する一定フレーム数あたり一定以上の割合でほぼ同一に維持されるかを判定する等々、ノイズによる影響を考慮した様々な手法を採用することができる。
The non-stationary object extracting means extracts pixels constituting the non-stationary object from each of a series of frame images constituting the moving image. The pixel block grouping unit divides the extracted non-stationary body pixels into pixel blocks formed by collecting adjacent pixels into a single block, and a predetermined matrix extension of the divided pixel blocks. A series of pixel blocks that are close to each other in the direction are stored as pixel blocks belonging to one group. The farthest point searching means searches for the farthest point in a predetermined matrix extending direction from all the pixel blocks belonging to one pixel block group and stores them as matrix end candidates. The farthest point determination means, when the farthest point stored as the matrix end candidate changes beyond a predetermined allowable value between consecutive frame images, based on the change direction and the stability after the change Thus, the farthest point stored as the matrix end candidate is determined as the farthest point to be recognized as the true end of the matrix. In addition, as a method for certifying “stability after change”, not only a method for determining whether the value after change is maintained substantially the same over a constant number of continuous frames, but also a constant per constant number of continuous frames. Various methods can be employed in consideration of the influence of noise, such as determining whether or not the same is maintained at the above ratio.
このような構成によれば、非静止体画素を構成する1又は2以上の画素塊の中で、所定の行列延在方向において互いに近接する一連の画素塊を1のグループに属する画素塊として記憶することから、行列周辺の通行人等の影響を受けることなく、真に、行列を構成する可能性の高い画素塊のみを先頭から末尾に向けて確実に探索することができ、さらに最遠点の探索についても、最後にグループに組み込まれた画素塊ではなくて、そのグループに含まれる全ての画素塊の中から探索するようにして、再度、グループ内を見渡して最も行列末尾の可能性が高い画素を最遠点とすることができ、加えて、こうして得られた最遠点を全てそのまま真の最遠点と認識するのではなく、列が延長された場合と列が短縮された場合とで、異なる基準により再確認することにより、認識応答性を維持しつつも、行列末尾周辺の通行人等による行列末尾の誤認を避けることができる。しかも、行列に含まれる人物を個別に認識するものではないので、画像処理全体は比較的に簡単な要素技術の組み合わせとなるため、処理も簡単で処理時間も比較的短く、通常の市販パソコンでも容易に実施することができる。
According to such a configuration, among one or two or more pixel blocks constituting a non-stationary body pixel, a series of pixel blocks adjacent to each other in a predetermined matrix extending direction is stored as a pixel block belonging to one group. Therefore, without being affected by passersby around the matrix, it is possible to reliably search only from the beginning to the end for the pixel block that is truly likely to form the matrix, and the farthest point. The search of all the pixel blocks included in the group instead of the pixel block that was last included in the group is also performed, and the possibility of the end of the matrix is looked at again within the group. The highest pixel can be the farthest point, and in addition, the farthest point obtained in this way is not directly recognized as the true farthest point, but the row is extended and the row is shortened. And according to different standards By reaffirming, while maintaining the recognized response can avoiding false matrix tail by passersby, etc. near matrices tail. Moreover, since the person included in the matrix is not individually recognized, the entire image processing is a combination of relatively simple elemental technologies, so the processing is simple and the processing time is relatively short. It can be easily implemented.
本発明の好ましい実施の一態様においては、前記最遠点確定手段が、前記最遠点の変化方向が行列先頭から遠ざかる方向であるときには、その変化後の最遠点が所定フレーム数に亘りほぼ一定に維持されたときに限り、前記行列末尾候補として記憶される最遠点を、真の行列末尾と認識されるべき最遠点として確定する処理を含む、ものであってもよい。
In a preferred embodiment of the present invention, when the farthest point determination means is such that when the direction of change of the farthest point is away from the top of the matrix, the farthest point after the change is approximately the number of frames. Only when it is kept constant, it may include a process of determining the farthest point stored as the matrix end candidate as the farthest point to be recognized as the true end of the matrix.
このような構成によれば、最遠点の変化方向が行列先頭から遠ざかる方向であるときには、その変化後の最遠点が何フレーム数継続するかを判定するだけで、変化後の最遠点が安定しているか否かを確実に判断することができ、通行人等による偽の最遠点に基づく、最遠点の誤認を簡単な処理で回避することができる。
According to such a configuration, when the change direction of the farthest point is a direction away from the head of the matrix, it is only necessary to determine how many frames the farthest point after the change continues, and the farthest point after the change. It is possible to reliably determine whether or not the farthest point is stable, and misidentification of the farthest point based on a false farthest point by a passerby or the like can be avoided by a simple process.
本発明の好ましい実施の一態様においては、前記最遠点確定手段が、前記最遠点の変化方向が行列先頭へと近づく方向であるときには、直ちに(所定フレーム数を待つことなく)、前記行列末尾候補として記憶される最遠点を、真の行列末尾と認識されるべき最遠点として確定する処理を含む、ものであってもよい。
In a preferred embodiment of the present invention, when the farthest point determination means is a direction in which the direction of change of the farthest point approaches the head of the matrix, the matrix immediately (without waiting for a predetermined number of frames). It may include processing for determining the farthest point stored as the tail candidate as the farthest point to be recognized as the true matrix tail.
このような構成によれば、行列が前に進んだことなどにより、行列末尾が先頭へ近づいたときには、これは正常な末尾移動と判断して、直ちに、変化後の最遠点を真の最遠点として直ちに認識するから、行列末尾が正常に前へ進んだときには、そのことを即時に認識することができる。
According to such a configuration, when the end of the matrix approaches the beginning due to the advance of the matrix or the like, this is determined to be a normal end movement, and the farthest point after the change is immediately set to the true maximum. Since it is immediately recognized as a far point, it can be recognized immediately when the end of the matrix has advanced normally.
本発明の好ましい実施の一態様によれば、前記俯瞰撮影は、想定される待ち行列の後方より行われ、かつ前記所定の行列延在方向とは、各フレーム画像の下方向とされる、ものであってもよい。
According to one aspect of a preferred embodiment of the present invention, the overhead view shooting is performed from the rear of an assumed queue, and the predetermined queue extending direction is a downward direction of each frame image. It may be.
このような構成によれば、フレーム画像上の待ち行列の末尾は、フレーム画像の上下方向へと移動することとなるため、想定される行列延在方向はフレーム画像の上下方向に一致することとなり、各画素が行列末尾に近いか遠いかの判断を、各画素の垂直座標値をそのまま利用して容易に行うことができる。
According to such a configuration, since the end of the queue on the frame image moves in the vertical direction of the frame image, the assumed matrix extending direction coincides with the vertical direction of the frame image. Whether each pixel is close to or far from the end of the matrix can be easily determined using the vertical coordinate value of each pixel as it is.
本発明の好ましい実施の一態様によれば、前記非静止体抽出手段が、前記フレーム画像をそれに含まれる各種の像の輪郭を抽出してなる線画像に変換する輪郭抽出手段と、前記線画像の中で、背景に相当する線画像部分を、背景差分処理を用いて削除する背景削除手段と、前記背景差分処理後の線画像を構成する線を鮮明化する鮮明化手段とを含む、ものであってもよい。
According to one aspect of a preferred embodiment of the present invention, the non-stationary body extracting unit converts the frame image into a line image obtained by extracting the contours of various images included therein, and the line image. Including a background deleting unit that deletes a line image portion corresponding to a background using background difference processing, and a sharpening unit that sharpens lines constituting the line image after the background difference processing. It may be.
このような構成によれば、背景削除処理に先立って輪郭抽出処理を行うと共に、輪郭抽出後にあっては、さらに、線画像の鮮明化を行うようにしているため、こうして得られる非静止体画像は、原フレーム画像に含まれる非静止体内部を白抜きとして輪郭を鮮明化したものとなり、以後の各種演算処理を、この輪郭像に含まれる画素に限定して行うことが可能となり、処理対象画素数の減少により、処理負荷の軽減化による高速動作を可能とすることができる。
According to such a configuration, the contour extraction process is performed prior to the background deletion process, and the line image is further sharpened after the contour extraction. The outline of the non-stationary object included in the original frame image is outlined and the outline is sharpened, and it is possible to perform various subsequent arithmetic processes only on the pixels included in the outline image. By reducing the number of pixels, it is possible to perform high-speed operation by reducing the processing load.
このとき、輪郭抽出手段が、前記フレーム画像をモノクロ画像に変換するグレースケール化手段と、前記モノクロ画像に対して空間一次微分処理を施す空間一次微分手段と、前記空間一次微分処理後の画像を二値化して線画像を生成する二値化手段とを含む、ものであれば、原フレーム画像に含まれる全ての像の輝度差を強調した線画像を容易に得ることができる。
At this time, the contour extraction means includes a gray scale conversion means for converting the frame image into a monochrome image, a spatial primary differentiation means for performing a spatial primary differentiation process on the monochrome image, and an image after the spatial primary differentiation process. If a binarization unit that binarizes and generates a line image is included, it is possible to easily obtain a line image in which the luminance difference between all the images included in the original frame image is emphasized.
また、前記背景削除手段が、前記線画像を構成する線に含まれる画素の値を、当該画素の所定複数フレーム数前の値と照合し、両値が一致する画素を背景画素として削除する処理を含む、ものであれば、上記の線画像に含まれる像の中で、過去所定フレーム数以内に現れた物体の線画像(すなわち、建築物やテーブル等の真の背景画像のみならず、一定時間以上置かれたままの荷物等についても除かれた線画像)だけを確実に抽出することができる。
In addition, the background deletion unit compares the value of a pixel included in a line constituting the line image with a value of a predetermined number of frames before the pixel, and deletes a pixel having a matching value as a background pixel. In the image included in the above line image, a line image of an object that has appeared within a predetermined number of frames in the past (that is, not only a true background image such as a building or a table, but also a fixed image) It is possible to reliably extract only the line images that have been removed even for packages that have been left for more than an hour.
さらに、前記鮮明化手段が、前記背景削除後の線画像に含まれるノイズ等を原因とする微細な汚点を除去する収縮手段と、前記背景処理後の線画像に含まれる線の途切れた波線部分を繋いで太線化する膨張手段とを含む、ものであれば、線画像に含まれる線のかすれや途切れを修復すると共に、輪郭で囲まれる空間内に存在するノイズ汚点を削除して、後のラベリング処理における同一ラベルの付される画素領域を拡大して、非静止体の線画像全体を比較的少数かつ大きめの画素塊により区分し、後述するグループ化処理の効率化に寄与することなる。
Further, the sharpening means removes fine stains caused by noise or the like included in the line image after the background deletion, and a broken wavy line portion of the line included in the line image after the background processing If there is an expansion means that connects and thickens, the blurring and discontinuity of the line included in the line image is repaired, and the noise stains existing in the space surrounded by the outline are deleted, By enlarging the pixel area to which the same label is attached in the labeling process, the entire non-stationary line image is divided into relatively small and large pixel clusters, which contributes to the efficiency of the grouping process described later.
本発明の好ましい実施の一態様によれば、前記画素塊グループ化手段が、前記抽出された非静止体を構成する画素のうちで、互いに隣接する画素に同一のラベルを付すことにより、非静止体を構成する画素を画素塊毎に区分するラベリング手段と、行列先頭に相当する1の画素塊に含まれる画素のうちで、所定の行列延在方向の最遠点に位置する画素を基準として、前記延在方向の遠方側に向けて設定された所定の探索領域内に、別のラベルの付された画素が存在するか否かを探索し、存在するときには、その画素が含まれる画素塊を同一グループに属する画素塊として記憶する処理を、新たに同一グループとして記憶された画素塊について、別のラベルの付された画素が存在しなくなるまで、全ての先頭相当画素塊について、繰り返す近接画素塊探査手段とを含む、ものであってもよい。
According to one aspect of a preferred embodiment of the present invention, the pixel block grouping unit attaches the same label to pixels adjacent to each other among the pixels constituting the extracted non-stationary body. The labeling means for dividing the pixels constituting the body into pixel blocks and the pixel located at the farthest point in the predetermined matrix extending direction among the pixels included in one pixel block corresponding to the head of the matrix In the predetermined search area set toward the far side in the extending direction, it is searched whether there is a pixel with another label, and when it exists, a pixel block including the pixel Is repeated for all head equivalent pixel blocks until there is no pixel with a different label for the new pixel block stored as the same group. And a pixel block exploration unit may be.
このような構成によれば、それぞれ固有のラベルが付された複数の画素塊(輪郭線像)に対して、待ち行列の想定される延在方向に向けて設定された探索領域を前提とした近接画素塊探索処理を施すことにより、周囲の通行人等を構成する画素塊を排除して、各レーンの待ち行列を構成する先頭から末尾に至る一連の画素塊だけを順次確実に抽出して1のグループに関係付けることができる。
According to such a configuration, a search region set in the expected extending direction of the queue is assumed for a plurality of pixel clusters (contour image) each having a unique label. By performing the proximity pixel block search process, pixel blocks that make up surrounding passersby are excluded, and only a series of pixel blocks from the beginning to the end that make up the queue of each lane are extracted in order reliably. Can be related to one group.
本発明の好ましい実施の一態様によれば、前記最遠点探索手段が、同一グループに属する1の画素塊に含まれる画素のうちで、所定の行列延在方向の最遠点に位置する画素を探索する処理を、当該グループに属する全ての画素塊について繰り返すことにより、当該同一グループ内における最遠点を特定する処理を含む、ものであってもよい。
According to a preferred embodiment of the present invention, the farthest point searching means is a pixel located at a farthest point in a predetermined matrix extending direction among pixels included in one pixel block belonging to the same group. It is also possible to include a process of specifying the farthest point in the same group by repeating the process of searching for all pixel blocks belonging to the group.
このような構成によれば、1のグループに最後に関係付けられた画素塊のみならず、そのグループにおける全ての画素塊について、所定の行列延在方向の最遠点に位置する画素を探索する処理を実行することから、行列末尾の可能性が高い最遠点をより確実に探し出すことができる。
According to such a configuration, not only the pixel block last related to one group but also all the pixel blocks in the group are searched for a pixel located at the farthest point in the predetermined matrix extending direction. Since the process is executed, the farthest point having a high possibility of the end of the matrix can be found more reliably.
本発明の好ましい実施の一態様によれば、前記最遠点確定手段が、同一グループ内最遠点の現フレームの値と1つ前のフレームの値との差が所定の許容値以内であるときには所定のフレームカウンタをインクリメントする一方、前記許容値を超えるときには前記フレームカウンタをクリアするカウンタ制御手段と、同一グループ内最遠点の現フレームの値と1つ前のフレームの値との差が所定の許容値を超えて変化するとき、その変化方向が近方向のときには、現フレームの値により真の最遠点の値を更新する一方、その変化方向が遠方向であるときには、前記フレームカウンタの値が所定フレーム数に達するのを待って、現フレームの値により真の最遠点の値を更新する最遠点真値の更新制御手段とを含む、ものであってもよい。
According to an aspect of a preferred embodiment of the present invention, the farthest point determination means has a difference between the value of the current frame at the farthest point in the same group and the value of the previous frame within a predetermined allowable value. Sometimes the predetermined frame counter is incremented, while the counter control means for clearing the frame counter when the allowable value is exceeded, and the difference between the value of the current frame at the farthest point in the same group and the value of the previous frame is When the change exceeds a predetermined allowable value, when the change direction is near, the value of the true farthest point is updated with the value of the current frame, while when the change direction is far, the frame counter And the farthest point true value update control means for updating the true farthest point value with the value of the current frame after waiting for the value of a predetermined number of frames to be reached.
このような構成によれば、最遠点の変化方向が行列先頭へと近づく方向であるときには、直ちに、前記行列末尾候補として記憶される最遠点を、真の行列末尾と認識されるべき最遠点として遅れることなく直ちに確定する一方、最遠点の変化方向が行列先頭から遠ざかる方向であるときには、その変化後の最遠点が何フレーム数継続するかを判定するだけで、変化後の最遠点が安定しているか否かを確実に判断することができ、通行人等による偽の最遠点に基づく、最遠点の誤認を簡単な処理で回避することができる。
According to such a configuration, when the change direction of the farthest point is a direction approaching the top of the matrix, the farthest point stored as the matrix end candidate is immediately recognized as the true end of the matrix. While it is fixed immediately without delay as a far point, when the change direction of the farthest point is a direction away from the top of the matrix, it is only necessary to determine how many frames the farthest point after the change will continue and Whether or not the farthest point is stable can be determined with certainty, and misidentification of the farthest point based on a false farthest point by a passerby or the like can be avoided by simple processing.
別の一面から見た本発明は、行列末尾認識用の画像処理方法として把握することもできる。すなわち、この行列末尾認識用の画像処理方法は、1もしくは2以上のレーンからなる待ち行列の出現が想定される領域を俯瞰撮影することにより得られた動画像から当該待ち行列の末尾を認識するための画像処理方法であって、非静止体抽出ステップと、画素塊グループ化ステップと、最遠点探索ステップと、最遠点確定ステップとを包含する。
The present invention viewed from another aspect can also be grasped as an image processing method for matrix end recognition. That is, this image processing method for recognizing the end of a queue recognizes the end of the queue from a moving image obtained by taking a bird's-eye shot of an area in which a queue composed of one or more lanes is expected to appear. A non-stationary object extraction step, a pixel block grouping step, a farthest point search step, and a farthest point determination step.
前記非静止体抽出ステップは、前記動画像を構成する一連のフレーム画像のそれぞれから非静止体を構成する画素を抽出する。前記画素塊グループ化ステップは、前記抽出された非静止体画素を、互いに隣接する画素を一塊に纏めてなる画素塊に区分すると共に、それらの区分された画素塊のうち、所定の行列延在方向において互いに近接する一連の画素塊を1のグループに属する画素塊として記憶する。前記最遠点探索ステップは、1の画素塊グループに属する全画素塊の中から、所定の行列延在方向における最遠点を探索して行列末尾候補として記憶する。前記最遠点確定ステップは、前記行列末尾候補として記憶される最遠点が相連続するフレーム画像間で所定の許容値を超えて変化したとき、その変化方向と変化後の安定性とに基づいて、前記行列末尾候補として記憶される最遠点を、真の行列末尾と認識されるべき最遠点として確定する。
In the non-stationary object extraction step, pixels constituting the non-stationary object are extracted from each of a series of frame images constituting the moving image. The pixel block grouping step divides the extracted non-stationary body pixels into pixel blocks formed by collecting adjacent pixels into a single block, and extends a predetermined matrix among the divided pixel blocks. A series of pixel blocks that are close to each other in the direction are stored as pixel blocks belonging to one group. In the farthest point searching step, the farthest point in a predetermined matrix extending direction is searched from all the pixel blocks belonging to one pixel block group and stored as a matrix end candidate. The farthest point determination step is based on the change direction and the stability after the change when the farthest point stored as the matrix end candidate changes beyond a predetermined allowable value between consecutive frame images. Thus, the farthest point stored as the matrix end candidate is determined as the farthest point to be recognized as the true end of the matrix.
本方法発明によれば、前記装置発明と同様な作用効果が得られることに加えて、例えばパソコンに必要な画像処理プログラムを組み込んで実施する場合には、常時は、そのプログラムを外部の他社ストレージ装置やいわゆるクラウドセンタ等に置いておき、必要に応じて、当該パソコンに移送して実施する場合のように、当該プログラムを当該パソコンに常駐せずに実施することもできる。
According to the method invention, in addition to obtaining the same operational effects as the device invention, for example, when an image processing program necessary for a personal computer is incorporated and executed, the program is always stored in an external storage of another company. The program can be executed without being resident in the personal computer as in the case where it is placed in a device or a so-called cloud center and transferred to the personal computer as necessary.
別の一面からみた本発明は、コンピュータを、行列末尾認識用の画像処理装置として機能させるためのコンピュータプログラムとして把握することもできる。すなわち、このコンピュータプログラムは、コンピュータを、1もしくは2以上のレーンからなる待ち行列の出現が想定される領域を俯瞰撮影することにより得られた動画像から当該待ち行列の末尾を認識するために、非静止体抽出手段と、画素塊グループ化手段と、最遠点探索手段と、最遠点確定手段とを包含する。
From another aspect, the present invention can be grasped as a computer program for causing a computer to function as an image processing apparatus for matrix end recognition. In other words, this computer program recognizes the end of the queue from a moving image obtained by taking a bird's-eye view of an area in which a queue composed of one or more lanes is expected to appear. Non-stationary object extracting means, pixel block grouping means, farthest point searching means, and farthest point determining means are included.
前記非静止体抽出手段は、前記動画像を構成する一連のフレーム画像のそれぞれから非静止体を構成する画素を抽出する。前記画素塊グループ化手段は、前記抽出された非静止体画素を、互いに隣接する画素を一塊に纏めてなる画素塊に区分すると共に、それらの区分された画素塊のうち、所定の行列延在方向において互いに近接する一連の画素塊を1のグループに属する画素塊として記憶する。前記最遠点探索手段は、1の画素塊グループに属する全画素塊の中から、所定の行列延在方向における最遠点を探索して行列末尾候補として記憶する。前記最遠点確定手段は、前記行列末尾候補として記憶される最遠点が相連続するフレーム画像間で所定の許容値を超えて変化したとき、その変化方向と変化後の安定性とに基づいて、前記行列末尾候補として記憶される最遠点を、真の行列末尾と認識されるべき最遠点として確定する。
The non-stationary object extracting means extracts pixels constituting the non-stationary object from each of a series of frame images constituting the moving image. The pixel block grouping unit divides the extracted non-stationary body pixels into pixel blocks formed by collecting adjacent pixels into a single block, and a predetermined matrix extension of the divided pixel blocks. A series of pixel blocks that are close to each other in the direction are stored as pixel blocks belonging to one group. The farthest point searching means searches for the farthest point in a predetermined matrix extending direction from all the pixel blocks belonging to one pixel block group and stores them as matrix end candidates. The farthest point determination means, when the farthest point stored as the matrix end candidate changes beyond a predetermined allowable value between consecutive frame images, based on the change direction and the stability after the change Thus, the farthest point stored as the matrix end candidate is determined as the farthest point to be recognized as the true end of the matrix.
本プログラム発明によれば、前記装置発明と同様な作用効果が得られることに加えて、プログラム単体でも、権限のない第三者による製造販売等の実施行為に対して、権利を行使できることは言うまでもない。
According to this program invention, in addition to obtaining the same effects as the apparatus invention, it is obvious that the program itself can exercise the right to an implementation act such as manufacturing and sales by an unauthorized third party. Yes.
別の一面から見た本発明は、1もしくは2以上のレーンからなる待ち行列の出現が想定される領域を俯瞰撮影するカメラと、前記カメラから得られる動画像から当該待ち行列の末尾を認識するための画像処理装置とを包含する行列末尾の認識システムとして把握することもできる。
From another aspect, the present invention recognizes the end of the queue from a camera that takes a bird's-eye view of an area where a queue composed of one or more lanes is expected to appear, and a moving image obtained from the camera. It can also be grasped as a recognition system at the end of the matrix that includes an image processing device for this purpose.
この場合、前記画像処理装置は、非静止体抽出手段と、画素塊グループ化手段と、最遠点探索手段と、最遠点確定手段とをを含む。
In this case, the image processing apparatus includes a non-stationary object extracting unit, a pixel block grouping unit, a farthest point searching unit, and a farthest point determining unit.
ここで、非静止体抽出手段は、前記動画像を構成する一連のフレーム画像のそれぞれから非静止体を構成する画素を抽出する。前記画素塊グループ化手段は、前記抽出された非静止体画素を、互いに隣接する画素を一塊に纏めてなる画素塊に区分すると共に、それらの区分された画素塊のうち、所定の行列延在方向において互いに近接する一連の画素塊を1のグループに属する画素塊として記憶する。前記最遠点探索手段は、1の画素塊グループに属する全画素塊の中から、所定の行列延在方向における最遠点を探索して行列末尾候補として記憶する。前記最遠点確定手段は、前記行列末尾候補として記憶される最遠点が相連続するフレーム画像間で所定の許容値を超えて変化したとき、その変化方向と変化後の安定性とに基づいて、前記行列末尾候補として記憶される最遠点を、真の行列末尾と認識されるべき最遠点として確定する。
Here, the non-stationary object extracting means extracts pixels constituting the non-stationary object from each of a series of frame images constituting the moving image. The pixel block grouping unit divides the extracted non-stationary body pixels into pixel blocks formed by collecting adjacent pixels into a single block, and a predetermined matrix extension of the divided pixel blocks. A series of pixel blocks that are close to each other in the direction are stored as pixel blocks belonging to one group. The farthest point searching means searches for the farthest point in a predetermined matrix extending direction from all the pixel blocks belonging to one pixel block group and stores them as matrix end candidates. The farthest point determination means, when the farthest point stored as the matrix end candidate changes beyond a predetermined allowable value between consecutive frame images, based on the change direction and the stability after the change Thus, the farthest point stored as the matrix end candidate is determined as the farthest point to be recognized as the true end of the matrix.
本システム発明によれば、前記装置発明と同様な作用効果が得られることに加えて、画像処理装置のみならず、俯瞰撮影用のカメラ自体も、発明を構成する設備となるため、権限のない第三者による製造販売等の実施行為に対しては、画像処理装置を構成するコンピュータのみならず、俯瞰撮影用のカメラに対しても、その権利を行使できることは言うまでもない。
According to the present system invention, in addition to obtaining the same operational effects as the apparatus invention, not only the image processing apparatus but also the overhead view camera itself is the equipment that constitutes the invention, and thus there is no authority. It goes without saying that the right to exercise not only a computer constituting the image processing apparatus but also a camera for overhead view photography can be exercised for an implementation act such as manufacturing and sales by a third party.
本発明に係る行列末尾認識用の画像処理装置は、これにさらに、当該画像処理装置から得られる行列末尾位置を、予め統計的に得られた、行列末尾位置と当該行列の待ち時間との相関に照らすことにより、現在の行列待ち時間を推定する待ち時間推定手段を付加することにより、行列待ち時間の推定装置を実現することもできる。
The image processing apparatus for recognizing the end of the matrix according to the present invention further includes a correlation between the end position of the matrix and the waiting time of the matrix, which is statistically obtained in advance as the end position of the matrix obtained from the image processing apparatus. By adding a waiting time estimating means for estimating the current matrix waiting time, it is possible to realize a matrix waiting time estimating device.
このような行列待ち時間の推定装置によれば、例えば、複数段階の行列末尾基準位置と各末尾基準位置のそれぞれに対応して、実際に測定を繰り返して得られた統計的な待ち時間とを対としてテーブルを作成しておき、本発明装置から得られる行列末尾位置に基づいてテーブル参照処理並びに適宜な補間処理を実行することにより、自動生成された行列末尾位置に基づいて、該当する待ち時間を推定することができる。
According to such a matrix waiting time estimation apparatus, for example, the statistical waiting time obtained by actually repeating the measurement corresponding to each of the matrix end reference position and each end reference position in a plurality of stages is obtained. By creating a table as a pair and executing table reference processing and appropriate interpolation processing based on the matrix end position obtained from the device of the present invention, the corresponding waiting time is determined based on the automatically generated matrix end position. Can be estimated.
このとき、複数レーンの待ち行列のそれぞれについて、現在の待ち時間を推定すると共に、それらの待ち時間に基づいて、当該複数レーンに対応する総合的な1個の待ち時間を決定する、ようにしてもよい。
At this time, for each of the queues of the plurality of lanes, the current waiting time is estimated, and based on those waiting times, one overall waiting time corresponding to the plurality of lanes is determined. Also good.
このような構成によれば、例えば、空港の出発ゲートのように、複数のゲートが存在すると共に、各ゲートには複数レーンの待ち行列が生ずるような場合、各ゲート単位の待ち時間を推定して、これを適宜な通知手段を介して搭乗予定客に知らせることにより、搭乗予定客は、比較的に空いている出発ゲートを選ぶことが可能となる。
According to such a configuration, for example, when there are a plurality of gates such as an airport departure gate and a queue of multiple lanes is generated at each gate, the waiting time for each gate is estimated. By notifying this to the boarding passenger through an appropriate notification means, the boarding passenger can select a relatively free departure gate.
本発明に係る行列末尾認識用の画像処理装置によれば、非静止体画素を構成する1又は2以上の画素塊の中で、所定の行列延在方向において互いに近接する一連の画素塊を1のグループに属する画素塊として記憶することから、行列周辺の通行人等の影響を受けることなく、真に、行列を構成する可能性の高い画素塊のみを先頭から末尾に向けて確実に探索することができ、さらに最遠点の探索についても、最後にグループに組み込まれた画素塊ではなくて、そのグループに含まれる全ての画素塊の中から探索するようにして、再度、グループ内を見渡して最も行列末尾の可能性が高い画素を最遠点とすることができ、加えて、こうして得られた最遠点を全てそのまま真の最遠点と認識するのではなく、列が延長された場合と列が短縮された場合とで、異なる基準により再確認することにより、認識応答性を維持しつつも、行列末尾周辺の通行人等による行列末尾の誤認を避けることができる。しかも、行列に含まれる人物を個別に認識するものではないので、画像処理全体は比較的に簡単な要素技術の組み合わせとなるため、処理も簡単で処理時間も比較的短く、通常の市販パソコンでも容易に実施することができる。
According to the image processing apparatus for recognizing the end of a matrix according to the present invention, a series of pixel blocks that are close to each other in a predetermined matrix extending direction among one or two or more pixel blocks constituting a non-stationary body pixel. Since it is stored as a pixel block belonging to the group, it is surely searched from the head to the end only for the pixel block that is truly likely to form a matrix without being affected by passersby around the matrix. You can search for the farthest point from all the pixel blocks included in the group instead of the pixel block that was last included in the group. The most likely pixel at the end of the matrix can be the farthest point, and in addition, the farthest point obtained in this way is not directly recognized as the true farthest point, but the column is extended. If and column is shortened In the case, by re-confirmed by different criteria, while maintaining the recognized response can avoiding false matrix tail by passersby, etc. near matrices tail. Moreover, since the person included in the matrix is not individually recognized, the entire image processing is a combination of relatively simple elemental technologies, so the processing is simple and the processing time is relatively short. It can be easily implemented.
以下に、本発明の好適な実施の一形態である待ち時間推定装置を、空港出発ゲートの前に出現する待ち行列を例にとり、添付図面を参照しながら詳細に説明する。なお、いうまでもないが、本発明は、鉄道駅や公営レース等の発券場の前に出現する待ち行列、さらには、スーパーマーケットやコンビニエンスストア等の決済場(レジ)の前に出現する待ち行列等々にも、広く適用可能であることは勿論である。
Hereinafter, a waiting time estimation apparatus according to a preferred embodiment of the present invention will be described in detail with reference to the accompanying drawings, taking a queue appearing in front of an airport departure gate as an example. Needless to say, the present invention is a queue that appears in front of a ticket office such as a railway station or a public race, or a queue that appears in front of a settlement (registry) such as a supermarket or a convenience store. Of course, it can be widely applied.
<前置き>
一般に、空港の各出発ゲートのそれぞれには、互いに並列な複数レーンの搭乗客用通路を有する保安検査場が設けられている。各搭乗客用通路には、身体検査用の金属探知器のほか、持ち物透視検査のためのX線探知器等々の各種の検査装置が設けられている。 <Introduction>
In general, each departure gate of an airport is provided with a security checkpoint having a plurality of lanes for passengers parallel to each other. Each passenger passage is provided with various inspection devices such as an X-ray detector for physical inspection as well as a metal detector for physical inspection.
一般に、空港の各出発ゲートのそれぞれには、互いに並列な複数レーンの搭乗客用通路を有する保安検査場が設けられている。各搭乗客用通路には、身体検査用の金属探知器のほか、持ち物透視検査のためのX線探知器等々の各種の検査装置が設けられている。 <Introduction>
In general, each departure gate of an airport is provided with a security checkpoint having a plurality of lanes for passengers parallel to each other. Each passenger passage is provided with various inspection devices such as an X-ray detector for physical inspection as well as a metal detector for physical inspection.
それらの検査装置はかなりの高精度であることに加え、昨今、テロ対策等々により検査はより慎重に実施されることもあり、搭乗客一人当たりの最低検査所要時間は長大化する傾向にあり、保安検査場の各レーンの搭乗客用通路の入り口の前には、検査待ちの行列が出現するのが通例である。そのため、当該出発ゲートの保安検査場を通って、一般客用フロア領域から搭乗客用フロア領域に入り込む際の通過所要時間(すなわち、待ち行列の待ち時間)は、この検査待ち行列の長さにより大きく変動する。
In addition to the high accuracy of these inspection devices, in recent years, inspections have been carried out more carefully due to terrorism countermeasures, etc., and the minimum time required for inspection per passenger tends to increase. It is customary that a queue for inspection appears before the entrance of the passenger passage in each lane of the security inspection area. For this reason, the time required for passing through the security inspection area of the departure gate from the general passenger floor area to the passenger floor area (that is, the waiting time of the queue) depends on the length of the inspection queue. It fluctuates greatly.
本発明に係る行列末尾認識用の画像処理装置は、出発ゲートの各レーン毎の待ち行列末尾をリアルタイムに自動認識して、各レーン毎や各出発ゲート毎の待ち時間、すなわち通過所要時間を推定するために利用することができる。
The image processing apparatus for recognizing the end of a queue according to the present invention automatically recognizes the end of the queue for each lane of the departure gate in real time, and estimates the waiting time for each lane or each departure gate, that is, the time required for passing. Can be used to
<空港設備について>
本実施例システムが適用される空港には、図1に示されるように、第1ゲート~第4ゲートからなる4カ所の出発ゲートが互いに適当な距離を隔てて設けられており、各出発ゲートの内部には、図3に示されるように、第1レーン(L1)~第4レーン(L4)からなる4レーンの搭乗客用通路を有する保安検査場が設けられている。各レーンの搭乗客用通路の入り口の前には、搭乗客の待ち行列が出現するのが通例である。そのため、各出発ゲートには、各レーン(L1)~(L4)の搭乗客用通路の延在方向に沿って、互いにほぼ並行に延在する4列の待ち行列が、出現することとなる。 <About airport facilities>
As shown in FIG. 1, at the airport to which this embodiment system is applied, four departure gates comprising the first gate to the fourth gate are provided at an appropriate distance from each other. As shown in FIG. 3, a security checkpoint having a four-lane passenger passage composed of the first lane (L1) to the fourth lane (L4) is provided. It is customary that a passenger queue appears in front of the passenger passage in each lane. Therefore, four queues that extend substantially in parallel with each other along the extending direction of the passenger passages in the lanes (L1) to (L4) appear at each departure gate.
本実施例システムが適用される空港には、図1に示されるように、第1ゲート~第4ゲートからなる4カ所の出発ゲートが互いに適当な距離を隔てて設けられており、各出発ゲートの内部には、図3に示されるように、第1レーン(L1)~第4レーン(L4)からなる4レーンの搭乗客用通路を有する保安検査場が設けられている。各レーンの搭乗客用通路の入り口の前には、搭乗客の待ち行列が出現するのが通例である。そのため、各出発ゲートには、各レーン(L1)~(L4)の搭乗客用通路の延在方向に沿って、互いにほぼ並行に延在する4列の待ち行列が、出現することとなる。 <About airport facilities>
As shown in FIG. 1, at the airport to which this embodiment system is applied, four departure gates comprising the first gate to the fourth gate are provided at an appropriate distance from each other. As shown in FIG. 3, a security checkpoint having a four-lane passenger passage composed of the first lane (L1) to the fourth lane (L4) is provided. It is customary that a passenger queue appears in front of the passenger passage in each lane. Therefore, four queues that extend substantially in parallel with each other along the extending direction of the passenger passages in the lanes (L1) to (L4) appear at each departure gate.
<俯瞰撮影用のカメラ及びその配置について>
第1ゲート~第4ゲートからなる4カ所の出発ゲートのそれぞれには、搭乗客の待ち行列を俯瞰撮影するために、動画撮影用のカメラが設けられる。各ゲート毎のカメラの台数及び撮影方向は、各カメラの視野内に、4列の待ち行列が互いに離隔して収まるか否か、換言すれば、4列の待ち行列が視野内において互いに列同士で重なり合うことがないようにして決定される。そのような観点からすれば、各ゲート毎のカメラの台数及び撮影方向としては、様々な選択肢が存在する。一例として、カメラの台数を1台とするのであれば、4列の待ち行列をその後方から充分な距離を隔てて俯瞰撮影するのが好ましい。もっとも、周囲の建築物の状況等により、充分な距離を隔てることが困難な場合には、カメラの台数を2台以上として、例えばカメラの台数を2台とする場合、第1のカメラにより隣接する2列分の待ち行列をその後方から、また第2のカメラにより隣接する2列分の待ち行列を、その後方から、俯瞰撮影するといった手法を採用してもよいであろう。 <Camera for overhead view photography and its arrangement>
Each of the four departure gates consisting of the first gate to the fourth gate is provided with a camera for taking a video to take a bird's-eye view of the passenger queue. The number of cameras for each gate and the shooting direction is whether the four queues are spaced apart from each other within the field of view of each camera, in other words, the four queues are It is determined so as not to overlap. From such a viewpoint, there are various options for the number of cameras and the shooting direction for each gate. As an example, if the number of cameras is one, it is preferable to take a bird's-eye view of four queues at a sufficient distance from the rear. However, if it is difficult to keep a sufficient distance due to the surrounding building conditions, etc., the number of cameras is set to two or more. For example, when the number of cameras is two, the first camera is adjacent. A method may be employed in which a queue for two rows is photographed from behind, and a queue for two rows adjacent by the second camera is photographed from above.
第1ゲート~第4ゲートからなる4カ所の出発ゲートのそれぞれには、搭乗客の待ち行列を俯瞰撮影するために、動画撮影用のカメラが設けられる。各ゲート毎のカメラの台数及び撮影方向は、各カメラの視野内に、4列の待ち行列が互いに離隔して収まるか否か、換言すれば、4列の待ち行列が視野内において互いに列同士で重なり合うことがないようにして決定される。そのような観点からすれば、各ゲート毎のカメラの台数及び撮影方向としては、様々な選択肢が存在する。一例として、カメラの台数を1台とするのであれば、4列の待ち行列をその後方から充分な距離を隔てて俯瞰撮影するのが好ましい。もっとも、周囲の建築物の状況等により、充分な距離を隔てることが困難な場合には、カメラの台数を2台以上として、例えばカメラの台数を2台とする場合、第1のカメラにより隣接する2列分の待ち行列をその後方から、また第2のカメラにより隣接する2列分の待ち行列を、その後方から、俯瞰撮影するといった手法を採用してもよいであろう。 <Camera for overhead view photography and its arrangement>
Each of the four departure gates consisting of the first gate to the fourth gate is provided with a camera for taking a video to take a bird's-eye view of the passenger queue. The number of cameras for each gate and the shooting direction is whether the four queues are spaced apart from each other within the field of view of each camera, in other words, the four queues are It is determined so as not to overlap. From such a viewpoint, there are various options for the number of cameras and the shooting direction for each gate. As an example, if the number of cameras is one, it is preferable to take a bird's-eye view of four queues at a sufficient distance from the rear. However, if it is difficult to keep a sufficient distance due to the surrounding building conditions, etc., the number of cameras is set to two or more. For example, when the number of cameras is two, the first camera is adjacent. A method may be employed in which a queue for two rows is photographed from behind, and a queue for two rows adjacent by the second camera is photographed from above.
図1~図3にあっては、各出発ゲート前の4列の待ち行列を、充分なる距離を隔てて、1台のカメラにより、その後方から俯瞰撮影する例が示されている。すなわち、第1ゲートの4列の待ち行列は第1カメラ2-1により、第2ゲートの4列の待ち行列は第2カメラ2-2により、第3ゲートの4列の待ち行列は第3カメラ2-3により、及び第4ゲートの4列の待ち行列は第4カメラ2-4により、それぞれ、その後方より、充分なる距離を隔てた位置から俯瞰撮影される。なお、図2において、符号3は、待ち行列を構成する人である。また、図3においては、待ち行列を構成する人3は、図面を簡素化するために、行列の一部のみを示すもので、他の大多数の行列部分は省略されている。さらに、図3において、出発ゲート入り口からカメラ2の取付位置までの距離は、紙面の関係上、大幅に縮小されていることに注意されたい。
FIGS. 1 to 3 show an example in which four queues in front of each departure gate are photographed from above with a single camera at a sufficient distance. That is, the queue of the four rows of the first gate is the first camera 2-1, the queue of the four rows of the second gate is the second camera 2-2, and the queue of the fourth row of the third gate is the third queue. The queues of the four rows of the fourth gate and the fourth gate are photographed from a position separated from the rear by a fourth camera 2-4 from an overhead position. In FIG. 2, reference numeral 3 denotes a person who forms a queue. Further, in FIG. 3, the person 3 who constitutes the queue shows only a part of the queue in order to simplify the drawing, and the majority of other queue parts are omitted. Furthermore, in FIG. 3, it should be noted that the distance from the entrance of the departure gate to the mounting position of the camera 2 has been greatly reduced due to space limitations.
カメラ2としては様々な形式のものを任意に採用できるが、この例にあっては、毎秒30フレームの撮影レートを有し、かつ通信機能を有する天井取付型のドームカメラが採用され、図2に示されるように、保安検査場の入り口の正面前方、充分に離間する位置にある天井面に取り付けられている。
Although various types of cameras 2 can be arbitrarily employed as the camera 2, in this example, a ceiling-mounted dome camera having a photographing rate of 30 frames per second and having a communication function is employed. As shown in Fig. 2, it is attached to the front surface of the entrance of the security inspection site, on the ceiling surface at a sufficiently spaced position.
<本発明の画像処理装置を構成する待ち時間生成PCについて>
図1に示されるように、第1ゲートの4列の待ち行列を俯瞰撮影する第1カメラ2-1、第2ゲートの4列の待ち行列を俯瞰撮影する第2カメラ2-2、第3ゲートの4列の待ち行列を俯瞰撮影する第3カメラ2-3、及び第4ゲートの4列の待ち行列を俯瞰撮影する第4カメラ2-4は、それぞれ、LANケーブルを介して、最終的に待ち時間を生成するための1台のパーソナルコンピュータ(以下、「待ち時間生成PC」と称する)1に接続される。なお、パーソナルコンピュータとしては、本体部(CPU、ハードディスク、メモリ、無線LANボード、等々を含む)、ディスプレイ、操作部(マウス、キーボード、等々を含む)を備えた市販の通常性能のものを任意に採用することができ、そのハードウェア構成やOS等のシステム構成については、各種の文献により当業者には周知であるから、説明は省略する。待ち時間生成PC1は、図示を略するが、無線LANを介して、空港の管理サーバにも接続可能とされている。 <About the waiting time generation PC constituting the image processing apparatus of the present invention>
As shown in FIG. 1, a first camera 2-1 for taking a bird's-eye view of a queue of four rows of the first gate, a second camera 2-2 for taking a bird's-eye view of a queue of four rows of the second gate, The third camera 2-3 for taking a bird's-eye view of the queue of the four rows of gates and the fourth camera 2-4 for taking a bird's-eye view of the queue of the fourth row of the fourth gate are respectively connected via a LAN cable. Are connected to one personal computer (hereinafter referred to as “latency generation PC”) 1 for generating a waiting time. As a personal computer, a commercially available computer having a normal performance with a main unit (including a CPU, a hard disk, a memory, a wireless LAN board, etc.), a display, and an operation unit (including a mouse, a keyboard, etc.) is arbitrarily selected. The hardware configuration and the system configuration such as the OS are well known to those skilled in the art based on various documents, and thus the description thereof is omitted. Although not shown, the waitingtime generation PC 1 can be connected to an airport management server via a wireless LAN.
図1に示されるように、第1ゲートの4列の待ち行列を俯瞰撮影する第1カメラ2-1、第2ゲートの4列の待ち行列を俯瞰撮影する第2カメラ2-2、第3ゲートの4列の待ち行列を俯瞰撮影する第3カメラ2-3、及び第4ゲートの4列の待ち行列を俯瞰撮影する第4カメラ2-4は、それぞれ、LANケーブルを介して、最終的に待ち時間を生成するための1台のパーソナルコンピュータ(以下、「待ち時間生成PC」と称する)1に接続される。なお、パーソナルコンピュータとしては、本体部(CPU、ハードディスク、メモリ、無線LANボード、等々を含む)、ディスプレイ、操作部(マウス、キーボード、等々を含む)を備えた市販の通常性能のものを任意に採用することができ、そのハードウェア構成やOS等のシステム構成については、各種の文献により当業者には周知であるから、説明は省略する。待ち時間生成PC1は、図示を略するが、無線LANを介して、空港の管理サーバにも接続可能とされている。 <About the waiting time generation PC constituting the image processing apparatus of the present invention>
As shown in FIG. 1, a first camera 2-1 for taking a bird's-eye view of a queue of four rows of the first gate, a second camera 2-2 for taking a bird's-eye view of a queue of four rows of the second gate, The third camera 2-3 for taking a bird's-eye view of the queue of the four rows of gates and the fourth camera 2-4 for taking a bird's-eye view of the queue of the fourth row of the fourth gate are respectively connected via a LAN cable. Are connected to one personal computer (hereinafter referred to as “latency generation PC”) 1 for generating a waiting time. As a personal computer, a commercially available computer having a normal performance with a main unit (including a CPU, a hard disk, a memory, a wireless LAN board, etc.), a display, and an operation unit (including a mouse, a keyboard, etc.) is arbitrarily selected. The hardware configuration and the system configuration such as the OS are well known to those skilled in the art based on various documents, and thus the description thereof is omitted. Although not shown, the waiting
<待ち時間生成用PCに組み込まれるコンピュータプログラムの全体について>
待ち時間生成PC1で実行される処理(コンピュータプログラム)の全体を概略的に示すゼネラルフローチャートを図4に示す。同図に示されるように、この処理は、要するに、所定のカメラポインタの値を初期値(ステップ100)から、第1カメラ→第2カメラ→第3カメラ→第4カメラのように、次々と更新しつつ(ステップ106)、その更新の都度、ポインタ指定のカメラからの画像を取得する処理(ステップ101)、最新120フレーム分(過去4秒以内)の画像を保存するFIFO処理(ステップ102)、行列末尾を認識する処理(ステップ103)、及び待ち時間を推定する処理(ステップ104)からなる一連の処理を、全カメラに対する処理が完了するまで繰り返し(ステップ105NO)、処理が完了するのを待って(ステップ105YES)、待ち時間推定値を所定ファイルに保存する処理(ステップ107)及び空港の管理サーバへのファイルを転送する処理(ステップ108)を実行し、以上の一連の処理(ステップ100~109)を、所定の操作で運用が停止されるまで繰り返し(ステップ109NO)、運用停止が確認されるのを待って(ステップ109YES)、処理を終了するように構成されている。 <Overall computer program incorporated in PC for waiting time generation>
FIG. 4 shows a general flowchart schematically showing the entire process (computer program) executed by the waiting time generation PC1. As shown in the figure, this process basically includes a predetermined camera pointer value from the initial value (step 100) to the first camera → second camera → third camera → fourth camera one after another. While updating (step 106), each time the update is performed, processing for acquiring an image from the camera designated by the pointer (step 101), and FIFO processing for storing the latest 120 frames (within the past 4 seconds) (step 102) A series of processes consisting of the process of recognizing the end of the matrix (step 103) and the process of estimating the waiting time (step 104) are repeated until the processes for all the cameras are completed (NO in step 105). Wait (YES in step 105), save the estimated waiting time value in a predetermined file (step 107), and save the file to the airport management server. Process is repeated (step 108), and the above series of processes (steps 100 to 109) is repeated until the operation is stopped by a predetermined operation (NO in step 109). It waits (step 109 YES), and it is comprised so that a process may be complete | finished.
待ち時間生成PC1で実行される処理(コンピュータプログラム)の全体を概略的に示すゼネラルフローチャートを図4に示す。同図に示されるように、この処理は、要するに、所定のカメラポインタの値を初期値(ステップ100)から、第1カメラ→第2カメラ→第3カメラ→第4カメラのように、次々と更新しつつ(ステップ106)、その更新の都度、ポインタ指定のカメラからの画像を取得する処理(ステップ101)、最新120フレーム分(過去4秒以内)の画像を保存するFIFO処理(ステップ102)、行列末尾を認識する処理(ステップ103)、及び待ち時間を推定する処理(ステップ104)からなる一連の処理を、全カメラに対する処理が完了するまで繰り返し(ステップ105NO)、処理が完了するのを待って(ステップ105YES)、待ち時間推定値を所定ファイルに保存する処理(ステップ107)及び空港の管理サーバへのファイルを転送する処理(ステップ108)を実行し、以上の一連の処理(ステップ100~109)を、所定の操作で運用が停止されるまで繰り返し(ステップ109NO)、運用停止が確認されるのを待って(ステップ109YES)、処理を終了するように構成されている。 <Overall computer program incorporated in PC for waiting time generation>
FIG. 4 shows a general flowchart schematically showing the entire process (computer program) executed by the waiting time generation PC1. As shown in the figure, this process basically includes a predetermined camera pointer value from the initial value (step 100) to the first camera → second camera → third camera → fourth camera one after another. While updating (step 106), each time the update is performed, processing for acquiring an image from the camera designated by the pointer (step 101), and FIFO processing for storing the latest 120 frames (within the past 4 seconds) (step 102) A series of processes consisting of the process of recognizing the end of the matrix (step 103) and the process of estimating the waiting time (step 104) are repeated until the processes for all the cameras are completed (NO in step 105). Wait (YES in step 105), save the estimated waiting time value in a predetermined file (step 107), and save the file to the airport management server. Process is repeated (step 108), and the above series of processes (steps 100 to 109) is repeated until the operation is stopped by a predetermined operation (NO in step 109). It waits (step 109 YES), and it is comprised so that a process may be complete | finished.
<行列末尾の認識処理の詳細について>
行列末尾の認識処理(ステップ103)の詳細を図5に示す。同図に示されるように、この処理は、要するに、非静止体画素の抽出処理(ステップ1030)と、近接画素塊同士のグループ化処理(ステップ1031)と、各グループ内最下点の探索処理(ステップ032)と、各グループ内最下点の確定処理(ステップ1033)を含んで構成されている。 <Details of recognition processing at the end of the matrix>
The details of the matrix end recognition process (step 103) are shown in FIG. As shown in the figure, this processing basically includes non-stationary pixel extraction processing (step 1030), grouping of adjacent pixel clusters (step 1031), and search processing for the lowest point in each group. (Step 032) and the lowest point determination process (Step 1033) in each group.
行列末尾の認識処理(ステップ103)の詳細を図5に示す。同図に示されるように、この処理は、要するに、非静止体画素の抽出処理(ステップ1030)と、近接画素塊同士のグループ化処理(ステップ1031)と、各グループ内最下点の探索処理(ステップ032)と、各グループ内最下点の確定処理(ステップ1033)を含んで構成されている。 <Details of recognition processing at the end of the matrix>
The details of the matrix end recognition process (step 103) are shown in FIG. As shown in the figure, this processing basically includes non-stationary pixel extraction processing (step 1030), grouping of adjacent pixel clusters (step 1031), and search processing for the lowest point in each group. (Step 032) and the lowest point determination process (Step 1033) in each group.
ここで、非静止体画素の抽出処理(ステップ1030)は、1/30秒毎に取得されるフレーム画像から人物等に相当する非静止体構成画素を抽出する処理であり、後に、図6を参照して詳述するように、輪郭抽出処理(ステップ10300)、背景削除処理(ステップ10301)、及び鮮明化処理(ステップ10302)を含んで構成される。
Here, the non-stationary body pixel extraction process (step 1030) is a process of extracting non-stationary body constituent pixels corresponding to a person or the like from a frame image acquired every 1/30 seconds. As will be described in detail with reference to the drawings, it includes a contour extraction process (step 10300), a background deletion process (step 10301), and a sharpening process (step 10302).
また、近接画素塊同士のグループ化処理(ステップ1031)は、後に、図9を参照して詳述するように、抽出された非静止体画素を、互いに隣接する画素を一塊に纏めてなる画素塊に区分すると共に、それらの区分された画素塊のうち、所定の行列延在方向において互いに近接する一連の画素塊を1のグループに属する画素塊として記憶する処理である。
In addition, the grouping process (step 1031) between adjacent pixel clusters is a pixel in which the extracted non-stationary pixels are grouped into adjacent pixels as will be described in detail later with reference to FIG. This is a process of dividing into a block and storing a series of pixel blocks close to each other in a predetermined matrix extending direction among the divided pixel blocks as a pixel block belonging to one group.
また、各グループ内最下点の探索処理(ステップ1032)は、後に、図10を参照して詳述するように、1の画素塊グループに属する全画素塊の中から、所定の行列延在方向(この例では、画面の下方向)における最遠点(この例では、最下点)を探索して行列末尾候補として記憶する処理である。
Further, the search processing for the lowest point in each group (step 1032) is performed by extending a predetermined matrix from all the pixel blocks belonging to one pixel block group, as will be described in detail later with reference to FIG. This is a process of searching for the farthest point (in this example, the lowest point in this example) in the direction (in this example, the downward direction on the screen) and storing it as a matrix end candidate.
さらに、各グループ内最下点の確定処理(ステップ1033)は、後に、図11を参照して詳述するように、行列末尾候補として記憶される最遠点が相連続するフレーム画像間で所定の許容値を超えて変化したとき、その変化方向と変化後の安定性とに基づいて、前記行列末尾候補として記憶される最遠点を、真の行列末尾と認識されるべき最遠点として確定する処理である。
<非静止体構成画素の抽出処理の詳細について>
非静止体構成画素の抽出処理の詳細を図6に示す。同図に示されるように、この処理は、フレーム画像をそれに含まれる各種の像の輪郭を抽出してなる線画像に変換する輪郭抽出処理(ステップ10300)と、前記線画像の中で、背景に相当する線画像部分を、背景差分処理を用いて削除する背景削除処理(ステップ10301)と、前記背景差分処理後の線画像を構成する線を鮮明化する鮮明処理(ステップ10302)とを含んで構成されている。そのため、この非静止体構成画素の抽出処理においては、単に、非静止体画素により構成される個々の画素塊の全体ではなくて、個々の画素塊の輪郭のみが選択的に抽出される結果、後の探索処理や照合処理における処理対象画素が輪郭部分にのみ限定され、対象画素数の大幅減少により、処理速度の向上に寄与することとなる。 Further, the determination process of the lowest point in each group (step 1033) is performed between frame images in which the farthest points stored as matrix end candidates are consecutive, as will be described in detail later with reference to FIG. The farthest point stored as the matrix end candidate is the farthest point to be recognized as the true end of the matrix based on the direction of change and the stability after the change. This is a process to confirm.
<Details of extraction processing of non-stationary constituent pixels>
Details of the process of extracting the non-stationary constituent pixels are shown in FIG. As shown in the figure, this process includes a contour extraction process (step 10300) for converting a frame image into a line image obtained by extracting the contours of various images included therein, and a background image in the line image. A background deletion process (step 10301) for deleting the line image portion corresponding to the background difference process and a sharpening process (step 10302) for sharpening the lines constituting the line image after the background difference process. It consists of Therefore, in this non-stationary body constituent pixel extraction process, not only the entire individual pixel block composed of non-stationary body pixels, but only the outline of the individual pixel block is selectively extracted. The processing target pixels in the subsequent search processing and collation processing are limited to only the contour portion, and the number of target pixels is greatly reduced, which contributes to an improvement in processing speed.
<非静止体構成画素の抽出処理の詳細について>
非静止体構成画素の抽出処理の詳細を図6に示す。同図に示されるように、この処理は、フレーム画像をそれに含まれる各種の像の輪郭を抽出してなる線画像に変換する輪郭抽出処理(ステップ10300)と、前記線画像の中で、背景に相当する線画像部分を、背景差分処理を用いて削除する背景削除処理(ステップ10301)と、前記背景差分処理後の線画像を構成する線を鮮明化する鮮明処理(ステップ10302)とを含んで構成されている。そのため、この非静止体構成画素の抽出処理においては、単に、非静止体画素により構成される個々の画素塊の全体ではなくて、個々の画素塊の輪郭のみが選択的に抽出される結果、後の探索処理や照合処理における処理対象画素が輪郭部分にのみ限定され、対象画素数の大幅減少により、処理速度の向上に寄与することとなる。 Further, the determination process of the lowest point in each group (step 1033) is performed between frame images in which the farthest points stored as matrix end candidates are consecutive, as will be described in detail later with reference to FIG. The farthest point stored as the matrix end candidate is the farthest point to be recognized as the true end of the matrix based on the direction of change and the stability after the change. This is a process to confirm.
<Details of extraction processing of non-stationary constituent pixels>
Details of the process of extracting the non-stationary constituent pixels are shown in FIG. As shown in the figure, this process includes a contour extraction process (step 10300) for converting a frame image into a line image obtained by extracting the contours of various images included therein, and a background image in the line image. A background deletion process (step 10301) for deleting the line image portion corresponding to the background difference process and a sharpening process (step 10302) for sharpening the lines constituting the line image after the background difference process. It consists of Therefore, in this non-stationary body constituent pixel extraction process, not only the entire individual pixel block composed of non-stationary body pixels, but only the outline of the individual pixel block is selectively extracted. The processing target pixels in the subsequent search processing and collation processing are limited to only the contour portion, and the number of target pixels is greatly reduced, which contributes to an improvement in processing speed.
<輪郭抽出処理について>
輪郭抽出処理(ステップ10300)は、この例にあっては、図7(a)に示されるように、フレーム画像をモノクロ画像に変換するグレースケール化処理(ステップ10300-1)と、モノクロ画像に対して、公知のsobelフィルタを使用して、空間一次微分を施す空間一次微分処理(ステップ10300-2)と、空間一次微分処理後の画像を二値化して線画像を生成する二値化処理(ステップ10300-3)とを含んで構成されている。
グレースケール化処理(ステップ10300-1)は、周知のように、RGB値を様々なアルゴリズムに基づいて均等化する処理であるが、本発明においては、公知の手法のものを任意に採用することができる。空間一次微分処理(ステップ10300-2)は、要するに、隣接する画素の輝度差に相当する画素値を生成する処理であり、これによりグレースケール化後の画像に含まれる画素塊の輪郭部分を朧気ながら浮き立たせることができる。二値化処理(ステップ10300-3)は、空間一次微分後の画像を構成する各画素値を、適当な輝度しきい値をもって二値化することにより、画素塊の輪郭部分に相当する線像を浮き立たせることができる。 <About contour extraction processing>
In this example, the contour extraction process (step 10300) includes, as shown in FIG. 7A, a grayscale process (step 10300-1) for converting a frame image into a monochrome image, and a monochrome image. On the other hand, using a known sobel filter, spatial primary differentiation processing (step 10300-2) for performing spatial primary differentiation, and binarization processing for generating a line image by binarizing the image after the spatial primary differentiation processing (Step 10300-3).
As is well known, the gray scale processing (step 10300-1) is a processing for equalizing the RGB values based on various algorithms. In the present invention, a known method is arbitrarily employed. Can do. The spatial first differentiation process (step 10300-2) is, in short, a process for generating a pixel value corresponding to the luminance difference between adjacent pixels, and thereby, the contour portion of the pixel block included in the grayscaled image is aspirated. While being able to stand up. The binarization process (step 10300-3) is a line image corresponding to the contour portion of the pixel block by binarizing each pixel value constituting the image after spatial primary differentiation with an appropriate luminance threshold value. Can be raised.
輪郭抽出処理(ステップ10300)は、この例にあっては、図7(a)に示されるように、フレーム画像をモノクロ画像に変換するグレースケール化処理(ステップ10300-1)と、モノクロ画像に対して、公知のsobelフィルタを使用して、空間一次微分を施す空間一次微分処理(ステップ10300-2)と、空間一次微分処理後の画像を二値化して線画像を生成する二値化処理(ステップ10300-3)とを含んで構成されている。
グレースケール化処理(ステップ10300-1)は、周知のように、RGB値を様々なアルゴリズムに基づいて均等化する処理であるが、本発明においては、公知の手法のものを任意に採用することができる。空間一次微分処理(ステップ10300-2)は、要するに、隣接する画素の輝度差に相当する画素値を生成する処理であり、これによりグレースケール化後の画像に含まれる画素塊の輪郭部分を朧気ながら浮き立たせることができる。二値化処理(ステップ10300-3)は、空間一次微分後の画像を構成する各画素値を、適当な輝度しきい値をもって二値化することにより、画素塊の輪郭部分に相当する線像を浮き立たせることができる。 <About contour extraction processing>
In this example, the contour extraction process (step 10300) includes, as shown in FIG. 7A, a grayscale process (step 10300-1) for converting a frame image into a monochrome image, and a monochrome image. On the other hand, using a known sobel filter, spatial primary differentiation processing (step 10300-2) for performing spatial primary differentiation, and binarization processing for generating a line image by binarizing the image after the spatial primary differentiation processing (Step 10300-3).
As is well known, the gray scale processing (step 10300-1) is a processing for equalizing the RGB values based on various algorithms. In the present invention, a known method is arbitrarily employed. Can do. The spatial first differentiation process (step 10300-2) is, in short, a process for generating a pixel value corresponding to the luminance difference between adjacent pixels, and thereby, the contour portion of the pixel block included in the grayscaled image is aspirated. While being able to stand up. The binarization process (step 10300-3) is a line image corresponding to the contour portion of the pixel block by binarizing each pixel value constituting the image after spatial primary differentiation with an appropriate luminance threshold value. Can be raised.
グレースケール化処理後の画像例を図13(a)に、空間一次微分処理並びに二値化処理後の画像例を図13(b)に、それぞれ、模式的に示す。なお、それらの図において、PCG11,PCG12,PCG13は、それぞれ待ち行列を構成する人物等に相当する画素塊、PCG2は、待ち行列を構成しない人物等に相当する画素塊、PCG31,PCG32は背景物に相当する画素塊、4は画素塊の輪郭線、5はノイズに相当する汚点である。図13(a)の画像と図13(b)の画像とを比較して明らかなように、グレースケール化処理後の画像に対して、空間一次微分処理並びに二値化処理を施すと、輪郭線4を含む線画を得ることがとできる。もっとも、この状態の線画においては、輪郭線4の太さは不揃いであり、部分的には途切れた波線状の箇所も存在するほか、多数のノイズに相当する汚点5も少なからず存在する。
FIG. 13A schematically shows an example of an image after gray scale processing, and FIG. 13B schematically shows an example of an image after spatial primary differentiation processing and binarization processing. In these figures, PCG11, PCG12, and PCG13 are pixel blocks corresponding to persons and the like constituting the queue, PCG2 is a pixel block corresponding to persons and the like that do not form the queue, and PCG31 and PCG32 are background objects. 4 is a contour line of the pixel block, and 5 is a dark spot corresponding to noise. As is apparent from a comparison between the image of FIG. 13A and the image of FIG. 13B, when the spatial first-order differentiation process and the binarization process are performed on the image after the gray scale process, an outline is obtained. A line drawing including the line 4 can be obtained. However, in the line drawing in this state, the thickness of the outline 4 is not uniform, there are partly broken wavy parts, and there are not a few stigma 5 corresponding to many noises.
<背景削除処理について>
背景削除処理(ステップ10301)の詳細を図8に示す。背景削除処理は、要するに、線画像を構成する線に含まれる画素を、当該画素の所定フレーム数前の値と照合し、両者が一致する画素を背景画素として削除する処理である、すなわち、同図に示されるように、この処理は、画素指定(ポインタ)の値を初期値(例えば、画面の左上)から水平方向へ1画素ずつ移動させる処理を、垂直下方向へと1画素ずつシフトさせながら(ステップ10301-1,10301-6)、各画素値を探査しつつ、それが輪郭部分であって、かつ120フレーム前の画素値と一致する場合に限り、その画素を背景とみなして削除する処理(ステップ10301-2,10301-3,10301-4)を、全画素について処理が完了するまで(ステップ10301-5NO)、繰り返し実行するものである。この処理にあっては、背景差分の基準となる画像を柱やテーブル等の本来の背景物を含む初期画像に固定するのではなく、常に、120フレーム時間毎に更新するようにしているため、空港フロアに置かれたスーツケースやその他の荷物であっても、120フレーム時間(4秒)以上に亘り継続的に置かれたまま場合には、それらの荷物は背景物とみなして削除されることから、それらの荷物が待ち行列の一部と誤認されることがない。 <About background deletion processing>
Details of the background deletion processing (step 10301) are shown in FIG. In short, the background deletion process is a process in which the pixels included in the line constituting the line image are compared with the value of the pixel before the predetermined number of frames, and the matching pixels are deleted as the background pixels. As shown in the figure, this process shifts the pixel designation (pointer) value from the initial value (for example, the upper left of the screen) one pixel at a time in the horizontal direction by shifting one pixel at a time in the vertical downward direction. (Steps 10301-1 and 10301-6), while searching for each pixel value, only if it is a contour part and matches the pixel value 120 frames before, delete the pixel as a background This process (steps 10301-2, 10301-3, and 10301-4) is repeatedly executed until the process is completed for all pixels (NO in step 10301-5). In this process, the image serving as the reference for the background difference is not fixed to the initial image including the original background object such as a column or table, but is always updated every 120 frame times. Even if a suitcase or other baggage placed on the airport floor is left on for more than 120 frame times (4 seconds), it will be deleted as a background item. For this reason, these packages will not be mistaken for part of the queue.
背景削除処理(ステップ10301)の詳細を図8に示す。背景削除処理は、要するに、線画像を構成する線に含まれる画素を、当該画素の所定フレーム数前の値と照合し、両者が一致する画素を背景画素として削除する処理である、すなわち、同図に示されるように、この処理は、画素指定(ポインタ)の値を初期値(例えば、画面の左上)から水平方向へ1画素ずつ移動させる処理を、垂直下方向へと1画素ずつシフトさせながら(ステップ10301-1,10301-6)、各画素値を探査しつつ、それが輪郭部分であって、かつ120フレーム前の画素値と一致する場合に限り、その画素を背景とみなして削除する処理(ステップ10301-2,10301-3,10301-4)を、全画素について処理が完了するまで(ステップ10301-5NO)、繰り返し実行するものである。この処理にあっては、背景差分の基準となる画像を柱やテーブル等の本来の背景物を含む初期画像に固定するのではなく、常に、120フレーム時間毎に更新するようにしているため、空港フロアに置かれたスーツケースやその他の荷物であっても、120フレーム時間(4秒)以上に亘り継続的に置かれたまま場合には、それらの荷物は背景物とみなして削除されることから、それらの荷物が待ち行列の一部と誤認されることがない。 <About background deletion processing>
Details of the background deletion processing (step 10301) are shown in FIG. In short, the background deletion process is a process in which the pixels included in the line constituting the line image are compared with the value of the pixel before the predetermined number of frames, and the matching pixels are deleted as the background pixels. As shown in the figure, this process shifts the pixel designation (pointer) value from the initial value (for example, the upper left of the screen) one pixel at a time in the horizontal direction by shifting one pixel at a time in the vertical downward direction. (Steps 10301-1 and 10301-6), while searching for each pixel value, only if it is a contour part and matches the pixel value 120 frames before, delete the pixel as a background This process (steps 10301-2, 10301-3, and 10301-4) is repeatedly executed until the process is completed for all pixels (NO in step 10301-5). In this process, the image serving as the reference for the background difference is not fixed to the initial image including the original background object such as a column or table, but is always updated every 120 frame times. Even if a suitcase or other baggage placed on the airport floor is left on for more than 120 frame times (4 seconds), it will be deleted as a background item. For this reason, these packages will not be mistaken for part of the queue.
<鮮明化処理について>
鮮明化処理(ステップ10302)の詳細を図7に示す。同図に示されるように、この処理は、背景削除後の線画像に含まれるノイズ等を原因とする微細な汚点を除去する収縮処理と、背景処理後の線画像に含まれる線の途切れた波線部分を繋いで太線化する膨張処理とを含んで構成されている。
背景削除処理後の画像例を図14(a)に、鮮明化処理後の画像例を図14(b)に、それぞれ、模式的に示す。図14(a)の画像と図14(b)の画像とを比較して明らかなように、背景削除処理後の画像に対して鮮明化処理を施すことにより、背景削除処理後の画像に存在した輪郭線4は波線部分が修復されかつ太線化された輪郭線4Aとなり、またノイズ汚点5は抹消されることとなる。この太線化された輪郭線4Aによれば、後に、ラベリング処理を実行する際に、同一ラベルを付する範囲が広くなり、画素塊の大型化かつ少数化に寄与することとなる。 <About sharpening processing>
Details of the sharpening process (step 10302) are shown in FIG. As shown in the figure, this processing is performed by a contraction process for removing fine stains caused by noise or the like included in the line image after the background deletion, and a line included in the line image after the background process is interrupted. And an expansion process for connecting the wavy line portions to thicken them.
An image example after the background deletion process is schematically shown in FIG. 14A, and an image example after the sharpening process is schematically shown in FIG. 14B. As is apparent from the comparison between the image in FIG. 14A and the image in FIG. 14B, the image after the background deletion process is present by performing the sharpening process on the image after the background deletion process. Thecontour line 4 becomes a contour line 4A in which the wavy line portion is repaired and thickened, and the noise spot 5 is erased. According to the thickened outline 4A, when the labeling process is executed later, the range to which the same label is attached becomes wider, which contributes to the increase in size and the number of pixel blocks.
鮮明化処理(ステップ10302)の詳細を図7に示す。同図に示されるように、この処理は、背景削除後の線画像に含まれるノイズ等を原因とする微細な汚点を除去する収縮処理と、背景処理後の線画像に含まれる線の途切れた波線部分を繋いで太線化する膨張処理とを含んで構成されている。
背景削除処理後の画像例を図14(a)に、鮮明化処理後の画像例を図14(b)に、それぞれ、模式的に示す。図14(a)の画像と図14(b)の画像とを比較して明らかなように、背景削除処理後の画像に対して鮮明化処理を施すことにより、背景削除処理後の画像に存在した輪郭線4は波線部分が修復されかつ太線化された輪郭線4Aとなり、またノイズ汚点5は抹消されることとなる。この太線化された輪郭線4Aによれば、後に、ラベリング処理を実行する際に、同一ラベルを付する範囲が広くなり、画素塊の大型化かつ少数化に寄与することとなる。 <About sharpening processing>
Details of the sharpening process (step 10302) are shown in FIG. As shown in the figure, this processing is performed by a contraction process for removing fine stains caused by noise or the like included in the line image after the background deletion, and a line included in the line image after the background process is interrupted. And an expansion process for connecting the wavy line portions to thicken them.
An image example after the background deletion process is schematically shown in FIG. 14A, and an image example after the sharpening process is schematically shown in FIG. 14B. As is apparent from the comparison between the image in FIG. 14A and the image in FIG. 14B, the image after the background deletion process is present by performing the sharpening process on the image after the background deletion process. The
<近接画素塊同士のグループ化処理について>
近接画素塊同士のグループ化処理の詳細を図9に示す。同図に示されるように、この処理は、要するに、抽出された非静止体を構成する画素のうちで、互いに隣接する画素に同一のラベルを付すことにより、非静止体を構成する画素を画素塊毎に区分するラベリング処理(ステップ10310)と、行列先頭に相当する1の画素塊に含まれる画素のうちで、所定の行列延在方向の最遠点(この例では、画面最下点)に位置する画素を基準として、前記延在方向の遠方側(この例では、画面下方向)に向けて設定された所定の探索領域(この例では、左右方向にそれぞれX画素、下方向にY画素の矩形の探索領域A)内に、別のラベルの付された画素が存在するか否かを探索し、存在するときには、その画素が含まれる画素塊を同一グループに属する画素塊として記憶する処理を、新たに同一グループとして記憶された画素塊について、別のラベルの付された画素が存在しなくなるまで、全ての先頭相当画素塊について、繰り返す近接画素塊探査処理(ステップ10311~10315)とを含んで構成される。 <Regarding the grouping process between adjacent pixel clusters>
Details of the grouping process between adjacent pixel blocks are shown in FIG. As shown in the figure, this processing is basically performed by assigning the same label to the pixels adjacent to each other among the extracted non-stationary pixels, so that the pixels constituting the non-stationary body are pixelated. A labeling process (step 10310) for dividing each block and the farthest point in the predetermined matrix extending direction among the pixels included in one pixel block corresponding to the top of the matrix (in this example, the lowest point of the screen) Predetermined search areas set in the far direction of the extending direction (in this example, the lower direction of the screen) with reference to the pixel located at (in this example, X pixels in the horizontal direction and Y in the lower direction, respectively) In the rectangular search area A) of pixels, a search is made as to whether or not another labeled pixel exists, and when it exists, the pixel block including the pixel is stored as a pixel block belonging to the same group. New processing for the same group Configured for the stored pixel block, to the pixel, labeled with a different label is no longer present, all of the top corresponding pixel block is repeated close pixel block search process (steps 10311 to 10315) and contains as.
近接画素塊同士のグループ化処理の詳細を図9に示す。同図に示されるように、この処理は、要するに、抽出された非静止体を構成する画素のうちで、互いに隣接する画素に同一のラベルを付すことにより、非静止体を構成する画素を画素塊毎に区分するラベリング処理(ステップ10310)と、行列先頭に相当する1の画素塊に含まれる画素のうちで、所定の行列延在方向の最遠点(この例では、画面最下点)に位置する画素を基準として、前記延在方向の遠方側(この例では、画面下方向)に向けて設定された所定の探索領域(この例では、左右方向にそれぞれX画素、下方向にY画素の矩形の探索領域A)内に、別のラベルの付された画素が存在するか否かを探索し、存在するときには、その画素が含まれる画素塊を同一グループに属する画素塊として記憶する処理を、新たに同一グループとして記憶された画素塊について、別のラベルの付された画素が存在しなくなるまで、全ての先頭相当画素塊について、繰り返す近接画素塊探査処理(ステップ10311~10315)とを含んで構成される。 <Regarding the grouping process between adjacent pixel clusters>
Details of the grouping process between adjacent pixel blocks are shown in FIG. As shown in the figure, this processing is basically performed by assigning the same label to the pixels adjacent to each other among the extracted non-stationary pixels, so that the pixels constituting the non-stationary body are pixelated. A labeling process (step 10310) for dividing each block and the farthest point in the predetermined matrix extending direction among the pixels included in one pixel block corresponding to the top of the matrix (in this example, the lowest point of the screen) Predetermined search areas set in the far direction of the extending direction (in this example, the lower direction of the screen) with reference to the pixel located at (in this example, X pixels in the horizontal direction and Y in the lower direction, respectively) In the rectangular search area A) of pixels, a search is made as to whether or not another labeled pixel exists, and when it exists, the pixel block including the pixel is stored as a pixel block belonging to the same group. New processing for the same group Configured for the stored pixel block, to the pixel, labeled with a different label is no longer present, all of the top corresponding pixel block is repeated close pixel block search process (
ラベリング処理(ステップ10310)とは、当業者にはよく知られているように、何らかの像を形成する1の画素に着目して、それに隣接して(すなわち、上下左右の4個の画素に、又は上下左右に加えて、右上、右下、左上、左下の8個の画素に)、何らかの像を形成する他の画素が存在するときには、その画素に自分と同じラベルを付す作業を繰り返す処理のことを言う。ここで、像を形成する画素とは、先に図14(a)を参照して説明した太線化された輪郭線4Aに含まれる画素のことを指すこととなる。
As is well known to those skilled in the art, the labeling process (step 10310) focuses on one pixel that forms an image and is adjacent to it (that is, four pixels on the top, bottom, left, and right) (In addition to the top, bottom, left, and right, the upper right, lower right, upper left, and lower left 8 pixels) When there are other pixels that form an image, repeat the process of attaching the same label to the pixels Say that. Here, the pixels forming the image indicate the pixels included in the bold outline 4A described above with reference to FIG.
その結果、上述のラベリング処理が終了すると、例えば、図14(a)の画像中の画素塊PCG11の輪郭線4Aを形成する全ての画素には第1のラベル(ラベル1)が、画素塊PCG12の輪郭線4Aを形成する全ての画素には第2のラベル(ラベル2)が、画素塊PCG13の輪郭線4Aを形成する全ての画素には第3のラベル(ラベル3)が、画素塊PCG2の輪郭線4Aを形成する全ての画素には第4のラベル(ラベル4)が、付される。
As a result, when the above-described labeling process ends, for example, the first label (label 1) is added to the pixel block PCG12 for all the pixels forming the outline 4A of the pixel block PCG11 in the image of FIG. The second label (label 2) is assigned to all the pixels forming the contour line 4A, and the third label (label 3) is assigned to all the pixels forming the outline 4A of the pixel block PCG13. A fourth label (label 4) is attached to all the pixels forming the outline 4A.
近接画素塊探索処理(ステップ10311~10315)を、図15を参照してより詳細に説明すると、先ず、着目ラベル(ポインタ)を初期値(例えば、ラベル1)に設定したのち(ステップ10311)、当該ラベル(ラベル1)が付された画素塊(例えば、画素塊PCG11の輪郭線4A部分)に含まれる画素のうちで最下点に位置する画素(Pz)から見て、図15(b)に示す所定の探索領域(例えば、左右方向にそれぞれX画素、下方向にY画素の矩形の探索領域)A内に別のラベル(例えば、ラベル2、ラベル3・・・)が付された画素が存在するか否かを判定する(ステップ10312)。
The proximity pixel block search processing (steps 10311 to 10315) will be described in more detail with reference to FIG. 15. First, after setting a target label (pointer) to an initial value (for example, label 1) (step 10311), As seen from the pixel (Pz) located at the lowest point among the pixels included in the pixel block (for example, the outline 4A portion of the pixel block PCG11) with the label (label 1), FIG. Pixels with different labels (for example, label 2, label 3,...) In a predetermined search area A (for example, a rectangular search area of X pixels in the horizontal direction and Y pixels in the downward direction) A Whether or not exists is determined (step 10312).
ここで、別のラベル(例えば、ラベル2)の付された画素が存在するときには(ステップ10312YES)、その画素が含まれる画素塊(例えば、画素塊PCG12の輪郭線4A部分)を同一グループ(例えば、第1グループであるグループG1)に関係付ける(ステップ10313)。
Here, when there is a pixel with another label (for example, label 2) (YES in step 10312), the pixel block (for example, the outline 4A portion of the pixel block PCG12) including the pixel is included in the same group (for example, , Group G1) which is the first group (step 10313).
以上の処理を、新たにグループG1に組み込まれた画素塊について、当該画素塊の最下点(Pz)から見て所定の探索領域A内に別のラベルが付された画素が発見されなくなるまで繰り返す(ステップ10312,10313)。
For the pixel block newly incorporated in the group G1, the above processing is performed until no pixel with another label is found in the predetermined search area A when viewed from the lowest point (Pz) of the pixel block. Repeat (steps 10312 and 10313).
こうして、別のラベルが付された画素が発見されなくなったならば(ステップ10312NO)、着目ラベルを未処理値に更新したのち(ステップ10314)、以上の処理(ステップ10312,10313)を、全ラベルについて処理が終了するまで繰り返す(ステップ10315NO)。
以上の近接画素探索処理(ステップ10311~10315)で使用される探索領域Aは、画面内において待ち行列は下方へ延出するとの想定の下に、画面の主として下方向へ広がる探査領域A内において、次々と、画素塊の構成画素を探索しては、見つかった画素塊PCG12,PCG13を同一のグループG1に組み込んで行くものであるから、結果として、グループG1内には、周囲の通行人等に対応する画素塊を排除して、待ち行列を構成する可能性の高い一連の画素塊PCG11,PCG12,PCG13だけが含まれることとなる。 In this way, if a pixel with another label is no longer found (NO in step 10312), the target label is updated to an unprocessed value (step 10314), and the above processing (steps 10312 and 10313) is performed for all labels. Is repeated until the process is completed (NO in step 10315).
The search area A used in the above proximity pixel search processing (steps 10311 to 10315) is in the search area A that mainly extends downward on the screen, assuming that the queue extends downward in the screen. One after another, the constituent pixels of the pixel block are searched, and the found pixel blocks PCG12 and PCG13 are incorporated into the same group G1. As a result, the passersby around the group G1 In other words, only a series of pixel blocks PCG11, PCG12, and PCG13 having a high possibility of forming a queue are included.
以上の近接画素探索処理(ステップ10311~10315)で使用される探索領域Aは、画面内において待ち行列は下方へ延出するとの想定の下に、画面の主として下方向へ広がる探査領域A内において、次々と、画素塊の構成画素を探索しては、見つかった画素塊PCG12,PCG13を同一のグループG1に組み込んで行くものであるから、結果として、グループG1内には、周囲の通行人等に対応する画素塊を排除して、待ち行列を構成する可能性の高い一連の画素塊PCG11,PCG12,PCG13だけが含まれることとなる。 In this way, if a pixel with another label is no longer found (NO in step 10312), the target label is updated to an unprocessed value (step 10314), and the above processing (
The search area A used in the above proximity pixel search processing (
<各グループ内最下点探索処理について>
各グループ内最下点探索処理の詳細を図10に示す。このグループ内最下点探索処理は、要するに、同一グループに属する1の画素塊に含まれる画素のうちで、所定の行列延在方向(この例では画面下方)の最遠点(最下点)に位置する画素を探索する処理を、当該グループに属する全ての画素塊について繰り返すことにより、当該同一グルーブ内における最遠点(最下点)を特定するものである。 <About the lowest point search process in each group>
Details of the lowest point search process in each group are shown in FIG. This in-group lowest point search process is, in short, the farthest point (lowest point) in a predetermined matrix extending direction (downward in the screen in this example) among the pixels included in one pixel block belonging to the same group. The process of searching for the pixel located at is repeated for all the pixel clusters belonging to the group to identify the farthest point (the lowest point) in the same groove.
各グループ内最下点探索処理の詳細を図10に示す。このグループ内最下点探索処理は、要するに、同一グループに属する1の画素塊に含まれる画素のうちで、所定の行列延在方向(この例では画面下方)の最遠点(最下点)に位置する画素を探索する処理を、当該グループに属する全ての画素塊について繰り返すことにより、当該同一グルーブ内における最遠点(最下点)を特定するものである。 <About the lowest point search process in each group>
Details of the lowest point search process in each group are shown in FIG. This in-group lowest point search process is, in short, the farthest point (lowest point) in a predetermined matrix extending direction (downward in the screen in this example) among the pixels included in one pixel block belonging to the same group. The process of searching for the pixel located at is repeated for all the pixel clusters belonging to the group to identify the farthest point (the lowest point) in the same groove.
すなわち、同図に示されるように、この処理は、グループ指定(ポインタ)の値を初期値(例えば、第1レーンに相当するグループG1)に設定したのち(ステップ10320)、当該グループに属する全ての画素塊(例えば、PCG1,PCG12,PCG13)の中から、最下点となる画素を探索し(ステップ10321)、探索された最下点を記憶する処理(ステップ10322)を、グループ指定を次々と更新しては(ステップ10324)、繰り返し(ステップ10321,10322)、全グループの探索を待って(ステップ10323)、処理を終了する。
That is, as shown in the figure, this process sets the group designation (pointer) value to an initial value (for example, group G1 corresponding to the first lane) (step 10320), and then all of the groups belonging to the group. The pixel that is the lowest point is searched from the pixel block (for example, PCG1, PCG12, PCG13) (step 10321), and the process of storing the searched lowest point (step 10322) is performed one after another. (Step 10324), and repeatedly (steps 10321 and 10322), the search of all groups is waited (step 10323), and the process is terminated.
このように、最後にグループに組み込まれた画素塊から最下点を探索するのではなく、グループに属する全ての画素塊の中から、最下点を探査することにより、行列末尾である可能性の高い最下点を確実に探索することができる。
In this way, instead of searching for the lowest point from the pixel block last incorporated in the group, it is possible to end the matrix by searching for the lowest point from all the pixel blocks belonging to the group. It is possible to reliably search for the lowest lowest point.
<各グループ内最下点の確定処理について>
各グループ内の最下点の確定処理の詳細を図11に示す。この各グルーブ内最下点の確定処理は、要するに、同一グループ内最遠点(この例では、最下点)の現フレームの値と1つ前のフレームの値との差が所定の許容値以内であるときには所定のフレームカウンタをインクリメントする一方、許容値を超えるときには前記フレームカウンタをクリアするカウンタ制御処理と、同一グループ内最遠点(この例では、最遠点)の現フレームの値と1つ前のフレームの値との差が所定の許容値を超えて変化するとき、その変化方向が近方向(この例では、画面上方)のときには、現フレームの値により真の最遠点(この例では、最下点)の値を更新する一方、その変化方向が遠方向(この例では、画面下方)であるときには、フレームカウンタの値が所定フレーム数に達するのを待って、現フレームの値により真の最遠点(最下点)の値を更新する最遠点真値(最下点真値)の更新制御処理とを含むものである。 <About the final point determination process in each group>
Details of the lowest point determination process in each group are shown in FIG. The determination process of the lowest point in each groove is basically that the difference between the value of the current frame at the farthest point in the same group (in this example, the lowest point) and the value of the previous frame is a predetermined allowable value. The counter control process increments a predetermined frame counter when it is within, while clearing the frame counter when the allowable value is exceeded, and the value of the current frame at the farthest point in the same group (the farthest point in this example) When the difference from the value of the previous frame changes beyond a predetermined allowable value, when the direction of change is near (in this example, the upper part of the screen), the true farthest point (depending on the value of the current frame) In this example, the value of the lowest point) is updated, while when the direction of change is the far direction (in this example, the lower part of the screen), it waits for the value of the frame counter to reach a predetermined number of frames and waits for the current frame. The value of the It is intended to include the update control process of the truer farthest point farthest point true value to update the value of (the lowest point) (nadir true value).
各グループ内の最下点の確定処理の詳細を図11に示す。この各グルーブ内最下点の確定処理は、要するに、同一グループ内最遠点(この例では、最下点)の現フレームの値と1つ前のフレームの値との差が所定の許容値以内であるときには所定のフレームカウンタをインクリメントする一方、許容値を超えるときには前記フレームカウンタをクリアするカウンタ制御処理と、同一グループ内最遠点(この例では、最遠点)の現フレームの値と1つ前のフレームの値との差が所定の許容値を超えて変化するとき、その変化方向が近方向(この例では、画面上方)のときには、現フレームの値により真の最遠点(この例では、最下点)の値を更新する一方、その変化方向が遠方向(この例では、画面下方)であるときには、フレームカウンタの値が所定フレーム数に達するのを待って、現フレームの値により真の最遠点(最下点)の値を更新する最遠点真値(最下点真値)の更新制御処理とを含むものである。 <About the final point determination process in each group>
Details of the lowest point determination process in each group are shown in FIG. The determination process of the lowest point in each groove is basically that the difference between the value of the current frame at the farthest point in the same group (in this example, the lowest point) and the value of the previous frame is a predetermined allowable value. The counter control process increments a predetermined frame counter when it is within, while clearing the frame counter when the allowable value is exceeded, and the value of the current frame at the farthest point in the same group (the farthest point in this example) When the difference from the value of the previous frame changes beyond a predetermined allowable value, when the direction of change is near (in this example, the upper part of the screen), the true farthest point (depending on the value of the current frame) In this example, the value of the lowest point) is updated, while when the direction of change is the far direction (in this example, the lower part of the screen), it waits for the value of the frame counter to reach a predetermined number of frames and waits for the current frame. The value of the It is intended to include the update control process of the truer farthest point farthest point true value to update the value of (the lowest point) (nadir true value).
すなわち、同図において、処理が開始されると(ステップ1330)、先ず、グループ指定(ポインタ)の値を第1レーンの待ち行列に相当する値に初期設定したのち(ステップ10330)、現フレームの最下点と1つ前のフレームの最下点との差が、所定の許容値以内か否かの判定を行う(ステップ10331)。
That is, in the figure, when processing is started (step 1330), first, the group designation (pointer) value is initialized to a value corresponding to the queue of the first lane (step 10330), and then the current frame It is determined whether the difference between the lowest point and the lowest point of the previous frame is within a predetermined allowable value (step 10331).
ここで、許容値以内、すなわち殆ど変化がない安定状態にあると判定されると(ステップ10331)、安定時間を判定するための所定のフレームカウンタを+1だけインクリメントしたのち(ステップ10332)、さらに、フレームカウンタの値が規定値(例えば、3秒に相当する90フレーム)を超えたか否か、すなわち所定の安定時間に達したか否かの判定を行う(ステップ10333)。
Here, if it is determined that it is within the allowable value, that is, it is in a stable state with almost no change (step 10331), a predetermined frame counter for determining the stable time is incremented by +1 (step 10332), and then It is determined whether or not the value of the frame counter has exceeded a specified value (for example, 90 frames corresponding to 3 seconds), that is, whether or not a predetermined stable time has been reached (step 10333).
ここで、規定値を超えたと判定されると(ステップ10333YES)、待ち行列が延びたものと判断して、現在の最下点を真の最下点と確定するのに対して(ステップ10334)、規定値を未だ超えてないと判定されると(ステップ10333NO)、待ち行列が延びたか否かは未だ不明と判断して、最下点確定処理(ステップ10334)はスキップされる。
If it is determined that the specified value has been exceeded (YES in step 10333), it is determined that the queue has been extended, and the current lowest point is determined as the true lowest point (step 10334). If it is determined that the specified value has not yet been exceeded (NO at step 10333), it is determined whether or not the queue has been extended, and the lowest point determination process (step 10334) is skipped.
これに対して、現フレームの最下点と1つ前のフレームの最下点との差が許容値以内でないと判定されると(ステップ10331NO)、待ち行列が延びたか縮んだかの可能性ありと判断して、フレームカウンタをクリアしたのち(ステップ10337)、さらに、現フレームの最下点が1つ前のフレームの最下点より下がったか否かが判定される(ステップ10338)。
On the other hand, if it is determined that the difference between the lowest point of the current frame and the lowest point of the previous frame is not within the allowable value (step 10331 NO), there is a possibility that the queue has been extended or shrunk. After the frame counter is cleared (step 10337), it is further determined whether or not the lowest point of the current frame has fallen below the lowest point of the previous frame (step 10338).
ここで、現フレームの最下点が1つ前のフレームの最下点より下がったと判定されると(ステップ10338)、待ち行列が延長された可能性があると判断して、取りあえず、最下点確定処理(ステップ10334)はスキップされるのに対して、現フレームの最下点が1つ前のフレームの最下点より下がっていないと判定されると(ステップ10338NO)、待ち行列が短縮されたと直ちに判断して、最下点確定処理(ステップ10334)を実行する。以後、全グループ(全レーン)に対する処理が完了するまで、グループ指定の値を次々と更新しては(ステップ10336)、以上述べた一連の処理を実行すると共に、全グループに対する処理が完了するのを待って(ステップ10335YES)、処理は終了する。
If it is determined that the lowest point of the current frame is lower than the lowest point of the previous frame (step 10338), it is determined that there is a possibility that the queue has been extended. While the point determination process (step 10334) is skipped, if it is determined that the lowest point of the current frame is not lower than the lowest point of the previous frame (step 10338 NO), the queue is shortened. Immediately determining that it has been performed, the lowest point determination process (step 10334) is executed. Thereafter, until the processing for all groups (all lanes) is completed, the group designation values are updated one after another (step 10336), and the above-described series of processing is executed and processing for all groups is completed. (Step 10335 YES), and the process ends.
以上一連の処理(ステップ10330~10338)が、フレーム毎に繰り返される結果、待ち行列が短縮されて、その行列末尾が前方へと移動する場合には、行列末尾は移動後の値に直ちに更新されるのに対して、待ち行列が伸張されて、その行列末尾が後方へと移動する場合には、所定の確認時間をまって、行列末尾は移動後の値に更新される。さらに、例えば、図16に示されるように、正常な待ち行列(同図(a)参照)の末尾を通行人が矢印6の如く横切ることにより、一時的に、行列末尾に画素塊PCG22が現れて、ΔYだけ待ち行列が延長するような場合(同図(b)参照)には、前述したように、フレームカウンタの値が規定値に達する前に(ステップ10333NO)、当該フレームカウンタの値はクリアされてしまうため(ステップ10337)、このような通行人の横切り等による一時的な行列末尾の後方への変化があったとしても、待ち行列の真の末尾を見誤ることがない。
As a result of the above series of processing (steps 10330 to 10338) being repeated for each frame, when the queue is shortened and the end of the matrix moves forward, the end of the matrix is immediately updated to the value after movement. On the other hand, when the queue is expanded and the end of the queue moves backward, the end of the queue is updated to the value after the movement after a predetermined confirmation time. Further, for example, as shown in FIG. 16, when a passer passes the end of a normal queue (see FIG. 16A) as indicated by an arrow 6, a pixel block PCG22 appears temporarily at the end of the matrix. In the case where the queue is extended by ΔY (see (b) in the figure), as described above, before the value of the frame counter reaches the specified value (step 10333 NO), the value of the frame counter is Since it is cleared (step 10337), even if there is a temporary rearward change of the end of the queue due to such a crossing of a passerby, the true end of the queue will not be mistaken.
<待ち時間推定処理について>
次に、待ち時間推定処理(ステップ104)の詳細を図12に示す。同図に示されるように、この処理は、先ず、グルーブ指定(ポインタ)を初期値(第1レーンに相当する値)に設定したのち(ステップ1041)、ポインタで指定されるグループの最下点を読み出し(ステップ1042)、読み出された最下点を相関テーブルを参照して待ち時間を求め(ステップ1043)、求められた待ち時間を該当するレーンの待ち時間として記録する(ステップ1043)。 <About waiting time estimation processing>
Next, the details of the waiting time estimation process (step 104) are shown in FIG. As shown in the figure, this processing first sets the groove designation (pointer) to an initial value (a value corresponding to the first lane) (step 1041), and then the lowest point of the group designated by the pointer. Is read (step 1042), the waiting time is obtained with reference to the correlation table with reference to the read lowest point (step 1043), and the obtained waiting time is recorded as the waiting time of the corresponding lane (step 1043).
次に、待ち時間推定処理(ステップ104)の詳細を図12に示す。同図に示されるように、この処理は、先ず、グルーブ指定(ポインタ)を初期値(第1レーンに相当する値)に設定したのち(ステップ1041)、ポインタで指定されるグループの最下点を読み出し(ステップ1042)、読み出された最下点を相関テーブルを参照して待ち時間を求め(ステップ1043)、求められた待ち時間を該当するレーンの待ち時間として記録する(ステップ1043)。 <About waiting time estimation processing>
Next, the details of the waiting time estimation process (step 104) are shown in FIG. As shown in the figure, this processing first sets the groove designation (pointer) to an initial value (a value corresponding to the first lane) (step 1041), and then the lowest point of the group designated by the pointer. Is read (step 1042), the waiting time is obtained with reference to the correlation table with reference to the read lowest point (step 1043), and the obtained waiting time is recorded as the waiting time of the corresponding lane (step 1043).
この相関テーブルは、予め統計的な手法にて作成して用意する。すなわち、所定のメモリ内には、図3に示されるように、各行列末尾の基準線Pref1~Pref5と、行列末尾が各基準線Pref1~Pref5の位置にあるとき、それぞれ待ち時間がどの程度になるかを統計的に調べることにより得られた各待ち時間との関係を示す以下の相関テーブルが保存されている。
This correlation table is prepared and prepared in advance by a statistical method. That is, in the predetermined memory, as shown in FIG. 3, when the reference lines Pref1 to Pref5 at the end of each matrix and the end points of the matrix are at the positions of the respective reference lines Pref1 to Pref5, how long the waiting time is, respectively. The following correlation table showing the relationship with each waiting time obtained by statistically examining whether or not is stored is stored.
行列末尾位置 待ち時間
Pref1 5分以下
Pref2 約10分
Pref3 約15分
Pref4 約20分
Pref5 約30分以上
相関テーブル
Matrix end position Waittime Pref1 5 minutes or less Pref2 About 10 minutes Pref3 About 15 minutes Pref4 About 20 minutes Pref5 About 30 minutes or more Correlation table
Pref1 5分以下
Pref2 約10分
Pref3 約15分
Pref4 約20分
Pref5 約30分以上
相関テーブル
Matrix end position Wait
前述の待ち時間推定処理(ステップ1043)では、読み出された最下点から導き出される行列末尾位置を上述の相関テーブルの各基準線Pref1~Pref5と照合すると共に、必要により、直線近似処理等により基準線間の隙間を補間することにより、待ち時間を生成する。こうして求められた待ち時間は、該当レーンの待ち時間として記録される(ステップ1044)。
In the above-described waiting time estimation process (step 1043), the end position of the matrix derived from the read bottom point is collated with each of the reference lines Pref1 to Pref5 of the correlation table, and if necessary, by a linear approximation process or the like. The waiting time is generated by interpolating the gap between the reference lines. The waiting time obtained in this way is recorded as the waiting time of the corresponding lane (step 1044).
以上の一連の処理(ステップ1042~1044)が、グループ指定を更新しては(ステップ1046)、繰り返して実行され、全グループの処理が完了するのを待って(ステップ1045YES)、レーン毎の待ち時間からゲート全体の待ち時間を求める処理(ステップ1047)、及び求められた待ち時間を該当ゲートの待ち時間として記録する処理(ステップ108)を順次に実行して処理は終了する。
The above series of processing (steps 1042 to 1044) is executed repeatedly after updating the group designation (step 1046), waiting for completion of processing of all groups (step 1045 YES), and waiting for each lane. A process for obtaining the waiting time for the entire gate from the time (step 1047) and a process for recording the obtained waiting time as the waiting time for the corresponding gate (step 108) are sequentially executed, and the process ends.
なお、レーン毎の待ち時間からゲート全体の待ち時間を求める処理(ステップ1047)としては、例えば各レーンの待ち時間のうちで最大待ち時間をそのゲートの待ち時間とする場合、各レーンの待ち時間の平均をとってそのゲートの待ち時間とする場合、等々、様々な方法を適宜に選択して採用すればよい。
In addition, as processing (step 1047) for obtaining the waiting time of the entire gate from the waiting time for each lane, for example, when the maximum waiting time is set as the waiting time of the gate among the waiting times of each lane, the waiting time of each lane If the average is taken as the waiting time for the gate, various methods may be selected and employed as appropriate.
<画像処理の過程を示す画像例>
以上、図5~図11のフローチャート、及び図13~図16の模式図を参照して説明した画像処理を、実際の空港における出発ゲート前の待ち行列に適用した場合における画像例を、図17~図19にまとめて示す。図において、図17輪郭抽出処理後の画像例、図18は背景削除及び鮮明化処理後の画像例、図19はグループ化処理後の画像例であり、各待ち行列末尾位置が一点鎖線にて示され、その引き出し端には、該当する待ち時間が付されている。さらに、画素塊のうちで、待ち行列の一部としてグループ化された画素塊はハッチングが付され、それ以外の係員などに相当する画素塊は梨地表示とされている。なお、画面左下のコーナー部には、探索領域Aに相当する白抜き矩形マークが描かれている。 <Example of image showing the process of image processing>
An example of an image when the image processing described with reference to the flowcharts of FIGS. 5 to 11 and the schematic diagrams of FIGS. 13 to 16 is applied to a queue before the departure gate in an actual airport is shown in FIG. These are collectively shown in FIG. In the figure, FIG. 17 shows an image example after contour extraction processing, FIG. 18 shows an example image after background deletion and sharpening processing, FIG. 19 shows an example image after grouping processing, and each queue end position is indicated by a one-dot chain line. The corresponding waiting time is given to the drawer end. Further, among the pixel blocks, the pixel blocks grouped as a part of the queue are hatched, and the pixel blocks corresponding to other staff members are displayed with a satin appearance. Note that a white rectangular mark corresponding to the search area A is drawn in the lower left corner of the screen.
以上、図5~図11のフローチャート、及び図13~図16の模式図を参照して説明した画像処理を、実際の空港における出発ゲート前の待ち行列に適用した場合における画像例を、図17~図19にまとめて示す。図において、図17輪郭抽出処理後の画像例、図18は背景削除及び鮮明化処理後の画像例、図19はグループ化処理後の画像例であり、各待ち行列末尾位置が一点鎖線にて示され、その引き出し端には、該当する待ち時間が付されている。さらに、画素塊のうちで、待ち行列の一部としてグループ化された画素塊はハッチングが付され、それ以外の係員などに相当する画素塊は梨地表示とされている。なお、画面左下のコーナー部には、探索領域Aに相当する白抜き矩形マークが描かれている。 <Example of image showing the process of image processing>
An example of an image when the image processing described with reference to the flowcharts of FIGS. 5 to 11 and the schematic diagrams of FIGS. 13 to 16 is applied to a queue before the departure gate in an actual airport is shown in FIG. These are collectively shown in FIG. In the figure, FIG. 17 shows an image example after contour extraction processing, FIG. 18 shows an example image after background deletion and sharpening processing, FIG. 19 shows an example image after grouping processing, and each queue end position is indicated by a one-dot chain line. The corresponding waiting time is given to the drawer end. Further, among the pixel blocks, the pixel blocks grouped as a part of the queue are hatched, and the pixel blocks corresponding to other staff members are displayed with a satin appearance. Note that a white rectangular mark corresponding to the search area A is drawn in the lower left corner of the screen.
<行列待ち時間情報の利用>
図4に戻って、以上説明した一連の処理(ステップ101~104)が、第1ゲート~第4ゲートからなる全てのゲートのカメラについて終了したならば(ステップ105)、こうして得られた待ち時間推定値は所定のファイルに保存され(ステップ107)、このファイルは空港側の管理サーバへと転送されて、様々な利用に供される。利用形態の一例としては、例えば、各出発ゲート毎の待ち時間を空港が運営するWebサイトに掲載することにより、搭乗客の閲覧に供すること等をあげることができる。 <Use of queue waiting time information>
Returning to FIG. 4, if the series of processing described above (steps 101 to 104) is completed for all the gate cameras including the first gate to the fourth gate (step 105), the waiting time obtained in this way. The estimated value is stored in a predetermined file (step 107), and this file is transferred to the airport-side management server for various uses. As an example of the form of use, for example, by listing the waiting time for each departure gate on a Web site operated by the airport, it can be used for browsing passengers.
図4に戻って、以上説明した一連の処理(ステップ101~104)が、第1ゲート~第4ゲートからなる全てのゲートのカメラについて終了したならば(ステップ105)、こうして得られた待ち時間推定値は所定のファイルに保存され(ステップ107)、このファイルは空港側の管理サーバへと転送されて、様々な利用に供される。利用形態の一例としては、例えば、各出発ゲート毎の待ち時間を空港が運営するWebサイトに掲載することにより、搭乗客の閲覧に供すること等をあげることができる。 <Use of queue waiting time information>
Returning to FIG. 4, if the series of processing described above (steps 101 to 104) is completed for all the gate cameras including the first gate to the fourth gate (step 105), the waiting time obtained in this way. The estimated value is stored in a predetermined file (step 107), and this file is transferred to the airport-side management server for various uses. As an example of the form of use, for example, by listing the waiting time for each departure gate on a Web site operated by the airport, it can be used for browsing passengers.
<本実施例システムの作用効果>
以上説明した空港出発ゲートの待ち時間推定システムによれば、行列末尾認識結果に基づいて統計的な手法で待ち時間を推定すると言う構成を基本的に採用することから、個々の人物像を繋げて待ち行列の全長を求め、それを行列の平均進行速度で除することにより、行列待ち時間を算出するものに比べて、処理が簡単でパソコンにとっての負荷が軽いという基本的な利点に加えて、待ち時間推定の基礎となる待ち行列末尾認定のために、非静止体構成画素の抽出処理、近接画素塊同士のグループ化処理、各グループ内最下点の探索処理、及び各グループ内最下点の確定処理からなる4つの処理を採用したことから、単に、画素塊を抽出して、画面下方へ互いに近接するもの同士をグループ化し、そのグループ内で行列延在方向の最下点に位置する画素を探し出すと言う簡単な処理で、行列末尾を高い確度で認識することができ、しかも、行列が伸張する場合と収縮する場合とで、最下点確定処理の内容を異ならせたため、認識信頼性は維持しつつも、全体としての認定応答性はさほど低下させることないという利点がある。 <Operational effects of the system according to this embodiment>
According to the airport departure gate waiting time estimation system described above, a configuration is adopted in which waiting time is estimated by a statistical method based on the matrix end recognition result. In addition to the basic advantages of finding the total length of the queue and dividing it by the average progression speed of the queue, the processing is simpler and the load on the computer is lighter than that for calculating the queue waiting time. In order to identify the end of the queue, which is the basis for waiting time estimation, extraction processing of non-stationary constituent pixels, grouping of adjacent pixel clusters, search processing of the lowest point in each group, and lowest point in each group Since the four processes consisting of the confirmation process are adopted, the pixel block is simply extracted, and those close to each other in the lower part of the screen are grouped, and the lowest point in the matrix extending direction is located in the group. It is possible to recognize the end of the matrix with a high degree of accuracy with a simple process of finding the pixel to be processed, and the content of the bottom point determination process differs depending on whether the matrix is expanded or contracted. While maintaining the reliability, there is an advantage that the overall responsiveness of the certification does not deteriorate so much.
以上説明した空港出発ゲートの待ち時間推定システムによれば、行列末尾認識結果に基づいて統計的な手法で待ち時間を推定すると言う構成を基本的に採用することから、個々の人物像を繋げて待ち行列の全長を求め、それを行列の平均進行速度で除することにより、行列待ち時間を算出するものに比べて、処理が簡単でパソコンにとっての負荷が軽いという基本的な利点に加えて、待ち時間推定の基礎となる待ち行列末尾認定のために、非静止体構成画素の抽出処理、近接画素塊同士のグループ化処理、各グループ内最下点の探索処理、及び各グループ内最下点の確定処理からなる4つの処理を採用したことから、単に、画素塊を抽出して、画面下方へ互いに近接するもの同士をグループ化し、そのグループ内で行列延在方向の最下点に位置する画素を探し出すと言う簡単な処理で、行列末尾を高い確度で認識することができ、しかも、行列が伸張する場合と収縮する場合とで、最下点確定処理の内容を異ならせたため、認識信頼性は維持しつつも、全体としての認定応答性はさほど低下させることないという利点がある。 <Operational effects of the system according to this embodiment>
According to the airport departure gate waiting time estimation system described above, a configuration is adopted in which waiting time is estimated by a statistical method based on the matrix end recognition result. In addition to the basic advantages of finding the total length of the queue and dividing it by the average progression speed of the queue, the processing is simpler and the load on the computer is lighter than that for calculating the queue waiting time. In order to identify the end of the queue, which is the basis for waiting time estimation, extraction processing of non-stationary constituent pixels, grouping of adjacent pixel clusters, search processing of the lowest point in each group, and lowest point in each group Since the four processes consisting of the confirmation process are adopted, the pixel block is simply extracted, and those close to each other in the lower part of the screen are grouped, and the lowest point in the matrix extending direction is located in the group. It is possible to recognize the end of the matrix with a high degree of accuracy with a simple process of finding the pixel to be processed, and the content of the bottom point determination process differs depending on whether the matrix is expanded or contracted. While maintaining the reliability, there is an advantage that the overall responsiveness of the certification does not deteriorate so much.
また、非静止体画素の抽出処理においては、非静止体画素塊の輪郭のみを抽出したことから、その後段における様々な演算処理の対象となる画素数を大幅に低減させることができ、このことからも、処理の高速化並びにパソコン負荷軽減に寄与することとなる。
In addition, in the non-stationary body pixel extraction process, only the contour of the non-stationary body pixel block is extracted, so that the number of pixels subject to various arithmetic processes in the subsequent stage can be greatly reduced. Therefore, it will contribute to speeding up the processing and reducing the load on the personal computer.
また、行列末尾伸張時の末尾認識信頼性を高めるための手法として、グループ内で探索された最下点同士をフレーム間で比較する処理を採用することから、フレーム間で画素塊の個数が変動する場合にも、それによる影響を受けることなく、末尾認識信頼性を維持することができる。
In addition, as a method to improve the end recognition reliability when expanding the end of the matrix, the processing of comparing the lowest points searched in the group between frames is adopted, so the number of pixel blocks varies between frames. In this case, the end recognition reliability can be maintained without being affected by the influence.
さらに、非静止体画素塊の輪郭を抽出するに際しては、収縮処理及び膨張処理を導入して、ノイズ汚点の除去並びに輪郭線の太線による連続化を行ったため、後段に設けられたラベリング処理における画素同士の繋がりを良好として、画素塊の大型化並びに少数化を達成し、それにより、後のグループ化処理における画素塊同士の近接探索処理の効率化を実現することができる。
Furthermore, when extracting the contour of the non-stationary body pixel block, the contraction process and the expansion process are introduced to remove the noise smudges and to make the contour line continuous with a thick line, so that the pixels in the labeling process provided in the subsequent stage It is possible to achieve good connection between each other, increase the size and decrease the number of pixel blocks, and thereby improve the efficiency of the proximity search process between the pixel blocks in the subsequent grouping process.
<その他>
なお、以上の実施例では、本発明に係る行列末尾認識用の画像処理装置を待ち時間推定の用途に適用したが、それ以外にも、認識された行列末尾の最大値に応じて、通過可能なレーン数を自動的に増減したり、スーパーのレジの場合、レジの係員を自動的に呼び出して使用可能なレジ数を増減したり、認識される行列末尾の時系列記録に基づいて、当該ゲートの利用傾向を分析して、最適なゲート運用方法を探る等々、様々な用途に適用できることは勿論である。 <Others>
In the above embodiments, the image processing apparatus for recognizing the end of the matrix according to the present invention is applied for the purpose of estimating the waiting time, but in addition to that, it can pass according to the maximum value of the end of the recognized matrix. Automatically increase or decrease the number of lanes, or in the case of a supermarket cashier, automatically call cashier staff to increase or decrease the number of available cashiers, or based on the time series record at the end of the recognized matrix Of course, it can be applied to various usages such as analyzing the usage trend of gates to find an optimal gate operation method.
なお、以上の実施例では、本発明に係る行列末尾認識用の画像処理装置を待ち時間推定の用途に適用したが、それ以外にも、認識された行列末尾の最大値に応じて、通過可能なレーン数を自動的に増減したり、スーパーのレジの場合、レジの係員を自動的に呼び出して使用可能なレジ数を増減したり、認識される行列末尾の時系列記録に基づいて、当該ゲートの利用傾向を分析して、最適なゲート運用方法を探る等々、様々な用途に適用できることは勿論である。 <Others>
In the above embodiments, the image processing apparatus for recognizing the end of the matrix according to the present invention is applied for the purpose of estimating the waiting time, but in addition to that, it can pass according to the maximum value of the end of the recognized matrix. Automatically increase or decrease the number of lanes, or in the case of a supermarket cashier, automatically call cashier staff to increase or decrease the number of available cashiers, or based on the time series record at the end of the recognized matrix Of course, it can be applied to various usages such as analyzing the usage trend of gates to find an optimal gate operation method.
さらに、以上のフローチャート(図4~図12)は本発明を概念的に説明するためのものであり、実際のプログラミングに際しては、想定されるノイズ成分等を考慮して、様々なノイズフィルタリング処理を施すことは言うまでもない。
Further, the above flowcharts (FIGS. 4 to 12) are for conceptually explaining the present invention. In actual programming, various noise filtering processes are performed in consideration of assumed noise components and the like. It goes without saying that it is applied.
本発明に係る行列末尾認識用の画像処理装置よれば、例えば、入場ゲートや発券機等の前に出現する待ち行列の末尾を比較的に高精度に認識することができ、これを利用することで、例えば、当該行列の待ち時間を求めることも可能となる。
According to the image processing apparatus for recognizing the end of a queue according to the present invention, for example, it is possible to recognize the end of a queue appearing in front of an entrance gate, a ticketing machine or the like with relatively high accuracy, and use this. Thus, for example, the waiting time of the matrix can be obtained.
1 待ち時間生成PC
2-1~2-4 カメラ
3 待ち行列を構成する人
4 輪郭線
4A 太線化された輪郭線
5 ノイズに相当する汚点
6 通行人の横切る方向を示す矢印
A 探索領域
L1~L4 第1レーン~第4レーン
PCG11,PCG12,PCG13 待ち行列を構成する画素塊
PCG2 待ち行列以外の画素塊(通行人等々)
PCG31,PCG32 背景物を構成する画素塊 1 Waiting time generation PC
2-1 to 2-4Camera 3 Persons constituting the queue 4 Outline 4A Thickened outline 5 Noise smudges 6 Arrows indicating the direction of passersby A A Search area L1 to L4 First lane to 4th lane PCG11, PCG12, PCG13 Pixel blocks constituting the queue PCG2 Pixel blocks other than the queue (passers, etc.)
PCG31, PCG32 Pixel block constituting background object
2-1~2-4 カメラ
3 待ち行列を構成する人
4 輪郭線
4A 太線化された輪郭線
5 ノイズに相当する汚点
6 通行人の横切る方向を示す矢印
A 探索領域
L1~L4 第1レーン~第4レーン
PCG11,PCG12,PCG13 待ち行列を構成する画素塊
PCG2 待ち行列以外の画素塊(通行人等々)
PCG31,PCG32 背景物を構成する画素塊 1 Waiting time generation PC
2-1 to 2-4
PCG31, PCG32 Pixel block constituting background object
Claims (16)
- 1もしくは2以上のレーンからなる待ち行列の出現が想定される領域を俯瞰撮影することにより得られる動画像から当該待ち行列の末尾を認識するための画像処理装置であって、
前記動画像を構成する一連のフレーム画像のそれぞれから非静止体を構成する画素を抽出する非静止体抽出手段と、
前記抽出された非静止体画素を、互いに隣接する画素を一塊に纏めてなる画素塊に区分すると共に、それらの区分された画素塊のうち、所定の行列延在方向において互いに近接する一連の画素塊を1のグループに属する画素塊として記憶する画素塊グループ化手段と、
1の画素塊グループに属する全画素塊の中から、所定の行列延在方向における最遠点を探索して行列末尾候補として記憶する最遠点探索手段と、
前記行列末尾候補として記憶される最遠点が相連続するフレーム画像間で所定の許容値を超えて変化したとき、その変化方向と変化後の安定性とに基づいて、前記行列末尾候補として記憶される最遠点を、真の行列末尾と認識されるべき最遠点として確定する最遠点確定手段とを包含する、行列末尾認識用の画像処理装置。 An image processing apparatus for recognizing the end of a queue from a moving image obtained by bird's-eye shooting of an area where a queue composed of one or more lanes is expected to appear,
Non-stationary object extraction means for extracting pixels constituting a non-stationary object from each of a series of frame images constituting the moving image;
The extracted non-stationary body pixels are divided into pixel blocks formed by collecting adjacent pixels together, and a series of pixels adjacent to each other in the predetermined matrix extending direction among the divided pixel blocks. Pixel block grouping means for storing the block as a pixel block belonging to one group;
A farthest point search means for searching for the farthest point in a predetermined matrix extending direction from all the pixel blocks belonging to one pixel block group and storing it as a matrix end candidate;
When the farthest point stored as the matrix tail candidate changes beyond a predetermined allowable value between consecutive frame images, it is stored as the matrix tail candidate based on the change direction and the stability after the change. An image processing apparatus for recognizing the end of a matrix, including farthest point determination means for determining the farthest point to be recognized as the farthest point to be recognized as the true end of the matrix. - 前記最遠点確定手段が、前記最遠点の変化方向が行列先頭から遠ざかる方向であるときには、その変化後の最遠点が所定フレーム数に亘りほぼ一定に維持されたときに限り、前記行列末尾候補として記憶される最遠点を、真の行列末尾と認識されるべき最遠点として確定する処理を含む、請求項1に記載の行列末尾認識用の画像処理装置。 When the farthest point determination means is such that the direction of change of the farthest point is away from the top of the matrix, the matrix only when the farthest point after the change is maintained substantially constant over a predetermined number of frames. The image processing device for recognizing a matrix end according to claim 1, further comprising a process of determining a farthest point stored as a tail candidate as a farthest point to be recognized as a true matrix end.
- 前記最遠点確定手段が、前記最遠点の変化方向が行列先頭へと近づく方向であるときには、直ちに、前記行列末尾候補として記憶される最遠点を、真の行列末尾と認識されるべき最遠点として確定する処理を含む、請求項2に記載の行列末尾認識用の画像処理装置。 When the farthest point determination means is such that the direction of change of the farthest point approaches the top of the matrix, the farthest point stored as the matrix end candidate should be immediately recognized as the true end of the matrix. The image processing apparatus for recognizing the end of a matrix according to claim 2, comprising a process for determining the farthest point.
- 前記俯瞰撮影は、想定される待ち行列の後方より行われ、かつ前記所定の行列延在方向とは、各フレーム画像の下方向とされる、請求項1に記載の行列末尾認識用の画像処理装置。 The image processing for recognizing the end of a matrix according to claim 1, wherein the overhead view shooting is performed from the rear of an assumed queue, and the predetermined matrix extending direction is a downward direction of each frame image. apparatus.
- 前記非静止体抽出手段が、
前記フレーム画像をそれに含まれる各種の像の輪郭を抽出してなる線画像に変換する輪郭抽出手段と、
前記線画像の中で、背景に相当する線画像部分を、背景差分処理を用いて削除する背景削除手段と、
前記背景差分処理後の線画像を構成する線を鮮明化する鮮明化手段とを含む、請求項1に記載の行列末尾認識用の画像処理装置。 The non-stationary body extraction means is
Contour extracting means for converting the frame image into a line image obtained by extracting contours of various images included therein;
In the line image, a background image deletion unit that deletes a line image portion corresponding to a background using background difference processing;
The image processing apparatus for recognizing the end of a matrix according to claim 1, further comprising: a sharpening unit that sharpens lines constituting the line image after the background difference processing. - 前記輪郭抽出手段が、
前記フレーム画像をモノクロ画像に変換するグレースケール化手段と、
前記モノクロ画像に対して空間一次微分処理を施す空間一次微分手段と、
前記空間一次微分処理後の画像を二値化して線画像を生成する二値化手段とを含む、請求項5に記載の行列末尾認識用の画像処理装置。 The contour extracting means is
Gray scale converting means for converting the frame image into a monochrome image;
Spatial primary differentiation means for performing spatial primary differentiation on the monochrome image;
The image processing apparatus for recognizing the end of a matrix according to claim 5, further comprising: binarization means for binarizing the image after the spatial first differentiation process to generate a line image. - 前記背景削除手段が、
前記線画像を構成する線に含まれる画素の値を、当該画素の所定複数フレーム数前の値と照合し、両値が一致する画素を背景画素として削除する処理を含む、請求項5に記載の行列末尾認識用の画像処理装置。 The background deletion means
6. The method according to claim 5, further comprising: comparing a value of a pixel included in a line constituting the line image with a value of a predetermined number of frames before the pixel, and deleting a pixel having the same value as a background pixel. Image processing device for matrix end recognition. - 前記鮮明化手段が、
前記背景削除後の線画像に含まれるノイズ等を原因とする微細な汚点を除去する収縮手段と、
前記背景処理後の線画像に含まれる線の途切れた波線部分を繋いで太線化する膨張手段とを含む、請求項5に記載の行列末尾認識用の画像処理装置。 The sharpening means is
Contraction means for removing fine stains caused by noise or the like included in the line image after the background deletion;
6. The image processing apparatus for recognizing a matrix end according to claim 5, further comprising expansion means for connecting the broken wavy lines included in the line image after the background processing to thicken the line. - 前記画素塊グループ化手段が、
前記抽出された非静止体を構成する画素のうちで、互いに隣接する画素に同一のラベルを付すことにより、非静止体を構成する画素を画素塊毎に区分するラベリング手段と、
行列先頭に相当する1の画素塊に含まれる画素のうちで、所定の行列延在方向の最遠点に位置する画素を基準として、前記延在方向の遠方側に向けて設定された所定の探索領域内に、別のラベルの付された画素が存在するか否かを探索し、存在するときには、その画素が含まれる画素塊を同一グループに属する画素塊として記憶する処理を、新たに同一グループとして記憶された画素塊について、別のラベルの付された画素が存在しなくなるまで、全ての先頭相当画素塊について、繰り返す近接画素塊探査手段とを含む、請求項1に記載の行列末尾認識用の画像処理装置。 The pixel block grouping means includes:
Among the pixels constituting the extracted non-stationary body, labeling means for classifying the pixels constituting the non-stationary body for each pixel block by attaching the same label to pixels adjacent to each other;
Among the pixels included in one pixel block corresponding to the top of the matrix, the pixel located at the farthest point in the predetermined matrix extending direction is used as a reference, and a predetermined value set toward the far side in the extending direction The search area is searched for whether or not another labeled pixel exists, and when it exists, the process of storing the pixel block including the pixel as a pixel block belonging to the same group is newly the same. 2. The matrix end recognition according to claim 1, further comprising: adjacent pixel block search means that repeats all of the head equivalent pixel blocks until no other labeled pixels exist for the pixel blocks stored as a group. Image processing device. - 前記最遠点探索手段が、同一グループに属する1の画素塊に含まれる画素のうちで、所定の行列延在方向の最遠点に位置する画素を探索する処理を、当該グループに属する全ての画素塊について繰り返すことにより、当該同一グルーブ内における最遠点を特定する処理を含む、請求項1に記載の行列末尾認識用の画像処理装置。 The farthest point searching means searches for a pixel located at the farthest point in a predetermined matrix extending direction among all the pixels included in one pixel block belonging to the same group. The image processing apparatus for recognizing the end of a matrix according to claim 1, comprising a process of specifying the farthest point in the same groove by repeating the pixel block.
- 前記最遠点確定手段が、
同一グループ内最遠点の現フレームの値と1つ前のフレームの値との差が所定の許容値以内であるときには所定のフレームカウンタをインクリメントする一方、前記許容値を超えるときには前記フレームカウンタをクリアするカウンタ制御手段と、
同一グループ内最遠点の現フレームの値と1つ前のフレームの値との差が所定の許容値を超えて変化するとき、その変化方向が近方向のときには、現フレームの値により真の最遠点の値を更新する一方、その変化方向が遠方向であるときには、前記フレームカウンタの値が所定フレーム数に達するのを待って、現フレームの値により真の最遠点の値を更新する最遠点真値の更新制御手段とを含む、請求項1に記載の行列末尾認識用の画像処理装置。 The farthest point determination means,
When the difference between the value of the current frame at the farthest point in the same group and the value of the previous frame is within a predetermined allowable value, the predetermined frame counter is incremented. Counter control means to clear,
When the difference between the value of the current frame at the farthest point in the same group and the value of the previous frame changes beyond a predetermined tolerance, when the direction of change is near, the value of the current frame While the farthest point value is updated, if the change direction is far direction, the value of the frame counter waits for the predetermined number of frames, and the true farthest point value is updated with the current frame value. The farthest point true value update control means for the matrix end recognition image processing apparatus according to claim 1. - 1もしくは2以上のレーンからなる待ち行列の出現が想定される領域を俯瞰撮影することにより得られた動画像から当該待ち行列の末尾を認識するための画像処理方法であって、
前記動画像を構成する一連のフレーム画像のそれぞれから非静止体を構成する画素を抽出する非静止体抽出ステップと、
前記抽出された非静止体画素を、互いに隣接する画素を一塊に纏めてなる画素塊に区分すると共に、それらの区分された画素塊のうち、所定の行列延在方向において互いに近接する一連の画素塊を1のグループに属する画素塊として記憶する画素塊グループ化ステップと、
1の画素塊グループに属する全画素塊の中から、所定の行列延在方向における最遠点を探索して行列末尾候補として記憶する最遠点探索ステップと、
前記行列末尾候補として記憶される最遠点が相連続するフレーム画像間で所定の許容値を超えて変化したとき、その変化方向と変化後の安定性とに基づいて、前記行列末尾候補として記憶される最遠点を、真の行列末尾と認識されるべき最遠点として確定する最遠点確定ステップとを具備する、行列末尾認識用の画像処理方法。 An image processing method for recognizing the end of a queue from a moving image obtained by taking a bird's-eye view of an area where a queue composed of one or more lanes is expected to appear,
A non-stationary object extraction step of extracting pixels constituting a non-stationary object from each of a series of frame images constituting the moving image;
The extracted non-stationary body pixels are divided into pixel blocks formed by collecting adjacent pixels together, and a series of pixels adjacent to each other in the predetermined matrix extending direction among the divided pixel blocks. A pixel block grouping step for storing the block as a pixel block belonging to one group;
A farthest point search step of searching for the farthest point in a predetermined matrix extending direction from all the pixel blocks belonging to one pixel block group and storing it as a matrix end candidate;
When the farthest point stored as the matrix tail candidate changes beyond a predetermined allowable value between consecutive frame images, it is stored as the matrix tail candidate based on the change direction and the stability after the change. And a farthest point determining step for determining the farthest point to be recognized as the farthest point to be recognized as the true end of the matrix. - コンピュータを、
1もしくは2以上のレーンからなる待ち行列の出現が想定される領域を俯瞰撮影することにより得られた動画像から当該待ち行列の末尾を認識するために、
前記動画像を構成する一連のフレーム画像のそれぞれから非静止体を構成する画素を抽出する非静止体抽出手段と、
前記抽出された非静止体画素を、互いに隣接する画素を一塊に纏めてなる画素塊に区分すると共に、それらの区分された画素塊のうち、所定の行列延在方向において互いに近接する一連の画素塊を1のグループに属する画素塊として記憶する画素塊グループ化手段と、
1の画素塊グループに属する全画素塊の中から、所定の行列延在方向における最遠点を探索して行列末尾候補として記憶する最遠点探索手段と、
前記行列末尾候補として記憶される最遠点が相連続するフレーム画像間で所定の許容値を超えて変化したとき、その変化方向と変化後の安定性とに基づいて、前記行列末尾候補として記憶される最遠点を、真の行列末尾と認識されるべき最遠点として確定する最遠点確定手段とを備えた、行列末尾認識用の画像処理装置として機能させるためのコンピュータプログラム。 Computer
In order to recognize the end of the queue from a moving image obtained by bird's-eye shooting of an area where a queue composed of one or more lanes is expected to appear,
Non-stationary object extraction means for extracting pixels constituting a non-stationary object from each of a series of frame images constituting the moving image;
The extracted non-stationary body pixels are divided into pixel blocks formed by collecting adjacent pixels together, and a series of pixels adjacent to each other in the predetermined matrix extending direction among the divided pixel blocks. Pixel block grouping means for storing the block as a pixel block belonging to one group;
A farthest point search means for searching for the farthest point in a predetermined matrix extending direction from all the pixel blocks belonging to one pixel block group and storing it as a matrix end candidate;
When the farthest point stored as the matrix tail candidate changes beyond a predetermined allowable value between consecutive frame images, it is stored as the matrix tail candidate based on the change direction and the stability after the change. A computer program for functioning as an image processing apparatus for recognizing the end of a matrix, comprising farthest point determination means for determining the farthest point to be recognized as the farthest point to be recognized as the true end of the matrix. - 1もしくは2以上のレーンからなる待ち行列の出現が想定される領域を俯瞰撮影するカメラと、
前記カメラから得られる動画像から当該待ち行列の末尾を認識するための画像処理装置とを包含し、
前記画像処理装置は、
前記動画像を構成する一連のフレーム画像のそれぞれから非静止体を構成する画素を抽出する非静止体抽出手段と、
前記抽出された非静止体画素を、互いに隣接する画素を一塊に纏めてなる画素塊に区分すると共に、それらの区分された画素塊のうち、所定の行列延在方向において互いに近接する一連の画素塊を1のグループに属する画素塊として記憶する画素塊グループ化手段と、
1の画素塊グループに属する全画素塊の中から、所定の行列延在方向における最遠点を探索して行列末尾候補として記憶する最遠点探索手段と、
前記行列末尾候補として記憶される最遠点が相連続するフレーム画像間で所定の許容値を超えて変化したとき、その変化方向と変化後の安定性とに基づいて、前記行列末尾候補として記憶される最遠点を、真の行列末尾と認識されるべき最遠点として確定する最遠点確定手段とをを含む、行列末尾の認識システム。 A camera for taking a bird's-eye view of an area where a queue of one or more lanes is expected to appear;
Including an image processing device for recognizing the end of the queue from a moving image obtained from the camera,
The image processing apparatus includes:
Non-stationary object extraction means for extracting pixels constituting a non-stationary object from each of a series of frame images constituting the moving image;
The extracted non-stationary body pixels are divided into pixel blocks formed by collecting adjacent pixels together, and a series of pixels adjacent to each other in the predetermined matrix extending direction among the divided pixel blocks. Pixel block grouping means for storing the block as a pixel block belonging to one group;
A farthest point search means for searching for the farthest point in a predetermined matrix extending direction from all the pixel blocks belonging to one pixel block group and storing it as a matrix end candidate;
When the farthest point stored as the matrix tail candidate changes beyond a predetermined allowable value between consecutive frame images, it is stored as the matrix tail candidate based on the change direction and the stability after the change. And a farthest point determining means for determining a farthest point to be recognized as a true farthest point to be recognized as a true end of the matrix. - 請求項1に記載の行列末尾認識用の画像処理装置と、
前記画像処理装置から得られる行列末尾位置を、予め統計的に得られた、行列末尾位置と当該行列の待ち時間との関係に照らすことにより、現在の行列待ち時間を推定する待ち時間推定手段とを包含する、行列待ち時間の推定装置。 An image processing device for matrix end recognition according to claim 1,
A waiting time estimating means for estimating a current matrix waiting time by comparing a matrix end position obtained from the image processing apparatus statistically in advance with a relationship between a matrix end position and a waiting time of the matrix; An apparatus for estimating a matrix waiting time. - 複数レーンの待ち行列のそれぞれについて、現在の待ち時間を推定すると共に、それらの待ち時間に基づいて、当該複数レーンに対応する総合的な1個の待ち時間を決定する、請求項15に記載の行列待ち時間の推定装置。 The current waiting time is estimated for each of the multiple lane queues, and an overall waiting time corresponding to the multiple lanes is determined based on the waiting times. Matrix waiting time estimation device.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015109631A JP6501302B2 (en) | 2015-05-29 | 2015-05-29 | Image processing apparatus for matrix end recognition |
JP2015-109631 | 2015-05-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016194770A1 true WO2016194770A1 (en) | 2016-12-08 |
Family
ID=57440535
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/065586 WO2016194770A1 (en) | 2015-05-29 | 2016-05-26 | Image processing device for recognition of end of queue |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP6501302B2 (en) |
WO (1) | WO2016194770A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115345862A (en) * | 2022-08-23 | 2022-11-15 | 成都智元汇信息技术股份有限公司 | Method and device for simulating X-ray machine scanning imaging based on column data and display |
US11983967B2 (en) | 2020-01-21 | 2024-05-14 | Nec Corporation | Control apparatus, control method, and non-transitory storage medium |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7237467B2 (en) * | 2018-05-30 | 2023-03-13 | キヤノン株式会社 | Information processing device, information processing method, and program |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007317052A (en) * | 2006-05-29 | 2007-12-06 | Japan Airlines International Co Ltd | System for measuring waiting time for lines |
JP2008519567A (en) * | 2004-11-02 | 2008-06-05 | センサーマティック・エレクトロニクス・コーポレーション | System and method for matrix monitoring |
JP2013109395A (en) * | 2011-11-17 | 2013-06-06 | Sharp Corp | Queue management device, queue management method, and queue management system |
-
2015
- 2015-05-29 JP JP2015109631A patent/JP6501302B2/en active Active
-
2016
- 2016-05-26 WO PCT/JP2016/065586 patent/WO2016194770A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008519567A (en) * | 2004-11-02 | 2008-06-05 | センサーマティック・エレクトロニクス・コーポレーション | System and method for matrix monitoring |
JP2007317052A (en) * | 2006-05-29 | 2007-12-06 | Japan Airlines International Co Ltd | System for measuring waiting time for lines |
JP2013109395A (en) * | 2011-11-17 | 2013-06-06 | Sharp Corp | Queue management device, queue management method, and queue management system |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11983967B2 (en) | 2020-01-21 | 2024-05-14 | Nec Corporation | Control apparatus, control method, and non-transitory storage medium |
CN115345862A (en) * | 2022-08-23 | 2022-11-15 | 成都智元汇信息技术股份有限公司 | Method and device for simulating X-ray machine scanning imaging based on column data and display |
Also Published As
Publication number | Publication date |
---|---|
JP6501302B2 (en) | 2019-04-17 |
JP2016224651A (en) | 2016-12-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5102410B2 (en) | Moving body detection apparatus and moving body detection method | |
US8588466B2 (en) | Object area detection system, device, method, and program for detecting an object | |
Segen | A camera-based system for tracking people in real time | |
KR101870902B1 (en) | Image processing apparatus and image processing method | |
KR101716646B1 (en) | Method for detecting and recogniting object using local binary patterns and apparatus thereof | |
Kumar et al. | An efficient approach for detection and speed estimation of moving vehicles | |
CN103366154B (en) | Reconfigurable clear path detection system | |
JP6474126B2 (en) | Object tracking method, apparatus and program | |
WO2009004479A2 (en) | System and process for detecting, tracking and counting human objects of interest | |
US8724851B2 (en) | Aerial survey video processing | |
WO2016194770A1 (en) | Image processing device for recognition of end of queue | |
JP5371040B2 (en) | Moving object tracking device, moving object tracking method, and moving object tracking program | |
CN112930535A (en) | Crowd behavior anomaly detection based on video analysis | |
US20210142064A1 (en) | Image processing apparatus, method of processing image, and storage medium | |
KR102332229B1 (en) | Method for Augmenting Pedestrian Image Data Based-on Deep Learning | |
KR20200119369A (en) | Apparatus and method for detecting object | |
JP2015210819A (en) | System and method for video-based detection of drive-offs and walk-offs in vehicular and pedestrian queues | |
JPWO2018025336A1 (en) | Deterioration detection device, deterioration detection method, and program | |
Kumar et al. | Traffic surveillance and speed limit violation detection system | |
JP6028972B2 (en) | Image processing apparatus, image processing method, and image processing program | |
Nalepa et al. | Real-time people counting from depth images | |
Raikar et al. | Automatic building detection from satellite images using internal gray variance and digital surface model | |
Chen et al. | Head-shoulder detection using joint HOG features for people counting and video surveillance in library | |
JP6851246B2 (en) | Object detector | |
Hsieh et al. | Grid-based template matching for people counting |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16803205 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16803205 Country of ref document: EP Kind code of ref document: A1 |