JP2004355082A - Optical flow detection system, detection method, and detection program - Google Patents

Optical flow detection system, detection method, and detection program Download PDF

Info

Publication number
JP2004355082A
JP2004355082A JP2003148765A JP2003148765A JP2004355082A JP 2004355082 A JP2004355082 A JP 2004355082A JP 2003148765 A JP2003148765 A JP 2003148765A JP 2003148765 A JP2003148765 A JP 2003148765A JP 2004355082 A JP2004355082 A JP 2004355082A
Authority
JP
Japan
Prior art keywords
optical flow
past
frame
image
evaluation amount
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2003148765A
Other languages
Japanese (ja)
Other versions
JP4269781B2 (en
Inventor
Kazuyuki Sakurai
和之 櫻井
Original Assignee
Nec Corp
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nec Corp, 日本電気株式会社 filed Critical Nec Corp
Priority to JP2003148765A priority Critical patent/JP4269781B2/en
Publication of JP2004355082A publication Critical patent/JP2004355082A/en
Application granted granted Critical
Publication of JP4269781B2 publication Critical patent/JP4269781B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To detect a small optical flow with a high degree of accuracy from moving images in which optical flows of different magnitudes are mixed and also to perform high speed detection of the optical flow. <P>SOLUTION: An evaluation amount calculation means 103 calculates an evaluation amount associated with the magnitude of the optical flow, and a past frame selection means 104 specifies a past image frame to be used for calculation of the optical flow based on the evaluation amount. A pixel which calculates the optical flow, calculates the optical flow using a past image frame having a long time interval from the present time of day when it is thought that the optical flow becomes small, calculates the optical flow, and calculates the optical flow using a past image frame having a short time interval from the present time of day when it is thought that the optical flow becomes large. <P>COPYRIGHT: (C)2005,JPO&NCIPI

Description

[0001]
TECHNICAL FIELD OF THE INVENTION
The present invention relates to an optical flow detection system, a detection method, and a detection program for detecting an optical flow that is a vector indicating a motion of an object or a pixel in a moving image.
[0002]
[Prior art]
2. Description of the Related Art In recent years, there has been proposed a method of grasping, for example, a motion of a surrounding object such as a vehicle on which an imaging device is installed from an image captured by the imaging device. The method is to grasp the movement of a captured object by measuring an optical flow between frames of a moving image.
[0003]
The optical flow is obtained by using a feature matching method (pattern matching), a spatiotemporal gradient method, or the like for each frame of an image.
[0004]
When the time interval between the current time image frame and the past image frame is too short and the optical flow is too small to detect with high accuracy, the time interval between the current time image frame and the past image frame is long. There has been proposed a method capable of detecting an optical flow with high accuracy by using an image frame at the current time (for example, see Patent Document 1).
[0005]
[Patent Document 1]
JP-A-2003-44861 (Page 4-7, FIG. 1)
[0006]
[Problems to be solved by the invention]
However, in the method described in Patent Document 1, in order to detect an optical flow having a small size with high accuracy, an optical flow is used for each coordinate using a plurality of past image frames having different time intervals from the current frame. Detect the flow. Therefore, there is a problem that the amount of data processing for detecting an optical flow is large.
[0007]
Therefore, an object of the present invention is to provide an optical flow detection system, a detection method, and a detection program capable of detecting an optical flow having a small size with high accuracy with a small amount of data processing.
[0008]
[Means for Solving the Problems]
The optical flow detection system according to the present invention includes an image storage unit that stores a plurality of past frames, an evaluation amount calculation unit that calculates an evaluation amount that is a numerical value corresponding to the size of the optical flow, and an evaluation amount calculation unit that calculates the evaluation amount. A past frame selecting unit that determines which past frame is used from the past frames stored in the image storage unit based on the evaluation amount.
[0009]
The evaluation amount calculating means may calculate an evaluation amount corresponding to the size of the optical flow based on the coordinate position of the object on the image frame. According to such a configuration, since a past image frame used for optical flow detection can be specified based on information obtained from an image frame, the optical flow can be determined without complicating the configuration of the optical flow detection system. Can be detected.
[0010]
The evaluation amount calculated by the evaluation amount calculating means may be a vertical coordinate value in the image frame of the object. In an image frame such as a road image, the upper object is considered to be a distant object, so the optical flow is considered to be small, and the lower object is considered to be a nearby object, so the optical flow is considered to be large. Therefore, in an image frame such as a road image, the evaluation amount can be calculated using information that is easy to obtain, such as a vertical coordinate value.
[0011]
The evaluation amount calculating means may calculate an evaluation amount corresponding to the size of the optical flow based on the size of the optical flow detected in the past. According to such a configuration, it is not necessary to newly extract the evaluation amount from the image frame, so that the evaluation amount can be calculated more quickly.
[0012]
An optical flow detection system according to another aspect of the present invention includes an image storage unit that stores a plurality of past frames, a distance measurement unit that measures a distance between an object appearing in a current frame, and a distance measured by the distance measurement unit. And a past frame selecting means for determining which past frame is to be used from the past frames stored in the image storage means. According to such a configuration, since the evaluation amount is calculated based on the distance to the object included in the image, the detection of the optical flow of the coordinates of the distant object that is considered to have a small optical flow and the current time is not performed. By selecting a past image frame having a long time interval, an optical flow can be detected with high accuracy. In addition, to detect an optical flow of a coordinate of an object at a short distance considered to have a large optical flow, a past image frame having a short time interval with the current time is selected, so that an area for searching for a corresponding point is set at the current time. It is narrowed around the position of the object in the image frame, and the optical flow can be detected at high speed.
[0013]
An area detection unit for detecting an area of an object on an image based on the detected optical flow may be provided. According to such a configuration, an object moving on the image can be monitored and tracked.
[0014]
The input moving image may be a video image of the outside taken from a running vehicle, and the area detection unit may detect another vehicle or an obstacle. According to such a configuration, a region of an object moving on an image can be extracted. For example, when a region of another vehicle is extracted from a moving image captured by a running vehicle, a driver of the vehicle may be extracted. The warning can be issued to prevent collision.
[0015]
BEST MODE FOR CARRYING OUT THE INVENTION
Embodiment 1 FIG.
FIG. 1 is a block diagram showing a first embodiment of an image area detection system including an optical flow detection system according to the present invention. The image region detection system shown in FIG. 1 includes an image input device 110 for inputting a moving image and an image region detection device 100.
[0016]
The image area detecting apparatus 100 calculates an image storage unit 101 that stores an input moving image, a processing target coordinate setting unit 102 that sets coordinates for detecting an optical flow, and an evaluation amount of coordinates for detecting an optical flow. Evaluation amount calculation means 103; past frame selection means 104 for specifying an image frame to be used for calculating an optical flow based on the calculated evaluation amount; and correspondence of a past image frame point corresponding to one point of the current frame A corresponding point searching means 105 for specifying a point, a time interval between the current time at which the current frame was captured and the time at which the past frame was captured, one point of the image frame of the current frame, and the corresponding point. Flow calculating means 106 for calculating an optical flow by means of a computer, and a display device for displaying the calculated optical flow Includes an optical flow output means 107 for outputting the not shown.) Or the like, and a region detection unit 108 for detecting the area of the image of the object based on the direction and magnitude of the optical flow in the image frame.
[0017]
In the configuration shown in FIG. 1, the optical flow detection system corresponds to a portion excluding the optical flow output means 107 and the area detection means 108 in the image area detection device 100. However, the optical flow output means 107 and the area detection means 108 The included system may be regarded as an optical flow detection system. Further, the image area detection device 100 is realized by a data processing device such as a computer.
[0018]
That is, the processing target coordinate setting unit 102, the evaluation amount calculation unit 103, the past frame selection unit 104, the corresponding point search unit 105, the optical flow calculation unit 106, the optical flow output unit 107, and the area detection unit 108 are stored in the storage device. It is realized by the operation of a computer based on the program being executed. Therefore, the processing of the processing target coordinate setting means 102, the evaluation amount calculating means 103, the past frame selecting means 104, the corresponding point searching means 105, the optical flow calculating means 106, the optical flow output means 107 and the area detecting means 108 described below This is realized by a program executed by a computer. The image storage unit 101 is realized by a storage device such as a hard disk built in the computer or connected to the computer. The image input device 110 is realized by combining one or more video camera, infrared camera, night-vision camera or other moving image capturing device, or a distance measuring device such as a radar or an ultrasonic transmitting / receiving device. This is realized by a device that obtains a two-dimensional or three-dimensional distribution of the quantity.
[0019]
The image input device 110 captures a moving image and inputs the captured moving image to the image storage unit 101 and the processing target coordinate setting unit 102 for each image frame. The image storage unit 101 stores the input image frame as a past frame. The processing target coordinate setting unit 102 sets coordinates for detecting an optical flow in the current frame (the latest frame, that is, the image frame at the current time), and outputs the current frame to the evaluation amount calculation unit 103 and the corresponding point search unit 105. The processing target coordinate setting means 102 sets the coordinates, for example, in the order of performing the raster scan on the video.
[0020]
The evaluation amount calculation means 103 calculates an evaluation amount of the processing target coordinates set to the coordinates for detecting the optical flow. The evaluation amount is an amount related to the size of the optical flow. For example, when an image frame of a black-and-white image obtained by photographing the rear of a traveling vehicle as shown in FIG. 2 is input, an object whose image has a small y-coordinate value is an object far from the photographing position. It is considered that the size of the optical flow is reduced. Also, the object of the image in the range where the y-coordinate value is large is considered to be an object near the shooting position, and the size of the optical flow is considered to be large. Therefore, when a road image or the like is input, the y coordinate value can be used as the evaluation amount.
[0021]
As another example of the evaluation amount, the coordinates for detecting the optical flow or the size of the optical flow calculated in the past around the coordinates may be used as the evaluation amount. The user may set the evaluation amount based on the knowledge, or a new evaluation amount may be calculated by combining the evaluation amounts obtained by these methods. The evaluation amount is not limited to the amount obtained by the method described above, and may be any amount related to the size of the optical flow. The evaluation amount calculation unit 103 outputs the calculated evaluation amount to the past frame selection unit 104.
[0022]
The past frame selecting unit 104 specifies an image frame used for detecting an optical flow among past image frames stored by the image storing unit 101 based on the evaluation amount. For example, when the evaluation amount is less than a predetermined threshold, the image frame two frames before the image frame at the current time is specified as an image frame used for optical flow detection, and the evaluation amount is determined by a predetermined threshold. In the case described above, the image frame immediately before the image frame at the current time is specified as the image frame used for detecting the optical flow. Hereinafter, the image frame specified by the past frame selecting unit 104 is referred to as a past image frame. The past frame selection unit 104 reads a past image frame from the moving images stored in the image storage unit 101 and outputs the frame to the corresponding point search unit 105.
[0023]
The corresponding point searching means 105 specifies a corresponding point which is the coordinate of the past image frame corresponding to the coordinate for detecting the optical flow of the current frame, and outputs information on the position of the specified corresponding point to the optical flow calculating means 106. The method of specifying the corresponding points will be described later.
[0024]
The optical flow calculation means 106 calculates the optical flow based on the coordinates for detecting the optical flow, the coordinates of the corresponding point, and the time interval that is the difference between the time when the current frame was shot and the time when the past image frame was shot. I do. The optical flow is calculated as a quotient obtained by dividing a vector of a difference between a coordinate for detecting the optical flow and the corresponding point by a time interval. Then, the optical flow calculation means 106 holds the calculated optical flow and outputs it to the optical flow output means 107. The retention of the optical flow is realized by storing the optical flow in a temporary storage unit such as a memory mounted on the computer. The optical flow output unit 107 outputs the input optical flow to the area detection unit 108. The area detecting means 108 detects an area where the same object is considered to exist based on the input optical flow.
[0025]
Next, the operation of this embodiment will be described. FIG. 3 is a flowchart for explaining the operation of this embodiment.
[0026]
The moving image input from the image input device 110 to the image region detection device 100 for each image frame is input to the image storage unit 101 and the processing target coordinate setting unit 102 in the image region detection device 100 (step S101). The image storage unit 101 stores the input image frame as a past frame (Step S102). The processing target coordinate setting means 102 determines whether or not the processing for detecting the optical flow has been performed for all the coordinates included in the area for detecting the optical flow (step S103). When it is determined that the coordinates for detecting the optical flow remain, the processing target coordinate setting means 102 sets the processing target coordinates for detecting the optical flow in the current frame (step S104). That is, the coordinates following the processed coordinates are set as the processing target coordinates. Then, the coordinates to be processed are output to the evaluation amount calculating means 103 and the corresponding point searching means 105. The evaluation amount calculation means 103 calculates the evaluation amount of the processing target coordinates for detecting the optical flow (step S105).
[0027]
The past frame selecting unit 104 specifies an image frame to be used for detecting an optical flow among past image frames accumulated by the image accumulating unit 101 based on the evaluation amount (Step S106). Then, the corresponding point searching means 105 specifies a corresponding point which is a coordinate of a past image frame corresponding to a coordinate for detecting an optical flow in the current frame from a predetermined search range, and obtains information on a position of the specified corresponding point. Is output to the optical flow calculation means 106 (step S107). The optical flow calculation means 106 calculates the optical flow based on the coordinates for detecting the optical flow, the coordinates of the corresponding point, and the difference between the time when the current frame was shot and the time when the past image frame was shot (step S108). ). The optical flow calculation means 106 holds the calculated optical flow (step S109).
[0028]
Then, the processing target coordinate setting means 102 determines whether or not the processing for detecting the optical flow has been performed for all the coordinates included in the area for detecting the optical flow (step S103). If it is determined that there are no remaining coordinates for detecting the optical flow, the processing target coordinate setting unit 102 outputs the optical flow held by the optical flow calculation unit 106 to the optical flow output unit 107 (step S110). ). The optical flow output unit 107 outputs the input optical flow to the area detection unit 108. The area detecting unit 108 detects an area where an image of the same object is considered to exist based on the input optical flow (step S111).
[0029]
The operation when the corresponding point searching means 105 specifies the corresponding point of the image frame in step S107 will be described. In this embodiment, when searching for a corresponding point, a sum of absolute values of differences between pixels between regions (hereinafter referred to as SAD (sum of absolute differences)) is used for evaluating the similarity between the current frame and the past image frame. Is used.) The evaluation of the similarity between the current frame and the past image frame is not limited to the method using the SAD, and the sum of the squares of the differences between the pixels between the regions (SSD (Sum of Square Differences)). For example, a similarity evaluation value used in processing of a correlation value between regions, a normalized correlation value between regions, or the like may be used.
[0030]
FIG. 4 is a flowchart for explaining the operation when the corresponding point searching means 105 specifies a specific point. The corresponding point searching means 105 extracts a luminance and an image density, which are pixel values around a processing target coordinate for detecting an optical flow in the current frame (for example, a rectangular area of 5 pixels vertically and horizontally around the processing target coordinate). (Step S201). Next, an appropriate value is set as the minimum SAD (step S202). The minimum SAD set here is a relatively large value. For example, depending on the type of image (different color image or binary image) and gradation, generally, the smaller the value, the higher the similarity. Is set to a value larger than the value determined to be.
[0031]
The corresponding point searching means 105 determines whether corresponding points have been searched for all pixels in a predetermined search range (step S203). The search range is, for example, a range of 31 pixels vertically and horizontally in the past image frame centered on the same coordinates as the processing target coordinates in the current frame. However, this range is merely an example, and is set to an appropriate value according to the situation such as the resolution of the image. If there is a pixel in the search range for which it has not been determined whether the pixel is a corresponding point or not, the corresponding point searching means 105 sets the processing target coordinates in the past image frame (corresponding to the corresponding point candidate coordinates). The coordinates after the processed coordinates are set (step S204). The search is performed in a raster scan direction from the upper left end in the search range, for example, and is set to the coordinates of the pixel immediately before the upper left end in the search range as the corresponding point candidate coordinates after the process of step S202 ends. Then, the corresponding point searching means 105 extracts the pixel values around the corresponding point candidate coordinates (for example, a rectangular area of 5 pixels vertically and horizontally around the corresponding point candidate coordinates) from the past image frame (step S205).
[0032]
The corresponding point searching means 105 calculates the SAD for the pixel values around the coordinates to be processed and the pixel values around the corresponding point candidate coordinates in the current frame (step S206). The corresponding point searching means 105 determines whether the calculated SAD is smaller than the value set as the minimum SAD (Step S207). If the calculated SAD is less than the value set as the minimum SAD, the corresponding point searching means 105 newly sets the calculated SAD as the minimum SAD and sets the corresponding point candidate coordinates as the corresponding point coordinates (step S208). When the SAD calculated in step S207 is equal to or greater than the value set as the minimum SAD, and when the process in step S208 is completed, the process returns to step S203.
[0033]
Then, the corresponding point searching means 105 determines whether or not corresponding pixels have been searched for all pixels in the area specified by the user (step S203). When the user has searched for the corresponding points for all the pixels in the area specified by the user, the corresponding point searching means 105 specifies the pixel of the corresponding point coordinates as the corresponding point and obtains information on the position of the specified corresponding point. Is output to the optical flow calculation means 106 (step S209).
[0034]
Next, the operation of this embodiment will be described using a moving image obtained by photographing the rear of a running vehicle as an example. FIG. 2 shows one of the image frames of the moving image thus obtained. Here, when the rear of the traveling vehicle is photographed, the optical flow of the scenery portion of the image is in the direction away from the photographing position, whereas the optical flow of the vehicle portion that overtakes the photographing vehicle is This is the direction approaching the shooting position. Further, the object of the image in the upper part of the road portion in the image is considered to be a distant object, and the size of the optical flow is considered to be small. Then, the object of the image in the lower part of the road portion in the image is considered to be an object near the shooting position, and the size of the optical flow is considered to be large. Utilizing these properties, detection of an overtaking vehicle and search for a corresponding point are performed.
[0035]
FIG. 5 is a flowchart illustrating the operation of this embodiment using a specific example. The image input device 110 captures, for example, the rear of a vehicle traveling on a road with a 256-tone black-and-white image (step S301). Therefore, the moving image output by the image input device 110 is a video of the outside taken from the running vehicle. The image input device 110 inputs the captured moving image (road image) to the image storage unit 101 and the processing target coordinate setting unit 102 for each image frame. The image storage unit 101 stores the input road image as a past frame (step S302). The processing target coordinate setting means 102 first determines an area in which an optical flow is to be detected. Here, it is assumed that the area below the line Lp shown in FIG. 4 has been identified as the area for detecting the optical flow. For example, the processing target coordinate setting unit 102 determines the line Lp by excluding an area where the y coordinate value is equal to or less than a predetermined value, or determines the line Lp by receiving designation from a user.
[0036]
The processing target coordinate setting means 102 determines whether or not the processing for detecting the optical flow has been performed for all the processing target coordinates included in the area for detecting the optical flow (step S303). When it is determined that the processing target coordinates for detecting the optical flow remain, the processing target coordinate setting unit 102 sets the coordinates for detecting the optical flow in the moving image (step S304). That is, the coordinates following the processed coordinates are set as the processing target coordinates. Note that, as the processing target coordinates for detecting the optical flow, the coordinates are set in the raster scan direction from the upper left end to the lower right end in the region below Lp.
[0037]
The evaluation amount calculation unit 103 sets the evaluation amount of the coordinates to be subjected to the process of detecting the optical flow as the y coordinate value (step S305). If the y-coordinate value is equal to or more than the threshold value Sy, the optical flow is detected using the past image frame immediately before the current frame, and if the y-coordinate value is less than the threshold value Sy, the previous image frame immediately before the current frame is detected. Is used to detect an optical flow (steps S306, 307, 308). That is, the area below Sy shown in FIG. 24 is the area where the optical flow is detected using the past image frame immediately before the current frame, and the area above Sy is the area two times before the current frame. This area is used to detect an optical flow using a past image frame.
[0038]
When detecting an optical flow using the previous image frame immediately before the image frame at the current time, the corresponding point searching means 105 detects the corresponding point in the previous image frame immediately before the image frame at the current time. Is performed (step S309). Then, information on the position of the detected corresponding point is output to the optical flow calculation means 106. The optical flow calculation means 106 sets the vector of the difference between the coordinates of the processing for detecting the optical flow in the current frame and the coordinates of the corresponding point as the optical flow (step S311).
[0039]
When the optical flow is detected using the past image frame two times before the image frame at the current time, the corresponding point searching means 105 detects the corresponding point in the past image frame two times before the current frame (step S310). Then, information on the position of the detected corresponding point is output to the optical flow calculation means 106. The optical flow calculation means 106 sets a vector obtained by multiplying a vector of a difference between a coordinate to be processed for detecting an optical flow in the current frame and a coordinate of the corresponding point by 2 (step S312). When a past frame n frames before is used, the difference vector is set to (1 / n) to be an optical flow.
[0040]
The optical flow calculation means 106 holds the calculated optical flow (step S313). Then, the processing target coordinate setting unit 102 determines whether or not the processing for detecting the optical flow has been performed on all the coordinates included in the area for detecting the optical flow (step S303). If the processing target coordinate setting unit 102 determines that there is no processing target coordinate for detecting the optical flow, the optical flow calculation unit 106 outputs the held optical flow to the optical flow output unit 107. (Step S314). The optical flow output unit 107 outputs the input optical flow to the area detection unit 108.
[0041]
The area detection unit 108 can extract a pixel area having a direction in which the optical flow approaches the vehicle on which the image area detection device 100 is mounted as an area of the overtaking vehicle and output the extracted area to the display unit or the like (step). S315). At this time, the area of the overtaking vehicle may be monitored or tracked.
[0042]
According to this embodiment, since a past image frame for searching for a corresponding point is selected according to each coordinate in an image frame, a pixel having a small optical flow has a long time interval with the current time in the past. By selecting an image frame, it is possible to detect an optical flow with high accuracy, and by selecting a past image frame with a short time interval for a pixel having a large optical flow, an area for searching for a corresponding point can be narrowed. Optical flows can be detected at high speed.
[0043]
In this embodiment, the case is divided into two cases, that is, the case of using the past image frame one before the current frame and the case of using the past image frame two before the current frame. Three or more previous image frames may be used, and cases may be divided into more cases.
[0044]
Further, in this embodiment, the area detecting unit 108 can detect an area where another vehicle exists based on an optical flow. For example, a vehicle equipped with the image area detecting device 100 approaches. It is also possible to detect an area where a moving obstacle or the like exists.
[0045]
Furthermore, in this embodiment, the case where a road image is input is used as an example, so the y-coordinate value is used as the evaluation amount. However, the evaluation amount is not limited to the y-coordinate value. Other quantities may be used as long as they are relevant to the quantity. For example, the coordinates for detecting the optical flow or the size of the optical flow calculated in the past around the coordinates may be used as the evaluation amount, or the user may evaluate the optical flow based on the prior knowledge of the user of the optical flow detection system. The amount may be set, or a new evaluation amount may be calculated by combining the evaluation amounts obtained by these methods.
[0046]
Embodiment 2 FIG.
FIG. 6 is a block diagram showing a second embodiment of the image area detection system including the optical flow detection system according to the present invention. The image area detection system illustrated in FIG. 6 includes an image input device 110 that inputs a moving image, an image area detection device 200, and a distance measurement device 300. Unlike the image region detection device 100 according to the first embodiment, the image region detection device 200 does not include the evaluation amount calculation unit 103. Further, the operation of the past frame selecting unit 201 is different from the operation of the past frame selecting unit 104 in the first embodiment.
[0047]
The distance measurement device 300 measures a distance (actual distance) between an object included in a moving image captured by the image input device 110 and a shooting position. The past frame selecting unit 201 specifies a past image frame used for calculating an optical flow based on the distance between the shooting position and the object measured by the distance measuring device 300. The distance measuring device 300 is realized by, for example, a distance measuring device such as a radar, a stereo camera, and an ultrasonic transmitting / receiving device.
[0048]
Next, the operation of this embodiment will be described. FIG. 7 is a flowchart for explaining the operation of this embodiment.
[0049]
The moving image input from the image input device 110 to the image region detection device 200 for each image frame is input to the image storage unit 101 and the processing target coordinate setting unit 102 in the image region detection device 200 (step S401). The image storage unit 101 stores the input image frame as a past frame (step S402). The processing target coordinate setting means 102 determines whether or not the processing for detecting the optical flow has been performed for all the coordinates included in the area for detecting the optical flow (step S403). If it is determined that the coordinates for detecting the optical flow remain, the processing target coordinate setting means 102 sets the processing target coordinates for detecting the optical flow in the current frame (step S404). That is, the coordinates following the processed coordinates are set as the processing target coordinates. Then, the current frame is output to the past frame selecting means 201 and the corresponding point searching means 105. The distance measuring device 300 measures the distance (actual distance) between the object existing in the current frame and the installation position of the image region detecting device 200 (step S405). It is assumed that the distance measurement device 300 is installed near the image region detection device 200.
[0050]
The distance measuring device 300 receives, for example, information from the image region detecting device 200 in the direction corresponding to the processing target coordinates in the measurement visual field of the distance measuring device 300. Then, the measurement direction is determined based on the supplied information, and the distance to the object existing in the current frame is measured. The distance measuring device 300 measures the distance in accordance with, for example, an instruction from software that implements the past frame selecting unit.
[0051]
The past frame selection unit 201 specifies an image frame used for detecting an optical flow among past image frames stored by the image storage unit 101 based on the distance from the object (step S406). Then, the corresponding point searching means 105 specifies a corresponding point, which is the coordinate of the past image frame corresponding to the coordinate for detecting the optical flow of the current frame, by searching in a predetermined search area, and specifies the corresponding point. Is output to the optical flow calculation means 106 (step S407). The optical flow calculation means 106 calculates the optical flow based on the processing target coordinates for detecting the optical flow, the coordinates of the corresponding point, and the time when the current frame was shot and the time when the past image frame was shot (step S408). ). The optical flow calculation means 106 holds the calculated optical flow (step S409).
[0052]
Then, the processing target coordinate setting means 102 determines whether or not all the coordinates for detecting the optical flow have been set (step S403). If the processing target coordinate setting unit 102 determines that there are no coordinates to detect the optical flow, the optical flow calculation unit 106 outputs the held optical flow to the optical flow output unit 107. (Step S410). The optical flow output unit 107 outputs the input optical flow to the area detection unit 108. The region detection unit 108 extracts a region that is considered to include an image of the same object based on the input optical flow (step S411).
[0053]
Next, the operation of this embodiment will be described using a moving image obtained by photographing the rear of a running vehicle as an example. FIG. 2 is an image frame of a moving image obtained as described above.
[0054]
FIG. 7 is a flowchart illustrating the operation of this embodiment using a specific example. The image input device 110 captures, for example, the rear of a vehicle traveling on a road as a monochrome image of 256 gradations (step S501). The image input device 110 inputs the captured moving image (road image) to the image storage unit 101 and the processing target coordinate setting unit 102 for each image frame. The image storage unit 101 stores the input road image as a past frame (step S502). The processing target coordinate setting means 102 first determines an area in which an optical flow is to be detected. Here, it is assumed that the area below the line Lp shown in FIG. 4 has been identified as the area for detecting the optical flow. For example, the processing target coordinate setting unit 102 determines the line Lp by excluding an area where the y coordinate value is equal to or less than a predetermined value, or determines the line Lp by receiving designation from a user.
[0055]
The processing target coordinate setting unit 102 determines whether or not the processing for detecting the optical flow has been performed for all the coordinates included in the area for detecting the optical flow (step S503). If it is determined that the processing target coordinates for detecting the optical flow remain, the processing target coordinate setting unit 102 sets the coordinates for detecting the optical flow in the moving image (step S504). That is, the coordinates following the processed coordinates are set as the processing target coordinates. Note that, as the processing target coordinates for detecting the optical flow, the coordinates are set in the raster scan direction from the upper left end to the lower right end in the region below Lp.
[0056]
The distance measuring device 300 measures the actual distance corresponding to the processing target coordinates (step S505). If the distance between the shooting position and the object is less than 30 m, for example, using 30 m as a threshold, the optical flow is detected using the past image frame immediately before the image frame at the current time, If the distance between the shooting position and the object is 30 m or more, it is determined that the optical flow is to be detected using the previous image frame two frames before the image frame at the current time (steps S506 and 507). , 508). The input of the distance threshold by the user may be received via input means such as a keyboard and a mouse.
[0057]
When the optical flow is detected using the previous image frame immediately before the image frame at the current time, the corresponding point searching unit 105 detects the corresponding point in the previous image frame immediately before the image frame at the current time. Perform (Step S509). Then, information on the position of the detected corresponding point is output to the optical flow calculation means 106. The optical flow calculation means 106 sets the vector of the difference between the coordinates of the process of detecting the optical flow in the image frame at the current time and the coordinates of the corresponding point as the optical flow (step S511).
[0058]
When detecting an optical flow using the past image frame two times before the image frame at the current time, the corresponding point searching unit 105 detects the corresponding point in the past image frame two times before the image frame at the current time. Perform (Step S510). Then, information on the position of the detected corresponding point is output to the optical flow calculation means 106. The optical flow calculating means 106 multiplies the vector of the difference between the coordinates of the processing for detecting the optical flow in the image frame at the current time and the coordinates of the corresponding point by と す る as the optical flow (step S512).
[0059]
The optical flow calculation means 106 holds the calculated optical flow (step S513). Then, the processing target coordinate setting unit 102 determines whether or not the processing for detecting the optical flow has been performed on all the coordinates included in the area for detecting the optical flow (step S503). If the processing target coordinate setting unit 102 determines that there are no remaining coordinates for detecting an optical flow, the optical flow calculation unit 106 determines the stored optical flow through the optical flow output unit 107 for area detection. Output to the means 108 (step S514).
[0060]
The region detection unit 108 outputs a region of pixels having a direction approaching the vehicle being photographed by the optical flow to the display unit or the like as a region of the overtaking vehicle (step S515). At this time, the area of the overtaking vehicle may be monitored or tracked.
[0061]
According to this embodiment, since the past image frame used for optical flow detection is selected in accordance with the distance to the object included in the image, the optical flow of the coordinates of the distant object whose optical flow is considered to be small is considered. By selecting a past image frame with a large time interval for detection, an optical flow can be detected with high accuracy. In addition, for detecting the optical flow of the coordinates of an object at a short distance where the optical flow is considered to be large, by selecting a past image frame with a small time interval, the area for searching the corresponding point can be narrowed, and the optical flow can be detected at high speed. Can be detected.
[0062]
According to this embodiment, since the past image frame used for detecting the optical flow is selected in accordance with the distance to the object included in the image, it is not necessary to predict the size of the optical flow in advance, The present invention can also be applied when it is difficult to predict the size of the optical flow.
[0063]
【The invention's effect】
As described above, according to the present invention, the past image frame used for detecting the optical flow is determined based on the evaluation amount which is a numerical value corresponding to the size of the optical flow. There is an effect that the flow can be detected with high accuracy and at high speed.
[Brief description of the drawings]
FIG. 1 is a block diagram showing a first embodiment of the present invention.
FIG. 2 is a diagram illustrating an example of an image frame of a moving image input to an image input device.
FIG. 3 is a flowchart illustrating an operation according to the first exemplary embodiment of the present invention.
FIG. 4 is a flowchart illustrating the operation of a corresponding point search unit according to the present invention.
FIG. 5 is a flowchart illustrating the operation of the first embodiment using a specific example.
FIG. 6 is a block diagram showing a second embodiment of the present invention.
FIG. 7 is a flowchart illustrating an operation of the second exemplary embodiment of the present invention.
FIG. 8 is a flowchart illustrating the operation of the second embodiment using a specific example.
[Explanation of symbols]
100 computer
101 Image storage means
102 Processing target coordinate setting means
103 Evaluation amount calculation means
104 Past frame selection means
105 Corresponding point search means
106 Optical flow calculation means
107 Optical flow output means
108 area detecting means
110 Image input device
300 Distance measuring device

Claims (11)

  1. An optical flow detection system that detects an optical flow using a current frame and a past frame in an input moving image,
    Image storage means for storing a plurality of past frames;
    Evaluation amount calculating means for calculating an evaluation amount that is a numerical value corresponding to the size of the optical flow;
    Past frame selecting means for determining which past frame among the past frames stored in the image storage means is to be used, based on the evaluation amount calculated by the evaluation amount calculating means. Optical flow detection system.
  2. The optical flow detection system according to claim 1, wherein the evaluation amount calculation means calculates an evaluation amount corresponding to a size of the optical flow based on a coordinate position of the object on the image frame.
  3. The optical flow detection system according to claim 2, wherein the evaluation amount calculated by the evaluation amount calculation means is a vertical coordinate value of the object in an image frame.
  4. 2. The optical flow detection system according to claim 1, wherein the evaluation amount calculating means calculates an evaluation amount corresponding to the size of the optical flow based on the size of the optical flow detected in the past.
  5. An optical flow detection system that detects an optical flow using a current frame and a past frame in an input moving image,
    Image storage means for storing a plurality of past frames;
    Distance measuring means for measuring a distance between an object appearing in the current frame,
    An optical flow, comprising: a past frame selecting unit for determining which past frame among the past frames stored in the image storing unit is to be used based on the distance measured by the distance measuring unit. Detection system.
  6. The optical flow detection system according to any one of claims 1 to 5, further comprising an area detection unit configured to detect an area of the object on the image based on the detected optical flow.
  7. The input moving image is a video of the outside taken from a running vehicle,
    7. The optical flow detection system according to claim 6, wherein the area detection unit detects another vehicle or an obstacle.
  8. An optical flow detection method for detecting an optical flow using a current frame and a past frame in an input moving image,
    Memorize multiple past frames,
    Calculate an evaluation amount that is a numerical value corresponding to the size of the optical flow,
    An optical flow detection method, comprising: determining which past frame among the stored past frames to use based on the calculated evaluation amount.
  9. An optical flow detection method for detecting an optical flow using a current frame and a past frame in an input moving image,
    Memorize multiple past frames,
    Measure the distance to the object that appears in the current frame,
    An optical flow detection method characterized by determining which past frame among the stored past frames to use based on the measured distance.
  10. An optical flow program for executing a process of detecting an optical flow using a current frame and a past frame in an input moving image,
    On the computer,
    A process of storing a plurality of past frames;
    Executes a process of calculating an evaluation amount that is a numerical value corresponding to the size of the optical flow, and a process of determining which past frame among the stored past frames to use based on the calculated evaluation amount. An optical flow detection program that lets you
  11. An optical flow program for executing a process of detecting an optical flow using a current frame and a past frame in an input moving image,
    On the computer,
    A process of storing a plurality of past frames;
    Processing to measure the distance between the object appearing in the current frame,
    An optical flow detection program for executing, based on the measured distance, a process of determining which of the stored past frames is to be used.
JP2003148765A 2003-05-27 2003-05-27 Optical flow detection system, detection method and detection program Active JP4269781B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2003148765A JP4269781B2 (en) 2003-05-27 2003-05-27 Optical flow detection system, detection method and detection program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2003148765A JP4269781B2 (en) 2003-05-27 2003-05-27 Optical flow detection system, detection method and detection program

Publications (2)

Publication Number Publication Date
JP2004355082A true JP2004355082A (en) 2004-12-16
JP4269781B2 JP4269781B2 (en) 2009-05-27

Family

ID=34045049

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2003148765A Active JP4269781B2 (en) 2003-05-27 2003-05-27 Optical flow detection system, detection method and detection program

Country Status (1)

Country Link
JP (1) JP4269781B2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007286724A (en) * 2006-04-13 2007-11-01 Clarion Co Ltd Onboard information processing method, onboard information processing program, onboard information processor and onboard information processing system
JP2010204805A (en) * 2009-03-02 2010-09-16 Konica Minolta Holdings Inc Periphery-monitoring device and method
JP2010286926A (en) * 2009-06-09 2010-12-24 Konica Minolta Holdings Inc Surroundings monitoring device
US8509480B2 (en) 2007-03-22 2013-08-13 Nec Corporation Mobile detector, mobile detecting program, and mobile detecting method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007286724A (en) * 2006-04-13 2007-11-01 Clarion Co Ltd Onboard information processing method, onboard information processing program, onboard information processor and onboard information processing system
US8509480B2 (en) 2007-03-22 2013-08-13 Nec Corporation Mobile detector, mobile detecting program, and mobile detecting method
JP2010204805A (en) * 2009-03-02 2010-09-16 Konica Minolta Holdings Inc Periphery-monitoring device and method
JP2010286926A (en) * 2009-06-09 2010-12-24 Konica Minolta Holdings Inc Surroundings monitoring device

Also Published As

Publication number Publication date
JP4269781B2 (en) 2009-05-27

Similar Documents

Publication Publication Date Title
CN101068344B (en) Object detection apparatus
CA2678156C (en) Measurement apparatus, measurement method, and feature identification apparatus
KR100422370B1 (en) An Apparatus and Method to Measuring Dimensions of 3D Object on a Moving Conveyor
JP3049603B2 (en) 3D image-object detection method
JP4915655B2 (en) Automatic tracking device
JP5407898B2 (en) Object detection apparatus and program
RU2668404C2 (en) Device for recording images in three-dimensional scale, method for formation of 3d-image and method for producing device for recording images in three dimensional scale
Coughlan et al. The Manhattan world assumption: Regularities in scene statistics which enable Bayesian inference
JP4429298B2 (en) Object number detection device and object number detection method
KR101971866B1 (en) Method and apparatus for detecting object in moving image and storage medium storing program thereof
EP2426642B1 (en) Method, device and system for motion detection
JP5603403B2 (en) Object counting method, object counting apparatus, and object counting program
JP5538667B2 (en) Position / orientation measuring apparatus and control method thereof
JP4926127B2 (en) Front imaging control device for moving body
KR20110002001A (en) A stereo-image registration and change detection system and method
EP0878965A2 (en) Method for tracking entering object and apparatus for tracking and monitoring entering object
US20060078197A1 (en) Image processing apparatus
JP4632987B2 (en) Road image analysis apparatus and road image analysis method
JP2012118698A (en) Image processing system
JP5075672B2 (en) Object detection apparatus and method
US10133937B2 (en) Crowd monitoring system
JP6650677B2 (en) Video processing apparatus, video processing method, and program
JP4809291B2 (en) Measuring device and program
EP3518146A1 (en) Image processing apparatus and image processing method
JP2004280776A (en) Method for determining shape of object in image

Legal Events

Date Code Title Description
RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20051117

RD03 Notification of appointment of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7423

Effective date: 20051117

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20051213

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20081028

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20081104

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20081225

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20090203

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20090216

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120306

Year of fee payment: 3

R150 Certificate of patent or registration of utility model

Ref document number: 4269781

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120306

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130306

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130306

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140306

Year of fee payment: 5