WO2011065149A1 - 対象物距離測定装置及び当該装置が搭載された車両 - Google Patents
対象物距離測定装置及び当該装置が搭載された車両 Download PDFInfo
- Publication number
- WO2011065149A1 WO2011065149A1 PCT/JP2010/068253 JP2010068253W WO2011065149A1 WO 2011065149 A1 WO2011065149 A1 WO 2011065149A1 JP 2010068253 W JP2010068253 W JP 2010068253W WO 2011065149 A1 WO2011065149 A1 WO 2011065149A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- human body
- height
- image
- size
- candidate
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/75—Determining position or orientation of objects or cameras using feature-based methods involving models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
Definitions
- the present invention relates to an object distance measuring device using a human body detection device that detects a human body based on an image acquired by an imaging device, and a vehicle on which the device is mounted.
- the vehicle periphery monitoring device using the human body detection device In the vehicle periphery monitoring device using the human body detection device according to the prior art, information on an object such as a pedestrian who may be in contact with the own vehicle from images around the own vehicle captured by the two infrared cameras. And provide the information to the driver of the vehicle.
- this vehicle periphery monitoring apparatus a portion having a high temperature in an image around the host vehicle taken by a pair of left and right infrared cameras (stereo cameras) is used as the object, and parallax of the object in the left and right images is obtained.
- the distance to the object is calculated, and an object that may affect the traveling of the host vehicle is detected from the moving direction of the object and the position of the object, and an alarm is output (see Japanese Patent No. 3970876). ).
- Japanese Patent No. 3970876 a candidate for a head of an object is detected, and it is determined whether or not the detected head candidate is larger than the lateral width (head width) of the head of the human body. A technique for discriminating whether it is a structure is described.
- Japanese Patent Laid-Open No. 2007-213561 A technique for improving this inconvenience is described in Japanese Patent Laid-Open No. 2007-213561.
- an object around the vehicle is photographed at least twice (two frames) at a predetermined time interval using a single infrared camera mounted on the vehicle.
- the change in the size (size) of the target image in the current captured image becomes larger as the relative speed between the target and the vehicle periphery monitoring device is higher than the size (size) of the previous image. .
- An object existing ahead of the vehicle has a shorter time to reach the vehicle as the relative speed between the object and the vehicle increases.
- the vicinity of the vehicle can be monitored by estimating the arrival time to the vehicle from the rate of change in the size of the image portion of the same object during a predetermined time interval.
- reference document 1 described later ⁇ Japanese physique survey report, measured in 1978-1981, published in 1984; Ministry of Economy, Trade and Industry (Ministry of International Trade and Industry), Institute of Industrial Technology, Japan Standards Association ⁇ And Reference 2 ⁇ Japanese human body measurement data, 1992-1994 measurement, Human Life Engineering Research Center (HQL) ⁇ .
- the vehicle periphery monitoring device can display a pedestrian in front as a target object that is difficult to see with the eyes of the driver when driving at night. it can.
- the height Hc in the real space of the human body candidate 2 in the real space that is the object is, for example, about 170 [cm] that is the average height of a Japanese adult.
- the position of the camera 3 that is, the position of the vehicle to the human body candidate 2 in the real space that is the object.
- a distance in the real space can be calculated by the following equation (1).
- the true human body 5 in the real space of the human body candidate 2 is a person who is taller or shorter than the assumed height of the human body candidate 2, as shown in FIG.
- the assumed height Hc has a height error ⁇ H with respect to the true height Htrue of the actual human body 5, and in this case, a distance error ⁇ Z (ranging error) shown in the following equation (2): ) Occurs.
- ⁇ Z F ⁇ ⁇ H / h
- the present invention has been made in consideration of such problems, and based on the size of the human body candidate in the image, the human body candidate in the real space (the real space corresponding to the human body candidate 2 in the image) from the imaging device. It is an object of the present invention to provide an object distance measuring device capable of calculating the distance to the upper true human body 5) with higher accuracy and a vehicle equipped with the device.
- the object distance measuring device is an object distance measuring device that uses a human body detection device that performs human body detection based on an image acquired by an imaging device, and that extracts a human body candidate region from the image.
- a reference feature extraction unit that extracts one predetermined human body feature as a reference feature from among the human body features of the total height, full width, torso, arms, legs, and head from the extracted human body candidate region, and extraction
- a comparison feature extraction unit that extracts one of the human body features excluding the reference feature from the human body features of the overall height, full width, torso, arms, legs, and head from the human body candidate region as a comparison feature; Based on the ratio between the size of the reference feature extracted by the reference feature extraction unit and the size of the comparison feature extracted by the comparison feature extraction unit, the size of the human body candidate in the image is estimated. Based on the estimated size in the real space, the estimated size in the real space, and the size of the human body candidate in the image, the distance from the imaging device to the human body candidate in the real space
- the human body candidate area is extracted from the image acquired by the imaging device by the human body candidate area extraction unit, and the human body features of the overall height, the entire width, the torso, the arms, the legs, and the head are extracted from the extracted human body candidate areas.
- one predetermined human body feature is extracted by the reference feature extraction unit, and another human body feature excluding the extracted reference feature is extracted as a comparison feature by the comparison feature extraction unit.
- the size of the human body candidate in the image is estimated by a real space size estimation unit.
- the distance calculation unit is configured to move from the imaging device to the human body candidate in the real space. The distance is calculated.
- the distance to the human body candidate in the real space can be calculated in a short time with a simple configuration with high accuracy.
- the size of the head of a Japanese is document 1 ⁇ Japanese physique survey report, measured in 1978-1981, published in 1984; Ministry of Economy, Trade and Industry (Ministry of International Trade and Industry) Figures 1A, 1B, 1C, and 2 reprinted from the drawings published by the Industrial Standards Institute of Japan, Japan Standards Association, and reference 2 ⁇ Japanese anthropometric data, 1992 -As can be seen from the infant physique data shown in Fig. 3A, Fig. 3B, Fig. 3C, and Fig.
- the real space size estimator estimates the actual human body candidate in the image. It is possible to accurately estimate the size in space. Then, the distance calculation unit can accurately calculate the distance from the imaging device to the human body candidate in the real space based on the estimated size in the real space and the size of the human body candidate in the image.
- the shoulder size of a Japanese person that is, the shoulder width (also referred to as the full width) is described in Reference 1 ⁇ Japanese physique survey report, 1978-1981 measurement, published in 1984.
- Figure 1A, Figure 1B, Figure 1C, and Figure 2 reprinted drawings published by the Ministry of Economy, Trade and Industry (Ministry of International Trade and Industry), Institute of Industrial Technology, Japan Standards Association, and reference 2 ⁇ Japanese anthropometric data, 1992-1994 measurements, reproduction of drawings published in the Human Life Engineering Research Center (HQL) ⁇ Infant physique shown in FIGS. 3A, 3B, 3C, and 4 As can be seen from the data, it increases almost monotonically with age.
- the reference feature extraction unit ⁇ in this case functions as a full width (shoulder width) extraction unit.
- ⁇ At least one of the shoulder width in the image extracted by the above and the human body features of the total height, the full width, the torso, the arm, and the leg that change with respect to the age from the human body candidate region extracted by the comparison feature extraction unit.
- the size estimation unit in the real space From about 5 to 99 years old, It is possible to accurately estimate the size of the human body candidate in the image in the real space.
- the distance calculation unit can accurately calculate the distance from the imaging device to the human body candidate in the real space based on the estimated size in the real space and the size of the human body candidate in the image.
- the distance can be calculated by one shooting (one frame)
- a vehicle periphery monitoring device including a notification unit for notifying a driver when the human body detection device of the vehicle detects a human body, the vehicle being mounted with an object distance measurement device using the human body detection device.
- the mounted vehicle is also included in the present invention.
- the distance to the human body candidate in the real space can be calculated in a short time with a simple configuration with high accuracy.
- FIG. 1A is an explanatory diagram of human body characteristics of a person over 7 years old
- FIG. 1B is an explanatory diagram of the head length and total head height of a person over 7 years old
- FIG. 1C is for a person aged 7 to 99 years
- 3A is an explanatory diagram of the human body characteristics of an infant under 6 years old
- FIG. 3B is an explanatory diagram of the head width and shoulder width of an infant under 6 years old
- 3C is a height for an infant aged 0-6 years old, It is a graph of the human body measurement value of the infant showing the shoulder width and the head width. It is a table
- FIG. 5 is a block diagram showing a configuration of the vehicle periphery monitoring device 10 in which the object distance measuring device 50 using the human body detection device according to the embodiment of the present invention is incorporated.
- FIG. 6 is a schematic diagram of the vehicle 12 equipped with the vehicle periphery monitoring device 10 in which the object distance measuring device 50 using the human body detection device shown in FIG. 5 is incorporated.
- the vehicle periphery monitoring device 10 includes an image processing unit 14 (processing device) that controls the vehicle periphery monitoring device 10 and the object distance measuring device 50, and an infrared ray connected to the image processing unit 14.
- a camera 16 imaging device
- a vehicle speed sensor 18 that detects the vehicle speed Vs of the vehicle 12
- a brake sensor 20 that detects a brake pedal operation amount (brake operation amount) Br by the driver
- a yaw rate Yr of the vehicle 12 are detected.
- An object such as a pedestrian (moving object) having a high risk of contact, displaying an image captured by an infrared camera 16 and a speaker 24 (notification unit) for issuing an alarm or the like by voice.
- an image display device 26 notification unit
- HUD Head UpaDisplay
- the image display device 26 is not limited to the HUD 26a, and a display of a navigation system can be used.
- the image processing unit 14 detects a moving object such as a pedestrian in front of the vehicle from an infrared image around the vehicle 12 and a signal indicating the vehicle running state (here, the vehicle speed Vs, the brake operation amount Br, and the yaw rate Yr). When it is determined that the possibility of contact is high, an alarm is issued through the speaker 24.
- a moving object such as a pedestrian in front of the vehicle from an infrared image around the vehicle 12 and a signal indicating the vehicle running state (here, the vehicle speed Vs, the brake operation amount Br, and the yaw rate Yr).
- the image processing unit 14 includes an A / D conversion circuit that converts an input analog signal into a digital signal, an image memory (storage unit 14m) that stores a digitized image signal, and a CPU (central processing unit) that performs various arithmetic processes. ) 14c, a storage unit 14m such as a RAM (Random Access Memory) used by the CPU 14c for storing data being calculated, a ROM (Read Only Memory) for storing programs, tables, maps, etc.
- a RAM Random Access Memory
- ROM Read Only Memory
- the CPU 14c of the image processing unit 14 captures these digital signals and executes a program, thereby functioning as various functional units (also referred to as functional means) described below, and driving the speaker 24 and the image display device 26 according to a warning.
- Send a signal (audio signal or display signal).
- the CPU 14c functions as a human body candidate region extraction unit 30, a human body comparison feature extraction unit 31, a human body reference feature extraction unit 33, a real space size estimation unit 34, a distance calculation unit 36, and the like.
- the human body reference feature extraction unit 33 includes a head extraction unit 32 and a shoulder extraction unit 37. As will be described later, the head extraction unit 32 is used as the first embodiment, and the shoulder extraction unit 37 is used as the second embodiment.
- the object processing unit 14 and the infrared camera 16 constitute an object distance measuring device 50 according to this embodiment.
- a normal video camera can be used instead of the infrared camera 16.
- the infrared camera 16 that detects far-infrared rays is disposed on the front bumper portion of the host vehicle 12 and at the center in the vehicle width direction of the host vehicle 12.
- the infrared camera 16 has a characteristic that its output signal level increases (brightness increases) as the temperature of the object increases.
- the HUD 26a is provided so that the display screen is displayed at a position that does not obstruct the front view of the driver of the front windshield of the host vehicle 12.
- the storage unit 14m is related to the physique of a pedestrian, which is a human body, a part that changes little with the growth of the human body (referred to as a human body reference feature in this first example) and a part that changes greatly (
- the ratio R of human body comparison feature is stored in advance as a characteristic (map or calculation formula).
- Ratio R height / head width is calculated and stored in the storage unit 14m.
- the human body reference feature or the human body comparison feature is a convenience for facilitating understanding, and the head width may be defined as the human body comparison feature and the height as the human body reference feature.
- the storage unit 14m also stores in advance, as a characteristic, a ratio R ′ of a portion having a different speed of change with the growth of the human body in relation to the second embodiment described later.
- the shoulder width is called a human body reference feature
- the height is called a human body comparison feature
- the ratio R ′ (shoulder width / height) is stored in the storage unit 14m.
- the table shown in FIG. 7 includes the age and height HT (extracted from the above-described Japanese physique data of FIGS. 1A, 1B, 1C and 2 and the infant physique data of FIGS. 3A, 3B, 3C and 4.
- a map (table) 60 of the ratio R (height / head width) in which the ratio R (height / head width average value) according to the first embodiment corresponding to the average value) and the head width (average value) is calculated is shown. ing.
- the characteristic diagram shown in FIG. 8 is a plot of height, head width, and calculated ratio R (height / head width) according to the first example against the age of 4 to 99 years, Note that the head width is substantially constant with age, and the characteristic Cr of the ratio R (height / head width) changes substantially in proportion to the height.
- FIG. 9 is a flowchart for explaining operations such as object detection of an object such as a pedestrian by the image processing unit 14, object size estimation, and distance calculation.
- step S1 of FIG. 9 the image processing unit 14 obtains an infrared image that is an output signal for each frame within a predetermined field angle range in front of the vehicle, which is captured for each frame by the infrared camera 16, and performs A / D conversion.
- the gray scale image is stored in the image memory in the storage unit 14m.
- step S2 the image signal is binarized, that is, an area brighter than the luminance threshold value for determining the human body luminance is set to “1” (white), and a dark area is set to “0”. ”(Black), a binarized image corresponding to the gray scale image is obtained for each captured frame, and stored in the storage unit 14m.
- step S3 the human body candidate region extraction unit 30 converts “1” (white) of the binarized image into run-length data for each scan line in the x direction (horizontal direction), and the portion overlapping the y direction.
- a certain line is regarded as one object, labels are attached to the circumscribed rectangles of the object, and a labeling process is performed to make the human body candidate region 52 surrounded by the circumscribed rectangle in FIG.
- step S4 an object is extracted.
- the human body comparison feature extraction unit 31 includes a mask area 53 (FIG. 10) that is slightly larger than the labeled human body candidate area 52 in the frame image.
- the area surrounded by the alternate long and short dash line) is searched while scanning the pixel values from the upper side to the lower side and from the left side to the right side of the pixels in the mask area 53.
- the scanning part is determined as the boundary between the human body candidate 4 and the road surface 51 in the image, and is set as the lower end OBb of the object.
- the boundary between the human body candidate 4 and the road surface 51 is the boundary with the road surface 51 when the legs are covered with slacks, shoes are worn on the legs, or the binarized image may not be obvious. Discriminate from the grayscale image.
- the average value of luminance and luminance dispersion of the pixels in the mask area 53 are obtained, and for example, the average value of luminance or luminance for each small area of a rectangular pixel group composed of a plurality of pixels.
- the variance is calculated from the upper side to the lower side and from the left side to the right side of the image in the mask area 53, and is a boundary between the human body candidate 4 and the road surface 51 when small areas having low luminance variance are consecutive.
- the lower end OBb of the object may be used.
- step S4b the human body comparison feature extraction unit 31 further performs a search while scanning pixel by pixel from the left side to the right side with respect to the image in the mask area 53 on the upper side from the lower end OBb of the object.
- the scanning part of the change section is the human body candidate 4.
- the upper end OBt of the object that is the edge of the boundary with the background is used.
- the size of the height h of the human body candidate 4 on the image (denoted by the number of pixels n) can be obtained.
- step S5 the head extracting unit 32 scans in the horizontal direction from the upper end OBt of the target object that is the human body candidate 4 in the binary image in the horizontal direction from the right side to the left side. Two vertical edges 56 and 58 corresponding to the head width hw are extracted. Then, the size of the head width hw (the number of pixels is m) is obtained.
- the human head 62 is one of the brightest parts in the infrared image, so the value “1” is continuous, and the vertical edges 56 and 58 of the head 62 can be detected accurately. Is easy.
- the distance Zr to the candidate is calculated by the following equation (3) as in the above equation (1).
- Zr HT ⁇ F / h (3)
- the calculated distance Zr is shown in FIG. 8 in comparison with the assumed value of the uniform height Hc described with reference to FIG. 15 according to the prior art, for example, the distance Zc obtained as 170 [cm].
- the head width is almost a fixed value at 4 years of age or older, and the height is a fluctuating value that fluctuates with age until about the late teens (average value of Japanese 104.8 [cm]; 4 years old, 170. 5 [cm]; 20-24 years old, 158.6 [cm]; 70-79 years old, etc.)
- the distance error ⁇ Z is extremely small as shown in FIG. .
- the height calculation error ⁇ H is also extremely small as compared to the true height value Htrue.
- step S8 a moving object (moving object) at a high temperature is detected as an object from the grayscale image and binarized image obtained for each frame over time, and a moving vector (speed) of the moving object is detected. And direction). Further, in this step S8, based on the brake operation amount Br, the vehicle speed Vs, the yaw rate Yr, which are the outputs of the brake sensor 20, the vehicle speed sensor 18, and the yaw rate sensor 22, and the distance Zr to the object detected in step S7. If the distance calculation unit 36 of the vehicle 12 determines whether there is a possibility of contact with the object whose distance Zr has been calculated, and determines that there is a possibility of contact, in step S9, the driver is notified. Provide information. Specifically, the gray scale image of the pedestrian is displayed on the HUD 26a, and an alarm is generated through the speaker 24 (notification unit) to notify the driver, and the driver of the vehicle 12 is prompted to perform a contact avoidance operation.
- the object distance measuring device 50 using the human body detection device according to the first embodiment described above is a single infrared camera 16 that is an imaging device (note that the present invention captures a visible region).
- a normal video camera can be used, but in the case of nighttime, an infrared camera is preferable.
- human body detection is performed based on an image acquired by the human body candidate region extraction unit 30, an image (binarized image) is displayed. And / or the human body candidate region 52 is extracted from the gray scale image).
- the head extraction unit 32 constituting the human body reference feature extraction unit 33 extracts the head 62 from the extracted human body candidate region 52 based on a temperature pattern specific to the head having a particularly high surface temperature.
- the human body comparison feature extraction unit 31 has an overall height (height in FIG. 1A), an overall width (shoulder width in FIG. 1A), a torso (upper limb length in FIG. 1A), an arm (arm length), from the extracted human body candidate region 52. At least one human body feature is extracted as a human body comparison feature from the human body features of the legs (inseam height in FIG. 1A) (height in this first embodiment).
- the above size in this first embodiment, the height HT in FIGS. 7, 8, and 11
- the ratio includes the ratio R between the head (reference feature) and the height (comparative feature), for example, the head (human body reference feature) and the upper limb length (human body comparison feature), the head (human body reference feature), and the inseam. High (leg: human body comparison feature) and the like.
- the distance calculation unit 36 is an infrared ray imaging device according to the above equation (3) based on the height HT that is the estimated size in real space and the height h that is the size of the human body candidate region 52 in the image. The distance from the position of the camera 16 to the human body candidate 2r in the real space can be calculated.
- the table shown in FIG. 12 shows the age and height (average) extracted from the above-mentioned Japanese physique data of FIGS. 1A, 1B, 1C and 2, and the infant physique data of FIGS. 3A, 3B, 3C and 4. Value), a ratio R ′ (shoulder width / height) map (table) 80 in which the ratio R ′ (shoulder width / height) according to the second embodiment corresponding to the shoulder width (average value) is calculated is shown.
- the characteristic diagram shown in FIG. 13 is a plot of height, shoulder width, and the calculated ratio R ′ (shoulder width / height) according to the second embodiment against the age of 4 to 99 years, and the ratio R ′. It should be noted that the characteristic Cr ′ of (shoulder width / height) changes substantially in proportion to the height at about 5 years of age or older.
- step S4 the size of the height h of the human body candidate 4 on the image (from the lower end OBb and the upper end OBt of the human body candidate 4 (object) shown in FIG.
- the number of pixels is n).
- step S5 ′ instead of step S5, the shoulder extraction unit 37 will be described below with reference to FIG. 14 from the upper end OBt of the object that is the human body candidate 4 in the binary image to the lower side and from the right side to the left side. Scanning is performed in the horizontal direction, and two vertical edges 76 and 78 corresponding to the shoulder width sw of the shoulder portions 72 and 73, which are brightness change points and are the full width (maximum width) in the image, are extracted. Then, the size of the shoulder width sw (denoted by the number of pixels p) is obtained.
- the shoulder width sw of the shoulder portions 72 and 73 of the human body is one of the high-luminance portions in the infrared image when exposed (for example, when wearing sleeveless clothes).
- “1” values are continuous, and it is easy to accurately detect the vertical edges 76 and 78 of the shoulder portions 72 and 73.
- the arm outer sides in the vicinity of the arm joints are similarly detected as the vertical edges 76 and 78 of the shoulders 72 and 73 in the same manner. Can do. Therefore, for example, in the case of light wear in summer and the like, the second embodiment is preferably applied.
- the distance Zr to the candidate is calculated by the above-described equation (3).
- the calculated distance Zr is a distance error compared to the assumed value of the uniform height Hc described with reference to FIG. 15 according to the prior art, for example, the distance Zc obtained as 170 [cm]. ⁇ Z becomes smaller. Therefore, the height calculation error ⁇ H is also smaller than the height true value Htrue.
- the characteristic Cr ′ of the ratio R ′ is linearly approximated, and the height in the real space corresponding to the calculated ratio Ri ′ with reference to the characteristic Cr ′′ HT may be obtained.
- the height HT is estimated (calculated) as HT ⁇ 152 [cm].
- the human body using the characteristic Cr ′ or the characteristic Cr ′′ is used. Even if the distance Zr to the candidate is calculated, there is a practically sufficient effect.
- the present invention is not limited to the above-described embodiment, and it is needless to say that various configurations can be adopted based on the contents described in this specification.
- the height HT is estimated.
- the shoulder is detected based on R ′
- the height HT is estimated.
- other parts such as the trunk (upper limb length in FIG. 1A), arms (arm length)
- a leg inseam height in FIG. 1A or the like
- the distance may be calculated based on a human body candidate with high reliability.
- the human body features are the total height, full width, torso, arms, legs, and head, but based on other parts such as the foot size, the knee length, the hand size, etc.
- the distance may be calculated.
- the distance is calculated using a single so-called monocular photographing device (infrared camera (or a normal video camera)), but an infrared camera (or a normal video camera) may be used.
- a so-called compound eye photographing device (stereo camera) using two cameras can also be used.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Zc=Hc×F/h=170[cm]×F/h …(1)
ΔZ=F×ΔH/h …(2)
図5は、この発明の一実施形態に係る人体検知装置を利用した対象物距離測定装置50が組み込まれた車両周辺監視装置10の構成を示すブロック図である。図6は、図5に示した人体検知装置を利用した対象物距離測定装置50が組み込まれた車両周辺監視装置10が搭載された車両12の模式図である。
図9は、画像処理ユニット14による歩行者等の対象物の対象物検出、対象物のサイズ推定及び距離算出等の動作説明に供されるフローチャートである。
Zr=HT×F/h …(3)
図12に示す表は、上述した図1A、図1B、図1C及び図2の日本人体格データ、並びに図3A、図3B、図3C及び図4の乳幼児体格データから抽出した年齢、身長(平均値)、肩幅(平均値)に対応する第2実施例に係る比率R´(肩幅/身長)を計算した、比率R´(肩幅/身長)のマップ(テーブル)80を示している。
Claims (4)
- 撮像装置(16)により取得された画像に基づき人体検知を行う人体検知装置を利用した対象物距離測定装置(50)において、
前記画像から人体候補領域(52)を抽出する人体候補領域抽出部(30)と、
抽出した前記人体候補領域(52)の中から全高、全幅、胴、腕、脚、頭部の人体特徴のうち、予め定めた1つの人体特徴を基準特徴として抽出する基準特徴抽出部(33)と、
抽出した前記人体候補領域(52)の中から全高、全幅、胴、腕、脚、頭部の人体特徴のうち、前記基準特徴を除く他の1つの人体特徴を比較特徴として抽出する比較特徴抽出部(31)と、
前記基準特徴の大きさと、前記比較特徴の大きさの比率(R)に基づいて、前記画像の中の人体候補の実空間上でのサイズを推定する実空間上サイズ推定部(34)と、
推定した前記実空間上での人体候補のサイズと、前記画像の中の人体候補のサイズに基づいて、前記撮像装置(16)から前記実空間上の前記人体候補までの距離を算出する距離算出部(36)と、
を備えることを特徴とする対象物距離測定装置。 - 請求項1記載の対象物距離測定装置において、
前記基準特徴を前記頭部(62)とし、前記比較特徴を前記全高とする
ことを特徴とする対象物距離測定装置。 - 請求項1記載の対象物距離測定装置において、
前記基準特徴を前記全幅とし、前記比較特徴を前記全高とする
ことを特徴とする対象物距離測定装置。 - 請求項1~3のいずれか1項に記載の対象物距離測定装置(50)を搭載した車両(12)であって、
当該車両(12)は、前記人体検知装置が人体を検知したとき、運転者に通知する通知部(24,26)を備える
ことを特徴とする車両。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011543170A JP5258977B2 (ja) | 2009-11-25 | 2010-10-18 | 対象物距離測定装置及び当該装置が搭載された車両 |
CN201080053321.8A CN102640182B (zh) | 2009-11-25 | 2010-10-18 | 被监测对象距离测定装置和搭载了该装置的车辆 |
US13/511,074 US8983123B2 (en) | 2009-11-25 | 2010-10-18 | Target-object distance measuring device and vehicle mounted with the device |
EP10832996.2A EP2506211B1 (en) | 2009-11-25 | 2010-10-18 | Target-object distance measuring device and vehicle mounted with the device |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009267793 | 2009-11-25 | ||
JP2009-267793 | 2009-11-25 | ||
JP2010090407 | 2010-04-09 | ||
JP2010-090407 | 2010-04-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011065149A1 true WO2011065149A1 (ja) | 2011-06-03 |
Family
ID=44066254
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/068253 WO2011065149A1 (ja) | 2009-11-25 | 2010-10-18 | 対象物距離測定装置及び当該装置が搭載された車両 |
Country Status (5)
Country | Link |
---|---|
US (1) | US8983123B2 (ja) |
EP (1) | EP2506211B1 (ja) |
JP (1) | JP5258977B2 (ja) |
CN (1) | CN102640182B (ja) |
WO (1) | WO2011065149A1 (ja) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102431485A (zh) * | 2011-10-13 | 2012-05-02 | 中国人民解放军总后勤部军需装备研究所 | 一种车载式非接触三维人体自动测量系统 |
JP2013002884A (ja) * | 2011-06-14 | 2013-01-07 | Honda Motor Co Ltd | 測距装置 |
JP2014122871A (ja) * | 2012-12-21 | 2014-07-03 | Zong Jing Investment Inc | 測距方法及びコンピュータプログラム製品 |
JP5642785B2 (ja) * | 2010-06-07 | 2014-12-17 | 本田技研工業株式会社 | 車両の周辺監視装置 |
US20150035962A1 (en) * | 2012-03-12 | 2015-02-05 | Honda Motor Co., Ltd. | Vehicle periphery monitor device |
JP2016080550A (ja) * | 2014-10-17 | 2016-05-16 | オムロン株式会社 | エリア情報推定装置、エリア情報推定方法、および空気調和装置 |
CN110488811A (zh) * | 2019-07-22 | 2019-11-22 | 上海有个机器人有限公司 | 一种基于社交网络模型的机器人对行人轨迹预测的方法 |
JP2020201746A (ja) * | 2019-06-11 | 2020-12-17 | トヨタ自動車株式会社 | 距離推定装置、距離推定方法及び距離推定用コンピュータプログラム |
JP2021026265A (ja) * | 2019-07-31 | 2021-02-22 | 富士通株式会社 | 画像処理装置、画像処理プログラム、及び画像処理方法 |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013047088A1 (ja) * | 2011-09-28 | 2013-04-04 | 本田技研工業株式会社 | 生体認識装置 |
US11615460B1 (en) | 2013-11-26 | 2023-03-28 | Amazon Technologies, Inc. | User path development |
US9444988B2 (en) * | 2014-01-23 | 2016-09-13 | Kiomars Anvari | Fast image sensor for body protection gear or equipment |
US10586203B1 (en) * | 2015-03-25 | 2020-03-10 | Amazon Technologies, Inc. | Segmenting a user pattern into descriptor regions for tracking and re-establishing tracking of a user within a materials handling facility |
US11205270B1 (en) | 2015-03-25 | 2021-12-21 | Amazon Technologies, Inc. | Collecting user pattern descriptors for use in tracking a movement of a user within a materials handling facility |
US10679177B1 (en) | 2015-03-25 | 2020-06-09 | Amazon Technologies, Inc. | Using depth sensing cameras positioned overhead to detect and track a movement of a user within a materials handling facility |
US10810539B1 (en) | 2015-03-25 | 2020-10-20 | Amazon Technologies, Inc. | Re-establishing tracking of a user within a materials handling facility |
US9896022B1 (en) * | 2015-04-20 | 2018-02-20 | Ambarella, Inc. | Automatic beam-shaping using an on-car camera system |
KR20160126802A (ko) * | 2015-04-24 | 2016-11-02 | 삼성전자주식회사 | 인체 정보를 측정하는 방법 및 그 전자 장치 |
JP6627680B2 (ja) | 2016-07-27 | 2020-01-08 | 株式会社Jvcケンウッド | 人物検出装置、人物検出システム、人物検出方法及び人物検出プログラム |
US9891625B1 (en) * | 2016-11-20 | 2018-02-13 | Veripat, LLC | Vehicle with autonomous feature override for theft prevention |
US11328513B1 (en) | 2017-11-07 | 2022-05-10 | Amazon Technologies, Inc. | Agent re-verification and resolution using imaging |
US11186295B2 (en) * | 2017-12-22 | 2021-11-30 | Veripat, LLC | Vehicle with escape feature using synthesized vehicle view |
US10991130B2 (en) * | 2019-07-29 | 2021-04-27 | Verizon Patent And Licensing Inc. | Systems and methods for implementing a sensor based real time tracking system |
CN116107394B (zh) * | 2023-04-06 | 2023-08-04 | 合肥联宝信息技术有限公司 | 调整方法、装置、电子设备及存储介质 |
CN116953680B (zh) * | 2023-09-15 | 2023-11-24 | 成都中轨轨道设备有限公司 | 一种基于图像的目标物实时测距方法及系统 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007213561A (ja) | 2006-01-16 | 2007-08-23 | Honda Motor Co Ltd | 車両周辺監視装置 |
JP3970876B2 (ja) | 2004-11-30 | 2007-09-05 | 本田技研工業株式会社 | 車両周辺監視装置 |
JP2009088709A (ja) * | 2007-09-27 | 2009-04-23 | Fujifilm Corp | 身長推定装置及び撮影装置 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4135123B2 (ja) * | 1998-05-13 | 2008-08-20 | 日産自動車株式会社 | 表示処理装置 |
DE10301468B4 (de) * | 2002-01-18 | 2010-08-05 | Honda Giken Kogyo K.K. | Vorrichtung zur Beobachtung der Umgebung eines Fahrzeugs |
DE102005056647B4 (de) * | 2004-11-30 | 2011-02-03 | Honda Motor Co., Ltd. | Fahrzeugumgebungsüberwachungsvorrichtung |
JP4062306B2 (ja) * | 2004-12-15 | 2008-03-19 | 日産自動車株式会社 | 画像処理装置、および方法 |
JP4267657B2 (ja) * | 2006-10-31 | 2009-05-27 | 本田技研工業株式会社 | 車両周辺監視装置 |
CN100567886C (zh) * | 2007-01-15 | 2009-12-09 | 中国北方车辆研究所 | 一种三点定标测量方法 |
CN101016053A (zh) * | 2007-01-25 | 2007-08-15 | 吉林大学 | 高等级公路上车辆防追尾碰撞预警方法和系统 |
CN100567891C (zh) * | 2007-02-16 | 2009-12-09 | 北京航空航天大学 | 基于双目视觉的车载环境及距离测量系统 |
EP2227406B1 (en) | 2007-11-12 | 2015-03-18 | Autoliv Development AB | A vehicle safety system |
-
2010
- 2010-10-18 EP EP10832996.2A patent/EP2506211B1/en not_active Not-in-force
- 2010-10-18 JP JP2011543170A patent/JP5258977B2/ja not_active Expired - Fee Related
- 2010-10-18 WO PCT/JP2010/068253 patent/WO2011065149A1/ja active Application Filing
- 2010-10-18 CN CN201080053321.8A patent/CN102640182B/zh active Active
- 2010-10-18 US US13/511,074 patent/US8983123B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3970876B2 (ja) | 2004-11-30 | 2007-09-05 | 本田技研工業株式会社 | 車両周辺監視装置 |
JP2007213561A (ja) | 2006-01-16 | 2007-08-23 | Honda Motor Co Ltd | 車両周辺監視装置 |
JP2009088709A (ja) * | 2007-09-27 | 2009-04-23 | Fujifilm Corp | 身長推定装置及び撮影装置 |
Non-Patent Citations (4)
Title |
---|
"Measured Data of Japanese Human Bodies, Measured from 1992 to 1994", 1992, RESEARCH INSTITUTE OF HUMAN ENGINEERING FOR QUALITY LIFE |
"Report on the Investigation of Japanese Body Frames, Measured from 1978 to 1981", 1978, AGENCY OF INDUSTRIAL SCIENCE AND TECHNOLOGY, JAPANESE STANDARDS ASSOCIATION |
"Report on the Investigation of Japanese Body Frames, Measured from 1978 to 1981", 1978, JAPANESE STANDARDS ASSOCIATION |
See also references of EP2506211A4 |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5642785B2 (ja) * | 2010-06-07 | 2014-12-17 | 本田技研工業株式会社 | 車両の周辺監視装置 |
JP2013002884A (ja) * | 2011-06-14 | 2013-01-07 | Honda Motor Co Ltd | 測距装置 |
CN102431485B (zh) * | 2011-10-13 | 2013-04-17 | 中国人民解放军总后勤部军需装备研究所 | 一种车载式非接触三维人体自动测量系统 |
CN102431485A (zh) * | 2011-10-13 | 2012-05-02 | 中国人民解放军总后勤部军需装备研究所 | 一种车载式非接触三维人体自动测量系统 |
US10565438B2 (en) * | 2012-03-12 | 2020-02-18 | Honda Motor Co., Ltd. | Vehicle periphery monitor device |
US20150035962A1 (en) * | 2012-03-12 | 2015-02-05 | Honda Motor Co., Ltd. | Vehicle periphery monitor device |
JP2014122871A (ja) * | 2012-12-21 | 2014-07-03 | Zong Jing Investment Inc | 測距方法及びコンピュータプログラム製品 |
JP2016080550A (ja) * | 2014-10-17 | 2016-05-16 | オムロン株式会社 | エリア情報推定装置、エリア情報推定方法、および空気調和装置 |
JP2020201746A (ja) * | 2019-06-11 | 2020-12-17 | トヨタ自動車株式会社 | 距離推定装置、距離推定方法及び距離推定用コンピュータプログラム |
JP7003972B2 (ja) | 2019-06-11 | 2022-01-21 | トヨタ自動車株式会社 | 距離推定装置、距離推定方法及び距離推定用コンピュータプログラム |
CN110488811A (zh) * | 2019-07-22 | 2019-11-22 | 上海有个机器人有限公司 | 一种基于社交网络模型的机器人对行人轨迹预测的方法 |
CN110488811B (zh) * | 2019-07-22 | 2023-04-07 | 上海有个机器人有限公司 | 一种基于社交网络模型的机器人对行人轨迹预测的方法 |
JP2021026265A (ja) * | 2019-07-31 | 2021-02-22 | 富士通株式会社 | 画像処理装置、画像処理プログラム、及び画像処理方法 |
Also Published As
Publication number | Publication date |
---|---|
CN102640182B (zh) | 2014-10-15 |
JPWO2011065149A1 (ja) | 2013-04-11 |
JP5258977B2 (ja) | 2013-08-07 |
EP2506211B1 (en) | 2014-05-07 |
US20120281878A1 (en) | 2012-11-08 |
EP2506211A4 (en) | 2013-05-22 |
EP2506211A1 (en) | 2012-10-03 |
US8983123B2 (en) | 2015-03-17 |
CN102640182A (zh) | 2012-08-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5258977B2 (ja) | 対象物距離測定装置及び当該装置が搭載された車両 | |
JP4410292B1 (ja) | 車両の周辺監視装置 | |
US7233233B2 (en) | Vehicle surroundings monitoring apparatus | |
US7436982B2 (en) | Vehicle surroundings monitoring apparatus | |
JP5639283B2 (ja) | 車両周辺監視装置 | |
JP3764086B2 (ja) | 車両用情報提供装置 | |
JP4173901B2 (ja) | 車両周辺監視装置 | |
JP4631096B2 (ja) | 車両周辺監視装置 | |
US8810653B2 (en) | Vehicle surroundings monitoring apparatus | |
JP4173902B2 (ja) | 車両周辺監視装置 | |
JP4528283B2 (ja) | 車両周辺監視装置 | |
JP4644273B2 (ja) | 車両周辺監視装置 | |
US7885430B2 (en) | Automotive environment monitoring device, vehicle with the automotive environment monitoring device, and automotive environment monitoring program | |
JP5004923B2 (ja) | 車両の運転支援装置 | |
JP2006151300A (ja) | 車両周辺監視装置 | |
JP2009126493A (ja) | 障害物検出装置 | |
JP4704998B2 (ja) | 画像処理装置 | |
JP3994954B2 (ja) | 物体検出装置及び物体検出方法 | |
JP3844750B2 (ja) | 赤外線画像認識装置、及び赤外線画像認識装置を用いた警報装置 | |
JP5430633B2 (ja) | 車両周辺監視装置 | |
JP2004362265A (ja) | 赤外線画像認識装置 | |
JP4922368B2 (ja) | 車両周辺監視装置 | |
JP5904927B2 (ja) | 車両用周辺監視装置 | |
JP5907849B2 (ja) | 車両周辺監視装置 | |
JP4283266B2 (ja) | 車両周辺監視装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080053321.8 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10832996 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011543170 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13511074 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010832996 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |