WO2014007286A1 - 状態認識システム及び状態認識方法 - Google Patents
状態認識システム及び状態認識方法 Download PDFInfo
- Publication number
- WO2014007286A1 WO2014007286A1 PCT/JP2013/068241 JP2013068241W WO2014007286A1 WO 2014007286 A1 WO2014007286 A1 WO 2014007286A1 JP 2013068241 W JP2013068241 W JP 2013068241W WO 2014007286 A1 WO2014007286 A1 WO 2014007286A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- state
- vehicle
- road
- image
- travel path
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/98—Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Definitions
- the present invention relates to a system for recognizing a state, and more particularly, to a system mounted on a vehicle and recognizing a road surface state.
- Determining the road surface condition is preferable for safe driving of vehicles such as automobiles.
- Japanese Patent Laid-Open No. 2011-174794 obtains short-cycle Gabor filter processing data and long-cycle Gabor filter processing data by performing Gabor filter processing on an image photographed by a camera.
- a technique for determining the wet and dry condition of the road surface by normalizing from both the data and comparing the normalized value with a predetermined threshold value is disclosed.
- An object of the present invention is to suitably determine whether the state of a traveling road is on-road or off-road even in a situation where the road surface state cannot be clearly observed.
- a typical example of the invention disclosed in the present application is as follows. That is, a state recognition system for recognizing a state of a traveling path on which a vehicle travels, and an analysis unit that analyzes an image photographed by an in-vehicle camera mounted on the vehicle, and a travel based on a result of analysis by the analysis unit
- a traveling path discriminating unit that determines a state of the road, and extracts an attachment of a lens of the vehicle-mounted camera from the image, determines a type and an amount of the extracted attachment, and the lens based on the determination result
- a lens diagnosis unit for diagnosing the state of
- FIG. 1 is a block diagram showing the configuration of an external environment recognition system 1 according to the first embodiment of the present invention.
- the external recognition system 1 of the present embodiment is a computer that includes a processor (CPU) 101, a memory 102, and an input / output interface 103, and these components are connected by a bus.
- the processor 101 is an arithmetic device that executes a program stored in the memory 102.
- the memory 102 is, for example, a non-volatile storage device such as a flash memory or a high-speed and volatile storage device such as a DRAM (Dynamic Random Access Memory), and stores an operating system (OS) and application programs.
- OS operating system
- the processor 101 executes the operating system, the basic functions of the computer are realized, and when the application program is executed, the functions provided by the external recognition system 1 are realized.
- the streetlight analysis unit 201 When the processor 101 executes a predetermined program, the streetlight analysis unit 201, the road surface analysis unit 202, the traveling road determination unit 203, and the lens diagnosis unit 204 are mounted on the external environment recognition system 1.
- the in-vehicle camera that captures an image input to the external environment recognition system 1 is a wide-angle camera with a wide angle of view or a narrow-angle camera that can capture an image with a narrow angle of view, and is attached to the front or rear of the vehicle.
- the camera attached to the rear of the vehicle is more susceptible to mud and snow splashing by the vehicle, and its lens is likely to become dirty, so the effect of applying this embodiment is higher.
- the streetlight analysis unit 201 extracts a high-intensity area brighter than a predetermined brightness in an area above the horizon (sky) of the input image, thereby lighting around the road (streetlights and surrounding buildings) Is analyzed.
- the streetlight analysis unit 201 has a day / night determination function and is controlled to operate only at night.
- the day / night determination function can determine whether it is day or night based on the brightness of the input image or the exposure control (Auto-Exposure) information of the camera that captures the image.
- the road surface analysis unit 202 extracts road surface features from the input image, and outputs road surface feature amounts and recognition results as road surface information. Details of the configuration of the road surface analysis unit 202 will be described later with reference to FIG.
- the travel path determination unit 203 is on-road based on the lighting information analyzed by the streetlight analysis unit 201, the road surface information analyzed by the road surface analysis unit 202, and the lens state diagnosed by the lens diagnosis unit 204. Or offload. That is, the travel path determination unit 203 analyzes the lighting information and the road surface information in time series to determine whether the travel path is on-road (paved road) or off-road (unpaved road, mud road, snowy road, grassland, For example, desert). Details of the processing in the travel path determination unit 203 will be described later with reference to FIG.
- the lens diagnosis unit 204 diagnoses the lens state based on the image information and the traveling road state, and includes a stain extraction unit 2041 and a stain determination unit 2042.
- Typical lens stains include water droplets, mud, snow, and water droplet traces, and the stain extraction unit 2041 extracts regions that are suspected to be stains (attachment candidate regions). For example, the candidate object region is extracted based on the edge, color, shape, size, and temporal change in the input image.
- the dirt determination unit 2042 determines the type of the extracted deposit candidate area based on the travel path state determined by the travel path determination unit 203, and determines the type of the determined deposit candidate area and the amount of the deposit. Is output as lens state information. For example, since it is difficult to discriminate between water droplets and muddy water at night, more accurate discrimination is realized based on the traveling road state. When a deposit that seems to be water droplets or muddy water is detected on-road, the deposit is determined to be a water droplet. On the other hand, when an adhering substance like water droplets or muddy water is detected in off-road, the adhering substance is determined to be muddy water. Thereby, muddy water having a large influence as lens dirt is more appropriately determined.
- the external environment recognition system 1 may have a secondary storage device.
- the secondary storage device is a large-capacity non-volatile storage device such as a magnetic storage device or a flash memory, and may store a program executed by the processor 101 and data used by the processor 101 when the program is executed. .
- the program is read from the secondary storage device, loaded into the memory 102, and executed by the processor 101.
- the input / output interface 103 is an interface for inputting / outputting data such as USB.
- the input / output interface 103 may include a communication interface connected to the in-vehicle network.
- the program executed by the processor 101 is provided to the computer via a nonvolatile storage medium or a network.
- the computer may be provided with an interface for reading a storage medium (CD-ROM, flash memory, etc.).
- FIG. 2 is a block diagram showing the configuration of the road surface analysis unit 202 of the first embodiment.
- the road surface analysis unit 202 includes a solid line lane feature extraction unit 2020, a solid line lane presence / absence determination unit 2021, a periodic lane feature extraction unit 2022, a periodic lane presence / absence determination unit 2023, an outside road feature extraction unit 2024, an outside road stability presence / absence determination unit 2025, A road surface paint feature extracting unit 2026, a road surface paint presence / absence determining unit 2027, a road surface color feature extracting unit 2028, and a road surface covering presence / absence determining unit 2029 are provided.
- the traveling road state is suitably determined even in bad weather or dark night.
- the solid line lane feature extraction unit 2020 extracts a solid line component from the left and right lane analysis region images, analyzes the degree and color of the extracted solid line, and outputs a feature amount of the solid line lane.
- the solid line lane presence / absence determination unit 2021 determines the presence / absence of a solid line lane based on the solid line lane feature amount, and outputs solid line lane information indicating the probability that a solid line lane exists.
- the periodic lane feature extraction unit 2022 extracts a shape having a periodicity (such as a broken line) from the images of the left and right lane analysis regions, analyzes the periodicity of the extracted shape, and outputs a feature amount of the periodic lane.
- the periodic lane presence / absence determining unit 2023 determines the presence / absence of a periodic lane based on the periodic lane feature amount, and outputs periodic lane information indicating the probability that the periodic lane exists.
- the lanes can be broadly classified into a solid line type (white, yellow) and a periodic type (broken line, botsdots, etc.).
- the solid line lane feature extraction unit 2020 and the periodic lane feature extraction unit 2022 are different from each other. To extract. If a solid line lane and / or a periodic lane is detected, there is a high possibility that the vehicle is on road (paved road).
- the out-of-road feature extraction unit 2024 extracts the feature amount (luminance, etc.) of the images in the left and right out-of-road analysis areas and outputs the out-of-road feature amount.
- the out-of-road stability presence / absence determination unit 2025 analyzes the time-series change (periodicity, stability) of the output out-of-road feature amount to determine whether the object outside the road is an artificial object, grass, snow, mud, etc. It is determined whether the object is a natural object, and out-of-road information indicating the probability that the object outside the road is an artifact is output. For example, when the brightness of the out-of-road analysis area does not change in time series, it can be determined that a roadside belt is provided outside the road.
- luminance of an out-of-road analysis area changes periodically, it can determine with the artifacts (for example, guardrail, a noise barrier, etc.) being installed outside the road. If an artifact is detected outside the road, there is a high possibility of being on-road (paved road). If natural objects are detected outside the road, there is a high possibility that the road is off-road (unpaved road).
- artifacts for example, guardrail, a noise barrier, etc.
- the road surface paint feature extraction unit 2026 extracts features of road surface paint (for example, pedestrian crossing, stop line, speed limit character paint, etc.) in an area between the left and right lane analysis units.
- the road surface paint presence / absence determination unit 2027 analyzes the feature amount of the road surface paint, determines the presence or absence of road surface paint, and outputs road surface paint information indicating the probability that the road surface paint exists. If road surface paint is detected, there is a high possibility of being on-road (paved road).
- the road surface color feature extraction unit 2028 determines the color ratio of the image.
- the road surface covering presence / absence determining unit 2029 determines whether the road surface is covered by the determined color of the image, and outputs road surface covering information indicating the probability that the road surface is covered. For example, when the color in the image is dominant white, there is a high possibility that the road surface is covered with snow.
- FIG. 3 is a diagram illustrating an example of an image input to the external environment recognition system 1 according to the first embodiment.
- the image is input from the in-vehicle camera to the external environment recognition system 1 of the first embodiment.
- the in-vehicle camera is a wide-angle camera with a wide angle of view or a narrow-angle camera with a narrow angle of view that can take images far and is attached to the front or rear of the vehicle.
- FIG. 3 illustrates an image acquired by the wide-angle camera.
- an area 401 where the vehicle body and / or the light shielding plate is reflected, and there is an area 402 where the vehicle body is reflected in the lower part.
- lane analysis areas 403 for analyzing the lane in which the vehicle travels are provided on the left and right of the image.
- an outside road analysis area 404 for analyzing a state outside the running road is provided on the left and right outside of the lane analysis area.
- characteristics such as a solid line 411, a periodic lane 412, a road surface paint 413, a streetlight 414, and a stable roadside as shown in FIG. 3 often appear. On the other hand, these characteristics rarely appear on unpaved roads. These characteristics are suitable characteristics for determining whether the traveling road is on-road or off-road, and can be extracted without greatly losing clarity even in bad weather or dark night.
- FIG. 4 is a flowchart of the process executed by the travel path determination unit 203 of the first embodiment.
- the travel path determination unit 203 determines whether or not the lens state is such that the travel path can be determined using the lens state information output by the lens diagnosis unit 204 (S101). For example, in a lens state in which the road surface is almost invisible, there is a high possibility that the road surface is erroneously determined to be off-road, so that it is determined that the travel path cannot be determined.
- the travel path determination unit 203 uses the road surface information analyzed by the road surface analysis unit 202 to calculate an off-road probability (S102).
- the probability of being off-road can be obtained by the following Bayesian estimation.
- the road class is C (off-road, on-road)
- the road surface information is N vectors x LANE (solid line lane information, periodic lane information, outside road information, road surface paint information, road surface cover information,... )
- x LANE ) of the road class after obtaining the road surface information can be obtained by the following equation.
- the posterior probability can be calculated by the following equation.
- the posterior probability can be calculated by learning in advance the probability that each observation value can be obtained from the road class, and multiplying the prior probability by the product thereof.
- the prior probability may be set to an arbitrary value or may be given in a uniform distribution.
- the travel route class may be determined by an identification model that models a discriminant function.
- the probability of the road class and the discrimination result may be obtained after obtaining observation information using an algorithm such as support vector machine or boosting.
- the probability of the travel route class and the discrimination result after obtaining the observation information may be obtained by using a time series generation model such as a hidden Markov model (HMM: Hidden Markov Model).
- the travel path determination unit 203 uses the lamp information vector x LIGHT (high luminance area area ratio, high luminance area concentration, high luminance area score,%) Analyzed by the street lamp analysis unit 201 to perform off-road.
- xLIGHT ) is calculated by the following equation (S103). For example, if the area ratio of the high-brightness area is equal to or higher than a predetermined threshold value, there is a high possibility of being on-road. When the number of high and high luminance area points is large, the score can be obtained from the viewpoint that it is likely to be on-road like a streetlight or a surrounding building.
- the traveling path determination unit 203 calculates an immediate off-road score p (C
- ⁇ is a weight of road surface information
- ⁇ is a weight of lighting information.
- a predetermined number (offset time) of offload scores of a plurality of frames are accumulated, and an average value or median value is obtained to calculate a time series representative value of the offload score (S106). Thereafter, it is determined whether or not the time-series representative value of the obtained offload score exceeds a predetermined threshold (S107). If the time-series representative value exceeds the threshold value, it is determined that the traveling road is off-road (S108). On the other hand, if the off-road score is equal to or less than a predetermined threshold, it is determined that the road is on-road (S109).
- the off-road score was obtained by calculating the weighted sum, but the road surface information and the street light information were used comprehensively. Then, the offload score of the current frame may be obtained.
- the first embodiment of the present invention since it has a streetlight analysis unit that analyzes the lighting in the image and a road surface analysis unit that analyzes the characteristics of the road surface from the image, The above two features can be used to accurately determine whether the vehicle is on-road or off-road.
- the road surface information used by the road surface analysis unit is a feature that does not lose much clarity even in bad weather or dark night (solid line lanes, periodic lanes, road stability, road surface paint, etc.). It is possible to make a suitable travel route determination without being greatly affected by the above. That is, erroneous determination and non-determination of the traveling road state can be greatly reduced.
- the traveling path is determined based on the lens state, it is possible to avoid forcibly making the determination when the lens is extremely dirty. That is, erroneous determination and non-determination caused by lens contamination can be greatly reduced.
- FIG. 5 is a block diagram showing the configuration of the external environment recognition system 1 according to the second embodiment of the present invention.
- the external environment recognition system 1 of the second embodiment is different from the external environment recognition system 1 of the first embodiment described above in that information from the in-vehicle sensor is used in combination to determine whether it is off-road or on-road. For this reason, the external environment recognition system 1 of the second embodiment has a vehicle behavior analysis unit 205.
- the same components and processes as those in the first embodiment described above are denoted by the same reference numerals, and description thereof is omitted.
- a plurality of in-vehicle sensors are mounted on a vehicle on which the external environment recognition system 1 of the second embodiment is mounted.
- the external environment recognition system 1 can use information output from various in-vehicle sensors.
- vehicle sensor information vehicle speed output from a vehicle speed sensor, steering angle output from a steering angle sensor, lateral acceleration and longitudinal acceleration output from an acceleration sensor, yaw rate output from a yaw rate sensor, output from a wheel speed sensor The wheel speed can be used.
- operation information of various control systems such as an anti-lock braking system (ABS), a traction control system (TCS), and a vehicle dynamics control (VDC) can be used as in-vehicle sensor information.
- ABS anti-lock braking system
- TCS traction control system
- VDC vehicle dynamics control
- the pitching amount is determined using a vehicle height sensor that measures the sinking of the vehicle and an acceleration sensor that measures the vertical vibration. You may acquire and use these vehicle-mounted sensor information. Further, information from sensors other than those described above may be used.
- the vehicle behavior analysis unit 205 is implemented by the processor 101 executing a predetermined program.
- the vehicle behavior analysis unit 205 analyzes information output from the in-vehicle sensor and outputs vehicle behavior information indicating the behavior of the vehicle to the travel path determination unit 203.
- the vehicle behavior analysis unit 205 outputs vehicle behavior information to the road surface analysis unit 202
- the road surface analysis unit 202 outputs road surface information to the vehicle behavior analysis unit 205.
- the lane change can be detected with high accuracy by using both the solid line lane information and periodic lane information output by the road surface analysis unit 202 and the steering angle information output by the vehicle behavior analysis unit 205.
- the road surface analysis unit 202 can determine whether the feature amount (periodicity) extracted by the periodic lane feature extraction unit 2022 is appropriate based on the vehicle speed acquired from the vehicle behavior analysis unit 205.
- the traveling road discrimination unit 203 includes lighting information analyzed by the street lamp analysis unit 201, road surface information analyzed by the road surface analysis unit 202, vehicle behavior information analyzed by the vehicle behavior analysis unit 205, and a lens diagnosed by the lens diagnosis unit 204. Based on the state, it is determined whether the state of the traveling road is on-road or off-road. In other words, the travel path determination unit 203 determines whether the travel path is on-road or off-road by analyzing the lighting information, road surface information, and vehicle behavior information in time series. Details of the processing in the travel path determination unit 203 will be described later with reference to FIG.
- FIG. 6 is a flowchart of processing executed by the travel path determination unit 203 of the second embodiment.
- the travel route determination unit 203 of the second embodiment determines the state of the travel route based on the vehicle behavior information analyzed by the vehicle behavior analysis unit 205 in addition to the lighting information and the road surface information. For this reason, step S110 is added.
- step S103 the travel path determination unit 203 calculates the probability of being off-road using the lighting information, and then calculates the probability of being off-road using the vehicle behavior information (S110).
- the vehicle speed is extremely high (for example, exceeding 100 km / h), it is estimated that the vehicle is traveling on a highway, and there is a high possibility that the vehicle is on-road (paved road).
- a large rudder angle is continuously detected during traveling, that is, if the turning radius is small, it is estimated that the vehicle is turning at an intersection or is turning around in a parking lot or the like. Probability is high.
- the variation in the vehicle speed and / or the variation in the steering angle is large, there is a high possibility that the vehicle is off-road.
- the variation in the wheel speed is large, it is estimated that the grips of the wheels are different, and there is a high possibility of being off-road.
- the vehicle information vector is represented by x VEHICLE
- x VEHICLE ) is calculated by the following equation (S110).
- the travel path determination unit 203 calculates an immediate off-road score p (C
- ⁇ is the weight of road surface information
- ⁇ is the weight of lighting information
- ⁇ is the weight of vehicle behavior information.
- the off-road score was obtained by calculating the weighted sum.
- An immediate off-road score may be obtained by comprehensively using road surface information, street light information, and vehicle behavior information.
- ambiguous information suitable for description of the probabilistic model information that cannot be determined by a single piece of information alone and that is effective in combination
- information that is not suitable for description determinable information that can be used to determine the travel route using only single information. Therefore, partial information suitable for the description of the probabilistic model is selected from the road surface information, street light information, and vehicle behavior information, and a road surface / street light / vehicle behavior information vector is constructed, and offloading is performed in the same manner as described above.
- An immediate score may be obtained. Thereafter, the off-road score immediate value may be forcibly rewritten using information that can determine the travel route.
- the on-load / intensity can be obtained in an environment such as darkness or bad weather that is difficult to determine only by image information or in a state where the lens is dirty.
- the off-road determination performance can be further improved.
- the feature with high discrimination performance obtained from the in-vehicle sensor it is possible to increase the number of cases where it is possible to determine whether the travel path is on-road or off-road. That is, the rate at which the traveling road state is correctly determined can be increased.
- FIG. 7 is a block diagram showing a configuration of an information system of a vehicle equipped with the external environment recognition system 1 of the first or second embodiment described above.
- the external recognition system 1 of the third embodiment is different from the external recognition system 1 of the first and second embodiments described above in that it includes a control unit 206 and an object detection unit 207.
- the same components and processes as those in the first or second embodiment described above are denoted by the same reference numerals, and description thereof is omitted.
- the input / output interface 103 is connected with a camera 301, an in-vehicle sensor 302, and a car navigation system 303.
- the camera 301 is a wide-angle camera with a wide angle of view or a narrow-angle camera with a narrow angle of view that can take images far and far.
- the in-vehicle sensor 302 is a vehicle speed sensor, a steering angle sensor, an acceleration sensor, a yaw rate sensor, a wheel speed sensor, and the like, and provides vehicle behavior information to the external environment recognition system 1.
- In-vehicle sensor 302 may include various control systems such as ABS, TCS, and VDC.
- the car navigation system 303 is a system that provides route guidance to the position of the host vehicle and the destination, and provides road information such as road type and road width to the external environment recognition system 1.
- the input / output interface 103 is connected to a speaker 311, a display 312, a lamp (LED) 313, and a dirt removing device 314.
- the speaker 311, the display 312, the lamp 313, and the dirt removing device 314 are devices to be controlled by the external environment recognition system 1, and their operations are controlled by the control unit 206.
- Speaker 311 is a device that converts an electrical signal into sound.
- the display 312 is a display device that displays a predetermined message, and includes, for example, a liquid crystal display panel (LCD).
- the lamp 313 is a light emitting element such as a light emitting diode.
- the speaker 311, the display 312, and the lamp 313 are notification devices that notify the driver or passengers that the amount of dirt exceeds a predetermined threshold.
- the speaker 311, the display 312, and the lamp 313 also function as a notification device that notifies a warning regarding a collision object detected by the object detection unit 207.
- the dirt removing device 314 is a device that removes lens dirt of the camera 301, and is a wiper, a washer, an air injection device, or the like.
- the control unit 206 and the object detection unit 207 are implemented by the processor 101 executing a predetermined program.
- the control unit 206 controls a control target (dirt removal device, object recognition unit, notification device) and the like. For example, the type and amount of adhered matter on the lens are obtained, and when the amount of dirt exceeds a predetermined threshold, a dirt removing device (such as a wiper, a washer or an air jet device for removing lens dirt) is operated.
- the object detection unit 207 extracts an object (such as a vehicle, a two-wheeled vehicle, or a pedestrian) that is included in the input image and that may affect the traveling of the vehicle.
- an object such as a vehicle, a two-wheeled vehicle, or a pedestrian
- the travel route determination unit 203 may determine whether the travel route is on-road or off-road based on the type of road acquired from the car navigation system 303. For example, if the type of road currently running is an expressway or a general road, it can be determined that the vehicle is on-road.
- the object detection unit 207 may increase the determination threshold used for object detection because the possibility that a collision object exists is low when the traveling path determination unit 203 determines that the vehicle is traveling off-road.
- the traveling road can be determined with high accuracy.
- the present invention is not limited to the above-described embodiments, and includes various modifications and equivalent configurations within the scope of the appended claims.
- the above-described embodiments have been described in detail for easy understanding of the present invention, and the present invention is not necessarily limited to those having all the configurations described.
- a part of the configuration of one embodiment may be replaced with the configuration of another embodiment.
- another configuration may be added, deleted, or replaced.
- each of the above-described configurations, functions, processing units, processing means, etc. may be realized in hardware by designing a part or all of them, for example, with an integrated circuit, and the processor realizes each function. It may be realized by software by interpreting and executing the program to be executed.
- Information such as programs, tables, and files that realize each function can be stored in a storage device such as a memory, a hard disk, and an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, and a DVD.
- a storage device such as a memory, a hard disk, and an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, and a DVD.
- control lines and information lines indicate what is considered necessary for the explanation, and do not necessarily indicate all control lines and information lines necessary for mounting. In practice, it can be considered that almost all the components are connected to each other.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
Abstract
Description
図1は、本発明の第1の実施例の外界認識システム1の構成を示すブロック図である。
次に、本発明の第2実施例について説明する。
次に、本発明の第3実施例について説明する。
Claims (12)
- 車両が走行する走行路の状態を認識する状態認識システムであって、
車両に搭載された車載カメラによって撮影された画像を解析する解析部と、
前記解析部が解析した結果に基づいて走行路の状態を判定する走行路判別部と、
前記画像から前記車載カメラのレンズの付着物を抽出し、前記抽出された付着物の種別及び量を判定し、前記判定の結果に基づいて前記レンズの状態を診断するレンズ診断部と、を備えることを特徴とする状態認識システム。 - 請求項1に記載の状態認識システムであって、
前記走行路判別部は、前記画像及び前記レンズ診断部で診断された前記レンズの状態に基づいて走行路の状態を判定することを特徴とする状態認識システム。 - 請求項1に記載の状態認識システムであって、
前記車両は、前記車両の走行状態を検出するセンサを有し、
前記状態認識システムは、前記センサから出力される情報に基づいて前記車両の挙動を解析する挙動解析部をさらに備え、
前記走行路判別部は、前記画像及び前記挙動解析部によって解析された前記車両の挙動に基づいて走行路の状態を判定することを特徴とする状態認識システム。 - 請求項1に記載の状態認識システムであって、
前記解析部は、前記画像中の灯火を解析する第1の解析部と、前記画像から路面の特徴を解析する第2の解析部とを有し、
前記走行路判別部は、前記第1の解析部によって解析された灯火の情報及び前記第2の解析部によって解析された路面の特徴に基づいて走行路の状態を判定することを特徴とする状態認識システム。 - 請求項1に記載の状態認識システムであって、
前記状態認識システムは、前記車両が走行中の道路の情報を出力するナビゲーションシステムに接続されており、
前記走行路判別部は、前記画像及び前記ナビゲーションシステムから取得した道路の情報に基づいて走行路の状態を判定することを特徴とする状態認識システム。 - 請求項1に記載の状態認識システムであって、
前記レンズ診断部は、前記走行路判別部が判定した前記走行路の状態に基づいて、前記付着物が水であるか泥であるかを判定することを特徴とする状態認識システム。 - 車両が走行路の状態を車載システムが認識するための状態認識方法であって、
前記車載システムは、プログラムを実行することによって所定の処理を行うプロセッサと、前記プログラムを格納するメモリとを有し、
前記方法は、
前記プロセッサが、車両に搭載された車載カメラによって撮影された画像を解析する解析ステップと、
前記プロセッサが、前記解析ステップにおける解析結果に基づいて走行路の状態を判定する走行路判別ステップと、
前記プロセッサが、前記画像から前記車載カメラのレンズの付着物を抽出し、前記抽出された付着物の種別及び量を判定し、前記判定の結果に基づいて前記レンズの状態を診断するレンズ診断ステップを含むことを特徴とする状態認識方法。 - 請求項7に記載の状態認識方法であって、
前記走行路判別ステップでは、前記プロセッサは、前記画像及び前記レンズ診断ステップにおいて診断された前記レンズの状態に基づいて走行路の状態を判定することを特徴とする状態認識方法。 - 請求項7に記載の状態認識方法であって、
前記車両は、前記車両の走行状態を検出するセンサを有し、
前記方法は、前記プロセッサが、前記センサから出力される情報に基づいて前記車両の挙動を解析する挙動解析ステップをさらに含み、
前記走行路判別ステップでは、前記プロセッサは、前記画像及び前記挙動解析ステップにおいて解析された前記車両の挙動に基づいて走行路の状態を判定することを特徴とする状態認識方法。 - 請求項7に記載の状態認識方法であって、
前記解析ステップは、前記画像中の灯火を解析する第1の解析ステップと、前記画像から路面の特徴を解析する第2の解析ステップとを含み、
前記走行路判別ステップでは、前記プロセッサは、前記第1の解析ステップにおいて解析された灯火の情報及び前記第2の解析ステップにおいて解析された路面の特徴に基づいて走行路の状態を判定することを特徴とする状態認識方法。 - 請求項7に記載の状態認識方法であって、
前記車載システムは、前記車両が走行中の道路の情報を出力するナビゲーションシステムに接続されており、
前記走行路判別ステップでは、前記プロセッサは、前記画像及び前記ナビゲーションシステムから取得した道路の情報に基づいて走行路の状態を判定することを特徴とする状態認識方法。 - 請求項7に記載の状態認識方法であって、
前記レンズ診断ステップでは、前記走行路判別ステップで判定した前記走行路の状態に基づいて、前記付着物が水であるか泥であるかを判定することを特徴とする状態認識方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/412,385 US9542605B2 (en) | 2012-07-03 | 2013-07-03 | State recognition system and state recognition method |
JP2014523764A JP6133861B2 (ja) | 2012-07-03 | 2013-07-03 | 状態認識システム及び状態認識方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-149322 | 2012-07-03 | ||
JP2012149322 | 2012-07-03 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014007286A1 true WO2014007286A1 (ja) | 2014-01-09 |
Family
ID=49882038
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/068241 WO2014007286A1 (ja) | 2012-07-03 | 2013-07-03 | 状態認識システム及び状態認識方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US9542605B2 (ja) |
JP (1) | JP6133861B2 (ja) |
WO (1) | WO2014007286A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015008566A1 (ja) * | 2013-07-18 | 2015-01-22 | クラリオン株式会社 | 車載装置 |
JP2018195118A (ja) * | 2017-05-18 | 2018-12-06 | 日本電信電話株式会社 | 路面データ収集装置、方法、及びプログラム |
WO2019124014A1 (ja) * | 2017-12-18 | 2019-06-27 | 株式会社オートネットワーク技術研究所 | 車載撮像装置、洗浄方法及びコンピュータプログラム |
US10518751B2 (en) | 2017-09-18 | 2019-12-31 | Ford Global Technologies, Llc | Vehicle image sensor cleaning |
CN110705527A (zh) * | 2019-11-25 | 2020-01-17 | 甘肃建投重工科技有限公司 | 一种基于机器视觉自动识别的洗扫车路面高压预清洗装置 |
CN111223320A (zh) * | 2020-02-18 | 2020-06-02 | 上汽大众汽车有限公司 | 基于v2i的低附路面智能驾驶安全控制方法 |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9566963B2 (en) * | 2015-06-25 | 2017-02-14 | Robert Bosch Gmbh | Method of decreasing braking distance |
JP6611353B2 (ja) * | 2016-08-01 | 2019-11-27 | クラリオン株式会社 | 画像処理装置、外界認識装置 |
US10771665B1 (en) * | 2019-02-27 | 2020-09-08 | Ford Global Technologies, Llc | Determination of illuminator obstruction by known optical properties |
GB2584611B (en) * | 2019-05-09 | 2021-10-27 | Jaguar Land Rover Ltd | Apparatus and method for use with a vehicle |
KR102203475B1 (ko) * | 2019-08-20 | 2021-01-18 | 엘지전자 주식회사 | 자율 주행 시스템에서 차량을 제어하기 위한 방법 및 장치 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003255430A (ja) * | 2002-02-28 | 2003-09-10 | Toshiba Corp | 映像診断装置、車載型映像監視装置の映像診断システム |
JP2006315489A (ja) * | 2005-05-11 | 2006-11-24 | Toyota Motor Corp | 車両周囲警報装置 |
JP2010146284A (ja) * | 2008-12-18 | 2010-07-01 | Mazda Motor Corp | 車両用対象物検出装置及び車両用運転支援システム |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7103208B2 (en) * | 2002-08-26 | 2006-09-05 | Eastman Kodak Company | Detecting and classifying blemishes on the transmissive surface of an image sensor package |
JP2007071539A (ja) | 2005-09-02 | 2007-03-22 | Denso Corp | 車載ナビゲーション装置 |
JP5022609B2 (ja) * | 2006-02-27 | 2012-09-12 | 日立オートモティブシステムズ株式会社 | 撮像環境認識装置 |
JP2008258982A (ja) * | 2007-04-05 | 2008-10-23 | Canon Inc | 画像処理装置及びその制御方法及びプログラム |
JP4991384B2 (ja) | 2007-05-08 | 2012-08-01 | 株式会社日立製作所 | 接近物検知装置及び接近物検知プログラム |
JP2009002797A (ja) * | 2007-06-21 | 2009-01-08 | Aisin Aw Co Ltd | 車両状態判定装置 |
JP5075152B2 (ja) * | 2009-03-24 | 2012-11-14 | 日立オートモティブシステムズ株式会社 | 車両制御装置 |
JP5457224B2 (ja) | 2010-02-24 | 2014-04-02 | 国立大学法人九州工業大学 | 路面状態検出装置 |
JP5704416B2 (ja) * | 2011-11-01 | 2015-04-22 | アイシン精機株式会社 | 障害物警報装置 |
JP6117634B2 (ja) * | 2012-07-03 | 2017-04-19 | クラリオン株式会社 | レンズ付着物検知装置、レンズ付着物検知方法、および、車両システム |
-
2013
- 2013-07-03 WO PCT/JP2013/068241 patent/WO2014007286A1/ja active Application Filing
- 2013-07-03 US US14/412,385 patent/US9542605B2/en active Active
- 2013-07-03 JP JP2014523764A patent/JP6133861B2/ja active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003255430A (ja) * | 2002-02-28 | 2003-09-10 | Toshiba Corp | 映像診断装置、車載型映像監視装置の映像診断システム |
JP2006315489A (ja) * | 2005-05-11 | 2006-11-24 | Toyota Motor Corp | 車両周囲警報装置 |
JP2010146284A (ja) * | 2008-12-18 | 2010-07-01 | Mazda Motor Corp | 車両用対象物検出装置及び車両用運転支援システム |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015008566A1 (ja) * | 2013-07-18 | 2015-01-22 | クラリオン株式会社 | 車載装置 |
US10095934B2 (en) | 2013-07-18 | 2018-10-09 | Clarion Co., Ltd. | In-vehicle device |
JP2018195118A (ja) * | 2017-05-18 | 2018-12-06 | 日本電信電話株式会社 | 路面データ収集装置、方法、及びプログラム |
US10518751B2 (en) | 2017-09-18 | 2019-12-31 | Ford Global Technologies, Llc | Vehicle image sensor cleaning |
WO2019124014A1 (ja) * | 2017-12-18 | 2019-06-27 | 株式会社オートネットワーク技術研究所 | 車載撮像装置、洗浄方法及びコンピュータプログラム |
CN110705527A (zh) * | 2019-11-25 | 2020-01-17 | 甘肃建投重工科技有限公司 | 一种基于机器视觉自动识别的洗扫车路面高压预清洗装置 |
CN110705527B (zh) * | 2019-11-25 | 2023-03-03 | 甘肃建投重工科技有限公司 | 一种基于机器视觉自动识别的洗扫车路面高压预清洗装置 |
CN111223320A (zh) * | 2020-02-18 | 2020-06-02 | 上汽大众汽车有限公司 | 基于v2i的低附路面智能驾驶安全控制方法 |
Also Published As
Publication number | Publication date |
---|---|
US9542605B2 (en) | 2017-01-10 |
US20150169967A1 (en) | 2015-06-18 |
JP6133861B2 (ja) | 2017-05-31 |
JPWO2014007286A1 (ja) | 2016-06-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6133861B2 (ja) | 状態認識システム及び状態認識方法 | |
US11390276B2 (en) | Control device, control method, and non-transitory storage medium | |
JP7291129B2 (ja) | 路面状態と天候に基づく環境影響を認識し評価する方法及び装置 | |
JP7499256B2 (ja) | ドライバの挙動を分類するためのシステムおよび方法 | |
US11688174B2 (en) | System and method for determining vehicle data set familiarity | |
US9218535B2 (en) | Arrangement and method for recognizing road signs | |
EP2549457B1 (en) | Vehicle-mounting vehicle-surroundings recognition apparatus and vehicle-mounting vehicle-surroundings recognition system | |
JP5981322B2 (ja) | 車載画像処理装置 | |
WO2014007175A1 (ja) | 車載環境認識装置 | |
WO2014017403A1 (ja) | 車載用画像認識装置 | |
JP2015095886A (ja) | 周囲環境認識装置 | |
US9965690B2 (en) | On-vehicle control device | |
US20110074955A1 (en) | Method and system for weather condition detection with image-based road characterization | |
CN104507765A (zh) | 摄像机装置、三维物体检测装置以及镜头清洗方法 | |
US10839263B2 (en) | System and method for evaluating a trained vehicle data set familiarity of a driver assitance system | |
US20140347487A1 (en) | Method and camera assembly for detecting raindrops on a windscreen of a vehicle | |
JP4940177B2 (ja) | 交通流計測装置 | |
Kuo et al. | Vision-based vehicle detection in the nighttime | |
JPWO2019174682A5 (ja) | ||
US20190057272A1 (en) | Method of detecting a snow covered road surface | |
CN114255452A (zh) | 目标测距方法及装置 | |
Yang | An enhanced three-frame-differencing approach for vehicle detection under challenging environmental conditions | |
Guan et al. | Development of advanced driver assist systems using smart phones | |
CN117409600A (zh) | 基于关键点姿态检测的紧急停车带占用预警方法 | |
Persson Rukin | Active safety sensor verification-Development of image analysis support tool |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13812985 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014523764 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14412385 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13812985 Country of ref document: EP Kind code of ref document: A1 |