WO2018190362A1 - Procédé et dispositif de détection de piéton autour d'un véhicule - Google Patents

Procédé et dispositif de détection de piéton autour d'un véhicule Download PDF

Info

Publication number
WO2018190362A1
WO2018190362A1 PCT/JP2018/015194 JP2018015194W WO2018190362A1 WO 2018190362 A1 WO2018190362 A1 WO 2018190362A1 JP 2018015194 W JP2018015194 W JP 2018015194W WO 2018190362 A1 WO2018190362 A1 WO 2018190362A1
Authority
WO
WIPO (PCT)
Prior art keywords
rider
frame
vehicle
pedestrians
riders
Prior art date
Application number
PCT/JP2018/015194
Other languages
English (en)
Japanese (ja)
Inventor
依若 戴
楊 張
孝一 照井
健 志磨
Original Assignee
日立オートモティブシステムズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立オートモティブシステムズ株式会社 filed Critical 日立オートモティブシステムズ株式会社
Priority to JP2019512544A priority Critical patent/JP6756908B2/ja
Publication of WO2018190362A1 publication Critical patent/WO2018190362A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes

Definitions

  • the present invention relates to a method and apparatus for detecting pedestrians around a vehicle.
  • Patent Document 1 In the on-board camera as a sensor, first a human and a bicycle part are detected, and a rider detection algorithm for determining whether or not a rider is based on the spatial positional relationship of both Is disclosed.
  • Patent Document 1 since it is necessary to detect a circular wheel of a bicycle, it cannot be applied when the bicycle wheel is not visible, and it cannot be determined whether or not a pedestrian is a rider.
  • the operating environment and conditions of the in-vehicle sensor are very severe, and the collision avoidance time of the vehicle running at high speed is extremely short. Therefore, the in-vehicle collision avoidance system is required to have high real-time characteristics and high processing speed.
  • the in-vehicle collision avoidance system is required to have high real-time characteristics and high processing speed.
  • In an actual traffic situation there are many light vehicles, motor vehicles, and many stationary objects such as pedestrians and barricades. Therefore, a plurality of riders may be detected.
  • an algorithm that is complicated in computation and has a large calculation amount often cannot satisfy real-time characteristics. Therefore, according to the prior art, it becomes difficult to detect a plurality of riders in real time.
  • the present invention provides a method and apparatus for quickly distinguishing a plurality of riders and a plurality of pedestrians among pedestrians and detecting a pedestrian around the vehicle that can detect a plurality of riders in real time.
  • an apparatus for detecting pedestrians around a vehicle acquires image information around the vehicle from an imaging device for the vehicle, and an acquisition unit that detects the current vehicle speed of the vehicle; Based on the image information, an initial candidate frame parameter calculation unit that calculates an initial candidate frame parameter including an orientation that is an angle of the pedestrian with respect to the imaging device at each time point of the plurality of pedestrians around the vehicle; An adjustment unit that adjusts the initial candidate frame parameters based on the orientation to obtain the adjusted candidate frame parameters at each time point, and a plurality of pedestrians based on the adjusted candidate frame parameters.
  • the initial candidate frame parameter further includes a frame centroid coordinate, a frame width w, and a frame height h
  • the adjusted candidate frame parameter includes the frame centroid coordinate, the adjusted frame width w ′, and the frame height. h, and the orientation ⁇ , where
  • the classification unit sets the adjusted candidate frame parameter at each time point of each of the one or more riders as the rider detection frame parameter.
  • the classification unit uses the initial candidate frame parameters of each of the one or more pedestrians as pedestrian detection frame parameters at the respective time points of the one or more pedestrians
  • the risk factor setting unit is configured such that the pedestrian detection frame parameter, the state information of each of the one or more pedestrians, and the change frequency of the direction, each pedestrian among the one or more pedestrians. Set the risk factor to.
  • the risk factor setting unit is based on the frame center of gravity coordinates of the rider detection frame parameter at the previous time point and the frame center of gravity coordinates of the rider detection frame parameter at the current time point.
  • the relative speed and the relative movement direction of each rider with respect to the vehicle are acquired.
  • the risk factor setting unit determines whether or not each rider is a new rider. If the rider is a new rider, the risk factor is set to 1. If the rider is not a new rider, the current rider detection frame is set. The risk factor is set based on the frame center-of-gravity coordinates and the direction of the parameter, the relative speed, the relative movement direction, the state information, and the change frequency of the direction.
  • the state information is a normal state or an abnormal state
  • the risk factor setting unit determines that the rider is a child based on the frame height of the rider detection frame parameter, or from the image information
  • the state information is determined as an abnormal state.
  • a method for detecting a pedestrian around a vehicle the first step of acquiring image information around the vehicle from a photographing device of the vehicle and detecting a current vehicle speed of the vehicle, and the image information.
  • a risk factor is set for each rider among the one or more riders based on the step, the rider detection frame parameter, the state information of each of the one or more riders, and the direction change frequency;
  • FIG. 1 is a structural diagram of an apparatus for detecting pedestrians around a vehicle according to an embodiment of the present invention.
  • 3 is a flowchart of a method for detecting pedestrians around a vehicle according to an embodiment of the present invention. It is a schematic diagram of an initial candidate frame at time t1 of one pedestrian among a plurality of pedestrians according to an embodiment of the present invention. It is a schematic diagram of the candidate frame after adjustment of one pedestrian shown in FIG. It is a schematic diagram of the initial candidate frame at the time t1 of another pedestrian among a plurality of pedestrians according to an embodiment of the present invention. It is a schematic diagram of the candidate frame after adjustment of the other pedestrian shown by Fig.4 (a). It is a schematic diagram which shows the definition of direction.
  • FIG. 1 is a structural diagram of an apparatus 10 for detecting pedestrians around a vehicle according to an embodiment of the present invention.
  • the device 10 for detecting pedestrians around a vehicle includes an acquisition unit 11, an initial candidate frame parameter calculation unit 12, an adjustment unit 13, a classification unit 14, and a risk coefficient setting unit 15.
  • FIG. 2 is a flowchart of a method for detecting pedestrians around a vehicle according to an embodiment of the present invention.
  • the acquisition unit 11 acquires image information around the vehicle from the vehicle imaging device and detects the current vehicle speed of the vehicle.
  • the photographing device (not shown) is one or a plurality of camera systems attached to appropriate positions of the vehicle (the upper end of the vehicle windshield, the rear end of the vehicle tail, or both sides of the vehicle body). Collect and store image information from the rear and both sides.
  • a camera system includes an optical system and a camera.
  • the optical system may have a zoom function, an autofocus function, and the like, and a color CCD (charge coupled device) video camera may be used as the camera.
  • CCD charge coupled device
  • the initial candidate frame parameter calculation unit 12 determines the angle of the pedestrian shooting device at each time point of the plurality of pedestrians around the vehicle based on the image information, that is, the pedestrian torso shooting device. An initial candidate frame parameter including the direction ⁇ which is an angle is calculated.
  • FIG. 5 is a schematic diagram showing the definition of the direction ⁇ .
  • When the orientation of the pedestrian with respect to the imaging device is right-handed, ⁇ is 0, and when the orientation of the pedestrian with respect to the imaging device is front-facing, ⁇ is ⁇ / 2, and the orientation of the pedestrian with respect to the imaging device is ⁇ is ⁇ or ⁇ when the pedestrian is facing left, and ⁇ is ⁇ / 2 when the pedestrian is facing the imaging device.
  • Other directions are output according to the actual size and normalized to [ ⁇ , ⁇ ].
  • the direction ⁇ divided into a certain region may be divided into the same angle of one class. As shown in FIG.
  • the initial candidate frame parameters further include frame centroid coordinates, frame width w, and frame height h.
  • FIG. 3A is a schematic diagram of the initial candidate frame F at the time point t1 of one pedestrian among a plurality of pedestrians according to an embodiment of the present invention.
  • the parameters include frame centroid coordinates (x t1 , y t1 , z t1 ), frame width w t1 , frame height h t1 , and orientation ⁇ t1 (not shown).
  • the frame center-of-gravity coordinates are set as the world coordinate system of the vehicle.
  • the coordinate origin is a position directly below the middle of the vehicle head.
  • a feature extraction method (dense feature: DPM deformed part model, ACF total channel feature, HOG direction gradient histogram, dense edge, etc .; sparse feature: size, shape, sparse edge, body part, walking , Texture, gray scale / edge symmetry, etc.), contour template matching, and the like.
  • an initial candidate frame including frame centroid coordinates, frame width, and frame height is calculated.
  • a deep learning algorithm of machine learning is calculated.
  • the orientation of the pedestrian is analyzed by a deep learning algorithm or a normal machine learning algorithm.
  • different models based on dedicated orientation detection networks or different directional gradient histograms can be used to divide the input pedestrian initial candidate frames into different orientations.
  • the type of method is called a cascade model.
  • the frame center-of-gravity coordinates, the frame width, the frame height, and the direction can be collectively calculated by the same deep learning network or normal machine learning, that is, a single model algorithm.
  • the pedestrian orientation is trained and detected as part of the regression loss function in all connected layers at the end of the deep learning network.
  • the initial candidate frame parameter calculation unit 12 can identify all pedestrians from the image information by the above calculation.
  • the pedestrian includes a rider and / or a pedestrian.
  • the adjustment unit 13 adjusts the initial candidate frame parameter based on the orientation, and acquires the adjusted candidate frame parameter at each time point.
  • FIG.3 (b) is a schematic diagram of the candidate frame after adjustment of one pedestrian shown by Fig.3 (a).
  • the adjustment unit 13 adjusts the initial candidate frame parameter of the initial candidate frame F at the time point t1 of the pedestrian to adjust the candidate frame after the adjustment at the time point t1 of the pedestrian.
  • F ' is acquired.
  • the candidate frame parameters of the adjusted candidate frame F ′ are the frame centroid coordinates (x t1 , y t1 , z t1 ), the frame width w ′ t1 , the frame height h t1 , and the orientation ⁇ t1 (not shown). Including. here,
  • the adjustment unit 13 changes the frame width w t1 of the initial candidate frame F based on the frame height h t1 and the orientation ⁇ t1 to obtain the adjusted frame width w ′ t1 of the candidate frame F ′. Keep other parameters constant. In this way, the candidate frame parameters of the adjusted candidate frame at the time t1 of each pedestrian among the plurality of pedestrians can be acquired.
  • the size of the initial candidate frame is not changed, that is, the adjusted candidate frame parameter and the initial candidate frame parameter are the same.
  • the frame height of the initial candidate frame is kept constant, and the frame width is adjusted to be equal to the frame height.
  • the frame height of the initial candidate frame is held constant and the frame width is adjusted to half the frame height.
  • the change (adjustment) is based on the premise that the frame center-of-gravity coordinates are held constant. As shown in FIG. 3B, once the frame width changes, the coordinates of the four vertices of the adjusted candidate frame also change correspondingly.
  • step S24 the classification unit 14 classifies the plurality of pedestrians into one or a plurality of riders and one or a plurality of pedestrians based on the adjusted candidate frame parameters, and each of the one or a plurality of riders. Obtain rider detection frame parameters at each time point.
  • the classification unit 14 calculates the candidate frame parameters after adjustment of each pedestrian by, for example, a deep learning classification algorithm, and classifies based on the calculation result. Since the specific calculation and classification method is the same as the conventional method, a redundant description will not be given.
  • FIG. 4A is a schematic diagram of an initial candidate frame G at time t1 of another pedestrian among a plurality of pedestrians according to the embodiment of the present invention
  • FIG. 4B is a schematic diagram of FIG. It is a schematic diagram of candidate frame G 'after adjustment of the other pedestrians shown in FIG.
  • the classification unit 14 classifies, for example, one pedestrian in FIG. 3B as a rider (hereinafter referred to as “rider A”) and walks the other pedestrians in FIG. 4B. (Hereinafter, the other pedestrians in FIGS. 4A and 4B are classified as “pedestrian B”).
  • the classification unit 14 uses the adjusted candidate frame parameter at each time point of each of one or more riders as the rider detection frame parameter.
  • the classification unit 14 uses the candidate frame parameter of the adjusted candidate frame F ′ at time t1 shown in FIG. That is, the rider detection frame parameters include frame barycentric coordinates (x t1 , y t1 , z t1 ), frame width w ′ t1 , frame height h t1 , and orientation ⁇ t1 (not shown).
  • the classification unit 14 sets the initial candidate frame parameter of each of one or more pedestrians as the pedestrian detection frame parameter at each time point of one or more pedestrians.
  • the classification unit 14 uses the initial candidate frame parameter of the initial candidate frame G at time t1 shown in FIG. 4A as the pedestrian detection frame parameter of the pedestrian.
  • the pedestrian detection frame parameter and the initial candidate frame parameter are the same, that is, the frame width of the pedestrian detection frame is adjusted. It is smaller than the frame width of the subsequent candidate frame, thereby narrowing the detection range for pedestrians.
  • the risk factor setting unit 15 determines the rider detection frame parameter, the status information of each of one or more riders, and the change frequency of the direction, and the respective rider among the one or more riders. Set the risk factor to.
  • the risk factor setting unit 15 is based on the frame center of gravity coordinates of the rider detection frame parameter at the previous time point and the frame center of gravity coordinates of the rider detection frame parameter at the current time point. The relative speed and the relative movement direction of each rider with respect to the vehicle are acquired.
  • the previous time is t1, for example, and the current time is t2, for example.
  • the frame center-of-gravity coordinates of the rider detection frame parameter at t1 are, for example, (x t1 , y t1 , z t1 )
  • the frame center of gravity coordinates of the rider detection frame parameter at t2 are, for example, (x t2 , y t2 , Z t2 )
  • the relative speed v and the relative movement direction r of the rider A with respect to the vehicle are acquired from the amount of change in (x t2 , y t2 , z t2 ) and (x t1 , y t1 , z t1 ).
  • the relative speed and relative movement direction of each of the other riders can be obtained by a similar method.
  • the risk factor setting unit 15 determines whether or not each rider is a new rider. If the rider is a new rider, the risk factor setting unit 15 sets the risk factor to 1. If the rider is not a new rider, the current risk detection frame parameter is set. A risk coefficient is set based on the frame center-of-gravity coordinates and direction, relative speed, relative movement direction, state information, and direction change frequency.
  • the risk coefficient setting unit 15 first determines whether or not it is a new rider. Specifically, it is determined whether or not the rider detection frame parameter exists at the previous time point t1 for the rider A, and if it is determined that the rider detection frame parameter does not exist, the rider A is a new rider, that is, a rider newly appearing at the current time t2. The risk factor W is set to 1 and the next rider is determined.
  • the risk factor W is initialized to 0, and the frame center of gravity coordinates and direction of the rider detection frame parameter at the current time t2, relative speed, relative movement direction, and state information
  • the risk factor is set based on the direction change frequency.
  • condition 1 to Condition 5 when it is determined that the rider A is not a new rider, the risk factor W is initialized to 0, and the risk factor is set for the rider A based on the following condition 1 to condition 5.
  • Condition 1 to Condition 5 will be described in detail.
  • Condition 1 It is determined whether the alarm safety distance d w less than or not the frame center coordinates of the rider detection frame parameters at the current time t2 is calculated from the following equation (1).
  • v is the relative speed of the rider A
  • t f is the braking time required for the vehicle
  • t d is the response time of the vehicle driver
  • is the coefficient of friction of the road where the vehicle is located
  • g is the acceleration of gravity.
  • ⁇ h is the current vehicle speed detected by the acquisition unit 11 in step S21.
  • the value of the friction coefficient ⁇ of the road can be determined based on the photographed image information.
  • the coordinate origin is a position directly below the middle of the vehicle head, and in this case, the distance between the frame center of gravity coordinates and the coordinate origin is the distance between the rider and the middle position of the vehicle head.
  • Rider frame center coordinates of the rider detection frame parameters at the moment t2 of A is a (x t2, y t2, z t2), thereby obtaining the distance D between the intermediate position of the rider A and headway.
  • the distance D determines whether the alarm safety distance d w smaller.
  • Condition 2 From the relative movement direction of rider A, it is determined whether rider A is approaching the vehicle. Here, for example, whether or not the rider A is approaching the vehicle is determined based on the calculated relative movement direction r.
  • Condition 3 Whether or not the state information of rider A is in an abnormal state.
  • the state information is a normal state or an abnormal state, and the risk coefficient setting unit 15 determines that the rider is a child based on the frame height of the rider detection frame parameter, or the rider and other riders from the image information.
  • the state information is determined as an abnormal state.
  • the frame height of the rider detection frame parameter is h t2 at present t2 rider A, since the frame height is h t2, the rider A is equal to or a child, children If it is, it is determined that the state is abnormal.
  • riders A and other riders such as the distance between the rider C is the distance D AC the frame centroid coordinates and rider C of the frame barycentric coordinates of the rider A.
  • the predetermined safety distance d s is obtained from the following equation (3), for example.
  • k is the relative speed between rider A and rider C
  • t ′ f is the time required for the rider's brake
  • t d ′ is the rider's response time
  • is the coefficient of friction of the road where rider A is located ( For example, obtained from the above equation (2))
  • g is the gravitational acceleration.
  • k can be acquired from the amount of change between the frame center of gravity coordinates of the rider A and the frame center of gravity coordinates of the rider C.
  • the distance D AC with a predetermined safety distance d s determines whether a predetermined safety distance d s is less than. If it is smaller than the predetermined safety distance d s , it is determined that the state is abnormal, and if it is equal to or longer than the predetermined safety distance d s , it is determined that the state is normal.
  • the rider A is interfered by an external object, for example, whether or not an unknown object appears within a predetermined angle range starting from the direction ⁇ of the rider A. If it does appear, it is determined that it is in an abnormal state, and if it does not appear, it is determined that it is in a normal state.
  • Condition 4 It is determined whether or not the rider A can see the vehicle based on the direction ⁇ of the rider A at the current time t2.
  • the direction ⁇ of the rider is any one of true rearward, left rearward, right rearward, true leftward, and rightward, that is, the direction ⁇ is the class VI, class V, class VII, class shown in FIG. If it belongs to either IV or class VIII, it is determined that the vehicle cannot be seen by the rider, and otherwise, it is determined that the vehicle is visible.
  • Condition 5 Whether the direction change frequency is greater than a predetermined threshold.
  • the direction change frequency is calculated.
  • the predetermined threshold value is obtained based on, for example, statistical data obtained by a large number of riders. Next, it is determined whether or not the direction change frequency is greater than a predetermined threshold.
  • condition 4 or 5 when it is determined that the vehicle is visible to the rider or the change frequency of the direction is greater than a predetermined threshold, the risk coefficient determined based on condition 3 is further increased. 1 is added.
  • each rider's risk factor at each point in time can be obtained, each rider can be tracked according to the risk factor, and an alarm can be issued to the vehicle if necessary.
  • the risk coefficient setting unit 15 further includes a pedestrian detection frame parameter, each state information of one or more pedestrians, and a change frequency of the direction, and each pedestrian among one or more pedestrians.
  • the risk factor can be set for the pedestrian B by the above-described risk factor setting method for the rider.
  • a plurality of riders and pedestrians among pedestrians can be quickly distinguished, and a plurality of riders can be detected in real time.
  • the present invention is not limited to the above-described embodiments, and includes various modifications.
  • the above-described embodiments have been described in detail for easy understanding of the present invention, and are not necessarily limited to those having all the configurations described.

Abstract

L'invention concerne un procédé et un dispositif permettant de détecter des piétons autour d'un véhicule, le procédé et le dispositif étant aptes à différencier rapidement, parmi les piétons, une pluralité de cyclistes d'une pluralité de piétons et à détecter une pluralité de cyclistes en temps réel. Un dispositif de détection de piétons autour d'un véhicule comprend : une unité d'acquisition destinée à acquérir des informations d'image autour du véhicule à partir d'un dispositif d'imagerie dans le véhicule pour détecter une vitesse actuelle du véhicule ; une unité de calcul de paramètres d'image candidate initiaux destinée à calculer, sur la base des informations d'image, des paramètres d'image candidate initiaux des piétons autour du véhicule, les paramètres incluant chacun une orientation qui est l'angle du piéton par rapport au dispositif d'imagerie ; une unité de réglage destinée à ajuster les paramètres d'image candidate initiaux sur la base des orientations en vue d'acquérir des paramètres d'image candidate ajustés ; une unité de classification destinée à classer une pluralité de piétons en cyclistes et piétons sur la base des paramètres d'image candidate ajustés en vue d'acquérir des paramètres d'image de détection de cycliste des cyclistes ; et une unité de définition de coefficient de danger destinée à définir un coefficient de danger pour chaque cycliste sur la base des paramètres d'image de détection de cycliste, des informations d'état sur chaque cycliste et de la fréquence de variation de l'orientation.
PCT/JP2018/015194 2017-04-12 2018-04-11 Procédé et dispositif de détection de piéton autour d'un véhicule WO2018190362A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019512544A JP6756908B2 (ja) 2017-04-12 2018-04-11 車両周囲の歩行者を検出する方法及び装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710234918.6 2017-04-12
CN201710234918.6A CN108694363A (zh) 2017-04-12 2017-04-12 对车辆周围的行人进行检测的方法和装置

Publications (1)

Publication Number Publication Date
WO2018190362A1 true WO2018190362A1 (fr) 2018-10-18

Family

ID=63792644

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/015194 WO2018190362A1 (fr) 2017-04-12 2018-04-11 Procédé et dispositif de détection de piéton autour d'un véhicule

Country Status (3)

Country Link
JP (1) JP6756908B2 (fr)
CN (1) CN108694363A (fr)
WO (1) WO2018190362A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111856493A (zh) * 2019-04-25 2020-10-30 北醒(北京)光子科技有限公司 一种基于激光雷达的摄像头触发装置及方法

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109409309A (zh) * 2018-11-05 2019-03-01 电子科技大学 一种基于人体检测的智能报警系统和方法
SG11201811455RA (en) 2018-11-09 2020-06-29 Beijing Didi Infinity Technology & Development Co Ltd System and method for detecting in-vehicle conflicts
CN111429754A (zh) * 2020-03-13 2020-07-17 南京航空航天大学 一种行人过街工况下的车辆避撞轨迹风险评估方法
CN115527074B (zh) * 2022-11-29 2023-03-07 深圳依时货拉拉科技有限公司 一种车辆检测框的生成方法、生成装置及计算机设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009053925A (ja) * 2007-08-27 2009-03-12 Toyota Motor Corp 行動予測装置
WO2017056382A1 (fr) * 2015-09-29 2017-04-06 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
WO2017158983A1 (fr) * 2016-03-18 2017-09-21 株式会社Jvcケンウッド Dispositif, procédé et programme de reconnaissance d'objets
JP2017194432A (ja) * 2016-04-22 2017-10-26 株式会社デンソー 物体検出装置、物体検出方法

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105216792A (zh) * 2014-06-12 2016-01-06 株式会社日立制作所 对周围环境中的障碍物目标进行识别跟踪的方法和设备
EP3282392B1 (fr) * 2015-08-28 2021-09-29 Veoneer Sweden AB Système et procédé de vision pour véhicule à moteur

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009053925A (ja) * 2007-08-27 2009-03-12 Toyota Motor Corp 行動予測装置
WO2017056382A1 (fr) * 2015-09-29 2017-04-06 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
WO2017158983A1 (fr) * 2016-03-18 2017-09-21 株式会社Jvcケンウッド Dispositif, procédé et programme de reconnaissance d'objets
JP2017194432A (ja) * 2016-04-22 2017-10-26 株式会社デンソー 物体検出装置、物体検出方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111856493A (zh) * 2019-04-25 2020-10-30 北醒(北京)光子科技有限公司 一种基于激光雷达的摄像头触发装置及方法

Also Published As

Publication number Publication date
JPWO2018190362A1 (ja) 2020-01-16
CN108694363A (zh) 2018-10-23
JP6756908B2 (ja) 2020-09-16

Similar Documents

Publication Publication Date Title
WO2018190362A1 (fr) Procédé et dispositif de détection de piéton autour d'un véhicule
US11390276B2 (en) Control device, control method, and non-transitory storage medium
US20220073068A1 (en) Rider assistance system and method
CN106485233B (zh) 可行驶区域检测方法、装置和电子设备
KR101891460B1 (ko) 차도 위의 반사체를 인식하고 평가하기 위한 방법 및 장치
CN111033510A (zh) 用于运行驾驶员辅助系统的方法和装置以及驾驶员辅助系统和机动车
LU101647B1 (en) Road pedestrian classification method and top-view pedestrian risk quantitative method in two-dimensional world coordinate system
Smaldone et al. The cyber-physical bike: A step towards safer green transportation
GB2538572A (en) Safety system for a vehicle to detect and warn of a potential collision
CN113044059A (zh) 用于车辆的安全系统
US9870513B2 (en) Method and device for detecting objects from depth-resolved image data
WO2009101660A1 (fr) Dispositif de surveillance de périphérie de véhicule, véhicule et programme de surveillance de périphérie de véhicule
JP2023532045A (ja) マイクロモビリティユーザーのリスクを判断するための外観と動きに基づくモデル
Rajendar et al. Prediction of stopping distance for autonomous emergency braking using stereo camera pedestrian detection
CN111081045A (zh) 姿态轨迹预测方法及电子设备
CN105679090B (zh) 一种基于智能手机的夜间司机驾驶辅助方法
Michalke et al. Towards a closer fusion of active and passive safety: Optical flow-based detection of vehicle side collisions
JP6764378B2 (ja) 車外環境認識装置
WO2018212090A1 (fr) Dispositif de commande et procédé de commande
JP6171608B2 (ja) 物体検出装置
KR102345798B1 (ko) 교차로 꼬리물기 인지 및 영상 저장 장치
JP7454685B2 (ja) 車両走行路内のデブリの検出
Rammohan et al. Automotive Collision Avoidance System: A Review
KR20150092505A (ko) 차간 거리 경고 방법 및 전방 충돌 감지 장치
Khandelwal et al. Automatic braking system for two wheeler with object detection and depth perception

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18784623

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019512544

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18784623

Country of ref document: EP

Kind code of ref document: A1