WO2016163590A1 - Procédé et dispositif auxiliaire de véhicule basés sur une image infrarouge - Google Patents

Procédé et dispositif auxiliaire de véhicule basés sur une image infrarouge Download PDF

Info

Publication number
WO2016163590A1
WO2016163590A1 PCT/KR2015/005573 KR2015005573W WO2016163590A1 WO 2016163590 A1 WO2016163590 A1 WO 2016163590A1 KR 2015005573 W KR2015005573 W KR 2015005573W WO 2016163590 A1 WO2016163590 A1 WO 2016163590A1
Authority
WO
WIPO (PCT)
Prior art keywords
collision
vehicle
infrared image
possibility
target
Prior art date
Application number
PCT/KR2015/005573
Other languages
English (en)
Korean (ko)
Inventor
박광일
임상묵
김진혁
Original Assignee
(주)피엘케이 테크놀로지
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)피엘케이 테크놀로지 filed Critical (주)피엘케이 테크놀로지
Publication of WO2016163590A1 publication Critical patent/WO2016163590A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/20Conjoint control of vehicle sub-units of different type or different function including control of steering systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention

Definitions

  • the present invention relates to a vehicle assistance technology based on an infrared image, and more particularly, to an infrared image based vehicle assistance apparatus and method capable of detecting an object having a collision possibility with a vehicle through an infrared camera.
  • the vehicle is provided with a variety of convenience means to enable the user to provide a more stable and comfortable driving state.
  • the vehicle safety devices include antilock braking system (ABS) and electronic controlled suspension (ECS) to prevent accidents and dangers of the vehicle in advance.
  • Active safety devices such as) and passive safety devices such as vehicle black boxes to determine the cause of the accident.
  • Korean Patent Laid-Open No. 10-2013-0046136 discloses a vehicle collision avoidance system using an infrared depth sensor and a radar and a method thereof.
  • the infrared depth sensor installed in the vehicle makes the primary determination of the presence of an obstacle, and when the obstacle is confirmed, the radar transmit pulse period is shorter than the default period and the radar received signal strength detection reference level is lower than the default level.
  • the radar transmit pulse period is shorter than the default period and the radar received signal strength detection reference level is lower than the default level.
  • Korean Patent Application Publication No. 10-1995-0011231 provides an object detection sensor using an infrared light emitting diode and a photodiode on the left and right sides of a vehicle separately from the side back mirror to monitor blind spots in the field of view of the side back mirror of the vehicle.
  • a vehicle collision avoidance apparatus using infrared rays and a method thereof This technique allows the driver to monitor blind spots that are not visible with the side back mirrors.
  • An embodiment of the present invention is to provide an infrared image-based vehicle assistance apparatus that can detect a target that may collide with the vehicle through the infrared camera.
  • An embodiment of the present invention is to provide an infrared image-based vehicle assistance apparatus that can determine whether or not a pedestrian according to the possibility of moving to the object in the infrared image, and the possibility of collision with the vehicle in the case of a pedestrian.
  • An embodiment of the present invention is to provide an infrared image-based vehicle assistance apparatus that can determine the possibility of collision with the vehicle by determining the proximity between the objects in the infrared image according to the expected movement path of the vehicle.
  • the vehicle assistance apparatus based on an infrared image may include an infrared image receiver configured to receive a series of infrared images, a collision object detector configured to detect a collision target from a latest infrared image among the infrared images, and a steering angle of a steering wheel. And a collision probability determination unit configured to determine an expected movement path of and determine a possibility of collision with the collision target based on the expected movement path.
  • the collision target detection unit may extract an object from the latest infrared image, determine the possibility of movement of the extracted object, and determine an object that may be moved as the collision target.
  • the collision target detector may determine a collision object that has moved or is moving from the series of infrared images.
  • the collision possibility determination unit may determine the expected movement path of the vehicle based on the steering angle of the steering wheel received from the vehicle.
  • the collision probability determination unit may calculate an anticipated movement path of the vehicle by calculating a turning radius based on the steering angle of the steering wheel and Equation 1 below.
  • Da is the wheelbase of the vehicle
  • ⁇ s is the steering wheel steering angle
  • ⁇ w is the wheel steering angle
  • RD is the turning radius
  • rs is the steering ratio of the vehicle (about 12: 1 to 20: 1).
  • Rfl, Rrl, Rrr and Rfr are the radius of rotation of the left front wheel, left rear wheel, right rear wheel and right front wheel, respectively
  • the collision probability determination unit may calculate a distance adjacency on the expected moving path with respect to the collision target.
  • the collision probability determination unit may determine whether the distance adjacency is increased or decreased based on the moving direction of the collision target, and the visual guidance corresponding to the collision target includes the proximity indicator near the collision target.
  • the collision probability determination unit may vary the distance adjacency in proportion to the speed of the vehicle.
  • the collision probability determination unit increases the distance adjacency in the rotational direction as compared to the distance adjacency in the opposite direction of rotation while the vehicle is rotating.
  • the collision possibility display unit may further include overlaying and displaying visual guidance on the collision object.
  • the collision possibility display unit may determine the color of the visual guidance by determining a color gradient according to the degree of collision possibility.
  • the collision possibility display unit may display the expected movement path of the vehicle on the screen.
  • the collision possibility display unit may recognize a lane on a road based on a visible light image captured by a visible light camera, and display the lane on the road on a screen.
  • the collision possibility display unit may model a lane on a road as shown in Equation 2 below.
  • A0, k, A1, k, A2, k, A3, k are state variables that determine the shape of the lane, k is order, and X and Z are coordinates on the road.
  • the collision possibility display unit may calculate a next state variable value by performing a prediction and correction process on a previous state variable value using a Kalman filter.
  • the apparatus may further include an autonomous emergency braking unit configured to check a possibility of collision of the vehicle with respect to the collision target based on the distance between the collision target and the vehicle to control at least one of reducing the speed of the vehicle and generating a warning sound.
  • an autonomous emergency braking unit configured to check a possibility of collision of the vehicle with respect to the collision target based on the distance between the collision target and the vehicle to control at least one of reducing the speed of the vehicle and generating a warning sound.
  • the vehicle assisted method based on the infrared image may include receiving a series of infrared images, detecting a collision target in the latest infrared image among the infrared images, and an expected moving path of the vehicle based on the steering angle of the steering wheel. And determining a possibility of collision with the collision target based on the expected movement path.
  • the method may further include overlaying and displaying visual guidance on the collision target.
  • the method may further include controlling at least one of a vehicle's speed reduction and a warning sound by determining a possibility of collision of the vehicle with respect to the collision target based on the distance between the collision target and the vehicle.
  • the vehicle auxiliary apparatus based on the infrared image may detect a target that may possibly collide with the vehicle through the infrared camera.
  • Infrared image-based vehicle assistance apparatus may determine whether or not the pedestrian in accordance with the possibility of moving to the object in the infrared image, and in the case of the pedestrian may determine the possibility of collision with the vehicle.
  • Infrared image-based vehicle assistance apparatus may determine the possibility of collision with the vehicle by determining the proximity between the objects in the infrared image according to the expected movement path of the vehicle.
  • FIG. 1 is a view for explaining a vehicle assistance apparatus according to an embodiment of the present invention.
  • FIG. 2 is a block diagram illustrating an example of the vehicle assistance apparatus of FIG. 1.
  • FIG. 3 is a diagram illustrating a process of calculating an expected moving path of a vehicle performed by the vehicle assistance apparatus of FIG. 1.
  • FIG. 4 is a diagram illustrating a process of displaying, on a screen, an expected movement path of a vehicle performed by the vehicle assistance apparatus of FIG. 1.
  • FIG. 5 is a block diagram illustrating another example of the vehicle assistance apparatus of FIG. 1.
  • FIG. 6 is a diagram illustrating a process of modeling a lane performed by the vehicle assistance apparatus of FIG. 5.
  • FIG. 7 is a diagram illustrating a process of calculating a state variable value performed by the vehicle assistance apparatus of FIG. 5.
  • FIG. 8 is a diagram illustrating a process of displaying a lane performed on a vehicle assistance apparatus of FIG. 5 on a screen.
  • FIG. 9 is a flowchart illustrating a vehicle assistance process performed in the vehicle assistance apparatus of FIG. 1.
  • FIG. 10 is a diagram illustrating visual guidance according to a location of a collision target provided by the vehicle assistance apparatus of FIG. 1.
  • FIG. 11 is a diagram illustrating an expected movement path and proximity indicator of a vehicle provided by the vehicle assistance apparatus of FIG. 1.
  • FIG. 1 is a schematic view of a vehicle equipped with an infrared camera according to an embodiment of the present invention.
  • the vehicle assistance apparatus 100 detects a collision target in front of the vehicle 10 based on a series of infrared images obtained from the infrared camera 20 installed in the vehicle 10.
  • the infrared camera 20 may be installed in front of the vehicle 10 to capture at least one infrared image by capturing a surrounding environment in front of the vehicle 10.
  • the infrared camera 20 may recognize a collision target in front of the vehicle 10 through a near infrared light source using a laser.
  • the vehicle assistance apparatus 100 may generate a visual guidance according to the possibility of collision of the vehicle 10 with respect to the collision object in front of the vehicle 10.
  • the visual guidance means a guide that is visually provided to the driver, and may be displayed by being transparently overlaid on the collision target in the same shape as the collision target.
  • the vehicle assistance apparatus 100 may determine the possibility of collision with the collision target according to the expected movement path of the vehicle 10. According to an embodiment, the vehicle assistance apparatus 100 may calculate an expected movement path of the vehicle 10 based on the steering angle of the steering wheel, and determine a possibility of collision with the collision target according to the calculated expected movement path.
  • the vehicle assistance apparatus 100 may control at least one of speed reduction and warning sound generation of the vehicle 10 according to a possibility of collision of the vehicle 10 with respect to the collision target.
  • a detailed description of the vehicle assistance apparatus 100 will be described with reference to FIG. 2.
  • the vehicle assistance apparatus 100 extracts a collision target from an infrared image in consideration of an ego path (that is, a lane).
  • FIG. 2 is a block diagram of a vehicle assistance apparatus according to an embodiment of the present invention.
  • the vehicle assistance apparatus 100 may include an infrared image receiver 210, a collision object detector 220, a collision probability determiner 230, a collision possibility display 240, and an autonomous emergency braking unit 250. And a controller 260.
  • the infrared image receiver 210 receives a series of infrared images from the infrared camera 20.
  • the series of infrared images refers to a plurality of images that appear continuously over time.
  • the infrared image receiving unit 210 may generate an around view based on the infrared image obtained from the infrared camera 20.
  • the around view is generated by merging at least two images captured by the infrared camera 20 around the vehicle 10 at the same time, and the horizontal length of the around view is longer than the horizontal length of one image. Means.
  • the same time means a time within 0.01 to 5 seconds.
  • the around view may be provided in real time based on the front of the vehicle 10.
  • the infrared image receiver 210 may determine whether to drive the infrared camera 20 by receiving an ECU (Electronic Control Unit) signal of the vehicle 10.
  • the ECU controls the state of the vehicle 10 by a computer, and may generate a signal based on the state of the vehicle 10.
  • the state of the vehicle 10 may include a stop state, a driving reserve state, and a driving state.
  • the collision object detector 220 detects a collision object in a recent infrared image among the infrared images.
  • the collision target means an object (eg, a pedestrian or an animal) on an infrared image.
  • the collision target detection unit 220 may calculate the color values of pixels in the image to group regions having similar color values and extract one group as one object.
  • the pixels of the object may be grouped into one region based on characteristics having similar color values.
  • the collision target detection unit 220 detects a boundary in the image by using an edge detection algorithm such as a Canny edge detection algorithm, a line edge detection algorithm, a Laplacian edge detection algorithm, and the like. You can extract the object.
  • the collision target detection unit 220 may extract an object by grouping areas separated from the background area based on the detected boundary line.
  • the collision target detection unit 220 may determine the possibility of movement of the object in the recent infrared image, and if it is determined that the object may be moved, the collision object detection unit 220 may determine the object as the collision target.
  • the object may include both dynamic objects (eg, pedestrians, animals) and static objects (eg, objects such as street lights) in the infrared image.
  • the collision target detector 220 may determine the possibility of moving the object based on the difference between the infrared images of the same object in the infrared images from the current viewpoint to the specific past point in time.
  • the collision target detection unit 220 may determine that there is a possibility of movement of the object when there is a change in the position of the object in the image based on the image of the current viewpoint and the image of the past specific viewpoint.
  • the collision target detector 220 may calculate a difference between the pixel value of the object region in the image at the present time point and the pixel value of the image at the past specific point in time, and determine a region where the difference between the two pixel values is minimum as the position of the corresponding object. have.
  • the collision target detection unit 220 may determine the movement possibility of the object based on whether the position of the object changes, and may determine the movement direction of the object based on the direction of the position change of the object.
  • the collision target detector 220 may detect a moving object that has moved or is moving from a series of infrared images, and may determine the moving object as a collision target when the moving object is detected.
  • the collision target detection unit 220 detects a moving object that is moving forward, backward, left, or right from the past viewpoint to the current viewpoint and then stops, and / or a moving object that is moving forward, backward, left, or right from the current viewpoint. can do.
  • the collision target detection unit 220 may determine an object that matches a predetermined feature point pattern (for example, a shape pattern of a pedestrian, a roadside tree, a facility around a road, etc.) with respect to the extracted object as a collision target. have.
  • the collision object detection unit 220 extracts an object in the infrared image through a feature point extraction algorithm such as, for example, a scale invariant feature transform (SIFT) and a speeded up robust feature (SURF), and defines a predefined feature point.
  • SIFT scale invariant feature transform
  • SURF speeded up robust feature
  • the collision object detector 220 may estimate a collision target region for the collision target based on a near and far degree of the infrared image received from the infrared camera 20.
  • the collision target region means a space occupied by the collision target, and the collision target located at a closer distance than the collision target located at a far distance from the vehicle may be larger.
  • a perspective view indicating a far and near degree between the collision target and the vehicle may be determined.
  • the collision probability determination unit 230 determines an expected moving path of the vehicle 10 in the infrared image, and determines the collision possibility with the collision target.
  • the predicted movement path means a path in which the vehicle 10 is in a direction to move after the current time.
  • the collision possibility determination unit 230 may calculate an expected movement path of the vehicle 10 based on a steering angle received from the vehicle 10. According to an embodiment, the collision possibility determination unit 230 may receive a steering angle of a steering wheel from a steering sensor.
  • FIG. 3 is a diagram illustrating a process of calculating an expected moving path of a vehicle performed by a vehicle assistance apparatus according to an exemplary embodiment of the present invention.
  • the vehicle 10 makes a circular motion about the center O of the virtual circle.
  • the radius of the circle (RD) is determined by the wheelbase of the vehicle (distance of the front wheel axis and the rear wheel axis) and the steering angle of the front wheel.
  • the steering angle of the front wheel can be calculated from the steering angle of the steering wheel and the steering ratio of the vehicle.
  • the steering ratio is a ratio of the steering wheel angle to the wheel angle and may vary depending on the type of vehicle.
  • the collision probability determination unit 230 calculates a turning radius, that is, a circle radius RD, based on a steering angle of a steering wheel and a steering ratio of a vehicle through Equation 1 below.
  • Da is the wheelbase of the vehicle
  • ⁇ s is the steering wheel steering angle
  • ⁇ w is the wheel steering angle
  • RD is the radius of rotation
  • rs is the steering ratio of the vehicle (about 12: 1 to 20: 1).
  • Rfl, Rrl, Rrr, and Rfr represent rotation radiuses of the front left wheel, rear left, rear right, and front right, respectively.
  • the rotation radius of the vehicle 10 may be calculated as a value obtained by dividing the steering wheel Da and the steering ratio rs of the vehicle by the steering wheel angle ⁇ s.
  • the collision possibility determination unit 230 may calculate an expected movement path of the vehicle 10 by calculating a turning radius in real time based on the steering angle of the steering wheel received from the steering sensor and the equation (1).
  • the collision possibility determination unit 230 may determine the collision possibility between the vehicle 10 and the object (collision target) based on the calculated movement path of the vehicle 10 and the movement direction of the object. For example, when the moving path of the object along the moving direction and the expected moving path of the vehicle 10 intersect, the collision possibility determining unit 230 may determine that there is a possibility of collision.
  • the collision possibility determination unit 230 may determine the possibility of collision between the vehicle 10 and the object (collision target) based on the expected movement path and speed of the vehicle 10, the moving direction and the speed of the object. have.
  • the object and the vehicle 10 may be simultaneously located at the same time (or within a predetermined time range) based on the movement path of the object according to the moving direction and the expected movement path of the vehicle 10. If it is expected to intersect within a predetermined distance range, the likelihood of collision determination unit 230 may determine that there is a possibility of collision.
  • the collision possibility determination unit 230 may calculate a distance adjacency on the expected moving path with respect to the collision target.
  • the distance adjacency refers to a spatial interval in which the vehicle 10 may move with respect to the collision object, and for example, how far the collision object is adjacent to the vehicle 10.
  • the collision possibility determiner 230 may calculate a higher distance proximity than when the collision target is not detected in the anticipated movement path of the vehicle 10. .
  • the collision possibility determination unit 230 may vary the distance proximity to be proportional to the speed of the vehicle 10. For example, the collision possibility determination unit 230 may increase the distance proximity when the speed of the vehicle 10 increases.
  • the collision possibility determination unit 230 may increase the distance adjacency in the rotational direction as compared to the distance adjacency in the opposite direction of rotation while the vehicle 10 rotates. For example, the collision possibility determination unit 230 may determine that the vehicle 10 is to be turned left and the collision target is located at the same distance to the left and the right of the vehicle 10 based on the current position of the vehicle 10. Proximity can be increased over right distance adjacency.
  • the possibility of collision determination unit 230 determines a lane in which the vehicle 10 of the current location of the vehicle 10 is placed, and distance proximity to a collision object within the lane in which the vehicle 10 is placed. It can increase the distance proximity to the collision target outside the lane in which the vehicle 10 is placed.
  • the collision possibility determination unit 230 may determine whether the distance proximity is increased or decreased based on the moving direction of the collision target so that the visual guidance includes the proximity indicator near the collision target.
  • the proximity indicator refers to an arrow for indicating a moving direction of the collision target, and the proximity indicator may be determined through a motion vector for the collision target.
  • the collision possibility determination unit 230 may determine the length and the direction of the proximity indicator based on the moving direction and the speed with respect to the collision target.
  • the collision possibility display unit 240 transparently overlays and displays visual guidance on the collision target.
  • the visual guidance may be updated based on at least one of a moving direction, a speed, a moving direction of the collision target, and a speed of the vehicle 10.
  • the collision possibility display unit 240 may display a color of the visual guidance by determining a color gradient according to the degree of collision possibility. For example, if the collision possibility display unit 240 displays the collision possibility in three stages of red, yellow, and green, the visual guidance of the collision target having a high probability of collision is displayed in red, and the collision probability is medium. The visual guidance for a collision target can be displayed in orange (or yellow), and the visual guidance for a collision target with little possibility of collision can be displayed in green.
  • the collision possibility display unit 240 may display the calculated movement path of the vehicle 10 on the screen.
  • the collision possibility display unit 240 may display the anticipated movement path of the vehicle 10 by overlaying the infrared camera image.
  • the collision possibility display unit 240 may correct the expected moving path of the vehicle 10 by reflecting the characteristics of the camera lens that captured the image, and display the corrected estimated moving path on the screen.
  • FIG. 4 is an explanatory diagram for displaying an expected movement route of a vehicle performed by the vehicle assistance apparatus according to an embodiment of the present invention.
  • FIG. 4A is a diagram illustrating an expected moving path of the vehicle calculated by Equation 1.
  • the vehicle 10 moves along a trajectory of a circle having a radius of rotation. Therefore, the expected trajectory of the vehicle 10 may be calculated by Equation 2.
  • w is the width of the vehicle and R is the radius of rotation.
  • (a) is an equation representing the left boundary of the expected trajectory, and (b) is an equation representing the right boundary of the expected trajectory.
  • the collision possibility display unit 240 may correct the expected movement path of the vehicle 10 by reflecting the characteristic of the camera lens, that is, the distortion, that captured the image, and display the corrected expected movement path on the screen.
  • 4B is a diagram illustrating an expected movement path corrected by reflecting distortion.
  • the collision possibility display unit 240 may correct the predicted trajectory calculated by Equation 2 by Equation 3.
  • represents the focal length of the camera
  • x and y represent coordinates on the screen
  • X and Z represent coordinates of the predicted locus
  • h represents the height of the camera.
  • the collision possibility display unit 240 calculates coordinates (X, Z) of the predicted trajectories satisfying Equation 2, and applies the calculated coordinates to Equation 3 to correspond to the coordinates (X, Z) of the predicted trajectories.
  • the coordinates (x, y) of the phase can be calculated.
  • the collision possibility display unit 240 may display the corrected expected movement path on the screen by calculating coordinates (x, y) on the screen corresponding to each of the left boundary and the right boundary of the expected trajectory.
  • the collision possibility display unit 240 may display a visual guidance on the expected movement path and the collision target of the vehicle 10 on the screen, and the user may display the vehicle through the visual guidance on the expected movement route and the collision target displayed on the screen. You can drive safely.
  • the autonomous emergency braking unit 250 determines a possibility of collision of the vehicle with respect to the collision target based on the distance between the collision target and the vehicle 10 and controls at least one of speed reduction and warning sound generation of the vehicle 10. For example, the autonomous emergency braking unit 250 generates a warning sound when the distance between the collision target and the vehicle 10 is less than the first reference distance, and generates a warning sound when the distance between the collision target and the vehicle 10 is less than the second reference distance. It is possible to generate a speed reduction and a warning sound of (10).
  • the second reference distance is shorter than the first reference distance.
  • the autonomous emergency braking unit 250 may emergencyly brake the vehicle 10 when the collision target is not detected in the infrared image of the previous view and is detected in the infrared image of the current view. For example, if the collision target suddenly appears near the vehicle 10, the autonomous emergency braking unit 250 may determine the collision target as a pedestrian and emergency brake the vehicle 10.
  • the controller 260 controls the overall operation of the vehicle assistance apparatus 100 and controls the infrared image receiver 210, the collision target detector 220, the collision possibility determiner 230, the collision possibility display 240, and the autonomy. Control the operation and data flow between the emergency braking unit (250).
  • FIG. 5 is a block diagram of a vehicle assistance apparatus according to another embodiment of the present invention.
  • the vehicle assistance apparatus 500 may include an infrared image receiver 510, a visible light image receiver 515, a collision target detector 520, a collision probability determiner 530, and a collision possibility display 540. ), The autonomous emergency braking unit 550 and the control unit 560.
  • the visible light image receiving unit 510 receives a series of visible light images from the visible light camera 22.
  • the visible light camera 22 may be disposed in the vehicle 10 so that the front of the vehicle 10 may be photographed like the infrared camera 20.
  • the visible light camera 22 may capture a visible light image at night as well as during the day. For example, at night, the visible light camera 22 may photograph the front of the vehicle illuminated by the headlight (eg, the headlight) of the vehicle 10.
  • the headlight eg, the headlight
  • the collision possibility display unit 540 may recognize the lane on the road based on the visible light image received by the visible light image receiving unit 510, and additionally display the recognized lane on the infrared image screen.
  • the collision possibility display unit 540 recognizes a lane in the visible light image received by the visible light image receiver 510.
  • the collision possibility display unit 540 detects an edge in an image by using an edge detection algorithm such as a Canny edge detection algorithm, a line edge detection algorithm, a Laplacian edge detection algorithm, and the like based on the detected boundary line.
  • an edge detection algorithm such as a Canny edge detection algorithm, a line edge detection algorithm, a Laplacian edge detection algorithm, and the like based on the detected boundary line.
  • the collision possibility display unit 540 may recognize a lane by matching an area separated from a background area by a boundary line with a preset lane pattern (for example, a continuous solid line pattern and a continuous dashed line pattern). Can be.
  • the collision possibility display unit 540 may model the lane recognized in the image as a cubic equation based on a reference coordinate system (for example, a two-dimensional (x, y) coordinate system) in the image.
  • a reference coordinate system for example, a two-dimensional (x, y) coordinate system
  • the collision possibility display unit 540 may individually model each lane.
  • FIG. 6 is an explanatory diagram of a lane modeling performed by the vehicle assistance apparatus according to another exemplary embodiment of the present invention.
  • FIG. 6A is a diagram illustrating lanes on an actual road
  • FIG. 6B is a diagram illustrating lanes in a visible light image.
  • Equation 4 when modeling a lane on a real road, the lane equation on the real road is represented by Equation 4.
  • A0, k, A1, k, A2, k, A3, k are state variables that determine the shape of the lane, k is order, and X and Z are coordinates on the road.
  • Equation 5 When the state variable is expressed as a state equation, it is expressed by Equation 5.
  • Is the k th state variable A is the state transition matrix, and wk is the state noise vector for the k th state variable.
  • the collision possibility display unit 540 models each lane individually using Equation 4. Hereinafter, for convenience of description, the description will be made based on the modeling of one lane.
  • Equation 6 Substituting Equation 3 into the lane equation on the actual road of Equation 4 is as in Equation 6.
  • the focal length ⁇ of the infrared camera and the focal length ⁇ of the visible light camera may be different values. Therefore, when modeling a lane on an actual road by reflecting the distortion of the visible light image, the collision possibility display unit 540 applies the focal length of the visible light camera.
  • the collision possibility display unit 540 extracts the coordinates (x, y) of the lanes recognized from the visible light image, calculates a measurement variable (z), and expresses the calculated measurement variable as a measurement equation. Same as
  • zk denotes the k-th measurement variable
  • H denotes the measurement matrix
  • xk denotes the k-th state variable
  • vk denotes the measurement noise vector
  • the measurement variable may be represented as a value including noise of v in a matrix operation of the measurement matrix and the state variable.
  • the collision possibility display unit 540 may calculate a lane equation on an actual road using coordinates (x, y) of the lane recognized in the visible light image and equation (6).
  • the collision possibility display unit 540 calculates state variables A0, k, A1, k, A2, k, A3, k that determine the shape of the lane based on the calculated lane equation on the actual road. According to an exemplary embodiment, the collision possibility display unit 540 calculates the state variable value by repeatedly performing a prediction and correction process on the state variable value using a Kalman filter. Can be.
  • FIG. 7 is a conceptual diagram of calculating a state variable value in a vehicle assistance apparatus according to another embodiment of the present invention.
  • the collision possibility display unit 540 may predict the secondary state variable value by using the initial state variable value and the Kalman filter prediction model.
  • the collision possibility display unit 540 determines the secondary state variable value and the measured value (coordinates (x, y) of lanes recognized in the visible light image and the state variable value calculated by Equation 6). ) To correct the predictive model of the Kalman filter.
  • the collision possibility display unit 540 calculates an error covariance by weighting the second state variable value and the measured value, and applies the calculated error covariance to the Kalman filter prediction model to determine the third state.
  • the collision possibility display unit 540 may determine the equation of the lane on the actual road based on the state variable value calculated using the Kalman filter.
  • FIG. 8 is a diagram illustrating a process of displaying a lane performed on a screen according to another embodiment of the present invention on a screen.
  • FIG. 8A is a diagram illustrating a lane on an actual road
  • FIG. 8B is a diagram illustrating a lane corrected by reflecting a distortion of an infrared camera.
  • the collision possibility display unit 540 may correct the lane equation by reflecting the characteristics of the camera lens capturing the image, and display the corrected lane on the image screen. For example, when displaying the corrected lane on the infrared camera image, the collision possibility display unit 540 may calculate the focal length ( ⁇ ) of the infrared camera lens and the coordinates (X, Z) of the lane on the actual road by using Equation 3 below. Apply. The collision possibility display unit 540 may calculate the coordinates (x, y) on the screen corresponding to the coordinates (X, Z) of the lanes on the actual road and display the corrected lanes on the screen.
  • the collision possibility display unit 240 may display the expected movement path of the vehicle 10, the visual guidance for the collision target, and the lane on the road, and the user may visualize the expected movement route and the collision target displayed on the screen. Guidance and lanes on the road allow you to drive your vehicle safely.
  • the collision possibility display unit 240 may selectively display at least one or more of an expected moving path of the vehicle 10, a visual guidance on a collision target, and a lane on a road.
  • the collision possibility display unit 240 may selectively display at least one or more of an expected moving route of the vehicle 10, a visual guidance of a collision target, and a lane on a road according to a command input by a user. Can be.
  • FIG. 9 is a flowchart illustrating a vehicle assistance process performed in the vehicle assistance apparatus according to the present invention.
  • the infrared image receiver 210 receives a series of infrared images from the infrared camera 20 (S901).
  • the collision target detection unit 220 detects a collision target in a recent infrared image among the infrared images (S902).
  • the collision target detection unit 220 may determine the possibility of movement of the object in the latest infrared image, and if there is a possibility of movement of the object, may determine the object as a collision target.
  • the collision target detector 220 may detect a moving object that has moved or is moving from a series of infrared images, and may determine the moving object as a collision target when the moving object is detected.
  • the collision possibility determination unit 230 determines an expected moving path of the vehicle 10 in the infrared image, and determines a collision possibility with the collision target (S903).
  • the collision probability determination unit 230 calculates an expected movement path of the vehicle 10 through a steering angle received from the vehicle 10, and the collision possibility display unit 240 displays an infrared image of the calculated expected movement path. Can be displayed on.
  • the process of calculating the expected moving path of the vehicle 10 based on the steering angle and displaying the calculated expected moving path on the infrared image is as described using Equations 1 to 3 below.
  • the collision possibility display unit 240 may recognize a lane based on the visible light image captured by the visible light camera, and additionally display the recognized lane on the infrared image.
  • the process of displaying the recognized lane on the infrared image is as described using Equations 4 to 7.
  • the collision possibility display unit 240 transparently generates a visual guidance on the collision target (S904 and S905).
  • FIG. 10 is a diagram illustrating a visual guidance according to the position of the collision target of the vehicle assistance apparatus according to the present invention.
  • the collision possibility display unit 540 may determine whether or not the collision target is likely to occur. Visual guidance can be displayed by transparently overlaying a collision target. When it is determined that there is no possibility of collision for the collision target, the collision possibility display unit 540 may display visual guidance on the collision target when it is determined that there is a possibility of collision for the collision target. .
  • the collision possibility display unit 540 may determine the color hardness according to the collision probability between the collision target and the vehicle 10 and display the visual guidance on the collision target. For example, if the collision possibility between the collision target and the vehicle 10 is low, the collision possibility display unit 540 displays the visual guidance on the collision target by determining the color hardness as green (corresponding to FIG. 10 (c)). If the collision possibility between the collision target and the vehicle 10 is high, the visual hardness may be displayed on the collision target by determining the color hardness as red (corresponding to FIG. 10 (d)).
  • FIG. 11 is an exemplary view illustrating a predicted movement path and proximity indicator of a vehicle provided by the vehicle assistance apparatus according to the present invention.
  • the collision possibility determination unit 530 may display the predicted moving path a of the vehicle 10 on the infrared image based on the steering angle of the vehicle 10 (FIG. 11 ( a)).
  • the collision possibility determination unit 530 increases the distance adjacency if the moving direction with respect to the collision target is the same direction as the expected moving route of the vehicle 10, and the moving direction with respect to the collision target is opposite to the expected moving route of the vehicle 10. Direction can reduce distance adjacency.
  • the collision possibility determination unit 530 is a left (or right) arrow corresponding to the proximity indicator near the collision target (eg, the top of the collision target) when the collision target moves to the left (or right). (Visual guidance) can be displayed (FIG. 11 (b) (c)).
  • the left (or right) arrow may vary in length depending on the moving speed of the collision target.
  • the left (or right) arrow may be displayed longer when the collision target moves at a second speed faster than the first speed than when the collision target moves at the first speed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)

Abstract

La présente invention se rapporte à un dispositif auxiliaire de véhicule basé sur une image infrarouge, le dispositif auxiliaire de véhicule comprenant : une unité de réception d'image infrarouge destinée à recevoir une série d'images infrarouges ; une unité de détection d'objet qui doit être heurté destinée à détecter un objet qui doit être heurté, à partir de l'image infrarouge la plus récente parmi les images infrarouges ; et une unité de détermination de risque de collision destinée à déterminer un trajet de déplacement attendu d'un véhicule sur la base d'un angle de braquage d'une poignée, et à déterminer le risque d'une collision avec l'objet qui doit être heurté, sur la base du trajet de déplacement attendu. Par conséquent, un objet ayant le risque d'entrer en collision avec un véhicule peut être détecté au moyen d'une caméra infrarouge.
PCT/KR2015/005573 2015-04-10 2015-06-03 Procédé et dispositif auxiliaire de véhicule basés sur une image infrarouge WO2016163590A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020150051036A KR101568745B1 (ko) 2015-04-10 2015-04-10 적외선 영상 기반의 차량 보조 장치 및 방법
KR10-2015-0051036 2015-04-10

Publications (1)

Publication Number Publication Date
WO2016163590A1 true WO2016163590A1 (fr) 2016-10-13

Family

ID=54610120

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/005573 WO2016163590A1 (fr) 2015-04-10 2015-06-03 Procédé et dispositif auxiliaire de véhicule basés sur une image infrarouge

Country Status (2)

Country Link
KR (1) KR101568745B1 (fr)
WO (1) WO2016163590A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114466776A (zh) * 2019-06-07 2022-05-10 万都移动系统股份公司 车辆控制方法、车辆控制装置和包括该车辆控制装置的车辆控制系统

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101789294B1 (ko) * 2015-12-29 2017-11-21 재단법인대구경북과학기술원 차량용 어라운드 뷰 시스템 및 그 동작 방법
KR102452774B1 (ko) * 2017-08-24 2022-10-12 현대자동차주식회사 차량 시뮬레이션 시스템, 및 차량 시뮬레이션 방법
KR102365361B1 (ko) * 2021-04-22 2022-02-23 (주)에이아이매틱스 차량의 주행 상태를 판단하는 장치 및 방법

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050060896A (ko) * 2003-12-17 2005-06-22 현대모비스 주식회사 차량의 후방 주차 모니터링 장치
JP2012533468A (ja) * 2009-07-22 2012-12-27 ローベルト ボツシユ ゲゼルシヤフト ミツト ベシユレンクテル ハフツング 電動車両またはハイブリッド車両用の歩行者警告システム
KR20130021990A (ko) * 2011-08-24 2013-03-06 현대모비스 주식회사 차량의 보행자 충돌 경보 시스템 및 그 방법
KR20140071121A (ko) * 2012-12-03 2014-06-11 현대자동차주식회사 보행자 행동 패턴 인지를 통한 차등 경고 시스템 및 그 방법

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050060896A (ko) * 2003-12-17 2005-06-22 현대모비스 주식회사 차량의 후방 주차 모니터링 장치
JP2012533468A (ja) * 2009-07-22 2012-12-27 ローベルト ボツシユ ゲゼルシヤフト ミツト ベシユレンクテル ハフツング 電動車両またはハイブリッド車両用の歩行者警告システム
KR20130021990A (ko) * 2011-08-24 2013-03-06 현대모비스 주식회사 차량의 보행자 충돌 경보 시스템 및 그 방법
KR20140071121A (ko) * 2012-12-03 2014-06-11 현대자동차주식회사 보행자 행동 패턴 인지를 통한 차등 경고 시스템 및 그 방법

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114466776A (zh) * 2019-06-07 2022-05-10 万都移动系统股份公司 车辆控制方法、车辆控制装置和包括该车辆控制装置的车辆控制系统
US20220234581A1 (en) * 2019-06-07 2022-07-28 Mando Mobility Solutions Corporation Vehicle control method, vehicle control device, and vehicle control system including same

Also Published As

Publication number Publication date
KR101568745B1 (ko) 2015-11-12

Similar Documents

Publication Publication Date Title
CN106485233B (zh) 可行驶区域检测方法、装置和电子设备
JP2667924B2 (ja) 航空機ドッキングガイダンス装置
JP4612635B2 (ja) 低照度の深度に適応可能なコンピュータ視覚を用いた移動物体検出
JP3739693B2 (ja) 画像認識装置
JP4456086B2 (ja) 車両周辺監視装置
EP3188156B1 (fr) Dispositif de reconnaissance d'objet et système de commande de véhicule
EP2546779B1 (fr) Dispositif de reconnaissance d'environnement pour véhicule et système de commande de véhicule l'utilisant
WO2016163590A1 (fr) Procédé et dispositif auxiliaire de véhicule basés sur une image infrarouge
JP2006184276A (ja) 視覚検知による全天候障害物衝突防止装置とその方法
US20050276450A1 (en) Vehicle surroundings monitoring apparatus
US20150298621A1 (en) Object detection apparatus and driving assistance apparatus
KR20190019840A (ko) 물체 검출과 통지를 위한 운전자 보조 시스템 및 방법
Aytekin et al. Increasing driving safety with a multiple vehicle detection and tracking system using ongoing vehicle shadow information
JP4425642B2 (ja) 歩行者抽出装置
WO2020159076A1 (fr) Dispositif et procédé d'estimation d'emplacement de point de repère, et support d'enregistrement lisible par ordinateur stockant un programme informatique programmé pour mettre en œuvre le procédé
KR20060021922A (ko) 두 개의 카메라를 이용한 장애물 감지 기술 및 장치
WO2020171605A1 (fr) Procédé de fourniture d'informations de conduite et serveur de fourniture de carte de véhicules et procédé associé
WO2022255677A1 (fr) Procédé de détermination d'emplacement d'objet fixe à l'aide d'informations multi-observation
WO2020246735A1 (fr) Procédé de commande de véhicule, dispositif de commande de véhicule et système de commande de véhicule les comprenant
JP2011103058A (ja) 誤認識防止装置
JP2007087203A (ja) 衝突判定システム、衝突判定方法及びコンピュータプログラム
JP3227247B2 (ja) 走行路検出装置
KR100588563B1 (ko) 영상을 이용한 차선 인식방법
JP2011191859A (ja) 車両の周辺監視装置
WO2022255678A1 (fr) Procédé d'estimation d'informations d'agencement de feux de circulation faisant appel à de multiples informations d'observation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15888575

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15888575

Country of ref document: EP

Kind code of ref document: A1