US11373415B2 - Vehicle and method for avoiding a collision of a vehicle with one or more obstacles - Google Patents
Vehicle and method for avoiding a collision of a vehicle with one or more obstacles Download PDFInfo
- Publication number
- US11373415B2 US11373415B2 US16/955,077 US201816955077A US11373415B2 US 11373415 B2 US11373415 B2 US 11373415B2 US 201816955077 A US201816955077 A US 201816955077A US 11373415 B2 US11373415 B2 US 11373415B2
- Authority
- US
- United States
- Prior art keywords
- vehicle
- image
- distance
- ground
- obstacles
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 51
- 238000012634 optical imaging Methods 0.000 claims description 19
- 230000003287 optical effect Effects 0.000 claims description 10
- 230000001960 triggered effect Effects 0.000 claims description 9
- 238000005286 illumination Methods 0.000 claims description 4
- 238000012544 monitoring process Methods 0.000 claims description 3
- 230000000875 corresponding effect Effects 0.000 description 20
- 238000003384 imaging method Methods 0.000 description 19
- 230000015654 memory Effects 0.000 description 16
- 238000001514 detection method Methods 0.000 description 12
- 230000006870 function Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 239000007787 solid Substances 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000000712 assembly Effects 0.000 description 2
- 238000000429 assembly Methods 0.000 description 2
- 230000002596 correlated effect Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000001052 transient effect Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 239000000872 buffer Substances 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 230000003449 preventive effect Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 201000009032 substance abuse Diseases 0.000 description 1
- 231100000736 substance abuse Toxicity 0.000 description 1
- 208000011117 substance-related disease Diseases 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
- B60W30/0956—Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60C—VEHICLE TYRES; TYRE INFLATION; TYRE CHANGING; CONNECTING VALVES TO INFLATABLE ELASTIC BODIES IN GENERAL; DEVICES OR ARRANGEMENTS RELATED TO TYRES
- B60C23/00—Devices for measuring, signalling, controlling, or distributing tyre pressure or temperature, specially adapted for mounting on vehicles; Arrangement of tyre inflating devices on vehicles, e.g. of pumps or of tanks; Tyre cooling arrangements
- B60C23/001—Devices for manually or automatically controlling or distributing tyre pressure whilst the vehicle is moving
- B60C23/002—Devices for manually or automatically controlling or distributing tyre pressure whilst the vehicle is moving by monitoring conditions other than tyre pressure or deformation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/09—Taking automatic action to avoid collision, e.g. braking and steering
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W50/16—Tactile feedback to the driver, e.g. vibration or force feedback to the driver on the steering wheel or the accelerator pedal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G06K9/6215—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/143—Alarm means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B60W2420/42—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2530/00—Input parameters relating to vehicle conditions or values, not covered by groups B60W2510/00 or B60W2520/00
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/60—Traversable objects, e.g. speed bumps or curbs
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
- B60W2554/802—Longitudinal distance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
Definitions
- Various aspects relate generally to a vehicle and a method for avoiding a collision of a vehicle with one or more obstacles.
- an emergency brake assist also referred to as brake assist (BA or BAS) may be implemented in the vehicle.
- the emergency brake assist may include a braking system that increases braking pressure in an emergency.
- the emergency may be a predicted collision of the vehicle with another vehicle or with a fixed object, as for example, a wall, a tree, etc.
- the vehicle may include one or more sensors and one or more processors that are configured to predict a frontal collision of the vehicle with an obstacle.
- a vehicle may include a parking assistance system, wherein parking sensors (e.g., proximity sensors) are used to sense obstacles in the vicinity of the vehicle while parking.
- parking sensors e.g., proximity sensors
- one or more autonomous vehicle maneuvering systems may be implemented in a vehicle, e.g., to move the vehicle into a parking position, to more or less autonomously drive the vehicle, etc.
- FIG. 1 shows an exemplary vehicle including a collision avoidance system, according to some aspects
- FIG. 2A shows an exemplary vehicle including a collision avoidance system, according to some aspects
- FIG. 2B shows a sensor image associated with the collision avoidance system in a more detailed view, according to some aspects
- FIG. 3 shows an exemplary optical imaging system of the collision avoidance system, according to some aspects
- FIG. 4 shows a sensor image with superimposed calibrated camera lines, according to some aspects
- FIG. 6A and FIG. 6B show an exemplary flow diagram of a method for collision avoidance, according to some aspects
- FIG. 7 shows an exemplary flow diagram of a method associated with operating a vehicle or collision avoidance, according to some aspects.
- FIG. 8 shows an exemplary flow diagram of a process associated with operating a vehicle or collision avoidance, according to some aspects.
- the terms “at least one” and “one or more” may be understood to include a numerical quantity greater than or equal to one (e.g., one, two, three, four, [ . . . ], etc.).
- the term “a plurality” may be understood to include a numerical quantity greater than or equal to two (e.g., two, three, four, five, [ . . . ], etc.).
- phrases “at least one of” with regard to a group of elements may be used herein to mean at least one element from the group consisting of the elements.
- the phrase “at least one of” with regard to a group of elements may be used herein to mean a selection of: one of the listed elements, a plurality of one of the listed elements, a plurality of individual listed elements, or a plurality of a multiple of listed elements.
- any phrases explicitly invoking the aforementioned words expressly refers more than one of the said objects.
- data may be understood to include information in any suitable analog or digital form, e.g., provided as a file, a portion of a file, a set of files, a signal or stream, a portion of a signal or stream, a set of signals or streams, and the like. Further, the term “data” may also be used to mean a reference to information, e.g., in form of a pointer. The term data, however, is not limited to the aforementioned examples and may take various forms and represent any information as understood in the art.
- processor as, for example, used herein may be understood as any kind of entity that allows handling data.
- the data may be handled according to one or more specific functions executed by the processor.
- a processor as used herein may be understood as any kind of circuit, e.g., any kind of analog or digital circuit.
- handle or “handling” as for example used herein referring to data handling, file handling or request handling may be understood as any kind of operation, e.g., an I/O operation, and/or any kind of logic operation.
- An I/O operation may include, for example, storing (also referred to as writing) and reading.
- a processor may thus be or include an analog circuit, digital circuit, mixed-signal circuit, logic circuit, microprocessor, Central Processing Unit (CPU), Graphics Processing Unit (GPU), Digital Signal Processor (DSP), Field Programmable Gate Array (FPGA), integrated circuit, Application Specific Integrated Circuit (ASIC), etc., or any combination thereof.
- CPU Central Processing Unit
- GPU Graphics Processing Unit
- DSP Digital Signal Processor
- FPGA Field Programmable Gate Array
- ASIC Application Specific Integrated Circuit
- a processor, controller, and/or circuit detailed herein may be implemented in software, hardware and/or as hybrid implementation including software and hardware.
- system e.g., a computing system, a memory system, a storage system, etc.
- elements can be, by way of example and not of limitation, one or more mechanical components, one or more electrical components, one or more instructions (e.g., encoded in storage media), and/or one or more processors, and the like.
- nism e.g., a spring mechanism, etc.
- elements can be, by way of example and not of limitation, one or more mechanical components, one or more electrical components, one or more instructions, etc.
- memory may be understood as a non-transitory computer-readable medium in which data or information can be stored for retrieval.
- references to “memory” included herein may thus be understood as referring to volatile or non-volatile memory, including random access memory (RAM), read-only memory (ROM), flash memory, solid-state storage, magnetic tape, hard disk drive, optical drive, etc., or any combination thereof.
- RAM random access memory
- ROM read-only memory
- flash memory solid-state storage
- magnetic tape magnetic tape
- hard disk drive optical drive
- optical drive etc.
- registers, shift registers, processor registers, data buffers, etc. are also embraced herein by the term memory.
- a single component referred to as “memory” or “a memory” may be composed of more than one different type of memory, and thus may refer to a collective component including one or more types of memory. It is readily understood that any single memory component may be separated into multiple collectively equivalent memory components, and vice versa.
- vehicle as used herein may be understood as any suitable type of vehicle, e.g., a motor vehicle also referred to as automotive vehicle.
- a vehicle may be a car also referred to as a motor car, a passenger car, etc.
- a vehicle may be a truck (also referred to as motor truck), a van, etc.
- distance from ground as used herein may be also referred to as height above ground, above ground level, etc.
- ground as used herein may be understood as any type of solid infrastructure, e.g. a street, etc., below the respective obstacle.
- lane or “driving lane” as used herein may be understood as any type of solid infrastructure (or section thereof) on which a vehicle may drive.
- information e.g., obstacle information
- information may be handled (e.g., processed, analyzed, stored, etc.) in any suitable form, e.g., data may represent the information and may be handled via a computing system.
- one or more range imaging sensors may be used for sensing objects in the vicinity of a vehicle.
- a range imaging sensor may allow associating range information (or in other words distance information or depth information) with an image, e.g., to provide a range image having range data associated with pixel data of the image. This allows, for example, providing a range image of the vicinity of the vehicle including range information about one or more objects depicted in the image.
- the range information may include, for example, one or more colors, one or more shadings associated with a relative distance from the range image sensor, etc.
- position data associated with positions of objects relative to the vehicle and/or relative to an assembly of the vehicle may be determined from the range information.
- a range image may be obtained, for example, by a stereo camera, e.g., calculated from two or more images having a different perspective. Three-dimensional coordinates of points on an object may be obtained, for example, by stereophotogrammetry, based on two or more photographic images taken from different positions.
- a range image may be generated based on images obtained via other types of cameras, e.g., based on time-of-flight (ToF) measurements, etc.
- a range image may be merged with additional sensor data, e.g., with sensor data of one or more radar sensors, etc.
- One or more aspects are related to a comparatively high vehicle, as for example, a truck, a lorry, a bus, a van, a special-purpose vehicle, etc., having for example a height of more than about 2 m.
- a vehicle itself may have a height of less than about 2 m but may include one or more attachment parts and/or may carry a cargo, etc. such that the effective height of the vehicle considering the one or more attachment parts and/or the cargo, etc., may be greater than 2 m.
- bridge related accidents may be very common, mostly as a result of drivers that do not pay attention to signs, low bridges, tunnels, electricity wires of a tram or of a power grid, etc. Those accidents may occur mostly with trucks. However, accidents may occur as well with cars having a load on top of their roofs, etc. Those accidents may occur due to wrong and/or new markings due to paving or other re-construction reasons, or simply due to not paying attention.
- Commonly used solutions may include a traffic sign detection system that may detect the height of a bridge. However, this may be not enough protection since not all traffic signs may be captured correctly. Further, there may many situations where no traffic sign is placed before a bridge as well, so that no detection of the height of the bridge is possible with traffic sign detection.
- an over-height vehicle detection system may include several stationery systems that may provide detection and alert the drivers. These systems are stationery, very costly, and therefore may be placed only in strategic places, rather than everywhere. In addition, those stationary systems may provide only a visual feedback and/or a sound feedback to the driver, which may not ensure that the driver will receive the feedback, since it is performed usually outside of the vehicle, wherein inside of the vehicle it may be a noisy and/or unfocused environment.
- Various aspects are related to efficiently prevent over-height accidents, i.e., a collision of a vehicle with an obstacle that has a lower clearance height than the actual maximum height of the vehicle.
- a non-stationary system is provided that may be implemented into an on-board system of the vehicle.
- the collision avoidance system described herein may be connected with other safety systems within the vehicle, e.g., the vehicle may be stopped automatically in a safe manner to avoid the collision (e.g. as part of an autonomous vehicle control).
- a system may prevent damage during driving, e.g., to prevent a collision of an obstacle (e.g., of an overpass, a tunnel, a wire, a traffic light, a traffic sign, etc.) with the vehicle, with an attachment part of the vehicle (e.g., a bicycle carrier on the roof of the vehicle, a roof box, etc.), and/or with a cargo of the vehicle.
- an obstacle e.g., of an overpass, a tunnel, a wire, a traffic light, a traffic sign, etc.
- an attachment part of the vehicle e.g., a bicycle carrier on the roof of the vehicle, a roof box, etc.
- a collision avoidance function may be implemented via one or more on-board components of the vehicle, such as a front camera, a processing unit, etc., in order to detect obstacles that may be higher than the defined height of the vehicle.
- a depth camera (or any other range image device) may be used, for example, aligned in forward driving direction to detect during driving when an obstacle may come too close and would cause a collision with the vehicle due to the height of the vehicle.
- at least one depth camera (or any other range image device) may be used, for example, that is aligned in rear driving direction to avoid a collision in the case that an obstacle approaches from this direction.
- one or more sensors and a computing system may be used to implement the collision avoidance functions described herein.
- the computing system may include, for example, one or more processors, one or more memories, etc.
- the computing system may be communicatively coupled to the one or more sensors of the vehicle to obtain and analyze sensor data generated by the one or more sensors.
- the one or more processors may be configured to generate depth images in real-time from the data received from one or more range imaging sensors.
- a vehicle may include an external structure and one or more assemblies attached to the external structure that may defined a maximum height of the vehicle.
- a motor vehicle is illustrated and described exemplarily as the vehicle, wherein two side mirror assemblies of the motor vehicle may be illustrated and described exemplarily as the at least one assembly of the vehicle.
- other types of vehicles may be provided including the same or similar structures and functions as described exemplarily for the motor vehicle.
- the vehicle may include any other type of assembly that is configured in the same or similar way as described exemplarily for the respective side mirror assembly.
- FIG. 1 illustrates a vehicle 100 , e.g., a motor vehicle, in a schematic view, according to various aspects.
- vehicle 100 may be a truck, a lorry, a van, a bus, a car, or any other vehicle driving on the ground.
- the vehicle 100 may define a longitudinal axis 113 .
- the longitudinal axis 113 may be associated with a forward driving direction (e.g., illustratively in direction 103 as illustrated in FIG. 1 ) and/or a rear driving direction (e.g., illustratively opposite to the direction 103 ).
- the vehicle 100 may define a lateral axis 111 perpendicular to the longitudinal axis 113 .
- a height of the vehicle 100 (e.g., a maximum height) of the vehicle 100 may be determined perpendicular to both the lateral axis 111 and the longitudinal axis 113 (e.g., illustratively perpendicular to the directions 101 , 103 illustrated in FIG. 1 ).
- the vehicle 100 may include one or more sensors 110 , in some aspects one or more image sensors (e.g. one or more cameras). Further, the vehicle 100 may include one or more processors 120 . The one or more processors 120 may be part of a computing system, e.g. of a head unit or a central computer of the vehicle 100 .
- the one or more sensors 110 may be configured to provide sensor image data 112 d to the one or more processors 120 .
- the sensor image data 112 d may represent an image (also referred to as sensor image or camera image) 112 i of a vicinity 130 of the vehicle 100 .
- the sensor image 112 i may correspond to a field of vision 130 v (also referred to as field of view) of the one or more sensors 110 .
- the one or more sensors 110 may be configured such that the field of vision 130 v has a lateral dimension (e.g. in a horizontal plane parallel to the lateral axis 111 and the longitudinal axis 113 ) and a vertical dimension (e.g.
- the one or more sensors 110 may be able to detect height information of one or more objects in the vicinity 130 of the vehicle 100 .
- the one or more sensors 110 may include, for example, one or more cameras (e.g., one or more depth cameras, one or more stereo cameras, etc.), one or more ultrasonic sensors, one or more radar (radio detection and ranging) sensors, one or more lidar (light detection and ranging) sensors, etc.
- the one or more sensors 110 may include, for example, any other suitable sensor that allows a detection of an object and corresponding height information associated with the object.
- the one or more processors 120 may be configured to determine one or more obstacles 132 from the sensor image data 112 d (or in other words from the sensor image 112 i ).
- the one or more obstacles 132 (in the real world) may correspond to one or more image objects 114 of the sensor image 112 i , see for example FIG. 2A .
- the obstacle detection may be based on the sensor image data 112 d .
- additional data from the one or more sensors 110 may be used to enhance the obstacle detection.
- one or more image objects may be correlated with the one or more obstacles 132 using data merging from at least two different types of sensors, e.g. from at least one image sensor and at least one range sensor, etc.
- the one or more processors 120 may be configured to determine a distance from ground for each of the one or more obstacles 132 based its corresponding image object in the sensor image 112 i.
- the one or more processors 120 may be configured to trigger a safety operation when the distance from ground is equal to or less than a safety height associated with the vehicle.
- a monitoring area 250 located ahead of the vehicle 100 may be monitored to detect obstacles that are disposed above a predefined height level 255 , e.g. having a distance to ground of more than about two meters but less than five meters that could be a possible threat for collision depending of the maximum height of the vehicle 100 (see FIG. 2A ).
- the safety height may be greater than a maximum height of the vehicle 100 and/or of a cargo 140 of the vehicle 100 .
- the one or more sensors 110 and the one or more processors 120 described herein may be a collision avoidance system or may be part of a collision avoidance system.
- FIG. 2A illustrates the vehicle 100 in a schematic side view, according to various aspects.
- the direction 105 may be aligned perpendicular to the directions 101 , 103 illustrated in FIG. 1 .
- FIG. 2B illustrates a more detailed view of the corresponding sensor image 112 i .
- an obstacle 132 e.g. for an overpass (as for example a bridge)
- a corresponding distance from ground 235 may be determined for an obstacle 132 , e.g. for an overpass (as for example a bridge), a traffic light, a traffic sign, a tunnel, a wire, etc.
- the distance from ground 235 may be determined in vertical direction (e.g. parallel to direction 105 illustrated in FIG. 2A ).
- the distance from ground 235 may be a correlated with a clearance height associated with the obstacle 132 .
- a bridge or a tunnel with a clearance height of a value X may allow a vehicle 100 with a maximum height 115 of the value X or less than the value X to drive under the bridge or tunnel without a risk of a collision with the bridge due to the height.
- the clearance height may consider a minimal height over ground of the obstacle 132 relative to the driving lane 240 running under the obstacle 132 .
- the respective distance from ground 235 may be determined based on the sensor image data 112 d , or, in other words, based on the sensor image 112 i .
- various image processing methods may be used.
- auxiliary data from other sensors than the one or more image sensors may be used to correlate the image object 114 in the sensor image 112 i with the obstacles 132 in the real world, e.g. located in driving direction of the vehicle 100 . Therefore, an obstacle range 233 associated with a range from an obstacle 132 to the one or more sensors 110 may be determined to correlate a size of a corresponding image object or of a corresponding a distance between two image objects (e.g.
- the obstacle range 233 may be determined from the sensor image data 112 d or in other words from the sensor image 112 i using a calibrated optical imaging system, as illustrated exemplarily in FIG. 4 .
- the obstacle range 233 may be determined via one or more range sensors, e.g. based on radar, etc.
- one or more range imaging sensors e.g.
- the obstacle range 233 may be determined from the range information provided by the one or more range imaging sensors.
- the one or more sensors 110 may include or may be part of an optical imaging system.
- FIG. 3 illustrates a schematic view of an optical imaging system 310 and exemplary optical properties associated with the optical imaging system 310 .
- the optical imaging system 310 may include an image sensor 312 , e.g. a charge-coupled device (CCD) sensor, etc.
- the optical imaging system 300 may include at least one lens 314 having a focal length 314 f associated therewith. In the case that a plurality of lenses is used for imaging, the focal length 314 f may be an effective focal length associated with the plurality of lenses.
- a front focal point 314 p may be associated with the optical imaging system. In general, any ray that may pass through the front focal point 314 p may emerge from the at least one lens 314 parallel to a corresponding optical axis 314 a.
- the determination of the distance from ground 235 of the obstacle 132 may include determining the obstacle range 233 associated with a range of a respective obstacle 132 from the at least one lens 314 of the optical imaging system 300 . Based on the sensor image 112 i , an image distance from ground 225 may be determined for the image object 114 that corresponds to the obstacle 132 .
- the distance from ground 235 may be determined (e.g. estimated, calculated, etc.) based on the focal length 314 f , the image distance from ground 225 , and the obstacle range 233 .
- the optical properties of the at least one lens 314 may be estimated based on calculation, calibration (e.g. using test measurements), etc.
- the optical properties of the at least one lens 314 may be approximated via one or more lens equations and the distance from ground may be calculated using the at least one lens equation.
- a relationship between a focal length f, an object size W, an image size B, and an object distance Z may be as follows:
- the distance from ground 235 may be determined based on the determined obstacle range 233 , the determined image distance from ground 225 , and the focal length 314 f accordingly.
- an on board camera of the vehicle 100 may be used to capture and detect obstacles in front of the vehicle 100 (or in a similar way in the back of the vehicle 100 ).
- various further inputs may be used (e.g. from maps, from a sign detection system, etc.) to increase confidence with identifying the one or more obstacles.
- determining a (e.g. estimated) bridge or tunnel clearance height, the current velocity of the vehicle 100 , the time to impact, etc. may enable the system to alert the driver.
- one or more other safety operations also referred to as preventive actions may be triggered and carried out.
- the safety operation that is triggered when a collision threat is predicted may include one or more of the following safety operations: stopping the vehicle, slowing the vehicle down, sending a signal to an external infrastructure, generating an acoustical alarm to the driver of the vehicle, generating an optical alarm to the driver of the vehicle, generating a vibration alarm to the driver of the vehicle, and/or reducing a maximum height of the vehicle.
- the collision threat may be predicted when the distance from ground 235 is equal to or less than a safety height associated with the vehicle 100 , as described exemplarily above. In an example, the collision threat may be predicted when the distance from ground 235 is equal to or less than the maximum height 115 of the vehicle 100 , as described exemplarily above.
- the safety operation may include reducing an inflation pressure of one or more tires 100 t of the vehicle 100 (see FIG. 2A ) to reduce the maximum height 115 of the vehicle.
- the vehicle 100 may include at least an inflation pressure control device configured to determine the inflation pressure of a respective tire of the vehicle 100 and to control (e.g. to open) a valve of the tire to reduce the inflation pressure to a predefined threshold.
- the one or more processors 120 may be further configured to determine the obstacle range 233 for each of the one or more obstacles 132 based on one or more calibrated camera lines superimposed onto the sensor image 112 i (e.g. onto the camera image from the one or more cameras used for obstacle detection).
- FIG. 4 illustrates a sensor image 112 i having two calibrated camera lines 414 superimposed (in other words overlaid) onto the sensor image 112 i . This may allow determining the obstacle range 233 from the sensor image 112 i itself.
- FIG. 5 shows a range imaging sensor 510 in a schematic view, according to various aspects.
- the range imaging sensor 510 may be used to provide the sensor image data 112 d to the one or more processors 120 of the vehicle 100 , as described herein. Further, the range imaging sensor 510 may be configured to provide range information 506 to the one or more processors 120 . In other words, the range imaging sensor 510 may be configured to provided range image data to the one or more processors 120 , and the one or more processors 120 may be configured to determine the one or more obstacles 132 , the image distance from ground 225 , and the obstacle range 233 from the range image data.
- the range imaging sensor 510 may include at least two cameras 530 configured to generate at least two photographic images taken from different vantage points 530 a , 530 b to generate the sensor image 112 i having range information 506 associated therewith.
- the range information 506 may be associated with the one or more image objects 114 of the sensor image 112 i and therefore with the one or more obstacles 132 .
- an infrared illumination device 532 may be used to at least partially illuminate a field of vision 530 v of the range imaging sensor 510 .
- the infrared illumination device 532 may be used to at least partially illuminate the field of vision 130 of the one or more sensors 110 described above, e.g. with reference to FIG. 1 .
- a line of vision 501 of the one or more image sensors or the one or more range imaging sensors 510 may be aligned in a forward driving direction and/or in a rear driving direction of the vehicle 100 .
- the vehicle 100 may include one or more cameras 110 configured to provide at least a camera image 112 i of a vicinity 130 of the vehicle 100 along a driving lane 240 , see for example FIG. 2A and FIG. 4 .
- the vehicle 100 may further include one or more processors 120 configured to determine, from the camera image 112 i , one or more obstacles 132 disposed over the driving lane 240 .
- the one or more obstacles 132 may not have a direct connection to the driving lane 240 so that the vehicle 100 may drive under the one or more obstacles 132 .
- the one or more processors 120 may be further configured to determine a clearance height for each of the one or more obstacles 132 based on one or more image objects 114 from the camera image 112 i associated with the one or more obstacles 132 .
- the determination of the clearance height may include to determine a minimal distance from ground of the one or more obstacles 132 with respect to the driving lane 240 .
- the safety height may be associated with the maximum height 115 of the vehicle 100 and may be greater than the maximum height 115 .
- the safety height may be a sum of the maximum height 115 of the vehicle 100 and a safety tolerance.
- the safety tolerance may be, for example, in the range from about 1 cm to about 20 cm, as an example.
- the one or more processors 120 may be configured to trigger any desired safety operation when the clearance height is equal to or less than the safety height.
- FIG. 6A and FIG. 6B illustrate a schematic flow diagram of a method 600 for collision avoidance, according to various aspects.
- a collision avoidance system may be used in a vehicle 100 , wherein the collision avoidance system is configured to carry out the method 600 .
- the method 600 may include, in 610 , detecting an obstacle.
- detecting a bridge detecting a tunnel, detecting wires running over a street.
- the obstacle may be detected via a camera of the vehicle 100 .
- the camera may be a front camera. Alternatively, the camera may be a back camera.
- the method 600 may include, in 620 , determining (e.g. estimating) the obstacle range, e.g. a substantially horizontal distance of the obstacle from the vehicle 100 (or from the front camera of the vehicle 100 ).
- the determination of the obstacle range may be based on one or more calibrated lines of the camera.
- the method 600 may include, in 630 , determining whether a clearance height associated with the obstacle is at least greater than the maximum height of the vehicle 100 .
- the clearance height may be calculated based on lens equations associated with imaging properties of the camera.
- a new frame (or a new cycle) may be started in 650 beginning with operation 610 .
- the method 600 may include, in 640 , to check, whether the clearance height is greater than the safety height. When the clearance height is not greater than the maximum height of the vehicle 100 , a safety operation is triggered in 660 .
- the (current) maximum height of the vehicle 100 may be provided for the operation 640 . Therefore, height data may be obtained from another sensor and/or algorithm. The height data may be, for example, stored in a memory and provided from the memory to the one or more processors 120 to carry out the operation 640 .
- the safety operation 660 that may be triggered may include: in 660 a , determining (e.g. calculating) an impact time based on the velocity of the vehicle 100 and the determined obstacle range; and, in 660 b , displaying the impact time to a driver of the vehicle 100 .
- the (e.g. current) velocity of the vehicle 100 may be provided for the operation 660 a . Therefore, velocity data may be obtained from another sensor and/or algorithm.
- the operation for the respective frame may end and a new frame (or a new cycle) may be started in 690 beginning with operation 610 .
- FIG. 7 illustrates a schematic flow diagram of a method 700 for operating a vehicle, e.g. method for avoiding a collision of the vehicle with one or more obstacles, according to various aspects.
- the method 700 may include: in 710 , generating a sensor image of a vicinity of a vehicle; in 720 , determining the one or more obstacles from the sensor image, the one or more obstacles corresponding to one or more image objects of the sensor image, in 730 , determining a distance from ground for each of the one or more obstacles based on a corresponding image object of the one or more image objects, and, in 740 , triggering a safety operation when the distance from ground is equal to or less than a safety height associated with the vehicle.
- the method 700 may include similar or the same functions as described herein with respect to the vehicle 100 and/or the collision avoidance system.
- FIG. 8 illustrates a schematic flow diagram of a process 800 , according to various aspects.
- the process 800 may include, in 810 , checking whether a safety operation is triggered or not. In the case that a safety operation is triggered the clearance height may be less than the safety height, e.g. less than the maximum height of the vehicle 100 .
- the process 800 may further include, in 820 , checking whether the clearance height increases beyond the safety height, e.g. beyond the maximum height of the vehicle 100 , or not.
- the process 800 may include, in 830 , suspending or canceling the safety operation.
- the process 800 may include, in 840 , continuing the safety operation.
- operation 820 of the process 800 may be carried out again.
- an obstacle may include any solid object that could harm the vehicle 100 in the case of a collision.
- the obstacle may be, for example, any solid object disposed vertically distanced from ground, e.g. from the street on which the vehicle 100 may drive.
- Example 1 is a vehicle 100 , including: one or more image sensors 110 configured to provide sensor image data 112 d representing a sensor image 112 i of a vicinity 130 of the vehicle 100 ; one or more processors 120 configured to determine one or more obstacles 132 from the sensor image data 112 d , the one or more obstacles 132 corresponding to one or more image objects 114 of the sensor image 112 i , determine a distance from ground 235 for each of the one or more obstacles 132 based its corresponding image object 114 , and trigger a safety operation when the distance from ground 235 is equal to or less than a safety height associated with the vehicle 100 .
- the vehicle 100 of example 1 may further include that the safety height is greater than a maximum height 115 of at least one of the vehicle 100 or a cargo 140 of the vehicle 100 .
- the vehicle 100 of example 2 may further include that the vicinity of the vehicle includes a monitoring area 250 located ahead of the vehicle and disposed above a predefined height level 255 .
- the predefined height level 255 may be at least two meters above ground.
- the vehicle 100 of any one of examples 1 to 3 may further include that the one or more image sensors 110 are part of an optical imaging system 310 .
- the optical imaging system 310 may include at least one lens 314 and a focal length 314 d associated therewith.
- the vehicle 100 of example 3 may further include that the determination of the distance from ground includes: determining an obstacle range 233 associated with a range of a respective obstacle 132 from the at least one lens 314 ; determining, based on the sensor image 112 i , an image distance from ground 225 for the corresponding image object 112 i ; and determining the distance from ground based 235 on the focal length 314 f , the image distance from ground 225 , and the obstacle range 233 .
- the vehicle 100 of example 5 may further include that determining the distance from ground 235 is based on a calculation using at least one lens equation.
- the vehicle 100 of example 5 or 6 may further include that determining the obstacle range 233 includes superimposing (also referred to as overlaying) one or more calibrated lines 414 onto the sensor image 112 i.
- the vehicle 100 of example 5 or 6 may further include: one or more range sensors configured to receive obstacle range information associated with the obstacle range 233 , wherein the determination of the obstacle range 233 is based on the obstacle range information.
- the vehicle 100 of example 5 or 6 may further include that the one or more image sensors 110 include one or more range imaging sensors 510 to generate the sensor image 112 i and obstacle range information associated with the one or more image objects 114 of the sensor image 112 i , wherein the determination of the obstacle range 233 is based on the obstacle range information.
- the one or more image sensors 110 include one or more range imaging sensors 510 to generate the sensor image 112 i and obstacle range information associated with the one or more image objects 114 of the sensor image 112 i , wherein the determination of the obstacle range 233 is based on the obstacle range information.
- the vehicle 100 of example 9 may further include that the one or more range imaging sensors 510 include at least two cameras 530 configured to generate at least two photographic images taken from different vantage points 530 a , 530 b to generate the sensor image 112 i.
- the one or more range imaging sensors 510 include at least two cameras 530 configured to generate at least two photographic images taken from different vantage points 530 a , 530 b to generate the sensor image 112 i.
- the vehicle 100 of any one of examples 1 to 10 may further include: an infrared illumination device 532 configured to at least partially illuminate a field of vision 130 v of the one or more image sensors.
- the vehicle 100 of any one of examples 1 to 11 may further include that a line of vision 501 of the one or more image sensors 110 , 510 is aligned in a forward driving direction and/or in a rear driving direction.
- the vehicle 100 of any one of examples 1 to 12 may further include that the one or more processors 120 are further configured to trigger the safety operation only when the distance from ground 235 is equal to or less than the safety height for a time period greater than or equal to a predetermined time period.
- the vehicle 100 of example 13 may further include that the predetermined time period is greater than about 30 milliseconds.
- the predetermined time period may be in the range from about 30 milliseconds to about 1000 milliseconds.
- the vehicle 100 of any one of examples 1 to 14 may further include that, after the safety operation is triggered, the one or more processors 120 are further configured to continue the safety operation during a time period in which the distance from ground 235 is equal to or less than the safety height and to suspend or cancel the safety operation in the case that the distance from ground 235 increases beyond the safety height.
- the vehicle 100 of any one of examples 1 to 15 may further include that the safety operation includes at least one of the following safety operations: stopping the vehicle, slowing the vehicle down, sending a signal to an external infrastructure, generating an acoustical alarm to the driver of the vehicle, generating an optical alarm to the driver of the vehicle, generating a vibration alarm to the driver of the vehicle, and/or reducing a maximum height of the vehicle.
- the safety operation includes at least one of the following safety operations: stopping the vehicle, slowing the vehicle down, sending a signal to an external infrastructure, generating an acoustical alarm to the driver of the vehicle, generating an optical alarm to the driver of the vehicle, generating a vibration alarm to the driver of the vehicle, and/or reducing a maximum height of the vehicle.
- the vehicle 100 of any one of examples 1 to 16 may further include that the safety operation includes reducing an inflation pressure of one or more tires 100 t of the vehicle 100 to reduce a maximum height 115 of the vehicle 100 .
- the vehicle 100 of any one of examples 1 to 17 may further include that determine one or more obstacles includes determining an overpass (as an obstacle) extending over a path of travel of the vehicle (e.g. over the street or the driving line on which the vehicle 100 may drive), wherein the distance from ground 235 is associated with a clearance height of the overpass.
- determine one or more obstacles includes determining an overpass (as an obstacle) extending over a path of travel of the vehicle (e.g. over the street or the driving line on which the vehicle 100 may drive), wherein the distance from ground 235 is associated with a clearance height of the overpass.
- the vehicle 100 of example 18 may further include that the one or more processors 120 are further configured to determine the clearance height of the overpass over the entire width of a driving lane 240 .
- the vehicle 100 of example 19 may further include that the one or more processors 120 are further configured to determine a minimal clearance height of the overpass with respect to the width of the driving lane 240 , wherein the distance from ground 235 is associated with the minimal clearance height.
- Example 21 is a method for avoiding a collision of a vehicle with one or more obstacles, the method including: generating a sensor image of a vicinity of a vehicle; determining the one or more obstacles from the sensor image, the one or more obstacles corresponding to one or more image objects of the sensor image, determining a distance from ground for each of the one or more obstacles based on a corresponding image object of the one or more image objects, and triggering a safety operation when the distance from ground is equal to or less than a safety height associated with the vehicle.
- the method of example 21 may further include that the safety height is greater than a maximum height of the vehicle or of a load carried by the vehicle.
- the method of example 22 may further include that the sensor image displays a region ahead of the vehicle above a height of at least 2 m.
- the method of any one of examples 21 to 23 may further include that the sensor image is generated via an optical imaging system, the optical imaging system including at least one lens and a focal length associated therewith.
- the method of example 24 may further include that determining the distance from ground for a respective obstacle of the one or more obstacles includes: determining an obstacle range associated with a distance of the respective obstacle from the at least one lens; determining, for the corresponding image object of the sensor image, an image distance from ground associated with the distance from ground; and determining the distance from ground based on the focal length, the image distance from ground, and the obstacle range.
- the method of example 25 may further include that determining the distance from ground is based on a calculation using at least one lens equation.
- the method of example 25 or 26 may further include that determining the obstacle range includes overlaying one or more calibrated lines onto the sensor image.
- the method of example 26 or 27 may further include: generating range information associated with the obstacle range, wherein the determination of the obstacle range is based on the range information.
- the method of any one of examples 21 to 28 may further include that the sensor image is generated via one or more range imaging sensors.
- the method of example 29 may further include that the one or more range imaging sensors include at least two cameras configured to generate at least two photographic images taken from different vantage points to generate the sensor image having range information associated with the one or more image objects of the sensor image.
- the method of any one of examples 21 to 30 may further include: at least partially illuminating a field of vision of the one or more image sensors with infrared light.
- the method of any one of examples 21 to 31 may further include that a line of vision of the one or more image sensors is aligned in a forward driving direction and/or in a rear driving direction.
- the method of any one of examples 21 to 32 may further include that the safety operation is only triggered when the distance from ground is equal to or less than the safety height for a time period greater than or equal to a predetermined time period.
- the method of example 33 may further include that the predetermined time period is greater than about 30 milliseconds.
- the predetermined time period may be in the range from about 30 milliseconds to about 1000 milliseconds.
- the method of any one of examples 21 to 34 may further include: after triggering the safety operation, continuing the safety operation during a time period in which the distance from ground is equal to or less than the safety height; and suspending or canceling the safety operation when the distance from ground increases beyond the safety height.
- the method of any one of examples 21 to 35 may further include that the safety operation includes at least one of: stopping the vehicle, reducing speed of the vehicle, sending a signal to an external infrastructure, generating an acoustical alarm to the driver of the vehicle, generating an optical alarm to the driver of the vehicle, generating a vibration alarm to the driver of the vehicle, and/or reducing a maximum height of the vehicle.
- the method of any one of examples 21 to 36 may further include that the safety operation includes reducing the inflation pressure of one or more tires of the vehicle to reduce a maximum height of the vehicle.
- the method of any one of examples 21 to 37 may further include that determine the one or more obstacles includes determining an overpass extending over a path of travel of the vehicle, wherein the distance from ground is associated with the clearance height of the overpass.
- the method of example 38 may further include that the clearance height is determined over the entire width of a lane on which the vehicle is driving.
- the method of example 39 may further include: determining a minimal clearance height of the overpass with respect to the width of the lane, wherein the distance from ground is associated with the minimal clearance height.
- Example 41 is a vehicle 100 , including: one or more cameras 110 , 510 configured to provide a camera image 112 i of a vicinity 130 of the vehicle 100 along a driving lane 240 ; one or more processors 120 configured to determine, from the camera image 112 i , one or more obstacles 132 disposed over the driving lane 240 ; determine a clearance height for each of the one or more obstacles 132 based on one or more image objects 114 from the camera image 112 i associated with the one or more obstacles 132 , and trigger a safety operation when the clearance height is equal to or less than a safety height associated with the vehicle.
- the vehicle 100 of example 41 may further include that one or more processors 120 are further configured to determine an impact time associated with an impact of the one or more obstacles 132 into the vehicle 100 when the clearance height is equal to or less than the safety height.
- the vehicle 100 of example 42 may further include that the safety operation includes displaying the impact time to a driver of the vehicle 100 .
- the vehicle 100 of any one of examples 41 to 43 may further include that the one or more processors 120 are further configured to determine an obstacle range 233 for each of the one or more obstacles 132 based on one or more calibrated camera lines 414 superimposed onto the camera image 112 i , the obstacle range 233 is associated with a range of the respective obstacle (of the one or more obstacles 132 ) from the one or more cameras 110 , 510 .
- the vehicle 100 of example 444 may further include that determine the clearance height includes: calculating the clearance height from the camera image 112 i based on one or more lens equations and based on the obstacle range 233 .
- the vehicle 100 of any one of examples 41 to 45 may further include that the one or more processors 120 are further configured to receive height data corresponding to a maximum height 115 of the vehicle 100 , and to adapt the safety height based on the height data.
- Example 47 is a non-transient computer readable medium configured to cause one or more processors to perform a method for avoiding a collision of a vehicle with one or more obstacles, the method including: generating a sensor image of a vicinity of a vehicle; determining the one or more obstacles from the sensor image, the one or more obstacles corresponding to one or more image objects of the sensor image, determining a distance from ground for each of the one or more obstacles based on a corresponding image object of the one or more image objects, and triggering a safety operation when the distance from ground is equal to or less than a safety height associated with the vehicle.
- Example 48 is a non-transient computer readable medium configured to cause one or more processors to perform the method of any one of examples 21 to 40.
- Example 49 is a collision avoidance system, including: one or more image sensors 110 configured to provide sensor image data 112 d representing a sensor image 112 i of a vicinity 130 of a ehicle 100 ; one or more processors 120 configured to determine one or more obstacles 132 from the sensor image data 112 d , the one or more obstacles 132 corresponding to one or more image objects 114 of the sensor image 112 i , determine a distance from ground 235 for each of the one or more obstacles 132 based its corresponding image object 114 , and trigger a safety operation when the distance from ground 235 is equal to or less than a safety height associated with the vehicle 100 .
- the collision avoidance system of example 49 may be configured in a similar way as described in any one of examples 2 to 20 with reference to the vehicle 100 .
- Example 50 is a collision avoidance system, including: one or more cameras 110 , 510 configured to provide a camera image 112 i of a vicinity 130 of a vehicle 100 along a driving lane 240 ; one or more processors 120 configured to determine, from the camera image 112 i , one or more obstacles 132 disposed over the driving lane 240 ; determine a clearance height for each of the one or more obstacles 132 based on one or more image objects 114 from the camera image 112 i associated with the one or more obstacles 132 , and trigger a safety operation when the clearance height is equal to or less than a safety height associated with the vehicle.
- the collision avoidance system of example 50 may be configured in as similar way as described in any one of examples 42 to 46 with reference to the vehicle 100 .
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- Data Mining & Analysis (AREA)
- Geometry (AREA)
- Multimedia (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Traffic Control Systems (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2018/081486 WO2019183981A1 (fr) | 2018-03-31 | 2018-03-31 | Véhicule et procédé permettant d'éviter une collision d'un véhicule avec un ou plusieurs obstacle(s) |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2018/081486 A-371-Of-International WO2019183981A1 (fr) | 2018-03-31 | 2018-03-31 | Véhicule et procédé permettant d'éviter une collision d'un véhicule avec un ou plusieurs obstacle(s) |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/752,914 Continuation US20220358768A1 (en) | 2018-03-31 | 2022-05-25 | Vehicle and method for avoiding a collision of a vehicle with one or more obstacles |
Publications (2)
Publication Number | Publication Date |
---|---|
US20210224559A1 US20210224559A1 (en) | 2021-07-22 |
US11373415B2 true US11373415B2 (en) | 2022-06-28 |
Family
ID=68062106
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/955,077 Active US11373415B2 (en) | 2018-03-31 | 2018-03-31 | Vehicle and method for avoiding a collision of a vehicle with one or more obstacles |
US17/752,914 Abandoned US20220358768A1 (en) | 2018-03-31 | 2022-05-25 | Vehicle and method for avoiding a collision of a vehicle with one or more obstacles |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/752,914 Abandoned US20220358768A1 (en) | 2018-03-31 | 2022-05-25 | Vehicle and method for avoiding a collision of a vehicle with one or more obstacles |
Country Status (3)
Country | Link |
---|---|
US (2) | US11373415B2 (fr) |
EP (1) | EP3774473A4 (fr) |
WO (1) | WO2019183981A1 (fr) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102018216790A1 (de) * | 2018-09-28 | 2020-04-02 | Robert Bosch Gmbh | Verfahren zur Bewertung einer Auswirkung eines Objektes im Umfeld eines Fortbewegungsmittels auf ein Fahrmanöver des Fortbewegungsmittels |
CN113002548B (zh) * | 2019-12-19 | 2022-10-28 | 华为技术有限公司 | 高度的确定方法和装置 |
US20220146632A1 (en) * | 2020-11-06 | 2022-05-12 | Argo AI, LLC | System and method for operating a retractable sensor of an autonomous vehicle |
CN113433537A (zh) * | 2021-06-24 | 2021-09-24 | 东风汽车集团股份有限公司 | 一种影像式测距倒车雷达及测距方法 |
KR102566034B1 (ko) * | 2021-08-25 | 2023-08-11 | 주식회사 에어밴 | 롤 트레일러 승강 시스템 및 그 제어 방법 |
CN113994391B (zh) * | 2021-09-23 | 2023-06-09 | 深圳市锐明技术股份有限公司 | 车辆通行提醒方法、装置及车载终端 |
US11597383B1 (en) * | 2021-11-04 | 2023-03-07 | STEER-Tech, LLC | Methods and systems for parking a vehicle |
JP2023179141A (ja) * | 2022-06-07 | 2023-12-19 | キヤノン株式会社 | 画像処理システム、画像処理方法、及びコンピュータプログラム |
CN115092186B (zh) * | 2022-07-30 | 2024-07-23 | 重庆长安汽车股份有限公司 | 一种车辆自动驾驶方法及装置、电子设备、存储介质 |
CN117864133A (zh) * | 2023-03-17 | 2024-04-12 | 成都鑫动源乐信息技术有限公司 | 基于大数据的安全控制系统 |
Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050080530A1 (en) * | 2003-08-16 | 2005-04-14 | Daimlerchrysler Ag | Motor vehicle with a pre-safe-system |
JP2006050451A (ja) * | 2004-08-06 | 2006-02-16 | Sumitomo Electric Ind Ltd | 障害物警告システム及び画像処理装置 |
US20060129292A1 (en) * | 2004-12-10 | 2006-06-15 | Hitomi Ohkubo | Vehicle driving support system |
US20080036576A1 (en) * | 2006-05-31 | 2008-02-14 | Mobileye Technologies Ltd. | Fusion of far infrared and visible images in enhanced obstacle detection in automotive applications |
CN101380927A (zh) | 2008-10-17 | 2009-03-11 | 奇瑞汽车股份有限公司 | 一种汽车行人防撞红外线自动感应装置 |
US20120081218A1 (en) * | 2009-06-23 | 2012-04-05 | Frank Nugent | Overhead obstacle avoidance system |
CN102490673A (zh) | 2011-12-13 | 2012-06-13 | 中科院微电子研究所昆山分所 | 基于车联网的汽车主动安全控制系统及其控制方法 |
DE202013103033U1 (de) | 2013-07-09 | 2013-07-18 | Georg Wöhrl | Luftreifenrad mit einem System zur Anpassung des Luftdrucks |
CN103253265A (zh) | 2013-05-27 | 2013-08-21 | 奇瑞汽车股份有限公司 | 一种主动避撞系统及其控制方法 |
US20130222592A1 (en) * | 2012-02-24 | 2013-08-29 | Magna Electronics Inc. | Vehicle top clearance alert system |
DE102013209873A1 (de) * | 2013-05-28 | 2014-12-18 | Robert Bosch Gmbh | Vorrichtung und Verfahren zur Kollisionsvermeidung für Fahrzeuglasten und -aufbauten |
US20160104379A1 (en) * | 2013-07-03 | 2016-04-14 | Deere & Company | Anti-collision system for an agricultural vehicle with automatic detection of the dimensions of a load |
DE102015100719A1 (de) | 2015-01-20 | 2016-07-21 | Valeo Schalter Und Sensoren Gmbh | Verfahren zum Betreiben eines Fahrerassistenzsystems eines Kraftfahrzeugs, Fahrerassistenzsystem sowie Kraftfahrzeug |
US20170039850A1 (en) * | 2015-08-06 | 2017-02-09 | Safer Technology Solutions LLC | Early warning intersection device |
EP3138707A1 (fr) | 2015-09-03 | 2017-03-08 | Lg Electronics Inc. | Appareil d'assistance au conducteur pour véhicule et véhicule le comprenant |
US20180162400A1 (en) * | 2016-12-08 | 2018-06-14 | Hassa M. Abdar | Controlling a motor vehicle based upon wind |
US10229596B1 (en) * | 2017-10-05 | 2019-03-12 | Analog Devices, Inc. | Systems and methods for measuring a bridge clearance |
US10675928B2 (en) * | 2016-07-18 | 2020-06-09 | Pi System Automation | Method of measuring the inflation pressure or tires on a vehicle moving along a traffic route |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9477894B1 (en) * | 2015-09-04 | 2016-10-25 | Ford Global Technologies, Llc | System and method for measuring object height for overhead clearance detection |
-
2018
- 2018-03-31 EP EP18911949.8A patent/EP3774473A4/fr not_active Withdrawn
- 2018-03-31 US US16/955,077 patent/US11373415B2/en active Active
- 2018-03-31 WO PCT/CN2018/081486 patent/WO2019183981A1/fr unknown
-
2022
- 2022-05-25 US US17/752,914 patent/US20220358768A1/en not_active Abandoned
Patent Citations (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050080530A1 (en) * | 2003-08-16 | 2005-04-14 | Daimlerchrysler Ag | Motor vehicle with a pre-safe-system |
JP2006050451A (ja) * | 2004-08-06 | 2006-02-16 | Sumitomo Electric Ind Ltd | 障害物警告システム及び画像処理装置 |
US20060129292A1 (en) * | 2004-12-10 | 2006-06-15 | Hitomi Ohkubo | Vehicle driving support system |
US20080036576A1 (en) * | 2006-05-31 | 2008-02-14 | Mobileye Technologies Ltd. | Fusion of far infrared and visible images in enhanced obstacle detection in automotive applications |
CN101380927A (zh) | 2008-10-17 | 2009-03-11 | 奇瑞汽车股份有限公司 | 一种汽车行人防撞红外线自动感应装置 |
US20120081218A1 (en) * | 2009-06-23 | 2012-04-05 | Frank Nugent | Overhead obstacle avoidance system |
CN102490673A (zh) | 2011-12-13 | 2012-06-13 | 中科院微电子研究所昆山分所 | 基于车联网的汽车主动安全控制系统及其控制方法 |
US20130222592A1 (en) * | 2012-02-24 | 2013-08-29 | Magna Electronics Inc. | Vehicle top clearance alert system |
CN103253265A (zh) | 2013-05-27 | 2013-08-21 | 奇瑞汽车股份有限公司 | 一种主动避撞系统及其控制方法 |
DE102013209873A1 (de) * | 2013-05-28 | 2014-12-18 | Robert Bosch Gmbh | Vorrichtung und Verfahren zur Kollisionsvermeidung für Fahrzeuglasten und -aufbauten |
US20160104379A1 (en) * | 2013-07-03 | 2016-04-14 | Deere & Company | Anti-collision system for an agricultural vehicle with automatic detection of the dimensions of a load |
DE202013103033U1 (de) | 2013-07-09 | 2013-07-18 | Georg Wöhrl | Luftreifenrad mit einem System zur Anpassung des Luftdrucks |
DE102015100719A1 (de) | 2015-01-20 | 2016-07-21 | Valeo Schalter Und Sensoren Gmbh | Verfahren zum Betreiben eines Fahrerassistenzsystems eines Kraftfahrzeugs, Fahrerassistenzsystem sowie Kraftfahrzeug |
EP3247606A1 (fr) | 2015-01-20 | 2017-11-29 | Valeo Schalter und Sensoren GmbH | Procédé permettant de faire fonctionner un système d'aide à la conduite d'un véhicule automobile, système d'aide à la conduite et véhicule automobile |
US20170039850A1 (en) * | 2015-08-06 | 2017-02-09 | Safer Technology Solutions LLC | Early warning intersection device |
EP3138707A1 (fr) | 2015-09-03 | 2017-03-08 | Lg Electronics Inc. | Appareil d'assistance au conducteur pour véhicule et véhicule le comprenant |
US10675928B2 (en) * | 2016-07-18 | 2020-06-09 | Pi System Automation | Method of measuring the inflation pressure or tires on a vehicle moving along a traffic route |
US20180162400A1 (en) * | 2016-12-08 | 2018-06-14 | Hassa M. Abdar | Controlling a motor vehicle based upon wind |
US10229596B1 (en) * | 2017-10-05 | 2019-03-12 | Analog Devices, Inc. | Systems and methods for measuring a bridge clearance |
Non-Patent Citations (2)
Title |
---|
International search report issued for the corresponding PCT application No. PCT/CN2018/081486, dated Jan. 14, 2019, 5 pages (for informational purposes only). |
Supplementary European Search Report issued for the corresponding Application No. EP 18 91 1949, dated Sep. 17, 2021, 1 page (for informational purposes only). |
Also Published As
Publication number | Publication date |
---|---|
WO2019183981A1 (fr) | 2019-10-03 |
EP3774473A4 (fr) | 2021-10-27 |
EP3774473A1 (fr) | 2021-02-17 |
US20220358768A1 (en) | 2022-11-10 |
US20210224559A1 (en) | 2021-07-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220358768A1 (en) | Vehicle and method for avoiding a collision of a vehicle with one or more obstacles | |
US11023782B2 (en) | Object detection device, vehicle control system, object detection method, and non-transitory computer readable medium | |
KR101628503B1 (ko) | 운전자 보조장치 및 그 작동 방법 | |
US20240046654A1 (en) | Image fusion for autonomous vehicle operation | |
US8952799B2 (en) | Method and system for warning a driver of a vehicle about potential obstacles behind the vehicle | |
US9321460B2 (en) | Railroad crossing barrier estimating apparatus and vehicle | |
US7447592B2 (en) | Path estimation and confidence level determination system for a vehicle | |
US9269263B2 (en) | Vehicle top clearance alert system | |
US10665108B2 (en) | Information processing apparatus and non-transitory computer-readable recording medium | |
US9020747B2 (en) | Method for recognizing a turn-off maneuver | |
US20050137774A1 (en) | Single vision sensor object detection system | |
KR20180030823A (ko) | 능동형 안전 장치의 트리거 조정을 위해 앞차의 제동등 인식 | |
US9283958B2 (en) | Method and device for assisting in returning a vehicle after leaving a roadway | |
WO2018190037A1 (fr) | Dispositif de détection et de notification d'obstacle, procédé et programme | |
JP2015507776A (ja) | 車体部及び/又は車輪と対象物との衝突を回避するための能動的警報及び/又はナビゲーション補助の方法 | |
US10275665B2 (en) | Device and method for detecting a curbstone in an environment of a vehicle and system for curbstone control for a vehicle | |
US20180174467A1 (en) | Driving support apparatus and driving support method | |
US20170259813A1 (en) | Method for avoiding a rear-end collision between a first vehicle and a second vehicle and control unit | |
JP2022502642A (ja) | 移動手段周辺の物体が移動手段の運転操作に及ぼす影響を評価する方法 | |
WO2019207755A1 (fr) | Dispositif d'informations embarqué, et procédé et système d'assistance à la conduite | |
US20210155257A1 (en) | Systems and methods of geometric vehicle collision evaluation | |
CN113053165A (zh) | 车辆及其碰撞识别方法、装置和设备 | |
JP6763080B2 (ja) | 自動車用ビジョンシステム及び方法 | |
US20240166176A1 (en) | Emergency evacuation device, mobile object control system, and emergency evacuation method | |
CN113022474B (zh) | 车辆控制系统 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: INTEL CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:RIDER, TOMER;NABWANI, AYOOB;YANG, WENLONG;SIGNING DATES FROM 20200703 TO 20200820;REEL/FRAME:054073/0431 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
AS | Assignment |
Owner name: INTEL CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:RIDER, TOMER;NABWANI, AYOOB;YANG, WENLONG;SIGNING DATES FROM 20200703 TO 20200820;REEL/FRAME:060047/0683 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: MOBILEYE VISION TECHNOLOGIES LTD., ISRAEL Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INTEL CORPORATION;REEL/FRAME:067435/0543 Effective date: 20221227 |