GB2623840A - System, device, and method for detecting an intention and an action associated with a vehicle - Google Patents

System, device, and method for detecting an intention and an action associated with a vehicle Download PDF

Info

Publication number
GB2623840A
GB2623840A GB2218796.7A GB202218796A GB2623840A GB 2623840 A GB2623840 A GB 2623840A GB 202218796 A GB202218796 A GB 202218796A GB 2623840 A GB2623840 A GB 2623840A
Authority
GB
United Kingdom
Prior art keywords
vehicle
controller
intention
output signal
indicator
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2218796.7A
Other versions
GB202218796D0 (en
Inventor
Sivan Neethu
Meghwani Hansa
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Continental Autonomous Mobility Germany GmbH
Original Assignee
Continental Autonomous Mobility Germany GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Autonomous Mobility Germany GmbH filed Critical Continental Autonomous Mobility Germany GmbH
Publication of GB202218796D0 publication Critical patent/GB202218796D0/en
Publication of GB2623840A publication Critical patent/GB2623840A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/54Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection

Abstract

A system 100 for use with a first vehicle 101 to detect an intended action of a second vehicle 102 is described. The system comprises at least one image sensor 104, e.g. rear-view camera, configured to receive image data associated with the second vehicle, and a controller 110 configured to detect an intention indicator (112, Fig. 2) of the second vehicle in the image data, the intention indicator indicative of the intended action of the second vehicle. The controller is further configured to generate an output signal 116 based on the intention indicator and provide the output signal to the first vehicle for responding to the intended action of the second vehicle. The system may be installed as part of an anti-collision system on the first vehicle. The image sensor 104 may comprise at least one fisheye lens. The intention indicator may be a visual indicator of an intention to overtake. The output signal may be a deceleration, emergency brake or directional change/maintain control signal.

Description

SYSTEM, DEVICE, AND METHOD FOR DETECTING AN INTENTION AND AN ACTION ASSOCIATED WITH A VEHICLE
TECHNICAL FIELD
Various aspects of this disclosure relate to systems, devices, and methods for detecting intentions and actions, such as overtaking intentions and actions, associated with vehicles.
BACKGROUND
The following discussion of the background art is intended to facilitate an understanding of the present disclosure only. It should be appreciated that the discussion is not an acknowledgment or admission that any of the material referred to was published, known, or is part of the common general knowledge of the person skilled in the art in any jurisdiction as of the priority date of the disclosure.
There exist various anti-collision system(s) installable on a vehicle to detect other vehicles in a proximity of the vehicle for collision avoidance. A known anti-collision system comprises the use of one or more cameras installed at various locations of the vehicle to detect objects or other vehicles so as to avoid collisions.
However, existing anti-collision systems do not adequately take into account the intention of other vehicles in executing certain actions, such as an acceleration action or an overtaking action, before such actions are executed. In addition, existing detection systems may not satisfactorily predict the actions of such other vehicles after the intention of the other vehicles have been indicated or identified.
Accordingly, there exists a need for an improved device, system and/or method for detecting vehicles, that seek to address at least one of the aforementioned issues.
SUMMARY
Various embodiments comprise a system, device, and method to improve the safety of one or more second vehicles while the one or more second vehicles are performing an action (e.g. an overtaking action) in relation to a first vehicle, particularly when the overtaking action is performed within a blind spot of the first vehicle.
According to an aspect of the present disclosure, there is provided a system for use to with a first vehicle to detect an intended action of a second vehicle, the system comprising: at least one image sensor configured to receive image data associated with the second vehicle; a controller configured to detect an intention indicator of the second vehicle in the image data, the intention indicator indicative of the intended action of the second vehicle, wherein the controller is further configured to generate is an output signal based on the intention indicator, and wherein the controller is further configured to provide the output signal to the first vehicle for responding to the intended action of the second vehicle..
The system of the present disclosure seeks to provide a relatively safer environment for a second vehicle when the second vehicle is performing an action, such as an overtaking action, within a blind spot of the first vehicle. The system of the present disclosure may also be used to prevent or minimise collision while the second vehicle is overtaking the first vehicle.
According to an embodiment which may be combined with any above-described embodiment or with any below described further embodiment, the at least one image sensor comprises at least one fisheye lens. In some embodiments, the at least one fisheye lens has an angle of view of up to 170 degrees, for example in the range from 150 to 170 degrees. The fisheye lens provides for a relatively wide angle of view for capturing images of the second vehicle.
According to an embodiment which may be combined with any above-described embodiment or with any below described further embodiment, the intention indicator of the vehicle is a visual indicator of an intention to overtake.
According to an embodiment which may be combined with any above-described embodiment or with any below described further embodiment, the output signal comprises at least one of a deceleration control signal, an emergency brake control signal, a directional change signal, and/or a maintain direction control signal to the first vehicle. Such control signal(s) may provide for range of possible actions the first vehicle may take in response to the action of the second vehicle.
According to an embodiment which may be combined with any above-described embodiment or with any below described further embodiment, the system comprises at least one detection sensor configured to receive detection signals associated with the second vehicle in a proximity of the first vehicle, wherein the controller is further configured to compute a distance measure between the first vehicle and the second to vehicle based on the received detection signals, and wherein the controller is further configured to generate the output signal further based on the distance measure. The at least one detection sensor may be used to detect if the intended action of the second vehicle is performed and/or if the distance between the first vehicle and the second vehicle is within an acceptable threshold.
According to an embodiment which may be combined with any above-described embodiment or with any below described further embodiment, the detection sensor may comprise a radar sensor positioned on the first vehicle to send and receive detection signals indicative of a lateral distance between the first vehicle and the second vehicle. The signals may include electromagnetic radiation, such as radio waves. Radar sensors are relatively agnostic to changes in lighting conditions and may be complementary to the use of the image sensors to further ascertain whether the action is performed as/after the intention indicator of the second vehicle in the image data is detected.
According to an embodiment which may be combined with any above-described embodiment or with any below described further embodiment, the controller is configured to compute the lateral distance based on the detection signals, and wherein the controller is configured to issue a warning notification based on determining that the lateral distance is less than a threshold value.
According to an embodiment which may be combined with any above-described embodiment or with any below described further embodiment, the warning notification is in the form of a visual or audio warning to a user of the first vehicle to avoid a collision. In some embodiments, the warning notification further comprises another visual or audio warning to the second vehicle to avoid the collision. The warning notification may be useful where the first and/or second vehicles are driven by a driver.
According to another aspect of the present disclosure, there is provided a first vehicle, comprising the system as described, wherein the at least one image sensor comprises a rear-view image sensor positioned at a rear portion of the first vehicle, and further comprises at least one detection sensor positioned at a side portion of the first vehicle, wherein the at least one detection sensor is configured to detect the distance between the second vehicle and the side portion of the first vehicle. According to an embodiment which may be combined with any above-described to embodiment or with any below described further embodiment, the first vehicle is a container truck.
According to an embodiment which may be combined with any above-described embodiment or with any below described further embodiment, the at least one image sensor comprises at least one fisheye lens.
According to another aspect of the present disclosure, there is provided a computer-implemented method for detecting an intended action of a second vehicle in relation to a first vehicle, the method comprising: obtaining image data associated with the second vehicle; detecting an intention indicator of the second vehicle in the image data, the intention indicator indicative of the intended action of the second vehicle; determining, based on the intention indicator, an output signal for the first vehicle (101) to respond to the intended action of the second vehicle.
According to an embodiment which may be combined with any above-described embodiment or with any below described further embodiment, the method further comprises: obtaining detection signals associated with the second vehicle in a proximity of the first vehicle; computing a distance measure between the first vehicle and the second vehicle based on the detection signals; and determining, the output signal further based on the distance measure.
According to an embodiment which may be combined with any above-described embodiment or with any below described further embodiment, the method further comprises: computing a lateral distance based on the detection signals, and comparing the lateral distance with a threshold value.
According to an embodiment which may be combined with any above-described embodiment or with any below described further embodiment, the method further comprises: issuing a warning notification to at least one of the first vehicle and the second vehicle based on a comparison result of the lateral distance and the threshold value. The comparison result may be a result where the lateral distance is less than the threshold value.
According to another aspect of the present disclosure, there is provided a controller for use with an autonomous driving control unit in a first vehicle for detection of a to second vehicle, the controller comprising an input module configured to receive image data associated with the second vehicle and detection signals associated with a distance between the first vehicle and the second vehicle; and an analysis module arranged in data communication with the input module; characterised in that: the analysis module is configured to analyse the image data for identification of an intention indicator of the second vehicle, compute a distance measure between the first vehicle and the second vehicle based on the received detection signals, and determine an output signal based on the intention indicator and the distance measure, in response to the action of the second vehicle.
According to another aspect of the present disclosure there is a non-transitory 20 computer-readable medium storing computer executable code comprising instructions that cause a processor to carry out the method previously described.
BRIEF DESCRIPTION OF THE DRAWINGS
The present disclosure will be better understood with reference to the detailed description when considered in conjunction with the non-limiting examples and the accompanying drawings, in which: FIG. 1 is a schematic diagram of a system for use with a first vehicle to detect an 30 action associated with a second vehicle according to some embodiments.
FIG. 2 is a schematic diagram illustrating the system in operation, the system installed on a large vehicle (first vehicle) to detect an overtaking vehicle (second vehicle).
FIG. 3 is a schematic diagram of a controller suitable for use with an anti-collision system of a vehicle and/or an auto-driving control unit of an autonomous vehicle. FIG. 4 is a flow chart depicting a method for detecting an action of a second vehicle in relation to a first vehicle.
FIG. 5 shows a flow chart depicting a specific use case of the method for detecting an overtaking action of a second vehicle in relation to a first vehicle.
DETAILED DESCRIPTION
to The following detailed description refers to the accompanying drawings that show, by way of illustration, specific details and embodiments in which the disclosure may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the disclosure. Other embodiments may be utilized and structural, and logical changes may be made without departing from the scope of the disclosure. The various embodiments are not necessarily mutually exclusive, as some embodiments can be combined with one or more other embodiments to form new embodiments.
The embodiments described in the context of one of the devices, systems, or methods are analogously valid for the other devices, systems, or methods. Similarly, the embodiments described in the context of a device are analogously valid for a system or a method, and vice-versa.
Features that are described in the context of an embodiment may correspondingly be applicable to the same or similar features in the other embodiments. Features that are described in the context of an embodiment may correspondingly be applicable to the other embodiments, even if not explicitly described in these other embodiments. Furthermore, additions and/or combinations and/or alternatives as described for a feature in the context of an embodiment may correspondingly be applicable to the same or similar feature in the other embodiments.
In the context of the various embodiments, the articles "a", "an", and "the" as used with regard to a feature or element include a reference to one or more of the features or elements.
As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
While terms such as "first", "second" etc., may be used to describe various vehicles, such vehicles are not limited by the above terms. The above terms are used only to distinguish one vehicle from another, and do not define an order and/or significance of the vehicles.
The term "data" as used herein may be understood to include information in any suitable analog or digital form, e.g., provided as a file, a portion of a file, a set of files, a signal or stream, a portion of a signal or stream, a set of signals or streams, and the like. Further, the term "data" may also be used to mean a reference to information, e.g., in form of a pointer. The term "data", however, is not limited to the to aforementioned examples and may take various forms and represent any information as understood in the art. Any type of information, as described herein, may be handled for example via one or more processors in a suitable way, e.g. as data.
The terms "processor" or "controller" as, for example, used herein may be understood as any kind of entity that allows handling data. The data may be handled according to one or more specific functions executed by the processor or controller. Further, a processor or controller as used herein may be understood as any kind of circuit, e.g., any kind of analog or digital circuit. A processor or a controller may thus be or include an analog circuit, digital circuit, mixed-signal circuit, logic circuit, processor, microprocessor, Central Processing Unit (CPU), Graphics Processing Unit (GPU), Digital Signal Processor (DSP), Field Programmable Gate Array (FPGA), integrated circuit, Application Specific Integrated Circuit (ASIC), etc., or any combination thereof. Any other kind of implementation of the respective functions, which will be described below in further detail, may also be understood as a processor, controller, or logic circuit. It is understood that any two (or more) of the processors, controllers, or logic circuits detailed herein may be realized as a single entity with equivalent functionality or the like, and conversely that any single processor, controller, or logic circuit detailed herein may be realized as two (or more) separate entities with equivalent functionality or the like.
The term "memory" detailed herein may be understood to include any suitable type of memory or memory device, e.g., a hard disk drive (HDD), a solid-state drive (SSD), a flash memory, etc.
B
The term "module" detailed herein refers to, forms part of, or includes an Application Specific Integrated Circuit (ASIC); an electronic circuit; a combinational logic circuit; a field programmable gate array (FPGA); a processor (shared, dedicated, or group) that executes code; other suitable hardware components that provide the described functionality; or a combination of some or all of the above, such as in a system-on-chip. The term module may include memory (shared, dedicated, or group) that stores code executed by the processor.
Differences between software and hardware implemented data handling may blur. A processor, controller, and/or circuit detailed herein may be implemented in software, to hardware, and/or as a hybrid implementation including software and hardware.
In the following, aspects or embodiments will be described in detail.
According to an aspect of the present disclosure, there is a system for use with a first vehicle to detect an action associated with a second vehicle. The system may be installed as part of an anti-collision system on the first vehicle, and/or as part of an autonomous driving control unit (ADCU) of the first vehicle, in the case where the first vehicle is an autonomous or semi-autonomous vehicle. In some embodiments, the system may be or form part of a driver assistance system of the first vehicle. Referring to FIG. 1, the system 100 for use with a first vehicle 101 to detect an action associated with a second vehicle 102 comprises at least one image sensor 104 configured to obtain image data 106 associated with the second vehicle 102. The at least one image sensor 104 may be a rear-view camera equipped with one or more fisheye lens. The fisheye lens may have an angle of view of up to 170 degrees. In some embodiments, the angle of view of the fisheye lens may be in a range of 150 degrees to 170 degrees. In some embodiments, the image sensor 104 may be configured to obtain video stream of the second vehicle 102 in a real-time or near real-time environment, the video stream comprising multiple images or image frames. The video stream may be converted to one or more suitable data formats and stored in a database (not shown).
One or more detection sensors 108, which may include a radar sensor, an infrared sensor, a sonar sensor, an ultrasound sensor, and/or other types of proximity sensors, may be configured to detect the second vehicle 102 in the proximity of the first vehicle 101. In some embodiments, at least one detection sensor 108 may be positioned at each side of the first vehicle 101 to detect one or more second vehicles 102 from either side of the first vehicle 101. In some embodiments, the detection sensor 108 may be a radar sensor configured to send and receive electromagnetic radiation, such as radio waves, within a frequency range. The radar sensor 108 may be configured to emit radio waves and receive reflected radio waves as an indication of the presence of one or more second vehicles 102 in the proximity of the first vehicle 101.
A controller 110, which may include one or more processors, may be arranged in data communication with the image sensor 104 and the detection sensor 108 to obtain or receive the image data 106 and detection signals, for example, to electromagnetic radiation (radio waves) signals, respectively. The controller 110 may then be used to detect an intention indicator 112 (see FIG. 2) in the image data 106 of the second vehicle 102, and compute a distance measure 114 between the first vehicle 101 and the second vehicle 102 based on the radio signals received by the detection sensor 108. In some embodiments, the detection of the image data 106 of the second vehicle 102 may include analysing the image data 106 using one or more image processing algorithms to identify the intention indicator 112. An output signal 116 may then be produced by the controller 110. The output signal 116 may be in the form of one or more control signals and/or one or more warning notifications to the first vehicle 102. The control signal(s) may include at least one of a deceleration control signal, an emergency brake control signal, a directional change signal, and/or a maintain direction control signal. The one or more warning notification(s) may be in the form of an audio alert, and/or a visual alert. The output signal 116 may be sent to one or more actuator units 118 to effect a corresponding action associated with the output signal 116. Examples of the one or more actuator units 118 include a braking system, a tyre directional control system, and/or a visual or audio warning system.
In some embodiments, if the second vehicle 102 indicates an intention to perform an action (e.g. an overtaking action), but did not subsequently perform the action, which may be inferred based on the lack of motion/proximity signals received by the detection sensor 108 and/or the switching off of the intention indicator associated with the second vehicle 102, then the controller 110 may operate or be configured to revert to a default state, for example, an idle state, a background monitoring state, or a stand-by state.
In some embodiments, the controller 110 may be configured to detect the presence of the second vehicle 102 in the vicinity of the detection sensor 108 for a predetermined period (for example up to 30 seconds), and if the second vehicle 102 does not perform the overtaking action, the controller 110 reverts to a background monitoring state. In some embodiments, the controller 110, in the background monitoring state, may be configured to monitor an action of the first vehicle 101, for example a directional change of the first vehicle 101, which triggers the system 100 to be activated and start obtaining images of the second vehicle 102.
In some embodiments, the computed distance measure 114 based on the received to motion/proximity signals may be compared with a threshold. A computed distance measure 114 that is less than the threshold may indicate that the second vehicle 102 is too near to the first vehicle 101 and may trigger a warning notification and/or a control signal to decelerate and/or swerve the first vehicle 101 so as to avoid a collision. In some embodiments, the threshold may be pre-determined or adjustable as and when required.
FIG. 2 shows an exemplary operation scenario where the first vehicle 101 is in the form of a relatively big vehicle, for example, a container truck, and the second vehicle 102 is in the form of a relatively small vehicle, for example, a motorcycle, motor-bike, a motor car, or non-motorized vehicles such as a bicycle, a tricycle, etc. The system 100 may be installed in the first vehicle 101 to detect an intention of the second vehicle 102 to perform an action, such as an overtaking action.
The system 100 may be part of an anti-collision system associated with the first vehicle 101. In some embodiments, system 100 may be activated when the first vehicle 101 is detected to be changing a lane on a road. The image sensor 104 may then be activated to capture successive images or videos 106 of one or more second vehicles 102 behind the first vehicle 101. The intention indicator 112 associated with the second vehicle 102 may be in the form of a visual indicator, for example, a left or right signal indicator of the second vehicle 102, that may be seen via the indicator display unit or front headlights of the second vehicle 102. The visual indicator may be captured as part of the image data 106 to be analysed.
Using the rear-view camera 104, the first vehicle 101, which may be positioned in front of the second vehicle 102, may obtain the image data 106 of the second vehicle 102 positioned behind the first vehicle 101. The first vehicle 101, which is in front, can capture this indication of the second vehicle 102 to overtake the first vehicle 101 through the rear-view camera 104, which may preferably be equipped with a fisheye lens. In this manner, the driver of the first vehicle 101 may be warned not to switch or change lanes until the second vehicle 102 has overtaken the first vehicle 101, until the second vehicle 102 is detected to be relatively safe from colliding with the first vehicle 101, and/or until the second vehicle 102 is no longer in the blind spot of the driver of the first vehicle 101. In other words, the identified intention indicator 112 may be used to assist a driver of the first vehicle 101 to avoid any lane change at the same time if the second vehicle 102 is already trying to overtake the first vehicle 101.
to FIG. 3 shows an embodiment of the controller 110 in the form of a device, such as an autonomous driving control unit (ADCU) 200, for use in a first vehicle 101 for detection of a second vehicle 102. The ADCU 200 comprises an input module 204 configured to obtain image data 106 associated with the second vehicle 102 and a distance measure 114 associated with a distance between the first vehicle 101 and the second vehicle 102; and an analysis module 206 arranged in data communication with the input module 204. The analysis module 206 may be configured to detect an intention indicator in the image data 106. In some embodiments, the analysis module 206 is configured to identify an intention indicator 112 of the second vehicle 102, and to determine an output signal in response to the action of the second vehicle 102. The output signal may be determined based on the detected intention indicator 112 and the distance measure 114, by the controller 110. The ADCU 200 may comprise an output module 208 to send the output signal (e.g., at least one of the control signals and/or warning notifications mentioned above) in response to the action of the second vehicle 102.
The analysis module 206 may include an image processing module 210 and a distance calculator module 212. The image processing module 210 may include an image processing algorithm for the identification of the intention indicator 112 within the captured image data of the second vehicle 102. For example, in the case where the second vehicle 102 signals an intention to overtake the first vehicle via the right side, the right directional indicator 112 of the second vehicle 102 will be activated.
The intention indicator 112 will be captured as part of the image data 106 and processed. The image processing module 210 may then identify the intention indicator 112 from the captured image data to determine whether there is an intention to overtake. The outcome of the determination, which may be in the form of a binary one ("1") indicating a "yes", and a binary zero ("0") indicating a "no", may be sent to the output module 208.
In some embodiments, the image processing module 210 may include a machine learning algorithm trained to identify the intention indicator 112 from the image data 106 collected. In some embodiments, the image processing module 210 may include a machine vision module.
The distance calculator module 212 may be configured to receive detection signals (e.g. radio wave signals) from the detection sensor(s) 108 and compute the distance to measure 114 between the first vehicle 101 and the second vehicle 102 based on the signals received from the radar sensor(s) 108. The computed distance may be a lateral distance between the first vehicle 101 and the second vehicle 102.
FIG. 4 shows another embodiment of a method 400 for detecting an action of a second vehicle 102 in relation to a first vehicle 101. The method 400 may be implemented as executable code stored in a computer-readable medium, which may be stored in a memory of the controller 110, 200. The executable code comprises instructions for detecting an action associated with a second vehicle 102 with respect to a first vehicle 101. The method 400 may comprise the steps of: Step 402: obtaining image data 106 associated with the second vehicle 102; Step 404: obtaining detection signals associated with the second vehicle 102 in a proximity of the first vehicle 101; Step 406: detecting an intention indicator of the second vehicle 102 in the image data 106; Step 408: computing a distance measure 114 between the first vehicle 101 and the second vehicle 102 based on the detection signals; and Step 410: determining, based on the detected intention indicator 112 and the distance measure 114, an output signal 116 in response to the action of the second vehicle 102.
FIG. 5 shows another embodiment of a method 500 for detecting an action, specifically an overtaking action of a second vehicle 102 in relation to or with respect to a first vehicle 101. The method 500 is assumed to be implemented as executable code stored in a computer-readable medium, which may be stored in a memory of the controller 110, 200. The executable code comprises instructions for detecting an action associated with a second vehicle 102 with respect to a first vehicle 101, and the method 500 may comprise the steps of: Step 502: The controller 110 or 200 installed in the first vehicle 101 is activated, and in a monitoring state, and the second vehicle 102 (which may be a relatively small vehicle) signals an intention to overtake the first vehicle 101 (which may be a container truck).
Step 504: The image sensor(s) 104, such as a rear-view camera of the first vehicle 101 captures image data (which may include video files) of the first vehicle 101 and its intention to overtake.
to Step 506: A warning notification (which may be a first warning notification), which may be in the form of an alarm, is sent to inform the driver of the first vehicle 101 about the second vehicle 102's intention to overtake and to avoid lane change. Step 508: The radar sensor(s) 108 is used to estimate the lateral distance between the first vehicle 101 and the second vehicle 102 during overtaking.
Step 510: The lateral distance is compared with a threshold value. Based on a comparison result of the lateral distance and the threshold value, if the lateral distance is more than the threshold, no further action is required and/or the first vehicle 101 continues to maintain its course. If lateral distance is less than or equal to the threshold, step 512 is performed.
Step 512: A warning notification (which may be a second warning notification) is issued to inform the driver of the first vehicle 101 that the second vehicle 102 may be too close to the first vehicle 101 during overtaking, and evasive action of the first vehicle 101 may be required.
Step 514: After evasive action is performed, safe overtaking from the blind spot of the first vehicle 101 is determined.
In some embodiments, the method 500 may further include a step of issuing a warning notification to the second vehicle 102 to inform the second vehicle 102 that it is within the blind spot of the driver of the first vehicle 101 or is too close in distance to the first vehicle 101 during the performance of the overtaking action.
While the disclosure has been particularly shown and described with reference to specific embodiments, it should be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the scope of the invention as defined by the appended claims. The scope of the invention is thus indicated by the appended claims.
REFERENCE SIGNS
100: System 101: First vehicle 102: Second vehicle 104: Image sensor(s) 106: Image data io 108: detection sensor(s) 110: Controller 112: Intention indicator 114: Distance measure 116: Controller output signal IS 118: Actuator unit 200: Autonomous driving control unit (ADCU) 204: Input module 206: Analysis module 208: Output module 210: Image processing module 212: Distance calculator 400: Method 402-410: Method steps 500: Method 502-514: Method steps

Claims (17)

  1. CLAIMS1. A system (100) for use with a first vehicle (101) to detect an intended action of a second vehicle (102), the system (100) comprising: at least one image sensor (104) configured to receive image data (106) associated with the second vehicle (102); a controller (110) configured to detect an intention indicator (112) of the second vehicle (102) in the image data (106), the intention indicator (112) indicative to of the intended action of the second vehicle (102), wherein the controller (110) is further configured to generate an output signal based on the intention indicator (112), and wherein the controller (110) is further configured to provide the output signal to the first vehicle (101) for responding to the intended action of the second vehicle (102).
  2. 2. The system (100) according to claim 1, wherein the at least one image sensor (104) comprises at least one fisheye lens, and wherein the at least one fisheye lens has an angle of view of up to 170 degrees.
  3. 3. The system (100) according to claim 1 or 2, wherein the intention indicator (112) of the second vehicle is a visual indicator of an intention to overtake.
  4. 4. The system (100) according to any one of claims 1 to 3, wherein the output signal (116) comprises at least one of: a deceleration control signal, an emergency brake control signal, a directional change control signal and/or a maintain direction control signal
  5. 5. The system (100) according to any one of claims 1 to 4, further comprising: at least one detection sensor (108) configured to receive detection signals associated with the second vehicle (102) in a proximity of the first vehicle (101), wherein the controller (110) is further configured to compute a distance measure (114) between the first vehicle (101) and the second vehicle (102) based on the received detection signals, and wherein the controller (110) is further configured to generate the output signal further based on the distance measure (114).
  6. 6. The system (100) according to claim 5, wherein the at least one detection sensor (108) comprises a radar sensor, wherein the radar sensor is positioned on the first vehicle (101) to receive detection signals indicative of a lateral distance to between the first vehicle (101) and the second vehicle (102).
  7. 7. The system (100) according to any one of claims 5 to 6, wherein the controller (110) is configured to compute the lateral distance based on the detection signals, and wherein the controller (110) is configured to issue a warning notification based on determining that the lateral distance is less than a threshold value.
  8. 8. The system (100) according to claim 7, wherein the warning notification is in the form of a visual or audio warning to a user of the first vehicle (101) to avoid collision.
  9. 9. The system (100) according to claim 7 or 8, wherein the warning notification further comprises another visual or audio warning to the second vehicle (102) to avoid collision.
  10. 10. A first vehicle (101), comprising the system (100) according to any one of claims 1 to 9, wherein the at least one image sensor (104) comprises a rear-view image sensor positioned at a rear portion of the first vehicle (101), and further comprising at least one detection sensor (108) positioned at a side portion of the first vehicle (101), wherein the at least one detection sensor (108) is configured to detect the distance between the second vehicle (102) and the side portion of the first vehicle (101).
  11. 11. The first vehicle (101) according to claim 10, wherein the first vehicle (101) is a container truck.
  12. 12. The first vehicle (101) according to claim 10 or 11, wherein the at least one image sensor (104) comprises at least one fisheye lens.
  13. 13. A computer-implemented method (400) for detecting an intended action of a second vehicle (102) in relation to a first vehicle (101), the method (400) comprising: obtaining (402) image data (106) associated with the second vehicle (102); detecting (406) an intention indicator (112) of the second vehicle (102) in the image data (106), the intention indicator (112) indicative of the intended action of the second vehicle (102); determining (410), based on the intention indicator (112), an output signal for the first vehicle (101) to respond to the intended action of the second vehicle (102).
  14. 14. The method (400) of claim 13, further comprising: obtaining (404) detection signals associated with the second vehicle (102) in a proximity of the first vehicle (101); computing (408) a distance measure (114) between the first vehicle (101) and the second vehicle (102) based on the detection signals; and determining (410), the output signal further based on the distance measure (114).
  15. 15. The method (400) of any one of claims 13 to 14, further comprising: computing a lateral distance based on the detection signals, and comparing the lateral distance with a threshold value.
  16. 16. The method (400) of claim 14, further comprising: issuing a warning notification to at least one of the first vehicle (101) and the second vehicle (102) based on a comparison result of the lateral distance and the threshold value.
  17. 17. A non-transitory computer-readable medium storing computer executable code comprising instructions that cause a processor to carry out the method (400) according to any one of claims 13 to 15.
GB2218796.7A 2022-10-31 2022-12-14 System, device, and method for detecting an intention and an action associated with a vehicle Pending GB2623840A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
IN202241061782 2022-10-31

Publications (2)

Publication Number Publication Date
GB202218796D0 GB202218796D0 (en) 2023-01-25
GB2623840A true GB2623840A (en) 2024-05-01

Family

ID=84974725

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2218796.7A Pending GB2623840A (en) 2022-10-31 2022-12-14 System, device, and method for detecting an intention and an action associated with a vehicle

Country Status (1)

Country Link
GB (1) GB2623840A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1024050A1 (en) * 1999-01-29 2000-08-02 Renault Drive aid device for a motor vehicle
US20110010094A1 (en) * 2008-02-26 2011-01-13 Stephan Simon Method for assisting a user of a vehicle, control device for a driver-assistance system of a vehicle and vehicle having such a control device
US9305223B1 (en) * 2013-06-26 2016-04-05 Google Inc. Vision-based indicator signal detection using spatiotemporal filtering
DE102019008089A1 (en) * 2019-11-21 2020-08-20 Daimler Ag Method for detecting a change of lane of another motor vehicle by means of a detection device and detection device
US20220262135A1 (en) * 2017-09-20 2022-08-18 Tusimple, Inc. System and method for vehicle taillight state recognition

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1024050A1 (en) * 1999-01-29 2000-08-02 Renault Drive aid device for a motor vehicle
US20110010094A1 (en) * 2008-02-26 2011-01-13 Stephan Simon Method for assisting a user of a vehicle, control device for a driver-assistance system of a vehicle and vehicle having such a control device
US9305223B1 (en) * 2013-06-26 2016-04-05 Google Inc. Vision-based indicator signal detection using spatiotemporal filtering
US20220262135A1 (en) * 2017-09-20 2022-08-18 Tusimple, Inc. System and method for vehicle taillight state recognition
DE102019008089A1 (en) * 2019-11-21 2020-08-20 Daimler Ag Method for detecting a change of lane of another motor vehicle by means of a detection device and detection device

Also Published As

Publication number Publication date
GB202218796D0 (en) 2023-01-25

Similar Documents

Publication Publication Date Title
CN108263279B (en) Sensor integration based pedestrian detection and pedestrian collision avoidance apparatus and method
KR102161432B1 (en) Autonomous braking failure management when protecting pedestrians
US9583003B2 (en) Vehicle danger notification control apparatus
US11713041B2 (en) Control system and control method for driving a motor vehicle
US10960877B2 (en) Object detection device and object detection method
WO2017199529A1 (en) Driving assistance device and driving assistance program
US20180174465A1 (en) Driving assistance apparatus
KR101545054B1 (en) Driver assistance systems and controlling method for the same
US20200031276A1 (en) Rear-side alarm device and rear-side alarm method thereof
JP3556014B2 (en) Vehicle front monitoring system
JP7262177B2 (en) Platooning control device
US11890939B2 (en) Driver assistance system
JP2014229197A (en) Vehicle notification device
US20190315349A1 (en) Collision determination apparatus and method
GB2623840A (en) System, device, and method for detecting an intention and an action associated with a vehicle
KR101519215B1 (en) Driver assistance systems and controlling method for the same
JP4356473B2 (en) Driver state estimation device and alarm control device
JP6370249B2 (en) In-vehicle warning device
JP7298435B2 (en) Vehicle surroundings monitoring device and vehicle surroundings monitoring method
CN114715031A (en) Vehicle reversing control method, device, system and medium
KR101511863B1 (en) Driver assistance systems and controlling method for the same
JP6921160B2 (en) Vehicle posterior monitoring device
KR101622041B1 (en) Detection system and method for preceding vehicle at close range
US11981255B2 (en) Vehicle control device, vehicle, operation method for vehicle control device, and storage medium
CN114390989B (en) Vehicle control apparatus and control method thereof