GB2613183A - Road monitoring device and method of monitoring road traffic - Google Patents

Road monitoring device and method of monitoring road traffic Download PDF

Info

Publication number
GB2613183A
GB2613183A GB2117062.6A GB202117062A GB2613183A GB 2613183 A GB2613183 A GB 2613183A GB 202117062 A GB202117062 A GB 202117062A GB 2613183 A GB2613183 A GB 2613183A
Authority
GB
United Kingdom
Prior art keywords
monitoring device
mirror
road monitoring
road
detection sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB2117062.6A
Other versions
GB202117062D0 (en
Inventor
Higuchi Kazuhide
Itagaki Noriaki
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Continental Autonomous Mobility Germany GmbH
Original Assignee
Continental Autonomous Mobility Germany GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Autonomous Mobility Germany GmbH filed Critical Continental Autonomous Mobility Germany GmbH
Priority to GB2117062.6A priority Critical patent/GB2613183A/en
Publication of GB202117062D0 publication Critical patent/GB202117062D0/en
Publication of GB2613183A publication Critical patent/GB2613183A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0112Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0133Traffic data processing for classifying traffic situation
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)

Abstract

A road monitoring device uses a mirror locator to determine positional information and size information of a mirror 130 located at an intersection 120 between two roads122, 124. An object detection sensor is configured to detect an object 134 and further configured to determine a perceived position of the object 134. Determining the perceived position comprises emitting a sensing signal and receiving a reflection of the sensing signal. A processor is configured to determine that the object 134 and the road monitoring device positioned on different roads 122, 124 based on the positional information and the size information of the mirror and the perceived position of the object 134. The sensor may also be configured to determine motion information of the object such as speed and direction. The object detection sensor may be a LiDAR sensor. The mirror location may be determined by processing image data.

Description

ROAD MONITORING DEVICE AND METHOD OF MONITORING ROAD TRAFFIC TECHNICAL FIELD
100011 Various embodiments relate to road monitoring devices and methods of monitoring road traffic.
BACKGROUND
[0002] Traffic mirrors are often placed at road intersections, so that motorists can see vehicles on the intersecting road, also referred herein as crossing vehicles. This helps the motorists to be prepared for, and to avoid colliding with, the crossing vehicles which are approaching the road junction from the intersecting road. While the motorist can see the crossing vehicles through the mirror, it may be challenging for the motorist to estimate the speed of the crossing vehicles and to react in time to avoid a collision. In a known system, an automated warning system detects the crossing vehicle based on a visual image of the mirror which contains a reflection of the crossing vehicle, and thereby alerts the motorist. However, it is challenging to detect the crossing vehicle accurately through the visual image of the mirror as the reflection of the crossing vehicle on the mirror is very small. As a result, the detection range and detection accuracy may be limited.
[0003] In view of the above, there is a need for a better system to detect crossing vehicles that are approaching a road intersection.
SUMMARY
[0004] According to various embodiments, there is provided a road monitoring device. The road monitoring device includes a mirror locator, an object detection sensor and a processor. The mirror locator is configured to determine positional information and size information of a mirror. The mirror is located at an intersection between two roads. The object detection sensor is configured to detect an object and further configured to determine a perceived position of the object, wherein determining the perceived position comprises emitting a sensing signal and receiving a reflection of the sensing signal. The processor is configured to determine that the object and the road monitoring device are respectively positioned on different roads of the two roads, based on the positional information and the size information of the mirror and the perceived position of the object.
[0005] According to various embodiments, there is provided a method of monitoring road traffic. The method includes receiving a positional information and size information of a mirror located at an intersection between two roads. The method further includes receiving a perceived position of the object from an object detection sensor that is configured to determine the perceived position by emitting a sensing signal and receiving a reflection of the sensing signal. The method further includes determining that the object and a road monitoring device are respectively positioned on different roads of the two roads, based on the received positional information and the size information of the mirror and the perceived position of the object, wherein the road monitoring device is preferably the abovementioned road monitoring device.
[0006] Additional features for advantageous embodiments are provided in the dependent claims.
BRIEF DESCRIPTION OF TI-1E DRAWINGS
[0007] In the drawings, like reference characters generally refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles of the invention. In the following description, various embodiments are described with reference to the following drawings, in which: [0008] FIG. I shows an illustration of a traffic scenario at a road intersection between a first road and a second road.
[0009] FIG. 2 shows a conceptual diagram of a road monitoring device according to various embodiments.
[0010] FIG. 3 shows a simplified data flow diagram of a road monitoring device according to various embodiments.
[0011] FIG. 4 shows a flow diagram of a method of monitoring road traffic according to various embodiments.
[0012] FIGS. 5 and 6 illustrate the operation of the road monitoring device of FIGS. 2 and 3, using an example.
[0013] FIG. 7 shows an example of the output data of a mirror locator, according to various embodiments.
[0014] FIG. 8 shows an example of output data of an object detection sensor according to various embodiments.
[0015] FIG. 9 shows an example of fusion data resulting of combining the output data of the mirror locator as shown in FIG. 7 and the output data of the object detection sensor as shown in FIG. 8. [0016] FIG. 10 shows a visual representation of the fusion data shown in FIG. 9.
DESCRIPTION
[0017] Embodiments described below in context of the devices are analogously valid for the respective methods, and vice versa. Furthermore, it will be understood that the embodiments described below may be combined, for example, a part of one embodiment may be combined with a part of another embodiment.
[0018] It will be understood that any property described herein for a specific device may also hold for any device described herein. It will be understood that any property described herein for a specific method may also hold for any method described herein. Furthermore, it will be understood that for any device or method described herein, not necessarily all the components or steps described must be enclosed in the device or method, but only some (but not all) components or steps may be enclosed.
[0019] The term "coupled" (or "connected") herein may be understood as electrically coupled or as mechanically coupled, for example attached or fixed, or just in contact without any fixation, and it will be understood that both direct coupling or indirect coupling (in other words: coupling without direct contact) may be provided.
[0020] The road monitoring device as described in this description may include a memory which is for example used in the processing carried out in the device. A memory used in the embodiments may be a volatile memory, for example a DRAM (Dynamic Random Access Memory) or a nonvolatile memory, for example a PROM (Programmable Read Only Memory), an EPROM (Erasable PROM), EEPROM (Electrically Erasable PROM), or a flash memory, e.g., a floating gate memory, a charge trapping memory, an MRAM (Magnetoresistive Random Access Memory) or a PCRAM (Phase Change Random Access Memory).
[0021] The tenn "road intersection" may also be referred herein as "road junction" [0022] The term "mirror" may also be referred herein as "traffic mirror", "convex mirror", "curved mirror" or "road reflector".
[0023] In order that the invention may be readily understood and put into practical effect, various embodiments will now be described by way of examples and not limitations, and with reference to the figures.
[0024] FIG. 1 shows an illustration of a traffic scenario at a road intersection 120 between a first road 122 and a second road 124. The first road 122 may be transverse to the second road 124, i.e. meet the second road 124 at an acute angle, an obtuse angle, or at right angle. A first vehicle 132 may be travelling in a first direction 142 on the first road 122. A second vehicle 134 may be travelling in a second direction 144 on the second road 134. The first direction 142 may be transverse to the second direction 144. The second vehicle 124 may be referred to as a crossing vehicle with respect to the first vehicle 122. A mirror 130 may be positioned at the road intersection 120, so that the driver of the first vehicle 132 can see a reflected image of the second vehicle 134, i.e. crossing vehicle, in the mirror 130, and vice-versa. In such a scenario, a prior art warning system like described in the background may attempt to detect a reflected image 136 of the second vehicle 134 by applying image recognition techniques on an image of the mirror 130. However, given the typical resolution of an onboard camera in vehicles, and the small size of the reflected image 136, the detection accuracy may be low. The detection accuracy may also be adversely affected under some lighting conditions, for example, when the sun is too bright causing blinding reflections, or at night when the road is dimly lit. Also, the prior art warning system may not be able to determine a travelling direction and speed of the second vehicle 134 based on the reflected image 136.
[0025] According to various embodiments, a road monitoring device 100 may be configured to detect crossing vehicles at a road intersection, like in the traffic scenario shown in FIG. 1.
[0026] FIG. 2 shows a conceptual diagram of the road monitoring device 100 according to various embodiments. The road monitoring device 100 includes a mirror locator 102 that is configured to determine position information and size information of a mirror 130 located at an intersection (for example, road intersection 120) between two roads. The road monitoring device 100 further includes an object detection sensor 104 configured to detect an object and further configured to determine a perceived position of the object. The object detection sensor 104 determines the perceived position by emitting a sensing signal and receiving a reflection of the sensing signal. The road monitoring device 100 further includes a processor 106. The processor 106 is configured to determine that the object and the road monitoring device 100 are respectively positioned on different roads of the two roads, based on the positional information and the size information of the mirror and the perceived position of the object. The mirror locator 102, the object detection sensor 104 and the processor 106 may be coupled to one another, for example electrically and/or communicatively, by coupling lines 108.
[0027] The road monitoring device 100 may be coupleable to a vehicle, and may also be coupleable to sensors in the vehicle. For example, the road monitoring device 100 may be installed in the first vehicle 132. The road monitoring device 100 may alert the driver of the first vehicle 132 to approaching crossing vehicles, such as the second vehicle 134. The road monitoring device 100 may obtain information about the mirror 130 using the mirror locator 102. The road monitoring device 100 may further obtain information about the crossing vehicle, using an object detection sensor 104. The object detection sensor 104 may be an active emission sensor, such as a radar device or a LiDAR device. The object detection sensor 104 may be capable of detecting the speed and range of the crossing vehicle. The processor 106 may perform data fusion, to combine the information received from the mirror locator 102 and the information received from the object detection sensor. The processor 106 may determine whether a detected object is a crossing vehicle, based on the combined information.
[0028] FIG. 3 shows a simplified data flow diagram of the road monitoring device 100 according to various embodiments. The mirror locator 102 may be configured to transmit positional information 110 of the mirror 130, as well as size information 112 of the mirror 130, to the processor 106. The object detection sensor 104 may be configured to transmit a perceived position 114 of a detected object, to the processor 106. The processor 106 may be configured to generate an output 116 based on the information received from the mirror locator 102 and the object detection sensor 104. The output 116 may include information on whether the object and the road monitoring device 100 are positioned on different roads that meet at a road intersection. In other words, the output 116 may indicate whether the object is a crossing vehicle, i.e. moving on an intersecting road, relative to a vehicle on which the road monitoring device 100 is installed.
100291 The road monitoring device 100 may detect crossing vehicles, without being constrained by camera resolution and lighting conditions, and may achieve better accuracy in detecting crossing vehicles as compared to prior art warning systems.
[0030] According to an embodiment which may be combined with the above-described embodiment or with any below described further embodiment, the positional information 110 of the mirror 130 may include distance between the mirror 130 and the road monitoring device 100, and an orientation of the mirror 130 relative to the road monitoring device IOU, The processor 106 may use information on the distance between the mirror 130 and the road monitoring device 100 to compute time taken for a sensing signal to travel between the object detection sensor 104 and the mirror 130. The processor 106 may use information on the orientation of the mirror 130 to determine an angular position of the object relative to the road monitoring device 100.
[0031] According to an embodiment which may be combined with any above-described embodiment or with any below described further embodiment, the size information 112 of the mirror 130 may include width of the mirror 130. The processor 106 may use the size information 112 to determine an area where the mirror 130 obstructs direct line-of-sight.
[0032] According to an embodiment which may be combined with any above-described embodiment or with any below described further embodiment, the processor 106 may be further configured to determine that the mirror 130 is obstructing a direct path between the road monitoring device 100 and the perceived position 114 of the object, based on the positional information 110 and the size information 1 I 2 of the mirror 130 and the perceived position 114 of the object. The processor 106 may be configured to determine that the object and the road monitoring device 100 are respectively positioned on different roads of the two roads of the road intersection, based on determining that the mirror 130 is obstructing the direct path between the road monitoring device 100 and the perceived position 114 of the object.
[0033] According to an embodiment which may be combined with any above-described embodiment or with any below described further embodiment, the object detection sensor 104 may be configured to detect objects in a first direction towards the mirror. The first direction may be towards a front of the vehicle that the road monitoring device 100 is coupled to.
[0034] According to an embodiment which may be combined with any above-described embodiment or with any below described further embodiment, the road monitoring device 100 may further include a further object detection sensor configured to detect objects in a second direction transverse to the first direction. The second direction may be lateral to the vehicle that the road monitoring device 100 is coupled to. Optionally, the further object detection sensor may be configured to determine a second perceived position of the object, and the processor 106 may be further configured to determine that the object and the road monitoring device 100 are respectively positioned on different roads of the two roads of the road intersection, further based on the second perceived position of the object. The further object detection sensor may transmit its further sensing signal to the object and the second perceived position may be determined based on a direct reflection of the further sensing signal reflected off the object. The processor 106 may verify its output 116 based on the second perceived position, and thereby improve the accuracy or confidence level of its output 116.
[0035] According to an embodiment which may be combined with any above-described embodiment or with any below described further embodiment, the road monitoring device 100 may be further configured to determine a true position of the object based on the positional information and the size information of the mirror 130 and the perceived position of the object. The output 116 generated by the processor 106 may include the true position of the object, for example, indicated in positional coordinates such as latitude and longitude, or in distance and angle relative to the road monitoring device 100. The processor 106 may further compare the true position of the object to the second perceived position received from the further object detection sensor.
[0036] According to an embodiment which may be combined with any above-described embodiment or with any below described further embodiment, the object detection sensor 104 may be further configured to determine motion information of the object. The object detection sensor 104 may transmit the motion information to the processor 106. The motion information may include travelling speed and perceived movement direction of the object. The processor 106 may use the motion information to determine whether a crossing vehicle is moving towards or away from the road intersection, to assess whether there is a danger of collision.
[0037] According to an embodiment which may be combined with any above-described embodiment or with any below described further embodiment, the road monitoring device 100 may be disposed on a travelling vehicle, and the processor 106 may be further configured to determine whether the travelling vehicle is on a collision path with the object based on the motion information of the object and the determination that the object and the road monitoring device are respectively positioned on different roads of the two roads.
[0038] According to an embodiment which may be combined with any above-described embodiment or with any below described further embodiment, the object detection sensor 104 may include at least one of a radar sensor and a LiDAR sensor. An active sensor such as radar or LiDAR sensor may be capable of determining velocity of the detected object based on Doppler effect.
[0039] According to an embodiment which may be combined with any above-described embodiment or with any below described further embodiment, the object detection sensor 104 may include a transceiver and a determination module. The transceiver may be configured to emit the sensing signal and may be further configured to receive the reflection of the sensing signal. The determination module may be configured to determine the perceived position of the object based on a time of arrival of the reflection of the sensing signal.
[0040] According to an embodiment which may be combined with any above-described embodiment or with any below described further embodiment, the mirror locator 102 may be configured to determine the position of the mirror 130 based on processing image data. The image data may include an image of the road intersection 120. The mirror locator 102 may be configured to determine the position information of the mirror 130, as well as the size information of the mirror 130, by image recognition techniques applied to the visual image of the road intersection 120. The mirror locator 102 may include a camera configured to capture the image data.
[0041] FIG. 4 shows a flow diagram of a method 400 of monitoring road traffic according to various embodiments. The method 400 may include receiving a positional information and size information of a mirror located at an intersection between two roads, in 402. The method 400 may further include receiving a perceived position of the object from an object detection sensor 104 configured to determine the perceived position by emitting a sensing signal and receiving a reflection of the sensing signal, in 404. The method 400 may further include determining that the object and a road monitoring device are respectively positioned on different roads of the two roads, based on the received positional information and the size information of the mirror 130 and the perceived position of the object. The road monitoring device may be the road monitoring device 100, and may be coupled to a vehicle.
[0042] According to an embodiment which may be combined with the above-described embodiment or with any below described further embodiment, the method 400 may further include determining the positional information and size information of the mirror 130 using a mirror locator 102.
[0043] According to an embodiment which may be combined with any above-described embodiment or with any below described further embodiment, the method 400 may further include detecting the object and determining the perceived position of the object, using the object detection sensor 104.
[0044] Various aspects of the method 400 described may also apply to the use of any of the above-described road monitoring devices 100.
[0045] FIGS. 5 and 6 illustrate the operation of the road monitoring device 100, using an example. [0046] Referring to FIG. 5, the road monitoring device 100 may be coupled to a vehicle 532 travelling on a first road 522. The vehicle 532 may be approaching a road intersection 120. A crossing vehicle 534 may be travelling along an intersecting road 524 that joins the first road 522 at the road intersection 120.
[0047] The road monitoring device 100 may determine the position and size of the mirror 130 using its mirror locator 102. The mirror locator 102 may apply image processing techniques to recognize the mirror 130 in a visual image of the road intersection 120. The mirror locator 102 may determine the position and size of the mirror 130 based on the visual image. In an alternative embodiment, the mirror locator 102 may determine retrieve at least one of position and size information of the mirror 130 from a database, based on detection of the mirror 130 in the visual image. The visual image may be captured by a camera onboard the vehicle 532.
[0048] The road monitoring device IOU may determine a perceived position of the crossing vehicle 534 using its object detection sensor 104. The object detection sensor 104 may emit a sensing signal 502. The sensing signal 502 may propagate towards the mirror 130, and may reflect off the mirror 130 to reach the crossing vehicle 534. The sensing signal 502 may reflect off the crossing vehicle 534 to reach the mirror 130, and then reflect off the mirror back to the vehicle 532.
[0049] Referring to FIG. 6, the sensing signal 502 may travel a first distance 602 (denoted as dl) along a path 610 from the vehicle 532 to the mirror 130. The sensing signal 502 may bounce off, i.e. be reflected by, the mirror 130, to travel a second distance 604 (denoted as d2) along a path 620 from the mirror 130 to the crossing vehicle 534. The sensing signal 502 may be at least partially reflected by the crossing vehicle 534. The reflected sensing signal may travel along a path 622 for the distance d2, from the crossing vehicle back to the mirror 130. The reflected sensing signal may reflect off the mirror 130, to travel the distance dl, from the mirror 130 back to the vehicle 532. The processor 106 may determine a perceived position of the crossing vehicle 534 based on the time delay between emitting the sensing signal 502 and receiving its reflection back at the vehicle 532, and optionally further based on an attenuation of the received reflected sensing signal as compared to the emitted sensing signal 502.
[0050] The strength of the received reflected sensing signal may be expressed as follows: 1 1 1 1 A2 Pr = Pt Gt0- G7.
4Thdi2 entc 4Thd22 c 4Thd22 cine 4ndi2 177 where Pr denotes the received power, Pt denotes transmitted power, Gt denotes transmitter antenna gain, C,. represents receiver antenna gain, di refers to dl 602, d2 refers d2 604, cre, denotes the radar cross section of the mirror 130, o-, denotes radar cross section of the crossing vehicle 534, acme denotes bi-static radar cross section of the mirror 130, and 2. denotes the wavelength of the emitted sensing signal 502.
[0051] As o-cme = emc, and Cr = Gt, the expression may be simplified to: PT - (47)5 d d24 100521 The vehicle 532 may optionally carry a second object detection sensor that is configured to emit a second sensing signal. The second object detection sensor may be disposed on a side of the vehicle 532, or may be configured to emit the second sensing signal towards a lateral direction of the vehicle 532. The second sensing signal may travel directly from the vehicle 532 to the crossing vehicle 534, along a path 630 for a third distance d3 606 (denoted as d3) without being reflected by the mirror 130. The second sensing signal may be reflected by the crossing vehicle 534. The reflected second sensing signal may travel back to the vehicle 532 along a path 632.
100531 FIG. 7 shows an example of the output data of the mirror locator 102 represented on a graph 700, according to various embodiments. The output data of the mirror locator 102 may include positional information 110 (shown in FIG. 3) and size information 112 (shown in FIG. 3) of a mirror, for example, the mirror 130 of FIGS. 5 and 6. The graph 700 includes a horizontal axis 702 representing lateral distance (denoted as x), and also includes a vertical axis 704 representing longitudinal distance (denoted as y). A circle 730 marked on the graph 700 represents the mirror 130. The position of the circle 730 on the graph 700 represents the position of the mirror 130. The size of the circle 730 on the graph 700 represents the size of the mirror 130, for example, the distance 732 on the horizontal axis indicates the width of the mirror 130. The origin 710 of the graph 700 may represent a position of the road monitoring device 100. The mirror locator 102 may further determine the orientation of the mirror 130 relative to the road monitoring device 100. The mirror locator 102 may determine a minimum angular position 902 and a maximum angular position 904 of the mirror 130, relative to the road monitoring device 100. The mirror locator 102 may determine the minimum angular position 902 and the maximum angular position 904 based on the determined orientation of the mirror 130. In an alternative embodiment, the processor 106 may determine the minimum angular position 902 and the maximum angular position 904.
[0054] FIG. 8 shows an example of output data of the object detection sensor 104 represented on graphs 800A and 800B, according to various embodiments. In this example, the object detection sensor 104 detected the crossing vehicle 534 of FIGS. 5 and 6. The output data of the object detection sensor 104 may include a perceived position 114 (shown in FIG. 3) of the mirror 130. The graph 800A includes the same horizontal axis 702 representing lateral distance and vertical axis 704 representing longitudinal distance. A triangle 830 marked on the graph 800A represents a position where the radar reflection intensity is strong, i.e. radar cross section of the crossing vehicle 534. The position of the triangle 830 on the graph 800A represents the perceived position of the crossing vehicle 534. The size of the triangle 830 on the graph 800A may also represent the size of the crossing vehicle 534. The object detection sensor 104 may determine a perceived angular position 906 of the crossing vehicle 534 relative to the road monitoring device 100 based on the perceived position. In an alternative embodiment, the processor 106 may determine the perceived angular position 906.
[0055] The output data of the object detection sensor 104 may further include motion information, for example, speed and movement direction, of the crossing vehicle. The graph 800B includes a horizontal axis 802 that represents speed (denoted as v) and also includes a vertical axis 804 that represents range (denoted as d). The graph 800B may present a range doppler map. A triangle 832 marked on the graph 800B represents doppler information of the crossing vehicle 534. The vertical position of the triangle 832 may indicate a range of the crossing vehicle 534, in other words, distance of the crossing vehicle 534 from the object detection sensor 104. The horizontal position of the triangle 832 may indicate a speed of the crossing vehicle 534.
[0056] FIG. 9 shows an example of fusion data resulting of combining the output data of the mirror locator 102 shown in FIG. 7 and the output data of the object detection sensor 104 shown in FIG. 8, represented on a graph 900. The processor 106 may perform the data fusion. The graph 900 includes a horizontal axis 702 that represents lateral distance, and a vertical axis 704 that represents longitudinal distance. The graph 800A is superimposed onto the graph 700, to show the perceived position of the crossing vehicle 534 relative to the position of the mirror 130. An arrow 930 indicates the perceived movement direction of the crossing vehicle 534, as determined by the object detection sensor 104. The processor 106 may determine whether the perceived angular position 906 lies between the minimum angular position angle 902 and the maximum angular position 904, which indicates that a direct path between the perceived position of the crossing vehicle 534 and the road monitoring device 100 may be obstructed by the mirror 130. The processor 106 may determine, based on the minimum angular position angle 902, the maximum angular position 904, and the perceived angular position 906, that the crossing vehicle 534 is on the intersecting road 524.
[0057] FIG. 10 shows a visual representation of the fusion data shown in FIG. 9. By combining the perceived position 1010 received from the object detection sensor 104 and the positional information received from the mirror locator 102, the processor 106 may determine that the perceived position 1010 is "behind" the mirror 130. The true position of the crossing vehicle 534 cannot be behind the mirror 130 since the mirror 130 would then obstruct the propagation of the sensing signal to the crossing vehicle 534. The processor 106 may thereby determine that the crossing vehicle 534 is in fact, on the intersecting road 524. The processor 106 may translate the perceived position 1010 to the true position on the intersecting road 524, based on the perceived position 1010 and the positional information of the mirror 130. The processor 106 may also translate the perceived movement direction 1012 of the crossing vehicle 534 to a true movement direction 1014 of the crossing vehicle, based on the perceived movement direction and the positional information of the mirror 130. The road monitoring device 100 may generate a warning message to alert the driver of the vehicle 532, that the crossing vehicle 534 is heading towards the road intersection 120, based on the determination that the crossing vehicle 534 is on the intersecting road 524. The road monitoring device 100 may further generate the warning message based on the true movement direction 1014 and true position of the crossing vehicle 534.
[0058] The following examples described further technical aspects of the devices, systems and methods described above and shall not be interpreted as claims.
[0059] The following examples can additionally be combined with any of the devices and methods as described above, any of the claims as originally filed.
[0060] Example 1 is a road monitoring device including: a mirror locator configured to determine positional information and size information of a mirror located at an intersection between two roads; an object detection sensor configured to detect an object and further configured to determine a perceived position of the object, wherein determining the perceived position includes emitting a sensing signal and receiving a reflection of the sensing signal; and a processor configured to determine that the object and the road monitoring device are respectively positioned on different roads of the two roads, based on the positional information and the size information of the mirror and the perceived position of the object.
[0061] In example 2, the subject-matter of example 1 can optionally include that the positional information of the mirror includes distance between the mirror and the road monitoring device, and an orientation of the mirror relative to the road monitoring device.
[0062] In example 3, the subject-matter of any one of the preceding examples can optionally include that the size information of the mirror includes width of the mirror.
[0063] In example 4, the subject-matter of any one of the preceding examples can optionally include that the processor is further configured to determine that the mirror is obstructing a direct path between the road monitoring device and the perceived position of the object, based on the positional information and the size information of the mirror and the perceived position of the object.
[0064] In example 5, the subject-matter of example 4 can optionally include that the processor is configured to determine that the object and the road monitoring device are respectively positioned on different roads of the two roads, based on determining that the mirror is obstructing the direct path between the road monitoring device and the perceived position of the object.
[0065] In example 5A, the subject-matter of any one of the preceding examples can optionally include that the road monitoring device is coupleable to a vehicle.
[0066] In example 6, the subject-matter of any one of the preceding examples can optionally include that the object detection sensor is configured to detect objects in a first direction towards the mirror.
[0067] In example GA, the subject-matter of any one of the preceding examples can optionally include that the object detection sensor is configured to detect objects that are in front of the vehicle.
[0068] In example 7, the subject-matter of any one of the preceding examples can optionally include a further object detection sensor configured to detect objects in a second direction transverse to the first direction.
[0069] In example 7A, the subject-matter of example 7 can optionally include that the further object detection sensor is configured to detect objects that are lateral to the vehicle.
[0070] In example 8, the subject-matter of any one of examples 7 or 7A can optionally include that the further object detection sensor is further configured to determine a second perceived position of the object, and wherein the processor is further configured to determine that the object and the road monitoring device are respectively positioned on different roads of the two roads, further based on the second perceived position of the object.
[0071] In example 8A, the subject-matter of any one of the preceding examples can optionally include that the processor is further configured to determine a true position of the object based on the positional information and the size information of the mirror and the perceived position of the object.
[0072] In example 8B, the subject-matter of any one of the preceding examples can optionally include that the processor compares the second perceived position to the determined true position. [0073] In example 9, the subject-matter of any one of the preceding examples can optionally include that the object detection sensor is further configured to determine motion information of the object.
[0074] In example 10, the subject-matter of example 9 can optionally include that the motion information of the object includes travelling speed and perceived movement direction of the object. [0075] In example 11, the subject-matter of any one of examples 9 to 10 can optionally include that the road monitoring device is disposed on a travelling vehicle, and wherein the processor is further configured to determine whether the travelling vehicle is on a collision path with the object based on the motion information of the object and the determination that the object and the road monitoring device are respectively positioned on different roads of the two roads.
[0076] In example I I A, the subject-matter of any one of the preceding examples can optionally include that the object detection sensor includes a transceiver and a determination module, wherein the transceiver is configured to emit the sensing signal and further configured to receive the reflection of the sensing signal, and wherein the determination module is configured to determine the perceived position of the object based on a time of arrival of the reflection of the sensing signal. [0077] In example 12, the subject-matter of any one of the preceding examples can optionally include that the object detection sensor includes at least one of a radar sensor and a LiDAR sensor. [0078] In example 13, the subject-matter of any one of the preceding examples can optionally include that the mirror locator is configured to determine the position of the mirror based on processing image data.
[0079] Example 14 is a method of monitoring road traffic, the method including: receiving a positional information and size information of a mirror located at an intersection between two roads; receiving a perceived position of the object from an object detection sensor configured to determine the perceived position by emitting a sensing signal and receiving a reflection of the sensing signal; and determining that the object and a road monitoring device are respectively positioned on different roads of the two roads, based on the received positional information and the size information of the mirror and the perceived position of the object, wherein the road monitoring device is preferably the road monitoring device of any one of the preceding examples. [0080] In example I 4A, the subject-matter of example 14 can optionally include: determining the positional information and size information of the mirror using a mirror locator.
[0081] In example 14B, the subject-matter of any one of examples 14 and 14A can optionally include: detecting the object and determining the perceived position of the object, using the object detection sensor.
[0082] Example 15 is a non-transitory computer-readable medium including instructions which, when executed by a computer, cause the computer to carry out the method of any one of examples 14, 14A and 14B.
[0083] While embodiments of the invention have been particularly shown and described with reference to specific embodiments, it should be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. The scope of the invention is thus indicated by the appended claims and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced. It will be appreciated that common numerals, used in the relevant drawings, refer to components that serve a similar or the same purpose.
[0084] It will be appreciated to a person skilled in the art that the terminology used herein is for the purpose of describing various embodiments only and is not intended to be limiting of the present invention. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
[0085] It is understood that the specific order or hierarchy of blocks in the processes / flowcharts disclosed is an illustration of exemplary approaches. Based upon design preferences, it is understood that the specific order or hierarchy of blocks in the processes / flowcharts may be rearranged. Further, some blocks may be combined or omitted. The accompanying method claims present elements of the various blocks in a sample order, and are not meant to be limited to the specific order or hierarchy presented.
[0086] The previous description is provided to enable any person skilled in the art to practice the various aspects described herein. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects. Thus, the claims are not intended to be limited to the aspects shown herein, but is to be accorded the full scope consistent with the language claims, wherein reference to an element in the singular is not intended to mean "one and only one" unless specifically so stated, but rather "one or more." The word "exemplary" is used herein to mean "serving as an example, instance, or illustration." Any aspect described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other aspects. Unless specifically stated otherwise, the term "some" refers to one or more. Combinations such as "at least one of A, B, or C," "one or more of A, B, or C," "at least one of A, B, and C," "one or more of A, B, and C," and -A, B, C, or any combination thereof' include any combination of A, B, and/or C, and may include multiples of A, multiples of B, or multiples of C. Specifically, combinations such as "at least one of A, B, or C," "one or more of A, B, or "at least one of A, B, and C," "one or more of A, B, and C," and "A, B, C, or any combination thereof' may be A only, B only, C only, A and B, A and C, B and C, or A and B and C, where any such combinations may contain one or more member or members of A, B, or C. All structural and functional equivalents to the elements of the various aspects described throughout this disclosure that are known or later come to be known to those of ordinary skill in the art are expressly incorporated herein by reference and are intended to be encompassed by the claims.

Claims (15)

  1. CLAIMSA road monitoring device (100) comprising: a mirror locator (102) configured to determine positional information and size information of a mirror located at an intersection between two roads; an object detection sensor (104) configured to detect an object and further configured to determine a perceived position of the object, wherein determining the perceived position comprises emitting a sensing signal and receiving a reflection of the sensing signal; and a processor (106) configured to determine that the object and the road monitoring device (100) are respectively positioned on different roads of the two roads, based on the positional information and the size information of the mirror and the perceived position of the object.
  2. 2. The road monitoring device (100) of claim 1, wherein the positional information of the mirror comprises distance between the mirror and the road monitoring device (100), and an orientation of the mirror relative to the road monitoring device (100).
  3. 3. The road monitoring device (100) of any one of claims 1 to 2, wherein the size information of the mirror comprises width of the mirror.
  4. 4. The road monitoring device (100) of any one of claims 1 to 3, wherein the processor (106) is further configured to determine that the mirror is obstructing a direct path between the road monitoring device (100) and the perceived position of the object, based on the positional information and the size information of the mirror and the perceived position of the object.
  5. 5. The road monitoring device (100) of claim 4, wherein the processor (106) is configured to determine that the object and the road monitoring device (100) are respectively positioned on different roads of the two roads, based on determining that the mirror is obstructing the direct path between the road monitoring device (100) and the perceived position of the object.
  6. 6. The road monitoring device (100) of any one of claims Ito 5, wherein the object detection sensor (104) is configured to detect objects in a first direction towards the mirror.
  7. 7. The road monitoring device (100) of claim 6, further comprising: a further object detection sensor configured to detect objects in a second direction transverse to the first direction.
  8. 8. The road monitoring device (100) of claim 7, wherein the further object detection sensor is further configured to determine a second perceived position of the object, and wherein the processor (106) is further configured to determine that the object and the road monitoring device (100) are respectively positioned on different roads of the two roads, further based on the second perceived position of the object.
  9. 9. The road monitoring device (100) of any one of claims I to 8, wherein the object detection sensor (104) is further configured to determine motion information of the object.
  10. 10. The road monitoring device (100) of claim 9, wherein the motion information of the object comprises travelling speed and perceived movement direction of the object.
  11. 11. The road monitoring device (100) of any one of claims 9 to 10, wherein the road monitoring device (100) is disposed on a travelling vehicle, and wherein the processor (106) is further configured to determine whether the travelling vehicle is on a collision path with the object based on the motion information of the object and the determination that the object and the road monitoring device (100) are respectively positioned on different roads of the two roads.
  12. 12. The road monitoring device (100) of any one of claims I to Ii, wherein the object detection sensor (104) comprises at least one of a radar sensor and a LiDAR sensor.
  13. 13. The road monitoring device (100) of any one of claims I to 12, wherein the mirror locator (102) is configured to determine the position of the mirror based on processing image data.
  14. 14. A method of monitoring road traffic, the method comprising: receiving a positional information and size information of a mirror located at an intersection between two roads (402); receiving a perceived position of the object from an object detection sensor configured to determine the perceived position by emitting a sensing signal and receiving a reflection of the sensing signal (404); and determining that the object and a road monitoring device are respectively positioned on different roads of the two roads, based on the received positional information and the size information of the mirror and the perceived position of the object (406), wherein the road monitoring device is preferably the road monitoring device of any one of claims 1 to 13.
  15. 15. A non-transitory computer-readable medium comprising instructions which, when executed by a computer, cause the computer to carry out the method of claim 14.
GB2117062.6A 2021-11-26 2021-11-26 Road monitoring device and method of monitoring road traffic Withdrawn GB2613183A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2117062.6A GB2613183A (en) 2021-11-26 2021-11-26 Road monitoring device and method of monitoring road traffic

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2117062.6A GB2613183A (en) 2021-11-26 2021-11-26 Road monitoring device and method of monitoring road traffic

Publications (2)

Publication Number Publication Date
GB202117062D0 GB202117062D0 (en) 2022-01-12
GB2613183A true GB2613183A (en) 2023-05-31

Family

ID=80038594

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2117062.6A Withdrawn GB2613183A (en) 2021-11-26 2021-11-26 Road monitoring device and method of monitoring road traffic

Country Status (1)

Country Link
GB (1) GB2613183A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180018878A1 (en) * 2015-01-22 2018-01-18 Pioneer Corporation Driving assistance device and driving assistance method
US20180178800A1 (en) * 2016-12-27 2018-06-28 Panasonic Intellectual Property Corporation Of America Information processing apparatus, information processing method, and recording medium
US20180181824A1 (en) * 2016-12-27 2018-06-28 Panasonic Intellectual Property Corporation Of America Information processing apparatus, information processing method, and recording medium
US20190197324A1 (en) * 2017-12-27 2019-06-27 Hyundai Motor Company Vehicle blind spot detection system and control method thereof
US20200271761A1 (en) * 2016-08-10 2020-08-27 James Thomas O'Keeffe Distributed lidar with fiber optics and a field of view combiner

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180018878A1 (en) * 2015-01-22 2018-01-18 Pioneer Corporation Driving assistance device and driving assistance method
US20200271761A1 (en) * 2016-08-10 2020-08-27 James Thomas O'Keeffe Distributed lidar with fiber optics and a field of view combiner
US20180178800A1 (en) * 2016-12-27 2018-06-28 Panasonic Intellectual Property Corporation Of America Information processing apparatus, information processing method, and recording medium
US20180181824A1 (en) * 2016-12-27 2018-06-28 Panasonic Intellectual Property Corporation Of America Information processing apparatus, information processing method, and recording medium
US20190197324A1 (en) * 2017-12-27 2019-06-27 Hyundai Motor Company Vehicle blind spot detection system and control method thereof

Also Published As

Publication number Publication date
GB202117062D0 (en) 2022-01-12

Similar Documents

Publication Publication Date Title
CN109712432B (en) System and method for projecting a trajectory of an autonomous vehicle onto a road surface
KR100803414B1 (en) Near object detection system
US8946990B1 (en) Vehicle headlight detection system
CN101326511B (en) Method for detecting or predicting vehicle cut-ins
US9174569B2 (en) Method for controlling a vehicle member
US11608055B2 (en) Enhanced autonomous systems with sound sensor arrays
US20160178802A1 (en) Road surface reflectivity detection by lidar sensor
US20090002222A1 (en) Method of estimating target elevation utilizing radar data fusion
US10906542B2 (en) Vehicle detection system which classifies valid or invalid vehicles
US10907962B2 (en) Driving assistance system mounted in vehicle
WO2019008716A1 (en) Non-visible measurement device and non-visible measurement method
US11999370B2 (en) Automated vehicle system
WO2020039840A1 (en) Radar processing device
CN111873906A (en) Vehicle side door opening angle early warning method, system, medium and vehicle-mounted terminal
JP2020197506A (en) Object detector for vehicles
JP7075244B2 (en) Peripheral monitoring device, peripheral monitoring system, and peripheral monitoring method
US11798417B2 (en) Driving assistance device
CN210617998U (en) Blind area detection equipment for freight transport and passenger transport vehicles
GB2613183A (en) Road monitoring device and method of monitoring road traffic
US11731622B2 (en) Prediction of dynamic objects at concealed areas
US20220017100A1 (en) Radar device for vehicle and method of controlling radar for vehicle
CN111332310A (en) Identification of objects by indirect signal reflection
CN112305544A (en) Motor vehicle having a radar sensor and method for operating a motor vehicle
CN117233767A (en) Method for operating a radar device of a vehicle, radar device and vehicle
JP2986567B2 (en) Inter-vehicle distance measuring device

Legal Events

Date Code Title Description
COOA Change in applicant's name or ownership of the application

Owner name: CONTINENTAL AUTONOMOUS MOBILITY GERMANY GMBH

Free format text: FORMER OWNER: CONTINENTAL AUTOMOTIVE GMBH

WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)