WO2022099224A1 - Radar signal filtering for removing noise due to transparent or translucent material located in front of sensor - Google Patents

Radar signal filtering for removing noise due to transparent or translucent material located in front of sensor Download PDF

Info

Publication number
WO2022099224A1
WO2022099224A1 PCT/US2021/065167 US2021065167W WO2022099224A1 WO 2022099224 A1 WO2022099224 A1 WO 2022099224A1 US 2021065167 W US2021065167 W US 2021065167W WO 2022099224 A1 WO2022099224 A1 WO 2022099224A1
Authority
WO
WIPO (PCT)
Prior art keywords
radar
data
radar sensor
transparent
translucent material
Prior art date
Application number
PCT/US2021/065167
Other languages
French (fr)
Inventor
Gaoyu XIAO
Zhebin ZHANG
Hongyu Sun
Jian Sun
Original Assignee
Innopeak Technology, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Innopeak Technology, Inc. filed Critical Innopeak Technology, Inc.
Priority to PCT/US2021/065167 priority Critical patent/WO2022099224A1/en
Publication of WO2022099224A1 publication Critical patent/WO2022099224A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4004Means for monitoring or calibrating of parts of a radar system
    • G01S7/4039Means for monitoring or calibrating of parts of a radar system of sensor or antenna obstruction, e.g. dirt- or ice-coating
    • G01S7/4043Means for monitoring or calibrating of parts of a radar system of sensor or antenna obstruction, e.g. dirt- or ice-coating including means to prevent or remove the obstruction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93276Sensor installation details in the windshield area

Definitions

  • the present disclosure relates, in general, to methods, systems, and apparatuses for implementing radar-based object detection (e.g., advanced driver assistance systems ("ADASs”), other radar-based object detection, or the like), and, more particularly, to methods, systems, and apparatuses for implementing radar signal filtering for removing noise due to transparent or translucent material located in front of sensor.
  • radar-based object detection e.g., advanced driver assistance systems ("ADASs"), other radar-based object detection, or the like
  • ADASs advanced driver assistance systems
  • radar signal filtering for removing noise due to transparent or translucent material located in front of sensor.
  • ADAS advanced driver assistance systems
  • GPU graphics processing unit
  • CPU central processing unit
  • ADAS advanced driver assistance systems
  • These input data can come from different types of sensors, such as a camera, radio detection and ranging (“Radar”), and/or light detection and ranging (“Lidar”).
  • Radar is often adopted for its insensitivity to bad weather and its low cost, even though Radar data comes with relatively low image resolution.
  • a Radar device can be installed in different locations in a vehicle. It can be placed outside a driver's compartment, such as right behind the bumper at the front of the vehicle for better field of view (“FOV”). It can also be placed behind the windshield inside the driver's compartment for easier installation, maintenance, and better portability.
  • FOV field of view
  • the techniques of this disclosure generally relate to tools and techniques for implementing radar-based object detection (e.g., ADASs, or the like), and, more particularly, to methods, systems, and apparatuses for implementing radar signal filtering for removing noise due to transparent or translucent material located in front of sensor.
  • radar-based object detection e.g., ADASs, or the like
  • methods, systems, and apparatuses for implementing radar signal filtering for removing noise due to transparent or translucent material located in front of sensor e.g., ADASs, or the like
  • a method may comprise receiving, using a computing system, radio detection and ranging ("Radar”) data from a Radar sensor, the Radar sensor being disposed such that a transparent or translucent material is positioned between one or more objects and the Radar sensor; parsing, using the computing system, the Radar data to extract range or depth data; based on a determination that at least one extracted range or depth data corresponds to a distance between the Radar sensor and the transparent or translucent material, filtering out, using the computing system, at least one Radar data point corresponding to said at least one extracted range or depth data from the Radar data to produce filtered Radar data, and utilizing the filtered Radar data as input to an object detection system; and, based on a determination that no extracted range or depth data corresponds to the distance between the Radar sensor and the transparent or translucent material, utilizing the Radar data from the Radar sensor as input to the object detection system.
  • Radar radio detection and ranging
  • an apparatus might comprise at least one processor and a non- transitory computer readable medium communicatively coupled to the at least one processor.
  • the non-transitory computer readable medium might have stored thereon computer software comprising a set of instructions that, when executed by the at least one processor, causes the apparatus to: receive radio detection and ranging ("Radar") data from a Radar sensor, the Radar sensor being disposed such that a transparent or translucent material is positioned between one or more objects and the Radar sensor; parse the Radar data to extract range or depth data; based on a determination that at least one extracted range or depth data corresponds to a distance between the Radar sensor and the transparent or translucent material, filter out at least one Radar data point corresponding to said at least one extracted range or depth data from the Radar data to produce filtered Radar data, and utilize the filtered Radar data as input to an object detection system; and based on a determination that no extracted range or depth data corresponds to the distance between the Radar sensor and the transparent or translucent material
  • a system might comprise a computing system, which might comprise at least one first processor and a first non-transitory computer readable medium communicatively coupled to the at least one first processor.
  • the first non-transitory computer readable medium might have stored thereon computer software comprising a first set of instructions that, when executed by the at least one first processor, causes the computing system to: receive radio detection and ranging ("Radar") data from a Radar sensor, the Radar sensor being disposed such that a transparent or translucent material is positioned between one or more objects and the Radar sensor; parse the Radar data to extract range or depth data; based on a determination that at least one extracted range or depth data corresponds to a distance between the Radar sensor and the transparent or translucent material, filter out at least one Radar data point corresponding to said at least one extracted range or depth data from the Radar data to produce filtered Radar data, and utilize the filtered Radar data as input to an object detection system; and based on a determination that no extracted range or depth
  • Fig. 1 is a schematic diagram illustrating a system for implementing radar signal filtering for removing noise due to transparent or translucent material located in front of sensor, in accordance with various embodiments.
  • FIGs. 2A and 2B are schematic block flow diagrams illustrating a non-limiting example of object detection using radar signal filtering for removing noise due to transparent or translucent material located in front of sensor, in accordance with various embodiments.
  • Figs. 3A-3D are schematic diagrams illustrating a non-limiting example of the use of radar signal filtering for removing noise due to transparent or translucent material located in front of sensor during implementation of an advanced driver assistance system (“ADAS”), in accordance with various embodiments.
  • ADAS advanced driver assistance system
  • FIGs. 4A and 4B are flow diagrams illustrating a method for implementing radar signal filtering for removing noise due to transparent or translucent material located in front of sensor, in accordance with various embodiments.
  • FIG. 5 is a block diagram illustrating an example of computer or system hardware architecture, in accordance with various embodiments.
  • Fig. 6 is a block diagram illustrating a networked system of computers, computing systems, or system hardware architecture, which can be used in accordance with various embodiments.
  • Various embodiments provide tools and techniques for implementing radar-based object detection (e.g., advanced driver assistance systems (“ADASs”), or the like), and, more particularly, to methods, systems, and apparatuses for implementing radar signal filtering for removing noise due to transparent or translucent material located in front of sensor.
  • ADASs advanced driver assistance systems
  • a general framework is provided to filter out unwanted reflection from radar data, the unwanted reflection being caused by a translucent/transparent object in front of the radar sensor.
  • this translucent/transparent object is most likely a windshield of a vehicle. This noisy data can be removed so that the quality of the radar signal can be improved, which can help with subsequent object detection.
  • the framework comprises an online part and an offline part.
  • the offline part should be executed before the online part.
  • the distance between the windshield and the radar sensor is determined, which next will be used in the online part to filter the radar data.
  • a computing system may receive radio detection and ranging (“Radar”) data from a Radar sensor, the Radar sensor being disposed such that a transparent or translucent material is positioned between one or more objects and the Radar sensor. The computing system may parse the Radar data to extract range or depth data.
  • Radar radio detection and ranging
  • the computing system may filter out at least one Radar data point corresponding to said at least one extracted range or depth data from the Radar data to produce filtered Radar data, and may utilize the filtered Radar data as input to an object detection system. Based on a determination that no extracted range or depth data corresponds to the distance between the Radar sensor and the transparent or translucent material, the computing system may utilize the Radar data from the Radar sensor as input to the object detection system.
  • the computing system may comprise at least one of a radar data processing system, the object detection system, an object detection and ranging system, a positioning and mapping system, an advanced driver assistance system ("ADAS"), a processor on a user device, a server computer over a network, a cloud computing system, or a distributed computing system, and/or the like.
  • ADAS advanced driver assistance system
  • the Radar sensor may be part of a Radar system that comprises a two-dimensional ("2D") Radar transmitter that emits a 2D Radar signal along a plane that is orthogonal to the 2D Radar transmitter, wherein the transparent or translucent material intersects the plane and the 2D Radar signal.
  • 2D two-dimensional
  • the 2D Radar transmitter may comprise at least one antenna disposed on an integrated circuit ("IC") chip, wherein the at least one antenna comprises one of a single IC -based antenna disposed on the IC chip, a plurality of IC-based antennas arranged as a one-dimensional ("ID") line of antennas disposed on the IC chip, or a 2D array of IC -based antennas disposed on the IC chip, and/or the like.
  • the plane may be orthogonal to a surface of the IC chip on which the at least one antenna is disposed. In some cases, the plane may be parallel to a ground surface below the Radar sensor.
  • the Radar sensor may be part of a Radar system that comprises a three-dimensional (“3D") Radar transmitter that emits a 3D Radar signal that is orthogonal to the 3D Radar transmitter, wherein the transparent or translucent material intersects the 3D Radar signal.
  • 3D three-dimensional
  • the transparent or translucent material may comprise a windshield of a vehicle, wherein the object detection system may be one of a system integrated with an ADAS or a system separate from yet in communication with an ADAS, or the like.
  • the windshield may be a front windshield, and the one or more objects may be located in front of a front portion of the vehicle.
  • the windshield may be a rear windshield, and the one or more objects may be located behind a rear portion of the vehicle.
  • the distance between the Radar sensor and the transparent or translucent material may be a distance value that is stored in a data store prior to receiving the Radar data, the data store being accessible by the computing system.
  • the distance value may comprise one of: a first distance value obtained using a manual measurement between the Radar sensor and the transparent or translucent material, along a plane that is orthogonal to an emitter/receiver surface of the Radar sensor; a second distance value obtained using a Radar measurement using the radar sensor between the Radar sensor and the transparent or translucent material, along the plane that is orthogonal to the emitter/receiver surface of the Radar sensor; a third distance value comprising a default distance value corresponding to estimated or average distances between the transparent or translucent material and a typical mounting position of the Radar sensor; a fourth distance value comprising a range of values between zero and the first distance value; a fifth distance value comprising a range of values between zero and the second distance value; a sixth distance value comprising a range of values between zero and the third distance value; or one of the first through sixth distance values with a predetermined tolerance value; and/or the like.
  • the manual measurement may be performed by a user using one of a ruler, a measuring tape, a combination of a string and a ruler, a combination of a string and a measuring tape, a combination of a laser-based measuring tool positioned beside the Radar sensor and a laser reflective material placed against a surface of the transparent or translucent material and oriented to reflect laser light back to the laser-based measuring tool, the laserbased measuring tool positioned with the transparent or translucent material positioned between the laser-based measuring tool and the Radar sensor and with the laser-based measuring tool aimed at the Radar sensor, or the laser-based measuring tool positioned with the laser-based measuring tool positioned between the transparent or translucent material and the Radar sensor and with the laser-based measuring tool aimed at the Radar sensor, and/or the like.
  • the Radar measurement may be performed using one of a combination of the Radar sensor and a Radar reflective material positioned between the transparent or translucent material and the Radar sensor and oriented or configured to reflect Radar signals back to the Radar sensor or a combination of the Radar sensor and a Radar reflective material that is positioned with the transparent or translucent material between the Radar reflective material and the Radar sensor and oriented or configured to reflect Radar signals back to the Radar sensor, wherein the Radar reflective material comprises a Radar signal corner reflector, and/or the like.
  • the Radar data may comprise Radar point cloud data, and wherein the at least one Radar data point may comprise at least one point within the Radar point cloud data.
  • the Radar data may comprise one of a 2D Radar heat map or a 3D Radar heat map, and wherein the at least one Radar data point may comprise a corresponding one of at least one intensity point within the 2D Radar heat map or at least one peak within the 3D Radar heat map.
  • radar signal filtering for removing noise due to transparent or translucent material located in front of sensor.
  • This new technique can effectively remove the unwanted noisy data that are caused by the radar reflection on the windshield of a vehicle, thus providing the user cleaned-up radar data that can result in increased accuracy of radar-based detection. It is also easy to implement.
  • the user is provided with three options: (a) distance determination through physical measurement, (b) distance determination through radar signal processing, or (c) use of a default typical distance, or the like. There is no need for any additional software or hardware tool.
  • this new technique does not require sophisticated fabrication or installation for it to be put into a real application.
  • the whole procedure for radar signal filtering is also straightforward for a user to implement.
  • the new technique does not have a strict requirement on the types of radar sensor to be used.
  • it can be incorporated into various types of devices. These devices can be either compact (e.g., radar and other type of sensor that are enclosed in a same chamber, etc.) or distributed (e.g., radar and other type of sensor that are separately placed in different chambers, etc.), and/or the like.
  • some embodiments can improve the functioning of user equipment or systems themselves (e.g., Radar data processing systems, Radar-based object detection systems, object detection systems, driver assistance systems, etc.), for example, by receiving, using a computing system, Radar data from a Radar sensor, the Radar sensor being disposed such that a transparent or translucent material is positioned between one or more objects and the Radar sensor; parsing, using the computing system, the Radar data to extract range or depth data; based on a determination that at least one extracted range or depth data corresponds to a distance between the Radar sensor and the transparent or translucent material, filtering out, using the computing system, at least one Radar data point corresponding to said at least one extracted range or depth data from the Radar data to produce filtered Radar data, and utilizing the filtered Radar data as input to an object detection system; and based on a determination that no extracted range or depth data corresponds to the distance between the Radar sensor and the transparent or translucent material, utilizing the Radar data from the Radar sensor as input to
  • Figs. 1-6 illustrate some of the features of the method, system, and apparatus for implementing radarbased object detection (e.g., advanced driver assistance systems ("ADASs"), or the like), and, more particularly, to methods, systems, and apparatuses for implementing radar signal filtering for removing noise due to transparent or translucent material located in front of sensor, as referred to above.
  • ADASs advanced driver assistance systems
  • the methods, systems, and apparatuses illustrated by Figs. 1-6 refer to examples of different embodiments that include various components and steps, which can be considered alternatives or which can be used in conjunction with one another in the various embodiments.
  • the description of the illustrated methods, systems, and apparatuses shown in Figs. 1-6 is provided for purposes of illustration and should not be considered to limit the scope of the different embodiments.
  • Fig. 1 is a schematic diagram illustrating a system 100 for implementing radar signal filtering for removing noise due to transparent or translucent material located in front of sensor, in accordance with various embodiments.
  • system 100 may comprise computing system(s) 105a, which may include, without limitation, at least one of one or more radio detection and ranging (“Radar") data processors 110 (including, but not limited to, Radar data parser 110a, Radar data filter 110b, and/or the like), one or more image data processors 115 (optional), an objection detection system 120, or a data store, and/or the like.
  • Radar data parser 110a including, but not limited to, Radar data parser 110a, Radar data filter 110b, and/or the like
  • image data processors 115 optionally a data processors
  • objection detection system 120 or a data store, and/or the like.
  • System 100 may further comprise one or more Radar sensors 135 and one or more cameras 155 (optional) that may be used to detect one or more objects 140a-140n (collectively, "objects 140" or the like), in some cases, through a transparent or translucent material 145.
  • Radar sensors 135 and one or more cameras 155 may be used to detect one or more objects 140a-140n (collectively, "objects 140" or the like), in some cases, through a transparent or translucent material 145.
  • System 100 may further comprise remote computing system 105b (and corresponding database(s) 160), computing system 105b including, but not limited to, at least one of a remote or server-based radar data processing system, a remote or server-based object detection system, a remote or server-based object detection and ranging system, a remote or server-based positioning and mapping system, a remote or server-based ADAS, a server computer over a network, a cloud computing system, or a distributed computing system, and/or the like.
  • System 100 may further comprise network(s) 165, a (separate or standalone) ADAS 170 (optional), and one or more user devices 175a-175n (collectively, "user devices 175" or the like).
  • the computing system 105a may include, without limitation, at least one of a radar data processing system, the object detection system, an object detection and ranging system, a positioning and mapping system, an advanced driver assistance system ("ADAS”), or a processor on a user device, and/or the like.
  • a radar data processing system the object detection system
  • an object detection and ranging system a positioning and mapping system
  • ADAS advanced driver assistance system
  • processor on a user device and/or the like.
  • the data store 125 may include, but is not limited to, at least one of read-only memory (“ROM”), programmable read-only memory (“PROM”), erasable programmable read-only memory (“EPROM”), electrically erasable programmable read-only memory (“EEPROM”), flash memory, other non-volatile memory devices, random-access memory (“RAM”), static random-access memory (“SRAM”), dynamic random-access memory (“DRAM”), synchronous dynamic random-access memory (“SDRAM”), virtual memory, a RAM disk, or other volatile memory devices, non-volatile RAM devices, and/or the like.
  • ROM read-only memory
  • PROM programmable read-only memory
  • EPROM erasable programmable read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • flash memory other non-volatile memory devices
  • RAM random-access memory
  • SRAM static random-access memory
  • DRAM dynamic random-access memory
  • SDRAM synchronous dynamic random-access memory
  • the Radar sensor(s) 135 may be part of a Radar system that includes a two-dimensional (“2D") Radar transmitter that emits a 2D Radar signal along a plane that is orthogonal to the 2D Radar transmitter, where the transparent or translucent material 145 may intersect the plane and the 2D Radar signal.
  • the 2D Radar transmitter may include at least one antenna disposed on an integrated circuit (“IC") chip, which may be part of a system on chip (“SoC”) platform or the like.
  • IC integrated circuit
  • SoC system on chip
  • the at least one antenna may include, but is not limited to, one of a single IC -based antenna disposed on the IC chip, a plurality of IC-based antennas arranged as a one-dimensional ("ID") line of antennas disposed on the IC chip, or a 2D array of IC-based antennas disposed on the IC chip, and/or the like.
  • the plane may be orthogonal to a surface of the IC chip on which the at least one antenna is disposed. In some cases, the plane may be parallel to a ground surface below the Radar sensor(s) 135.
  • the Radar sensor(s) 135 may be part of a Radar system that includes a three-dimensional ("3D") Radar transmitter that emits a 3D Radar signal that is orthogonal to the 3D Radar transmitter, where the transparent or translucent material 145 may intersect the 3D Radar signal.
  • 3D three-dimensional
  • a transparent material including, but not limited to, clear glass, etc. refers to a material through which all incident light passes
  • a translucent material including, but not limited to, frosted glass, some plastics, etc. refers to a material through which some (but not all) of the incident light passes.
  • the transparent or translucent material 145 may include, but is not limited to, one of a vehicle front windshield with tinting and/or metallization layers, a vehicle front windshield without tinting or metallization layers, a vehicle rear windshield with tinting and/or metallization layers, a vehicle rear windshield without tinting or metallization layers, a building clear glass window, a building frosted glass window, a clear plastic protective material for a portable device, a frosted plastic protective material for a portable device, a clear plastic wall, a frosted plastic wall, and/or the like.
  • the user devices 175 may each include, without limitation, one of a smart phone, a tablet computer, a laptop computer, a desktop computer, or a server computer, and/or the like.
  • Some user devices e.g., a smart phone, a tablet computer, a laptop computer, etc.
  • Some user devices may each include at least one external display screen or monitor (not shown) (which may be a non-touchscreen display device or a touchscreen display device, or the like) and at least one integrated audio playback device (not shown) (e.g., built-in speakers, etc.) and/or at least one external audio playback device (not shown) (e.g., external or peripheral speakers, wired earphones, wired earbuds, wired headphones, wireless earphones, wireless earbuds, wireless headphones, or the like).
  • Some user devices e.g., some desktop computers, or some server computers, or the like
  • the lightning bolt symbols are used to denote wireless communications between computing system(s) 105a and each of one or more of radar sensor(s) 135, camera(s) 155, ADAS 170, and/or user device(s) 175, between network(s) 165 (in some cases, via network access points or the like (not shown)) and each of one or more of computing system(s) 105 a, remote computing system 105b, radar sensor(s) 135, camera(s) 155, ADAS 170, and/or user device(s) 175, between computing system 105b and each of one or more of computing system(s) 105a, remote computing system 105b, radar sensor(s) 135, camera(s) 155, ADAS 170, and/or user device(s) 175 via network(s) 165, and/or the like.
  • the wireless communications may include wireless communications using protocols including, but not limited to, at least one of BluetoothTM communications protocol, WiFi communications protocol, or other 802.11 suite of communications protocols, ZigBee communications protocol, Z-wave communications protocol, or other 802.15.4 suite of communications protocols, cellular communications protocol (e.g., 3G, 4G, 4G LTE, 5G, etc.), or other suitable communications protocols, and/or the like.
  • protocols including, but not limited to, at least one of BluetoothTM communications protocol, WiFi communications protocol, or other 802.11 suite of communications protocols, ZigBee communications protocol, Z-wave communications protocol, or other 802.15.4 suite of communications protocols, cellular communications protocol (e.g., 3G, 4G, 4G LTE, 5G, etc.), or other suitable communications protocols, and/or the like.
  • the network(s) 165 may each include a local area network (“LAN”), including, without limitation, a fiber network, an Ethernet network, a Token-RingTM network, and/or the like; a wide-area network (“WAN”); a wireless wide area network (“WWAN”); a virtual network, such as a virtual private network (“VPN”); the Internet; an intranet; an extranet; a public switched telephone network (“PSTN”); an infra-red network; a wireless network, including, without limitation, a network operating under any of the IEEE 802.11 suite of protocols, the BluetoothTM protocol known in the art, and/or any other wireless protocol; and/or any combination of these and/or other networks.
  • LAN local area network
  • WAN wide-area network
  • WWAN wireless wide area network
  • VPN virtual network
  • PSTN public switched telephone network
  • PSTN public switched telephone network
  • a wireless network including, without limitation, a network operating under any of the IEEE 802.11 suite of protocols, the BluetoothTM protocol known in the art,
  • the network(s) 165 might include an access network of the service provider (e.g., an Internet service provider ("ISP")).
  • the network(s) 165 may include a core network of the service provider, and/or the Internet.
  • ISP Internet service provider
  • computing system 105a or 105b may receive Radar data (e.g., Radar signal data 130, or the like) from a Radar sensor (e.g., Radar sensor(s) 135, or the like), the Radar sensor being disposed such that a transparent or translucent material (e.g., transparent or translucent material 145, or the like) is positioned between one or more objects (e.g., objects 140a-140n, or the like) and the Radar sensor.
  • Radar data e.g., Radar signal data 130, or the like
  • a Radar sensor e.g., Radar sensor(s) 135, or the like
  • a transparent or translucent material e.g., transparent or translucent material 145, or the like
  • the computing system may parse (e.g., using Radar data parser 110a, or the like) the Radar data to extract range or depth data (e.g., range or depth di - d n corresponding to the depth or distance between the Radar sensor(s) 135 and corresponding one of objects 145a-145n, or the like).
  • range or depth data e.g., range or depth di - d n corresponding to the depth or distance between the Radar sensor(s) 135 and corresponding one of objects 145a-145n, or the like.
  • the computing system may filter out (e.g., using Radar data filter 110b, or the like) at least one Radar data point corresponding to said at least one extracted range or depth data from the Radar data to produce filtered Radar data, and may utilize the filtered Radar data as input to an object detection system (e.g., object detection system 120, or the like) and/or an ADAS (e.g., ADAS 170, or the like; in the case of radar for use for assisting driving of vehicles, or the like).
  • an object detection system e.g., object detection system 120, or the like
  • ADAS e.g., ADAS 170, or the like; in the case of radar for use for assisting driving of vehicles, or the like.
  • the computing system may utilize the (unfiltered) Radar data from the Radar sensor as input to the object detection system.
  • image data if available; e.g., image data 150 from the camera(s) 155, or the like
  • Results or data from the object detection system (and/or ADAS) may subsequently be sent to a user device(s) (e.g., user device(s) 175, or the like) for display to a user(s), or the like.
  • the distance between the Radar sensor and the transparent or translucent material may be a distance value that is stored in a data store (e.g., data store 125 or database(s) 160, or the like) prior to receiving the Radar data, the data store being accessible by the computing system.
  • a data store e.g., data store 125 or database(s) 160, or the like
  • the distance value (£)) may include, but is not limited to, one of: a first distance value obtained using a manual measurement between the Radar sensor and the transparent or translucent material, along a plane that is orthogonal to an emitter/receiver surface of the Radar sensor; a second distance value obtained using a Radar measurement using the radar sensor between the Radar sensor and the transparent or translucent material, along the plane that is orthogonal to the emitter/receiver surface of the Radar sensor; a third distance value comprising a default distance value corresponding to estimated or average distances between the transparent or translucent material and a typical mounting position of the Radar sensor; a fourth distance value comprising a range of values between zero and the first distance value; a fifth distance value comprising a range of values between zero and the second distance value; a sixth distance value comprising a range of values between zero and the third distance value; or one of the first through sixth distance values with a predetermined tolerance value; and/or the like.
  • the distance (£)) may include any suitable separation distance between a Radar sensor and the transparent or translucent material, including, but not limited to, one of about 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, or 50 cm, or a range between about 0.25 and about 50 cm, or the like.
  • the default distance value may be any suitable value, including, but not limited to, one of about 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, or 50 cm, or a range between about 0.25 and about 50 cm, or the like (e.g., 30 cm or the like, where any values of d being 0 - 30 cm may be filtered out).
  • the predetermined tolerance may be any suitable tolerance value, including, but not limited to, one of about 1, 2, 3, 4, 5, 6, 7, 8, 9, or 10 cm, or a range between about 0.25 and about 10 cm, or the like.
  • the depth or range ( ⁇ 7) of typical Radar sensors may extend from about 0 cm to 100's of meters, or further.
  • the manual measurement may be performed by a user using one of a ruler, a measuring tape, a combination of a string and a ruler, a combination of a string and a measuring tape, a combination of a laser-based measuring tool positioned beside the Radar sensor and a laser reflective material placed against a surface of the transparent or translucent material and oriented to reflect laser light back to the laser-based measuring tool, the laserbased measuring tool positioned with the transparent or translucent material positioned between the laser-based measuring tool and the Radar sensor and with the laser-based measuring tool aimed at the Radar sensor, or the laser-based measuring tool positioned with the laser-based measuring tool positioned between the transparent or translucent material and the Radar sensor and with the laser-based measuring tool aimed at the Radar sensor, and/or the like.
  • the Radar measurement may be performed using one of a combination of the Radar sensor and a Radar reflective material positioned between the transparent or translucent material and the Radar sensor and oriented or configured to reflect Radar signals back to the Radar sensor or a combination of the Radar sensor and a Radar reflective material that is positioned with the transparent or translucent material between the Radar reflective material and the Radar sensor and oriented or configured to reflect Radar signals back to the Radar sensor, wherein the Radar reflective material comprises a Radar signal corner reflector, and/or the like.
  • the Radar data may include Radar point cloud data.
  • the at least one Radar data point may include at least one point within the Radar point cloud data.
  • the point cloud data may include coordinate data (e.g., x-axis coordinate data, y-axis coordinate data, and z-axis coordinate date, or corresponding polar coordinate data, or the like) and Radar reflection data (e.g., Radar reflection value or Radar signal intensity value, or the like), and/or the like.
  • the Radar data may include one of a 2D Radar heat map or a 3D Radar heat map.
  • the at least one Radar data point may include a corresponding one of at least one intensity point within the 2D Radar heat map or at least one peak within the 3D Radar heat map.
  • millimeter wave Radar e.g., Texas Instrument's mm wave Radar or TI mmWave, or the like
  • the data associated with each Radar point may include, without limitation, both the 3D coordinates (e.g., x, y, z coordinate data, polar coordinate data, or the like) and the intensity of Radar reflection (usually in the form of Radar cross section ("RCS”), or the like).
  • RCS Radar cross section
  • the intensity of Radar reflection may be determined not only by the distance between an object and the Radar sensor, but also by the material of the object as well as the angle of the reflective surface, the unwanted noise caused by a vehicle windshield (or other transparent or translucent material or object) cannot be removed by using RCS values alone. Therefore, the system and techniques of the various embodiments set out to remove Radar noise using the 3D coordinates of the Radar point cloud. In particular, the system and techniques of the various embodiments may rely on the range (or depth) information in the 3D coordinates to filter out the noise in the Radar data.
  • FIGs. 2A and 2B are schematic block flow diagrams illustrating a non-limiting example 200 of object detection using radar signal filtering for removing noise due to transparent or translucent material located in front of sensor, in accordance with various embodiments.
  • Some embodiments are directed to ongoing work on developing an ADAS with multiple sensors.
  • Two types of sensors may be adopted for ADAS - namely, a camera and a radar device.
  • the two types of data inputs are aligned through radar-camera calibration before being utilized for object detection in ADAS, as shown in the non-limiting example of Fig. 2A.
  • Radar signal data 210 from Radar sensor 205 may be processed and filtered using Radar signal filtering (at block 215), the filtered signal being used for data alignment (at block 230) together with image data 225 from camera 220.
  • object detection at block 235 may be performed to detect objects that are sensed or captured by Radar sensor 205 and camera 220.
  • the resultant object detection data may be sent to an ADAS 240 to enable driver assistance.
  • the multi-sensor data collection platform may be placed inside the driver's compartment behind the windshield.
  • this design will lead to the undesirable Radar reflection from the windshield, which could have a negative impact on the subsequent object detection task. Therefore, an additional stage of Radar signal filtering (at block 215) is introduced to remove this unwanted noise in the Radar signal data 210 from Radar sensor 205.
  • radar signal filtering is shown.
  • Some embodiments comprise two main parts: an offline part (or setup part or presensing part, or the like; depicted on the left side of the vertical dashed line in Fig. 2B) and an online part (or Radar sensing part or object detection part, or the like; depicted on the right side of the vertical dashed line in Fig. 2B).
  • the offline (or setup or pre-sensing) part may include, without limitation, (physical, in some cases, manually) setting up the Radar sensor behind a windshield (or other transparent or translucent material or object, or the like) (at block 245), and determining or computing distance (D) between the Radar sensor and the windshield (at block 250). This part is executed before the actual radar data collection (i.e., the online part).
  • the online (or Radar sensing, or object detection) part may include, but is not limited to, receiving Radar data (in some cases, in the form of a Radar point cloud, or the like) from the Radar sensor (at block 255) and parsing the Radar data to extract the range or depth (d) data for each point in the Radar point cloud (at block 260).
  • this depth value (d) may be compared with the previously obtained distance (D) (at block 265). If these two values are very close (or if depth value ( ⁇ 7) is less than or equal to distance (D), or the like) for any particular Radar point, then this radar point may be considered as indicative of Radar reflection from the windshield, and hence may be removed (or filtered) from the final Radar signal output or Radar point cloud (at block 270).
  • the particular Radar point may be incorporated into the final output of radar point cloud.
  • the processes at blocks 265 and 270 may be repeated for each Radar data point (or at least for those Radar data points that are close to the distance (£>)).
  • the unfiltered or incorporated Radar points may be collected, and the Radar Data may be sent to an object detection system (at block 280).
  • Figs. 3A-3D are schematic diagrams illustrating a nonlimiting example 300 of the use of radar signal filtering for removing noise due to transparent or translucent material located in front of sensor during implementation of an advanced driver assistance system (“ADAS"), in accordance with various embodiments.
  • Fig. 3 A depicts relative positions between a Radar sensor(s) and a windshield of a vehicle
  • Fig. 3B depicts an example Radar range profile that is averaged across all angles of a Radar sensor(s)
  • Figs. 3C and 3D depict manual measurement (Fig. 3C) and Radar measurement (Fig. 3D), respectively, of distance (D) between the Radar sensor and the windshield.
  • the windshield may be a front windshield, and one or more objects that may be detected by the Radar sensor(s) may be located in front of a front portion of the vehicle.
  • the windshield may be a rear windshield, and the one or more objects that may be detected by the Radar sensor(s) may be located behind a rear portion of the vehicle.
  • a Radar sensor(s) 305 having a Radar emitter surface 310 and an antenna axis 315 may be placed or mounted within a vehicle interior compartment, behind a windshield 320 (at an approximate distance (D) between the Radar emitter surface 310 and the windshield 320, along the antenna axis 315).
  • the Radar sensor 305 may be part of a Radar system that comprises a two-dimensional ("2D") Radar transmitter that emits a 2D Radar signal along a plane that is orthogonal to the 2D Radar transmitter (i.e., a plane that is orthogonal to the Radar emitter surface 310, where the antenna axis 315 lies along the plane, or the like, the antenna axis 315 being orthogonal to the radar emitter surface 310).
  • windshield 320 or other transparent or translucent material or object, or the like
  • the 2D Radar transmitter may include, without limitation, at least one antenna disposed on an integrated circuit ("IC") chip, the at least one antenna including, but not limited to, one of a single IC -based antenna disposed on the IC chip, a plurality of IC-based antennas arranged as a one-dimensional ("ID") line of antennas disposed on the IC chip, or a 2D array of IC-based antennas disposed on the IC chip, and/or the like.
  • the plane may be orthogonal to a surface of the IC chip on which the at least one antenna is disposed. In some cases, the plane may be parallel to a ground surface below the Radar sensor.
  • the Radar sensor 305 may be part of a Radar system that comprises a three-dimensional ("3D") Radar transmitter that emits a 3D Radar signal that is orthogonal to the 3D Radar transmitter (i.e., orthogonal to the radar emitter surface 310, the antenna axis 315 being orthogonal to the radar emitter surface 310, or the like).
  • 3D three-dimensional
  • windshield 320 or other transparent or translucent material or object, or the like
  • the Radar data may include, but is not limited to, Radar point cloud data, and the at least one Radar data point including, without limitation, at least one point within the Radar point cloud data.
  • the Radar data may include, but is not limited to, one of a 2D Radar heat map or a 3D Radar heat map, or the like, and the at least one Radar data point may include, without limitation, a corresponding one of at least one intensity point within the 2D Radar heat map or at least one peak within the 3D Radar heat map, and/or the like.
  • the Radar sensor 305 behind the windshield 320 may be placed either close to the top or bottom of the windshield 320 (depicted in Fig. 3 as being close to the top). In either case, it may be necessary to obtain the distance (D) between the Radar sensor 305 and the windshield 320.
  • the exact distance value can be determined either by pure physical measurement (as shown and described with respect to Fig. 3C), or by Radar signal processing (as shown and described with respect to Fig. 3D).
  • Fig. 3B an example Radar range profile that is averaged across all angles is shown, with each intensity peak (having amplitude measured in dB) representing an object(s) detected at a distance (d) that is measured in meters.
  • angles refer to angles of a fan- shaped detection field corresponding to the Radar field of detection, expanding from a point at the radar emitter surface 310, with the center of the fan-shaped detection field corresponding to the antenna axis 315.
  • the peaks 325 at about 2.5 inches (or about 6.3 cm) potentially corresponds to distance (D) between the Radar sensor(s) 315 and the windshield 320, and thus may be filtered out (or removed) from the Radar data, thereby increasing accuracy of Radar-based object detection, or the like.
  • distance (D) may be determined by using a manual or physical measurement. After the Radar sensor 305 has been mounted behind the windshield 320, the distance (D) between them can be measured, e.g., using a ruler or measuring tape 335a (or a combination of a string to mark the distance and a ruler or measuring tape 335a to measure the marked portion of the string, or the like), or the like.
  • the ruler, measuring tape, and/or string needs to be orthogonal to the radar antenna plane (i.e., Radar emitter surface 310), or at least be approximately orthogonal to the radar antenna plane (i.e., Radar emitter surface 310), in some cases, along the antenna axis 315, or the like.
  • the Radar beam of Radar sensor 305 for ADAS usually has a limited elevation angle, which will result in inaccurate reading if the distance (D) is not measured along the orthogonal direction (i.e., along the antenna axis 315, or the like).
  • a laser distance measurer or laser measuring tool 335b In an alternative (and, in some cases, more accurate) way to manually or physically measure the distance (D) is by using a laser distance measurer or laser measuring tool 335b, although, it is a more expensive way.
  • the laser measuring tool 335b In the case that the laser measuring tool 335b is positioned beside the radar sensor and pointed toward the windshield 320, it may be necessary for the user to temporarily place an opaque shield or layer against the windshield 320 so as to prevent the laser beam from easily passing through the windshield 320, which will likely lead to inaccurate measurement.
  • the laser beam ought to be (at least approximately) orthogonal to the radar antenna plane (i.e., orthogonal to the Radar emitter surface 310 and/or aligned with the antenna axis 315, or the like).
  • the laser measuring tool 335b may be positioned with the windshield 320 between the laser measuring tool 335b and the Radar sensor 305, with the laser measuring tool 335b being aimed at the Radar emitter surface 310 and/or another portion of the Radar sensor, along the antenna axis 315 (where the Radar emitter surface 310 is used to reflect the laser beam back to the laser measuring tool 335b.
  • distance (D) may be determined by using a Radar measurement or using Radar signal processing, or the like. In this manner, the distance (D) between the Radar sensor 305 and the windshield 320 may be obtained without a physical measuring device. To determine or compute the distance value (D), one can simply use the Radar signal that is reflected from the windshield 320. As described above, the reflected radar signal, which is received by the radar sensor or receiver 305, may contain range or depth information (d) of the Radar point(s) in front of the Radar sensor(s) 305 (i.e., in front of the Radar emitter surface 310, or the like).
  • the user may temporarily place a Radar reflector 340 (e.g., a Radar corner reflector, or the like), which may preferably be made of metal with strong radar reflection characteristics (i.e., back to the Radar sensor(s) 305), close to the windshield 320.
  • This Radar reflector 340 may provide very strong reflection (i.e., high Radar cross section ("RCS") value) in the resultant Radar point cloud, so that a software program can easily detect these points, and subsequently extract the depth value accordingly.
  • RCS Radar cross section
  • the radar reflector 340 should be placed at approximately the same elevation with the radar sensor (i.e., orthogonal to the Radar plane - that is, orthogonal to the Radar emitter surface 310, the antenna axis 315 being orthogonal to the radar emitter surface 310).
  • Figs. 4A and 4B are flow diagrams illustrating a method 400 for implementing radar signal filtering for removing noise due to transparent or translucent material located in front of sensor, in accordance with various embodiments.
  • Fig. 4 While the techniques and procedures are depicted and/or described in a certain order for purposes of illustration, it should be appreciated that certain procedures may be reordered and/or omitted within the scope of various embodiments.
  • the method 400 illustrated by Fig. 4 can be implemented by or with (and, in some cases, are described below with respect to) the systems, examples, or embodiments 100, 200, and 300 of Figs.
  • method 400 at block 405, may comprise determining a distance between a radio detection and ranging (“Radar”) sensor and a transparent or translucent material.
  • Radar radio detection and ranging
  • a transparent material including, but not limited to, clear glass, etc.
  • a translucent material including, but not limited to, frosted glass, some plastics, etc.
  • method 400 may comprise receiving, using a computing system, Radar data from the Radar sensor, the Radar sensor being disposed such that the transparent or translucent material is positioned between one or more objects and the Radar sensor.
  • the computing system may comprise at least one of a radar data processing system, the object detection system, an object detection and ranging system, a positioning and mapping system, an advanced driver assistance system ("ADAS"), a processor on a user device, a server computer over a network, a cloud computing system, or a distributed computing system, and/or the like.
  • ADAS advanced driver assistance system
  • the Radar sensor may be part of a Radar system that comprises a two-dimensional ("2D") Radar transmitter that emits a 2D Radar signal along a plane that is orthogonal to the 2D Radar transmitter, wherein the transparent or translucent material intersects the plane and the 2D Radar signal.
  • 2D two-dimensional
  • the 2D Radar transmitter may comprise at least one antenna disposed on an integrated circuit ("IC") chip, wherein the at least one antenna comprises one of a single IC -based antenna disposed on the IC chip, a plurality of IC-based antennas arranged as a one-dimensional ("ID") line of antennas disposed on the IC chip, or a 2D array of IC-based antennas disposed on the IC chip, and/or the like.
  • the plane may be orthogonal to a surface of the IC chip on which the at least one antenna is disposed. In some cases, the plane may be parallel to a ground surface below the Radar sensor.
  • the Radar sensor may be part of a Radar system that comprises a three-dimensional (“3D") Radar transmitter that emits a 3D Radar signal that is orthogonal to the 3D Radar transmitter, wherein the transparent or translucent material intersects the 3D Radar signal.
  • 3D three-dimensional
  • the transparent or translucent material may comprise a windshield of a vehicle, wherein the object detection system may be one of a system integrated with an advanced driver assistance system ("ADAS") or a system separate from yet in communication with an ADAS, or the like.
  • the windshield may be a front windshield, and the one or more objects may be located in front of a front portion of the vehicle.
  • the windshield may be a rear windshield, and the one or more objects may be located behind a rear portion of the vehicle.
  • Method 400 may further comprise parsing, using the computing system, the Radar data to extract range or depth data (block 415).
  • Method 400 may further comprise, at block 420, determining whether at least one extracted range or depth data corresponds to the distance between the Radar sensor and the transparent or translucent material. If so, method 400 may proceed to the process at block 425. If not method 400 may proceed to the process at block 435.
  • method 400 may comprise, based on a determination that at least one extracted range or depth data corresponds to the distance between the Radar sensor and the transparent or translucent material, filtering out, using the computing system, at least one Radar data point corresponding to said at least one extracted range or depth data from the Radar data to produce filtered Radar data.
  • Method 400 at block 430, may comprise utilizing the filtered Radar data as input to an object detection system
  • method 400 may comprise, based on a determination that no extracted range or depth data corresponds to the distance between the Radar sensor and the transparent or translucent material, utilizing the Radar data from the Radar sensor as input to the object detection system.
  • determining the distance between the Radar sensor and the transparent or translucent material may comprise one of: (a) utilizing manual measurement (block 440); (b) utilizing Radar measurement (block 445); or (c) utilizing a default distance value(s) (block 450); or the like.
  • the distance between the Radar sensor and the transparent or translucent material may be a distance value that is stored in a data store prior to receiving the Radar data, the data store being accessible by the computing system.
  • the manual measurement may be performed by a user using one of (1) a ruler, (2) a measuring tape, (3) a combination of a string and a ruler, (4) a combination of a string and a measuring tape, (5) a combination of a laser-based measuring tool positioned beside the Radar sensor and a laser reflective material placed against a surface of the transparent or translucent material and oriented to reflect laser light back to the laserbased measuring tool, (6) the laser-based measuring tool positioned with the transparent or translucent material positioned between the laser-based measuring tool and the Radar sensor and with the laser-based measuring tool aimed at the Radar sensor, or (7) the laser-based measuring tool positioned with the laser-based measuring tool positioned between the transparent or translucent material and the Radar sensor and with the laser-based measuring tool aimed at the Radar sensor, and/or the like.
  • the Radar measurement may be performed using one of (i) a combination of the Radar sensor and a Radar reflective material positioned between the transparent or translucent material and the Radar sensor and oriented or configured to reflect Radar signals back to the Radar sensor or (ii) a combination of the Radar sensor and a Radar reflective material that is positioned with the transparent or translucent material between the Radar reflective material and the Radar sensor and oriented or configured to reflect Radar signals back to the Radar sensor, and/or the like.
  • the Radar reflective material may include, without limitation, a Radar signal comer reflector, and/or the like.
  • the distance value may include, without limitation, one of: a first distance value obtained using a manual measurement between the Radar sensor and the transparent or translucent material, along a plane that is orthogonal to an emitter/receiver surface of the Radar sensor; a second distance value obtained using a Radar measurement using the radar sensor between the Radar sensor and the transparent or translucent material, along the plane that is orthogonal to the emitter/receiver surface of the Radar sensor; a third distance value comprising a default distance value corresponding to estimated or average distances between the transparent or translucent material and a typical mounting position of the Radar sensor; a fourth distance value comprising a range of values between zero and the first distance value; a fifth distance value comprising a range of values between zero and the second distance value; a sixth distance value comprising a range of values between zero and the third distance value; or one of the first through sixth distance values with a predetermined tolerance value; and/or the like.
  • the distance (£)) may include any suitable separation distance between a Radar sensor and the transparent or translucent material, including, but not limited to, one of about 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36,
  • the default distance value may be any suitable value, including, but not limited to, one of about 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17,
  • the predetermined tolerance may be any suitable tolerance value, including, but not limited to, one of about 1, 2, 3, 4, 5, 6, 7, 8, 9, or 10 cm, or a range between about 0.25 and about 10 cm, or the like.
  • the depth or range ( ⁇ 7) of typical Radar sensors may extend from about 0 cm to 100's of meters, or further.
  • the Radar data may include, without limitation, Radar point cloud data, where the at least one Radar data point may include, but is not limited to, at least one point within the Radar point cloud data.
  • the Radar data may include, without limitation, one of a 2D Radar heat map or a 3D Radar heat map, and/or the like, where the at least one Radar data point may include, but is not limited to, a corresponding one of at least one intensity point within the 2D Radar heat map or at least one peak within the 3D Radar heat map, and/or the like.
  • Fig. 5 is a block diagram illustrating an example of computer or system hardware architecture, in accordance with various embodiments.
  • Fig. 5 provides a schematic illustration of one embodiment of a computer system 500 of the service provider system hardware that can perform the methods provided by various other embodiments, as described herein, and/or can perform the functions of computer or hardware system (i.e., computing systems 105a and 105b, advanced driver assistance systems ("ADASs") 170 and 240, and user devices 175a-175n, etc.), as described above.
  • ADASs advanced driver assistance systems
  • Fig. 5 is meant only to provide a generalized illustration of various components, of which one or more (or none) of each may be utilized as appropriate.
  • Fig. 5, therefore, broadly illustrates how individual system elements may be implemented in a relatively separated or relatively more integrated manner.
  • the computer or hardware system 500 - which might represent an embodiment of the computer or hardware system (i.e., computing systems 105a and 105b, ADASs 170 and 240, and user devices 175a-175n, etc.), described above with respect to Figs. 1-4 - is shown comprising hardware elements that can be electrically coupled via a bus 505 (or may otherwise be in communication, as appropriate).
  • the hardware elements may include one or more processors 510, including, without limitation, one or more general-purpose processors and/or one or more special-purpose processors (such as microprocessors, digital signal processing chips, graphics acceleration processors, and/or the like); one or more input devices 515, which can include, without limitation, a mouse, a keyboard, and/or the like; and one or more output devices 520, which can include, without limitation, a display device, a printer, and/or the like.
  • processors 510 including, without limitation, one or more general-purpose processors and/or one or more special-purpose processors (such as microprocessors, digital signal processing chips, graphics acceleration processors, and/or the like)
  • input devices 515 which can include, without limitation, a mouse, a keyboard, and/or the like
  • output devices 520 which can include, without limitation, a display device, a printer, and/or the like.
  • the computer or hardware system 500 may further include (and/or be in communication with) one or more storage devices 525, which can comprise, without limitation, local and/or network accessible storage, and/or can include, without limitation, a disk drive, a drive array, an optical storage device, solid-state storage device such as a random access memory (“RAM”) and/or a read-only memory (“ROM”), which can be programmable, flash-updateable, and/or the like.
  • RAM random access memory
  • ROM read-only memory
  • Such storage devices may be configured to implement any appropriate data stores, including, without limitation, various file systems, database structures, and/or the like.
  • the computer or hardware system 500 might also include a communications subsystem 530, which can include, without limitation, a modem, a network card (wireless or wired), an infra-red communication device, a wireless communication device and/or chipset (such as a BluetoothTM device, an 802.11 device, a WiFi device, a WiMax device, a WWAN device, cellular communication facilities, etc.), and/or the like.
  • the communications subsystem 530 may permit data to be exchanged with a network (such as the network described below, to name one example), with other computer or hardware systems, and/or with any other devices described herein.
  • the computer or hardware system 500 will further comprise a working memory 535, which can include a RAM or ROM device, as described above.
  • the computer or hardware system 500 also may comprise software elements, shown as being currently located within the working memory 535, including an operating system 540, device drivers, executable libraries, and/or other code, such as one or more application programs 545, which may comprise computer programs provided by various embodiments (including, without limitation, hypervisors, VMs, and the like), and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
  • an operating system 540 may comprise computer programs provided by various embodiments (including, without limitation, hypervisors, VMs, and the like), and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
  • application programs 545 may comprise computer programs provided by various embodiments (including, without limitation, hypervisors, VMs, and the like), and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
  • one or more procedures described with respect to the method(s) discussed above might be implemented as code and/or instructions executable by a computer (and/or a processor within a computer); in an aspect, then, such code and/or instructions can be used to configure and/or adapt a general purpose computer (or other device) to perform one or more operations in accordance with the described methods.
  • a set of these instructions and/or code might be encoded and/or stored on a non- transitory computer readable storage medium, such as the storage device(s) 525 described above. In some cases, the storage medium might be incorporated within a computer system, such as the system 500.
  • the storage medium might be separate from a computer system (i.e., a removable medium, such as a compact disc, etc.), and/or provided in an installation package, such that the storage medium can be used to program, configure, and/or adapt a general purpose computer with the instructions/code stored thereon.
  • These instructions might take the form of executable code, which is executable by the computer or hardware system 500 and/or might take the form of source and/or installable code, which, upon compilation and/or installation on the computer or hardware system 500 (e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc.) then takes the form of executable code.
  • some embodiments may employ a computer or hardware system (such as the computer or hardware system 500) to perform methods in accordance with various embodiments of the invention.
  • some or all of the procedures of such methods are performed by the computer or hardware system 500 in response to processor 510 executing one or more sequences of one or more instructions (which might be incorporated into the operating system 540 and/or other code, such as an application program 545) contained in the working memory 535.
  • Such instructions may be read into the working memory 535 from another computer readable medium, such as one or more of the storage device(s) 525.
  • execution of the sequences of instructions contained in the working memory 535 might cause the processor(s) 510 to perform one or more procedures of the methods described herein.
  • machine readable medium and “computer readable medium,” as used herein, refer to any medium that participates in providing data that causes a machine to operate in some fashion.
  • various computer readable media might be involved in providing instructions/code to processor(s) 510 for execution and/or might be used to store and/or carry such instructions/code (e.g., as signals).
  • a computer readable medium is a non-transitory, physical, and/or tangible storage medium.
  • a computer readable medium may take many forms, including, but not limited to, non-volatile media, volatile media, or the like.
  • Non-volatile media includes, for example, optical and/or magnetic disks, such as the storage device(s) 525.
  • Volatile media includes, without limitation, dynamic memory, such as the working memory 535.
  • a computer readable medium may take the form of transmission media, which includes, without limitation, coaxial cables, copper wire, and fiber optics, including the wires that comprise the bus 505, as well as the various components of the communication subsystem 530 (and/or the media by which the communications subsystem 530 provides communication with other devices).
  • transmission media can also take the form of waves (including without limitation radio, acoustic, and/or light waves, such as those generated during radio-wave and infra-red data communications).
  • Common forms of physical and/or tangible computer readable media include, for example, a floppy disk, a flexible disk, a hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read instructions and/or code.
  • Various forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to the processor(s) 510 for execution.
  • the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer.
  • a remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by the computer or hardware system 500.
  • These signals which might be in the form of electromagnetic signals, acoustic signals, optical signals, and/or the like, are all examples of carrier waves on which instructions can be encoded, in accordance with various embodiments of the invention.
  • the communications subsystem 530 (and/or components thereof) generally will receive the signals, and the bus 505 then might carry the signals (and/or the data, instructions, etc. carried by the signals) to the working memory 535, from which the processor(s) 505 retrieves and executes the instructions.
  • the instructions received by the working memory 535 may optionally be stored on a storage device 525 either before or after execution by the processor(s) 510.
  • a set of embodiments comprises methods and systems for implementing radar-based object detection (e.g., advanced driver assistance systems ("ADASs”), or the like), and, more particularly, to methods, systems, and apparatuses for implementing radar signal filtering for removing noise due to transparent or translucent material located in front of sensor.
  • Fig. 6 illustrates a schematic diagram of a system 600 that can be used in accordance with one set of embodiments.
  • the system 600 can include one or more user computers, user devices, or customer devices 605.
  • a user computer, user device, or customer device 605 can be a general purpose personal computer (including, merely by way of example, desktop computers, tablet computers, laptop computers, handheld computers, and the like, running any appropriate operating system, several of which are available from vendors such as Apple, Microsoft Corp., and the like), cloud computing devices, a server(s), and/or a workstation computer(s) running any of a variety of commercially-available UNIXTM or UNIX-like operating systems.
  • a user computer, user device, or customer device 605 can also have any of a variety of applications, including one or more applications configured to perform methods provided by various embodiments (as described above, for example), as well as one or more office applications, database client and/or server applications, and/or web browser applications.
  • a user computer, user device, or customer device 605 can be any other electronic device, such as a thin-client computer, Internet-enabled mobile telephone, and/or personal digital assistant, capable of communicating via a network (e.g., the network(s) 610 described below) and/or of displaying and navigating web pages or other types of electronic documents.
  • a network e.g., the network(s) 610 described below
  • the system 600 is shown with two user computers, user devices, or customer devices 605, any number of user computers, user devices, or customer devices can be supported.
  • Some embodiments operate in a networked environment, which can include a network(s) 610.
  • the network(s) 610 can be any type of network familiar to those skilled in the art that can support data communications using any of a variety of commercially-available (and/or free or proprietary) protocols, including, without limitation, TCP/IP, SNATM, IPXTM, AppleTalkTM, and the like.
  • the network(s) 610 (similar to network(s) 165 of Fig.
  • LAN local area network
  • WAN wide-area network
  • WWAN wireless wide area network
  • VPN virtual private network
  • PSTN public switched telephone network
  • PSTN public switched telephone network
  • a wireless network including, without limitation, a network operating under any of the IEEE 802.11 suite of protocols, the BluetoothTM protocol known in the art, and/or any other wireless protocol; and/or any combination of these and/or other networks.
  • the network might include an access network of the service provider (e.g., an Internet service provider (“ISP”)).
  • ISP Internet service provider
  • the network might include a core network of the service provider, and/or the Internet.
  • Embodiments can also include one or more server computers 615.
  • Each of the server computers 615 may be configured with an operating system, including, without limitation, any of those discussed above, as well as any commercially (or freely) available server operating systems.
  • Each of the servers 615 may also be running one or more applications, which can be configured to provide services to one or more clients 605 and/or other servers 615.
  • one of the servers 615 might be a data server, a web server, a cloud computing device(s), or the like, as described above.
  • the data server might include (or be in communication with) a web server, which can be used, merely by way of example, to process requests for web pages or other electronic documents from user computers 605.
  • the web server can also run a variety of server applications, including HTTP servers, FTP servers, CGI servers, database servers, Java servers, and the like.
  • the web server may be configured to serve web pages that can be operated within a web browser on one or more of the user computers 605 to perform methods of the invention.
  • the server computers 615 might include one or more application servers, which can be configured with one or more applications accessible by a client running on one or more of the client computers 605 and/or other servers 615.
  • the server(s) 615 can be one or more general purpose computers capable of executing programs or scripts in response to the user computers 605 and/or other servers 615, including, without limitation, web applications (which might, in some cases, be configured to perform methods provided by various embodiments).
  • a web application can be implemented as one or more scripts or programs written in any suitable programming language, such as JavaTM, C, C#TM or C++, and/or any scripting language, such as Perl, Python, or TCL, as well as combinations of any programming and/or scripting languages.
  • the application server(s) can also include database servers, including, without limitation, those commercially available from OracleTM, MicrosoftTM, SybaseTM, IBMTM, and the like, which can process requests from clients (including, depending on the configuration, dedicated database clients, API clients, web browsers, etc.) running on a user computer, user device, or customer device 605 and/or another server 615.
  • an application server can perform one or more of the processes for implementing radar-based object detection (e.g., ADASs, or the like), and, more particularly, to methods, systems, and apparatuses for implementing radar signal filtering for removing noise due to transparent or translucent material located in front of sensor, as described in detail above.
  • Data provided by an application server may be formatted as one or more web pages (comprising HTML, JavaScript, etc., for example) and/or may be forwarded to a user computer 605 via a web server (as described above, for example).
  • a web server might receive web page requests and/or input data from a user computer 605 and/or forward the web page requests and/or input data to an application server.
  • a web server may be integrated with an application server.
  • one or more servers 615 can function as a file server and/or can include one or more of the files (e.g., application code, data files, etc.) necessary to implement various disclosed methods, incorporated by an application running on a user computer 605 and/or another server 615.
  • a file server can include all necessary files, allowing such an application to be invoked remotely by a user computer, user device, or customer device 605 and/or server 615.
  • the system can include one or more databases 620a-620n (collectively, "databases 620").
  • databases 620 The location of each of the databases 620 is discretionary: merely by way of example, a database 620a might reside on a storage medium local to (and/or resident in) a server 615a (and/or a user computer, user device, or customer device 605).
  • a database 620n can be remote from any or all of the computers 605, 615, so long as it can be in communication (e.g., via the network 610) with one or more of these.
  • a database 620 can reside in a storage-area network ("SAN") familiar to those skilled in the art.
  • SAN storage-area network
  • the database 620 can be a relational database, such as an Oracle database, that is adapted to store, update, and retrieve data in response to SQL-formatted commands.
  • the database might be controlled and/or maintained by a database server, as described above, for example.
  • system 600 might further comprise computing system(s) 625 (similar to computing system 105a of Fig. 1, or the like), which may include, without limitation, at least one of one or more radio detection and ranging (“Radar”) data processors 630 (similar to Radar data processor(s) 110 of Fig. 1, or the like), one or more image data processors 635 (optional; similar to image data processor(s) 115 of Fig. 1, or the like), an objection detection system 640 (similar to objection detection system 120 of Fig. 1, or the like), or a data store 645 (similar to data store 125 of Fig. 1, or the like), and/or the like.
  • Radar data processors 630 similar to Radar data processor(s) 110 of Fig. 1, or the like
  • image data processors 635 optionally a data processor
  • objection detection system 640 similar to objection detection system 120 of Fig. 1, or the like
  • a data store 645 similar to data store 125 of Fig. 1, or the like
  • Each Radar data processor 630 may include, but is not limited to, Radar data parser 630a, Radar data filter 630b, and/or the like (similar to Radar data parser 110a, Radar data filter 110b, and/or the like of Fig. 1, or the like).
  • System 100 may further comprise one or more Radar sensors 655 (similar to Radar sensor(s) 135 of Fig. 1, or the like) and one or more cameras 675 (optional; similar to camera(s) 155 of Fig. 1, or the like) that may be used to detect one or more objects 660a-660n (collectively, "objects 660" or the like; similar to objects 140a-140n of Fig.
  • System 100 may further comprise remote computing system 625b and corresponding database(s) 680 (similar to remote computing system 105b and corresponding database(s) 160 of Fig. 1, or the like).
  • computing system 625a or 625b may receive Radar data (e.g., Radar signal data 650, or the like) from a Radar sensor (e.g., Radar sensor(s) 655, or the like), the Radar sensor being disposed such that a transparent or translucent material (e.g., transparent or translucent material 665, or the like) is positioned between one or more objects (e.g., objects 660a-660n, or the like) and the Radar sensor.
  • Radar data e.g., Radar signal data 650, or the like
  • a Radar sensor e.g., Radar sensor(s) 655, or the like
  • a transparent or translucent material e.g., transparent or translucent material 665, or the like
  • the computing system may parse (e.g., using Radar data parser 630a, or the like) the Radar data to extract range or depth data (e.g., range or depth di - d n corresponding to the depth or distance between the Radar sensor(s) 655 and corresponding one of objects 660a-660n, or the like).
  • range or depth data e.g., range or depth di - d n corresponding to the depth or distance between the Radar sensor(s) 655 and corresponding one of objects 660a-660n, or the like.
  • the computing system may filter out (e.g., using Radar data filter 630b, or the like) at least one Radar data point corresponding to said at least one extracted range or depth data from the Radar data to produce filtered Radar data, and may utilize the filtered Radar data as input to an object detection system (e.g., object detection system 640, or the like).
  • an object detection system e.g., object detection system 640, or the like.
  • the computing system may utilize the (unfiltered) Radar data from the Radar sensor as input to the object detection system.
  • image data if available; e.g., image data 670 from the camera(s) 675, or the like
  • Results or data from the object detection system may subsequently be sent to a user device(s) (e.g., user device(s) 605, or the like) for display to a user(s), or the like.
  • the distance between the Radar sensor and the transparent or translucent material may be a distance value that is stored in a data store (e.g., data store 645 or database(s) 680, or the like) prior to receiving the Radar data, the data store being accessible by the computing system.
  • a data store e.g., data store 645 or database(s) 680, or the like
  • the distance value (£)) may include, but is not limited to, one of: a first distance value obtained using a manual measurement between the Radar sensor and the transparent or translucent material, along a plane that is orthogonal to an emitter/receiver surface of the Radar sensor; a second distance value obtained using a Radar measurement using the radar sensor between the Radar sensor and the transparent or translucent material, along the plane that is orthogonal to the emitter/receiver surface of the Radar sensor; a third distance value comprising a default distance value corresponding to estimated or average distances between the transparent or translucent material and a typical mounting position of the Radar sensor; a fourth distance value comprising a range of values between zero and the first distance value; a fifth distance value comprising a range of values between zero and the second distance value; a sixth distance value comprising a range of values between zero and the third distance value; or one of the first through sixth distance values with a predetermined tolerance value; and/or the like.
  • the manual measurement may be performed by a user using one of a ruler, a measuring tape, a combination of a string and a ruler, a combination of a string and a measuring tape, a combination of a laser-based measuring tool positioned beside the Radar sensor and a laser reflective material placed against a surface of the transparent or translucent material and oriented to reflect laser light back to the laser-based measuring tool, the laserbased measuring tool positioned with the transparent or translucent material positioned between the laser-based measuring tool and the Radar sensor and with the laser-based measuring tool aimed at the Radar sensor, or the laser-based measuring tool positioned with the laser-based measuring tool positioned between the transparent or translucent material and the Radar sensor and with the laser-based measuring tool aimed at the Radar sensor, and/or the like.
  • the Radar measurement may be performed using one of a combination of the Radar sensor and a Radar reflective material positioned between the transparent or translucent material and the Radar sensor and oriented or configured to reflect Radar signals back to the Radar sensor or a combination of the Radar sensor and a Radar reflective material that is positioned with the transparent or translucent material between the Radar reflective material and the Radar sensor and oriented or configured to reflect Radar signals back to the Radar sensor, wherein the Radar reflective material comprises a Radar signal corner reflector, and/or the like.
  • the Radar data may include Radar point cloud data.
  • the at least one Radar data point may include at least one point within the Radar point cloud data.
  • the point cloud data may include coordinate data (e.g., x-axis coordinate data, y-axis coordinate data, and z-axis coordinate date, or corresponding polar coordinate data, or the like) and Radar reflection data (e.g., Radar reflection value or Radar signal intensity value, or the like), and/or the like.
  • the Radar data may include one of a 2D Radar heat map or a 3D Radar heat map.
  • the at least one Radar data point may include a corresponding one of at least one intensity point within the 2D Radar heat map or at least one peak within the 3D Radar heat map.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

Novel tools and techniques are provided for implementing radar-based object detection (e.g., advanced driver assistance systems ("ADASs"), other radar-based object detection, etc.). In various embodiments, a computing system may receive Radar data from a Radar sensor, the Radar sensor being disposed such that a transparent or translucent material is positioned between one or more objects and the Radar sensor. The computing system may parse the Radar data to extract range or depth data, and if at least one extracted range or depth data corresponds to a distance between the Radar sensor and the transparent or translucent material, may filter out at least one Radar data point corresponding to said at least one extracted range or depth data from the Radar data to produce filtered Radar data, and may utilize the filtered Radar data as input to an object detection system.

Description

RADAR SIGNAL FILTERING FOR REMOVING NOISE DUE TO TRANSPARENT OR TRANSLUCENT MATERIAL LOCATED IN FRONT OF SENSOR
COPYRIGHT STATEMENT
[0001] A portion of the disclosure of this patent document contains material that is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.
FIELD
[0002] The present disclosure relates, in general, to methods, systems, and apparatuses for implementing radar-based object detection (e.g., advanced driver assistance systems ("ADASs"), other radar-based object detection, or the like), and, more particularly, to methods, systems, and apparatuses for implementing radar signal filtering for removing noise due to transparent or translucent material located in front of sensor.
BACKGROUND
[0003] With the rapid development of artificial intelligence ("Al") technologies during the past decade, more and more automobile manufacturers are equipping their new vehicle models with advanced driver assistance systems ("ADAS") devices, which rely on on-board graphics processing unit ("GPU")/central processing unit ("CPU") to process and analyze the input data regarding their surroundings. These input data can come from different types of sensors, such as a camera, radio detection and ranging ("Radar"), and/or light detection and ranging ("Lidar"). Among them, Radar is often adopted for its insensitivity to bad weather and its low cost, even though Radar data comes with relatively low image resolution. A Radar device can be installed in different locations in a vehicle. It can be placed outside a driver's compartment, such as right behind the bumper at the front of the vehicle for better field of view ("FOV"). It can also be placed behind the windshield inside the driver's compartment for easier installation, maintenance, and better portability.
[0004] In cases where the Radar device is placed behind the windshield, for some windshields that include a metallization layer to mitigate the infrared radiation through the shield, an opening is often created in the metallization layer to let the radar signal pass through. However, even a windshield without a metallization layer can introduce some undesirable radar reflection for the radar receiver. Yet, this side effect (i.e., signal noise) from the windshield is often overlooked. While some conventional systems and techniques are directed to increase the Radar data resolution through radar signal filtering, none have been found to filter out the noisy data that are introduced by the translucent/transparent windshield. [0005] Hence, there is a need for more robust and scalable solutions for implementing radar-based object detection (e.g., advanced driver assistance systems ("ADASs"), or the like).
SUMMARY
[0006] The techniques of this disclosure generally relate to tools and techniques for implementing radar-based object detection (e.g., ADASs, or the like), and, more particularly, to methods, systems, and apparatuses for implementing radar signal filtering for removing noise due to transparent or translucent material located in front of sensor.
[0007] In an aspect, a method may comprise receiving, using a computing system, radio detection and ranging ("Radar") data from a Radar sensor, the Radar sensor being disposed such that a transparent or translucent material is positioned between one or more objects and the Radar sensor; parsing, using the computing system, the Radar data to extract range or depth data; based on a determination that at least one extracted range or depth data corresponds to a distance between the Radar sensor and the transparent or translucent material, filtering out, using the computing system, at least one Radar data point corresponding to said at least one extracted range or depth data from the Radar data to produce filtered Radar data, and utilizing the filtered Radar data as input to an object detection system; and, based on a determination that no extracted range or depth data corresponds to the distance between the Radar sensor and the transparent or translucent material, utilizing the Radar data from the Radar sensor as input to the object detection system.
[0008] In another aspect, an apparatus might comprise at least one processor and a non- transitory computer readable medium communicatively coupled to the at least one processor. The non-transitory computer readable medium might have stored thereon computer software comprising a set of instructions that, when executed by the at least one processor, causes the apparatus to: receive radio detection and ranging ("Radar") data from a Radar sensor, the Radar sensor being disposed such that a transparent or translucent material is positioned between one or more objects and the Radar sensor; parse the Radar data to extract range or depth data; based on a determination that at least one extracted range or depth data corresponds to a distance between the Radar sensor and the transparent or translucent material, filter out at least one Radar data point corresponding to said at least one extracted range or depth data from the Radar data to produce filtered Radar data, and utilize the filtered Radar data as input to an object detection system; and based on a determination that no extracted range or depth data corresponds to the distance between the Radar sensor and the transparent or translucent material, utilize the Radar data from the Radar sensor as input to the object detection system.
[0009] In yet another aspect, a system might comprise a computing system, which might comprise at least one first processor and a first non-transitory computer readable medium communicatively coupled to the at least one first processor. The first non-transitory computer readable medium might have stored thereon computer software comprising a first set of instructions that, when executed by the at least one first processor, causes the computing system to: receive radio detection and ranging ("Radar") data from a Radar sensor, the Radar sensor being disposed such that a transparent or translucent material is positioned between one or more objects and the Radar sensor; parse the Radar data to extract range or depth data; based on a determination that at least one extracted range or depth data corresponds to a distance between the Radar sensor and the transparent or translucent material, filter out at least one Radar data point corresponding to said at least one extracted range or depth data from the Radar data to produce filtered Radar data, and utilize the filtered Radar data as input to an object detection system; and based on a determination that no extracted range or depth data corresponds to the distance between the Radar sensor and the transparent or translucent material, utilize the Radar data from the Radar sensor as input to the object detection system. [0010] Various modifications and additions can be made to the embodiments discussed without departing from the scope of the invention. For example, while the embodiments described above refer to particular features, the scope of this invention also includes embodiments having different combination of features and embodiments that do not include all of the above-described features.
[0011] The details of one or more aspects of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the techniques described in this disclosure will be apparent from the description and drawings, and from the claims.
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] A further understanding of the nature and advantages of particular embodiments may be realized by reference to the remaining portions of the specification and the drawings, in which like reference numerals are used to refer to similar components. In some instances, a sub-label is associated with a reference numeral to denote one of multiple similar components. When reference is made to a reference numeral without specification to an existing sub-label, it is intended to refer to all such multiple similar components.
[0013] Fig. 1 is a schematic diagram illustrating a system for implementing radar signal filtering for removing noise due to transparent or translucent material located in front of sensor, in accordance with various embodiments.
[0014] Figs. 2A and 2B are schematic block flow diagrams illustrating a non-limiting example of object detection using radar signal filtering for removing noise due to transparent or translucent material located in front of sensor, in accordance with various embodiments.
[0015] Figs. 3A-3D are schematic diagrams illustrating a non-limiting example of the use of radar signal filtering for removing noise due to transparent or translucent material located in front of sensor during implementation of an advanced driver assistance system ("ADAS"), in accordance with various embodiments.
[0016] Figs. 4A and 4B are flow diagrams illustrating a method for implementing radar signal filtering for removing noise due to transparent or translucent material located in front of sensor, in accordance with various embodiments.
[0017] Fig. 5 is a block diagram illustrating an example of computer or system hardware architecture, in accordance with various embodiments.
[0018] Fig. 6 is a block diagram illustrating a networked system of computers, computing systems, or system hardware architecture, which can be used in accordance with various embodiments.
DETAILED DESCRIPTION
[0019] Overview
[0020] Various embodiments provide tools and techniques for implementing radar-based object detection (e.g., advanced driver assistance systems ("ADASs"), or the like), and, more particularly, to methods, systems, and apparatuses for implementing radar signal filtering for removing noise due to transparent or translucent material located in front of sensor.
[0021] A general framework is provided to filter out unwanted reflection from radar data, the unwanted reflection being caused by a translucent/transparent object in front of the radar sensor. In the non- limiting application scenario of ADAS, this translucent/transparent object is most likely a windshield of a vehicle. This noisy data can be removed so that the quality of the radar signal can be improved, which can help with subsequent object detection.
[0022] In some embodiments, the framework comprises an online part and an offline part. The offline part should be executed before the online part. In offline part, the distance between the windshield and the radar sensor is determined, which next will be used in the online part to filter the radar data.
[0023] The various embodiments do not need sophisticated fabrication or installation and it can also adapt to various types of devices.
[0024] Although the most representative application scenario for this framework is ADAS, the various embodiments are not so limited, and the framework can find applications in many other areas or fields of technology where a radar signal is needed. In many of such cases, for protective purposes, the radar sensor may be placed behind a translucent/transparent shield, which can introduce unwanted reflective noise in the radar data. The framework described herein can greatly help with this situation to obtain radar data with better quality. [0025] In various embodiments, a computing system may receive radio detection and ranging ("Radar") data from a Radar sensor, the Radar sensor being disposed such that a transparent or translucent material is positioned between one or more objects and the Radar sensor. The computing system may parse the Radar data to extract range or depth data. Based on a determination that at least one extracted range or depth data corresponds to a distance between the Radar sensor and the transparent or translucent material, the computing system may filter out at least one Radar data point corresponding to said at least one extracted range or depth data from the Radar data to produce filtered Radar data, and may utilize the filtered Radar data as input to an object detection system. Based on a determination that no extracted range or depth data corresponds to the distance between the Radar sensor and the transparent or translucent material, the computing system may utilize the Radar data from the Radar sensor as input to the object detection system.
[0026] In some embodiments, the computing system may comprise at least one of a radar data processing system, the object detection system, an object detection and ranging system, a positioning and mapping system, an advanced driver assistance system ("ADAS"), a processor on a user device, a server computer over a network, a cloud computing system, or a distributed computing system, and/or the like.
[0027] According to some embodiments, the Radar sensor may be part of a Radar system that comprises a two-dimensional ("2D") Radar transmitter that emits a 2D Radar signal along a plane that is orthogonal to the 2D Radar transmitter, wherein the transparent or translucent material intersects the plane and the 2D Radar signal. In some instances, the 2D Radar transmitter may comprise at least one antenna disposed on an integrated circuit ("IC") chip, wherein the at least one antenna comprises one of a single IC -based antenna disposed on the IC chip, a plurality of IC-based antennas arranged as a one-dimensional ("ID") line of antennas disposed on the IC chip, or a 2D array of IC -based antennas disposed on the IC chip, and/or the like. The plane may be orthogonal to a surface of the IC chip on which the at least one antenna is disposed. In some cases, the plane may be parallel to a ground surface below the Radar sensor.
[0028] Alternatively, the Radar sensor may be part of a Radar system that comprises a three-dimensional ("3D") Radar transmitter that emits a 3D Radar signal that is orthogonal to the 3D Radar transmitter, wherein the transparent or translucent material intersects the 3D Radar signal.
[0029] In some embodiments, the transparent or translucent material may comprise a windshield of a vehicle, wherein the object detection system may be one of a system integrated with an ADAS or a system separate from yet in communication with an ADAS, or the like. In some cases, the windshield may be a front windshield, and the one or more objects may be located in front of a front portion of the vehicle. Alternatively, the windshield may be a rear windshield, and the one or more objects may be located behind a rear portion of the vehicle.
[0030] According to some embodiments, the distance between the Radar sensor and the transparent or translucent material may be a distance value that is stored in a data store prior to receiving the Radar data, the data store being accessible by the computing system.
[0031] In some instances, the distance value may comprise one of: a first distance value obtained using a manual measurement between the Radar sensor and the transparent or translucent material, along a plane that is orthogonal to an emitter/receiver surface of the Radar sensor; a second distance value obtained using a Radar measurement using the radar sensor between the Radar sensor and the transparent or translucent material, along the plane that is orthogonal to the emitter/receiver surface of the Radar sensor; a third distance value comprising a default distance value corresponding to estimated or average distances between the transparent or translucent material and a typical mounting position of the Radar sensor; a fourth distance value comprising a range of values between zero and the first distance value; a fifth distance value comprising a range of values between zero and the second distance value; a sixth distance value comprising a range of values between zero and the third distance value; or one of the first through sixth distance values with a predetermined tolerance value; and/or the like.
[0032] The manual measurement may be performed by a user using one of a ruler, a measuring tape, a combination of a string and a ruler, a combination of a string and a measuring tape, a combination of a laser-based measuring tool positioned beside the Radar sensor and a laser reflective material placed against a surface of the transparent or translucent material and oriented to reflect laser light back to the laser-based measuring tool, the laserbased measuring tool positioned with the transparent or translucent material positioned between the laser-based measuring tool and the Radar sensor and with the laser-based measuring tool aimed at the Radar sensor, or the laser-based measuring tool positioned with the laser-based measuring tool positioned between the transparent or translucent material and the Radar sensor and with the laser-based measuring tool aimed at the Radar sensor, and/or the like. The Radar measurement may be performed using one of a combination of the Radar sensor and a Radar reflective material positioned between the transparent or translucent material and the Radar sensor and oriented or configured to reflect Radar signals back to the Radar sensor or a combination of the Radar sensor and a Radar reflective material that is positioned with the transparent or translucent material between the Radar reflective material and the Radar sensor and oriented or configured to reflect Radar signals back to the Radar sensor, wherein the Radar reflective material comprises a Radar signal corner reflector, and/or the like.
[0033] Merely by way of example, in some cases, the Radar data may comprise Radar point cloud data, and wherein the at least one Radar data point may comprise at least one point within the Radar point cloud data.
[0034] Alternatively, or additionally, the Radar data may comprise one of a 2D Radar heat map or a 3D Radar heat map, and wherein the at least one Radar data point may comprise a corresponding one of at least one intensity point within the 2D Radar heat map or at least one peak within the 3D Radar heat map.
[0035] In the various aspects described herein, radar signal filtering for removing noise due to transparent or translucent material located in front of sensor is provided. This new technique can effectively remove the unwanted noisy data that are caused by the radar reflection on the windshield of a vehicle, thus providing the user cleaned-up radar data that can result in increased accuracy of radar-based detection. It is also easy to implement. To obtain the distance between the radar sensor and the windshield, the user is provided with three options: (a) distance determination through physical measurement, (b) distance determination through radar signal processing, or (c) use of a default typical distance, or the like. There is no need for any additional software or hardware tool.
[0036] In addition, this new technique does not require sophisticated fabrication or installation for it to be put into a real application. The whole procedure for radar signal filtering is also straightforward for a user to implement. The new technique does not have a strict requirement on the types of radar sensor to be used. Moreover, it can be incorporated into various types of devices. These devices can be either compact (e.g., radar and other type of sensor that are enclosed in a same chamber, etc.) or distributed (e.g., radar and other type of sensor that are separately placed in different chambers, etc.), and/or the like.
[0037] These and other aspects of the system and method for radar signal filtering for removing noise due to transparent or translucent material located in front of sensor are described in greater detail with respect to the figures.
[0038] The following detailed description illustrates a few embodiments in further detail to enable one of skill in the art to practice such embodiments. The described examples are provided for illustrative purposes and are not intended to limit the scope of the invention. [0039] In the following description, for the purposes of explanation, numerous details are set forth in order to provide a thorough understanding of the described embodiments. It will be apparent to one skilled in the art, however, that other embodiments of the present invention may be practiced without some of these details. In other instances, some structures and devices are shown in block diagram form. Several embodiments are described herein, and while various features are ascribed to different embodiments, it should be appreciated that the features described with respect to one embodiment may be incorporated with other embodiments as well. By the same token, however, no single feature or features of any described embodiment should be considered essential to every embodiment of the invention, as other embodiments of the invention may omit such features.
[0040] Unless otherwise indicated, all numbers used herein to express quantities, dimensions, and so forth used should be understood as being modified in all instances by the term "about." In this application, the use of the singular includes the plural unless specifically stated otherwise, and use of the terms "and" and "or" means "and/or" unless otherwise indicated. Moreover, the use of the term "including," as well as other forms, such as "includes" and "included," should be considered non-exclusive. Also, terms such as "element" or "component" encompass both elements and components comprising one unit and elements and components that comprise more than one unit, unless specifically stated otherwise.
[0041] Various embodiments as described herein - while embodying (in some cases) software products, computer-performed methods, and/or computer systems - represent tangible, concrete improvements to existing technological areas, including, without limitation, Radar data processing technology, Radar-based object detection technology, object detection technology, driver assistance technology, and/or the like. In other aspects, some embodiments can improve the functioning of user equipment or systems themselves (e.g., Radar data processing systems, Radar-based object detection systems, object detection systems, driver assistance systems, etc.), for example, by receiving, using a computing system, Radar data from a Radar sensor, the Radar sensor being disposed such that a transparent or translucent material is positioned between one or more objects and the Radar sensor; parsing, using the computing system, the Radar data to extract range or depth data; based on a determination that at least one extracted range or depth data corresponds to a distance between the Radar sensor and the transparent or translucent material, filtering out, using the computing system, at least one Radar data point corresponding to said at least one extracted range or depth data from the Radar data to produce filtered Radar data, and utilizing the filtered Radar data as input to an object detection system; and based on a determination that no extracted range or depth data corresponds to the distance between the Radar sensor and the transparent or translucent material, utilizing the Radar data from the Radar sensor as input to the object detection system; and/or the like.
[0042] In particular, to the extent any abstract concepts are present in the various embodiments, those concepts can be implemented as described herein by devices, software, systems, and methods that involve novel functionality (e.g., steps or operations), such as, radar signal filtering for removing noise due to transparent or translucent material located in front of sensor (e.g., removing the unwanted noisy data that are caused by the radar reflection on the windshield of a vehicle, thus providing the user cleaned-up radar data that can result in increased accuracy of radar-based detection, for driver assistance applications, or the like) and/or the like, to name a few examples, that extend beyond mere conventional computer processing operations. These functionalities can produce tangible results outside of the implementing computer system, including, merely by way of example, optimized Radar data processing that takes into account unwanted noise caused by reflection from a transparent or translucent material or object between the Radar sensor(s) and objects in front of the Radar sensor(s), using a system that does not require sophisticated fabrication or installation for it to be put into a real application, is straightforward for a user to implement, does not have a strict requirement on the types of radar sensor to be used, and can be incorporated into various types of devices, and/or the like, at least some of which may be observed or measured by users (e.g., drivers, Radar technicians, etc.), developers, and/or Radar-based object detection system manufacturers.
[0043] Some Embodiments [0044] We now turn to the embodiments as illustrated by the drawings. Figs. 1-6 illustrate some of the features of the method, system, and apparatus for implementing radarbased object detection (e.g., advanced driver assistance systems ("ADASs"), or the like), and, more particularly, to methods, systems, and apparatuses for implementing radar signal filtering for removing noise due to transparent or translucent material located in front of sensor, as referred to above. The methods, systems, and apparatuses illustrated by Figs. 1-6 refer to examples of different embodiments that include various components and steps, which can be considered alternatives or which can be used in conjunction with one another in the various embodiments. The description of the illustrated methods, systems, and apparatuses shown in Figs. 1-6 is provided for purposes of illustration and should not be considered to limit the scope of the different embodiments.
[0045] With reference to the figures, Fig. 1 is a schematic diagram illustrating a system 100 for implementing radar signal filtering for removing noise due to transparent or translucent material located in front of sensor, in accordance with various embodiments. [0046] In the non-limiting example of Fig. 1, system 100 may comprise computing system(s) 105a, which may include, without limitation, at least one of one or more radio detection and ranging ("Radar") data processors 110 (including, but not limited to, Radar data parser 110a, Radar data filter 110b, and/or the like), one or more image data processors 115 (optional), an objection detection system 120, or a data store, and/or the like. System 100 may further comprise one or more Radar sensors 135 and one or more cameras 155 (optional) that may be used to detect one or more objects 140a-140n (collectively, "objects 140" or the like), in some cases, through a transparent or translucent material 145.
[0047] System 100 may further comprise remote computing system 105b (and corresponding database(s) 160), computing system 105b including, but not limited to, at least one of a remote or server-based radar data processing system, a remote or server-based object detection system, a remote or server-based object detection and ranging system, a remote or server-based positioning and mapping system, a remote or server-based ADAS, a server computer over a network, a cloud computing system, or a distributed computing system, and/or the like. System 100 may further comprise network(s) 165, a (separate or standalone) ADAS 170 (optional), and one or more user devices 175a-175n (collectively, "user devices 175" or the like).
[0048] In some embodiments, the computing system 105a may include, without limitation, at least one of a radar data processing system, the object detection system, an object detection and ranging system, a positioning and mapping system, an advanced driver assistance system ("ADAS"), or a processor on a user device, and/or the like. The data store 125 may include, but is not limited to, at least one of read-only memory ("ROM"), programmable read-only memory ("PROM"), erasable programmable read-only memory ("EPROM"), electrically erasable programmable read-only memory ("EEPROM"), flash memory, other non-volatile memory devices, random-access memory ("RAM"), static random-access memory ("SRAM"), dynamic random-access memory ("DRAM"), synchronous dynamic random-access memory ("SDRAM"), virtual memory, a RAM disk, or other volatile memory devices, non-volatile RAM devices, and/or the like.
[0049] According to some embodiments, the Radar sensor(s) 135 may be part of a Radar system that includes a two-dimensional ("2D") Radar transmitter that emits a 2D Radar signal along a plane that is orthogonal to the 2D Radar transmitter, where the transparent or translucent material 145 may intersect the plane and the 2D Radar signal. In some instances, the 2D Radar transmitter may include at least one antenna disposed on an integrated circuit ("IC") chip, which may be part of a system on chip ("SoC") platform or the like. The at least one antenna may include, but is not limited to, one of a single IC -based antenna disposed on the IC chip, a plurality of IC-based antennas arranged as a one-dimensional ("ID") line of antennas disposed on the IC chip, or a 2D array of IC-based antennas disposed on the IC chip, and/or the like. The plane may be orthogonal to a surface of the IC chip on which the at least one antenna is disposed. In some cases, the plane may be parallel to a ground surface below the Radar sensor(s) 135. Alternatively, the Radar sensor(s) 135 may be part of a Radar system that includes a three-dimensional ("3D") Radar transmitter that emits a 3D Radar signal that is orthogonal to the 3D Radar transmitter, where the transparent or translucent material 145 may intersect the 3D Radar signal.
[0050] Herein, a transparent material (including, but not limited to, clear glass, etc.) refers to a material through which all incident light passes, while a translucent material (including, but not limited to, frosted glass, some plastics, etc.) refers to a material through which some (but not all) of the incident light passes. In some cases, the transparent or translucent material 145 may include, but is not limited to, one of a vehicle front windshield with tinting and/or metallization layers, a vehicle front windshield without tinting or metallization layers, a vehicle rear windshield with tinting and/or metallization layers, a vehicle rear windshield without tinting or metallization layers, a building clear glass window, a building frosted glass window, a clear plastic protective material for a portable device, a frosted plastic protective material for a portable device, a clear plastic wall, a frosted plastic wall, and/or the like. [0051] According to some embodiments, the user devices 175 may each include, without limitation, one of a smart phone, a tablet computer, a laptop computer, a desktop computer, or a server computer, and/or the like. Some user devices (e.g., a smart phone, a tablet computer, a laptop computer, etc.) may each include at least one integrated display screen (not shown) (in some cases, including a non-touchscreen display screen(s), while, in other cases, including a touchscreen display screen(s), and, in still other cases, including a combination of at least one non-touchscreen display screen and at least one touchscreen display screen) and at least one integrated audio playback device (not shown) (e.g., built-in speakers or the like). Some user devices (e.g., a desktop computer, or a server computer, or the like) may each include at least one external display screen or monitor (not shown) (which may be a non-touchscreen display device or a touchscreen display device, or the like) and at least one integrated audio playback device (not shown) (e.g., built-in speakers, etc.) and/or at least one external audio playback device (not shown) (e.g., external or peripheral speakers, wired earphones, wired earbuds, wired headphones, wireless earphones, wireless earbuds, wireless headphones, or the like). Some user devices (e.g., some desktop computers, or some server computers, or the like) may have neither an integrated display screen nor an external display screen.
[0052] The lightning bolt symbols are used to denote wireless communications between computing system(s) 105a and each of one or more of radar sensor(s) 135, camera(s) 155, ADAS 170, and/or user device(s) 175, between network(s) 165 (in some cases, via network access points or the like (not shown)) and each of one or more of computing system(s) 105 a, remote computing system 105b, radar sensor(s) 135, camera(s) 155, ADAS 170, and/or user device(s) 175, between computing system 105b and each of one or more of computing system(s) 105a, remote computing system 105b, radar sensor(s) 135, camera(s) 155, ADAS 170, and/or user device(s) 175 via network(s) 165, and/or the like. In some embodiments, the wireless communications may include wireless communications using protocols including, but not limited to, at least one of Bluetooth™ communications protocol, WiFi communications protocol, or other 802.11 suite of communications protocols, ZigBee communications protocol, Z-wave communications protocol, or other 802.15.4 suite of communications protocols, cellular communications protocol (e.g., 3G, 4G, 4G LTE, 5G, etc.), or other suitable communications protocols, and/or the like.
[0053] In some cases, the network(s) 165 may each include a local area network ("LAN"), including, without limitation, a fiber network, an Ethernet network, a Token-Ring™ network, and/or the like; a wide-area network ("WAN"); a wireless wide area network ("WWAN"); a virtual network, such as a virtual private network ("VPN"); the Internet; an intranet; an extranet; a public switched telephone network ("PSTN"); an infra-red network; a wireless network, including, without limitation, a network operating under any of the IEEE 802.11 suite of protocols, the Bluetooth™ protocol known in the art, and/or any other wireless protocol; and/or any combination of these and/or other networks. In a particular embodiment, the network(s) 165 might include an access network of the service provider (e.g., an Internet service provider ("ISP")). In another embodiment, the network(s) 165 may include a core network of the service provider, and/or the Internet.
[0054] In operation, computing system 105a or 105b (collectively, "computing system" or the like) may receive Radar data (e.g., Radar signal data 130, or the like) from a Radar sensor (e.g., Radar sensor(s) 135, or the like), the Radar sensor being disposed such that a transparent or translucent material (e.g., transparent or translucent material 145, or the like) is positioned between one or more objects (e.g., objects 140a-140n, or the like) and the Radar sensor. The computing system may parse (e.g., using Radar data parser 110a, or the like) the Radar data to extract range or depth data (e.g., range or depth di - dn corresponding to the depth or distance between the Radar sensor(s) 135 and corresponding one of objects 145a-145n, or the like). Based on a determination that at least one extracted range or depth data (e.g., one of di - dn, or the like) corresponds to a distance (D) between the Radar sensor and the transparent or translucent material, the computing system may filter out (e.g., using Radar data filter 110b, or the like) at least one Radar data point corresponding to said at least one extracted range or depth data from the Radar data to produce filtered Radar data, and may utilize the filtered Radar data as input to an object detection system (e.g., object detection system 120, or the like) and/or an ADAS (e.g., ADAS 170, or the like; in the case of radar for use for assisting driving of vehicles, or the like). Based on a determination that no extracted range or depth data corresponds to the distance between the Radar sensor and the transparent or translucent material, the computing system may utilize the (unfiltered) Radar data from the Radar sensor as input to the object detection system. In some cases, image data (if available; e.g., image data 150 from the camera(s) 155, or the like) may also be utilized as additional input to the object detection system. Results or data from the object detection system (and/or ADAS) may subsequently be sent to a user device(s) (e.g., user device(s) 175, or the like) for display to a user(s), or the like.
[0055] According to some embodiments, the distance between the Radar sensor and the transparent or translucent material may be a distance value that is stored in a data store (e.g., data store 125 or database(s) 160, or the like) prior to receiving the Radar data, the data store being accessible by the computing system.
[0056] In some instances, the distance value (£)) may include, but is not limited to, one of: a first distance value obtained using a manual measurement between the Radar sensor and the transparent or translucent material, along a plane that is orthogonal to an emitter/receiver surface of the Radar sensor; a second distance value obtained using a Radar measurement using the radar sensor between the Radar sensor and the transparent or translucent material, along the plane that is orthogonal to the emitter/receiver surface of the Radar sensor; a third distance value comprising a default distance value corresponding to estimated or average distances between the transparent or translucent material and a typical mounting position of the Radar sensor; a fourth distance value comprising a range of values between zero and the first distance value; a fifth distance value comprising a range of values between zero and the second distance value; a sixth distance value comprising a range of values between zero and the third distance value; or one of the first through sixth distance values with a predetermined tolerance value; and/or the like. Merely by way of example, in some cases, the distance (£)) may include any suitable separation distance between a Radar sensor and the transparent or translucent material, including, but not limited to, one of about 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, or 50 cm, or a range between about 0.25 and about 50 cm, or the like. In some cases, the default distance value may be any suitable value, including, but not limited to, one of about 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, or 50 cm, or a range between about 0.25 and about 50 cm, or the like (e.g., 30 cm or the like, where any values of d being 0 - 30 cm may be filtered out). In some instances, the predetermined tolerance may be any suitable tolerance value, including, but not limited to, one of about 1, 2, 3, 4, 5, 6, 7, 8, 9, or 10 cm, or a range between about 0.25 and about 10 cm, or the like. The depth or range (<7) of typical Radar sensors may extend from about 0 cm to 100's of meters, or further.
[0057] The manual measurement may be performed by a user using one of a ruler, a measuring tape, a combination of a string and a ruler, a combination of a string and a measuring tape, a combination of a laser-based measuring tool positioned beside the Radar sensor and a laser reflective material placed against a surface of the transparent or translucent material and oriented to reflect laser light back to the laser-based measuring tool, the laserbased measuring tool positioned with the transparent or translucent material positioned between the laser-based measuring tool and the Radar sensor and with the laser-based measuring tool aimed at the Radar sensor, or the laser-based measuring tool positioned with the laser-based measuring tool positioned between the transparent or translucent material and the Radar sensor and with the laser-based measuring tool aimed at the Radar sensor, and/or the like. The Radar measurement may be performed using one of a combination of the Radar sensor and a Radar reflective material positioned between the transparent or translucent material and the Radar sensor and oriented or configured to reflect Radar signals back to the Radar sensor or a combination of the Radar sensor and a Radar reflective material that is positioned with the transparent or translucent material between the Radar reflective material and the Radar sensor and oriented or configured to reflect Radar signals back to the Radar sensor, wherein the Radar reflective material comprises a Radar signal corner reflector, and/or the like.
[0058] Merely by way of example, in some cases, the Radar data may include Radar point cloud data. In some instances, the at least one Radar data point may include at least one point within the Radar point cloud data. In some embodiments, the point cloud data may include coordinate data (e.g., x-axis coordinate data, y-axis coordinate data, and z-axis coordinate date, or corresponding polar coordinate data, or the like) and Radar reflection data (e.g., Radar reflection value or Radar signal intensity value, or the like), and/or the like.
[0059] Alternatively, or additionally, the Radar data may include one of a 2D Radar heat map or a 3D Radar heat map. In such cases, the at least one Radar data point may include a corresponding one of at least one intensity point within the 2D Radar heat map or at least one peak within the 3D Radar heat map.
[0060] In some aspects, millimeter wave Radar (e.g., Texas Instrument's mm wave Radar or TI mmWave, or the like) may be used as the Radar sensor (although not limited to such type of Radar system), which can provide Radar input in the form of Radar points, or the like. The data associated with each Radar point may include, without limitation, both the 3D coordinates (e.g., x, y, z coordinate data, polar coordinate data, or the like) and the intensity of Radar reflection (usually in the form of Radar cross section ("RCS"), or the like). As the intensity of Radar reflection (e.g., RCS) may be determined not only by the distance between an object and the Radar sensor, but also by the material of the object as well as the angle of the reflective surface, the unwanted noise caused by a vehicle windshield (or other transparent or translucent material or object) cannot be removed by using RCS values alone. Therefore, the system and techniques of the various embodiments set out to remove Radar noise using the 3D coordinates of the Radar point cloud. In particular, the system and techniques of the various embodiments may rely on the range (or depth) information in the 3D coordinates to filter out the noise in the Radar data.
[0061] These and other functions of the system 100 (and its components) are described in greater detail below with respect to Figs. 2-4.
[0062] Figs. 2A and 2B (collectively, "Fig. 2") are schematic block flow diagrams illustrating a non-limiting example 200 of object detection using radar signal filtering for removing noise due to transparent or translucent material located in front of sensor, in accordance with various embodiments.
[0063] Some embodiments are directed to ongoing work on developing an ADAS with multiple sensors. Two types of sensors may be adopted for ADAS - namely, a camera and a radar device. The two types of data inputs are aligned through radar-camera calibration before being utilized for object detection in ADAS, as shown in the non-limiting example of Fig. 2A.
[0064] Turning to Fig. 2A, Radar signal data 210 from Radar sensor 205 may be processed and filtered using Radar signal filtering (at block 215), the filtered signal being used for data alignment (at block 230) together with image data 225 from camera 220. After data alignment between the Radar sensor 205 and camera 220, object detection (at block 235) may be performed to detect objects that are sensed or captured by Radar sensor 205 and camera 220. The resultant object detection data may be sent to an ADAS 240 to enable driver assistance.
[0065] To make the system an easy-to-use and easy-to-install product, the multi-sensor data collection platform may be placed inside the driver's compartment behind the windshield. However, this design will lead to the undesirable Radar reflection from the windshield, which could have a negative impact on the subsequent object detection task. Therefore, an additional stage of Radar signal filtering (at block 215) is introduced to remove this unwanted noise in the Radar signal data 210 from Radar sensor 205.
[0066] With reference to the non-limiting example of Fig. 2B, radar signal filtering is shown. Some embodiments comprise two main parts: an offline part (or setup part or presensing part, or the like; depicted on the left side of the vertical dashed line in Fig. 2B) and an online part (or Radar sensing part or object detection part, or the like; depicted on the right side of the vertical dashed line in Fig. 2B). The offline (or setup or pre-sensing) part may include, without limitation, (physical, in some cases, manually) setting up the Radar sensor behind a windshield (or other transparent or translucent material or object, or the like) (at block 245), and determining or computing distance (D) between the Radar sensor and the windshield (at block 250). This part is executed before the actual radar data collection (i.e., the online part). The online (or Radar sensing, or object detection) part may include, but is not limited to, receiving Radar data (in some cases, in the form of a Radar point cloud, or the like) from the Radar sensor (at block 255) and parsing the Radar data to extract the range or depth (d) data for each point in the Radar point cloud (at block 260). Next, for each Radar point, this depth value (d) may be compared with the previously obtained distance (D) (at block 265). If these two values are very close (or if depth value (<7) is less than or equal to distance (D), or the like) for any particular Radar point, then this radar point may be considered as indicative of Radar reflection from the windshield, and hence may be removed (or filtered) from the final Radar signal output or Radar point cloud (at block 270).
Otherwise, the particular Radar point may be incorporated into the final output of radar point cloud. The processes at blocks 265 and 270 may be repeated for each Radar data point (or at least for those Radar data points that are close to the distance (£>)). At block 275, the unfiltered or incorporated Radar points may be collected, and the Radar Data may be sent to an object detection system (at block 280).
[0067] These and other functions of the example 200 (and its components) are described in greater detail below with respect to Figs. 1, 3, and 4.
[0068] Figs. 3A-3D (collectively, "Fig. 3") are schematic diagrams illustrating a nonlimiting example 300 of the use of radar signal filtering for removing noise due to transparent or translucent material located in front of sensor during implementation of an advanced driver assistance system ("ADAS"), in accordance with various embodiments. Fig. 3 A depicts relative positions between a Radar sensor(s) and a windshield of a vehicle, while Fig. 3B depicts an example Radar range profile that is averaged across all angles of a Radar sensor(s), and Figs. 3C and 3D depict manual measurement (Fig. 3C) and Radar measurement (Fig. 3D), respectively, of distance (D) between the Radar sensor and the windshield. In some cases, the windshield may be a front windshield, and one or more objects that may be detected by the Radar sensor(s) may be located in front of a front portion of the vehicle. Alternatively, the windshield may be a rear windshield, and the one or more objects that may be detected by the Radar sensor(s) may be located behind a rear portion of the vehicle.
[0069] As shown in Fig. 3 A, a Radar sensor(s) 305 having a Radar emitter surface 310 and an antenna axis 315 may be placed or mounted within a vehicle interior compartment, behind a windshield 320 (at an approximate distance (D) between the Radar emitter surface 310 and the windshield 320, along the antenna axis 315). [0070] According to some embodiments, the Radar sensor 305 may be part of a Radar system that comprises a two-dimensional ("2D") Radar transmitter that emits a 2D Radar signal along a plane that is orthogonal to the 2D Radar transmitter (i.e., a plane that is orthogonal to the Radar emitter surface 310, where the antenna axis 315 lies along the plane, or the like, the antenna axis 315 being orthogonal to the radar emitter surface 310). In such a case, windshield 320 (or other transparent or translucent material or object, or the like) may intersect the plane and the 2D Radar signal. In some instances, the 2D Radar transmitter may include, without limitation, at least one antenna disposed on an integrated circuit ("IC") chip, the at least one antenna including, but not limited to, one of a single IC -based antenna disposed on the IC chip, a plurality of IC-based antennas arranged as a one-dimensional ("ID") line of antennas disposed on the IC chip, or a 2D array of IC-based antennas disposed on the IC chip, and/or the like. The plane may be orthogonal to a surface of the IC chip on which the at least one antenna is disposed. In some cases, the plane may be parallel to a ground surface below the Radar sensor.
[0071] Alternatively, the Radar sensor 305 may be part of a Radar system that comprises a three-dimensional ("3D") Radar transmitter that emits a 3D Radar signal that is orthogonal to the 3D Radar transmitter (i.e., orthogonal to the radar emitter surface 310, the antenna axis 315 being orthogonal to the radar emitter surface 310, or the like). In such a case, windshield 320 (or other transparent or translucent material or object, or the like) may intersect the 3D Radar signal.
[0072] Merely by way of example, in some cases, the Radar data may include, but is not limited to, Radar point cloud data, and the at least one Radar data point including, without limitation, at least one point within the Radar point cloud data. Alternatively, or additionally, the Radar data may include, but is not limited to, one of a 2D Radar heat map or a 3D Radar heat map, or the like, and the at least one Radar data point may include, without limitation, a corresponding one of at least one intensity point within the 2D Radar heat map or at least one peak within the 3D Radar heat map, and/or the like.
[0073] The Radar sensor 305 behind the windshield 320 may be placed either close to the top or bottom of the windshield 320 (depicted in Fig. 3 as being close to the top). In either case, it may be necessary to obtain the distance (D) between the Radar sensor 305 and the windshield 320. The exact distance value can be determined either by pure physical measurement (as shown and described with respect to Fig. 3C), or by Radar signal processing (as shown and described with respect to Fig. 3D). [0074] Turning to Fig. 3B, an example Radar range profile that is averaged across all angles is shown, with each intensity peak (having amplitude measured in dB) representing an object(s) detected at a distance (d) that is measured in meters. Here, the angles refer to angles of a fan- shaped detection field corresponding to the Radar field of detection, expanding from a point at the radar emitter surface 310, with the center of the fan-shaped detection field corresponding to the antenna axis 315. As shown in Fig. 3B, the peaks 325 at about 2.5 inches (or about 6.3 cm) potentially corresponds to distance (D) between the Radar sensor(s) 315 and the windshield 320, and thus may be filtered out (or removed) from the Radar data, thereby increasing accuracy of Radar-based object detection, or the like.
[0075] With reference to the non-limiting example of Fig. 3C, distance (D) may be determined by using a manual or physical measurement. After the Radar sensor 305 has been mounted behind the windshield 320, the distance (D) between them can be measured, e.g., using a ruler or measuring tape 335a (or a combination of a string to mark the distance and a ruler or measuring tape 335a to measure the marked portion of the string, or the like), or the like. It should be noted that the ruler, measuring tape, and/or string needs to be orthogonal to the radar antenna plane (i.e., Radar emitter surface 310), or at least be approximately orthogonal to the radar antenna plane (i.e., Radar emitter surface 310), in some cases, along the antenna axis 315, or the like. This is because the Radar beam of Radar sensor 305 for ADAS usually has a limited elevation angle, which will result in inaccurate reading if the distance (D) is not measured along the orthogonal direction (i.e., along the antenna axis 315, or the like).
[0076] In an alternative (and, in some cases, more accurate) way to manually or physically measure the distance (D) is by using a laser distance measurer or laser measuring tool 335b, although, it is a more expensive way. In the case that the laser measuring tool 335b is positioned beside the radar sensor and pointed toward the windshield 320, it may be necessary for the user to temporarily place an opaque shield or layer against the windshield 320 so as to prevent the laser beam from easily passing through the windshield 320, which will likely lead to inaccurate measurement. Again, in this case, for the reason described in the previous paragraph, the laser beam ought to be (at least approximately) orthogonal to the radar antenna plane (i.e., orthogonal to the Radar emitter surface 310 and/or aligned with the antenna axis 315, or the like). Alternatively, such as shown in Fig. 3C, the laser measuring tool 335b may be positioned with the windshield 320 between the laser measuring tool 335b and the Radar sensor 305, with the laser measuring tool 335b being aimed at the Radar emitter surface 310 and/or another portion of the Radar sensor, along the antenna axis 315 (where the Radar emitter surface 310 is used to reflect the laser beam back to the laser measuring tool 335b. [0077] Alternatively, as shown in Fig. 3D, distance (D) may be determined by using a Radar measurement or using Radar signal processing, or the like. In this manner, the distance (D) between the Radar sensor 305 and the windshield 320 may be obtained without a physical measuring device. To determine or compute the distance value (D), one can simply use the Radar signal that is reflected from the windshield 320. As described above, the reflected radar signal, which is received by the radar sensor or receiver 305, may contain range or depth information (d) of the Radar point(s) in front of the Radar sensor(s) 305 (i.e., in front of the Radar emitter surface 310, or the like). By extracting the depth value (d) from the Radar point cloud, one can obtain the distance value (D). In some embodiments, the user may temporarily place a Radar reflector 340 (e.g., a Radar corner reflector, or the like), which may preferably be made of metal with strong radar reflection characteristics (i.e., back to the Radar sensor(s) 305), close to the windshield 320. This Radar reflector 340 may provide very strong reflection (i.e., high Radar cross section ("RCS") value) in the resultant Radar point cloud, so that a software program can easily detect these points, and subsequently extract the depth value accordingly. As described above, the radar reflector 340 should be placed at approximately the same elevation with the radar sensor (i.e., orthogonal to the Radar plane - that is, orthogonal to the Radar emitter surface 310, the antenna axis 315 being orthogonal to the radar emitter surface 310).
[0078] These and other functions of the example 300 (and its components) are described in greater detail below with respect to Figs. 1, 2, and 4.
[0079] Figs. 4A and 4B (collectively, "Fig. 4") are flow diagrams illustrating a method 400 for implementing radar signal filtering for removing noise due to transparent or translucent material located in front of sensor, in accordance with various embodiments. [0080] While the techniques and procedures are depicted and/or described in a certain order for purposes of illustration, it should be appreciated that certain procedures may be reordered and/or omitted within the scope of various embodiments. Moreover, while the method 400 illustrated by Fig. 4 can be implemented by or with (and, in some cases, are described below with respect to) the systems, examples, or embodiments 100, 200, and 300 of Figs. 1, 2, and 3, respectively (or components thereof), such methods may also be implemented using any suitable hardware (or software) implementation. Similarly, while each of the systems, examples, or embodiments 100, 200, and 300 of Figs. 1, 2, and 3, respectively (or components thereof), can operate according to the method 400 illustrated by Fig. 4 (e.g., by executing instructions embodied on a computer readable medium), the systems, examples, or embodiments 100, 200, and 300 of Figs. 1, 2, and 3 can each also operate according to other modes of operation and/or perform other suitable procedures. [0081] In the non-limiting embodiment of Fig. 4A, method 400, at block 405, may comprise determining a distance between a radio detection and ranging ("Radar") sensor and a transparent or translucent material. As described above, a transparent material (including, but not limited to, clear glass, etc.) is one through which all incident light passes, while a translucent material (including, but not limited to, frosted glass, some plastics, etc.) is one through which some (but not all) of the incident light passes.
[0082] At block 410, method 400 may comprise receiving, using a computing system, Radar data from the Radar sensor, the Radar sensor being disposed such that the transparent or translucent material is positioned between one or more objects and the Radar sensor.
[0083] In some embodiments, the computing system may comprise at least one of a radar data processing system, the object detection system, an object detection and ranging system, a positioning and mapping system, an advanced driver assistance system ("ADAS"), a processor on a user device, a server computer over a network, a cloud computing system, or a distributed computing system, and/or the like.
[0084] According to some embodiments, the Radar sensor may be part of a Radar system that comprises a two-dimensional ("2D") Radar transmitter that emits a 2D Radar signal along a plane that is orthogonal to the 2D Radar transmitter, wherein the transparent or translucent material intersects the plane and the 2D Radar signal. In some instances, the 2D Radar transmitter may comprise at least one antenna disposed on an integrated circuit ("IC") chip, wherein the at least one antenna comprises one of a single IC -based antenna disposed on the IC chip, a plurality of IC-based antennas arranged as a one-dimensional ("ID") line of antennas disposed on the IC chip, or a 2D array of IC-based antennas disposed on the IC chip, and/or the like. The plane may be orthogonal to a surface of the IC chip on which the at least one antenna is disposed. In some cases, the plane may be parallel to a ground surface below the Radar sensor.
[0085] Alternatively, the Radar sensor may be part of a Radar system that comprises a three-dimensional ("3D") Radar transmitter that emits a 3D Radar signal that is orthogonal to the 3D Radar transmitter, wherein the transparent or translucent material intersects the 3D Radar signal.
[0086] In some embodiments, the transparent or translucent material may comprise a windshield of a vehicle, wherein the object detection system may be one of a system integrated with an advanced driver assistance system ("ADAS") or a system separate from yet in communication with an ADAS, or the like. In some cases, the windshield may be a front windshield, and the one or more objects may be located in front of a front portion of the vehicle. Alternatively, the windshield may be a rear windshield, and the one or more objects may be located behind a rear portion of the vehicle.
[0087] Method 400 may further comprise parsing, using the computing system, the Radar data to extract range or depth data (block 415). Method 400 may further comprise, at block 420, determining whether at least one extracted range or depth data corresponds to the distance between the Radar sensor and the transparent or translucent material. If so, method 400 may proceed to the process at block 425. If not method 400 may proceed to the process at block 435.
[0088] At block 425, method 400 may comprise, based on a determination that at least one extracted range or depth data corresponds to the distance between the Radar sensor and the transparent or translucent material, filtering out, using the computing system, at least one Radar data point corresponding to said at least one extracted range or depth data from the Radar data to produce filtered Radar data. Method 400, at block 430, may comprise utilizing the filtered Radar data as input to an object detection system
[0089] At block 435, method 400 may comprise, based on a determination that no extracted range or depth data corresponds to the distance between the Radar sensor and the transparent or translucent material, utilizing the Radar data from the Radar sensor as input to the object detection system.
[0090] With reference to Fig. 4B, determining the distance between the Radar sensor and the transparent or translucent material (at block 405) may comprise one of: (a) utilizing manual measurement (block 440); (b) utilizing Radar measurement (block 445); or (c) utilizing a default distance value(s) (block 450); or the like. In some cases, the distance between the Radar sensor and the transparent or translucent material may be a distance value that is stored in a data store prior to receiving the Radar data, the data store being accessible by the computing system.
[0091] In some embodiments, the manual measurement may be performed by a user using one of (1) a ruler, (2) a measuring tape, (3) a combination of a string and a ruler, (4) a combination of a string and a measuring tape, (5) a combination of a laser-based measuring tool positioned beside the Radar sensor and a laser reflective material placed against a surface of the transparent or translucent material and oriented to reflect laser light back to the laserbased measuring tool, (6) the laser-based measuring tool positioned with the transparent or translucent material positioned between the laser-based measuring tool and the Radar sensor and with the laser-based measuring tool aimed at the Radar sensor, or (7) the laser-based measuring tool positioned with the laser-based measuring tool positioned between the transparent or translucent material and the Radar sensor and with the laser-based measuring tool aimed at the Radar sensor, and/or the like.
[0092] According to some embodiments, the Radar measurement may be performed using one of (i) a combination of the Radar sensor and a Radar reflective material positioned between the transparent or translucent material and the Radar sensor and oriented or configured to reflect Radar signals back to the Radar sensor or (ii) a combination of the Radar sensor and a Radar reflective material that is positioned with the transparent or translucent material between the Radar reflective material and the Radar sensor and oriented or configured to reflect Radar signals back to the Radar sensor, and/or the like. In some instances, the Radar reflective material may include, without limitation, a Radar signal comer reflector, and/or the like.
[0093] In some instances, the distance value may include, without limitation, one of: a first distance value obtained using a manual measurement between the Radar sensor and the transparent or translucent material, along a plane that is orthogonal to an emitter/receiver surface of the Radar sensor; a second distance value obtained using a Radar measurement using the radar sensor between the Radar sensor and the transparent or translucent material, along the plane that is orthogonal to the emitter/receiver surface of the Radar sensor; a third distance value comprising a default distance value corresponding to estimated or average distances between the transparent or translucent material and a typical mounting position of the Radar sensor; a fourth distance value comprising a range of values between zero and the first distance value; a fifth distance value comprising a range of values between zero and the second distance value; a sixth distance value comprising a range of values between zero and the third distance value; or one of the first through sixth distance values with a predetermined tolerance value; and/or the like. Merely by way of example, in some cases, the distance (£)) may include any suitable separation distance between a Radar sensor and the transparent or translucent material, including, but not limited to, one of about 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36,
37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, or 50 cm, or a range between about 0.25 and about 50 cm, or the like. In some cases, the default distance value may be any suitable value, including, but not limited to, one of about 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17,
18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, or 50 cm, or a range between about 0.25 and about 50 cm, or the like (e.g., 30 cm or the like, where any values of d being 0 - 30 cm may be filtered out). In some instances, the predetermined tolerance may be any suitable tolerance value, including, but not limited to, one of about 1, 2, 3, 4, 5, 6, 7, 8, 9, or 10 cm, or a range between about 0.25 and about 10 cm, or the like. The depth or range (<7) of typical Radar sensors may extend from about 0 cm to 100's of meters, or further.
[0094] Merely by way of example, in some cases, the Radar data may include, without limitation, Radar point cloud data, where the at least one Radar data point may include, but is not limited to, at least one point within the Radar point cloud data.
[0095] Alternatively, or additionally, the Radar data may include, without limitation, one of a 2D Radar heat map or a 3D Radar heat map, and/or the like, where the at least one Radar data point may include, but is not limited to, a corresponding one of at least one intensity point within the 2D Radar heat map or at least one peak within the 3D Radar heat map, and/or the like.
[0096] Examples of System and Hardware Implementation
[0097] Fig. 5 is a block diagram illustrating an example of computer or system hardware architecture, in accordance with various embodiments. Fig. 5 provides a schematic illustration of one embodiment of a computer system 500 of the service provider system hardware that can perform the methods provided by various other embodiments, as described herein, and/or can perform the functions of computer or hardware system (i.e., computing systems 105a and 105b, advanced driver assistance systems ("ADASs") 170 and 240, and user devices 175a-175n, etc.), as described above. It should be noted that Fig. 5 is meant only to provide a generalized illustration of various components, of which one or more (or none) of each may be utilized as appropriate. Fig. 5, therefore, broadly illustrates how individual system elements may be implemented in a relatively separated or relatively more integrated manner.
[0098] The computer or hardware system 500 - which might represent an embodiment of the computer or hardware system (i.e., computing systems 105a and 105b, ADASs 170 and 240, and user devices 175a-175n, etc.), described above with respect to Figs. 1-4 - is shown comprising hardware elements that can be electrically coupled via a bus 505 (or may otherwise be in communication, as appropriate). The hardware elements may include one or more processors 510, including, without limitation, one or more general-purpose processors and/or one or more special-purpose processors (such as microprocessors, digital signal processing chips, graphics acceleration processors, and/or the like); one or more input devices 515, which can include, without limitation, a mouse, a keyboard, and/or the like; and one or more output devices 520, which can include, without limitation, a display device, a printer, and/or the like.
[0099] The computer or hardware system 500 may further include (and/or be in communication with) one or more storage devices 525, which can comprise, without limitation, local and/or network accessible storage, and/or can include, without limitation, a disk drive, a drive array, an optical storage device, solid-state storage device such as a random access memory ("RAM") and/or a read-only memory ("ROM"), which can be programmable, flash-updateable, and/or the like. Such storage devices may be configured to implement any appropriate data stores, including, without limitation, various file systems, database structures, and/or the like.
[0100] The computer or hardware system 500 might also include a communications subsystem 530, which can include, without limitation, a modem, a network card (wireless or wired), an infra-red communication device, a wireless communication device and/or chipset (such as a Bluetooth™ device, an 802.11 device, a WiFi device, a WiMax device, a WWAN device, cellular communication facilities, etc.), and/or the like. The communications subsystem 530 may permit data to be exchanged with a network (such as the network described below, to name one example), with other computer or hardware systems, and/or with any other devices described herein. In many embodiments, the computer or hardware system 500 will further comprise a working memory 535, which can include a RAM or ROM device, as described above.
[0101] The computer or hardware system 500 also may comprise software elements, shown as being currently located within the working memory 535, including an operating system 540, device drivers, executable libraries, and/or other code, such as one or more application programs 545, which may comprise computer programs provided by various embodiments (including, without limitation, hypervisors, VMs, and the like), and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein. Merely by way of example, one or more procedures described with respect to the method(s) discussed above might be implemented as code and/or instructions executable by a computer (and/or a processor within a computer); in an aspect, then, such code and/or instructions can be used to configure and/or adapt a general purpose computer (or other device) to perform one or more operations in accordance with the described methods. [0102] A set of these instructions and/or code might be encoded and/or stored on a non- transitory computer readable storage medium, such as the storage device(s) 525 described above. In some cases, the storage medium might be incorporated within a computer system, such as the system 500. In other embodiments, the storage medium might be separate from a computer system (i.e., a removable medium, such as a compact disc, etc.), and/or provided in an installation package, such that the storage medium can be used to program, configure, and/or adapt a general purpose computer with the instructions/code stored thereon. These instructions might take the form of executable code, which is executable by the computer or hardware system 500 and/or might take the form of source and/or installable code, which, upon compilation and/or installation on the computer or hardware system 500 (e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc.) then takes the form of executable code.
[0103] It will be apparent to those skilled in the art that substantial variations may be made in accordance with particular requirements. For example, customized hardware (such as programmable logic controllers, field-programmable gate arrays, application-specific integrated circuits, and/or the like) might also be used, and/or particular elements might be implemented in hardware, software (including portable software, such as applets, etc.), or both. Further, connection to other computing devices such as network input/output devices may be employed.
[0104] As mentioned above, in one aspect, some embodiments may employ a computer or hardware system (such as the computer or hardware system 500) to perform methods in accordance with various embodiments of the invention. According to a set of embodiments, some or all of the procedures of such methods are performed by the computer or hardware system 500 in response to processor 510 executing one or more sequences of one or more instructions (which might be incorporated into the operating system 540 and/or other code, such as an application program 545) contained in the working memory 535. Such instructions may be read into the working memory 535 from another computer readable medium, such as one or more of the storage device(s) 525. Merely by way of example, execution of the sequences of instructions contained in the working memory 535 might cause the processor(s) 510 to perform one or more procedures of the methods described herein.
[0105] The terms "machine readable medium" and "computer readable medium," as used herein, refer to any medium that participates in providing data that causes a machine to operate in some fashion. In an embodiment implemented using the computer or hardware system 500, various computer readable media might be involved in providing instructions/code to processor(s) 510 for execution and/or might be used to store and/or carry such instructions/code (e.g., as signals). In many implementations, a computer readable medium is a non-transitory, physical, and/or tangible storage medium. In some embodiments, a computer readable medium may take many forms, including, but not limited to, non-volatile media, volatile media, or the like. Non-volatile media includes, for example, optical and/or magnetic disks, such as the storage device(s) 525. Volatile media includes, without limitation, dynamic memory, such as the working memory 535. In some alternative embodiments, a computer readable medium may take the form of transmission media, which includes, without limitation, coaxial cables, copper wire, and fiber optics, including the wires that comprise the bus 505, as well as the various components of the communication subsystem 530 (and/or the media by which the communications subsystem 530 provides communication with other devices). In an alternative set of embodiments, transmission media can also take the form of waves (including without limitation radio, acoustic, and/or light waves, such as those generated during radio-wave and infra-red data communications).
[0106] Common forms of physical and/or tangible computer readable media include, for example, a floppy disk, a flexible disk, a hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read instructions and/or code.
[0107] Various forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to the processor(s) 510 for execution. Merely by way of example, the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer. A remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by the computer or hardware system 500. These signals, which might be in the form of electromagnetic signals, acoustic signals, optical signals, and/or the like, are all examples of carrier waves on which instructions can be encoded, in accordance with various embodiments of the invention.
[0108] The communications subsystem 530 (and/or components thereof) generally will receive the signals, and the bus 505 then might carry the signals (and/or the data, instructions, etc. carried by the signals) to the working memory 535, from which the processor(s) 505 retrieves and executes the instructions. The instructions received by the working memory 535 may optionally be stored on a storage device 525 either before or after execution by the processor(s) 510.
[0109] As noted above, a set of embodiments comprises methods and systems for implementing radar-based object detection (e.g., advanced driver assistance systems ("ADASs"), or the like), and, more particularly, to methods, systems, and apparatuses for implementing radar signal filtering for removing noise due to transparent or translucent material located in front of sensor. Fig. 6 illustrates a schematic diagram of a system 600 that can be used in accordance with one set of embodiments. The system 600 can include one or more user computers, user devices, or customer devices 605. A user computer, user device, or customer device 605 can be a general purpose personal computer (including, merely by way of example, desktop computers, tablet computers, laptop computers, handheld computers, and the like, running any appropriate operating system, several of which are available from vendors such as Apple, Microsoft Corp., and the like), cloud computing devices, a server(s), and/or a workstation computer(s) running any of a variety of commercially-available UNIX™ or UNIX-like operating systems. A user computer, user device, or customer device 605 can also have any of a variety of applications, including one or more applications configured to perform methods provided by various embodiments (as described above, for example), as well as one or more office applications, database client and/or server applications, and/or web browser applications. Alternatively, a user computer, user device, or customer device 605 can be any other electronic device, such as a thin-client computer, Internet-enabled mobile telephone, and/or personal digital assistant, capable of communicating via a network (e.g., the network(s) 610 described below) and/or of displaying and navigating web pages or other types of electronic documents. Although the system 600 is shown with two user computers, user devices, or customer devices 605, any number of user computers, user devices, or customer devices can be supported.
[0110] Some embodiments operate in a networked environment, which can include a network(s) 610. The network(s) 610 can be any type of network familiar to those skilled in the art that can support data communications using any of a variety of commercially-available (and/or free or proprietary) protocols, including, without limitation, TCP/IP, SNA™, IPX™, AppleTalk™, and the like. Merely by way of example, the network(s) 610 (similar to network(s) 165 of Fig. 1, or the like) can each include a local area network ("LAN"), including, without limitation, a fiber network, an Ethernet network, a Token-Ring™ network, and/or the like; a wide-area network ("WAN"); a wireless wide area network ("WWAN"); a virtual network, such as a virtual private network ("VPN"); the Internet; an intranet; an extranet; a public switched telephone network ("PSTN"); an infra-red network; a wireless network, including, without limitation, a network operating under any of the IEEE 802.11 suite of protocols, the Bluetooth™ protocol known in the art, and/or any other wireless protocol; and/or any combination of these and/or other networks. In a particular embodiment, the network might include an access network of the service provider (e.g., an Internet service provider ("ISP")). In another embodiment, the network might include a core network of the service provider, and/or the Internet.
[0111] Embodiments can also include one or more server computers 615. Each of the server computers 615 may be configured with an operating system, including, without limitation, any of those discussed above, as well as any commercially (or freely) available server operating systems. Each of the servers 615 may also be running one or more applications, which can be configured to provide services to one or more clients 605 and/or other servers 615.
[0112] Merely by way of example, one of the servers 615 might be a data server, a web server, a cloud computing device(s), or the like, as described above. The data server might include (or be in communication with) a web server, which can be used, merely by way of example, to process requests for web pages or other electronic documents from user computers 605. The web server can also run a variety of server applications, including HTTP servers, FTP servers, CGI servers, database servers, Java servers, and the like. In some embodiments of the invention, the web server may be configured to serve web pages that can be operated within a web browser on one or more of the user computers 605 to perform methods of the invention.
[0113] The server computers 615, in some embodiments, might include one or more application servers, which can be configured with one or more applications accessible by a client running on one or more of the client computers 605 and/or other servers 615. Merely by way of example, the server(s) 615 can be one or more general purpose computers capable of executing programs or scripts in response to the user computers 605 and/or other servers 615, including, without limitation, web applications (which might, in some cases, be configured to perform methods provided by various embodiments). Merely by way of example, a web application can be implemented as one or more scripts or programs written in any suitable programming language, such as Java™, C, C#™ or C++, and/or any scripting language, such as Perl, Python, or TCL, as well as combinations of any programming and/or scripting languages. The application server(s) can also include database servers, including, without limitation, those commercially available from Oracle™, Microsoft™, Sybase™, IBM™, and the like, which can process requests from clients (including, depending on the configuration, dedicated database clients, API clients, web browsers, etc.) running on a user computer, user device, or customer device 605 and/or another server 615. In some embodiments, an application server can perform one or more of the processes for implementing radar-based object detection (e.g., ADASs, or the like), and, more particularly, to methods, systems, and apparatuses for implementing radar signal filtering for removing noise due to transparent or translucent material located in front of sensor, as described in detail above. Data provided by an application server may be formatted as one or more web pages (comprising HTML, JavaScript, etc., for example) and/or may be forwarded to a user computer 605 via a web server (as described above, for example). Similarly, a web server might receive web page requests and/or input data from a user computer 605 and/or forward the web page requests and/or input data to an application server. In some cases, a web server may be integrated with an application server.
[0114] In accordance with further embodiments, one or more servers 615 can function as a file server and/or can include one or more of the files (e.g., application code, data files, etc.) necessary to implement various disclosed methods, incorporated by an application running on a user computer 605 and/or another server 615. Alternatively, as those skilled in the art will appreciate, a file server can include all necessary files, allowing such an application to be invoked remotely by a user computer, user device, or customer device 605 and/or server 615. [0115] It should be noted that the functions described with respect to various servers herein (e.g., application server, database server, web server, file server, etc.) can be performed by a single server and/or a plurality of specialized servers, depending on implementationspecific needs and parameters.
[0116] In some embodiments, the system can include one or more databases 620a-620n (collectively, "databases 620"). The location of each of the databases 620 is discretionary: merely by way of example, a database 620a might reside on a storage medium local to (and/or resident in) a server 615a (and/or a user computer, user device, or customer device 605). Alternatively, a database 620n can be remote from any or all of the computers 605, 615, so long as it can be in communication (e.g., via the network 610) with one or more of these. In a particular set of embodiments, a database 620 can reside in a storage-area network ("SAN") familiar to those skilled in the art. (Likewise, any necessary files for performing the functions attributed to the computers 605, 615 can be stored locally on the respective computer and/or remotely, as appropriate.) In one set of embodiments, the database 620 can be a relational database, such as an Oracle database, that is adapted to store, update, and retrieve data in response to SQL-formatted commands. The database might be controlled and/or maintained by a database server, as described above, for example.
[0117] According to some embodiments, system 600 might further comprise computing system(s) 625 (similar to computing system 105a of Fig. 1, or the like), which may include, without limitation, at least one of one or more radio detection and ranging ("Radar") data processors 630 (similar to Radar data processor(s) 110 of Fig. 1, or the like), one or more image data processors 635 (optional; similar to image data processor(s) 115 of Fig. 1, or the like), an objection detection system 640 (similar to objection detection system 120 of Fig. 1, or the like), or a data store 645 (similar to data store 125 of Fig. 1, or the like), and/or the like. Each Radar data processor 630 may include, but is not limited to, Radar data parser 630a, Radar data filter 630b, and/or the like (similar to Radar data parser 110a, Radar data filter 110b, and/or the like of Fig. 1, or the like). System 100 may further comprise one or more Radar sensors 655 (similar to Radar sensor(s) 135 of Fig. 1, or the like) and one or more cameras 675 (optional; similar to camera(s) 155 of Fig. 1, or the like) that may be used to detect one or more objects 660a-660n (collectively, "objects 660" or the like; similar to objects 140a-140n of Fig. 1, or the like), in some cases, through a transparent or translucent material 665 (similar to transparent or translucent material 145 of Fig. 1, or the like). System 100 may further comprise remote computing system 625b and corresponding database(s) 680 (similar to remote computing system 105b and corresponding database(s) 160 of Fig. 1, or the like).
[0118] In operation, computing system 625a or 625b (collectively, "computing system" or the like) may receive Radar data (e.g., Radar signal data 650, or the like) from a Radar sensor (e.g., Radar sensor(s) 655, or the like), the Radar sensor being disposed such that a transparent or translucent material (e.g., transparent or translucent material 665, or the like) is positioned between one or more objects (e.g., objects 660a-660n, or the like) and the Radar sensor. The computing system may parse (e.g., using Radar data parser 630a, or the like) the Radar data to extract range or depth data (e.g., range or depth di - dn corresponding to the depth or distance between the Radar sensor(s) 655 and corresponding one of objects 660a-660n, or the like). Based on a determination that at least one extracted range or depth data (e.g., one of di - dn, or the like) corresponds to a distance (£)) between the Radar sensor and the transparent or translucent material, the computing system may filter out (e.g., using Radar data filter 630b, or the like) at least one Radar data point corresponding to said at least one extracted range or depth data from the Radar data to produce filtered Radar data, and may utilize the filtered Radar data as input to an object detection system (e.g., object detection system 640, or the like). Based on a determination that no extracted range or depth data corresponds to the distance between the Radar sensor and the transparent or translucent material, the computing system may utilize the (unfiltered) Radar data from the Radar sensor as input to the object detection system. In some cases, image data (if available; e.g., image data 670 from the camera(s) 675, or the like) may also be utilized as additional input to the object detection system. Results or data from the object detection system may subsequently be sent to a user device(s) (e.g., user device(s) 605, or the like) for display to a user(s), or the like.
[0119] According to some embodiments, the distance between the Radar sensor and the transparent or translucent material may be a distance value that is stored in a data store (e.g., data store 645 or database(s) 680, or the like) prior to receiving the Radar data, the data store being accessible by the computing system.
[0120] In some instances, the distance value (£)) may include, but is not limited to, one of: a first distance value obtained using a manual measurement between the Radar sensor and the transparent or translucent material, along a plane that is orthogonal to an emitter/receiver surface of the Radar sensor; a second distance value obtained using a Radar measurement using the radar sensor between the Radar sensor and the transparent or translucent material, along the plane that is orthogonal to the emitter/receiver surface of the Radar sensor; a third distance value comprising a default distance value corresponding to estimated or average distances between the transparent or translucent material and a typical mounting position of the Radar sensor; a fourth distance value comprising a range of values between zero and the first distance value; a fifth distance value comprising a range of values between zero and the second distance value; a sixth distance value comprising a range of values between zero and the third distance value; or one of the first through sixth distance values with a predetermined tolerance value; and/or the like.
[0121] The manual measurement may be performed by a user using one of a ruler, a measuring tape, a combination of a string and a ruler, a combination of a string and a measuring tape, a combination of a laser-based measuring tool positioned beside the Radar sensor and a laser reflective material placed against a surface of the transparent or translucent material and oriented to reflect laser light back to the laser-based measuring tool, the laserbased measuring tool positioned with the transparent or translucent material positioned between the laser-based measuring tool and the Radar sensor and with the laser-based measuring tool aimed at the Radar sensor, or the laser-based measuring tool positioned with the laser-based measuring tool positioned between the transparent or translucent material and the Radar sensor and with the laser-based measuring tool aimed at the Radar sensor, and/or the like. The Radar measurement may be performed using one of a combination of the Radar sensor and a Radar reflective material positioned between the transparent or translucent material and the Radar sensor and oriented or configured to reflect Radar signals back to the Radar sensor or a combination of the Radar sensor and a Radar reflective material that is positioned with the transparent or translucent material between the Radar reflective material and the Radar sensor and oriented or configured to reflect Radar signals back to the Radar sensor, wherein the Radar reflective material comprises a Radar signal corner reflector, and/or the like.
[0122] Merely by way of example, in some cases, the Radar data may include Radar point cloud data. In some instances, the at least one Radar data point may include at least one point within the Radar point cloud data. In some embodiments, the point cloud data may include coordinate data (e.g., x-axis coordinate data, y-axis coordinate data, and z-axis coordinate date, or corresponding polar coordinate data, or the like) and Radar reflection data (e.g., Radar reflection value or Radar signal intensity value, or the like), and/or the like.
[0123] Alternatively, or additionally, the Radar data may include one of a 2D Radar heat map or a 3D Radar heat map. In such cases, the at least one Radar data point may include a corresponding one of at least one intensity point within the 2D Radar heat map or at least one peak within the 3D Radar heat map.
[0124] These and other functions of the system 600 (and its components) are described in greater detail above with respect to Figs. 1-4.
[0125] While particular features and aspects have been described with respect to some embodiments, one skilled in the art will recognize that numerous modifications are possible. For example, the methods and processes described herein may be implemented using hardware components, software components, and/or any combination thereof. Further, while various methods and processes described herein may be described with respect to particular structural and/or functional components for ease of description, methods provided by various embodiments are not limited to any particular structural and/or functional architecture but instead can be implemented on any suitable hardware, firmware and/or software configuration. Similarly, while particular functionality is ascribed to particular system components, unless the context dictates otherwise, this functionality need not be limited to such and can be distributed among various other system components in accordance with the several embodiments.
[0126] Moreover, while the procedures of the methods and processes described herein are described in a particular order for ease of description, unless the context dictates otherwise, various procedures may be reordered, added, and/or omitted in accordance with various embodiments. Moreover, the procedures described with respect to one method or process may be incorporated within other described methods or processes; likewise, system components described according to a particular structural architecture and/or with respect to one system may be organized in alternative structural architectures and/or incorporated within other described systems. Hence, while various embodiments are described with — or without — particular features for ease of description and to illustrate some aspects of those embodiments, the various components and/or features described herein with respect to a particular embodiment can be substituted, added and/or subtracted from among other described embodiments, unless the context dictates otherwise. Consequently, although several embodiments are described above, it will be appreciated that the invention is intended to cover all modifications and equivalents within the scope of the following claims.

Claims

WHAT IS CLAIMED IS:
1. A method, comprising: receiving, using a computing system, radio detection and ranging ("Radar") data from a Radar sensor, the Radar sensor being disposed such that a transparent or translucent material is positioned between one or more objects and the Radar sensor; parsing, using the computing system, the Radar data to extract range or depth data; based on a determination that at least one extracted range or depth data corresponds to a distance between the Radar sensor and the transparent or translucent material, filtering out, using the computing system, at least one Radar data point corresponding to said at least one extracted range or depth data from the Radar data to produce filtered Radar data, and utilizing the filtered Radar data as input to an object detection system; and based on a determination that no extracted range or depth data corresponds to the distance between the Radar sensor and the transparent or translucent material, utilizing the Radar data from the Radar sensor as input to the object detection system.
2. The method of claim 1, wherein the computing system comprises at least one of a radar data processing system, the object detection system, an object detection and ranging system, a positioning and mapping system, an advanced driver assistance system ("ADAS"), a processor on a user device, a server computer over a network, a cloud computing system, or a distributed computing system.
3. The method of claim 1 or 2, wherein the Radar sensor is part of a Radar system that comprises a two-dimensional ("2D") Radar transmitter that emits a 2D Radar signal along a plane that is orthogonal to the 2D Radar transmitter, wherein the transparent or translucent material intersects the plane and the 2D Radar signal.
4. The method of claim 3, wherein the 2D Radar transmitter comprises at least one antenna disposed on an integrated circuit ("IC") chip, wherein the at least one antenna comprises one of a single IC -based antenna disposed on the IC chip, a plurality of IC-based antennas arranged as a one-dimensional ("ID") line of antennas disposed on the IC chip, or a 2D array of IC-based antennas disposed on the IC chip, wherein the plane is orthogonal to a surface of the IC chip on which the at least one antenna is disposed.
35
5. The method of claim 3, wherein the plane is parallel to a ground surface below the Radar sensor.
6. The method of claim 1 or 2, wherein the Radar sensor is part of a Radar system that comprises a three-dimensional ("3D") Radar transmitter that emits a 3D Radar signal that is orthogonal to the 3D Radar transmitter, wherein the transparent or translucent material intersects the 3D Radar signal.
7. The method of any of claims 1-6, wherein the transparent or translucent material comprises a windshield of a vehicle, and wherein the object detection system is one of a system integrated with an advanced driver assistance system ("ADAS") or a system separate from yet in communication with an ADAS.
8. The method of claim 7, wherein the windshield is a front windshield, and the one or more objects are located in front of a front portion of the vehicle.
9. The method of claim 7, wherein the windshield is a rear windshield, and the one or more objects are located behind a rear portion of the vehicle.
10. The method of any of claims 1-8, wherein the distance between the Radar sensor and the transparent or translucent material is a distance value that is stored in a data store prior to receiving the Radar data, the data store being accessible by the computing system.
11. The method of claim 10, wherein the distance value comprises one of: a first distance value obtained using a manual measurement between the Radar sensor and the transparent or translucent material, along a plane that is orthogonal to an emitter/receiver surface of the Radar sensor, the manual measurement being performed by a user using one of a ruler, a measuring tape, a combination of a string and a ruler, a combination of a string and a measuring tape, a combination of a laser-based measuring tool positioned beside the Radar sensor and a laser reflective material placed against a surface of the transparent or translucent material and oriented to reflect laser light back to the laser-based measuring tool, the laser-based measuring tool positioned with the transparent or translucent material positioned between the laser-based measuring tool and the Radar sensor and with the laser-based measuring tool aimed at the Radar sensor, or the laserbased measuring tool positioned with the laser-based measuring tool positioned between the transparent or translucent material and the Radar sensor and with the laser-based measuring tool aimed at the Radar sensor;
36 a second distance value obtained using a Radar measurement using the radar sensor between the Radar sensor and the transparent or translucent material, along the plane that is orthogonal to the emitter/receiver surface of the Radar sensor, the Radar measurement being performed using one of a combination of the Radar sensor and a Radar reflective material positioned between the transparent or translucent material and the Radar sensor and oriented or configured to reflect Radar signals back to the Radar sensor or a combination of the Radar sensor and a Radar reflective material that is positioned with the transparent or translucent material between the Radar reflective material and the Radar sensor and oriented or configured to reflect Radar signals back to the Radar sensor, wherein the Radar reflective material comprises a Radar signal corner reflector; a third distance value comprising a default distance value corresponding to estimated or average distances between the transparent or translucent material and a typical mounting position of the Radar sensor; a fourth distance value comprising a range of values between zero and the first distance value; a fifth distance value comprising a range of values between zero and the second distance value; a sixth distance value comprising a range of values between zero and the third distance value; or one of the first through sixth distance values with a predetermined tolerance value.
12. The method of any of claims 1-11, wherein the Radar data comprises Radar point cloud data, and wherein the at least one Radar data point comprises at least one point within the Radar point cloud data.
13. The method of any of claims 1-12, wherein the Radar data comprises one of a two-dimensional ("2D") Radar heat map or a three-dimensional ("3D") Radar heat map, and wherein the at least one Radar data point comprises a corresponding one of at least one intensity point within the 2D Radar heat map or at least one peak within the 3D Radar heat map.
14. An apparatus, comprising: at least one processor; and a non-transitory computer readable medium communicatively coupled to the at least one processor, the non-transitory computer readable medium having stored thereon computer software comprising a set of instructions that, when executed by the at least one processor, causes the apparatus to: receive radio detection and ranging ("Radar") data from a Radar sensor, the Radar sensor being disposed such that a transparent or translucent material is positioned between one or more objects and the Radar sensor; parse the Radar data to extract range or depth data; based on a determination that at least one extracted range or depth data corresponds to a distance between the Radar sensor and the transparent or translucent material, filter out at least one Radar data point corresponding to said at least one extracted range or depth data from the Radar data to produce filtered Radar data, and utilize the filtered Radar data as input to an object detection system; and based on a determination that no extracted range or depth data corresponds to the distance between the Radar sensor and the transparent or translucent material, utilize the Radar data from the Radar sensor as input to the object detection system.
15. A system, comprising: a computing system, comprising: at least one first processor; and a first non-transitory computer readable medium communicatively coupled to the at least one first processor, the first non-transitory computer readable medium having stored thereon computer software comprising a first set of instructions that, when executed by the at least one first processor, causes the computing system to: receive radio detection and ranging ("Radar") data from a Radar sensor, the Radar sensor being disposed such that a transparent or translucent material is positioned between one or more objects and the Radar sensor; parse the Radar data to extract range or depth data; based on a determination that at least one extracted range or depth data corresponds to a distance between the Radar sensor and the transparent or translucent material, filter out at least one Radar data point corresponding to said at least one extracted range or depth data from the Radar data to produce filtered Radar data, and utilize the filtered Radar data as input to an object detection system; and based on a determination that no extracted range or depth data corresponds to the distance between the Radar sensor and the transparent or translucent material, utilize the Radar data from the Radar sensor as input to the object detection system.
39
PCT/US2021/065167 2021-12-24 2021-12-24 Radar signal filtering for removing noise due to transparent or translucent material located in front of sensor WO2022099224A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2021/065167 WO2022099224A1 (en) 2021-12-24 2021-12-24 Radar signal filtering for removing noise due to transparent or translucent material located in front of sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2021/065167 WO2022099224A1 (en) 2021-12-24 2021-12-24 Radar signal filtering for removing noise due to transparent or translucent material located in front of sensor

Publications (1)

Publication Number Publication Date
WO2022099224A1 true WO2022099224A1 (en) 2022-05-12

Family

ID=81456812

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2021/065167 WO2022099224A1 (en) 2021-12-24 2021-12-24 Radar signal filtering for removing noise due to transparent or translucent material located in front of sensor

Country Status (1)

Country Link
WO (1) WO2022099224A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130271310A1 (en) * 2012-04-12 2013-10-17 Honda Elesys Co., Ltd. On-board radar apparatus, detection method, and detection program
US20170205230A1 (en) * 2014-07-08 2017-07-20 Basf Se Detector for determining a position of at least one object
WO2020163385A1 (en) * 2019-02-06 2020-08-13 Metawave Corporation Method and apparatus for electromagnetic transmission attenuation control
US20200293753A1 (en) * 2019-03-15 2020-09-17 Samsung Electronics Co., Ltd. Millimeter wave radar and camera fusion based face authentication system
US20200363500A1 (en) * 2019-05-13 2020-11-19 Gm Cruise Holdings Llc Radar cross section compensation for calibration of vehicle radar

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130271310A1 (en) * 2012-04-12 2013-10-17 Honda Elesys Co., Ltd. On-board radar apparatus, detection method, and detection program
US20170205230A1 (en) * 2014-07-08 2017-07-20 Basf Se Detector for determining a position of at least one object
WO2020163385A1 (en) * 2019-02-06 2020-08-13 Metawave Corporation Method and apparatus for electromagnetic transmission attenuation control
US20200293753A1 (en) * 2019-03-15 2020-09-17 Samsung Electronics Co., Ltd. Millimeter wave radar and camera fusion based face authentication system
US20200363500A1 (en) * 2019-05-13 2020-11-19 Gm Cruise Holdings Llc Radar cross section compensation for calibration of vehicle radar

Similar Documents

Publication Publication Date Title
US11226200B2 (en) Method and apparatus for measuring distance using vehicle-mounted camera, storage medium, and electronic device
US11579307B2 (en) Method and apparatus for detecting obstacle
CN112014845B (en) Vehicle obstacle positioning method, device, equipment and storage medium
CN111324115B (en) Obstacle position detection fusion method, obstacle position detection fusion device, electronic equipment and storage medium
US20200081119A1 (en) Method and apparatus for determining relative pose, device and medium
WO2017057042A1 (en) Signal processing device, signal processing method, program, and object detection system
US20200293793A1 (en) Methods and systems for video surveillance
US11835622B2 (en) Method and device to process radar signal
EP3441906A1 (en) Information processing apparatus, moving object, information processing method, and computer-readble medium
US10759448B2 (en) Method and apparatus for early warning of vehicle offset
CN113887400B (en) Obstacle detection method, model training method and device and automatic driving vehicle
CA3069589A1 (en) Handheld three-dimensional ultrasound imaging system and method
CN110163900B (en) Method and device for adjusting point cloud data
JP2018037737A (en) Periphery monitoring device and periphery monitoring method
CN114488099A (en) Laser radar coefficient calibration method and device, electronic equipment and storage medium
CN112254902A (en) Method and device for generating three-dimensional laser point cloud picture based on laser and visible light scanning
JP6411933B2 (en) Vehicle state determination device
WO2022099224A1 (en) Radar signal filtering for removing noise due to transparent or translucent material located in front of sensor
KR20220109537A (en) Apparatus and method for calibration of sensor system of autonomous vehicle
CN112801024A (en) Detection information processing method and device
KR101658089B1 (en) Method for estimating a center lane for lkas control and apparatus threof
CN114167393A (en) Position calibration method and device for traffic radar, storage medium and electronic equipment
JP2022088496A (en) Method of controlling data collection, and device, electronic apparatus and medium thereof
CN111077527A (en) Obstacle distance determination method and device, vehicle-mounted equipment and storage medium
CN113470103A (en) Method and device for determining camera action distance in vehicle-road cooperation and road side equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21890309

Country of ref document: EP

Kind code of ref document: A1