CN115366845A - System and method for providing safety to a vehicle - Google Patents

System and method for providing safety to a vehicle Download PDF

Info

Publication number
CN115366845A
CN115366845A CN202210462078.XA CN202210462078A CN115366845A CN 115366845 A CN115366845 A CN 115366845A CN 202210462078 A CN202210462078 A CN 202210462078A CN 115366845 A CN115366845 A CN 115366845A
Authority
CN
China
Prior art keywords
points
vehicle
point
space
dynamic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210462078.XA
Other languages
Chinese (zh)
Inventor
T·萨菲尔
S·哈里斯
J·马哈穆蒂亚齐奥卢
J·汉纳福德
E·哈纳克
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ford Global Technologies LLC
Original Assignee
Ford Global Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ford Global Technologies LLC filed Critical Ford Global Technologies LLC
Publication of CN115366845A publication Critical patent/CN115366845A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/30Detection related to theft or to other events relevant to anti-theft systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/10Fittings or systems for preventing or indicating unauthorised use or theft of vehicles actuating a signalling device
    • B60R25/102Fittings or systems for preventing or indicating unauthorised use or theft of vehicles actuating a signalling device a signal being sent to a remote location, e.g. a radio signal being transmitted to a police station, a security company or the owner
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/30Detection related to theft or to other events relevant to anti-theft systems
    • B60R25/305Detection related to theft or to other events relevant to anti-theft systems using a camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/30Detection related to theft or to other events relevant to anti-theft systems
    • B60R25/31Detection related to theft or to other events relevant to anti-theft systems of human presence inside or outside the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/30Detection related to theft or to other events relevant to anti-theft systems
    • B60R25/34Detection related to theft or to other events relevant to anti-theft systems of conditions of vehicle components, e.g. of windows, door locks or gear selectors
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/22Electrical actuation
    • G08B13/24Electrical actuation by interference with electromagnetic field distribution
    • G08B13/2491Intrusion detection systems, i.e. where the body of an intruder causes the interference with the electromagnetic field
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/10Fittings or systems for preventing or indicating unauthorised use or theft of vehicles actuating a signalling device
    • B60R2025/1013Alarm systems characterised by the type of warning signal, e.g. visual, audible
    • B60R2025/1016Remote signals alerting owner or authorities, e.g. radio signals
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2325/00Indexing scheme relating to vehicle anti-theft devices
    • B60R2325/20Communication devices for vehicle anti-theft devices
    • B60R2325/205Mobile phones
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B29/00Checking or monitoring of signalling or alarm systems; Prevention or correction of operating errors, e.g. preventing unauthorised operation
    • G08B29/18Prevention or correction of operating errors
    • G08B29/20Calibration, including self-calibrating arrangements

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
  • Transportation (AREA)

Abstract

The systems and methods disclosed herein are configured to provide safety to a vehicle. A system includes a point measurement device located inside a space of a vehicle. The system identifies a threat if a point other than the calibrated set of points is located in a region of the space.

Description

System and method for providing safety to a vehicle
Technical Field
The present disclosure relates to systems and methods for providing safety to a vehicle.
Background
Equipment in a vehicle, such as a commercial vehicle, is the target of theft. Attacks on the vehicle can be made in an organized manner and follow a set up procedure for breaking into the vehicle. The process may include steps such as: attacking the body panel with a tool or peeling the body panel at the perceived weak point. With respect to these and other considerations, the disclosure herein is set forth.
Disclosure of Invention
The systems and methods disclosed herein are configured to provide safety to a vehicle. The system includes a point measurement device located inside a space of the vehicle. The system identifies a threat if a point other than the calibrated set of points is located in a region of the space.
Referring to FIG. 1, system 100 includes a mobile device 110, a server 112, and a vehicle 120 connected by a network 114. The vehicle 120 includes a vehicle computer 122, a point measurement device 124, and a computer vision system 128.
The point measurement device may be a Frequency Modulated Continuous Wave (FMCW) LIDAR (light detection and ranging) radar 124. The FMCW radar 124 is located in an interior space 126 of the vehicle 120. The interior space 126 may be a loading space (e.g., equipment storage space or trunk space) of the vehicle 120.
The FMCW radar 124 may include a transmitter 130 (e.g., a Transmit (TX) antenna) that transmits a beam 132 including a frequency band (e.g., chirp) at an angle, and a receiver 134 (e.g., a Receive (RX) antenna) that receives the reflected beam 132.
FMCW radar 124 generates a point cloud (e.g., point 136). If the load space 126 is closed, the point cloud (e.g., point 136) represents the interior surface of the load space 126 (e.g., presence detection).
To generate a point cloud (e.g., point 136), the time difference between when beam 132 was transmitted and when beam 132 was received may be determined. The difference in time and frequency of the beam 132 may be used to determine the distance to a point 136 on the interior surface of the load space 126. Using the distance to the point 136 and the angle of the beam 132, the (x, y) coordinates of the point 136 on the coordinate system may be determined (e.g., located). Thus, a set of points 136 covering the interior surface of the loading space 126 may be determined.
The FMCW radar 124 determines a point 136 on the interior of the loading space 126, for example, as a safety measure to identify when the loading space 126 is damaged or breached.
In some cases, the system 100 may be dormant and awakened based on sound or vibration measurements. For example, the system 100 may include a sensor array 140, the sensor array 140 including an accelerometer and/or a microphone to measure sound or vibration. If the sound or vibration measurement exceeds a certain threshold, the vehicle computer 122 may initiate a measurement with the FMCW radar 124.
Referring to fig. 2 and 3, a portion of a wall 138 of the loading space 126 is shown for teaching purposes. The wall 138 may comprise a panel of the vehicle 120.
The FMCW radar 124 may be calibrated by determining a set of points 136 on the interior of the load space 126 that includes a static reference point 150. The step of determining the static reference point 150 may be repeated until all points are determined to be static.
For example, if point 136 is determined to be at the same location (e.g., have the same distance at the same angle) at two different times, point 136 is a static reference point 150. The calibration provides a set of static reference points 150 as a basis for measuring dynamic points.
Since the load space 126 of the vehicle 120 may have several items or contents occupying it at any given time, the static reference point 150 may also represent static items in the load space 126.
To determine whether to notify a user or mobile device 110 to access or damage stowage space 126, system 100 determines whether the set of points 136 includes dynamic points 160. Referring to fig. 2 and 3, fig. 2 shows a static reference point 150 (e.g., a side view) on the undamaged wall 138, and fig. 3 shows a static reference point 150 and a dynamic point 160 on the damaged wall 138. For the damaged wall 138 in FIG. 3, the set of points 136 includes a static reference point 150 on the undamaged portion of the wall 138 and a dynamic point 160 on the damaged portion of the wall 138.
The dynamic point 160 is a different point 136 than the static reference point 150 (e.g., the FMCW radar 124 determines a point 136 that has a different distance at the same angle as the static reference point 150).
According to an exemplary method, after calibration, FMCW radar 124 determines the locations of multiple points 136 and determines whether any of points 136 are dynamic points 160. System 100 ignores point 136 as static reference point 150 and focuses on point 136 as dynamic point 160 (e.g., different from static reference point 150).
The system 100 then determines whether the dynamic point 160 is in the locating region. For example, the system 100 may determine whether the dynamic points 160 are close together (e.g., a number of the dynamic points 160 are within a threshold distance of each other (e.g., 1-10 cm)).
If the dynamic points 160 are not close together, the dynamic points 160 are not located, and the system 100 returns to the step of determining the location of the point 136.
If the dynamic points 160 are close together, a cluster 162 is created from the dynamic points 160. The clusters 162 of dynamic points 160 may indicate local deformation of the wall 138 (e.g., a body panel) and thus may be separated from other smaller local damage areas caused by, for example, weather. In particular, weather damage may be sustained along the entire body panel rather than in localized areas, and the degree of damage may be low.
The system 100 then determines whether a cluster 162 has been previously identified. If cluster 162 has not been previously identified, cluster 162 is assigned an identifier.
System 100 may continue to determine point 136. If the next set of dynamic points 160 is the same as or close to those of the previously identified cluster 162 (e.g., indicating additional damage to the wall 138), the next set of dynamic points 160 may be added to the previous set of dynamic points 160 or the previous set of dynamic points 160 may be replaced to track the cluster 162 over time. Alternatively, a new cluster may be identified.
The identified cluster 162 is an indicator of a threat or suspicious activity. In particular, the cluster 162 of dynamic points 160 may represent a deformation of the wall 138 (e.g., a body panel), an intrusion into the load space 126 (e.g., a person entering the load space 126 without the body panel being deformed), and/or another suspicious event that the system 100 will detect. An attack on the loading space 126 may deform the body panel, pierce or cut the body panel, pull the body panel rearward, combinations thereof, and the like.
The identified cluster 162 initiates an object detection method by the computer vision system 128. The computer vision system 128 may include a camera 170 and an object recognition algorithm that detects and tracks objects from the image data. For example, the computer vision system 128 may capture an image with the camera 170 and identify objects (e.g., people and/or tools) in the image to verify that the identified cluster 162 represents a threat or suspicious activity.
If the threat or suspicious activity is verified, the system 100 may notify the user on the mobile device 110 or notify the server 112. Thus, the system 100 uses an internal sensor (FMCW radar 124) to detect external interactions with the vehicle 120. Internal sensors may not be easily tampered with or disabled because accessing the sensors through the load space 126 allows the system 100 to notify a user before the sensors can be accessed. The vehicle 120 is aware of the potential attack earlier than using conventional vehicle security methods.
These and other advantages of the present disclosure are provided in greater detail herein.
Drawings
The detailed description explains the embodiments with reference to the drawings. The use of the same reference numbers may indicate similar or identical items. Various embodiments may utilize elements and/or components other than those shown in the figures, and some elements and/or components may not be present in various embodiments. Elements and/or components in the drawings have not necessarily been drawn to scale. Throughout this disclosure, singular and plural terms may be used interchangeably, depending on the context.
FIG. 1 depicts a system for providing safety to a vehicle according to the present disclosure.
Fig. 2 depicts a wall of a space of a vehicle according to the present disclosure.
Fig. 3 depicts a damaged version of the wall of fig. 2 according to the present disclosure.
Fig. 4 depicts a method according to the present disclosure.
Fig. 5 depicts the system of fig. 1 according to the present disclosure.
Detailed Description
The present disclosure now will be described more fully hereinafter with reference to the accompanying drawings, in which exemplary embodiments of the disclosure are shown, and which are not intended to be limiting.
Referring to fig. 1, system 100 includes a mobile device 110, a server 112, and a vehicle 120 connected by a network 114.
The vehicle 120 may take the form of a passenger or commercial automobile, such as, for example, a truck, a car, a sport utility vehicle, a cross-over vehicle, a van, a minivan, a taxi, a bus, etc., and may be configured to include various types of automotive drive systems. Exemplary drive systems may include various types of Internal Combustion Engine (ICE) powertrains having gasoline, diesel, or natural gas powered combustion engines with conventional drive components such as transmissions, drive shafts, differentials, and the like.
In another configuration, the vehicle 120 may be configured as an Electric Vehicle (EV). More specifically, the vehicle 120 may include a Battery EV (BEV) drive system. Vehicle 120 may be configured as a Hybrid EV (HEV) having a stand-alone on-board power plant or a plug-in HEV (PHEV) including an HEV powertrain connectable to an external power source (including a parallel or series hybrid powertrain having a combustion engine power plant and one or more EV drive systems). An HEV may include a battery and/or a bank of super capacitors for storage, a flywheel storage system, or other power generation and storage infrastructure.
The vehicle 120 may be further configured as a Fuel Cell Vehicle (FCV) that converts liquid or solid fuel into usable power using a fuel cell (e.g., a Hydrogen Fuel Cell Vehicle (HFCV) powertrain, etc.) and/or any combination of these drive systems and components.
Additionally, vehicle 120 may be a manually driven vehicle and/or configured to operate in a fully autonomous (e.g., unmanned) mode (e.g., level 5 autonomous) or in one or more partially autonomous modes. Examples of partially autonomous modes are broadly understood in the art as being level 1 to level 5 autonomous.
In some cases, the system 100 may be dormant and awakened based on sound or vibration measurements. For example, the system 100 includes a sensor array 140, the sensor array 140 including an accelerometer and/or a microphone to measure sound or vibration. If the sound or vibration measurement exceeds a certain threshold, the vehicle computer 122 may initiate a measurement with the FMCW radar 124.
The vehicle includes a point measurement device 124, such as a Frequency Modulated Continuous Wave (FMCW) LIDAR (light detection and ranging) radar 124. The FMCW radar 124 is located in an interior space 126 of the vehicle 120. The interior space 126 may be a loading space (e.g., a trunk space) of the vehicle 120.
The FMCW radar 124 includes a transmitter 130 (e.g., a Transmit (TX) antenna) that transmits a beam 132 including a frequency band (e.g., chirp) at an angle, and a receiver 134 (e.g., a Receive (RX) antenna) that receives the reflected beam 132.
FMCW radar 124 generates a point cloud (e.g., point 136). If the load space 126 is closed, the point cloud (e.g., point 136) represents an interior surface of the load space 126 (e.g., presence detection) or a static object in the load space 126.
To generate a point cloud (e.g., point 136), the time difference between when beam 132 was transmitted and when beam 132 was received may be determined. The difference in time and frequency of the beam 132 may be used to determine the distance to a point 136 on the interior surface of the load space 126. Using the distance to the point 136 and the angle of the beam 132, the (x, y) coordinates of the point 136 on the coordinate system may be determined (e.g., located). Thus, a set of points 136 covering the interior surface of the loadspace 126 or a static object in the loadspace 126 can be determined.
The FMCW radar 124 determines a point 136 on the interior of the loading space 126, for example, as a safety measure to identify when the loading space 126 is damaged or breached.
The FMCW radar 124 may be calibrated by determining a set of points 136 on an interior of the load space 126 (e.g., a wall 138 or object such as a device) that includes a static reference point 150. The step of determining the static reference point 150 may be repeated until all points are determined to be static.
For example, if point 136 is determined to be at the same location (e.g., have the same distance at the same angle) at two different times, point 136 is a static reference point 150. The calibration provides a set of static reference points 150 as a basis for measuring dynamic points 160.
To determine whether to notify a user or mobile device 110 to access or damage stowage space 126, system 100 determines whether the set of points 136 includes dynamic points 160. Referring to fig. 2 and 3, fig. 2 shows a static reference point 150 (e.g., a side view) on the undamaged wall 138, and fig. 3 shows a static reference point 150 and a dynamic point 160 on the damaged wall 138. For a damaged wall 138, the set of points 136 includes a static reference point 150 on an undamaged portion of the wall 138 and a dynamic point 160 on a damaged portion of the wall 138.
The dynamic point 160 is a different point than the static reference point 150 (e.g., the FMCW radar 124 determines the point 136 having a different distance at the same angle as the static reference point 150).
The vehicle 120 includes a computer vision system 128. The computer vision system 128 includes a camera 170 and an object detection module that performs object localization and image classification functions on images from the camera 170.
Object detection may be performed using various methods for determining bounding boxes. For example, object detection may use Convolutional Neural Networks (CNN), region-based convolutional neural networks (RCNN), fast RCNN, faster RCNN, you need only look once (YOLO), and so on.
The object detection module uses an object localization algorithm and an image classification algorithm. The object localization algorithm locates the presence of an object in the image and indicates the presence of the object with a bounding box. For example, the position of each bounding box is defined by points, and each bounding box has a width and a height. The image classification algorithm predicts the type or class of object (e.g., person or baggage) in the bounding box.
Referring to fig. 4, according to a first step 210 of an exemplary method 200, the fmcw radar 124 determines a first set of points 136 on the interior surface of the space 126 of the vehicle 120 at a first time. The first set of points 136 includes a static reference point 150. The first set of points 136 may be determined repeatedly until all of the first set of points 136 are static reference points 150.
According to a second step 220, the sensor array 140 measures sound or vibration, and if the sound or vibration exceeds a threshold, the method 200 proceeds to a third step 230.
According to a third step 230, the fmcw radar 124 determines a second set of points 136 on the interior surface of the space 126 of the vehicle 120 at a second time. The second set of points 136 includes a first set of dynamic points 160.
According to a fourth step 240, the vehicle computer 122 determines whether the first set of dynamic points 160 are within a threshold distance of each other.
According to a fifth step 250, if the first set of dynamic points 160 are within a threshold distance of each other, the vehicle computer 122 identifies the first set of dynamic points 160 as a cluster 162.
According to a sixth step 260, once the cluster 162 is identified or verified (e.g., by repeating steps 230, 240, 250), the computer vision system 128 performs an object detection method.
According to a seventh step 270, if a threat is verified by the object detection method (e.g., if an object is detected and identified as a threat), the system 100 generates a notification. The notification may be sent to the mobile device 110 or the server 112.
Referring to FIG. 5, the system 100 is described in more detail.
The vehicle computer 122 includes components including a memory (e.g., memory 300) and a processor (e.g., processor 302). The mobile device 110 and the server 112 also include a memory and a processor. For purposes of teaching, the description of memory 300 and processor 302 applies to memories and processors of other elements.
The processor may be any suitable processing device or set of processing devices, such as but not limited to: a microprocessor, a microcontroller-based platform, suitable integrated circuitry, one or more Field Programmable Gate Arrays (FPGAs), and/or one or more Application Specific Integrated Circuits (ASICs).
The memory may be volatile memory (e.g., RAM, which may include non-volatile RAM, magnetic RAM, ferroelectric RAM, and any other suitable form); non-volatile memory (e.g., disk memory, flash memory, EPROM, EEPROM, memristor-based non-volatile solid-state memory, etc.); non-alterable memory (e.g., EPROM); a read-only memory; and/or high capacity storage devices (e.g., hard disk drives, solid state drives, etc.). In some examples, the memory includes a variety of memories, particularly volatile and non-volatile memories.
The memory is a computer-readable medium on which one or more sets of instructions (such as software for performing the methods of the present disclosure) may be embedded. The instructions may embody one or more of the methods or logic as described herein. The instructions may reside, completely or at least partially, within any one or more of a memory, a computer-readable medium, and/or within a processor during execution thereof.
The terms "non-transitory computer-readable medium" and "computer-readable medium" should be taken to include a single medium or multiple media (such as a centralized or distributed database that stores one or more sets of instructions, and/or associated caches and servers). The terms "non-transitory computer-readable medium" and "computer-readable medium" also include any tangible medium that is capable of storing, encoding or carrying a set of instructions for execution by a processor or that cause a system to perform any one or more of the methods or operations disclosed herein. As used herein, the term "computer-readable medium" is expressly defined to include any type of computer-readable storage and/or storage disk and to exclude propagating signals.
With continued reference to fig. 5, the Vehicle Control Unit (VCU) 304 includes a plurality of Electronic Control Units (ECUs) 310 disposed in communication with the vehicle computer 122. The VCU 304 may coordinate data between the vehicle systems, connected servers (e.g., server 112), and other vehicles operating as part of a fleet of vehicles. VCU 304 may control various aspects of connected vehicle 120 and implement one or more sets of instructions received from a vehicle system controller (such as vehicle computer 122).
VCU 304 may include or communicate with any combination of ECUs 310 such as, for example, a Body Control Module (BCM) 312, an Engine Control Module (ECM) 314, a Transmission Control Module (TCM) 316, a Telematics Control Unit (TCU) 318, a Restraint Control Module (RCM) 320, and the like. The TCU 318 may be configured to communicate with the ECU 310 via a Controller Area Network (CAN) bus 340. In some aspects, the TCU 318 may retrieve and send data as a node of the CAN bus 340.
The CAN bus 340 may be configured as a multi-master serial bus standard to connect two or more of the ECUs 310 as nodes using a message-based protocol that may be configured and/or programmed to allow the ECUs 310 to communicate with each other. The CAN bus 340 may be or include a high speed CAN (which may have bit speeds up to 1Mb/s over CAN, up to 5Mb/s over CAN flexible data rate (CAN FD)) and may include a low speed or fault tolerant CAN (up to 125 Kbps) that may use a linear bus configuration in some configurations. In some aspects, ECU 310 may communicate with a host computer (e.g., vehicle computer 122 and/or one or more servers 112, etc.), and may also communicate with each other without necessarily requiring a host computer.
The CAN bus 340 may connect the ECU 310 with the vehicle computer 122 so that the vehicle computer 122 may retrieve information from the ECU 310, send information to the ECU 310, and otherwise interact with the ECU 310 to perform the steps described in accordance with embodiments of the present disclosure. The CAN bus 340 may connect CAN bus nodes (e.g., ECUs 310) to each other over a two-wire bus, which may be a twisted pair wire with a nominal characteristic impedance. The CAN bus 340 may also be implemented using other communication protocol solutions, such as Media Oriented System Transport (MOST) or ethernet. In other aspects, the CAN bus 340 CAN be a wireless in-vehicle CAN bus.
VCU 304 may communicate via CAN bus 340 to directly control various loads or may implement such control in conjunction with BCM 312. The ECU 310 described with respect to VCU 304 is provided for exemplary purposes only and is not intended to be limiting or exclusive. Control of and/or communication with other control modules is possible and such control is contemplated.
The ECU 310 may use input from a human driver, input from a vehicle system controller, and/or wireless signal input received from other connected devices over one or more wireless channels to control various aspects of vehicle operation and communication. When configured as nodes in the CAN bus 340, the ECUs 310 may each include a Central Processing Unit (CPU), a CAN controller, and/or a transceiver.
The TCU 318 may be configured to provide vehicle connectivity to wireless computing systems on and off the vehicle 120, and may be configured for wireless communications between the vehicle 120 and other systems, computers, mobile devices 110, servers 112, and modules.
The TCU 318 includes a Navigation (NAV) system 330 for receiving and processing GPS signals from the GPS 332,
Figure BDA0003622502950000111
A low power module (BLEM) 334, a Wi-Fi transceiver, an Ultra Wideband (UWB) transceiver, and/or a transceiver for transmitting data using a Near Field Communication (NFC) protocol, described in further detail below,
Figure BDA0003622502950000112
Protocol, wi-Fi, ultra Wideband (UWB), and other possible data connection and sharing technologies.
The TCU 318 may include wireless transmission and communication hardware that may be configured to communicate with one or more transceivers associated with telecommunications towers and other wireless telecommunications infrastructure. For example, BLEM 334 may be configured and/or programmed to receive messages from one or more cellular towers associated with a telecommunications provider and/or a telematics Service Delivery Network (SDN) associated with vehicle 120 and transmit the messages thereto for coordinating a fleet of vehicles.
BLEM 334 may use
Figure BDA0003622502950000113
And Bluetooth Low-
Figure BDA0003622502950000114
Communication protocols establish connectivity by broadcasting and/or listening to the broadcast of a small advertisement packet and with a responding device configured according to embodiments described hereinWireless communication is established. For example, BLEM 334 may include generic attribute profile (GATT) device connectivity for client devices responding to or initiating GATT commands and requests.
The external server 112 may be communicatively coupled with the vehicle 120 via one or more networks 114, which one or more networks 114 may communicate via one or more wireless channels 350. One or more wireless channels 350 are depicted in fig. 5 as communicating via one or more networks 114.
The mobile device 110 may use a Near Field Communication (NFC) protocol,
Figure BDA0003622502950000121
Protocols, wi-Fi, ultra Wideband (UWB), and other possible data connections and sharing technologies are connected with the vehicle 120 via direct communication (e.g., channel 354).
The one or more networks 114 illustrate an exemplary communication infrastructure in which connected devices discussed in various embodiments of the present disclosure may communicate. The one or more networks 114 may be and/or may include the internet, private networks, public networks, or other configurations that operate using any one or more known communication protocols, such as, for example, transmission control protocol/internet protocol (TCP/IP),
Figure BDA0003622502950000122
Wi-Fi based on Institute of Electrical and Electronics Engineers (IEEE) standard 802.11, wiMAX (IEEE 802.16 m), ultra Wideband (UWB), and cellular technologies such as Time Division Multiple Access (TDMA), code Division Multiple Access (CDMA), high speed packet Access (HSPDA), long Term Evolution (LTE), global System for Mobile communications (GSM) and fifth Generation (5G), universal Mobile Telecommunications System (UMTS), long Term Evolution (LTE), and the like.
BCM 312 typically includes an integration of sensors, vehicle performance indicators, and varactors associated with vehicle systems, and may include a processor-based power distribution circuit that can control functions associated with a vehicle body, such as vehicle lights, vehicle windows, safety devices, point measurement device 124, computer vision system 128, sensor array 140, and various comfort controls. The BCM 312 may also operate as a gateway for the bus and network interfaces to interact with remote ECUs.
BCM 312 may coordinate any one or more of a variety of vehicle functionalities, including energy management systems, alarms, vehicle immobilizers, driver and rider access authorization systems, phone-as-a-key (PaaK) systems, driver assistance systems, autonomous Vehicle (AV) control systems, power windows, doors, actuators, and other functionalities, among others.
BCM 312 may be configured for vehicle energy management, exterior lighting control, wiper functionality, power window and door functionality, hvac systems, and driver integration systems. In other aspects, BCM 312 may control auxiliary device functionality and/or be responsible for the integration of such functionality. In one aspect, a vehicle having a vehicle control system may integrate the system at least in part using the BCM 312.
In the foregoing disclosure, reference has been made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration specific implementations in which the disclosure may be practiced. It is to be understood that other implementations may be utilized and structural changes may be made without departing from the scope of the present disclosure. References in the specification to "one embodiment," "an example embodiment," etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a feature, structure, or characteristic is described in connection with an embodiment, it will be recognized by those skilled in the art that such feature, structure, or characteristic may be used in connection with other embodiments whether or not explicitly described.
It should also be understood that the word "example" as used herein is intended to be non-exclusive and non-limiting in nature. More specifically, the word "exemplary" as used herein indicates one of several examples, and it is to be understood that no undue emphasis or preference is placed on the particular examples described.
A computer-readable medium (also referred to as a processor-readable medium) includes any non-transitory (e.g., tangible) medium that participates in providing data (e.g., instructions) that may be read by a computer (e.g., by a processor of a computer). Such a medium may take many forms, including but not limited to, non-volatile media and volatile media. The computing device may include computer-executable instructions, where the instructions are executable by one or more computing devices (such as those listed above) and stored on a computer-readable medium.
With respect to the processes, systems, methods, heuristics, etc. described herein, it should be understood that, although the steps of such processes, etc. have been described as occurring according to a certain ordered sequence, such processes could be practiced with the described steps performed in an order other than the order described herein. It is further understood that certain steps may be performed concurrently, that other steps may be added, or that certain steps described herein may be omitted. In other words, the description of processes herein is provided for the purpose of illustrating various embodiments and should in no way be construed as limiting the claims.
Accordingly, it is to be understood that the above description is intended to be illustrative, and not restrictive. Many embodiments and applications other than the examples provided would be apparent upon reading the above description. The scope should be determined, not with reference to the above description, but should instead be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. It is anticipated and intended that the technology discussed herein will be developed in the future and that the disclosed systems and methods will be incorporated into such future embodiments. In summary, it should be understood that the present application is capable of modification and variation.
Unless explicitly indicated to the contrary herein, all terms used in the claims are intended to be given their ordinary meaning as understood by the skilled person described herein. In particular, the use of singular articles such as "a," "the," "said," etc. should be read to recite one or more of the indicated elements unless a claim recites an explicit limitation to the contrary. Conditional language such as, inter alia, "can," "might," "may," or "may" is generally intended to convey that certain embodiments may include certain features, elements, and/or steps, while other embodiments may not include certain features, elements, and/or steps, unless specifically stated otherwise or otherwise understood within the context when used. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments.
According to one embodiment, the point measurement device is a radar comprising a transmitter and a receiver, wherein the transmitter transmits a beam with a frequency band at an angle and the receiver receives a reflected beam, wherein the point location comprises an angle and a distance and the first dynamic point has the same angle as the first static reference point and a different distance from the first static reference point.
According to one embodiment, the invention is further characterized by a sensor array; and instructions that, when executed by the processor, cause the processor to perform operations comprising: measuring sound or vibration with the sensor array and determining the second set of points if the sound or vibration exceeds a certain threshold.
According to one embodiment, the invention is further characterized by a mobile device or server; and instructions that, when executed by the processor, cause the processor to perform operations comprising: sending a notification to at least one of the mobile device and the server in response to a threat indicated by the cluster.
According to one embodiment, the invention is further characterized by computer vision; and instructions that, when executed by the processor, cause the processor to perform operations comprising: an object detection method is performed with a computer vision system to verify the cluster-indicated threat.
According to one embodiment, the invention is further characterized by instructions that, when executed by the processor, cause the processor to perform operations comprising: determining a third set of points on the interior surface of the space of the vehicle at a third time, wherein the third set of points includes a second set of dynamic points determined relative to the static reference point; and determining that the second set of dynamic points is associated with the cluster.

Claims (15)

1. A method, comprising:
determining a first set of points on an interior surface of a space of a vehicle at a first time, wherein the first set of points includes a static reference point;
determining a second set of points on the interior surface of the space of the vehicle at a second time, wherein the second set of points includes a first set of dynamic points determined relative to the static reference point; and
identifying the first set of dynamic points as a cluster based on the first set of dynamic points being within a threshold distance of each other.
2. The method of claim 1, wherein the point is determined by a radar comprising a transmitter and a receiver.
3. The method of claim 2, wherein the transmitter transmits a beam having a frequency band at an angle and the receiver receives a reflected beam.
4. The method of claim 3, wherein the point locations comprise an angle and a distance, and the first dynamic point has the same angle and a different distance from the first static reference point.
5. The method of claim 3, wherein the transmitters transmit beams at different angles and the radar determines the distance at the angle to determine the location of a point.
6. The method of claim 5, wherein the distance is based on a time difference between when a beam is transmitted and when the beam is received.
7. The method of claim 6, wherein the distance is based on a frequency of the beam.
8. The method of claim 1, wherein the space in the vehicle is a loading space of the vehicle.
9. The method of claim 1, further comprising: measuring sound or vibration and determining the second set of points based on the sound or vibration exceeding a particular threshold.
10. The method of claim 1, further comprising: the first set of points is repeatedly determined until all points in the first set of points are static reference points.
11. The method of claim 1, wherein the threshold distance is in a range of one centimeter to ten centimeters.
12. The method of claim 1, wherein the cluster is an indication of damage to the vehicle.
13. The method of claim 1, further comprising: an object detection method is performed with a computer vision system to verify threats.
14. The method of claim 1, further comprising:
determining a third set of points on the interior surface of the space of the vehicle at a third time, wherein the third set of points includes a second set of dynamic points determined relative to the static reference point; and
determining that the second set of dynamic points is associated with the cluster.
15. A system, comprising:
a vehicle, the vehicle comprising:
a point measurement device located in a space of the vehicle;
a processor; and
a memory comprising instructions that, when executed by the processor, cause the processor to perform operations comprising:
determining a first set of points on an interior surface of the space of the vehicle at a first time using the point measurement device, wherein the first set of points includes a static reference point;
determining a second set of points on the interior surface of the space of the vehicle at a second time using the point measurement device, wherein the second set of points includes a first set of dynamic points determined relative to the static reference point; and
identifying the first set of dynamic points as a cluster if the first set of dynamic points are within a threshold distance of each other.
CN202210462078.XA 2021-05-18 2022-04-28 System and method for providing safety to a vehicle Pending CN115366845A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US17/323,832 2021-05-18
US17/323,832 US11919479B2 (en) 2021-05-18 2021-05-18 Systems and methods for providing security to a vehicle

Publications (1)

Publication Number Publication Date
CN115366845A true CN115366845A (en) 2022-11-22

Family

ID=83899186

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210462078.XA Pending CN115366845A (en) 2021-05-18 2022-04-28 System and method for providing safety to a vehicle

Country Status (3)

Country Link
US (1) US11919479B2 (en)
CN (1) CN115366845A (en)
DE (1) DE102022111056A1 (en)

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012125726A1 (en) 2011-03-14 2012-09-20 Intelligent Technologies International, Inc. Cargo theft prevention system and method
US7492303B1 (en) 2006-05-09 2009-02-17 Personnel Protection Technologies Llc Methods and apparatus for detecting threats using radar
US8461989B2 (en) * 2008-10-16 2013-06-11 Lawrence Livermore National Security, Llc. Smart container UWB sensor system for situational awareness of intrusion alarms
US9041545B2 (en) * 2011-05-02 2015-05-26 Eric Allen Zelepugas Audio awareness apparatus, system, and method of using the same
KR102204839B1 (en) * 2014-02-11 2021-01-19 한국전자통신연구원 Apparatus and method of detecting target using radar
US9802571B2 (en) * 2014-10-01 2017-10-31 Conduent Business Services, Llc Method and system for vandalism and/or loitering detection using video
US10139827B2 (en) 2016-06-28 2018-11-27 Ford Global Technologies, Llc Detecting physical threats approaching a vehicle
US10713839B1 (en) * 2017-10-24 2020-07-14 State Farm Mutual Automobile Insurance Company Virtual vehicle generation by multi-spectrum scanning
EP3499264B1 (en) * 2017-12-13 2020-07-01 Nxp B.V. Radar unit and method for cascading integrated circuits in a radar unit
US11460573B2 (en) * 2017-12-18 2022-10-04 Nec Corporation Synthetic aperture radar signal processing device and method
US10417911B2 (en) * 2017-12-18 2019-09-17 Ford Global Technologies, Llc Inter-vehicle cooperation for physical exterior damage detection
CN108647563A (en) * 2018-03-27 2018-10-12 阿里巴巴集团控股有限公司 A kind of method, apparatus and equipment of car damage identification
CN108594264B (en) * 2018-04-28 2021-10-22 诺亚机器人科技(上海)有限公司 Obstacle recognition method and system and robot with obstacle recognition function
US11049233B2 (en) * 2019-01-14 2021-06-29 Ford Global Technologies, Llc Systems and methods for detecting and reporting vehicle damage events
CN111497773A (en) 2019-01-30 2020-08-07 南京知行新能源汽车技术开发有限公司 Vehicle safety protection system and method
CN115768664A (en) * 2020-04-28 2023-03-07 沃伊亚影像有限公司 System and method for monitoring a vehicle cabin
CN112505142B (en) * 2020-10-27 2023-01-24 北京建筑大学 Method for detecting damage of road structure, autonomous mobile device and storage medium
US11807257B2 (en) * 2021-06-07 2023-11-07 Toyota Connected North America, Inc. Sensing interactions with unpermitted components within a vehicle
US20220388464A1 (en) * 2021-06-08 2022-12-08 Toyota Connected North America, Inc. Sensing the ingress of water into a vehicle

Also Published As

Publication number Publication date
US20220371545A1 (en) 2022-11-24
DE102022111056A1 (en) 2022-11-24
US11919479B2 (en) 2024-03-05

Similar Documents

Publication Publication Date Title
US11710358B2 (en) Time-of-flight vehicle user localization
CN110363899B (en) Method and device for detecting relay attack based on communication channel
US11733690B2 (en) Remote control system for a vehicle and trailer
US11535196B2 (en) Remote vehicle motive control with optimized mobile device localization
US11511576B2 (en) Remote trailer maneuver assist system
US11062582B1 (en) Pick-up cargo bed capacitive sensor systems and methods
US11218836B1 (en) Systems and methods for controlling a geo-fence
US11919479B2 (en) Systems and methods for providing security to a vehicle
US11757559B2 (en) Collaborative signal jamming detection
US11810457B2 (en) Systems and methods for locating a parking space
US11882500B2 (en) Systems and methods for tracking luggage in a vehicle
US11345312B1 (en) Systems and methods for unlocking a vehicle
US11699345B2 (en) Systems and methods for determining and improving a parking position
US11369017B2 (en) Systems and methods utilizing a vehicle for detecting and responding to a power outage
US11546689B2 (en) Systems and methods for audio processing
US11914064B2 (en) Time-of-flight vehicle user localization distance determination
US11611694B2 (en) Vehicle control systems and methods and optical tethering techniques
CN114954436A (en) Vehicle control system and method
US20230090051A1 (en) Remote vehicle motive control with optimized mobile device localization
US11915185B2 (en) Systems and methods for delivery to a vehicle
CN117998319A (en) Remote vehicle motion control with optimized mobile device positioning
CN115339925A (en) Autonomous vehicle cargo box loading

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication