US20230340806A1 - Systems and methods for an identification panel to measure hand static and/or dynamic characteristics - Google Patents
Systems and methods for an identification panel to measure hand static and/or dynamic characteristics Download PDFInfo
- Publication number
- US20230340806A1 US20230340806A1 US17/724,775 US202217724775A US2023340806A1 US 20230340806 A1 US20230340806 A1 US 20230340806A1 US 202217724775 A US202217724775 A US 202217724775A US 2023340806 A1 US2023340806 A1 US 2023340806A1
- Authority
- US
- United States
- Prior art keywords
- user
- measurements
- handle
- vehicle
- hand
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 62
- 230000003068 static effect Effects 0.000 title claims abstract description 38
- 238000005259 measurement Methods 0.000 claims abstract description 270
- 238000010801 machine learning Methods 0.000 claims description 47
- 210000003811 finger Anatomy 0.000 description 37
- 238000005516 engineering process Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 10
- 230000008569 process Effects 0.000 description 10
- 238000013528 artificial neural network Methods 0.000 description 9
- 210000003813 thumb Anatomy 0.000 description 9
- 238000012545 processing Methods 0.000 description 8
- 230000007246 mechanism Effects 0.000 description 7
- 230000008859 change Effects 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 6
- 238000009530 blood pressure measurement Methods 0.000 description 5
- 230000009023 proprioceptive sensation Effects 0.000 description 5
- 238000004590 computer program Methods 0.000 description 4
- 230000001815 facial effect Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 230000000994 depressogenic effect Effects 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001010 compromised effect Effects 0.000 description 1
- 239000013013 elastic material Substances 0.000 description 1
- 210000004247 hand Anatomy 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000004043 responsiveness Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
Images
Classifications
-
- E—FIXED CONSTRUCTIONS
- E05—LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
- E05B—LOCKS; ACCESSORIES THEREFOR; HANDCUFFS
- E05B81/00—Power-actuated vehicle locks
- E05B81/54—Electrical circuits
- E05B81/64—Monitoring or sensing, e.g. by using switches or sensors
- E05B81/76—Detection of handle operation; Detection of a user approaching a handle; Electrical switching actions performed by door handles
- E05B81/77—Detection of handle operation; Detection of a user approaching a handle; Electrical switching actions performed by door handles comprising sensors detecting the presence of the hand of a user
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R25/00—Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
- B60R25/01—Fittings or systems for preventing or indicating unauthorised use or theft of vehicles operating on vehicle systems or fittings, e.g. on doors, seats or windscreens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R25/00—Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
- B60R25/20—Means to switch the anti-theft system on or off
- B60R25/25—Means to switch the anti-theft system on or off using biometry
- B60R25/252—Fingerprint recognition
-
- E—FIXED CONSTRUCTIONS
- E05—LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
- E05B—LOCKS; ACCESSORIES THEREFOR; HANDCUFFS
- E05B85/00—Details of vehicle locks not provided for in groups E05B77/00 - E05B83/00
- E05B85/10—Handles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/1365—Matching; Classification
Definitions
- the subject matter described herein relates, in general, to granting a user access to an area based upon static characteristics and/or dynamic characteristics that are proprioception-related, and, more particularly, to measuring static characteristics and/or dynamic characteristics of a hand of a user in order to determine whether to grant the user access to a cabin of a vehicle.
- Vehicle anti-theft protection is a significant focus of security research due to the emergence of wireless and keyless access technologies. For instance, an asymmetric cryptographic key transmitted via Bluetooth® can be stolen by a hacker through intervention into a radio channel. While some wireless access systems may utilize near field communication (NFC) which increases security and trust, keys can still be stolen and used by a hacker.
- NFC near field communication
- Conventional keyless access systems for preventing vehicle theft may use fingerprint and/or facial recognition. However, such conventional keyless access systems may be compromised if a hacker has access to fingerprint scans and/or facial scans of an authorized user.
- Example systems and methods relating to a manner of improving access to an interior of an object, such as a cabin of a vehicle, by measuring static characteristics and/or dynamic characteristics of a hand of a user are described herein.
- a system obtains first measurements of static characteristics and/or dynamic characteristics of a hand of a user using sensors (e.g., piezoelectric sensors) as the hand of the user applies force to a handle on an exterior of the object (e.g., a handle on a door of a vehicle).
- the sensors may be located on or integrated into the handle.
- Static characteristics of the hand generally do not change over the first time period and may include a shape of the hand, an orientation of the hand on the handle, a position of the hand on the handle, and/or a gripping load distributed across the handle.
- Dynamic characteristics of the hand may change over the first time period and may include angles at which the handle is pulled by the hand of the user over the first time period, gripping pressures applied to different regions of the handle by different parts of the hand (e.g., different fingers, palm, etc.) over the first time period, pulling forces applied to different regions of the handle by the different parts of the hand over the first time period, and/or overall pulling forces applied to the handle by the hand over the first time period.
- the system identifies the user as being authorized to access the interior of the object based upon the first measurements of the static and/or dynamic characteristics and a stored profile, where the stored profile comprises second measurements of the static and/or dynamic characteristics of the hand generated during a second time period occurring prior to the first time period.
- the system trains a machine learning model based upon the second measurements and provides the first measurements as input to the machine learning model, where the machine learning model outputs a value based upon learned parameters of the machine learning model and the first measurements, where the value is indicative of whether or not the user is authorized to access the interior of the object.
- the value may be either a True/False Boolean variable or a confidence rate.
- the system grants the user access to the interior of the object based on the user being authorized. For instance, the system may unlock a door of a vehicle upon identifying that that the user is authorized to access the vehicle or if the confidence rate is above predefined threshold.
- a system for controlling access to an interior of an object includes a processor and a memory communicably coupled to the processor.
- the memory stores instructions that when executed by the processor cause the processor to obtain first measurements that are based upon contact of a hand of a user with a handle located on an exterior of the object over a first time period as the hand applies first force to the handle.
- the instructions further cause the processor to identify the user as being authorized to access the interior of the object based upon the first measurements and a profile for the handle, wherein the profile comprises second measurements that are based upon contact of the hand of the user with the handle over a second time period as the hand applies second force to the handle, wherein the second time period occurs prior to the first time period.
- the instructions additionally cause the processor to grant the user access to the interior of the object based on the user being authorized.
- a non-transitory computer-readable medium for controlling access to an interior of an object and including instructions that when executed by a processor cause the processor to perform one or more functions.
- the instructions cause the processor to obtain first measurements that are based upon contact of a hand of a user with a handle located on an exterior of the object over a first time period as the hand applies first force to the handle.
- the instructions further cause the processor to identify the user as being authorized to access the interior of the object based upon the first measurements and a profile for the user, wherein the profile comprises second measurements that are based upon contact of the hand of the user with the handle over a second time period as the hand applies second force to the handle, wherein the second time period occurs prior to the first time period.
- the instructions additionally cause the processor to grant the user access to the interior of the object based on the user being authorized.
- a method for controlling access to an interior of an object includes obtain first measurements that are based upon contact of a hand of a user with a handle located on an exterior of an object over a first time period as the hand applies first force to the handle. The method further identify the user as being authorized to access an interior of the object based upon the first measurements and second measurements, wherein the second measurements are based upon contact of the hand of the user with the handle over a second time period as the hand applies second force to the handle, wherein the second time period occurs prior to the first time period. The method also includes grant the user access to the interior of the object based on the user being authorized.
- FIG. 1 illustrates one embodiment of a vehicle within which systems and methods disclosed herein may be implemented.
- FIG. 2 illustrates one embodiment of a vehicle access system that is associated with the vehicle illustrated in FIG. 1 .
- FIG. 3 illustrates one embodiment of a vehicle door that is associated with the vehicle illustrated in FIG. 1 .
- FIGS. 4 A-B illustrate differing views of one embodiment of a handle of a vehicle.
- FIG. 5 depicts an example graph of force applied over a time period to different sensors of the handle illustrated in FIGS. 4 A-B .
- FIGS. 6 A-B illustrate different views of one embodiment of a handle.
- FIGS. 7 A-B illustrate different views of one embodiment of a handle.
- FIGS. 8 A-B illustrate different views of one embodiment of a handle.
- FIGS. 9 A-B illustrate differing views of one embodiment of a handle.
- FIG. 10 illustrates one embodiment of a method that is associated with controlling access to an interior of an object.
- the object is a vehicle
- the interior is a cabin of the vehicle
- the handle is comprised by a door of the vehicle.
- the handle may have sensors integrated therein, such as force sensors.
- the sensors are tensiometric sensors or piezoelectric sensors.
- a hand of the user grips the handle a plurality of times while opening the door.
- the sensors generate electrical signals which are converted into first measurements.
- the system obtains the first measurements from the sensors.
- the first measurements include static characteristics and/or dynamic characteristics of the hand.
- the static characteristics do not change over time as the user applies the first force to the handle during the first time period.
- the static characteristics may include a shape of the hand, an orientation of the hand on the handle, a position of the hand on the handle, and/or a gripping load distributed across the handle.
- the dynamic characteristics may change over time as the user applies force to the handle during the first time period.
- Dynamic characteristics of the hand may include angles at which the handle is pulled by the hand of the user over the first time period, gripping pressures applied to different regions of the handle by different parts of the hand (e.g., different fingers, palm, etc.) over the first time period, pulling forces (or pushing forces) applied to the different regions of the handle by the hand over the first time period, and/or overall pulling forces (or pushing forces) applied to the handle by the hand over the first time period.
- the dynamic characteristics may also include ratios of measurements, such as a ratio of a first force applied by a first digit of the hand at a timestep to a second force applied by a second digit of the hand at the timestep.
- the system generates a profile for the user and/or the handle based upon the first measurements.
- the profile includes average measurements generated by each sensor at each timestep as the user grips the handle the plurality of times while opening the door.
- the profile may also include acceptable deviations for each of the average measurements at each timestep.
- the system trains a machine learning model (e.g., a neural network) based upon the first measurements, where the machine learning model includes learned parameters that are based upon the first measurements.
- the system stores the profile and/or the machine learning model in a data store within the vehicle and/or in a data store located in a cloud-computing environment.
- the hand of the user grips the handle of the vehicle while the user attempts to open the door of the vehicle.
- the sensors generate electrical signals which are converted into second measurements.
- the second measurements include static characteristics and/or dynamic characteristics as described above.
- the system obtains the second measurements from the sensors.
- the system identifies the user as being authorized to access the cabin of the vehicle based upon the first measurements and the second measurements. According to embodiments, the system compares the average measurements in the profile to corresponding measurements in the second measurements and identifies the user as being authorized to access the cabin of the vehicle when one or more of the second measurements are within a threshold range of one or more corresponding average measurements in the profile.
- the system provides the second measurements as input to the machine learning model described above.
- the machine learning model outputs a value based upon the learned parameters of the machine learning model and the second measurements, where the value is indicative of whether or not the user is authorized to access the cabin of the vehicle.
- the system identifies the user as being authorized to access the cabin based upon the value.
- the system grants the user access to the cabin of the vehicle upon identifying the user as being authorized. For instance, the system may unlock a door of the vehicle.
- the above-described technologies present various advantages over conventional system that control access to interiors of objects, such as systems in vehicles that control entry to a cabin (or a trunk) of the vehicle.
- the above-described technologies may be also implemented remotely from a vehicle in a cloud-computing environment.
- the decision as to whether or not the user is authorized to access the vehicle is made in the cloud-computing environment based upon static and/or dynamic characteristics of the user stored at the cloud-computing environment.
- security of the vehicle may be improved as the static characteristics and/or dynamic characteristics of the user are not stored on the vehicle and hence cannot be easily accessed by a hacker.
- the above-described technologies may be useful in shared-vehicle scenarios, such as with a taxi or ride-sharing service.
- the above-described technologies may be useful in preventing an impaired driver (e.g., an inebriated driver) from accessing a vehicle.
- a “vehicle” is any form of motorized transport.
- the vehicle 100 is an automobile. While arrangements will be described herein with respect to automobiles, it will be understood that embodiments are not limited to automobiles.
- the vehicle 100 may be any robotic device or form of motorized transport that, for example, includes sensors to perceive aspects of the surrounding environment, and thus benefits from the functionality discussed herein associated with granting access to the vehicle based upon static and/or dynamic characteristics of a hand of a user.
- the vehicle 100 also includes various elements. It will be understood that in various embodiments it may not be necessary for the vehicle 100 to have all of the elements shown in FIG. 1 .
- the vehicle 100 can have any combination of the various elements shown in FIG. 1 . Further, the vehicle 100 can have additional elements to those shown in FIG. 1 . In some arrangements, the vehicle 100 may be implemented without one or more of the elements shown in FIG. 1 . While the various elements are shown as being located within the vehicle 100 in FIG. 100 , it will be understood that one or more of these elements can be located external to the vehicle 100 . Further, the elements shown may be physically separated by large distances.
- one or more components of the disclosed system can be implemented within a vehicle while further components of the system are implemented within a cloud-computing environment or other system that is remote from the vehicle 100 .
- the vehicle access system 170 may be implemented within a cloud-computing environment or on a server.
- FIG. 1 Some of the possible elements of the vehicle 100 are shown in FIG. 1 and will be described along with subsequent figures. However, a description of many of the elements in FIG. 1 will be provided after the discussion of FIGS. 2 - 10 for purposes of brevity of this description. Additionally, it will be appreciated that for simplicity and clarity of illustration, where appropriate, reference numerals have been repeated among the different figures to indicate corresponding or analogous elements. In addition, the discussion outlines numerous specific details to provide a thorough understanding of the embodiments described herein. Those of skill in the art, however, will understand that the embodiments described herein may be practiced using various combinations of these elements.
- the vehicle 100 includes a vehicle access system 170 as well as a vehicle door(s) 180 that are implemented to perform methods and other functions as disclosed herein relating to improving access to an interior of the vehicle by utilizing proprioception-related factors as a hand of a user grips a handle of a door of the vehicle 100 .
- the vehicle access system 170 in various embodiments, is implemented partially within the vehicle 100 , and as a cloud-based service. For example, in one approach, functionality associated with at least one module of the vehicle access system 170 is implemented within the vehicle 100 while further functionality is implemented within a cloud-based computing system.
- the vehicle access system 170 is shown as including a processor 110 from the vehicle 100 of FIG. 1 .
- the processor 110 may be a part of the vehicle access system 170 , the vehicle access system 170 may include a separate processor from the processor 110 of the vehicle 100 , or the vehicle access system 170 may access the processor 110 through a data bus or another communication path.
- the vehicle access system 170 includes a memory 210 that stores a measurement module 220 , a decision module 225 , and a vehicle access module 230 .
- the memory 210 is a random-access memory (RAM), read-only memory (ROM), a hard-disk drive, a flash memory, or other suitable memory for storing the measurement module 220 , the decision module 225 , and the vehicle access module 230 .
- the measurement module 220 , the decision module 225 , and the vehicle access module 230 are, for example, computer-readable instructions that when executed by the processor 110 cause the processor 110 to perform the various functions disclosed herein.
- the vehicle access system 170 as illustrated in FIG. 2 is generally an abstracted form of the vehicle access system 170 as may be implemented between the vehicle 100 and a cloud-computing environment.
- the measurement module 220 generally includes instructions that function to control the processor 110 to receive data inputs from one or more sensors of the vehicle 110 .
- the measurement module 220 is configured to obtain first measurements of static characteristics and/or dynamic characteristics of a hand of a user as the hand applies first force to a handle on an exterior of the object that controls access to an interior of an object.
- the measurement module 220 is configured to obtain second measurements of static characteristics and/or dynamic characteristics of the hand of the user as the hand applies second force to the handle.
- the measurement module 220 may store the first measurements and/or the second measurements in the database 240 (as part of the measurements 245 ) and/or the memory 210 .
- the decision module 225 generally includes instructions that function to control the processor 110 to receive data inputs from the measurement module 220 . As will be explained in greater detail below, the decision module 225 determines whether or not a user is authorized to access an interior of an object based upon measurements and stored measurements. The decision module 225 also generates profiles for users and/or doors based upon measurements obtained during a calibration procedure. The decision module 225 may also generate and/or update machine learning models (described in greater detail below).
- the vehicle access module 230 generally includes instructions that function to control the processor 110 to receive data inputs from the decision module 225 .
- the vehicle access module 230 upon receiving an indication from the decision module 225 , is configured to grant a user access to an interior of an object, such as a cabin of the vehicle 100 .
- the vehicle access module 230 transmits a signal a door that causes the door to unlock.
- the vehicle access system 170 includes a database 240 .
- the database 240 is, in one embodiment, an electronic data structure stored in the memory 210 or another data store and that is configured with routines that can be executed by the processor 110 for analyzing stored data, providing stored data, organizing stored data, and so on.
- the database 240 stores data used by the measurement module 220 , the decision module 225 , and/or the vehicle access module 230 in executing various functions.
- the database 240 includes measurements 245 .
- the measurements 245 may include first measurements generated as part of a calibration procedure for the vehicle access system 170 as well as second measurements generated during use of the vehicle access system 170 (explained in greater detail below).
- the database 240 includes a user profile 250 for a user.
- the user profile 250 for the user includes measurements of static characteristics and/or dynamic characteristics of a hand of the user (obtained during a calibration procedure) as the hand applies force to a handle that controls access to an interior of an object.
- the measurements are average measurements (described in greater detail below).
- the vehicle access system 170 may utilize the user profile 250 to determine whether or not to unlock the vehicle door 180 of the vehicle 100 .
- the database 240 includes a door profile 255 .
- the door profile 255 for the user includes measurements of static characteristics and/or dynamic characteristics of hands of users that are authorized to open the vehicle door 180 .
- the door profile 255 includes the user profile 250 .
- the database 240 includes a machine learning model 260 .
- the machine learning model 260 includes learned parameters that are based upon measurements of static characteristics and/or dynamic characteristics of a hand of a user as the user applies force to the handle during a calibration procedure. The learned parameters may also be adjusted subsequent to the calibration procedure in order to improve performance of the machine learning model 260 .
- the machine learning model 260 is configured to take current measurements of static characteristics and/or dynamic characteristics of the hand of the user as the user applies force to the handle. Based upon the learned parameters and the current measurements, the machine learning model 260 is configured to output a value (or values) that is/are indicative of whether or not the user is an authorized user of the vehicle.
- the decision module 225 comprises the machine learning module 260 .
- the machine learning model 260 is a classifier model that is configured to classify static characteristics and/or dynamic characteristics of a hand as belonging or not belonging to an authorized user of the vehicle 100 .
- the machine learning model 260 is a neural network comprising nodes and edges connecting the nodes, where the edges are assigned learned weights that are based upon measurements of the static and/or dynamic characteristics of the hand of the user.
- the machine learning model 260 may be a neural network comprising an input layer comprising first node(s), at least one hidden layer comprising second node(s), and an output layer comprising third node(s), where the first node(s) are connected to the second node(s) via first edges and the second node(s) are connected to the third node(s) via second edges, and where the first edges and second edges have learned weights assigned thereto.
- the machine learning model 260 comprises a plurality of neural networks, where each neural network is assigned to a particular sensor.
- the plurality of neural networks may include a first neural network that is assigned to a first pressure sensor that makes contact with a first finger (e.g., index) of a hand of the user when the hand grips the handle and a second neural network that is assigned to a second pressure sensor that makes contact with a second finger (e.g., middle) of the hand of the user when the hand grips the handle.
- the database 240 is illustrated in FIG. 2 as being part of the vehicle access system 170 and the measurements 245 , the user profile 250 , the door profile 255 , and the machine learning model 260 are illustrated in FIG. 2 as being stored in the database 240 , other possibilities are contemplated. According to embodiments, the measurements 245 , the user profile 250 , the door profile 255 , and the machine learning model 260 are stored in the memory 210 . According to embodiments, the database 240 is separate from the vehicle access system 170 .
- the database 240 may be part of a cloud computing environment or a remote server, where the cloud computing environment/remote server is in network communication with the vehicle access system 170 , where the vehicle access system 170 is located within the vehicle 100 .
- the vehicle door 180 is a physical barrier of the vehicle 100 that, when opened, provides access to a cabin of the vehicle 100 and that, when closed, prevents access to the cabin of the vehicle 100 .
- the vehicle door 180 includes a handle 310 .
- the handle 310 is located on an exterior of the vehicle door 180 and is configured to be gripped by a hand of a user as part of opening the vehicle door 180 .
- the user applies force (e.g., pulling force, pushing force, rotational force, etc.) to the handle 310 in order to open the vehicle door 180 .
- force e.g., pulling force, pushing force, rotational force, etc.
- the handle 310 moves from a rest position to one or more positions located in a direction from the handle 310 that extends in a direction in which the force is being applied.
- the vehicle door 180 includes one or more door sensors 320 (referred to now herein as “the door sensors 320 ”) that are configured to generate measurements of static characteristics and/or dynamic characteristics of the hand of the user as the user applies force to the handle 310 as part of a door opening operation of the vehicle door 180 . At least some of the door sensors 320 are integrated into the handle 310 . According to embodiments, some or all of the door sensors 320 may be covered with an elastic material to hide such sensors from view.
- the door sensors 320 may include one or more force sensors 321 (referred to now herein as “the force sensors 321 ”) that are configured to generate measurements of force applied to one or more regions of the handle 310 by the hand of the user over a time period.
- the time period begins when the user begins to apply force to the handle 310 and ends when the handle 310 is extended to a predetermined position.
- the force sensors 321 comprise five force sensors, where each of the five force sensors are configured to make contact with a different digit (e.g., thumb, index finger, middle finger, ring finger, pinky finger) of a hand of the user as the user grips the handle 310 .
- the five force sensors At each timestep in a time period, the five force sensors generate five force measurements as digits of the user apply force to the five force sensors. In an example involving two timesteps, the five force sensors generate five force measurements at a first timestep and an additional five force measurements at a second timestep.
- the force sensors 321 are piezoelectric or tensiometric force sensors.
- the door sensors 320 may include one or more pressure sensors 322 (referred to now as “the pressure sensors 322 ”) that are configured to generate measurements of pressure applied to one or more regions of the handle 310 by the hand of the user over the time period.
- the pressure sensors 321 comprise five pressure sensors, where each of the five pressure sensors are configured to make contact with a different digit (e.g., thumb, index finger, middle finger, ring finger, pinky finger) of the hand of the user as the user grips the handle 310 .
- the five pressure sensors generate five pressure measurements as digits of the user apply pressure to one of the five pressure sensors.
- the pressure sensors 322 are piezoelectric or tensiometric force sensors.
- the door sensors 320 may include one or more angle sensors 323 (referred to now as “the angle sensors 323 ”) that are configured to generate measurements of angles at which one or more regions of the handle 310 are pulled (or pushed) by the hand of the user over the time period. The measurements of the angles may be made with respect to a reference angle.
- the pressure sensors 321 comprise five angle sensors, where each of the five angle sensors are configured to make contact with a different digit (e.g., thumb, index finger, middle finger, ring finger, pink finger) of a hand of the user as the user grips the handle 310 .
- the five angle sensors generate five angle measurements as digits of the user apply force to one of the five angle sensors.
- pulling force vectors can be obtained by analyzing pressure measurements from high-precision pressure sensors.
- the door sensors 320 may include one or more rotation sensors 324 that are configured to measure angular rotations of the handle 310 during the time period as the hand of the user rotates the handle 310 .
- the one or more rotation sensors 324 may alternatively be referred to as one or more angular encoders or one or more angular sensors.
- the one or more rotation sensors 324 may be torque based or position based.
- the door sensors 320 may include one or more temperature sensors 325 (referred to now as “the temperature sensors 325 ”) that are configured to measure a temperature of one or more regions of the hand of the user as the hand of the user applies force to the handle 310 while gripping the handle 310 over the time period.
- the temperature sensors 325 are configured to measure a temperature of one or more regions of the hand of the user as the hand of the user applies force to the handle 310 while gripping the handle 310 over the time period.
- the door sensors 320 may include one or more optical sensors 326 that are configured to determine a position, orientation, and/or shape of a hand of the user on the handle 310 as the user applies force to the handle 310 over the time period.
- the door sensors 320 may include one or more cameras 327 that are configured to capture images of the hand of the user as the user applies force to the handle 310 over the time period.
- the vehicle door 180 may include one or more miscellaneous sensors 328 (referred to herein now as “the miscellaneous sensors 328 ”) that are configured to generate measurements in addition to the measurements captured by the sensors 321 - 327 described above.
- the miscellaneous sensors 328 may include a fingerprint scanner that is configured to capture one or more fingerprints of the user as the user grips the handle 310 while applying force to the handle 310 .
- the miscellaneous sensors 328 may include a handprint scanner that is configured to capture a handprint of the user as the user grips the handle 310 while applying force to the handle 310 .
- the vehicle door 180 includes a lock/unlock mechanism 330 that is configured to lock/unlock the vehicle door 180 .
- the vehicle access system 170 communicates with the lock/unlock mechanism 330 in order to unlock the vehicle door 180 based upon identifying a user as an authorized user of the vehicle using proprioception-related factors.
- the handle 310 , the door sensors 320 , and the lock/unlock mechanism 330 are described above as being part of the vehicle door 180 , other possibilities are contemplated.
- the handle 310 , the door sensors 320 , and the lock/unlock mechanism 330 may be part of a trunk of the vehicle 100 and as such, the handle 310 , the door sensors 320 , and the lock/unlock mechanism 330 may provide access to a trunk of the vehicle 100 .
- the handle 401 may be the handle 310 .
- a pinky finger 403 , a ring finger 404 , a middle finger 405 , an index finger 406 , and a thumb 407 of the hand 402 grip a first sensor 408 , a second sensor 409 , a third sensor 410 , a fourth sensor 411 , and a fifth sensor 412 , respectively, of the handle 401 .
- the sensors 408 - 412 comprise the force sensors 321 , the pressure sensors 322 , the angle sensors 323 , the temperature sensors 325 , the optical sensor 326 , and/or the miscellaneous sensors 328 .
- the rotation sensors 324 may be integrated into the handle 401 .
- a lens of the camera 327 may be oriented towards the hand 402 of the user.
- the first sensor 408 comprises one or more of a force sensor, a pressure sensor, a fingerprint sensor, an optical sensor, and/or a thermal sensor.
- each sensor in the sensors 408 - 412 may include multiple sensors of different types.
- FIG. 4 B illustrates a side view of the handle 401 depicted in FIG. 4 A .
- the y-axis represents force (e.g., measured in Newtons) and the x-axis represents time (e.g., measured in milliseconds).
- the y-axis may represent pressure (Pi), relative force (Fi/sum(Fi), relative pressure (Pi/sum(Pi)), a derived pressure, or a derived force.
- the x-axis includes a plurality of timesteps (T 1 . . . T N ). Each line in the graph is indicative of a force applied to one of the sensors 408 - 412 over time.
- the vehicle access system 170 may utilize data in the graph in order to generate a profile for the vehicle door 180 and/or the user and/or to train the machine learning model 260 .
- FIG. 6 A an example handle 602 that is gripped by the hand 402 of a user is illustrated.
- the handle 602 may be the handle 310 .
- the pinky finger 403 , the ring finger 404 , the middle finger 405 , the index finger 406 , and the thumb 407 of the hand 402 grip a first sensor 603 , a second sensor 604 , a third sensor 605 , a fourth sensor 606 , and a fifth sensor 607 , respectively, of the handle 401 .
- Each of the sensors 603 - 607 are depressible by the user. A degree to which a sensor in the sensors 603 - 607 is depressed is indicative of a force applied to the sensor.
- FIG. 6 B illustrates a side view of the handle 602 depicted in FIG. 6 A .
- FIG. 7 A an example handle 702 that is gripped by the hand 402 of a user is illustrated.
- the handle 702 may be the handle 310 .
- the pinky finger 403 , the ring finger 404 , the middle finger 405 , and the index finger 406 grip a first sensor 703 and the thumb 407 grips a second sensor 704 .
- the first sensor 703 and the second sensor 704 may be included in the door sensors 320 .
- FIG. 7 B illustrates a side view of the handle 702 depicted in FIG. 7 A .
- FIG. 8 A an example handle 802 that is gripped by the hand 402 of a user is illustrated.
- the handle 802 may be the handle 310 .
- the pinky finger 403 , the ring finger 404 , the middle finger 405 , and the index finger 406 grip a first sensor 803 and the thumb 407 grips a second sensor 804 .
- the first sensor 803 and the second sensor 804 are configured to make contact with an entirety of the fingers 403 - 406 and the thumb 407 , respectively.
- the first sensor 803 and the second sensor 804 may be included in the door sensors 320 .
- FIG. 8 B illustrates a side view of the handle 802 depicted in FIG. 8 A .
- FIG. 9 A an example handle 902 that is gripped by the hand 402 of the user is illustrated.
- the handle 902 may be the handle 310 .
- the pinky finger 403 , the ring finger 404 , the middle finger 405 , the index finger 406 , and the thumb 407 of the hand 402 grip the handle 902 , where a plurality of sensors 904 (indicated by circles in FIG. 9 A ) are distributed throughout the handle 902 .
- the plurality of sensors 904 may be included in the door sensors 320 .
- FIG. 9 A illustrates a side view of the handle 902 depicted in FIG. 9 A .
- the vehicle access system 170 is described below in the context of the vehicle 100 , it is to be understood that the vehicle access system 170 is operable in other contexts.
- the vehicle access system 170 may be utilized in the context of a door on an outside of a building (i.e., an object) or a door to a room (i.e., an object) within the building.
- the vehicle access system 170 may be utilized to control access to a relatively small area, such as a drawer, a cabinet, or a safe.
- the vehicle access system 170 is described below as controlling access to an interior of an object, the vehicle access system 170 may control access to areas which are not enclosed overhead.
- the vehicle access system 170 may control access to a fenced-in area having a door that has a handle mounted thereon.
- a calibration procedure for the handle 310 is performed.
- the calibration procedure occurs at a time at which a user becomes authorized to access a vehicle, such as when the user purchases the vehicle 100 .
- the user grips the handle 310 at least one time and applies first force to the handle 310 in order to open the vehicle door 180 at least one time.
- the door sensors 320 generate first measurements of static characteristics and/or dynamic characteristics of the hand of the user.
- the static characteristics of the hand of the user do not change while the user applies the first force to the vehicle door 180 as part of opening the vehicle door 180 .
- the static characteristics may include a shape of the hand, a position of the hand on the handle 310 , an orientation of the hand on the handle 310 , or a gripping load distributed across digits and a palm of the hand.
- the dynamic characteristics of the hand of the user may change over time while the user applies the first force to the vehicle door 180 as part of opening the vehicle door 180 .
- the dynamic characteristics may include angles at which the handle 310 is pulled by the hand of the user over the first time period, gripping pressures applied to different regions of the handle 310 by different parts of the hand (e.g., different digits of the hand, palm, etc.) over the first time period, pulling forces applied to different regions of the handle 310 by the hand over the first time period, pushing forces applied to different regions of the handle 310 by the hand over the first time period, overall gripping pressures applied to the handle 310 over the first time period, and/or overall pulling forces applied to the handle 310 by the hand over the first time period.
- the dynamic characteristics may also include rotational measurements of the handle 310 over time when the handle 310 is able to be rotated (e.g., a knob).
- the first measurements of the static and/or dynamic characteristics may include direct measurements and derived measurements.
- Direct measurements are measurements that are generated directly by the door sensors 320 .
- direct measurements include a first measurement of force applied to a first region of the handle 310 by a first digit of a hand of the user and a second measurement of force applied to a second region of the handle 310 by a second digit of the hand of the user.
- Derived measurements are measurements that are generated from the direct measurements.
- derived measurements include a ratio of the first measurement to the second measurement.
- derived measurements include a sum of the first measurement and the second measurements.
- the first measurements may be homogenous or heterogenous.
- the first measurements include pressure measurements.
- the first measurements include a combination of force measurements and pressure measurements.
- the first measurements include force measurements and measurements derived from the force measurements, such as ratios of the force measurements to one another.
- the door sensors 320 generate the first measurements over the first time period.
- the first measurements are generated at a plurality of timesteps.
- the plurality of timesteps are 5-200 ms apart, such as 50 ms apart.
- the first measurements include a first measurement taken at a first timestep, a second measurement taken at a second timestep, and a third measurement taken at a third timestep, where the first timestep, the second timestep, and the third timestep occur chronologically.
- the door sensors 320 begin to generate the first measurements when the hand of the user makes contact with the handle 310 .
- the door sensors 320 finish generating the first measurements when the handle 310 reaches a position that would open the vehicle door 180 if the vehicle door 180 were unlocked.
- the door sensors 320 generate the first measurement when the hand makes contact with the handle 310 , the second measurement when the user is pulling the handle 310 , and the third measurement when the handle 310 reaches the position that would open the vehicle door 180 if the vehicle door 180 were unlocked.
- a number of timesteps may be dynamic and based upon how long the user takes to open the vehicle door 180 .
- the number of timesteps may be predetermined. When the number of timesteps is predetermined and the user exceeds the predetermined number of timesteps while opening the vehicle door 180 , the measurement module 220 may truncate measurements occurring at timesteps that go beyond the predetermined number of timesteps.
- the user may open the vehicle door 180 several times during the calibration procedure.
- the first measurements may include a plurality of measurement sets, where each set includes measurements from a different instance of the user opening the vehicle door 180 as part of the calibration procedure.
- the first measurements may include a first measurement set that includes measurements generated when the vehicle door 180 is opened a first time and a second measurement set that includes measurements generated when the vehicle door 180 is opened a second time.
- a measurement set comprises a plurality of measurement subsets, where each of the subsets corresponds to a different timestep while the vehicle door 180 is being opened, and where each measurement in a measurement subset is generated at the same timestep.
- a first measurement set comprises a first measurement subset for the first timestep and a second measurement subset for the second timestep.
- the first measurement subset comprises a first measurement generated by the first sensor at the first timestep and a second measurement generated by the second sensor at the first timestep.
- the second measurement subset comprises a third measurement generated by the first sensor at the second timestep and a fourth measurement generated by the second sensor at the second timestep.
- the measurement module 220 of the vehicle access system 170 obtains the first measurements from the door sensors 320 .
- the decision module 225 may obtain the first measurements from the measurement module 220 and may generate the user profile 250 for the user based upon the first measurements.
- the decision module 225 determines average measurements for each sensor at each timestep based upon the first measurements (more specifically, measurement sets comprised by the first measurements) and determines acceptable deviations from the average measurements based upon the first measurements.
- the decision module 225 computes a first average measurement and a second average measurement for the first sensor and the second sensor, respectively, where the first average measurement is an average of measurements generated by the first sensor and the second average measurement is an average of measurements generated by the second sensor.
- the acceptable deviations may be standard deviations or a manually defined deviations.
- the decision module 225 may generate the door profile 255 based upon the first measurements.
- the door profile 255 may include the user profile 250 , as well as profiles for other users that are authorized to access the vehicle 100 .
- the decision module 225 of the vehicle access system 170 trains the machine learning model 260 based upon the first measurements.
- the first measurements may include a plurality of measurement sets, where each of the sets corresponds to a different instance of the user opening the vehicle door 180 as part of the calibration procedure.
- the plurality of measurement sets may serve as training data for the machine learning model 260 (e.g., as part of a supervised learning procedure).
- the decision module 225 may also utilize other data as training data (e.g., measurements of other users applying force to handles).
- the door sensors 320 of the handle 310 include a first pressure sensor, a second pressure sensor, a third pressure sensor, a fourth pressure sensor, and a fifth pressure sensor.
- the first pressure sensor, the second pressure sensor, the third pressure sensor, the fourth pressure sensor, and the fifth pressure sensor are depressed by respective digits of the hand of the user.
- the depressions of the first through fifth sensors cause electrical signals to be generated which are indicative of a first pressure (P 1 ), a second pressure (P 2 ), a third pressure (P 3 ), a fourth pressure (P 4 ), and a fifth pressure (P 5 ), where each of the pressures is generated based upon a depression from one of the digits.
- P 1 through P 5 represent maximum pressures obtained during the first time period.
- the decision module 225 generates an array that stores P 1 through P 5 and stores the array as part of user profile 250 and/or the door profile 255 in the database 240 .
- the decision module 225 may sum P 1 through P 5 to obtain an overall pressure and store the overall pressure as part of the array.
- the decision module 225 may also compute relative pressures between the pressures P 1 through P 5 and store the relative pressures as part of the array.
- the door sensors 320 of the handle 310 include a first pressure sensor, a second pressure sensor, a third pressure sensor, a fourth pressure sensor, and a fifth pressure sensor.
- the first pressure sensor, the second pressure sensor, the third pressure sensor, the fourth pressure sensor, and the fifth pressure sensor are depressed by respective digits of the hand of the user.
- the depressions of the first through fifth sensors cause electrical signals to be generated which are indicative of a first pressure (P 1 ), a second pressure (P 2 ), a third pressure (P 3 ), a fourth pressure (P 4 ) and a fifth pressure (P 5 ), where each of the pressures is generated based upon a depression from one of the digits.
- the decision module 225 generates an N by M array, where N is a number of sensors and M is a number of timesteps at which measurements are generated by the sensors.
- the user wishes to open the vehicle door 180 .
- the hand of the user applies second force to the handle 310 as the user attempts to open the vehicle door 180 .
- the sensor systems 320 generate second measurements based upon contact of the hand with the handle 310 as the hand applies the second force.
- Types of the second measurements generally corresponds to types of the first measurements.
- the first measurements and the second measurements are pressure measurements.
- the measurement module 220 obtains the second measurements.
- the decision module 225 obtains the second measurements from the measurement module 220 and identifies the user as being authorized (or unauthorized) to access the cabin of the vehicle 100 based upon the first measurements and the second measurements. According to one embodiment, the decision module 225 accesses the machine learning model 260 . For instance, the decision module 225 may load the machine learning model 260 into the memory 210 from the database 240 . The decision module 225 provides the second measurements as input to the machine learning model 260 .
- the machine learning model 260 outputs at least one value based upon the learned parameters and the second measurements, where the at least one value is indicative of whether static characteristics and/or dynamic characteristics of the hand of the user measured during the second time period match (or are substantially similar to) static characteristics and/or dynamic characteristics of the hand as measured during the first time period as part of the calibration procedure.
- the decision module 225 identifies the user as being authorized to access the vehicle 100 based upon the at least one value.
- the decision module 225 compares each of the second measurements to corresponding measurements within the user profile 250 .
- the decision module 225 may compare a first measurement in the second measurements with a corresponding measurement in the user profile 250 , where the first measurement and the corresponding measurement may be generated by the same sensor of the handle 310 at corresponding timesteps in the first time period and the second time period.
- the decision module 225 may identify the user as being authorized to access the vehicle 100 .
- the decision module 225 may compare each of the second measurements with corresponding measurements in the user profile 250 in order to identify the user as being authorized.
- the decision module 225 identifies the user as being authorized when a threshold number of measurements are within corresponding threshold ranges of measurements in the user profile 250 .
- the vehicle access module 230 Upon receiving an indication from the decision module 225 , the vehicle access module 230 grants the user access to the cabin (or the trunk) of the vehicle based upon identifying the user as being authorized. In an example, the vehicle access module 230 transmits a signal to the lock/unlock mechanism 330 which causes the vehicle door 180 to be unlocked. As the vehicle door 180 is now unlocked, the user may apply further force to pull open the door. The user may then enter the cabin of the vehicle 100 and/or access the trunk. In another example, the vehicle access module 230 transmits a signal to a motor within the vehicle door 180 which causes the motor to generate force which opens the vehicle door 180 (without requiring the user to apply further force).
- the measurement module 220 stores the second measurements in the memory 210 and/or the database 240 .
- the decision module 225 may update the machine learning model 260 using the second measurements. For instance, decision module 225 may retrain the machine learning model 260 based upon the second measurements.
- the first measurements are generated by the door sensors 320 of the vehicle as described above.
- the measurement module 220 transmits the first measurements over a network connection (e.g., a wireless connection) to a server computing device, such as a server in a cloud-computing environment.
- the server computing device (which may comprise the decision module 225 ) generates the user profile 250 , the door profile 255 , and/or the machine learning model 260 as described above.
- the second measurements are generated by the door sensors 320 of the vehicle 100 as described above.
- the measurement module 220 transmits the second measurements over the network connection to the server computing device.
- the server computing device (which may comprise the decision module 225 ) identifies the user as being authorized (or unauthorized) to access the vehicle 100 using the above-described processes. Upon identifying the user as being authorized, the server computing device transmits a message over the network connection to the vehicle access module 230 . Upon receiving the message, the vehicle access module 230 grants the user access to the cabin and/or trunk of the vehicle 100 (e.g., by unlocking the vehicle door 180 ).
- the vehicle decision module 225 of the vehicle access system 170 performs the above-described processes to generate a second user profile for a second user of the vehicle 100 (e.g., a family member of the user, an employee who works at the same organization of the user, etc.).
- the decision module 225 may store the second user profile as part of the door profile 255 .
- the decision module 225 may generate a second machine learning model for the second user using the above-described processes.
- the decision module 225 may utilize the second user profile and/or the second machine learning model 260 to identify the second user as an authorized user of the vehicle 100 . In this manner, multiple users may employ the above-described technologies in order to access the vehicle 100 .
- the vehicle 100 is part of a fleet of shared vehicles, such as a taxi service.
- the decision module 225 generates the user profile 250 and/or the machine learning model 260 as described above and transmits the user profile 250 and/or the machine learning model 260 to a server computing device, such as a server in a cloud-computing environment.
- the server computing device generates the user profile 250 and/or the machine learning model 260 using the first measurements as described above.
- the user profile 250 and/or the machine learning model 260 are stored within a datastore of the server computing device.
- the hand of the user makes contact with a second handle of a second vehicle, where the user is authorized to enter the second vehicle.
- the second vehicle is part of the fleet of shared vehicles and is of a similar make as the first vehicle.
- Sensor systems of the second vehicle generate second measurements as described above.
- the second vehicle transmits the second measurements to the server computing device.
- the server computing device identifies the user as being authorized (or unauthorized) to access the second vehicle using the above-described processes.
- the server computing device Upon identifying the user as being authorized, transmits a message over the network connection to the second vehicle.
- the vehicle 100 grants the user access to the cabin and/or trunk of the second vehicle (e.g., by unlocking a vehicle door of the second vehicle).
- the door sensors 320 begin to generate the second measurements when the handle 310 is moved from a rest position (i.e., a default position where the handle 310 rests when a hand of the user is not in contact with the handle 310 ) while being gripped by the hand of the user.
- a rest position i.e., a default position where the handle 310 rests when a hand of the user is not in contact with the handle 310
- the decision module 225 identifies the user as being authorized to access the vehicle 100 using the processes described above.
- the first position is located in a direction of a pulling force applied to the handle 310 by the user.
- the vehicle access module 230 transmits a signal to the lock/unlock mechanism 330 causing the vehicle door 180 to unlock.
- the second position is located in the direction of the pulling force applied by the user to the handle 310 . In this manner, the vehicle access system 170 is able to provide seamless entry to the vehicle 100 from the user perspective.
- the handle 310 includes a fingerprint scanner.
- the fingerprint scanner generates a first fingerprint scan of digits of the user.
- the measurement module 220 may store the first fingerprint scan as part of the user profile 250 .
- the decision module 225 may train the machine learning model 260 further based upon the first fingerprint scan. For instance, the decision module 225 may convert the first fingerprint scan into first values that can be used to train the machine learning model 260 (in addition to the first measurements described above).
- the fingerprint scanner generates a second fingerprint scan of the digits of the user.
- the decision module 225 may utilize the second fingerprint scan (in addition to the second measurements described above) to identify the user as being authorized to access the vehicle 100 . For instance, the decision module 225 may convert the second fingerprint scan into second values that are provided as input to the machine learning model 260 (in addition to the second measurements described above).
- the vehicle access system 170 may utilize additional vehicle entry means in addition to those described above.
- decision module 225 mistakenly fails to identify the user as being authorized, the decision module 225 may rely on conventional approaches to grant the user access to the vehicle 100 , such as wireless entry through a key fob, a mechanical key, and/or a passcode.
- the decision module 225 may utilize data obtained from the rotation sensors 324 , the temperature sensors 325 , the optical sensors 326 , the camera 327 and/or the miscellaneous sensors 328 in determining whether or not the user is authorized to access the vehicle 100 .
- FIG. 10 illustrates a flowchart of a method 1000 that is associated with controlling access to an interior of an object.
- the method 1000 will be discussed from the perspective of the vehicle access system 170 of FIGS. 1 , and 2 . While method 1000 is discussed in combination with the vehicle access system 170 , it should be appreciated that the method 1000 is not limited to being implemented within the vehicle access system 170 but is instead one example of a system that may implement the method 1000 .
- the vehicle access system 170 obtains first measurements that are based upon contact of a hand of a user with a handle located on an exterior (e.g., a door) of an object (e.g., a vehicle) over a first time period as the hand applies first force to the handle.
- the measurements may include static characteristics and/or dynamic characteristics as described above.
- the vehicle access system 170 identifies the user as being authorized to access an interior (e.g., a cabin or a trunk) of the vehicle based upon second measurements.
- the second measurements are based upon contact of the hand of the user with the handle over a second time period as the hand applies second force to the handle.
- the second measurements may include the static characteristics and/or dynamic characteristics (taken during the second time period) as described above. The second time period occurs prior to the first time period.
- the vehicle access system 170 grants the user access to the interior (e.g., the cabin or the trunk) of the object (e.g., the vehicle) based on the user being authorized. For instance, the vehicle access system 170 may unlock the door of the vehicle such that the user may enter the vehicle 100 . In an example, the vehicle access system 170 grants the user access to the interior of the object responsive to the user being authorized. In another example, the vehicle access system 170 grants the user access to the interior of the object after a period of time elapses.
- FIG. 1 will now be discussed in full detail as an example environment within which the system and methods disclosed herein may operate.
- the vehicle 100 is configured to switch selectively between an autonomous mode, one or more semi-autonomous operational modes, and/or a manual mode. Such switching can be implemented in a suitable manner, now known or later developed.
- “Manual mode” means that all of or a majority of the navigation and/or maneuvering of the vehicle is performed according to inputs received from a user (e.g., human driver).
- the vehicle 100 can be a conventional vehicle that is configured to operate in only a manual mode.
- the vehicle 100 is an autonomous vehicle.
- autonomous vehicle refers to a vehicle that operates in an autonomous mode.
- Autonomous mode refers to navigating and/or maneuvering the vehicle 100 along a travel route using one or more computing systems to control the vehicle 100 with minimal or no input from a human driver.
- the vehicle 100 is highly automated or completely automated.
- the vehicle 100 is configured with one or more semi-autonomous operational modes in which one or more computing systems perform a portion of the navigation and/or maneuvering of the vehicle along a travel route, and a vehicle operator (i.e., driver) provides inputs to the vehicle to perform a portion of the navigation and/or maneuvering of the vehicle 100 along a travel route.
- the vehicle 100 can include the sensor system 120 .
- the sensor system 120 can include one or more sensors.
- Sensor means any device, component and/or system that can detect, and/or sense something.
- the one or more sensors can be configured to detect, and/or sense in real-time.
- real-time means a level of processing responsiveness that a user or system senses as sufficiently immediate for a particular process or determination to be made, or that enables the processor to keep up with some external process.
- the sensors can work independently from each other.
- two or more of the sensors can work in combination with each other.
- the two or more sensors can form a sensor network.
- the sensor system 120 and/or the one or more sensors can be operatively connected to the processor(s) 110 and/or another element of the vehicle 100 (including any of the elements shown in FIG. 1 ).
- the sensor system 120 can acquire data of at least a portion of the external environment of the vehicle 100 (e.g., nearby vehicles).
- the sensor system 120 can include any suitable type of sensor. Various examples of different types of sensors will be described herein. However, it will be understood that the embodiments are not limited to the particular sensors described.
- the sensor system 120 can include one or more vehicle sensors 121 .
- the vehicle sensor(s) 121 can detect, determine, and/or sense information about the vehicle 100 itself. In one or more arrangements, the vehicle sensor(s) 121 can be configured to detect, and/or sense position and orientation changes of the vehicle 100 , such as, for example, based on inertial acceleration.
- the vehicle sensor(s) 121 can include one or more accelerometers, one or more gyroscopes, an inertial measurement unit (IMU), a dead-reckoning system, a global navigation satellite system (GNSS), a global positioning system (GPS), a navigation system 147 , and/or other suitable sensors.
- the vehicle sensor(s) 121 can be configured to detect, and/or sense one or more characteristics of the vehicle 100 .
- the vehicle sensor(s) 121 can include a speedometer to determine a current speed of the vehicle 100 .
- the sensor system 120 can include one or more environment sensors 122 configured to acquire, and/or sense driving environment data.
- Driving environment data includes data or information about the external environment in which an autonomous vehicle is located or one or more portions thereof.
- the one or more environment sensors 122 can be configured to detect, quantify and/or sense obstacles in at least a portion of the external environment of the vehicle 100 and/or information/data about such obstacles. Such obstacles may be stationary objects and/or dynamic objects.
- the one or more environment sensors 122 can be configured to detect, measure, quantify and/or sense other things in the external environment of the vehicle 100 , such as, for example, lane markers, signs, traffic lights, traffic signs, lane lines, crosswalks, curbs proximate the vehicle 100 , off-road objects, etc.
- sensors of the sensor system 120 will be described herein.
- the example sensors may be part of the one or more environment sensors 122 and/or the one or more vehicle sensors 121 . However, it will be understood that the embodiments are not limited to the particular sensors described.
- the sensor system 120 can include one or more radar sensors 123 , one or more LIDAR sensors 124 , one or more sonar sensors 125 , and/or one or more cameras 126 .
- the one or more cameras 126 can be high dynamic range (HDR) cameras or infrared (IR) cameras.
- the vehicle 100 can include an input system 130 .
- An “input system” includes any device, component, system, element or arrangement or groups thereof that enable information/data to be entered into a machine.
- the input system 130 can receive an input from a vehicle passenger (e.g., a driver or a passenger).
- the vehicle 100 can include an output system 135 .
- An “output system” includes any device, component, or arrangement or groups thereof that enable information/data to be presented to a vehicle passenger (e.g., a person, a vehicle passenger, etc.).
- the vehicle 100 can include one or more vehicle systems 140 .
- Various examples of the one or more vehicle systems 140 are shown in FIG. 1 .
- the vehicle 100 can include more, fewer, or different vehicle systems. It should be appreciated that although particular vehicle systems are separately defined, each or any of the systems or portions thereof may be otherwise combined or segregated via hardware and/or software within the vehicle 100 .
- the vehicle 100 can include a propulsion system 141 , a braking system 142 , a steering system 143 , throttle system 144 , a transmission system 145 , a signaling system 146 , and/or a navigation system 147 .
- Each of these systems can include one or more devices, components, and/or a combination thereof, now known or later developed.
- the navigation system 147 can include one or more devices, applications, and/or combinations thereof, now known or later developed, configured to determine the geographic location of the vehicle 100 and/or to determine a travel route for the vehicle 100 .
- the navigation system 147 can include one or more mapping applications to determine a travel route for the vehicle 100 .
- the navigation system 147 can include a global positioning system, a local positioning system or a geolocation system.
- the vehicle 100 can include one or more actuators 150 .
- the actuators 150 can be any element or combination of elements operable to modify, adjust and/or alter one or more of the vehicle systems 140 or components thereof to responsive to receiving signals or other inputs from the processor(s) 110 and/or the measurement module 220 , the decision module 225 , or the vehicle access module 230 . Any suitable actuator can be used.
- the one or more actuators 150 can include motors, pneumatic actuators, hydraulic pistons, relays, solenoids, and/or piezoelectric actuators, just to name a few possibilities.
- the vehicle 100 can include one or more modules, at least some of which are described herein.
- the modules can be implemented as computer-readable program code that, when executed by a processor 110 , implement one or more of the various processes described herein.
- One or more of the modules can be a component of the processor(s) 110 , or one or more of the modules can be executed on and/or distributed among other processing systems to which the processor(s) 110 is operatively connected.
- the modules can include instructions (e.g., program logic) executable by one or more processor(s) 110 .
- one or more of the modules described herein can include artificial or computational intelligence elements, e.g., neural network, fuzzy logic or other machine learning algorithms. Further, in one or more arrangements, one or more of the modules can be distributed among a plurality of the modules described herein. In one or more arrangements, two or more of the modules described herein can be combined into a single module.
- artificial or computational intelligence elements e.g., neural network, fuzzy logic or other machine learning algorithms.
- one or more of the modules can be distributed among a plurality of the modules described herein. In one or more arrangements, two or more of the modules described herein can be combined into a single module.
- each block in the flowcharts or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
- the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
- the systems, components and/or processes described above can be realized in hardware or a combination of hardware and software and can be realized in a centralized fashion in one processing system or in a distributed fashion where different elements are spread across several interconnected processing systems. Any kind of processing system or another apparatus adapted for carrying out the methods described herein is suited.
- a typical combination of hardware and software can be a processing system with computer-usable program code that, when being loaded and executed, controls the processing system such that it carries out the methods described herein.
- the systems, components and/or processes also can be embedded in a computer-readable storage, such as a computer program product or other data programs storage device, readable by a machine, tangibly embodying a program of instructions executable by the machine to perform methods and processes described herein. These elements also can be embedded in an application product which comprises all the features enabling the implementation of the methods described herein and, which when loaded in a processing system, is able to carry out these methods.
- arrangements described herein may take the form of a computer program product embodied in one or more computer-readable media having computer-readable program code embodied, e.g., stored, thereon. Any combination of one or more computer-readable media may be utilized.
- the computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium.
- the phrase “computer-readable storage medium” means a non-transitory storage medium.
- a computer-readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
- a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
- modules as used herein include routines, programs, objects, components, data structures, and so on that perform particular tasks or implement particular data types.
- a memory generally stores the noted modules.
- the memory associated with a module may be a buffer or cache embedded within a processor, a RAM, a ROM, a flash memory, or another suitable electronic storage medium.
- a module as envisioned by the present disclosure is implemented as an application-specific integrated circuit (ASIC), a hardware component of a system on a chip (SoC), as a programmable logic array (PLA), or as another suitable hardware component that is embedded with a defined configuration set (e.g., instructions) for performing the disclosed functions.
- ASIC application-specific integrated circuit
- SoC system on a chip
- PLA programmable logic array
- Program code embodied on a computer-readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber, cable, RF, etc., or any suitable combination of the foregoing.
- Computer program code for carrying out operations for aspects of the present arrangements may be written in any combination of one or more programming languages, including an object-oriented programming language such as JavaTM Python, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
- the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
- the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
- LAN local area network
- WAN wide area network
- Internet Service Provider an Internet Service Provider
- the terms “a” and “an,” as used herein, are defined as one or more than one.
- the term “plurality,” as used herein, is defined as two or more than two.
- the term “another,” as used herein, is defined as at least a second or more.
- the terms “including” and/or “having,” as used herein, are defined as comprising (i.e., open language).
- the phrase “at least one of . . . and . . . ” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
- the phrase “at least one of A, B, and C” includes A only, B only, C only, or any combination thereof (e.g., AB, AC, BC or ABC).
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Lock And Its Accessories (AREA)
Abstract
Description
- The subject matter described herein relates, in general, to granting a user access to an area based upon static characteristics and/or dynamic characteristics that are proprioception-related, and, more particularly, to measuring static characteristics and/or dynamic characteristics of a hand of a user in order to determine whether to grant the user access to a cabin of a vehicle.
- Vehicle anti-theft protection is a significant focus of security research due to the emergence of wireless and keyless access technologies. For instance, an asymmetric cryptographic key transmitted via Bluetooth® can be stolen by a hacker through intervention into a radio channel. While some wireless access systems may utilize near field communication (NFC) which increases security and trust, keys can still be stolen and used by a hacker. Conventional keyless access systems for preventing vehicle theft may use fingerprint and/or facial recognition. However, such conventional keyless access systems may be compromised if a hacker has access to fingerprint scans and/or facial scans of an authorized user.
- Example systems and methods relating to a manner of improving access to an interior of an object, such as a cabin of a vehicle, by measuring static characteristics and/or dynamic characteristics of a hand of a user are described herein. In one embodiment, during a first time period, a system obtains first measurements of static characteristics and/or dynamic characteristics of a hand of a user using sensors (e.g., piezoelectric sensors) as the hand of the user applies force to a handle on an exterior of the object (e.g., a handle on a door of a vehicle). The sensors may be located on or integrated into the handle. Static characteristics of the hand generally do not change over the first time period and may include a shape of the hand, an orientation of the hand on the handle, a position of the hand on the handle, and/or a gripping load distributed across the handle. Dynamic characteristics of the hand may change over the first time period and may include angles at which the handle is pulled by the hand of the user over the first time period, gripping pressures applied to different regions of the handle by different parts of the hand (e.g., different fingers, palm, etc.) over the first time period, pulling forces applied to different regions of the handle by the different parts of the hand over the first time period, and/or overall pulling forces applied to the handle by the hand over the first time period. The system identifies the user as being authorized to access the interior of the object based upon the first measurements of the static and/or dynamic characteristics and a stored profile, where the stored profile comprises second measurements of the static and/or dynamic characteristics of the hand generated during a second time period occurring prior to the first time period. In an example, the system trains a machine learning model based upon the second measurements and provides the first measurements as input to the machine learning model, where the machine learning model outputs a value based upon learned parameters of the machine learning model and the first measurements, where the value is indicative of whether or not the user is authorized to access the interior of the object. The value may be either a True/False Boolean variable or a confidence rate. The system grants the user access to the interior of the object based on the user being authorized. For instance, the system may unlock a door of a vehicle upon identifying that that the user is authorized to access the vehicle or if the confidence rate is above predefined threshold.
- In one embodiment, a system for controlling access to an interior of an object is disclosed. The system includes a processor and a memory communicably coupled to the processor. The memory stores instructions that when executed by the processor cause the processor to obtain first measurements that are based upon contact of a hand of a user with a handle located on an exterior of the object over a first time period as the hand applies first force to the handle. The instructions further cause the processor to identify the user as being authorized to access the interior of the object based upon the first measurements and a profile for the handle, wherein the profile comprises second measurements that are based upon contact of the hand of the user with the handle over a second time period as the hand applies second force to the handle, wherein the second time period occurs prior to the first time period. The instructions additionally cause the processor to grant the user access to the interior of the object based on the user being authorized.
- In one embodiment, a non-transitory computer-readable medium for controlling access to an interior of an object and including instructions that when executed by a processor cause the processor to perform one or more functions is disclosed. The instructions cause the processor to obtain first measurements that are based upon contact of a hand of a user with a handle located on an exterior of the object over a first time period as the hand applies first force to the handle. The instructions further cause the processor to identify the user as being authorized to access the interior of the object based upon the first measurements and a profile for the user, wherein the profile comprises second measurements that are based upon contact of the hand of the user with the handle over a second time period as the hand applies second force to the handle, wherein the second time period occurs prior to the first time period. The instructions additionally cause the processor to grant the user access to the interior of the object based on the user being authorized.
- In one embodiment, a method for controlling access to an interior of an object is disclosed. In one embodiment, the method includes obtain first measurements that are based upon contact of a hand of a user with a handle located on an exterior of an object over a first time period as the hand applies first force to the handle. The method further identify the user as being authorized to access an interior of the object based upon the first measurements and second measurements, wherein the second measurements are based upon contact of the hand of the user with the handle over a second time period as the hand applies second force to the handle, wherein the second time period occurs prior to the first time period. The method also includes grant the user access to the interior of the object based on the user being authorized.
- The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate various systems, methods, and other embodiments of the disclosure. It will be appreciated that the illustrated element boundaries (e.g., boxes, groups of boxes, or other shapes) in the figures represent one embodiment of the boundaries. In some embodiments, one element may be designed as multiple elements or multiple elements may be designed as one element. In some embodiments, an element shown as an internal component of another element may be implemented as an external component and vice versa. Furthermore, elements may not be drawn to scale.
-
FIG. 1 illustrates one embodiment of a vehicle within which systems and methods disclosed herein may be implemented. -
FIG. 2 illustrates one embodiment of a vehicle access system that is associated with the vehicle illustrated inFIG. 1 . -
FIG. 3 illustrates one embodiment of a vehicle door that is associated with the vehicle illustrated inFIG. 1 . -
FIGS. 4A-B illustrate differing views of one embodiment of a handle of a vehicle. -
FIG. 5 depicts an example graph of force applied over a time period to different sensors of the handle illustrated inFIGS. 4A-B . -
FIGS. 6A-B illustrate different views of one embodiment of a handle. -
FIGS. 7A-B illustrate different views of one embodiment of a handle. -
FIGS. 8A-B illustrate different views of one embodiment of a handle. -
FIGS. 9A-B illustrate differing views of one embodiment of a handle. -
FIG. 10 illustrates one embodiment of a method that is associated with controlling access to an interior of an object. - Systems, methods, and other embodiments associated with improving access to an interior of an object, such as a cabin of a vehicle, are disclosed herein. As noted above, conventional vehicle wireless access systems suffer from security vulnerabilities whereby a hacker may gain access to a vehicle through intervention into a radio channel. While keyless access systems that utilize fingerprint scans and/or facial scans may be used by a vehicle to improve security, such keyless access systems may be vulnerable if a hacker obtains access to the facial scans and/or the fingerprint scans.
- To address these issues, a system for controlling access to an interior of an object based upon proprioception-related factors as a hand of a user applies force to a handle on an exterior of the object is described herein. In an example, the object is a vehicle, the interior is a cabin of the vehicle, and the handle is comprised by a door of the vehicle. The handle may have sensors integrated therein, such as force sensors. In an example, the sensors are tensiometric sensors or piezoelectric sensors. During a first time period, a hand of the user grips the handle a plurality of times while opening the door. As the hand of the user applies first force to the handle while opening the door during the first time period, the sensors generate electrical signals which are converted into first measurements. The system obtains the first measurements from the sensors. The first measurements include static characteristics and/or dynamic characteristics of the hand. In general, the static characteristics do not change over time as the user applies the first force to the handle during the first time period. The static characteristics may include a shape of the hand, an orientation of the hand on the handle, a position of the hand on the handle, and/or a gripping load distributed across the handle. In general, the dynamic characteristics may change over time as the user applies force to the handle during the first time period. Dynamic characteristics of the hand may include angles at which the handle is pulled by the hand of the user over the first time period, gripping pressures applied to different regions of the handle by different parts of the hand (e.g., different fingers, palm, etc.) over the first time period, pulling forces (or pushing forces) applied to the different regions of the handle by the hand over the first time period, and/or overall pulling forces (or pushing forces) applied to the handle by the hand over the first time period. The dynamic characteristics may also include ratios of measurements, such as a ratio of a first force applied by a first digit of the hand at a timestep to a second force applied by a second digit of the hand at the timestep.
- According to embodiments, the system generates a profile for the user and/or the handle based upon the first measurements. In an example, the profile includes average measurements generated by each sensor at each timestep as the user grips the handle the plurality of times while opening the door. The profile may also include acceptable deviations for each of the average measurements at each timestep. According to embodiments, the system trains a machine learning model (e.g., a neural network) based upon the first measurements, where the machine learning model includes learned parameters that are based upon the first measurements. The system stores the profile and/or the machine learning model in a data store within the vehicle and/or in a data store located in a cloud-computing environment.
- During a second time period that occurs subsequent to the first time period, the hand of the user grips the handle of the vehicle while the user attempts to open the door of the vehicle. As the hand of the user applies second force to the handle during the second time period, the sensors generate electrical signals which are converted into second measurements. The second measurements include static characteristics and/or dynamic characteristics as described above. The system obtains the second measurements from the sensors. The system identifies the user as being authorized to access the cabin of the vehicle based upon the first measurements and the second measurements. According to embodiments, the system compares the average measurements in the profile to corresponding measurements in the second measurements and identifies the user as being authorized to access the cabin of the vehicle when one or more of the second measurements are within a threshold range of one or more corresponding average measurements in the profile. According to embodiments, the system provides the second measurements as input to the machine learning model described above. The machine learning model outputs a value based upon the learned parameters of the machine learning model and the second measurements, where the value is indicative of whether or not the user is authorized to access the cabin of the vehicle. The system identifies the user as being authorized to access the cabin based upon the value. The system grants the user access to the cabin of the vehicle upon identifying the user as being authorized. For instance, the system may unlock a door of the vehicle.
- The above-described technologies present various advantages over conventional system that control access to interiors of objects, such as systems in vehicles that control entry to a cabin (or a trunk) of the vehicle. First, unlike systems that transmit keys wirelessly, the above-described technologies are not vulnerable to techniques employed by hackers which intercept keys through intervention into a radio channel. Second, even if a hacker had access to the static characteristics and/or dynamic characteristics of a user as described above, it would be difficult, if not impossible, for the hacker to precisely replicate such characteristics. Third, according to embodiments, the above-described technologies may be also implemented remotely from a vehicle in a cloud-computing environment. Then, according to the embodiments, the decision as to whether or not the user is authorized to access the vehicle is made in the cloud-computing environment based upon static and/or dynamic characteristics of the user stored at the cloud-computing environment. As such, security of the vehicle may be improved as the static characteristics and/or dynamic characteristics of the user are not stored on the vehicle and hence cannot be easily accessed by a hacker. Fourth, the above-described technologies may be useful in shared-vehicle scenarios, such as with a taxi or ride-sharing service. Fifth, the above-described technologies may be useful in preventing an impaired driver (e.g., an inebriated driver) from accessing a vehicle.
- Referring to
FIG. 1 , an example of avehicle 100 is illustrated. As used herein, a “vehicle” is any form of motorized transport. In one or more implementations, thevehicle 100 is an automobile. While arrangements will be described herein with respect to automobiles, it will be understood that embodiments are not limited to automobiles. In some implementations, thevehicle 100 may be any robotic device or form of motorized transport that, for example, includes sensors to perceive aspects of the surrounding environment, and thus benefits from the functionality discussed herein associated with granting access to the vehicle based upon static and/or dynamic characteristics of a hand of a user. - The
vehicle 100 also includes various elements. It will be understood that in various embodiments it may not be necessary for thevehicle 100 to have all of the elements shown inFIG. 1 . Thevehicle 100 can have any combination of the various elements shown inFIG. 1 . Further, thevehicle 100 can have additional elements to those shown inFIG. 1 . In some arrangements, thevehicle 100 may be implemented without one or more of the elements shown inFIG. 1 . While the various elements are shown as being located within thevehicle 100 inFIG. 100 , it will be understood that one or more of these elements can be located external to thevehicle 100. Further, the elements shown may be physically separated by large distances. For example, as discussed, one or more components of the disclosed system can be implemented within a vehicle while further components of the system are implemented within a cloud-computing environment or other system that is remote from thevehicle 100. In an example, thevehicle access system 170 may be implemented within a cloud-computing environment or on a server. - Some of the possible elements of the
vehicle 100 are shown inFIG. 1 and will be described along with subsequent figures. However, a description of many of the elements inFIG. 1 will be provided after the discussion ofFIGS. 2-10 for purposes of brevity of this description. Additionally, it will be appreciated that for simplicity and clarity of illustration, where appropriate, reference numerals have been repeated among the different figures to indicate corresponding or analogous elements. In addition, the discussion outlines numerous specific details to provide a thorough understanding of the embodiments described herein. Those of skill in the art, however, will understand that the embodiments described herein may be practiced using various combinations of these elements. In either case, thevehicle 100 includes avehicle access system 170 as well as a vehicle door(s) 180 that are implemented to perform methods and other functions as disclosed herein relating to improving access to an interior of the vehicle by utilizing proprioception-related factors as a hand of a user grips a handle of a door of thevehicle 100. As will be discussed in greater detail subsequently, thevehicle access system 170, in various embodiments, is implemented partially within thevehicle 100, and as a cloud-based service. For example, in one approach, functionality associated with at least one module of thevehicle access system 170 is implemented within thevehicle 100 while further functionality is implemented within a cloud-based computing system. - With reference to
FIG. 2 , one embodiment of thevehicle access system 170 ofFIG. 1 is further illustrated. Thevehicle access system 170 is shown as including aprocessor 110 from thevehicle 100 ofFIG. 1 . Accordingly, theprocessor 110 may be a part of thevehicle access system 170, thevehicle access system 170 may include a separate processor from theprocessor 110 of thevehicle 100, or thevehicle access system 170 may access theprocessor 110 through a data bus or another communication path. In one embodiment, thevehicle access system 170 includes amemory 210 that stores ameasurement module 220, adecision module 225, and avehicle access module 230. Thememory 210 is a random-access memory (RAM), read-only memory (ROM), a hard-disk drive, a flash memory, or other suitable memory for storing themeasurement module 220, thedecision module 225, and thevehicle access module 230. Themeasurement module 220, thedecision module 225, and thevehicle access module 230 are, for example, computer-readable instructions that when executed by theprocessor 110 cause theprocessor 110 to perform the various functions disclosed herein. - The
vehicle access system 170 as illustrated inFIG. 2 is generally an abstracted form of thevehicle access system 170 as may be implemented between thevehicle 100 and a cloud-computing environment. - With reference to
FIG. 2 , themeasurement module 220 generally includes instructions that function to control theprocessor 110 to receive data inputs from one or more sensors of thevehicle 110. In general, during a first time period (e.g., during a calibration procedure), themeasurement module 220 is configured to obtain first measurements of static characteristics and/or dynamic characteristics of a hand of a user as the hand applies first force to a handle on an exterior of the object that controls access to an interior of an object. During a second time period that occurs subsequent to the first time period, themeasurement module 220 is configured to obtain second measurements of static characteristics and/or dynamic characteristics of the hand of the user as the hand applies second force to the handle. Themeasurement module 220 may store the first measurements and/or the second measurements in the database 240 (as part of the measurements 245) and/or thememory 210. - The
decision module 225 generally includes instructions that function to control theprocessor 110 to receive data inputs from themeasurement module 220. As will be explained in greater detail below, thedecision module 225 determines whether or not a user is authorized to access an interior of an object based upon measurements and stored measurements. Thedecision module 225 also generates profiles for users and/or doors based upon measurements obtained during a calibration procedure. Thedecision module 225 may also generate and/or update machine learning models (described in greater detail below). - The
vehicle access module 230 generally includes instructions that function to control theprocessor 110 to receive data inputs from thedecision module 225. In general, upon receiving an indication from thedecision module 225, thevehicle access module 230 is configured to grant a user access to an interior of an object, such as a cabin of thevehicle 100. In an example, thevehicle access module 230 transmits a signal a door that causes the door to unlock. - Moreover, in one embodiment, the
vehicle access system 170 includes adatabase 240. Thedatabase 240 is, in one embodiment, an electronic data structure stored in thememory 210 or another data store and that is configured with routines that can be executed by theprocessor 110 for analyzing stored data, providing stored data, organizing stored data, and so on. Thus, in one embodiment, thedatabase 240 stores data used by themeasurement module 220, thedecision module 225, and/or thevehicle access module 230 in executing various functions. - In one embodiment, the
database 240 includesmeasurements 245. Themeasurements 245 may include first measurements generated as part of a calibration procedure for thevehicle access system 170 as well as second measurements generated during use of the vehicle access system 170 (explained in greater detail below). - In one embodiment, the
database 240 includes auser profile 250 for a user. Theuser profile 250 for the user includes measurements of static characteristics and/or dynamic characteristics of a hand of the user (obtained during a calibration procedure) as the hand applies force to a handle that controls access to an interior of an object. In an example, the measurements are average measurements (described in greater detail below). Thevehicle access system 170 may utilize theuser profile 250 to determine whether or not to unlock thevehicle door 180 of thevehicle 100. - In one embodiment, the
database 240 includes adoor profile 255. Thedoor profile 255 for the user includes measurements of static characteristics and/or dynamic characteristics of hands of users that are authorized to open thevehicle door 180. In an example, thedoor profile 255 includes theuser profile 250. - In one embodiment, the
database 240 includes a machine learning model 260. In an example, the machine learning model 260 includes learned parameters that are based upon measurements of static characteristics and/or dynamic characteristics of a hand of a user as the user applies force to the handle during a calibration procedure. The learned parameters may also be adjusted subsequent to the calibration procedure in order to improve performance of the machine learning model 260. In general, the machine learning model 260 is configured to take current measurements of static characteristics and/or dynamic characteristics of the hand of the user as the user applies force to the handle. Based upon the learned parameters and the current measurements, the machine learning model 260 is configured to output a value (or values) that is/are indicative of whether or not the user is an authorized user of the vehicle. According to embodiments, thedecision module 225 comprises the machine learning module 260. - According to embodiments, the machine learning model 260 is a classifier model that is configured to classify static characteristics and/or dynamic characteristics of a hand as belonging or not belonging to an authorized user of the
vehicle 100. - According to embodiments, the machine learning model 260 is a neural network comprising nodes and edges connecting the nodes, where the edges are assigned learned weights that are based upon measurements of the static and/or dynamic characteristics of the hand of the user. For instance, the machine learning model 260 may be a neural network comprising an input layer comprising first node(s), at least one hidden layer comprising second node(s), and an output layer comprising third node(s), where the first node(s) are connected to the second node(s) via first edges and the second node(s) are connected to the third node(s) via second edges, and where the first edges and second edges have learned weights assigned thereto. The learned weights are based upon the measurements of static and/or dynamic characteristics of the hand of the user as the user applies force to the handle during the calibration procedure (or subsequent to the calibration procedure). According to embodiments, the machine learning model 260 comprises a plurality of neural networks, where each neural network is assigned to a particular sensor. For instance, the plurality of neural networks may include a first neural network that is assigned to a first pressure sensor that makes contact with a first finger (e.g., index) of a hand of the user when the hand grips the handle and a second neural network that is assigned to a second pressure sensor that makes contact with a second finger (e.g., middle) of the hand of the user when the hand grips the handle.
- Although the
database 240 is illustrated inFIG. 2 as being part of thevehicle access system 170 and themeasurements 245, theuser profile 250, thedoor profile 255, and the machine learning model 260 are illustrated inFIG. 2 as being stored in thedatabase 240, other possibilities are contemplated. According to embodiments, themeasurements 245, theuser profile 250, thedoor profile 255, and the machine learning model 260 are stored in thememory 210. According to embodiments, thedatabase 240 is separate from thevehicle access system 170. For instance, according to embodiments, thedatabase 240 may be part of a cloud computing environment or a remote server, where the cloud computing environment/remote server is in network communication with thevehicle access system 170, where thevehicle access system 170 is located within thevehicle 100. - With reference now to
FIG. 3 , an example of thevehicle door 180 is depicted. In general, thevehicle door 180 is a physical barrier of thevehicle 100 that, when opened, provides access to a cabin of thevehicle 100 and that, when closed, prevents access to the cabin of thevehicle 100. - The
vehicle door 180 includes ahandle 310. In general, thehandle 310 is located on an exterior of thevehicle door 180 and is configured to be gripped by a hand of a user as part of opening thevehicle door 180. The user applies force (e.g., pulling force, pushing force, rotational force, etc.) to thehandle 310 in order to open thevehicle door 180. In an example, as the force is applied to thehandle 310, thehandle 310 moves from a rest position to one or more positions located in a direction from thehandle 310 that extends in a direction in which the force is being applied. - The
vehicle door 180 includes one or more door sensors 320 (referred to now herein as “thedoor sensors 320”) that are configured to generate measurements of static characteristics and/or dynamic characteristics of the hand of the user as the user applies force to thehandle 310 as part of a door opening operation of thevehicle door 180. At least some of thedoor sensors 320 are integrated into thehandle 310. According to embodiments, some or all of thedoor sensors 320 may be covered with an elastic material to hide such sensors from view. - The
door sensors 320 may include one or more force sensors 321 (referred to now herein as “theforce sensors 321”) that are configured to generate measurements of force applied to one or more regions of thehandle 310 by the hand of the user over a time period. In an example, the time period begins when the user begins to apply force to thehandle 310 and ends when thehandle 310 is extended to a predetermined position. According to embodiments, theforce sensors 321 comprise five force sensors, where each of the five force sensors are configured to make contact with a different digit (e.g., thumb, index finger, middle finger, ring finger, pinky finger) of a hand of the user as the user grips thehandle 310. At each timestep in a time period, the five force sensors generate five force measurements as digits of the user apply force to the five force sensors. In an example involving two timesteps, the five force sensors generate five force measurements at a first timestep and an additional five force measurements at a second timestep. According to embodiments, theforce sensors 321 are piezoelectric or tensiometric force sensors. - The
door sensors 320 may include one or more pressure sensors 322 (referred to now as “thepressure sensors 322”) that are configured to generate measurements of pressure applied to one or more regions of thehandle 310 by the hand of the user over the time period. According to embodiments, thepressure sensors 321 comprise five pressure sensors, where each of the five pressure sensors are configured to make contact with a different digit (e.g., thumb, index finger, middle finger, ring finger, pinky finger) of the hand of the user as the user grips thehandle 310. The five pressure sensors generate five pressure measurements as digits of the user apply pressure to one of the five pressure sensors. According to embodiments, thepressure sensors 322 are piezoelectric or tensiometric force sensors. - The
door sensors 320 may include one or more angle sensors 323 (referred to now as “theangle sensors 323”) that are configured to generate measurements of angles at which one or more regions of thehandle 310 are pulled (or pushed) by the hand of the user over the time period. The measurements of the angles may be made with respect to a reference angle. According to embodiments, thepressure sensors 321 comprise five angle sensors, where each of the five angle sensors are configured to make contact with a different digit (e.g., thumb, index finger, middle finger, ring finger, pink finger) of a hand of the user as the user grips thehandle 310. The five angle sensors generate five angle measurements as digits of the user apply force to one of the five angle sensors. Alternatively, pulling force vectors can be obtained by analyzing pressure measurements from high-precision pressure sensors. - The
door sensors 320 may include one ormore rotation sensors 324 that are configured to measure angular rotations of thehandle 310 during the time period as the hand of the user rotates thehandle 310. The one ormore rotation sensors 324 may alternatively be referred to as one or more angular encoders or one or more angular sensors. The one ormore rotation sensors 324 may be torque based or position based. - The
door sensors 320 may include one or more temperature sensors 325 (referred to now as “thetemperature sensors 325”) that are configured to measure a temperature of one or more regions of the hand of the user as the hand of the user applies force to thehandle 310 while gripping thehandle 310 over the time period. - The
door sensors 320 may include one or moreoptical sensors 326 that are configured to determine a position, orientation, and/or shape of a hand of the user on thehandle 310 as the user applies force to thehandle 310 over the time period. - The
door sensors 320 may include one ormore cameras 327 that are configured to capture images of the hand of the user as the user applies force to thehandle 310 over the time period. - The
vehicle door 180 may include one or more miscellaneous sensors 328 (referred to herein now as “themiscellaneous sensors 328”) that are configured to generate measurements in addition to the measurements captured by the sensors 321-327 described above. For instance, themiscellaneous sensors 328 may include a fingerprint scanner that is configured to capture one or more fingerprints of the user as the user grips thehandle 310 while applying force to thehandle 310. Themiscellaneous sensors 328 may include a handprint scanner that is configured to capture a handprint of the user as the user grips thehandle 310 while applying force to thehandle 310. - The
vehicle door 180 includes a lock/unlock mechanism 330 that is configured to lock/unlock thevehicle door 180. In an example, thevehicle access system 170 communicates with the lock/unlock mechanism 330 in order to unlock thevehicle door 180 based upon identifying a user as an authorized user of the vehicle using proprioception-related factors. - Although the
handle 310, thedoor sensors 320, and the lock/unlock mechanism 330 are described above as being part of thevehicle door 180, other possibilities are contemplated. For instance, thehandle 310, thedoor sensors 320, and the lock/unlock mechanism 330 may be part of a trunk of thevehicle 100 and as such, thehandle 310, thedoor sensors 320, and the lock/unlock mechanism 330 may provide access to a trunk of thevehicle 100. - Referring now to
FIG. 4A , anexample handle 401 that is gripped by ahand 402 of a user is illustrated. Thehandle 401 may be thehandle 310. As depicted inFIG. 4A , apinky finger 403, aring finger 404, amiddle finger 405, anindex finger 406, and athumb 407 of thehand 402 grip afirst sensor 408, asecond sensor 409, athird sensor 410, afourth sensor 411, and afifth sensor 412, respectively, of thehandle 401. In an example, the sensors 408-412 comprise theforce sensors 321, thepressure sensors 322, theangle sensors 323, thetemperature sensors 325, theoptical sensor 326, and/or themiscellaneous sensors 328. Therotation sensors 324 may be integrated into thehandle 401. A lens of thecamera 327 may be oriented towards thehand 402 of the user. In an example, thefirst sensor 408 comprises one or more of a force sensor, a pressure sensor, a fingerprint sensor, an optical sensor, and/or a thermal sensor. Thus, it is to be understood that each sensor in the sensors 408-412 may include multiple sensors of different types.FIG. 4B illustrates a side view of thehandle 401 depicted inFIG. 4A . - Referring now to
FIG. 5 , an example graph of force applied by thehand 402 of a user to thehandle 401 over time is illustrated. In the example shown inFIG. 5 , the y-axis represents force (e.g., measured in Newtons) and the x-axis represents time (e.g., measured in milliseconds). Alternatively, the y-axis may represent pressure (Pi), relative force (Fi/sum(Fi), relative pressure (Pi/sum(Pi)), a derived pressure, or a derived force. The x-axis includes a plurality of timesteps (T1 . . . TN). Each line in the graph is indicative of a force applied to one of the sensors 408-412 over time. As will be discussed in greater detail below, thevehicle access system 170 may utilize data in the graph in order to generate a profile for thevehicle door 180 and/or the user and/or to train the machine learning model 260. - Referring now to
FIG. 6A , anexample handle 602 that is gripped by thehand 402 of a user is illustrated. Thehandle 602 may be thehandle 310. As depicted inFIG. 6A , thepinky finger 403, thering finger 404, themiddle finger 405, theindex finger 406, and thethumb 407 of thehand 402 grip afirst sensor 603, a second sensor 604, athird sensor 605, afourth sensor 606, and afifth sensor 607, respectively, of thehandle 401. Each of the sensors 603-607 are depressible by the user. A degree to which a sensor in the sensors 603-607 is depressed is indicative of a force applied to the sensor.FIG. 6B illustrates a side view of thehandle 602 depicted inFIG. 6A . - Referring now to
FIG. 7A , anexample handle 702 that is gripped by thehand 402 of a user is illustrated. Thehandle 702 may be thehandle 310. As depicted inFIG. 7A , thepinky finger 403, thering finger 404, themiddle finger 405, and theindex finger 406 grip afirst sensor 703 and thethumb 407 grips asecond sensor 704. Thefirst sensor 703 and thesecond sensor 704 may be included in thedoor sensors 320.FIG. 7B illustrates a side view of thehandle 702 depicted inFIG. 7A . - Referring now to
FIG. 8A , anexample handle 802 that is gripped by thehand 402 of a user is illustrated. Thehandle 802 may be thehandle 310. As depicted inFIG. 8A , thepinky finger 403, thering finger 404, themiddle finger 405, and theindex finger 406 grip afirst sensor 803 and thethumb 407 grips asecond sensor 804. In contrast to thehandle 702 depicted inFIGS. 7A-B , thefirst sensor 803 and thesecond sensor 804 are configured to make contact with an entirety of the fingers 403-406 and thethumb 407, respectively. Thefirst sensor 803 and thesecond sensor 804 may be included in thedoor sensors 320.FIG. 8B illustrates a side view of thehandle 802 depicted inFIG. 8A . - Referring now to
FIG. 9A , anexample handle 902 that is gripped by thehand 402 of the user is illustrated. Thehandle 902 may be thehandle 310. As depicted inFIG. 9A , thepinky finger 403, thering finger 404, themiddle finger 405, theindex finger 406, and thethumb 407 of thehand 402 grip thehandle 902, where a plurality of sensors 904 (indicated by circles inFIG. 9A ) are distributed throughout thehandle 902. The plurality ofsensors 904 may be included in thedoor sensors 320.FIG. 9A illustrates a side view of thehandle 902 depicted inFIG. 9A . - Operation of the
vehicle access system 170 is now set forth. Although thevehicle access system 170 is described below in the context of thevehicle 100, it is to be understood that thevehicle access system 170 is operable in other contexts. For instance, thevehicle access system 170 may be utilized in the context of a door on an outside of a building (i.e., an object) or a door to a room (i.e., an object) within the building. Additionally, thevehicle access system 170 may be utilized to control access to a relatively small area, such as a drawer, a cabinet, or a safe. Furthermore, although thevehicle access system 170 is described below as controlling access to an interior of an object, thevehicle access system 170 may control access to areas which are not enclosed overhead. For instance, thevehicle access system 170 may control access to a fenced-in area having a door that has a handle mounted thereon. - During a first time period, a calibration procedure for the
handle 310 is performed. In an example, the calibration procedure occurs at a time at which a user becomes authorized to access a vehicle, such as when the user purchases thevehicle 100. During the calibration procedure, the user grips thehandle 310 at least one time and applies first force to thehandle 310 in order to open thevehicle door 180 at least one time. As the user applies the first force to thehandle 310 during the first time period, thedoor sensors 320 generate first measurements of static characteristics and/or dynamic characteristics of the hand of the user. - In general, the static characteristics of the hand of the user do not change while the user applies the first force to the
vehicle door 180 as part of opening thevehicle door 180. For instance, the static characteristics may include a shape of the hand, a position of the hand on thehandle 310, an orientation of the hand on thehandle 310, or a gripping load distributed across digits and a palm of the hand. - In general, the dynamic characteristics of the hand of the user may change over time while the user applies the first force to the
vehicle door 180 as part of opening thevehicle door 180. The dynamic characteristics may include angles at which thehandle 310 is pulled by the hand of the user over the first time period, gripping pressures applied to different regions of thehandle 310 by different parts of the hand (e.g., different digits of the hand, palm, etc.) over the first time period, pulling forces applied to different regions of thehandle 310 by the hand over the first time period, pushing forces applied to different regions of thehandle 310 by the hand over the first time period, overall gripping pressures applied to thehandle 310 over the first time period, and/or overall pulling forces applied to thehandle 310 by the hand over the first time period. The dynamic characteristics may also include rotational measurements of thehandle 310 over time when thehandle 310 is able to be rotated (e.g., a knob). - The first measurements of the static and/or dynamic characteristics may include direct measurements and derived measurements. Direct measurements are measurements that are generated directly by the
door sensors 320. In an example, direct measurements include a first measurement of force applied to a first region of thehandle 310 by a first digit of a hand of the user and a second measurement of force applied to a second region of thehandle 310 by a second digit of the hand of the user. Derived measurements are measurements that are generated from the direct measurements. In an example, derived measurements include a ratio of the first measurement to the second measurement. In another example, derived measurements include a sum of the first measurement and the second measurements. - The first measurements may be homogenous or heterogenous. In one example, the first measurements include pressure measurements. In another example, the first measurements include a combination of force measurements and pressure measurements. In a further example, the first measurements include force measurements and measurements derived from the force measurements, such as ratios of the force measurements to one another.
- As noted above, the
door sensors 320 generate the first measurements over the first time period. As such, the first measurements are generated at a plurality of timesteps. In an example, the plurality of timesteps are 5-200 ms apart, such as 50 ms apart. In an example, the first measurements include a first measurement taken at a first timestep, a second measurement taken at a second timestep, and a third measurement taken at a third timestep, where the first timestep, the second timestep, and the third timestep occur chronologically. In an example, thedoor sensors 320 begin to generate the first measurements when the hand of the user makes contact with thehandle 310. In the example, thedoor sensors 320 finish generating the first measurements when thehandle 310 reaches a position that would open thevehicle door 180 if thevehicle door 180 were unlocked. Following the example above, thedoor sensors 320 generate the first measurement when the hand makes contact with thehandle 310, the second measurement when the user is pulling thehandle 310, and the third measurement when thehandle 310 reaches the position that would open thevehicle door 180 if thevehicle door 180 were unlocked. It is to be understood that a number of timesteps may be dynamic and based upon how long the user takes to open thevehicle door 180. Alternatively, the number of timesteps may be predetermined. When the number of timesteps is predetermined and the user exceeds the predetermined number of timesteps while opening thevehicle door 180, themeasurement module 220 may truncate measurements occurring at timesteps that go beyond the predetermined number of timesteps. - It is contemplated that the user may open the
vehicle door 180 several times during the calibration procedure. As such, the first measurements may include a plurality of measurement sets, where each set includes measurements from a different instance of the user opening thevehicle door 180 as part of the calibration procedure. For instance, the first measurements may include a first measurement set that includes measurements generated when thevehicle door 180 is opened a first time and a second measurement set that includes measurements generated when thevehicle door 180 is opened a second time. - In general, a measurement set comprises a plurality of measurement subsets, where each of the subsets corresponds to a different timestep while the
vehicle door 180 is being opened, and where each measurement in a measurement subset is generated at the same timestep. For instance, in an example involving a first sensor and a second sensor and a first timestep and a second timestep, a first measurement set comprises a first measurement subset for the first timestep and a second measurement subset for the second timestep. The first measurement subset comprises a first measurement generated by the first sensor at the first timestep and a second measurement generated by the second sensor at the first timestep. The second measurement subset comprises a third measurement generated by the first sensor at the second timestep and a fourth measurement generated by the second sensor at the second timestep. - The
measurement module 220 of thevehicle access system 170 obtains the first measurements from thedoor sensors 320. Thedecision module 225 may obtain the first measurements from themeasurement module 220 and may generate theuser profile 250 for the user based upon the first measurements. In an example, thedecision module 225 determines average measurements for each sensor at each timestep based upon the first measurements (more specifically, measurement sets comprised by the first measurements) and determines acceptable deviations from the average measurements based upon the first measurements. In an example involving a first sensor and a second sensor (such as force sensors), for each timestep, thedecision module 225 computes a first average measurement and a second average measurement for the first sensor and the second sensor, respectively, where the first average measurement is an average of measurements generated by the first sensor and the second average measurement is an average of measurements generated by the second sensor. The acceptable deviations may be standard deviations or a manually defined deviations. - Additionally or alternatively, the
decision module 225 may generate thedoor profile 255 based upon the first measurements. Thedoor profile 255 may include theuser profile 250, as well as profiles for other users that are authorized to access thevehicle 100. - According to embodiments, the
decision module 225 of thevehicle access system 170 trains the machine learning model 260 based upon the first measurements. For instance, as noted above, the first measurements may include a plurality of measurement sets, where each of the sets corresponds to a different instance of the user opening thevehicle door 180 as part of the calibration procedure. As such, the plurality of measurement sets may serve as training data for the machine learning model 260 (e.g., as part of a supervised learning procedure). Thedecision module 225 may also utilize other data as training data (e.g., measurements of other users applying force to handles). - According to a first embodiment, the
door sensors 320 of thehandle 310 include a first pressure sensor, a second pressure sensor, a third pressure sensor, a fourth pressure sensor, and a fifth pressure sensor. When thehandle 310 is grabbed by the user, the first pressure sensor, the second pressure sensor, the third pressure sensor, the fourth pressure sensor, and the fifth pressure sensor are depressed by respective digits of the hand of the user. The depressions of the first through fifth sensors cause electrical signals to be generated which are indicative of a first pressure (P1), a second pressure (P2), a third pressure (P3), a fourth pressure (P4), and a fifth pressure (P5), where each of the pressures is generated based upon a depression from one of the digits. In the example, P1 through P5 represent maximum pressures obtained during the first time period. Thedecision module 225 generates an array that stores P1 through P5 and stores the array as part ofuser profile 250 and/or thedoor profile 255 in thedatabase 240. Thedecision module 225 may sum P1 through P5 to obtain an overall pressure and store the overall pressure as part of the array. Thedecision module 225 may also compute relative pressures between the pressures P1 through P5 and store the relative pressures as part of the array. In an example, the array may include entries such as K1=P1/P2, K2=P2/P3, K3=P3/P4, and so forth. - According to a second embodiment, the
door sensors 320 of thehandle 310 include a first pressure sensor, a second pressure sensor, a third pressure sensor, a fourth pressure sensor, and a fifth pressure sensor. When thehandle 310 is grabbed by the user, the first pressure sensor, the second pressure sensor, the third pressure sensor, the fourth pressure sensor, and the fifth pressure sensor are depressed by respective digits of the hand of the user. The depressions of the first through fifth sensors cause electrical signals to be generated which are indicative of a first pressure (P1), a second pressure (P2), a third pressure (P3), a fourth pressure (P4) and a fifth pressure (P5), where each of the pressures is generated based upon a depression from one of the digits. Thedecision module 225 generates an N by M array, where N is a number of sensors and M is a number of timesteps at which measurements are generated by the sensors. - Subsequent to completion of the calibration procedure, it is contemplated that the user wishes to open the
vehicle door 180. During a second time period that occurs subsequent to the first time period, the hand of the user applies second force to thehandle 310 as the user attempts to open thevehicle door 180. Thesensor systems 320 generate second measurements based upon contact of the hand with thehandle 310 as the hand applies the second force. Types of the second measurements generally corresponds to types of the first measurements. In an example, the first measurements and the second measurements are pressure measurements. Themeasurement module 220 obtains the second measurements. - The
decision module 225 obtains the second measurements from themeasurement module 220 and identifies the user as being authorized (or unauthorized) to access the cabin of thevehicle 100 based upon the first measurements and the second measurements. According to one embodiment, thedecision module 225 accesses the machine learning model 260. For instance, thedecision module 225 may load the machine learning model 260 into thememory 210 from thedatabase 240. Thedecision module 225 provides the second measurements as input to the machine learning model 260. The machine learning model 260 outputs at least one value based upon the learned parameters and the second measurements, where the at least one value is indicative of whether static characteristics and/or dynamic characteristics of the hand of the user measured during the second time period match (or are substantially similar to) static characteristics and/or dynamic characteristics of the hand as measured during the first time period as part of the calibration procedure. Thedecision module 225 identifies the user as being authorized to access thevehicle 100 based upon the at least one value. - According to another embodiment, the
decision module 225 compares each of the second measurements to corresponding measurements within theuser profile 250. For instance, thedecision module 225 may compare a first measurement in the second measurements with a corresponding measurement in theuser profile 250, where the first measurement and the corresponding measurement may be generated by the same sensor of thehandle 310 at corresponding timesteps in the first time period and the second time period. When the first measurement is within a threshold range (specified in the user profile 250) of the corresponding measurement, thedecision module 225 may identify the user as being authorized to access thevehicle 100. Similarly, thedecision module 225 may compare each of the second measurements with corresponding measurements in theuser profile 250 in order to identify the user as being authorized. In an example, thedecision module 225 identifies the user as being authorized when a threshold number of measurements are within corresponding threshold ranges of measurements in theuser profile 250. - Upon receiving an indication from the
decision module 225, thevehicle access module 230 grants the user access to the cabin (or the trunk) of the vehicle based upon identifying the user as being authorized. In an example, thevehicle access module 230 transmits a signal to the lock/unlock mechanism 330 which causes thevehicle door 180 to be unlocked. As thevehicle door 180 is now unlocked, the user may apply further force to pull open the door. The user may then enter the cabin of thevehicle 100 and/or access the trunk. In another example, thevehicle access module 230 transmits a signal to a motor within thevehicle door 180 which causes the motor to generate force which opens the vehicle door 180 (without requiring the user to apply further force). - According to embodiments, the
measurement module 220 stores the second measurements in thememory 210 and/or thedatabase 240. Thedecision module 225 may update the machine learning model 260 using the second measurements. For instance,decision module 225 may retrain the machine learning model 260 based upon the second measurements. - Although the above-described technologies have been described as being performed by systems of the
vehicle 100, other possibilities are contemplated. According to the embodiments, the first measurements are generated by thedoor sensors 320 of the vehicle as described above. However, according to the embodiments, themeasurement module 220 transmits the first measurements over a network connection (e.g., a wireless connection) to a server computing device, such as a server in a cloud-computing environment. In the embodiments, the server computing device (which may comprise the decision module 225) generates theuser profile 250, thedoor profile 255, and/or the machine learning model 260 as described above. According to the embodiments, the second measurements are generated by thedoor sensors 320 of thevehicle 100 as described above. However, according to the embodiments, themeasurement module 220 transmits the second measurements over the network connection to the server computing device. The server computing device (which may comprise the decision module 225) identifies the user as being authorized (or unauthorized) to access thevehicle 100 using the above-described processes. Upon identifying the user as being authorized, the server computing device transmits a message over the network connection to thevehicle access module 230. Upon receiving the message, thevehicle access module 230 grants the user access to the cabin and/or trunk of the vehicle 100 (e.g., by unlocking the vehicle door 180). - Although the above-described technologies have been described above in the context of a single user, other possibilities are contemplated. According to embodiments, the
vehicle decision module 225 of thevehicle access system 170 performs the above-described processes to generate a second user profile for a second user of the vehicle 100 (e.g., a family member of the user, an employee who works at the same organization of the user, etc.). Thedecision module 225 may store the second user profile as part of thedoor profile 255. Additionally or alternatively, thedecision module 225 may generate a second machine learning model for the second user using the above-described processes. Thedecision module 225 may utilize the second user profile and/or the second machine learning model 260 to identify the second user as an authorized user of thevehicle 100. In this manner, multiple users may employ the above-described technologies in order to access thevehicle 100. - Although the above-described technologies have been described in a context of a single handle of a single vehicle, other possibilities are contemplated. According to embodiments, the
vehicle 100 is part of a fleet of shared vehicles, such as a taxi service. According to embodiments, thedecision module 225 generates theuser profile 250 and/or the machine learning model 260 as described above and transmits theuser profile 250 and/or the machine learning model 260 to a server computing device, such as a server in a cloud-computing environment. Alternatively, the server computing device generates theuser profile 250 and/or the machine learning model 260 using the first measurements as described above. In either case, theuser profile 250 and/or the machine learning model 260 are stored within a datastore of the server computing device. According to the embodiments, during the second time period, the hand of the user makes contact with a second handle of a second vehicle, where the user is authorized to enter the second vehicle. In an example, the second vehicle is part of the fleet of shared vehicles and is of a similar make as the first vehicle. Sensor systems of the second vehicle generate second measurements as described above. The second vehicle transmits the second measurements to the server computing device. The server computing device identifies the user as being authorized (or unauthorized) to access the second vehicle using the above-described processes. Upon identifying the user as being authorized, the server computing device transmits a message over the network connection to the second vehicle. Upon receiving the message, thevehicle 100 grants the user access to the cabin and/or trunk of the second vehicle (e.g., by unlocking a vehicle door of the second vehicle). - According to embodiments, the
door sensors 320 begin to generate the second measurements when thehandle 310 is moved from a rest position (i.e., a default position where thehandle 310 rests when a hand of the user is not in contact with the handle 310) while being gripped by the hand of the user. When thehandle 310 reaches a first position while being gripped by the hand of the user, thedecision module 225 identifies the user as being authorized to access thevehicle 100 using the processes described above. In an example, the first position is located in a direction of a pulling force applied to thehandle 310 by the user. When thehandle 310 reaches a second position while being gripped by the hand of the user, thevehicle access module 230 transmits a signal to the lock/unlock mechanism 330 causing thevehicle door 180 to unlock. In an example, the second position is located in the direction of the pulling force applied by the user to thehandle 310. In this manner, thevehicle access system 170 is able to provide seamless entry to thevehicle 100 from the user perspective. - According to embodiments, the
handle 310 includes a fingerprint scanner. During the first time period, the fingerprint scanner generates a first fingerprint scan of digits of the user. Themeasurement module 220 may store the first fingerprint scan as part of theuser profile 250. Additionally or alternatively, thedecision module 225 may train the machine learning model 260 further based upon the first fingerprint scan. For instance, thedecision module 225 may convert the first fingerprint scan into first values that can be used to train the machine learning model 260 (in addition to the first measurements described above). During the second time period, the fingerprint scanner generates a second fingerprint scan of the digits of the user. Thedecision module 225 may utilize the second fingerprint scan (in addition to the second measurements described above) to identify the user as being authorized to access thevehicle 100. For instance, thedecision module 225 may convert the second fingerprint scan into second values that are provided as input to the machine learning model 260 (in addition to the second measurements described above). - It is contemplated that the
vehicle access system 170 may utilize additional vehicle entry means in addition to those described above. In an example,decision module 225 mistakenly fails to identify the user as being authorized, thedecision module 225 may rely on conventional approaches to grant the user access to thevehicle 100, such as wireless entry through a key fob, a mechanical key, and/or a passcode. - According to embodiments, the
decision module 225 may utilize data obtained from therotation sensors 324, thetemperature sensors 325, theoptical sensors 326, thecamera 327 and/or themiscellaneous sensors 328 in determining whether or not the user is authorized to access thevehicle 100. - Additional aspects of controlling access to an interior of an object, such as a cabin of a vehicle, based upon proprioception-related factors will be discussed in relation to
FIG. 10 .FIG. 10 illustrates a flowchart of amethod 1000 that is associated with controlling access to an interior of an object. Themethod 1000 will be discussed from the perspective of thevehicle access system 170 ofFIGS. 1, and 2 . Whilemethod 1000 is discussed in combination with thevehicle access system 170, it should be appreciated that themethod 1000 is not limited to being implemented within thevehicle access system 170 but is instead one example of a system that may implement themethod 1000. - At 1010, the
vehicle access system 170 obtains first measurements that are based upon contact of a hand of a user with a handle located on an exterior (e.g., a door) of an object (e.g., a vehicle) over a first time period as the hand applies first force to the handle. The measurements may include static characteristics and/or dynamic characteristics as described above. - At 1020, the
vehicle access system 170 identifies the user as being authorized to access an interior (e.g., a cabin or a trunk) of the vehicle based upon second measurements. The second measurements are based upon contact of the hand of the user with the handle over a second time period as the hand applies second force to the handle. The second measurements may include the static characteristics and/or dynamic characteristics (taken during the second time period) as described above. The second time period occurs prior to the first time period. - At 1030, the
vehicle access system 170 grants the user access to the interior (e.g., the cabin or the trunk) of the object (e.g., the vehicle) based on the user being authorized. For instance, thevehicle access system 170 may unlock the door of the vehicle such that the user may enter thevehicle 100. In an example, thevehicle access system 170 grants the user access to the interior of the object responsive to the user being authorized. In another example, thevehicle access system 170 grants the user access to the interior of the object after a period of time elapses. -
FIG. 1 will now be discussed in full detail as an example environment within which the system and methods disclosed herein may operate. In some instances, thevehicle 100 is configured to switch selectively between an autonomous mode, one or more semi-autonomous operational modes, and/or a manual mode. Such switching can be implemented in a suitable manner, now known or later developed. “Manual mode” means that all of or a majority of the navigation and/or maneuvering of the vehicle is performed according to inputs received from a user (e.g., human driver). In one or more arrangements, thevehicle 100 can be a conventional vehicle that is configured to operate in only a manual mode. - In one or more embodiments, the
vehicle 100 is an autonomous vehicle. As used herein, “autonomous vehicle” refers to a vehicle that operates in an autonomous mode. “Autonomous mode” refers to navigating and/or maneuvering thevehicle 100 along a travel route using one or more computing systems to control thevehicle 100 with minimal or no input from a human driver. In one or more embodiments, thevehicle 100 is highly automated or completely automated. In one embodiment, thevehicle 100 is configured with one or more semi-autonomous operational modes in which one or more computing systems perform a portion of the navigation and/or maneuvering of the vehicle along a travel route, and a vehicle operator (i.e., driver) provides inputs to the vehicle to perform a portion of the navigation and/or maneuvering of thevehicle 100 along a travel route. - As noted above, the
vehicle 100 can include thesensor system 120. Thesensor system 120 can include one or more sensors. “Sensor” means any device, component and/or system that can detect, and/or sense something. The one or more sensors can be configured to detect, and/or sense in real-time. As used herein, the term “real-time” means a level of processing responsiveness that a user or system senses as sufficiently immediate for a particular process or determination to be made, or that enables the processor to keep up with some external process. - In arrangements in which the
sensor system 120 includes a plurality of sensors, the sensors can work independently from each other. Alternatively, two or more of the sensors can work in combination with each other. In such case, the two or more sensors can form a sensor network. Thesensor system 120 and/or the one or more sensors can be operatively connected to the processor(s) 110 and/or another element of the vehicle 100 (including any of the elements shown inFIG. 1 ). Thesensor system 120 can acquire data of at least a portion of the external environment of the vehicle 100 (e.g., nearby vehicles). - The
sensor system 120 can include any suitable type of sensor. Various examples of different types of sensors will be described herein. However, it will be understood that the embodiments are not limited to the particular sensors described. Thesensor system 120 can include one ormore vehicle sensors 121. The vehicle sensor(s) 121 can detect, determine, and/or sense information about thevehicle 100 itself. In one or more arrangements, the vehicle sensor(s) 121 can be configured to detect, and/or sense position and orientation changes of thevehicle 100, such as, for example, based on inertial acceleration. In one or more arrangements, the vehicle sensor(s) 121 can include one or more accelerometers, one or more gyroscopes, an inertial measurement unit (IMU), a dead-reckoning system, a global navigation satellite system (GNSS), a global positioning system (GPS), anavigation system 147, and/or other suitable sensors. The vehicle sensor(s) 121 can be configured to detect, and/or sense one or more characteristics of thevehicle 100. In one or more arrangements, the vehicle sensor(s) 121 can include a speedometer to determine a current speed of thevehicle 100. - Alternatively, or in addition, the
sensor system 120 can include one ormore environment sensors 122 configured to acquire, and/or sense driving environment data. “Driving environment data” includes data or information about the external environment in which an autonomous vehicle is located or one or more portions thereof. For example, the one ormore environment sensors 122 can be configured to detect, quantify and/or sense obstacles in at least a portion of the external environment of thevehicle 100 and/or information/data about such obstacles. Such obstacles may be stationary objects and/or dynamic objects. The one ormore environment sensors 122 can be configured to detect, measure, quantify and/or sense other things in the external environment of thevehicle 100, such as, for example, lane markers, signs, traffic lights, traffic signs, lane lines, crosswalks, curbs proximate thevehicle 100, off-road objects, etc. - Various examples of sensors of the
sensor system 120 will be described herein. The example sensors may be part of the one ormore environment sensors 122 and/or the one ormore vehicle sensors 121. However, it will be understood that the embodiments are not limited to the particular sensors described. - As an example, in one or more arrangements, the
sensor system 120 can include one ormore radar sensors 123, one ormore LIDAR sensors 124, one ormore sonar sensors 125, and/or one ormore cameras 126. In one or more arrangements, the one ormore cameras 126 can be high dynamic range (HDR) cameras or infrared (IR) cameras. - The
vehicle 100 can include aninput system 130. An “input system” includes any device, component, system, element or arrangement or groups thereof that enable information/data to be entered into a machine. Theinput system 130 can receive an input from a vehicle passenger (e.g., a driver or a passenger). Thevehicle 100 can include anoutput system 135. An “output system” includes any device, component, or arrangement or groups thereof that enable information/data to be presented to a vehicle passenger (e.g., a person, a vehicle passenger, etc.). - The
vehicle 100 can include one ormore vehicle systems 140. Various examples of the one ormore vehicle systems 140 are shown inFIG. 1 . However, thevehicle 100 can include more, fewer, or different vehicle systems. It should be appreciated that although particular vehicle systems are separately defined, each or any of the systems or portions thereof may be otherwise combined or segregated via hardware and/or software within thevehicle 100. Thevehicle 100 can include apropulsion system 141, abraking system 142, asteering system 143,throttle system 144, atransmission system 145, asignaling system 146, and/or anavigation system 147. Each of these systems can include one or more devices, components, and/or a combination thereof, now known or later developed. - The
navigation system 147 can include one or more devices, applications, and/or combinations thereof, now known or later developed, configured to determine the geographic location of thevehicle 100 and/or to determine a travel route for thevehicle 100. Thenavigation system 147 can include one or more mapping applications to determine a travel route for thevehicle 100. Thenavigation system 147 can include a global positioning system, a local positioning system or a geolocation system. - The
vehicle 100 can include one ormore actuators 150. Theactuators 150 can be any element or combination of elements operable to modify, adjust and/or alter one or more of thevehicle systems 140 or components thereof to responsive to receiving signals or other inputs from the processor(s) 110 and/or themeasurement module 220, thedecision module 225, or thevehicle access module 230. Any suitable actuator can be used. For instance, the one ormore actuators 150 can include motors, pneumatic actuators, hydraulic pistons, relays, solenoids, and/or piezoelectric actuators, just to name a few possibilities. - The
vehicle 100 can include one or more modules, at least some of which are described herein. The modules can be implemented as computer-readable program code that, when executed by aprocessor 110, implement one or more of the various processes described herein. One or more of the modules can be a component of the processor(s) 110, or one or more of the modules can be executed on and/or distributed among other processing systems to which the processor(s) 110 is operatively connected. The modules can include instructions (e.g., program logic) executable by one or more processor(s) 110. - In one or more arrangements, one or more of the modules described herein can include artificial or computational intelligence elements, e.g., neural network, fuzzy logic or other machine learning algorithms. Further, in one or more arrangements, one or more of the modules can be distributed among a plurality of the modules described herein. In one or more arrangements, two or more of the modules described herein can be combined into a single module.
- Detailed embodiments are disclosed herein. However, it is to be understood that the disclosed embodiments are intended only as examples. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the aspects herein in virtually any appropriately detailed structure. Further, the terms and phrases used herein are not intended to be limiting but rather to provide an understandable description of possible implementations. Various embodiments are shown in
FIGS. 1-10 , but the embodiments are not limited to the illustrated structure or application. - The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments. In this regard, each block in the flowcharts or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
- The systems, components and/or processes described above can be realized in hardware or a combination of hardware and software and can be realized in a centralized fashion in one processing system or in a distributed fashion where different elements are spread across several interconnected processing systems. Any kind of processing system or another apparatus adapted for carrying out the methods described herein is suited. A typical combination of hardware and software can be a processing system with computer-usable program code that, when being loaded and executed, controls the processing system such that it carries out the methods described herein. The systems, components and/or processes also can be embedded in a computer-readable storage, such as a computer program product or other data programs storage device, readable by a machine, tangibly embodying a program of instructions executable by the machine to perform methods and processes described herein. These elements also can be embedded in an application product which comprises all the features enabling the implementation of the methods described herein and, which when loaded in a processing system, is able to carry out these methods.
- Furthermore, arrangements described herein may take the form of a computer program product embodied in one or more computer-readable media having computer-readable program code embodied, e.g., stored, thereon. Any combination of one or more computer-readable media may be utilized. The computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium. The phrase “computer-readable storage medium” means a non-transitory storage medium. A computer-readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: a portable computer diskette, a hard disk drive (HDD), a solid-state drive (SSD), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a portable compact disc read-only memory (CD-ROM), a digital versatile disc (DVD), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
- Generally, modules as used herein include routines, programs, objects, components, data structures, and so on that perform particular tasks or implement particular data types. In further aspects, a memory generally stores the noted modules. The memory associated with a module may be a buffer or cache embedded within a processor, a RAM, a ROM, a flash memory, or another suitable electronic storage medium. In still further aspects, a module as envisioned by the present disclosure is implemented as an application-specific integrated circuit (ASIC), a hardware component of a system on a chip (SoC), as a programmable logic array (PLA), or as another suitable hardware component that is embedded with a defined configuration set (e.g., instructions) for performing the disclosed functions.
- Program code embodied on a computer-readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber, cable, RF, etc., or any suitable combination of the foregoing. Computer program code for carrying out operations for aspects of the present arrangements may be written in any combination of one or more programming languages, including an object-oriented programming language such as Java™ Python, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
- The terms “a” and “an,” as used herein, are defined as one or more than one. The term “plurality,” as used herein, is defined as two or more than two. The term “another,” as used herein, is defined as at least a second or more. The terms “including” and/or “having,” as used herein, are defined as comprising (i.e., open language). The phrase “at least one of . . . and . . . ” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. As an example, the phrase “at least one of A, B, and C” includes A only, B only, C only, or any combination thereof (e.g., AB, AC, BC or ABC).
- Aspects herein can be embodied in other forms without departing from the spirit or essential attributes thereof. Accordingly, reference should be made to the following claims, rather than to the foregoing specification, as indicating the scope hereof.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/724,775 US11802426B1 (en) | 2022-04-20 | 2022-04-20 | Systems and methods for an identification panel to measure hand static and/or dynamic characteristics |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/724,775 US11802426B1 (en) | 2022-04-20 | 2022-04-20 | Systems and methods for an identification panel to measure hand static and/or dynamic characteristics |
Publications (2)
Publication Number | Publication Date |
---|---|
US20230340806A1 true US20230340806A1 (en) | 2023-10-26 |
US11802426B1 US11802426B1 (en) | 2023-10-31 |
Family
ID=88416203
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/724,775 Active US11802426B1 (en) | 2022-04-20 | 2022-04-20 | Systems and methods for an identification panel to measure hand static and/or dynamic characteristics |
Country Status (1)
Country | Link |
---|---|
US (1) | US11802426B1 (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100109838A1 (en) * | 2008-09-08 | 2010-05-06 | Sherard Fisher | Fingerprint unlocking system |
US9460575B2 (en) * | 2013-12-05 | 2016-10-04 | Lg Electronics Inc. | Vehicle control apparatus and method thereof |
US20180196988A1 (en) * | 2017-01-06 | 2018-07-12 | Qualcomm Incorporated | Progressive multiple fingerprint enrollment and matching, and dynamic user account transitions |
US10967837B1 (en) * | 2018-12-21 | 2021-04-06 | United Services Automobile Association (Usaa) | Security device using sequences of fingerprints |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5920640A (en) | 1997-05-16 | 1999-07-06 | Harris Corporation | Fingerprint sensor and token reader and associated methods |
JP3900893B2 (en) | 2001-11-02 | 2007-04-04 | ソニー株式会社 | Steering device, driver authentication method, automobile |
US7102507B1 (en) | 2004-07-21 | 2006-09-05 | Givi Lauren | Keyless entry system |
WO2011004372A1 (en) | 2009-07-07 | 2011-01-13 | Tracktec Ltd | Driver profiling |
JP4900460B2 (en) | 2009-12-14 | 2012-03-21 | 株式会社デンソー | Wheel grip detection device for steering device, program |
US8881347B2 (en) | 2012-08-24 | 2014-11-11 | Feinstein Patents Llc | Vibration and pressure damping device for gripping handles and steering mechanisms |
US8909428B1 (en) * | 2013-01-09 | 2014-12-09 | Google Inc. | Detecting driver grip on steering wheel |
AU2015250829B2 (en) * | 2014-04-23 | 2020-11-05 | Novomatic Ag | Arrangement and method for identifying fingerprints |
US9688271B2 (en) * | 2015-03-11 | 2017-06-27 | Elwha Llc | Occupant based vehicle control |
EP3106594A1 (en) * | 2015-06-16 | 2016-12-21 | U-Shin Italia S.p.A. | Handle for a vehicle door |
FR3047219B1 (en) | 2016-01-29 | 2022-03-11 | Daniel Moulene | AUTOMATIC TRANSPORT SYSTEM |
US10316966B2 (en) * | 2016-12-15 | 2019-06-11 | Dura Operating, Llc | Biometric shifter for a vehicle |
US10846391B1 (en) * | 2017-04-24 | 2020-11-24 | Architecture Technology Corporation | Secure authentication using fast authentication factors |
US10229309B2 (en) | 2017-05-12 | 2019-03-12 | Dura Operations, Llc | Biometric control member |
DE102017124568A1 (en) * | 2017-10-20 | 2019-04-25 | Dr. Ing. H.C. F. Porsche Aktiengesellschaft | Outside door handle assembly for a motor vehicle |
DE102017130029A1 (en) * | 2017-12-14 | 2019-06-19 | Huf Hülsbeck & Fürst Gmbh & Co. Kg | Door handle arrangement of a vehicle door |
CN208686236U (en) | 2018-07-12 | 2019-04-02 | 河南森源重工有限公司 | A kind of door handle for vehicle and car door assembly and vehicle with physiological characteristic identification |
-
2022
- 2022-04-20 US US17/724,775 patent/US11802426B1/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100109838A1 (en) * | 2008-09-08 | 2010-05-06 | Sherard Fisher | Fingerprint unlocking system |
US9460575B2 (en) * | 2013-12-05 | 2016-10-04 | Lg Electronics Inc. | Vehicle control apparatus and method thereof |
US20180196988A1 (en) * | 2017-01-06 | 2018-07-12 | Qualcomm Incorporated | Progressive multiple fingerprint enrollment and matching, and dynamic user account transitions |
US10967837B1 (en) * | 2018-12-21 | 2021-04-06 | United Services Automobile Association (Usaa) | Security device using sequences of fingerprints |
Also Published As
Publication number | Publication date |
---|---|
US11802426B1 (en) | 2023-10-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11052874B2 (en) | Recognizing authorized vehicle user with movement data | |
US10990099B2 (en) | Motion planning methods and systems for autonomous vehicle | |
US10586254B2 (en) | Method and system for adaptive vehicle control in autonomous vehicles | |
US10369966B1 (en) | Controlling access to a vehicle using wireless access devices | |
US10970747B2 (en) | Access and control for driving of autonomous vehicle | |
US9963106B1 (en) | Method and system for authentication in autonomous vehicles | |
JP4459735B2 (en) | Product explanation robot | |
JP2019040465A (en) | Behavior recognition device, learning device, and method and program | |
US11054818B2 (en) | Vehicle control arbitration | |
US20200070777A1 (en) | Systems and methods for a digital key | |
CN108216241A (en) | Determine the control characteristic of autonomous land vehicle | |
US20170185763A1 (en) | Camera-based detection of objects proximate to a vehicle | |
US20200159233A1 (en) | Memory-Based Optimal Motion Planning With Dynamic Model For Automated Vehicle | |
US11572039B2 (en) | Confirmed automated access to portions of vehicles | |
JP4611675B2 (en) | Customer service robot | |
Haid et al. | Inertial-based gesture recognition for artificial intelligent cockpit control using hidden Markov models | |
US11802426B1 (en) | Systems and methods for an identification panel to measure hand static and/or dynamic characteristics | |
CN107134191A (en) | Vehicle operator training system | |
US11752974B2 (en) | Systems and methods for head position interpolation for user tracking | |
US11603119B2 (en) | Method and apparatus for out-of-distribution detection | |
US20240095317A1 (en) | Method and System For Device Access Control | |
US20240054838A1 (en) | Method and System for Device Access Control | |
US11912234B2 (en) | Enhanced biometric authorization | |
US20230177900A1 (en) | Enhanced biometric authorization | |
US12010114B2 (en) | Delayed biometric authorization |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: WOVEN PLANET NORTH AMERICA, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BABAEV, ISLAM;REEL/FRAME:059677/0850 Effective date: 20220419 |
|
AS | Assignment |
Owner name: WOVEN BY TOYOTA, U.S., INC., CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:WOVEN PLANET NORTH AMERICA, INC.;REEL/FRAME:064065/0601 Effective date: 20230322 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |