DE102017101343A1 - Systems and methods for vehicle system control based on physiological characteristics - Google Patents

Systems and methods for vehicle system control based on physiological characteristics

Info

Publication number
DE102017101343A1
DE102017101343A1 DE102017101343.1A DE102017101343A DE102017101343A1 DE 102017101343 A1 DE102017101343 A1 DE 102017101343A1 DE 102017101343 A DE102017101343 A DE 102017101343A DE 102017101343 A1 DE102017101343 A1 DE 102017101343A1
Authority
DE
Germany
Prior art keywords
system
physiological
vehicle
data
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
DE102017101343.1A
Other languages
German (de)
Inventor
Peng Lu
Xiaosong Huang
Joseph F. Szczerba
Tricia E. Neiiendam
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
GM Global Technology Operations LLC
Original Assignee
GM Global Technology Operations LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201662287422P priority Critical
Priority to US62/287,422 priority
Priority to US15/410,582 priority patent/US10137777B2/en
Priority to US15/410,582 priority
Application filed by GM Global Technology Operations LLC filed Critical GM Global Technology Operations LLC
Publication of DE102017101343A1 publication Critical patent/DE102017101343A1/en
Application status is Pending legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/015Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting the presence or position of passengers, passenger seats or child seats, and the related safety parameters therefor, e.g. speed or timing of airbag inflation in relation to occupant position or seat belt use
    • B60R21/01512Passenger detection systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/023Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
    • B60R16/0231Circuits relating to the driving or the functioning of the vehicle
    • B60R16/0232Circuits relating to the driving or the functioning of the vehicle for measuring vehicle parameters and indicating critical, abnormal or dangerous conditions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators

Abstract

Systems and methods are provided for controlling a vehicle based on a physiological feature. The method comprises: receiving physiological data from one or more physiological sensors; Processing the received physiological data by a processor to determine one or more physiological conditions; and output, based on the determined physiological condition, one or more control signals to a vehicle system to control operation of the vehicle system.

Description

  • CROSS-REFERENCE TO RELATED APPLICATIONS
  • The present application claims the benefit of US Provisional Application No. 62 / 287,422, filed January 26, 2016, and is a continuation-in-part of US Application No. 15 / 342,451, filed on Nov. 3, 2016, which is assigned to the provisional US Application No. 62 / 250,180 filed November 3, 2015. Each of the above-referenced applications is incorporated by reference into the present application.
  • TECHNICAL AREA
  • The present disclosure relates generally to vehicles, and more particularly, to systems and methods for controlling vehicle systems based on one or more physiological features of a user.
  • BACKGROUND
  • An autonomous vehicle is a vehicle that can sense its environment and navigate without user input or with a small amount of user input. An autonomous vehicle includes its environment using transducer devices such as radar, lidar, image sensors, and the like. The autonomous vehicle system also uses information from global positioning system (GPS) technology, from navigation systems, from vehicle-to-vehicle communication, from vehicle-to-vehicle technology, and / or from drive-by-wire systems Navigate the vehicle.
  • Vehicle automation has been divided into numerical vehicle levels ranging from zero, corresponding to no automation with full human control, to five, corresponding to complete automation without human control. Various automated driver assistance systems, such as cruise control, adaptive cruise control, and park assist systems, correspond to lower levels of automation, whereas true "driverless" vehicles correspond to higher levels of automation.
  • A vehicle may include one or more automated driver assistance systems of a lower level of automation that require user input for operation. In certain cases, however, it may happen that the user can not provide direct input, for example as a result of a current driving situation or a state of health of the user. In other cases, for convenience, the user may simply want to operate the vehicle or vehicle system indirectly without intentional user interaction.
  • It is therefore desirable to provide systems and methods for controlling a vehicle system without intentional user input. Further, it is desirable to provide systems and methods for controlling a vehicle system based on a feature or a plurality of physiological features of a user. In addition, other desirable features and characteristics of the present disclosure will become apparent from the following detailed description and the appended claims, taken in conjunction with the accompanying drawings and the foregoing technical and background statements.
  • SUMMARY
  • The object of the present invention is achieved by the subject-matter of the independent claims, further embodiments are given in the dependent claims.
  • Systems and methods for controlling a vehicle are provided. In one embodiment, a method for controlling a vehicle based on a physiological feature comprises: receiving physiological data from one or more physiological sensors; Processing the received physiological data by a processor to determine one or more physiological conditions; and output, based on the determined physiological condition, one or more control signals to a vehicle system for controlling an operation of the vehicle system.
  • In various embodiments, the method further comprises: processing the received physiological data by the processor to generate a baseline for a physiological measurement result; Processing subsequent received physiological data by the processor to determine a physiological change; and output, based on the detected physiological change, one or more control signals to the vehicle system for controlling the operation of the vehicle system. The method further comprises: retrieving a setting associated with the determined physiological change, wherein the output of the one or more control signals to the vehicle system is based on the adjustment. The setting is one custom setting or a predefined default setting. The method further comprises: issuing a message to an external authority system based on the determined physiological condition. The method further comprises: outputting at least one message and a GPS location of the vehicle based on the determined physiological condition. The method includes issuing a prompt based on the detected physiological condition; and determining, by the processor, whether a user response has been received, wherein the output of one or more control signals to the vehicle system is based on the user response. The determination by the processor as to whether the user response has been received further comprises: receiving sensor data from one or more reaction sensors; Processing the sensor data to determine if the user made a gesture; and determining that the response has been received based on the determination of the gesture.
  • In one embodiment, a system for controlling a vehicle based on a physiological feature is provided. The system includes a source of physiological data pertaining to a user of the vehicle. The system further includes a control module provided with a processor that processes the physiological data and outputs at least one or more control signals to a vehicle control system to autonomously operate the vehicle based on the physiological data and one or more control signals the vehicle control system to issue commands to a HVAC system and / or a seating system and / or an infotainment system and / or a locking system and / or a lighting system and / or a window system and / or an alarm system based on the physiological Dates.
  • The source of physiological data is a personal device (personal device) associated with the user. The processor processes the physiological data to generate a baseline of a physiological display, subsequently processes received physiological data to determine a physiological change, and outputs the one or more control signals to the vehicle control system based on the detected physiological change. The processor retrieves a setting associated with the detected physiological change and outputs the one or more control signals to the vehicle control system based on the setting. The setting is a user-defined setting or a predetermined standard setting. The processor processes the received physiological data and determines one or more physiological conditions, and outputs one or more control signals to the vehicle control system based on the determined physiological condition. The processor issues a message to an external authority system based on the detected physiological condition. The processor outputs at least a message and a GPS location of the vehicle based on the detected physiological condition. The processor issues a request based on the determined physiological condition, determines whether a user response has been received, and the control module outputs the one or more control signals to the at least one vehicle system based on the user response. The processor determines whether the user response has been received based on sensor data received from one or more reaction sensors, and the processor processes the sensor data to determine whether a user made gesture has occurred. The processor determines that the user response has been received based on the detection of the gesture.
  • In one embodiment, a portable physiology device is provided. The portable physiology apparatus has at least one physiological sensor which observes at least one physiological condition associated with a wearer of the physiology apparatus and generates sensor signals thereon. The portable physiology apparatus further includes a control module provided with a processor that processes the sensor signals and outputs the sensor signals to a system associated with the vehicle.
  • The portable physiology device is a portable electronic device selected from the following group: a wristwatch, a ring, an earring, a bracelet, a cufflink, a necklace, a tie, glasses, a chest band, smart clothes, and Combinations of these.
  • DESCRIPTION OF THE DRAWINGS
  • The exemplary embodiments will now be described in conjunction with the following drawing figures, in which like numerals denote like elements, wherein:
  • 1 FIG. 10 is a functional block diagram illustrating an autonomous vehicle provided with a physiological control system according to various embodiments; FIG.
  • 2 is a functional block diagram that explains a transport system that has an or several autonomous vehicle (s) of 1 according to various embodiments;
  • 3 FIG. 11 is a data flow diagram illustrating an autonomous driving system having the autonomous vehicle's physiological control system according to various embodiments; FIG.
  • 4 is a data flow diagram, which is a control system of the physiological control system of the autonomous vehicle of 1 explained, according to various embodiments;
  • 5 FIG. 3 is a flowchart illustrating a control method of the physiological control system of FIG 1 explained, according to various embodiments;
  • 6 FIG. 3 is a flowchart illustrating a control method of the physiological control system of FIG 1 explained, according to various embodiments;
  • 7 FIG. 3 is a flowchart illustrating a control method of the physiological control system of FIG 1 explained, according to various embodiments;
  • 8th schematically illustrates an example of a computer architecture according to various embodiments;
  • 9 Examples of memory components of the computer architecture of 8th explained, according to various embodiments;
  • 10 shows an example of a portable device worn on a user and examples of user movements, according to various embodiments;
  • 11 an example of a method according to various embodiments explained; and
  • 12 Examples of system inputs and outputs are explained, according to various embodiments.
  • DETAILED DESCRIPTION
  • The following detailed description is meant to be exemplary only and is not intended to limit its use and uses. Furthermore, it is not intended to be bound by any expressed or implied theory presented in the preceding technical field, background, brief summary or the following detailed description. The term module as used herein refers to any hardware, software, firmware, electronic control component, processing logic, and / or processor device, individually or in any combination including, without limitation, an application specific integrated circuit (ASIC), electronic circuitry, processor (shared, especially or group) and a memory that executes one or more software or firmware programs, a combination logic circuit, and / or other suitable components that provide the described functionality.
  • Embodiments of the present disclosure may be described herein by way of functional and / or logical block components and various processing steps. It should be understood that such block components may be implemented by any number of hardware, software and / or firmware components configured to perform the specified functions. For example, one embodiment of the present disclosure may employ various integrated circuit components, such as memory elements, digital signal processing elements, logic elements, lookup tables, and the like, that may perform various functions controlled by one or more microprocessors or other controllers. Further, those skilled in the art will recognize that embodiments of the present disclosure may be practiced in conjunction with any number of systems, and that the vehicle system described herein represents only one exemplary embodiment of the present disclosure.
  • In order not to unnecessarily burden the illustration, conventional techniques relating to signal processing, data transmission, signaling, control, and other functional aspects of the systems (and the individual operating components of the systems) may not be described in detail herein. Furthermore, the connection lines shown in different figures are intended to represent examples of functional relationships and / or physical couplings between the various elements. It should be understood that numerous alternative or additional functional relationships or physical connections may be present in one embodiment of the present disclosure.
  • In 1 is a common with 100 designated physiological control system a vehicle 10 assigned according to various embodiments. In general, the physiological control system receives 100 an input from a physiology monitor 102 and intelligently controls the vehicle on this basis 10 ,
  • As in 1 shown, the vehicle points 10 essentially a chassis 12 on, a body 14 , Front wheels 16 , and rear wheels 18 , The body 14 is on the chassis 12 arranged, and substantially encloses components of the vehicle 10 , The body 14 and the chassis 12 can form a framework together. The wheels 14 - 18 are each rotatable with the chassis 12 near a corner of the body 14 coupled.
  • In various embodiments, the vehicle is 10 an autonomous vehicle, and is the physiological control system 100 in the autonomous vehicle 10 (hereinafter referred to as the autonomous vehicle 10 designated) incorporated or associated with this. The autonomous vehicle 10 For example, a vehicle that is automatically controlled to carry passengers from one place to another. The autonomous vehicle 10 is shown in the illustrated embodiment as a passenger vehicle, however, it should be appreciated that any other vehicle may be employed including motorcycles, trucks, SUVs, recreational vehicles (RVs), watercraft, aircraft, etc. In an exemplary embodiment, the autonomous vehicle is a so-called automation system of level three, level four or level five. A level three system refers to "conditional automation" in terms of driving state specific performance through an automated driving system of all aspects of the dynamic driving task, with the expectation that the human driver will respond appropriately to a request to intervene. A level four system indicates "high automation" in terms of driving state specific performance through an automated driving system of all aspects of the dynamic driving task, even if the human driver is not properly responsive to a request for intervention. A level five system indicates "complete automation" concerning the performance of an automated driving system at all times of all aspects of the dynamic driving task under all conditions for the road and the environment that can be accomplished by a human driver. However, it is noted that the physiological control system 100 also, if desired, may be coupled to or built into a lower level automation system.
  • As shown, the vehicle points 10 essentially a drive system 20 on, a transmission system 22 , a steering system 24 , a braking system 26 , a sensor system 28 , an actuator system 30 , at least one data storage device 32 , at least one controller 34 , and a communication system 36 , The drive system 20 In various embodiments, it may include an internal combustion engine, an electrical machine such as a traction motor, and / or a fuel cell drive system. The transmission system 22 is designed to deliver power from the drive system 20 to the wheels 16 - 18 transmits according to selectable reduction ratios. In various embodiments, the transmission system 22 have a stepwise automatic transmission, a transmission with continuously variable ratio, or any other suitable transmission. The brake system 26 is designed to brake torque for the wheels 16 - 18 and / or the transmission system 22 to provide. The brake system 26 In various embodiments, it may include friction brakes, electronic brakes (brakes-by-wire), a regenerative braking system such as an electric machine, and / or other suitable braking systems. The steering system 24 influences the course of driving the autonomous vehicle 10 For example, by adjusting a position of the wheels 16 - 18 , While a steering wheel is illustrated for purposes of illustration, in some embodiments included within the scope of the present disclosure, the steering system may be used 24 have no steering wheel.
  • The vehicle 10 further includes one or more systems for the comfort and convenience of the user, including, but not limited to, a heat ventilation and cooling (HVAC) system. 21 , a seating system 23 , an infotainment system 25 , a locking system 27 , a lighting system 29 , a window system 31 , an alarm system 33 , etc. In various embodiments, the HVAC system 32 an engine coupled to a blower, the engine being started to drive the blower to direct air through a condenser (for cooled air) and / or heated air (for heated air) to open a desired ambient air temperature in the vehicle 10 arrives. The condenser and heater may also be operated to cool or heat the airflow from the fan. In various embodiments, the seating system is 23 one or more seat surfaces or seats of the vehicle 10 associated, and has one or more motors which are operated so that the respective seat is moved in different directions, including, but not limited to, front and rear, up and down, tilted, etc. The seating system 23 may further include a motor that is actuated to provide increased or decreased support of the spine, and may also include a heater coil operated to provide heat release to one or more seats (eg a Heated seats). In various embodiments, the seating system 23 also have a cooling control valve operated to supply cooled air from the HVAC system 21 one seat or more seats, so that the seat surface (s) are cooled (for example, a seat cooling).
  • In various embodiments, the infotainment system 25 a display 25a and one or more input device (s) 25b on. The infotainment system 25 may also include other user entertainment options, including, but not limited to, a radio, a DVD, etc. The infotainment system 25 communicates with the physiological control system 100 and provides input about the interaction of a user with the one or more input devices. 25b , for the physiological control system 100 ready. the display 25a generally includes any display that is in a dashboard of the vehicle 10 can be implemented, such as a flat panel display, a curved display or one with a different shape, or a projection display, a virtual 3D display, etc. The display 25a Any suitable technique for displaying information includes, but is not limited to, a liquid crystal display (LCD), an organic light emitting diode (OLED), plasma, or a cathode ray tube (CRT). The input device 25b includes each device for receiving input and / or commands from the user (eg, a portable device, gesture detection device, microphone, buttons, keyboard, etc.) and the input device 25b may have a touch screen layer that faces the display 25a assigned.
  • In various embodiments, the locking system 27 one or more lock actuators operable to lock or unlock one or more doors (s) associated with the vehicle 10 are provided. It should be noted that the locking system 27 may also include one or more locking actuators operable to lock or unlock a tailgate or trunk access door attached to the vehicle 10 is provided. In certain embodiments, the locking system 27 also include one or more lock actuators that are operable to lock or unlock a respective one or more doors, one or more liftgates, or one or more trunk latches associated with the vehicle 10 are provided.
  • The lighting system 29 includes one or more light emitting devices within a passenger compartment of the vehicle 10 or outside the passenger compartment of the vehicle 10 on (for example, headlights and taillights). In various embodiments, the lighting system 29 be operated so that it sets one or more lamp (s) outside the passenger compartment of the vehicle in operation, or the lights outside the passenger compartment of the vehicle 10 flashes. Furthermore, the lighting system 29 be operated so that the color of the light is changed into the passenger compartment of the vehicle 10 or around the passenger compartment of the vehicle 10 to illuminate. In various embodiments, the window system 31 one or more engines, each of which is associated with an associated drive system of a window of the vehicle 10 is assigned or are. The one or more motors are operated to move the respective window between an open position and a closed position. In various embodiments, the alarm system 33 a siren or other output device operated to produce a loud noise.
  • The sensor system 28 has one or more sensor (s) or pickup devices 40a . 40b ... 40n which detect observable conditions of the outside environment, but also with respect to the interior environment and / or the operating state of the autonomous vehicle 10 , The transducer devices 40a . 40b ... 40n may include, but are not limited to, radars, eyepieces, global positioning systems, optical cameras, thermal cameras, ultrasonic sensors and / or other sensors, communication connectors, or GPS antennas. The actuator system 30 has one or more actuator devices 42a . 42b ... 42n which control one or more features on the vehicle, components, systems, and / or functions, including, but not limited to, the drive system 20 , the transmission system 22 , the steering system 24 , the brake system 26 , the HVAC system 21 , the seating system 23 , the infotainment system 25 , the locking system 27 , the lighting system 29 , the window system 31 and the alarm system 33 ,
  • The data storage device 32 stores data for use in automatic control of the autonomous vehicle 10 , In various embodiments, the data storage device stores 32 Defined maps of the environment in which you can navigate. In various embodiments, the designated cards may be fixed earlier and obtained from a remote system (which will be described in further detail with respect to FIG 2 will be described). For example, the specified maps can be collected from the remote system to the autonomous vehicle 10 be communicated (wireless and / or wired), and in the data storage device 32 get saved. In this context, the data storage device 32 a part of the controller 34 represent, separate from the controller 34 be provided, or part of the controller 34 and be part of a separate system.
  • The control 34 has at least one processor 44 and a computer readable storage device or medium 46 on. The processor 44 can be any commercial or commercially available processor, a central processing unit (CPU), a graphics processing unit (GPU), an auxiliary processor among multiple processors, the controller 34 a microprocessor on a semiconductor basis (in the form of a microchip or a chip group), a macro processor, any combination thereof, or quite generally any device for executing instructions. The computer-readable storage device or medium 46 may include volatile and nonvolatile memories in a read only memory (ROM), in a random access memory (RAM), and in a latch (KAM), these being examples only. KAM is a persistent or nonvolatile memory that can be used to store various operating variables while the processor is in use 44 is switched off. The computer-readable storage device or medium 46 can be implemented using any of several known storage devices such as PROMs, EPROMs (Electric PROM), EEPROMs (Electrically Erasable PROM), flash memory, or any other electrical, magnetic, optical, or combined storage device, which can store data, some of which represent executable instructions issued by the controller 34 for controlling the autonomous vehicle 10 be used.
  • The instructions may include one or more separate programs, each of which has an ordered list of executable instructions for implementing logical functions. When the commands from the processor 44 are executed, they receive signals from the sensor system 28 and process them, carry logic, calculations, methods and / or algorithms for automatically controlling the components of the autonomous vehicle 10 through, and generate control signals to the actuator system 30 for automatically controlling the components of the autonomous vehicle 10 based on logic, calculations, methods, and / or algorithms. Although in 1 only one controller 34 can be shown, embodiments of the autonomous vehicle 10 every number of controls 34 which communicate via any suitable communication medium or combination of communication media and cooperate to process the sensor signals, perform logic, computations, procedures, and / or algorithms, and generate control signals for automatically controlling features of the autonomous vehicle 10 ,
  • In various embodiments, one or more commands are commands from the controller 34 in the physiological control system 100 realized, and then initiate when they get from the processor 44 running, the processor 44 to process an input from the physiological monitor 102 is received, and one or more control signals to the actuator system 30 to output one or more of the vehicle features, components, systems, and / or functions of the vehicle 10 to control based on the input provided by the physiological monitor 102 Will be received. For example, the processor 44 process the input from the physiological monitor 102 is received, and one or more control signals to the actuator system 30 spend the vehicle 10 autonomously based on one or more sensor signals or received inputs from the physiological monitor 102 , The processor 44 can process the input made by the physiological monitor 102 is received, and one or more control signals to the actuator system 30 spend to the HVAC system 21 , the seating system 34 , the infotainment system 25 , the locking system 27 , the lighting system 29 , the window system 31 and the alarm system 33 to control or more of these systems.
  • The communication system 36 is designed to wirelessly (via radio) information to other sizes 48 and transmit from them, for example, other vehicles (communication "V2V"). Infrastructure (communication "V2I"), remote systems, and / or personal equipment (more specifically with reference to 2 described). In an exemplary embodiment, the communication system is 36 a wireless communication system configured to communicate over a wireless local area network (WLAN) using standards such as IEEE 802.11 , Bluetooth ® , or using cellular data communication. However, additional or alternative communication methods, such as a special short-range-wide communication channel (DSRC), are also intended to be included in the scope of the present disclosure. DSRC channels are short- or medium-range wireless communication channels in one-way or two-way operation designed specifically for use in motor vehicles and a corresponding set of protocols and standards. The communication system 36 can also be configured to encode data or encoded data generated. The of the communication system 36 generated coded data can be encrypted. A security key may be used to decrypt and decode the encoded data, as understood by those skilled in the art. The security key may be a "password," or any other arrangement of data, fingerprint, eye print, face recognition, or DNA recognition that allows the encoded data to be decrypted.
  • The vehicle 10 also has one or more reaction sensors 39 on. The one or more reaction sensors 39 is / are in communication with the physiological control system 100 that in the controller 34 is installed. The one or more reaction sensors 39 observe / observe a user of the vehicle 10 and generate sensor signals on this basis. In various embodiments, the one or more reaction sensors observe 39 a response or gesture of the user, and generate sensor signals based thereon. The one or more reaction sensors 39 include, but are not limited to, a camera, radar, wired gloves, and portable devices including, but not limited to, portable electronic devices in the form of wristwatches, rings, earrings, bracelets, cufflinks, necklaces, neckties, goggles, ribbons, smart apparel, etc. that are designed to observe one or more gestures or movements that result from the user. Based on the sensor signals from the reaction sensors 39 controls the physiological control system 100 one or more of the system (s) of the vehicle 10 such as the HVAC system 21 , the seating system 23 , the infotainment system 25 , the locking system 27 , the lighting system 29 , the window system 31 , the alarm system 33 , and an autonomous driving system 200 , In other words, will be one or more movements or gestures of the user of the vehicle 10 observed and from the physiological control system 100 processed to control one or more of the vehicle systems or the operation of the vehicle 10 even.
  • In some embodiments, the reaction sensors 39 at least one camera 39 ' and at least one range sensor. The range sensor may include a short range radar (SRR), an ultrasonic sensor, a wide range RADAR, or a light detection and distance sensor (LiDAR). The camera 39 ' and the area sensor can be one of the sensors 40a . 40b ... 40n of the sensor system 28 exhibit. The schematically illustrated camera 39 ' represents one or more camera (s) located at each intended or suitable location of the vehicle 10 are arranged, for example on side mirrors of the vehicle, on door handles or in the vicinity thereof, on a trunk lid, outgoing headlights and / or taillights of the vehicle, within a passenger compartment of the vehicle 10 , etc. Every camera 39 ' is configured to detect the presence of a user and, in some embodiments, user movement. Each of these is movable, for example automatically moved by an actuator, by a computer-based system 700 ( 8th ) or the controller 34 is controlled to track a user who is moving in the vicinity of the vehicle. Cameras can be used with other sensors, such as laser motion detection sensors, to detect a user's gestures. Sensors that detect movement of a user, including gestures, may be aligned in any of various directions without departing from the scope of the present disclosure.
  • In one embodiment, the reaction sensor is 39 the user device 54 , In this embodiment, the user device is 54 provided with at least one sensor, for example a motion detector based on RADAR to detect user movements. The user device 54 may include any suitable components for detecting gestures or movements of a user, such as camera components, an inertial moment unit (IMU) having one or more accelerometers, and the detected gestures or movement of the user to the vehicle 10 over the communication network 56 transfers.
  • Now, reference is made 2 taken; in various embodiments, this may be with reference to 1 described autonomous vehicle 10 be for use in the context of a taxi or shuttle system in a particular geographic area (for example, in a city, a school or business campus, a mall, an amusement park, a convention center, and the like), or can be easily managed by a remote system. For example, the autonomous vehicle 10 be assigned to a remote transport system based on autonomous vehicles. 2 FIG. 12 shows an example embodiment of an operating system environment that has a total of 50 and which is an autonomous vehicle based remote transport system 52 which has one or more autonomous vehicles 10a . 10b ... 10n is associated with how they relate to 1 have been described. In various embodiments, the operating system environment 50 continue one or several user devices 54 a remote physiological processing system 66 , one or more physiological monitoring device (s) 102 , and one or more external authority system (s) 80 that with the autonomous vehicle 10 and / or the remote transport system 52 over a communication network 56 communicate.
  • The communication network 56 Supports the required communication between devices, systems, and components by the operating system environment 50 supported (for example, via tangible communication links and / or wireless communication links). For example, the communication network 56 a wireless carrier frequency system 60 a cellular telephone system having a plurality of transmission towers (not shown), one or more mobile switching centers (MSCs) (not shown), as well as any other network components needed for this purpose, the wireless carrier frequency system 60 to connect with a land communication system. Each transmission tower has transmitting and receiving antennas and a base station, the base stations of different transmission towers being connected to the MSC either directly or via intervening devices, for example a base station controller. The wireless carrier frequency system 60 can implement any suitable communication technique, including, for example, digital technologies such as CDMA (for example, CDMA2000), LTE (for example, 4G LTE or 5G LTE), GSM / GPRS, or other currently available or emerging wireless technologies. Other transmitter / base station / MSC arrangements are possible and may be used with the wireless carrier frequency system 60 be used. For example, the base station and the transmission tower could be located together in the same location, or could they be separate, each base station could be responsible for a single transmission tower, or could a single base station service different transmission towers, or could different base stations be with a single MSC coupled to enumerate only a few possible arrangements.
  • Apart from that, the wireless carrier frequency system 60 is provided, a second wireless carrier frequency system in the form of a satellite communication system 64 be provided for unidirectional or bidirectional communication with the autonomous vehicles 10a . 10b ... 10n to provide. This may be done using one or more communication satellites (not shown) and an uplink transmitting station (not shown). Unidirectional communication may include, for example, satellite radio services in which program content (news, music, etc.) is received by the broadcast station, packetized for upload, and then sent to the satellite, which broadcasts the programs to subscribers. Bidirectional communication may include, for example, satellite telephone service, which uses the satellite as a relay station for telephone communication between the autonomous vehicle 10 and the station. Satellite telephony can either be in addition to or instead of the wireless carrier frequency system 60 be provided.
  • A land communication system 62 may further be provided, which is a conventional land-based telecommunications network, which is connected to one or more landline phones, and the wireless carrier frequency system 60 with the remote transport system 52 combines. The landline communication system 62 For example, it may include a public switched telephone network (PSTN), such as that used to provide hardwired telephone, packet switched data communications, and the Internet infrastructure. One or more of the segment (s) of the fixed network communication system 62 can be implemented through the use of a standard wired network, a fiber network or other optical network, a cable network, power lines, other wireless networks such as wireless local area networks (WLANs), or networks providing broadband wireless access (BWA); or by any combination of these. Furthermore, the remote transport system must 52 not via the landline communication system 62 but may include wireless telephone devices so that it can communicate directly with a wireless network, such as the wireless carrier frequency system 60 ,
  • Although only a user device 54 in 2 can be shown embodiments of the operating system environment 50 any number of user devices 54 support, including multiple user devices 54 which belong to a person and are operated by him or otherwise used. Each user device 54 that from the operating system environment 50 can be implemented using any suitable hardware platform. In this regard, the user device 54 be implemented in any conventional form factor, including but not limited to: a desktop computer; a mobile computer (for example, a tablet computer, a laptop computer, or a netbook computer); a smartphone; a video game machine; a digital media player; a unit of home entertainment equipment; a digital camera or a video camera; a portable computing device (such as smart watch, smart glasses, smart Dress); and the same. Therefore, the user device 54 to be a portable device. Each user device 54 that from the operating system environment 50 is implemented as a computer-implemented or computer-based device having the hardware, software, firmware, and / or processing logic necessary to perform the various techniques and methods described herein. For example, the user device points 54 a microprocessor in the form of a programmable device containing one or more instructions stored in an internal memory structure and used to receive the plurality of binary output generation inputs. In some embodiments, the user device 54 a GPS module that can receive GPS satellite signals and generate GPS coordinates based on these signals. In other embodiments, the user device 54 Cellular communication functionality, where, for example, the device voice and / or data communication over the communication network 56 using one or more cellular communication protocols as discussed herein. In various embodiments, the user device 54 a visual display, such as a graphical touch-screen display, or another display.
  • In general, the communication network allows 56 the communication between the user device 54 and the autonomous vehicle 10 so the communication system 36 one or more activation requests and / or user preferences from the user device 54 which receives the physiological control system 100 be transmitted. The activation requests and / or user preferences may be provided via a user input device associated with the user device 54 is assigned (for example, a keyboard, a speech recognition system, etc.). Additionally, the activation requests and / or user preferences may be provided via an application (eg, "app") that resides on the user device 54 running.
  • The communication network 56 also allows communication between the physiological monitor 102 and the autonomous vehicle 10 , In general, the communication system stands 36 in communication with the physiological monitor 102 so the controller 34 receives one or more physiological sensor signals or physiologic sensor data sent to the physiological control system 100 be transmitted. Furthermore, the physiological monitoring device 102 the one or more physiological sensor signal (s) or physiological sensor data to a remote physiological processing system 66 and / or an external authority system 80 over the communication network 56 transfer.
  • Although only a physiological monitor 102 in 2 can be shown embodiments of the operating system environment 50 any number of physiological monitoring devices 102 support, including multiple physiological monitoring devices 102 that belong to, be operated by or otherwise used by a person, and that can monitor different physiological features, for example. The physiological monitor 102 is in communication with the controller 34 to provide physiological sensor data or sensor signals to the physiological control system 100 over the communication network 56 transferred to. Any of the operating system environment 50 supported physiological monitor 102 can be implemented using any suitable hardware platform. Furthermore, each of the operating system environment 50 supported physiological monitor 102 as a computer-implemented or computer-based device having the hardware, software, firmware, and / or processing logic needed to perform the various techniques and methods described herein. This is the case, for example, with the physiological monitoring device 102 a microprocessor in the form of a programmable device having one or more instructions stored in an internal memory structure and used to receive binary inputs for generating binary outputs.
  • In various embodiments, the physiological monitoring device 102 one or more physiological sensors 104 and a physiological communication system 106 on. The physiological monitor 102 also has a physiological control module 108 on. In one example, the physiological sensors are 104 , the physiological monitoring system 106 and the physiological control module 108 within a housing 110 taken with the user (ie driver, operator and / or passenger (s)) of the autonomous vehicle 10 is coupled. Thus, in one example, the physiological monitor may 102 a portable device that can be placed on the body of the user, and can be coupled to the body of the user, such as a ring, a chest band, a bracelet, and other portable devices, including but not limited to portable electronic devices in the form of wristwatches, rings, earrings, bracelets, cufflinks, necklaces, neckties, spectacles, ribbons, smart clothing, etc. Therefore, in certain cases Embodiments the physiological monitoring device 102 a personal device, the operator of the autonomous vehicle 10 is assigned, which can be worn by the operator, and together with the operator in the autonomous vehicle 10 when using the autonomous vehicle 10 is spent. It should be noted that the physiological monitoring device 102 One or more physiological monitoring devices may be integral with the autonomous vehicle 10 coupled, and therefore can not be worn by the user.
  • The physiological sensors 104 observe one or more physiological features of the user and generate sensor signals based thereon. In various embodiments, the physiological sensors observe 104 a physiological feature, such as blood pressure, pulse rate, body temperature, respiratory rate, blood sugar, alcohol level (eg, blood alcohol level), and body motion of the user, and generate sensor signals based thereon. It should be noted that the physiological sensors 104 observe other characteristics or properties of the user, and that the above list is only to be understood as an example. The physiological control module 108 receives the sensor signals from the physiological sensors 104 and controls the physiological communication system 106 so that the sensor signals from the physiological sensors 104 to the controller 34 be transferred to the vehicle 10 is assigned, and / or to the remote physiological processing system 66 , via the communication network 56 , In various embodiments, the physiological control module controls 108 continue the physiological communication system 106 so that the sensor signals from the physiological sensors 104 to the user device 54 and / or the external authority systems 80 over the communication network 56 be transmitted.
  • The sensor signals from the physiological sensors 104 be through the physiological communication system 106 to the communication system 36 , the remote physiological processing system 66 , the user device 54 and / or the external authority systems 80 over the communication network 56 transfer. The physiological communication system 106 is in communication with the communication system 36 , the remote physiological processing system 66 , the user device 54 and / or the external authority systems 80 over any suitable communication protocol, that of the operating system environment 50 is supported, and therefore, the physiological communication system 106 have, and it is not limited to, a Bluetooth ® transceiver, a radio transceiver, a Cellular transceiver, a transceiver of the type 2G / 3G / 4G LTE and / or a Wi-Fi transceiver. The physiological communication system 106 can also have a one-way transmitter. In addition, the physiological communication system 106 in communication wiring with the controller 34 stand for input to the physiological control system 100 to provide.
  • The remote transport system 52 has one or more back-end server systems that may be cloud-based, network-based, or located at the particular campus or geographic location of the remote transport system 52 is served. The remote transport system (or remote transport system) 52 can be staffed by a human adviser, an automated adviser, or a combination of both. The remote transport system 52 can with the user devices 54 and the autonomous vehicles 10a . 10b ... 10n communicate to plan trips, autonomous vehicles 10a . 10b ... 10n to send, and the like. In various embodiments, the remote transport system stores 52 Account information, such as subscriber authentication information, vehicle identifiers, profile records, behavior patterns, and other essential subscriber information.
  • In a typical workflow for an application, a registered user of the remote transport system may 52 a driving request via the user device 54 produce. The driving request typically includes the desired picking location of the passenger (or current GPS location), the desired destination (identifying a predetermined vehicle hold and / or a user specified passenger destination), and a pickup time. The remote transport system 52 receives the driving request, processes the request, and sends a selected vehicle among the autonomous vehicles 10a - 10n (and if one is available) to pick up the passenger at the specified location and time. The remote transport system 62 may also generate a suitably configured acknowledgment message or message and to the user device 54 send to let the passenger know that a vehicle is traveling.
  • In various embodiments, the remote transport system includes 52 a remote physiological processing system 66 responding to a remote message coming from the physiological control system 100 of the autonomous vehicle 10 over the communication network 56 is transmitted. In general, the remote physiological processing system 66 a distant one physiological control module 68 and a remote communication system 70 on. The remote communication system 70 is in communication with the communication system 36 that the autonomous vehicle 10 and is in communication with one or more external authority systems 80 over the communication network 56 , In various embodiments, the remote communication system 70 a Bluetooth ® transceiver, a radio transceiver, a cellular transceiver, a 2G / 3G / 4G LTE transceiver, and / or a Wi-Fi transceiver, and may also be configured to encode or encode data generated coded data. The remote communication system 70 transmits from the remote physiological control module 68 received data to the external authority systems 80 , and receives the remote message from the physiological control system 100 of the autonomous vehicle 10 ,
  • The remote physiological control module 68 receives the message remotely and, based on this message, gives data to the external authority systems 80 out. In various embodiments, the remote physiological control module engages 68 to a remote data store 72 and remotely retrieves a user profile based on receipt of the message. The remote data store 72 stores one or more user profiles. In one example, each user profile includes, but is not limited to, health records and / or medical records, emergency contact information, location information, and so on, to the user of the physiological monitor 102 assigned. The user profile may be set up by the user via a web-based application, or via the application on the user device 54 , Furthermore, the user profile may be set based on data provided by the physiological monitor 102 which is also in communication with the remote physiological processing system 66 can stand by the respective user profile in the remote data store 72 to set up. Those of the remote physiological control module 68 to the external authority systems 80 output data includes the user profile, a current location of the vehicle 10 , and physiological data provided by the physiological monitor 102 be received.
  • In various embodiments, the external authority systems 80 include, but are not limited to, a remote vehicle or user assistance system, a police dispatching system, an alerts system in response to medical emergencies, a healthcare provider, an insurance broker, or an insurance company, one or more emergency contacts associated with the user based on the retrieved user profile , a taxi allocation system and / or a rental car allocation system. The remote transport system 52 can also be an external authority system. The external authority systems 80 each have a communication component, such as a wireless communications component, a Bluetooth ® transceiver, a radio transceiver, a Cellular transceiver, a transceiver of the type 2G / 3G / 4G LTE, and / or a Wi-Fi transceiver to to receive the data received from the remote communication system 70 be transmitted. The remote communication system 70 can also work directly with the external authority systems 80 communicate via e-mail over the Internet or a web-based application, over the communications network 56 , That the vehicle 10 associated communication system 36 can also send one or more messages (e-mail, text, etc.) to the external authority systems 80 output directly through the communications network 56 ,
  • It will be appreciated that the subject matter disclosed herein provides certain improved features and functionality with respect to what is considered a standard or baseline for an autonomous vehicle 10 and / or an autonomous vehicle based remote transport system 52 can be viewed. To this end, an autonomous vehicle and a remote transport system based on an autonomous vehicle may be modified, improved, or otherwise supplemented to provide the additional features set forth in greater detail below.
  • As in 3 shown, where also still on 1 Referring to various embodiments, the controller implements the controller 34 the autonomous driving system (ADS) 200 , This means that suitable software and / or hardware components of the controller 34 (for example, processor 44 and computer-readable storage device 46 ) be used to an autonomous driving system 200 to provide that in connection with the vehicle 10 is used.
  • In various embodiments, the commands of the autonomous driving system 200 be organized by function or system. For example, as in 3 shown the autonomous driving system 200 a sensor fusion system 124 have a positioning system 126 , a leadership system 128 , and a vehicle control system 130 , It should be understood that in various embodiments, the instructions may be organized in any number of systems (For example, combined, further divided, etc.), since the disclosure is not limited to the present examples.
  • In various embodiments, the sensor fusion system synthesizes and processes 124 Sensor data, and leads to a prediction of the presence, the location, the classification, and / or the path of objects and features of the environment of the vehicle 10 by. In various embodiments, the sensor fusion system 124 Process information from multiple sensors, including, but not limited to, cameras, lidars, radars, and / or any number of other types of sensors.
  • The positioning system 126 Processes sensor data along with other data to a position (eg, a local position relative to a map, an exact position relative to a lane of a road, the heading or speed of the vehicle, etc.) of the vehicle 10 in terms of the environment. The leadership system 128 Processes sensor data along with other data to determine a path to which the vehicle 10 should follow. The vehicle control system 130 generates control signals for controlling the vehicle 10 according to the determined way. In various embodiments, the guidance system processes 128 also the exact position of the vehicle 10 and the map data from the data storage device 32 to one or more geographic locations for receiving a treatment, for stopping, and / or for stopping the vehicle 10 including, but not limited to, one or more nearby hotels, hospitals, emergency rooms, clinics, parking lots, gas stations, drug stores, etc.
  • In various embodiments, the controller implements 34 Machine learning method to support the functionality of the controller 34 for example, detection / classification of features, avoidance of obstacles, route change, mapping, sensor integration, determination of true altitude over ground, and the like.
  • In various embodiments, the vehicle control system receives 130 an issue 199 from a physiological control module 202 of the physiological control system 100 , On this basis, the vehicle control system controls 130 the operation of the vehicle 10 , In one example, the vehicle control system receives 130 the edition 199 from the physiological control system 100 and at least either generates a vehicle control output 180 or a second vehicle tax expense 182 , The vehicle tax issue 180 includes a group of actuator commands for controlling the operation of the vehicle 10 along a predetermined path including, but not limited to, a steering command, a gearshift command, an accelerator pedal command, and a brake command. The second vehicle tax issue 182 includes a set of actuator commands to provide a default setting for the comfort or convenience of the vehicle control system 130 includes, but is not limited to, an HVAC command, a seat command, an infotainment command, a lock command, a lighting command, a window command, and an alert command.
  • The vehicle tax issue 180 and the second vehicle tax issue 182 be attached to the actuator system 30 communicated. In an exemplary embodiment, the actuators include 42 a steering controller, a gearshift controller, an accelerator controller, a brake controller, an HVAC controller, a seat controller, an infotainment controller, a lock controller, a lighting controller, a window controller, and an alarm controller. The steering control may be, for example, a steering system 24 control how it is in 1 is shown. The gear shift control may, for example, a transmission system 22 control how it is in 1 is shown. The accelerator control may be, for example, a drive system 20 control how it is in 1 is shown. The brake control may be, for example, a wheel brake system 26 control how it is in 1 is shown. The HVAC control can be, for example, the HVAC system 21 control how it is in 1 is shown. The seat control may be, for example, the seating system 23 control how it is in 1 is shown. The infotainment control, for example, the infotainment system 25 control how it is in 1 is shown. The locking control may be, for example, the locking system 27 control how it is in 1 is shown. The lighting control can be, for example, the lighting system 29 control how it is in 1 is shown. The window control can be, for example, the window system 31 control how it is in 1 is shown. The alarm control can be, for example, the alarm system 33 control how it is in 1 is shown.
  • As already briefly mentioned above, the physiological control system 100 from 1 in the autonomous driving system 200 included, for example, to generate the vehicle tax expense 180 and the second vehicle tax issue 182 through the vehicle control system 130 , As will be explained in more detail, generated the physiological control system 100 the edition 199 provided by the vehicle control system 130 to generate the vehicle tax issue 180 and / or the second vehicle tax issue 182 is used, attached to the actuator system 30 is transmitted. Therefore, the physiological control system generates 100 the edition 199 provided by the vehicle control system 130 is used to one or more of the actuators 42 of the actuator system 30 based on the systems and methods of the present disclosure.
  • For example, as with more details in 4 is still shown, while still referring to 3 is taken, the physiological control system 100 the physiological control module 202 on. In various embodiments, the physiological control module is provided 202 the edition 199 to the vehicle control system 130 based on the sensor signals from the physiological monitor 102 , the sensor signals from the reaction sensors 39 , the input from the user device 54 , the input from the input device 25b , and based on the vehicle control systems and methods of the present disclosure. The physiological control module 202 outputs a message remotely, physiological data and a location of the vehicle 10 , for the remote physiological processing system 66 based on the sensor signals from the physiological monitor 102 , the sensor signals from the reaction sensors 39 , the input from the user device 54 , the input from the input device 25b , and based on the vehicle control systems and methods of the present disclosure. In various embodiments, the physiological control module is provided 202 also a message for one or more of the external authority systems 80 based on the sensor signals from the physiological monitor 102 , the sensor signals from the reaction sensors 39 , the input from the user device 54 , the input from the input device 25b , and based on the vehicle control systems and methods of the present disclosure. The physiological control module 202 also gives an interface to the display on the display 25a based on the sensor signals from the physiological monitor 102 and based on the vehicle control systems and methods of the present disclosure.
  • In 4 , with continuing reference to the 1 - 3 a data flow diagram explains various embodiments of a control system 300 for the physiological control system 100 that is in the physiological control module 202 can be embedded. Various embodiments of the control system 300 According to the present disclosure, there may be any number of sub-modules included in the physiological control module 202 are embedded. It should be noted that the in 4 Submodules shown can be combined and / or further subdivided to correspond to the output 199 for the vehicle control system 130 to generate data to the remote physiological processing system 66 and / or the external authority systems 80 to send and to the interface for display on the display 25a issue. Entries to the tax system 300 can from the physiological monitor 102 ( 1 ) can be received from the input device 25b of the infotainment system 25 ( 1 ) can be received from the user device 54 ( 1 ) can be received from the reaction sensors 39 to be received by the vehicle 10 can be received from other control modules (not shown) which are the vehicle 10 and may be determined or modeled by other sub-modules (not shown) within the physiological control module 202 become. In various embodiments, the physiological control module 202 a value data store 302 on, a physiological monitoring module 304 , a response data store 306 , a reaction detection module 308 , a settings data store 310 , a state control module 312 , a communication control module 314 , and a control module 316 for a user interface (UI).
  • The value data store 302 stores one or more tables (eg, look-up tables) indicating one or more acceptable values for one or more physiological features, and further stores a physiological baseline reading. In other words, the value data store stores 302 one or more tables containing one or more predetermined, acceptable values 318 for the physiological features provided by the physiological sensors 104 monitored and measured, and continues to store an original or baseline reading 318a from the physiological sensors 104 , In various embodiments, the tables may be interpolation tables defined by one or more indices. One or more values 318 that are provided by at least one of the tables will normally indicate a normal or acceptable range for the particular physiological features. For example, one or more values include 318 an acceptable range for blood pressure, an acceptable range for a pulse rate, an acceptable range for a body temperature, an acceptable range for a respiratory rate, an acceptable range for blood glucose, an acceptable range for a blood alcohol level, an acceptable range for an amount of exercise. As one example, one or more tables may be indexed by various parameters, such as, but not limited to, blood pressure, pulse rate, body temperature, respiratory rate, blood glucose, blood alcohol level, or exercise, by the one or more values 318 to provide. The baseline reading 318a includes a blood pressure, a pulse rate, a body temperature, a respiratory rate, a blood sugar and / or a blood alcohol level, originally from the physiological sensors 104 monitored and measured, and in the value data memory 302 through the physiological monitoring module 304 get saved.
  • The physiological monitoring module 304 receives as input physiological sensor data 320 , The physiological sensor data 320 include the sensor signals from the physiological sensors 104 that the physiological monitor 102 assigned. The physiological monitoring module 304 processes the physiological sensor data 320 and generates the physiological baseline reading 318a based on an initial reception of the physiological sensor data 320 (For example, after starting the vehicle 10 ), The physiological monitoring module 304 saves the physiological baseline reading 318a in the value data store 302 , Based on a sampling rate corresponding to the physiological sensors 104 and receiving later sampled physiological sensor data 320 (ie physiological sensor data 320 that are received after the original baseline reading) compares the physiological monitor module 304 the following physiological sensor data 320 with the physiological baseline reading 318a and determines if one or more physiological features have changed. Based on this determination, the physiological monitoring module provides 304 a physiological change 321 for the condition control module 312 one. Examples of physiological changes include, but are not limited to, an increase / decrease in blood pressure, an increase / decrease in pulse rate, an increase / decrease in body temperature, an increase / decrease in respiratory rate, an increase / decrease in blood glucose, an increase / Reduction of movement, etc.
  • The physiological monitoring module 304 also processes the physiological sensor data 320 , and calls the one or more values 318 from the value data store 302 from. Based on one or more retrieved values 318 provides the physiological monitoring module 304 determines if the physiological sensor data 320 within an acceptable range. Are the physiological sensor data 320 outside the acceptable range, so does the physiological monitoring module 304 a physiological condition 322 for the communication control module 314 and the UI control module 316 one. The physiological condition 322 indicates that the physiological sensor data 320 are not normal or out of the one or more acceptable values 318 lie.
  • The physiological monitoring module 304 also receives as input a response 324 from the reaction detection module 308 , The reaction 324 is a response from the user to a prompt 330 to ignore that from the UI control module 316 is output based on the reception of the physiological state 322 , Based on the reaction 324 provides the physiological monitoring module 304 the physiological data 326 including the determined physiological condition and the received physiologic sensor data 320 may contain for the communication control module 314 ,
  • The response data store 306 saves one or more reaction settings 328 , In other words, the response data memory stores 306 one or more settings for reactions of the vehicle 10 gestures that the user has made. For example, the response data memory stores 306 one or more reaction settings 328 , a user-made gesture to respond to the request 330 from the UI control module 316 correspond. In other words, the response data memory stores 306 one or more predetermined vehicle responses (response settings) for a particular gesture made by the user. In various embodiments, the reaction settings are 328 configured by the user, by inputs provided by the user device 54 and / or received from inputs provided by the input device 28b received, as will be explained in more detail below. In other embodiments, the reaction settings are default settings or manufacturer settings. In various embodiments, the response data store stores 306 the reaction setting 328 which a particular gesture of a user of a refusal of the call 330 assigns and stores that reaction setting 328 that a particular user gesture an activation command for the state control module 312 assigns.
  • The reaction detection module 308 receives as input reactive sensor data 332 , The reaction sensor data 332 include the sensor signals from the reaction sensors 39 , The reaction detection module 308 processes the reaction sensor data 332 and accesses the response data store 306 to, that reaction attitude 328 to retrieve which of the gesture corresponds to that of the reaction sensors 39 was observed. Based on the reaction setting 328 determines the reaction detection module 308 the desired vehicle response based on that of the reaction sensors 39 observed gesture. In various embodiments, the reaction detection module provides 308 based on the reaction attitude 328 the reaction 324 for the physiological monitoring module 304 one, which is the rejection of the solicitation 330 indicated by the user via a watched gesture. Based on the reaction setting 328 provides the reaction detection module 308 continue a gesture activation 365 for the condition control module 312 one. The gesture activation 365 indicates receipt of a gesture from the user to the state control module 312 to activate.
  • The settings data store 310 stores one or more of the settings that control the system of the vehicle 10 including, but not limited to, the vehicle control system 130 , the HVAC system 21 , the seating system 23 , the infotainment system 25 , the locking system 27 , the lighting system 29 , the window system 31 and the alarm system 33 , based on the determined physiological change 321 and the determined physiological condition 322 , In various embodiments, the adjustment data memory stores 310 one or more tables (eg, look-up tables) indicating a predetermined control command for the HVAC system 21 , the seating system 23 , the infotainment system 25 , the locking system 27 , the lighting system 29 , the window system 31 , and the alarm system 33 , or for several of these, based on the physiological change 321 and the physiological state 322 , In other words, the setting data memory stores 310 one or more tables containing one or more settings 334 provide which of the detected physiological change 321 are assigned and provides one or more settings 334 based on the detected physiological condition 322 to disposal. In various embodiments, the tables are set by the user based on inputs provided by the user device 54 received, and / or input from the input device 28b be received. In other embodiments, the tables are predefined based on default values and based on preferences defined by the user. As an example, one or more of the tables may be indexed by various parameters, such as, but not limited to, the particular system of the vehicle 10 , the physiological change 321 or the physiological state 322 to set one or more settings 334 to provide.
  • The state control module 312 receives as input the physiological change 321 and the physiological state 322 , The state control module 312 processes the physiological change 321 and the physiological state 322 ; and calls the appropriate settings 334 from. The state control module 312 gives the output 199 for the vehicle control system 130 from which HVAC data 336 contains, seat data 338 , Vehicle tax data 340 , Infotainment data 342 , Lighting data 344 , Locking data 346 , Window control data 347 , and alarm data 349 , based on the retrieved settings 334 , In one example, the HVAC data includes 336 one or more control signals for the vehicle control system 130 to the actuator system 30 to instruct the HVAC system 21 to control, increase or decrease a temperature of the interior of the vehicle 10 (For example, one or more control signals to the engine, the condenser and / or the heater). The seat data 338 comprise one or more control signals for the vehicle control system 130 to the actuator system 30 to instruct the seating system 23 to control seat cooling, seat heating, and / or one or more of the actuators associated with a respective seat to move the seat. The vehicle tax data 340 comprise one or more control signals for the vehicle control system 130 for autonomous or semi-autonomous control of the operation of the vehicle 10 via one or more commands to the actuator system 30 to the steering system 24 to control the transmission system 22 , the drive system 20 and / or the wheel brake system 26 , The vehicle tax data 340 may further include one or more control signals for the vehicle control system 130 have to the actuator system 30 to instruct the wheel brake system 26 and / or the drive system 20 so control the speed of the vehicle 10 is limited.
  • The infotainment data 342 include one or more control signals for the vehicle system 130 to the actuator system 30 to instruct the infotainment system 25 to instruct a music station to change, play a predetermined play list, recommend a conversation or activity, etc. The lighting data 344 comprise one or more control signals for the vehicle control system 130 to the actuator system 30 to instruct the lighting system 29 pretend to be a light source of the vehicle 10 to turn on / off one or more interior lights, turn on / off one or more exterior lights, change a color associated with one or more interior lights, etc. The interlocking data 346 comprise one or more control signals for the vehicle control system 130 to the actuator system 30 to instruct it to the locking system 27 pretending to lock or unlock one or more latches associated with one or more doors and / or luggage compartment flaps associated with a vehicle 10 including, but not limited to, unlocking or locking one of the doors or bootlids, unlocking or locking all doors or bootlid, etc. The window control data 347 comprise one or more control signals for the vehicle control system 130 for instructing the actuator system 30 It's the window system 31 pretend one or more of the vehicle 10 move windows, including, but not limited to, open and / or close a window to open and / or close all windows, to open and / or close a window on the driver's side, to open a passenger window and / or close, and / or a sunroof and / or a convertible top of the vehicle 10 to open or close. The alarm data 349 comprise one or more control signals for the vehicle control system 130 to the actuator system 30 to instruct a command to the alarm system 33 to enable or disable the alarm.
  • The state control module 312 also receives navigation data as input 348 , The navigation data 348 include information regarding places that are available to get a treatment, the vehicle 10 to park and / or stop based on a current geographical location of the vehicle 10 by the autonomous driving system 200 is received, for example, from the leadership system 128 , The navigation data 348 can also remove directly from the transport system 52 be received. The state control module 312 processes the navigation data 348 , and based on the navigation data 348 and the physiological state 322 provides the state control module 312 local options 350 for the UI control module 316 to disposal. The location options 350 include one or more nearby locations for receiving treatment, stopping and / or parking the vehicle 10 , including, but not limited to, one or more nearby hotels, hospitals, emergency clinics, clinics, car parks, gas stations, pharmacies, etc. A nearest location for treatment may also be to the vehicle control system 130 be issued to autonomously or semi-autonomously the operation of the vehicle 10 to steer to that place for a treatment.
  • The communication control module 314 receives as input the physiological data 326 and the physiological state 322 from the physiological monitoring module 304 , The communication module 314 processes the physiological data 326 and the physiological state 322 ; and gives physiological status data 352 for transmission by the communication system 36 to the remote physiological processing system 66 out. The physiological status data 352 include the physiological data 326 and the physiological state 322 by the physiological monitoring module 304 can be received, and can be an identification of the user of the physiological monitoring device 102 contain. The communication control module 314 continues to provide GPS data 354 and a message 356 remotely, for transmission through the communication system 36 to the remote physiological processing system 66 , The GPS data 354 include the current geographical location of the vehicle 10 as received by the autonomous driving system 200 For example, from the positioning system 126 , and the distant message 356 includes a message that the physiological sensor data 320 are outside the acceptable range as indicated by the physiological condition 322 ,
  • Based on the physiological data 326 and the physiological state 322 gives the communication control module 314 continue a message 328 out. In various embodiments, the message becomes 358 for the communication system 36 for transmission to the external authority systems 80 output. For example, the message points 358 an e-mail or text message to a user device which is a contact 360 is assigned, for example, an emergency contact, by the UI control module 316 Will be received. The emergency contact may also be used as input from the user device 54 to be received by the vehicle 10 assigned.
  • The communication control module 314 continues to receive data as input 362 a portable device from the user device 54 , The data 362 of the portable device include input from the user device 54 which include, but are not limited to, preferences related to vehicle responses to user gestures, such as prompts to respond to prompts, preferences for one or more settings associated with one or more physiological conditions, and one or more emergency contacts. The communication control module 314 processes the data 362 of the portable device and sets device preferences 364 for the response data memory 306 and the settings data store 310 one. The communication control module 314 continues to process the data 362 the portable device for the one or more contacts, and in various embodiments transmits the communication control module 314 the message 358 to one contact or several contacts in the data 362 of the portable device. In various embodiments, the communication control module processes 314 continue the data 362 of the portable device, and determines if one Activation request was received, so that the output 199 to the vehicle control system 130 is issued. When the activation request is received, the communication control module may 314 activation data 363 for the condition control module 312 to adjust. Based on receipt of activation data 363 gives the state control module 312 the edition 199 which includes one or more of the following: HVAC data 336 , Seat data 338 , Vehicle tax data 340 , Infotainment data 342 , Lighting data 344 , and locking data 346 , based on the detected physiological change 321 and / or the physiological state 322 ,
  • In various embodiments, one or more may be among the communication control modules 314 and the state control modules 312 be released based on one or more sensor signals from the reaction sensors 39 be received. In other words, the reaction detection module 308 the reaction sensor data 332 process by retrieving the reaction setting 328 which is associated with the identified gesture and determining that the vehicle response is gesture activation 365 represents. Based on this finding, the reaction detection module provides 308 the gesture activation 365 for the communication control module 314 and / or the state control module 312 one. Based on gesture activation 365 can the communication control module 314 the data 362 receive and process the portable device. Based on gesture activation 365 can the state control module 312 the physiological condition 322 and the physiological change 321 receive and process. Therefore, in various embodiments, a user's motion or gesture may be detected by the reaction sensors 39 is observed, an activation request for the state control module 312 and / or the communication control module 314 provide.
  • The UI control module 316 receives input data 366 , The input data 366 include input from the input device 28b be received. The UI control module 316 processes the input data 366 , and sets preferences 368 for the response data memory 306 and the settings data store 310 one; and makes the contact 360 for the communication control module 314 one. The preferences 368 include one or more user-defined preferences with respect to gestures in response to the request 330 (Response Setting 328 ), one or more user defined gestures preferences for activating the state control module 312 (Response Setting 328 ), and one or more user preference preferences 334 which are associated with one or more physiological conditions or physiological changes. The input data 366 may also include an input attached to the input device 28b in response to the call 330 Will be received.
  • The UI control module 316 furthermore receives as input the physiological state 322 and the location options 350 , The UI control module 316 processes the physiological condition 322 , and based on the physiological state 322 gives the UI control module 316 the request 330 out. The request 330 can be part of a user interface 370 be. The request 330 includes a request for a gesture, a request for physical exercise, a graphic, textual and / or verbal request to the user, which provides a warning regarding the detected physiological condition 322 and optionally provides suggestions for action regarding the detected physiological condition 322 to disposal. For example, the requirement 330 include a request, the vehicle 10 into an autonomous driving mode (so that the vehicle 10 is operated autonomously), or in a semi-autonomous driving mode. Furthermore, the request may include a request to deviate from a planned route to take a break or to seek treatment. For specific examples, the request contains 330 also one or more places for the user to take a rest to seek treatment, or the vehicle 10 stop, based on location options 350 received from the state control module 312 be received.
  • In 5 , while continuing to be on the 1 - 2 Referring to FIG. 1, a flowchart explains a control method 400 that through the physiological control module 202 of the 1 - 4 can be performed according to the present disclosure. As will be apparent from the disclosure, the order of operations in the method is not limited to the sequence described in U.S. Pat 5 but may be performed in one or more different orders, as appropriate, and in accordance with the present disclosure.
  • In various embodiments, the method may be scheduled to run based on predetermined events based on the receipt of physiological sensor data 320 , and / or it may be continuous during operation of the autonomous vehicle 10 expire. The procedure begins at 402 , at 404 the method receives the physiological sensor data 320 from the physiological monitor 102 , and sets a baseline for a physiological reading. The baseline for physiologic reading is generally used when initiating the exercise physiological control system 100 and provides original sensor signals or sensor data provided by the physiological monitor 102 be received. The method stores this physiological baseline reading 318a in the value data store 302 , at 406 the procedure processes the physiological sensor data 320 to the physiological change 321 and the physiological state 322 determine and retrieve one or more values 318 from. at 408 The method determines whether the received physiological sensor data 320 outside the acceptable values 318 lie. If the physiological sensor data 320 are outside the one or more acceptable ranges based on the one or more values 318 , the procedure goes to A in 6 above.
  • Otherwise, add 410 the procedure determines whether the received physiological sensor data 320 compared to the baseline for the physiological reading. Have the physiological sensor data 320 not opposite the baseline for physiological reading 318a changed, or are the physiological sensor data 320 within a predetermined tolerance range for variances from the baseline for the physiological reading 318a , the procedure goes on 412 above. Otherwise, the procedure represents the physiological change 321 firmly and goes after 414 above. at 412 the procedure determines if the vehicle 10 operating, which may be based on data from the sensor system 28 be received. Is the vehicle 10 in operation, the process branches off 404 , Otherwise, the procedure ends at 416 ,
  • at 414 the procedure calls the setting 344 on the basis of the established physiological change 321 , and gives the output 199 from which one or more control signals for the actuator system 30 contains to one or more systems of the vehicle 10 to control, such as the vehicle control system 130 , the HVAC system 21 , the seating system 23 , the infotainment system 25 , the locking system 27 , the lighting system 29 , the window system 31 or the alarm system 33 , based on the setting 334 that the detected physiological change 321 assigned. In various embodiments, the method outputs the output 199 based on the setting 334 out, that of the detected physiological change 321 assigned.
  • Optionally, the procedure follows 418 over, and determines if the activation data 363 from the user device 54 were received. If true, the procedure follows 414 over, and gives the output 199 off which one or more of the HVAC data 336 , the seat data 338 , the vehicle tax data 340 , the infotainment data 342 , the lighting data 344 , the locking data 346 , the window control data 347 and the alarm data 349 based on the activation request made by the user device 54 is received, and the detected physiological change 321 , Otherwise the procedure will follow 420 via, and determined based on the sensor signals or sensor data from the reaction sensors 39 whether the gesture activation 365 was received. If this is true, the procedure follows 414 over, and gives the output 199 off which one or more of the HVAC data 336 , the seat data 338 , the vehicle tax data 340 , the infotainment data 342 , the lighting data 344 , the locking data 346 , the window control data 347 and the alarm data 349 based on the activation request made by the user device 54 is received, and the detected physiological change 321 , Otherwise the procedure branches off 418 , From 414 the procedure follows 412 above.
  • In 6 , while continuing to be on the 1 - 4 Referring to FIG. 1, a flowchart explains a control method 500 that of the physiological control module 302 of the 1 - 3 according to the present disclosure. As can be seen from the disclosure, the order of operations in the method is not the same as in FIG 6 but may be performed in one or more different orders depending on the application, and in accordance with the present disclosure.
  • In one example, the procedure begins at A. Bei 502 the procedure gives the prompt 330 based on the physiological state 322 out. at 504 The method determines whether a response or gesture of a user has been observed based on the sensor signals from the reaction sensors 39 , If a reaction or gesture of a user is observed, then call 506 the procedure the reaction adjustment 328 from the reaction data memory 306 off, and determines if the gesture is the solicitation 330 should ignore. If the gesture ignores the request 330 means the procedure goes to B in 5 above.
  • Otherwise, call 508 the procedure the setting 344 based on the detected physiological condition 322 , and gives the output 199 to the vehicle control system 130 from which the vehicle control data 340 contains. Optionally, the procedure gives the output 199 to the vehicle control system 130 based on the activation request made by the user device 54 Will be received. at 510 the procedure gives the message 358 to the external authority systems 80 out. at 512 the procedure gives the message 356 From a distance, the physiological status data 352 and the GPS data 354 , to the remote physiological processing system 66 , The procedure ends at 514 ,
  • In 7 , with continuing reference to the 1 - 4 is taken, a flowchart explains a control method 600 that from the remote physiological control module 68 of the 1 and 2 can be performed according to the present disclosure. As will be apparent from the disclosure, the order of operations in the method is not limited to the sequence described in U.S. Pat 7 but may be performed in one or more different orders, as appropriate, and in accordance with the present disclosure.
  • The procedure begins at 602 , at 604 the method receives the remote message 356 , the GPS data 354 , and the physiological status data 352 , from the vehicle 10 , at 606 the process processes the received data and retrieves the user profile from the remote data store 72 from, based on from the vehicle 10 received data. at 608 the procedure gives the GPS data 354 , the user profile, and the physiological state data 352 and transmits them to the one or more external authority systems 80 , The procedure ends at 610 ,
  • It should be noted that the systems (for example, the HVAC system 21 , the seating system 23 , the infotainment system 25 , the locking system 27 , the lighting system 29 , the window system 31 and the alarm system 33 ) and the vehicle control system 130 not only have to be activated based on a physiological input. Instead, one or more of the systems and / or the vehicle control system may also be used 130 be activated so that one or more control signals to the systems (for example, the HVAC system 21 , the seating system 23 , the infotainment system 25 , the locking system 27 , the lighting system 29 , the window system 31 and the alarm system 33 ) and / or the vehicle control system 130 based on receipt of inputs from the user equipment 54 , based on vehicle-to-vehicle (V2V) communication, etc.
  • Vehicle-user interaction based on gestures
  • With reference to 8th is the interaction of the user with the vehicle 10 further explained by one or more gestures. In general, the vehicle can 10 also be in communication with a portable device worn by the user and which can activate one or more functions of the vehicle based on a body gesture. In various embodiments, the portable device is the physiological monitoring device 102 that is worn by the user who is with the vehicle 10 communicates to initiate various vehicle functions and, in addition, sensor signals or sensor data from the one or more physiological sensors 104 communicated. Therefore, in various embodiments, the physiological monitoring device 102 designed to send various signals to the vehicle based on movements of the user comprising the physiological monitor 102 affect.
  • The reaction sensors 39 of the vehicle 10 may include any selected sensors and communication receivers for receiving user input, specially programmed computer components for determining vehicle functions in accordance with user inputs, and output components for enabling or initiating the detected vehicle functions. Portable devices, such as the physiological monitor 102 For example, various embodiments are configured to generate and transmit signals for reception by the vehicle 10 , based on a movement of the user. For example, when a user initiates an emergency call or a text message, by means of a gesture, whether by movement of a body-worn device or simply by a body movement, the vehicle uses 10 the communication system 36 to make the call or to send the text message.
  • 8th explains the computer-based system 700 , In one embodiment, one or all components of the calculation system 700 arranged at a remote call or control center, such as the remote transport system 52 ( 2 ).
  • The computer-based system 700 from 8th may also be a model for other electronics systems of existing technology, such as a wearable device, such as a smart bracelet, a smart ring, one or more smart cufflinks, an intelligent belt buckle, an intelligent shoe or boot (footwear) with accessories, intelligent Legwear, smart armwear, smart clothes, smart headphones, smart headphone and microphone sets, a smart hat or other smart headgear, a smart wristwatch, smart glasses, smart sunglasses, smart earrings, etc., as further below Details are described, also related to 10 , The computer-based system 700 is part of a primary Calculation unit of the vehicle 10 , for example, the controller 34 of the vehicle 10 ( 1 ). The system and its components may be hardware based. The computer-based system 700 includes a computer-readable storage medium or a data storage device 704 and further includes a processing hardware unit 706 on top of that with the computer-readable storage device 704 is connected or can be connected to this, using a communication link 708 for example, a computer bus. In various embodiments, the processing hardware unit is 706 the processor 44 ,
  • The processing hardware unit 706 may include one or more processors that may include distributed processors or parallel processors in a single machine or multiple machines. The processing hardware unit may be used in supporting a virtual processing environment. The processing hardware unit may include a state machine, an application specific integrated circuit (ASIC), a programmable gate array (PGA) including a field PGA, or a state machine. References in this regard to the processing hardware unit that executes code or instructions to perform operations, operations, tasks, functions, steps, or the like may include that the processing hardware unit directly and / or by facilitating the operations , Align or collaborate with another device or other components to perform the operations.
  • In various embodiments, the data storage device is 704 any volatile medium, non-volatile medium, removable medium, and non-removable medium. The term computer-readable media and its variants as used in the specification and claims relates to perceptible storage media. The media can be a device and can be non-volatile. In some embodiments, the storage media includes volatile and / or nonvolatile, removable, and / or non-removable media, such as random access memory (RAM), read only memory (ROM), electrically erasable programmable read only memory (EEPROM), solid state memory or other storage technology, CD-ROM, DVD, BLU-RAY, or other optical disk storage, magnetic tape, magnetic disk storage or other magnetic storage devices. The data storage device 704 includes one or more memory modules that store computer readable instructions from the processing hardware unit 706 can be performed to the functions of the computer-based system described here 700 perform. For example, the data storage device 704 team based vehicle engine modules 710 exhibit.
  • In various embodiments, the reaction sensors provide 39 Information for the computer-based system 700 including information indicating the presence and movement of the user of a nearby vehicle as well as movement of the user within the vehicle 10 displays. Furthermore, the reaction sensors 39 a part of the sensor system 28 be.
  • 9 shows further details of the data storage device 704 from 8th , The components of the data storage device 704 will now be further described with reference to the figure. The data storage device 704 has one or more modules 710 on. Furthermore, the memory can also auxiliary components 712 For example, additional software and / or data support performance for the methods according to the present disclosure.
  • The auxiliary components 712 For example, they may contain one or more user profiles. The profiles may include settings, default and / or factory settings for one or more users (eg, drivers) of the vehicle. These and other data components are described elsewhere herein, and also in the context of the operating methods below 1000 , The technology can be personalized in these ways and adapted to the application. In various embodiments, the auxiliary components include 712 the reaction data memory 306 ,
  • The modules 710 can have at least three (3) modules 802 . 804 . 806 which will be described in more detail in the next section. In one embodiment, the modules 710 one or more additional modules. Some instructions may be part of more than one module, and functions described herein may be performed by the execution of the processor of the corresponding module of the plurality of modules.
  • Functions described here, but not explicitly in conjunction with any of the three modules 802 . 804 . 806 , may be part of one of the three modules and / or part of an additional support module or several such modules 808 be. The support module or support modules 808 For example, they may include a user identification module. a passenger identification module, a learning module (for example, to learn how to gesticulate a user, or the nature of natural movement or movement) Gestures of the user to improve user and system efficiency and performance or interaction) and / or a recommendation module, suggestion module, or teaching module (for example, to provide advice to a user regarding it, such as a gesture to trigger selected ones) Vehicle functions to be performed, to improve the efficiency and performance or the interaction of user and system).
  • Each of the modules may be designated in one of several ways, such as by a term or designation indicating the function. The modules 802 . 804 . 806 of the computer-based system 700 may be referred to as: a user gesture determination module 802 ; a vehicle function identification module 804 ; a vehicle function activation module 806 ; and the like, or otherwise, for example.
  • The processing hardware unit 706 which the user gesture determination module 802 determines what gesture a user has made based on user input data. The user input data may include one or more data components. The user input data is from the processing hardware unit 706 receive the module 802 from one or more different data sources. Examples of data sources include one or more sensors of the physiological monitor 102 or another portable device worn by the user and one or more other sensors, such as the vehicle 10 , which are configured and arranged to detect movement of one or more user body parts, such as the arm, wrist, head of a user, etc. The other wearable devices include a smart wristband, a smart ring, one or more smart cufflinks , one or more belt buckles, a part of a shoe or boot (footwear), smart legwear, smart armwear, smart clothes, smart headphones, smart headphone and microphone sets, smart hat or other headgear, smart bricks, smart rings, smart sunglasses , or a smart wristwatch, these are just a few examples.
  • An example of a wearable smart bracelet device is shown in FIG 10 with the reference number 900 designated. The device 900 may be a computerized or electronic device having components analogous to those used in the 8th and 9 for example, a memory unit having executable instructions and a processing device for executing the instructions. Furthermore, as explained, the device 900 the physiological monitor 102 be, or may be one or more features of the physiological monitoring device 102 and may therefore include one or more physiological sensors 104 exhibit.
  • In various embodiments, the device 900 at least one transmitter or transceiver component for transmitting at least signals or messages to the vehicle, such as signals or messages corresponding to user gestures and the physiological sensor signals. The transmitter / transceiver may be any of the above with respect to the communication components of the physiological monitor 102 have described properties, or other properties. The transmitter / transceiver may for example be formed so that it communicates according to any one of many different protocols, including Bluetooth ®, infrared, infrared data association (IRDA), Near Field Communication (NFC), and the like, or improvements thereof.
  • Furthermore, as described, in some embodiments, the system, or systems, is configured to capture user gestures, such as through one or more vehicle sensors, when the user is the device 900 wearing. The provided data source includes one or more sensors configured to detect movement of a user's body part, such as a wrist, head, arm, or hand. In 10 An arm, a wrist and a user's hand are shown. The sensors may include, but are not limited to those described above in connection with the reaction sensors 39 of the physiological control system 100 of the 1 and 2 have been described, so for example at least one camera 39 ' and / or detection devices 40a . 40b ... 40n of the sensor system 28 of the vehicle 10 ( 1 ).
  • In various embodiments, the vehicle is 10 and / or the device 900 configured to determine whether the user is present or in the vicinity of the vehicle, for example, by determining that the portable device is in the vicinity of the vehicle 10 located. The vehicle 10 may identify or authenticate the presence of the user for this purpose in any of various ways, along with it, or in addition to determining the proximity of a user's mobile device, such as by voice authentication, facial authentication, retinal scanning, etc. In various Embodiments detects the device 900 and or the vehicle 10 only user gestures and / or responds to these after determining the presence or proximity of the device 900 and / or the vehicle 10 was made.
  • The processing hardware unit 706 indicating the vehicle function activation module 806 performs the function or functions performed by the processing hardware unit 706 be identified, which the previous modules 802 . 804 performs. Examples of functions include initiating a call at number 911, locking or unlocking doors, etc.
  • 11 shows examples of procedures 1000 according to embodiments of the present technology. It should be noted that the procedures 1000 are not necessarily present in any particular order, and that the performance of some or all of the steps in an alternative order is possible and contemplated. Furthermore, it should be noted that the illustrated methods 1000 can be stopped at any time.
  • In certain embodiments, some or all steps of the method or methods 1000 and / or substantially equivalent steps performed by a processor, such as a computer processor executing computer-executable instructions stored or contained in a computer-readable medium, such as the data storage device 704 of the computer-based system described above 700 ,
  • The procedure of the procedure 1000 is divided into four sections as an example: a user personalization and input section 1010 , a comfort / convenience vehicle function section 1020 , a local alarm vehicle function section 1030 , and a remote communication or alarm section 1040 ,
  • at 1011 a user sets the physiological monitor 102 on or has this, as by the example of the device 900 from 10 explains, or another mobile device, such as a smartphone. at 1012 teach and / or learn one or more sensors and one or more computer systems of the mobile device and / or the respective vehicle user movements, such as gestures, and associated desired vehicle functions. One or more of these learned moves may act as a reaction setting in the response data store 306 get saved. In various embodiments, the system may include a standard organization of gestures available for use, and / or the user may organize the gestures, for example, by establishing interaction levels in the system, for example, a first level of comfort gestures; and a second level for emergency situations. The teachings may include suggesting gestures for the user to use to trigger corresponding vehicle functions. The suggestions may be sent to the user from the vehicle through a user device or through the display 25a ( 1 ) are communicated. The suggestions may include standard gestures that are already associated with corresponding vehicle functions. The user selects a suggestion or more of the suggestions as the preference 368 out in the response data store 306 can be stored.
  • at 1013 The method sets or sets default gesture controls or personalized gesture controls based on user input, standard programming, commands, or updates from the remote transport system 52 , at 1014 the method determines the state of the user. The process may include, for example, determining that the user is the vehicle 10 approaching, near the vehicle, in the vehicle or in the vehicle 10 leaves.
  • at 1015 the method detects a gesture or movement of the user and identifies the gesture based on signals received from the reaction sensors 39 be received. The method determines a vehicle function or vehicle response according to user movement. In various embodiments, the method calls the reaction adjustment 328 based on the identified gesture, and sets the response or function of the vehicle based on the retrieved reaction setting 328 one.
  • at 1021 The method implements local convenience or convenience features that were used in the previous process 1015 were determined. Examples of features in this section 1020 include, but are not limited to, illuminating or flashing vehicle exterior lights (headlamps, taillights, turn signals, under-body and / or interior lights, door latch / unlock, or door, lid, or trunk opening / closing , Activate the state control module 312 , Override the request 330 , etc.).
  • at 1031 The procedure implements local alarm or emergency functions as described in the previous procedure 1015 were determined. Examples of local functions include the operation of the vehicle's horn, the fade-in / fade-out of exterior lights or interior lights, etc. In some Embodiments include the function of the vehicle receiving audio and / or video, for example, for recording a potentially criminal situation involving or occurring near the user. at 1041 The method implements additional vehicle-related functions in the previous process 1015 were determined. Examples of such functions include initiating a telephone call, text message, sending a GPS location, or a video, such as that provided at 1031 has been recorded. The telephone call may go to number 911, or may be an automatic call in which the vehicle provides a message to the recipient, or may be a user call in which live audio signals are transmitted. In one embodiment, the function of each nearby user mobile device or recording device, such as the infrastructure of a parking lot, includes recording audio and / or video, for example, to record a potentially criminal situation involving or proximate the user.
  • The procedure 1000 may end, or it may be one or more operations of the procedure 1000 be carried out again.
  • 12 shows an arrangement 1100 examples of system input 1110 and expenses 1150 separated by a gesture recognition system 1160 According to embodiments of the present technology. The inputs 1110 can be divided into three primary types: User gestures 1120 , Inputs 1130 from outside (outside the vehicle), and inputs 1140 on board (aboard the vehicle).
  • Examples of user gestures 1120 For example, any of those described above include, for example, a user body rotation 1121 , pointing or moving straight 1122 , a wipe 1123 , and a click 1124 , Examples of inputs 1130 outside include inputs from one or more vehicle cameras 1141 , one or more other vehicle sensors 1142 , a Bluetooth input 1143 to the vehicle 10 , a remote input 1144 to the vehicle 10 For example, from the remote transport system 52 , Input from an application 1145 of the vehicle or mobile device, such as an app for determining navigation or location of wear, vehicle or vehicle related controls or function input 1146 for example, a user touchpad, vehicle lighting, key fob, lock / unlock button or button, key fob, and vehicle location input 1147 , Examples of inputs 1140 include location information (eg, GPS) or other data input from a satellite 1131 , a cellular device 1132 , about V2X 1133 (V2V, V2I, etc.) or data over the Internet 1134 , each connected in any suitable manner.
  • The gesture recognition system 1160 In various embodiments, each of the components provided above is associated with gesture recognition functions, such as a user mobile device or vehicle sensors and computer systems.
  • The output functions 1150 include all those described above are not limited thereto, such as lightening of vehicle light 1151 Locking / unlocking vehicle door locks 1152 , the operation of the vehicle horn 1153 , initiating a communication 1154 such as a call or text message, or a send 1155 the mobile device or the vehicle location and / or audio or video recorded by the mobile device, the vehicle, or a nearby arrangement, such as the camera of a parking lot. In various embodiments, the output functions include 1150 continue to set the gesture activation 365 for the condition control module 312 and adjusting the reaction 324 for the physiological monitoring module 304 ,
  • Furthermore, the following examples are also provided, numbered for ease of reference:
    • A method for controlling a vehicle based on a physiological feature, comprising: receiving physiological data from one or more physiological sensors; Processing the received physiological data by a processor to determine one or more physiological conditions; and output, based on the determined physiological condition, one or more control signals to a vehicle system for controlling an operation of the vehicle system.
    • 2. The method of Example 1, further comprising: processing the received physiological data by the processor to generate a baseline of a physiological reading; Processing subsequently received physiological data by the processor to determine a physiological change; and output, based on the detected physiological change, one or more control signals to the vehicle system for controlling the operation of the vehicle system.
    • 3. The method of Example 2, further comprising: Retrieving a setting associated with the determined physiological change, wherein the output of the one or more control signals to the vehicle system is based on the setting.
    • 4. The method of Example 3, wherein the setting is a custom setting or a predetermined default setting.
    • 5. The method of any one of Examples 1 to 4, further comprising: issuing a notification to an external authority system based on the determined physiological condition.
    • 6. The method of any one of examples 1 to 5, further comprising: outputting at least one of a message and a GPS location of the vehicle based on the detected physiological condition.
    • 7. The method of any one of Examples 1 to 6, further comprising: issuing a prompt based on the determined physiological condition; and determining, by the processor, whether a user response has been received; wherein the output of one or more control signals to the vehicle system is based on the user response.
    • 8. The method of Example 7, wherein the determining, by the processor, whether the user response has been received further comprises: receiving sensor data from one or more reaction sensors; Processing the sensor data to determine if the user made a gesture; and determining that the response has been received based on the determination of the gesture.
    • 9. A system for controlling a vehicle based on a physiological feature, comprising: a source of physiological data relating to a user of the vehicle; and a control module having a processor that processes the physiological data and outputs at least one or more control signals to a vehicle control system to autonomously operate the vehicle based on the physiological data and outputs one or more control signals to the vehicle control system for commands deliver to one or more of the following systems: an HVAC system, a seating system, an infotainment system, a locking system, a lighting system, a window system, and an alarm system based on the physiological data.
    • 10. The system of Example 9, wherein the source of physiological data is a personal device associated with the user.
    • 11. The system of Example 9 or 10, wherein the processor processes the physiological data to generate a baseline for physiological reading, subsequently processes received physiologic data to determine a physiological change, and outputs one or more control signals to the vehicle control system , based on the detected physiological change.
    • 12. The system of Example 11, wherein the processor retrieves a setting associated with the detected physiological change and outputs the one or more control signals to the vehicle control system based on the setting.
    • 13. The system of Example 12, wherein the setting is a user-defined setting or a predetermined default setting.
    • 14. The system of any one of Examples 9 to 13, wherein the processor processes the received physiological data and determines one or more physiological conditions, and outputs one or more control signals to the vehicle control system based on the detected physiological condition.
    • 15. The system of Example 14, wherein the processor outputs a message to an external authority system based on the detected physiological condition.
    • 16. The system of Example 14 or 15, wherein the processor outputs at least a message and a GPS location of the vehicle based on the detected physiological condition.
    • 17. The system of Example 14, 15 or 16, wherein the processor issues a request based on the determined physiological condition, determines whether a user response has been received, and the control module based on the one or more control signals to the at least one vehicle system of the user response.
    • 18. The system of Example 17, wherein the processor determines whether the user response has been received based on sensor data received from one or more reaction sensors, and the processor processes the sensor data to determine whether a user action has been performed Gesture is present, and determines that the user response has been received based on the determination of the gesture.
    • 19. A portable physiological apparatus comprising: at least one physiological sensor which observes at least one physiological condition associated with a wearer of the physiological apparatus and generates sensor signals thereon; and a control module having a processor processing the sensor signals and outputting the sensor signals to a system associated with a vehicle.
    • 20. The physiological portable device of Example 19, wherein the portable physiological device is a portable electronic device selected from the group consisting of a wristwatch, a ring, an earring, a bracelet, a cufflink, a necklace, a tie, glasses , a chest band, smart clothes, and combinations of these.
  • While at least one exemplary embodiment has been presented in the foregoing detailed description, it should be understood that a large number of variations exist. It is also to be understood that the exemplary embodiment or exemplary embodiments are only examples, and that it is not intended to in any way limit the scope, applicability, or configuration of the disclosure. Instead, the foregoing detailed description provides those skilled in the art with a convenient roadmap for implementing the exemplary embodiment (s). It should be understood that various changes in the function and arrangement of elements may be made without departing from the scope of the disclosure, as set forth in the appended claims and their legal equivalents.
  • QUOTES INCLUDE IN THE DESCRIPTION
  • This list of the documents listed by the applicant has been generated automatically and is included solely for the better information of the reader. The list is not part of the German patent or utility model application. The DPMA assumes no liability for any errors or omissions.
  • Cited non-patent literature
    • IEEE 802.11 [0043]

Claims (10)

  1. A method of controlling a vehicle based on a physiological feature, wherein: Receiving physiological data from one or more physiological sensors; Processing the received physiological data by a processor to determine one or more physiological conditions; and Output, based on the detected physiological condition, one or more control signals to a vehicle system for controlling an operation of the vehicle system.
  2. The method of claim 1, further comprising: Processing the received physiological data by the processor to generate a baseline for physiological reading; Processing subsequent received physiological data by the processor to determine a physiological change; and Output, based on the detected physiological change, one or more control signals to the vehicle system for controlling the operation of the vehicle system.
  3. The method of claim 2, further comprising: Retrieve a setting associated with the detected physiological change; wherein the output of the one or more control signals to the vehicle system is based on the adjustment.
  4. The method of claim 3, wherein the adjustment is a user-defined setting or a predetermined default setting.
  5. The method of any of claims 1 to 4, further comprising: Output a message to an external authority system based on the detected physiological condition.
  6. The method of any one of claims 1 to 5, further comprising: Issuing a request based on the detected physiological condition; and Determining by the processor whether a user response has been received; wherein the output of the one or more control signals to the vehicle system is based on the user response.
  7. The method of claim 6, wherein the determining, by the processor, whether the user response has been received further comprises: Receiving sensor data from one or more reaction sensors; Processing the sensor data to determine if a gesture has been made by the user; and Determining that the reaction was received based on the determination of the gesture.
  8. A system for controlling a vehicle based on a physiological feature, comprising: a source of physiological data relating to a user of the vehicle; and a control module having a processor that processes the physiological data and outputs at least one or more control signals to a vehicle control system to autonomously operate the vehicle based on the physiological data and outputs one or more control signals to the vehicle control system or several of the following systems: an HVAC system, a seating system, an infotainment system, a locking system, a lighting system, a window system, and an alarm system based on the physiological data.
  9. The system of claim 8, wherein the source of physiological data is a personal device associated with the user.
  10. Portable physiological device comprising: at least one physiological sensor observing at least one physiological condition associated with a wearer of the physiological apparatus and generating sensor signals thereon; and a control module having a processor that processes the sensor signals and outputs the sensor signals to a system associated with the vehicle.
DE102017101343.1A 2015-11-03 2017-01-25 Systems and methods for vehicle system control based on physiological characteristics Pending DE102017101343A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US201662287422P true 2016-01-26 2016-01-26
US62/287,422 2016-01-26
US15/410,582 US10137777B2 (en) 2015-11-03 2017-01-19 Systems and methods for vehicle system control based on physiological traits
US15/410,582 2017-01-19

Publications (1)

Publication Number Publication Date
DE102017101343A1 true DE102017101343A1 (en) 2017-07-27

Family

ID=59295917

Family Applications (1)

Application Number Title Priority Date Filing Date
DE102017101343.1A Pending DE102017101343A1 (en) 2015-11-03 2017-01-25 Systems and methods for vehicle system control based on physiological characteristics

Country Status (2)

Country Link
CN (1) CN107121952A (en)
DE (1) DE102017101343A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017130628A1 (en) 2017-12-20 2019-07-11 Autoliv Development Ab System, method and computer program product for increasing driving safety in traffic

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8509982B2 (en) * 2010-10-05 2013-08-13 Google Inc. Zone driving
EP2876620B1 (en) * 2012-07-17 2019-08-14 Nissan Motor Company, Limited Driving assistance system and driving assistance method
US10088844B2 (en) * 2013-11-22 2018-10-02 Ford Global Technologies, Llc Wearable computer in an autonomous vehicle
US9539999B2 (en) * 2014-02-28 2017-01-10 Ford Global Technologies, Llc Vehicle operator monitoring and operations adjustments
US20150302718A1 (en) * 2014-04-22 2015-10-22 GM Global Technology Operations LLC Systems and methods for interpreting driver physiological data based on vehicle events
CN104216514A (en) * 2014-07-08 2014-12-17 深圳市华宝电子科技有限公司 Method and device for controlling vehicle-mounted device, and vehicle
CN105117147A (en) * 2015-07-24 2015-12-02 上海修源网络科技有限公司 Method and apparatus for manipulating vehicle-mounted operating system based on gesture and vehicle-mounted device
CN105138125A (en) * 2015-08-25 2015-12-09 华南理工大学 Intelligent vehicle-mounted system based on Leapmotion gesture recognition

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
IEEE 802.11

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017130628A1 (en) 2017-12-20 2019-07-11 Autoliv Development Ab System, method and computer program product for increasing driving safety in traffic

Also Published As

Publication number Publication date
CN107121952A (en) 2017-09-01

Similar Documents

Publication Publication Date Title
CA2947995C (en) Autonomous vehicles
CN102497477B (en) Cell phone system for vehicle monitoring
US20050099275A1 (en) Method and system for status indication on a key fob
US20170203767A1 (en) Method and system for controlling and modifying driving behaviors
US20130345929A1 (en) Mobile device wireless camera integration with a vehicle
US9062617B2 (en) Autostarting a vehicle based on user criteria
US8768569B2 (en) Information providing method for mobile terminal and apparatus thereof
US20190041855A1 (en) Automatic driving vehicle and program for automatic driving vehicle
JP2015089808A (en) Adaptation of vehicle systems based on wearable devices
US20170156000A1 (en) Vehicle with ear piece to provide audio safety
EP2273468B1 (en) Method of calling a vehicle and mobile terminal for the same
US20140002357A1 (en) Enabling and Disabling Features of a Headset Computer Based on Real-Time Image Analysis
DE102015219463A1 (en) System and method for providing valet instructions for a self-driving vehicle
US9978278B2 (en) Vehicle to vehicle communications using ear pieces
US8054168B2 (en) System and method for estimating an emergency level of a vehicular accident
US20170151957A1 (en) Vehicle with interactions with wearable device to provide health or physical monitoring
KR101916993B1 (en) Display apparatus for vehicle and control method thereof
DE102012024010A1 (en) Procedure for a vehicle
US20170155998A1 (en) Vehicle with display system for interacting with wearable device
CN105966405A (en) Driver distraction detection system
US9786170B2 (en) In-vehicle notification presentation scheduling
EP3422947A1 (en) Autonomous vehicle with interactions with wearable devices
US20170060234A1 (en) Driver assistance apparatus and method for controlling the same
JP2016042692A (en) Driver status indicator
US20170153114A1 (en) Vehicle with interaction between vehicle navigation system and wearable devices