US20130145401A1 - Music streaming - Google Patents

Music streaming Download PDF

Info

Publication number
US20130145401A1
US20130145401A1 US13/679,875 US201213679875A US2013145401A1 US 20130145401 A1 US20130145401 A1 US 20130145401A1 US 201213679875 A US201213679875 A US 201213679875A US 2013145401 A1 US2013145401 A1 US 2013145401A1
Authority
US
United States
Prior art keywords
vehicle
available
multimedia
device
communication system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/679,875
Inventor
Christopher P. Ricci
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AutoConnect Holdings LLC
Original Assignee
Flextronics AP LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201161560509P priority Critical
Priority to US201261637164P priority
Priority to US201261646747P priority
Priority to US201261653264P priority
Priority to US201261653275P priority
Priority to US201261653563P priority
Priority to US201261663335P priority
Priority to US201261672483P priority
Priority to US201261714016P priority
Priority to US201261715699P priority
Application filed by Flextronics AP LLC filed Critical Flextronics AP LLC
Priority to US13/679,875 priority patent/US20130145401A1/en
Priority claimed from US13/679,842 external-priority patent/US8979159B2/en
Priority claimed from US13/829,718 external-priority patent/US9043073B2/en
Priority claimed from US13/829,505 external-priority patent/US9088572B2/en
Priority claimed from US13/828,651 external-priority patent/US9055022B2/en
Priority claimed from US13/830,003 external-priority patent/US9008906B2/en
Priority claimed from US13/828,960 external-priority patent/US9173100B2/en
Priority claimed from US13/830,133 external-priority patent/US9081653B2/en
Priority claimed from US13/828,513 external-priority patent/US9116786B2/en
Assigned to FLEXTRONICS AP, LLC reassignment FLEXTRONICS AP, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: RICCI, CHRISTOPHER P.
Publication of US20130145401A1 publication Critical patent/US20130145401A1/en
Priority claimed from US13/963,728 external-priority patent/US9098367B2/en
Priority claimed from US14/253,006 external-priority patent/US9384609B2/en
Priority claimed from US14/253,204 external-priority patent/US9147296B2/en
Priority claimed from US14/253,506 external-priority patent/US9082239B2/en
Priority claimed from US14/253,706 external-priority patent/US9147298B2/en
Priority claimed from US14/253,405 external-priority patent/US9082238B2/en
Priority claimed from US14/253,838 external-priority patent/US9373207B2/en
Priority claimed from US14/252,978 external-priority patent/US9378601B2/en
Priority claimed from US14/543,535 external-priority patent/US9412273B2/en
Priority claimed from US14/831,696 external-priority patent/US9545930B2/en
Assigned to AUTOCONNECT HOLDINGS LLC reassignment AUTOCONNECT HOLDINGS LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FLEXTRONICS AP, LLC
Priority claimed from US14/847,849 external-priority patent/US20160070527A1/en
Priority claimed from US14/863,361 external-priority patent/US20160086391A1/en
Priority claimed from US14/875,472 external-priority patent/US20160114745A1/en
Priority claimed from US14/978,185 external-priority patent/US20160185222A1/en
Priority claimed from US14/979,272 external-priority patent/US20160189544A1/en
Priority claimed from US14/991,236 external-priority patent/US20160196745A1/en
Priority claimed from US15/073,955 external-priority patent/US20160306766A1/en
Priority claimed from US15/099,413 external-priority patent/US20160247377A1/en
Priority claimed from US15/133,793 external-priority patent/US20160255575A1/en
Priority claimed from US15/400,947 external-priority patent/US20170247000A1/en
Application status is Abandoned legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L41/00Arrangements for maintenance or administration or management of packet switching networks
    • H04L41/14Arrangements for maintenance or administration or management of packet switching networks involving network analysis or design, e.g. simulation, network model or planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K37/00Dashboards
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K37/00Dashboards
    • B60K37/02Arrangement of instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K37/00Dashboards
    • B60K37/04Arrangement of fittings on dashboard
    • B60K37/06Arrangement of fittings on dashboard of controls, e.g. controls knobs
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/037Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for occupant comfort, e.g. for automatic adjustment of appliances according to personal settings, e.g. seats, mirrors, steering wheel
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/015Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting the presence or position of passengers, passenger seats or child seats, and the related safety parameters therefor, e.g. speed or timing of airbag inflation in relation to occupant position or seat belt use
    • B60R21/01512Passenger detection systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R7/00Stowing or holding appliances inside vehicle primarily intended for personal property smaller than suit-cases, e.g. travelling articles, or maps
    • B60R7/04Stowing or holding appliances inside vehicle primarily intended for personal property smaller than suit-cases, e.g. travelling articles, or maps in driver or passenger space, e.g. using racks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/18Propelling the vehicle
    • B60W30/182Selecting between different operative modes, e.g. comfort and performance modes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/04Traffic conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W40/09Driving style or behaviour
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0098Details of control systems ensuring comfort, safety or stability not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in preceding groups G01C1/00-G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Other optical systems; Other optical apparatus
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/07Responding to the occurrence of a fault, e.g. fault tolerance
    • G06F11/16Error detection or correction of the data by redundancy in hardware
    • G06F11/20Error detection or correction of the data by redundancy in hardware using active fault-masking, e.g. by switching out faulty elements or by switching in spare elements
    • G06F11/202Error detection or correction of the data by redundancy in hardware using active fault-masking, e.g. by switching out faulty elements or by switching in spare elements where processing functionality is redundant
    • G06F11/2023Failover techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3003Monitoring arrangements specially adapted to the computing system or computing system component being monitored
    • G06F11/3013Monitoring arrangements specially adapted to the computing system or computing system component being monitored where the computing system is an embedded system, i.e. a combination of hardware and software dedicated to perform a certain function in mobile devices, printers, automotive or aircraft systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3065Monitoring arrangements determined by the means or processing involved in reporting the monitored data
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/32Monitoring with visual or acoustical indication of the functioning of the machine
    • G06F11/324Display of status information
    • G06F11/328Computer systems status display
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/14Handling requests for interconnection or transfer
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/14Handling requests for interconnection or transfer
    • G06F13/36Handling requests for interconnection or transfer for access to common bus or bus system
    • G06F13/362Handling requests for interconnection or transfer for access to common bus or bus system with centralised access control
    • G06F13/364Handling requests for interconnection or transfer for access to common bus or bus system with centralised access control using independent requests or grants, e.g. using separated request and grant lines
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/10Protecting distributed programs or content, e.g. vending or licensing of copyrighted material
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/10Protecting distributed programs or content, e.g. vending or licensing of copyrighted material
    • G06F21/12Protecting executable software
    • G06F21/121Restricting unauthorised execution of programs
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/33User authentication using certificates
    • G06F21/335User authentication using certificates for accessing specific resources, e.g. using Kerberos tickets
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/629Protecting access to data via a platform, e.g. using keys or access control rules to features or functions of an application
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • G06F3/04842Selection of a displayed object
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders, dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for entering handwritten data, e.g. gestures, text
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/60Software deployment
    • G06F8/61Installation
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44505Configuring for program initiating, e.g. using registry, configuration files
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00335Recognising movements or behaviour, e.g. recognition of gestures, dynamic facial expressions; Lip-reading
    • G06K9/00355Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00362Recognising human body or animal bodies, e.g. vehicle occupant, pedestrian; Recognising body parts, e.g. hand
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00624Recognising scenes, i.e. recognition of a whole field of perception; recognising scene-specific objects
    • G06K9/00832Recognising scenes inside a vehicle, e.g. related to occupancy, driver state, inner lighting conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computer systems using knowledge-based models
    • G06N5/02Knowledge representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce, e.g. shopping or e-commerce
    • G06Q30/02Marketing, e.g. market research and analysis, surveying, promotions, advertising, buyer profiling, customer management or rewards; Price estimation or determination
    • G06Q30/0241Advertisement
    • G06Q30/0251Targeted advertisement
    • G06Q30/0265Vehicular advertisement
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/08Insurance, e.g. risk analysis or pensions
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/006Indicating maintenance
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/008Registering or indicating the working of vehicles communicating information to a remotely located station
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/02Registering or indicating driving, working, idle, or waiting time only
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0816Indicating performance data, e.g. occurrence of a malfunction
    • G07C5/0825Indicating performance data, e.g. occurrence of a malfunction using optical means
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0816Indicating performance data, e.g. occurrence of a malfunction
    • G07C5/0833Indicating performance data, e.g. occurrence of a malfunction using audio means
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data
    • G07C5/085Registering performance data using electronic data carriers
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C19/00Electric signal transmission systems
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0112Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0116Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/012Measuring and analyzing of parameters relative to traffic conditions based on the source of data from other sources than vehicle or roadside beacons, e.g. mobile networks
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0129Traffic data processing for creating historical data or processing based on historical data
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0133Traffic data processing for classifying traffic situation
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0137Measuring and analyzing of parameters relative to traffic conditions for specific applications
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0137Measuring and analyzing of parameters relative to traffic conditions for specific applications
    • G08G1/0141Measuring and analyzing of parameters relative to traffic conditions for specific applications for traffic information dissemination
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096716Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information does not generate an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096725Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096775Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096783Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a roadside individual element
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096791Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is another vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096805Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route
    • G08G1/096827Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route where the route is computed onboard
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096833Systems involving transmission of navigation instructions to the vehicle where different aspects are considered when computing the route
    • G08G1/096844Systems involving transmission of navigation instructions to the vehicle where different aspects are considered when computing the route where the complete route is dynamically recomputed based on new data
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/14Traffic control systems for road vehicles indicating individual free spaces in parking areas
    • G08G1/141Traffic control systems for road vehicles indicating individual free spaces in parking areas with means giving the indication of available parking spaces
    • G08G1/143Traffic control systems for road vehicles indicating individual free spaces in parking areas with means giving the indication of available parking spaces inside the vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00Arrangements for user-to-user messaging in packet-switching networks, e.g. e-mail or instant messages
    • H04L51/36Unified messaging, e.g. interactions between instant messaging, e-mail or other types of messages such as converged IP messaging [CPM]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/10Network architectures or network communication protocols for network security for controlling access to network resources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network-specific arrangements or communication protocols supporting networked applications
    • H04L67/02Network-specific arrangements or communication protocols supporting networked applications involving the use of web-based technology, e.g. hyper text transfer protocol [HTTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network-specific arrangements or communication protocols supporting networked applications
    • H04L67/12Network-specific arrangements or communication protocols supporting networked applications adapted for proprietary or special purpose networking environments, e.g. medical networks, sensor networks, networks in a car or remote metering networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network-specific arrangements or communication protocols supporting networked applications
    • H04L67/12Network-specific arrangements or communication protocols supporting networked applications adapted for proprietary or special purpose networking environments, e.g. medical networks, sensor networks, networks in a car or remote metering networks
    • H04L67/125Network-specific arrangements or communication protocols supporting networked applications adapted for proprietary or special purpose networking environments, e.g. medical networks, sensor networks, networks in a car or remote metering networks involving the control of end-device applications over a network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41422Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance located in transportation means, e.g. personal vehicle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network, synchronizing decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4331Caching operations, e.g. of an advertisement for later insertion during playback
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/482End-user interface for program selection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/48Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for in-vehicle communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/50Service provisioning or reconfiguring
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/90Services for handling of emergency or hazardous situations, e.g. earthquake and tsunami warning systems [ETWS]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W8/00Network data management
    • H04W8/22Processing or transfer of terminal data, e.g. status or physical capabilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W84/00Network topologies
    • H04W84/005Moving wireless networks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/10Input devices or features thereof
    • B60K2370/11Graphical user interfaces or menu aspects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/10Input devices or features thereof
    • B60K2370/12Input devices or input features
    • B60K2370/143Touch sensitive input devices
    • B60K2370/1438Touch screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/10Input devices or features thereof
    • B60K2370/12Input devices or input features
    • B60K2370/146Input by gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/10Input devices or features thereof
    • B60K2370/12Input devices or input features
    • B60K2370/146Input by gesture
    • B60K2370/14643D-gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/10Input devices or features thereof
    • B60K2370/12Input devices or input features
    • B60K2370/146Input by gesture
    • B60K2370/1468Touch gesture
    • B60K2370/1472Multi-touch gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/10Input devices or features thereof
    • B60K2370/12Input devices or input features
    • B60K2370/146Input by gesture
    • B60K2370/1468Touch gesture
    • B60K2370/1476Handwriting
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/16Type of information
    • B60K2370/164Infotainment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/16Type of information
    • B60K2370/166Navigation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/20Optical features of instruments
    • B60K2370/21Optical features of instruments using cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/50Control arrangements; Data network features
    • B60K2370/55Remote controls
    • B60K2370/56Remote controls using mobile devices
    • B60K2370/566Mobile devices displaying vehicle information
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/50Control arrangements; Data network features
    • B60K2370/58Data transfers
    • B60K2370/589Wireless
    • B60K2370/5894SIM cards
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/50Control arrangements; Data network features
    • B60K2370/58Data transfers
    • B60K2370/589Wireless
    • B60K2370/5899Internet
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/60Structural details of dashboards or instruments
    • B60K2370/68Features of instruments
    • B60K2370/691Housings
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/80Mounting or fastening arrangements; Mounting or fastening processes
    • B60K2370/81Fastening of instruments, e.g. to dashboard
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0872Driver physiology
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0881Seat occupation; Driver or passenger presence
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0075Automatic parameter input, automatic initialising or calibrating means
    • B60W2050/0095Automatic control mode change
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to the driver
    • B60W2540/22Psychological state; Stress level or workload
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to the driver
    • B60W2540/26Incapacity of driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to the driver
    • B60W2540/28Identity of driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2550/00Input parameters relating to exterior conditions
    • B60W2550/20Traffic related input parameters
    • B60W2550/22Traffic rules, e.g. traffic signs
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/07Indexing scheme relating to G06F21/10, protecting distributed programs or content
    • G06F2221/0722Content
    • G06F2221/0724Editing
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00885Biometric patterns not provided for under G06K9/00006, G06K9/00154, G06K9/00335, G06K9/00362, G06K9/00597; Biometric specific functions not specific to the kind of biometric
    • G06K2009/00939Biometric patterns based on physiological signals, e.g. heartbeat, blood flow
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computer systems using knowledge-based models
    • G06N5/04Inference methods or devices
    • G06N5/048Fuzzy inferencing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W84/00Network topologies
    • H04W84/02Hierarchically pre-organised networks, e.g. paging networks, cellular networks, WLAN [Wireless Local Area Network] or WLL [Wireless Local Loop]
    • H04W84/10Small scale networks; Flat hierarchical networks
    • H04W84/12WLAN [Wireless Local Area Networks]

Abstract

Methods and systems for a complete vehicle ecosystem are provided. Specifically, systems that when taken alone, or together, provide an individual or group of individuals with an intuitive and comfortable vehicular environment. The present disclosure includes a system to generate a vehicle communication system. The vehicle communication system can determine which devices are within the vehicle. From this determination, the vehicle communication system may create a universal bus and hotspot where applications, data, multimedia information, and resources can be shared both with the vehicle and with the other devices in the vehicle.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • The present application claims the benefits of and priority, under 35 U.S.C. §119(e), to U.S. Provisional Application Ser. No. 61/560,509, filed on Nov. 26, 2011, entitled “Complete Vehicle Ecosystem;” Ser. No. 61/637,164, filed on Apr. 23, 2012, entitled “Complete Vehicle Ecosystem;” Ser. No. 61/646,747, filed on May 14, 2012, entitled “Branding of Electrically Propelled Vehicles Via the Generation of Specific Operating Sounds;” Ser. No. 61/653,275, filed on May 30, 2012, entitled “Vehicle Application Store for Console;” Ser. No. 61/653,264, filed on May 30, 2012, entitled “Control of Device Features Based on Vehicle State;” Ser. No. 61/653,563, filed on May 31, 2012, entitled “Complete Vehicle Ecosystem;” Ser. No. 61/663,335, filed on Jun. 22, 2012, entitled “Complete Vehicle Ecosystem;” Ser. No. 61/672,483, filed on Jul. 17, 2012, entitled “Vehicle Climate Control;” and Ser. No. 61/714,016, filed on Oct. 15, 2012, entitled “Vehicle Middleware.” The entire disclosures of the applications listed above are hereby incorporated by reference, in their entirety, for all that they teach and for all purposes.
  • This application is also related to U.S. patent application Ser. No. 13/420,236, filed on Mar. 14, 2012, entitled, “Configurable Vehicle Console;” Ser. No. 13/420,240, filed on Mar. 14, 2012, entitled “Removable, Configurable Vehicle Console;” Ser. No. 13/462,593, filed on May 2, 2012; Ser. No. 13/462,596, filed on May 2, 2012, entitled “Configurable Heads-Up Dash Display;” Ser. No. ______, filed on Nov. 16, 2012, entitled “Implementation of Conquest Functionality in Automotive Consule” (Attorney Docket No. 6583-228); Ser. No. ______, filed on Nov. 16, 2012, entitled “Car Application Store for Console” (Attorney Docket No. 6583-230); Ser. No. ______, filed on Nov. 16, 2012, entitled “Sharing Applications/Media Between Car and Phone (Hydroid)” (Attorney Docket No. 6583-231); Ser. No. ______, filed on Nov. 16, 2012, entitled “In-Cloud Connection for Car Multimedia” (Attorney Docket No. 6583-232); Ser. No. ______, filed on Nov. 16, 2012, entitled “Music Streaming” (Attorney Docket No. 6583-233); Ser. No. ______, filed on Nov. 16, 2012, entitled “Activate/Deactivate Features of Console/Car Based on Location (State Law Compliance)” (Attorney Docket No. 6583-234); Ser. No. ______, filed on Nov. 16, 2012, entitled “Insurance Tracking” (Attorney Docket No. 6583-235); Ser. No. ______, filed on Nov. 16, 2012, entitled “Law Breaking/Behavior Sensor” (Attorney Docket No. 6583-236); Ser. No. ______, filed on Nov. 16, 2012, entitled “Etiquette Suggestion” (Attorney Docket No. 6583-237); Ser. No. ______, filed on Nov. 16, 2012, entitled “Parking Space Finder Based on Parking Meter Data” (Attorney Docket No. 6583-238); Ser. No. ______, filed on Nov. 16, 2012, entitled “Parking Meter Expired Alert” (Attorney Docket No. 6583-239); Ser. No. ______, filed on Nov. 16, 2012, entitled “Object Sensing (Pedestrian Avoidance/Accident Avoidance)” (Attorney Docket No. 6583-240); Ser. No. ______, filed on Nov. 16, 2012, entitled “Proximity Warning Relative to Other Cars” (Attorney Docket No. 6583-241); Ser. No. ______, filed on Nov. 16, 2012, entitled “Street Side Sensors” (Attorney Docket No. 6583-242); Ser. No. ______, filed on Nov. 16, 2012, entitled “Car Location” (Attorney Docket No. 6583-263); Ser. No. ______, filed on Nov. 16, 2012, entitled “Universal Bus in the Car” (Attorney Docket No. 6583-244); Ser. No. ______, filed on Nov. 16, 2012, entitled “Mobile Hot Spot/Router/Application Share site or Network” (Attorney Docket No. 6583-245); Ser. No. ______, filed on Nov. 16, 2012, entitled “Universal Console Chassis for the Car” (Attorney Docket No. 6583-246); Ser. No. ______, filed on Nov. 16, 2012, entitled “Middleware” (Attorney Docket No. 6583-247); Ser. No. ______, filed on Nov. 16, 2012, entitled “Real Time Traffic” (Attorney Docket No. 6583-248); Ser. No. ______, filed on Nov. 16, 2012, entitled “Map Updating” (Attorney Docket No. 6583-249); Ser. No. ______, filed on Nov. 16, 2012, entitled “Indications Call Mechanic” (Attorney Docket No. 6583-250); Ser. No. ______, filed on Nov. 16, 2012, entitled “Felon Identifier” (Attorney Docket No. 6583-251); Ser. No. ______, filed on Nov. 16, 2012, entitled “Behavioral Tracking and Vehicle Applications” (Attorney Docket No. 6583-252); Ser. No. ______, filed on Nov. 16, 2012, entitled “Improvements to Controller Area Network Bus” (Attorney Docket No. 6583-314); Ser. No. ______, filed on Nov. 16, 2012, entitled “Location Information Exchange Between Vehicle and Device” (Attorney Docket No. 6583-315); Ser. No. ______, filed on Nov. 16, 2012, entitled “In Car Communication Between Devices” (Attorney Docket No. 6583-316); Ser. No. ______, filed on Nov. 16, 2012, entitled “Configurable Hardware Unit for Car Systems” (Attorney Docket No. 6583-317); Ser. No. ______, filed on Nov. 16, 2012, entitled “Feature Recognition for Configuring a Vehicle Console and Associated Devices” (Attorney Docket No. 6583-318); Ser. No. ______, filed on Nov. 16, 2012, entitled “Configurable Vehicle Console” (Attorney Docket No. 6583-412); Ser. No. ______, filed on Nov. 16, 2012, entitled “Configurable Dash Display” (Attorney Docket No. 6583-413); Ser. No. ______, filed on Nov. 16, 2012, entitled “Configurable Heads-Up Dash Display” (Attorney Docket No. 6583-414); and Ser. No. ______, filed on Nov. 16, 2012, entitled “Removable, Configurable Vehicle Console” (Attorney Docket No. 6583-415). The entire disclosures of the applications listed above are hereby incorporated by reference, in their entirety, for all that they teach and for all purposes.
  • BACKGROUND
  • Whether using private, commercial, or public transport, the movement of people and/or cargo has become a major industry. In today's interconnected world daily travel is essential to engaging in commerce. Commuting to and from work can account for a large portion of a traveler's day. As a result, vehicle manufacturers have begun to focus on making this commute, and other journeys, more enjoyable and easier.
  • Currently, vehicle manufacturers attempt to entice travelers to use a specific conveyance based on any number of features. Most of these features focus on vehicle safety or efficiency. From the addition of safety-restraints, air-bags, and warning systems to more efficient engines, motors, and designs, the vehicle industry has worked to appease the supposed needs of the traveler. Recently, however, vehicle manufactures have shifted their focus to user and passenger comfort as a primary concern. Making an individual more comfortable while traveling instills confidence and pleasure in using a given vehicle and increases an individual's preference for a given manufacturer and/or vehicle type.
  • One way to instill comfort in a vehicle is to create an environment within the vehicle similar to that of an individual's home. Integrating features in a vehicle that are associated with comfort found in an individual's home can ease a traveler's transition from home to vehicle. Several manufacturers have added comfort features in vehicles such as the following: leather seats, adaptive and/or personal climate control systems, music and media players, ergonomic controls, Internet connectivity, etc. However, because these manufacturers have added features to a conveyance, they have built comfort around a vehicle and failed to build a vehicle around comfort. Thus, the vehicle as an ecosystem has not been fully considered.
  • SUMMARY
  • There is a need for a vehicle ecosystem that can integrate both physical and mental comforts while seamlessly operating with current electronic devices to result in a totally intuitive and immersive user experience. These and other needs are addressed by the various aspects, embodiments, and/or configurations of the present disclosure. Also, while the disclosure is presented in terms of exemplary embodiments, it should be appreciated that individual aspects of the disclosure can be separately claimed.
  • The present disclosure can provide a number of advantages depending on the particular aspect, embodiment, and/or configuration. Currently, the vehicle industry is dominated by conveyances offering a separate comfort experience from a home, work, or other aspect of a traveler's life. Unfortunately, current vehicles include a series of separate devices that work together while an individual or individuals are associated with the vehicle. Technology areas and devices such as user interfaces, applications, tracking capabilities, hardware, and/or location-based communications, could be combined together, or used separately, to form a complete vehicle ecosystem. This ecosystem can provide a connected and intuitive user experience for any traveler.
  • A series of devices associated with a vehicle along with other devices can form a complete and familiar user experience. In particular, the devices, applications, interfaces, hardware, and software may combine to form a user-friendly environment while traveling or otherwise moving from one location to another and/or when a vehicle is at rest. Moreover, aspects of the present disclosure may provide communication between the vehicle and a user at any given time. Specifically, communication between a vehicle and another device may also relay information to an individual and/or group of individuals. This communication between a vehicle and at least one other device may include, but is not limited to, communication between a vehicle and: 1) at least one mobile device, 2) at least one other vehicle, 3) another system/group of devices, 4) a non-mobile device, and 5) combinations thereof. These and other advantages will be apparent from the disclosure.
  • Methods and systems for a complete vehicle ecosystem are provided. Specifically, systems that when taken alone, or together, provide an individual or group of individuals with an intuitive and comfortable vehicular environment. The present disclosure includes a system to generate a vehicle communication system. The vehicle communication system can determine which devices are within the vehicle. From this determination, the vehicle communication system may create a universal bus and hotspot where applications, data, multimedia information, and resources can be shared both with the vehicle and with the other devices in the vehicle.
  • The term “ecosystem” or “vehicle ecosystem,” as used herein, refers to a community of person(s) in conjunction with the vehicle or other device components of their environment (for example, climate control, safety features, mobile devices, multimedia sources etc.), interacting as a system.
  • The term “environment” or “vehicle environment,” as used herein, refers to the surroundings or conditions in which a person operates a vehicle.
  • The term “sensor,” as used herein, refers to a converter or instrument that measures a physical quantity or quality and converts the measurement into a signal which can be read, observed, stored, and/or understood by an observer or by another instrument.
  • The term “stimulus,” as used herein, refers to some event or something external that influences an activity.
  • The term “automotive navigation system” is a satellite navigation system designed for use in automobiles. It typically uses a GPS navigation device to acquire position data to locate the user on a road in the unit's map database. Using the road database, the unit can give directions to other locations along roads also in its database. Dead reckoning using distance data from sensors attached to the drivetrain, a gyroscope and an accelerometer can be used for greater reliability, as GPS signal loss and/or multipath can occur due to urban canyons or tunnels.
  • The term “bus” and variations thereof, as used herein, refers to a subsystem that transfers information and/or data between various components. A bus generally refers to the collection communication hardware interface, interconnects, bus architecture, and/or protocol defining the communication scheme for a communication system and/or communication network. A bus may also be specifically refer to a part of a communication hardware that interfaces the communication hardware with the interconnects that connect to other components of the corresponding communication network. The bus may be for a wired network, such as a physical bus, or wireless network, such as part of an antenna or hardware that couples the communication hardware with the antenna. A bus architecture supports a defined format in which information and/or data is arranged when sent and received through a communication network. A protocol may define the format and rules of communication of a bus architecture.
  • The terms “communication device,” “smartphone,” and “mobile device,” and variations thereof, as used herein, are used interchangeably and include any type of device capable of communicating with one or more of another device and/or across a communications network, via a communications protocol, and the like. Exemplary communication devices may include but are not limited to smartphones, handheld computers, laptops, netbooks, notebook computers, subnotebooks, tablet computers, scanners, portable gaming devices, phones, pagers, GPS modules, portable music players, and other Internet-enabled and/or network-connected devices.
  • The terms “head unit,” “dash,” “dashboard,” and variations thereof, as used herein, are used interchangeably and include any panel and/or area of a vehicle disposed adjacent to an operator, user, and/or passenger. Typical dashboards may include but are not limited to one or more control panel, instrument housing, head unit, indicator, gauge, meter, light, audio equipment, computer, screen, display, HUD unit, and graphical user interface.
  • The term “electronic address” refers to any contactable address, including a telephone number, instant message handle, e-mail address, Universal Resource Locator (“URL”), Universal Resource Identifier (“URI”), Address of Record (“AOR”), electronic alias in a database, like addresses, and combinations thereof.
  • The term “communication system” or “communication network” and variations thereof, as used herein, refers to a collection of communication components capable of one or more of transmission, relay, interconnect, control, or otherwise manipulate information or data from at least one transmitter to at least one receiver. As such, the communication may include a range of systems supporting point-to-point to broadcasting of the information or data. A communication system may refer to the collection individual communication hardware as well as the interconnects associated with and connecting the individual communication hardware. Communication hardware may refer to dedicated communication hardware or may refer a processor coupled with a communication means (i.e. an antenna) and running software capable of using the communication means to send a signal within the communication system. Interconnect refers some type of wired or wireless communication link that connects various components, such as communication hardware, within a communication system. A communication network may refer to a specific setup of a communication system with the collection of individual communication hardware and interconnects having some definable network topography. A communication network may include wired and/or wireless network having a pre-set to an ad hoc network structure.
  • The phrases “at least one”, “one or more”, and “and/or” are open-ended expressions that are both conjunctive and disjunctive in operation. For example, each of the expressions “at least one of A, B and C”, “at least one of A, B, or C”, “one or more of A, B, and C”, “one or more of A, B, or C” and “A, B, and/or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
  • The term “a” or “an” entity,” as used herein, refers to one or more of that entity. As such, the terms “a” (or “an”), “one or more” and “at least one” can be used interchangeably herein. It is also to be noted that the terms “comprising”, “including”, and “having” can be used interchangeably.
  • The term “automatic” and variations thereof, as used herein, refers to any process or operation done without material human input when the process or operation is performed. However, a process or operation can be automatic, even though performance of the process or operation uses material or immaterial human input, if the input is received before performance of the process or operation. Human input is deemed to be material if such input influences how the process or operation will be performed. Human input that consents to the performance of the process or operation is not deemed to be “material”.
  • The term “computer-readable medium,” as used herein, refers to any tangible storage and/or transmission medium that participate in providing instructions to a processor for execution. Such a medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media includes, for example, NVRAM, or magnetic or optical disks. Volatile media includes dynamic memory, such as main memory. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, magneto-optical medium, a CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, a solid state medium like a memory card, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read. A digital file attachment to e-mail or other self-contained information archive or set of archives is considered a distribution medium equivalent to a tangible storage medium. When the computer-readable media is configured as a database, it is to be understood that the database may be any type of database, such as relational, hierarchical, object-oriented, and/or the like. Accordingly, the disclosure is considered to include a tangible storage medium or distribution medium and prior art-recognized equivalents and successor media, in which the software implementations of the present disclosure are stored.
  • The term “desktop,” as used herein, refers to a metaphor used to portray systems. A desktop is generally considered a “surface” that typically includes pictures, called icons, widgets, folders, etc. that can activate show applications, windows, cabinets, files, folders, documents, and other graphical items. The icons are generally selectable to initiate a task through user interface interaction to allow a user to execute applications or conduct other operations.
  • The term “display,” as used herein, refers to a portion of a screen used to display the output of a computer to a user.
  • The term “displayed image,” as used herein, refers to an image produced on the display. A typical displayed image is a window or desktop. The displayed image may occupy all or a portion of the display.
  • The term “display orientation,” as used herein, refers to the way in which a rectangular display is oriented by a user for viewing. The two most common types of display orientation are portrait and landscape. In landscape mode, the display is oriented such that the width of the display is greater than the height of the display (such as a 4:3 ratio, which is 4 units wide and 3 units tall, or a 16:9 ratio, which is 16 units wide and 9 units tall). Stated differently, the longer dimension of the display is oriented substantially horizontal in landscape mode while the shorter dimension of the display is oriented substantially vertical. In the portrait mode, by contrast, the display is oriented such that the width of the display is less than the height of the display. Stated differently, the shorter dimension of the display is oriented substantially horizontal in the portrait mode while the longer dimension of the display is oriented substantially vertical. The multi-screen display can have one composite display that encompasses all the screens. The composite display can have different display characteristics based on the various orientations of the device.
  • The term “gesture,” as used herein, refers to a user action that expresses an intended idea, action, meaning, result, and/or outcome. The user action can include manipulating a device (e.g., opening or closing a device, changing a device orientation, moving a trackball or wheel, etc.), movement of a body part in relation to the device, movement of an implement or tool in relation to the device, audio inputs, etc. A gesture may be made on a device (such as on the screen) or with the device to interact with the device.
  • The term “module,” as used herein, refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and software that is capable of performing the functionality associated with that element.
  • The term “gesture recognition” or “gesture capture,” as used herein, refers to a sense or otherwise a detection of an instance and/or type of user gesture. The gesture capture can occur in one or more areas of the screen, A gesture region can be on the display, where it may be referred to as a touch sensitive display or off the display where it may be referred to as a gesture capture area.
  • A “multi-screen application,” as used herein, refers to an application that is capable of producing one or more windows that may simultaneously occupy multiple screens. A multi-screen application commonly can operate in single-screen mode in which one or more windows of the application are displayed only on one screen or in multi-screen mode in which one or more windows are displayed simultaneously on multiple screens.
  • A “single-screen application,” as used herein, refers to an application that is capable of producing one or more windows that may occupy only a single screen at a time.
  • The term “screen,” “touch screen,” or “touchscreen,” as used herein, refers to a physical structure that enables the user to interact with the computer by touching areas on the screen and provides information to a user through a display. The touch screen may sense user contact in a number of different ways, such as by a change in an electrical parameter (e.g., resistance or capacitance), acoustic wave variations, infrared radiation proximity detection, light variation detection, and the like. In a resistive touch screen, for example, normally separated conductive and resistive metallic layers in the screen pass an electrical current. When a user touches the screen, the two layers make contact in the contacted location, whereby a change in electrical field is noted and the coordinates of the contacted location calculated. In a capacitive touch screen, a capacitive layer stores electrical charge, which is discharged to the user upon contact with the touch screen, causing a decrease in the charge of the capacitive layer. The decrease is measured, and the contacted location coordinates determined. In a surface acoustic wave touch screen, an acoustic wave is transmitted through the screen, and the acoustic wave is disturbed by user contact. A receiving transducer detects the user contact instance and determines the contacted location coordinates.
  • The term “window” refers to a, typically rectangular, displayed image on at least part of a display that contains or provides content different from the rest of the screen. The window may obscure the desktop.
  • The terms “determine”, “calculate” and “compute,” and variations thereof, as used herein, are used interchangeably and include any type of methodology, process, mathematical operation or technique.
  • It shall be understood that the term “means” as used herein shall be given its broadest possible interpretation in accordance with 35 U.S.C., Section 112, Paragraph 6. Accordingly, a claim incorporating the term “means” shall cover all structures, materials, or acts set forth herein, and all of the equivalents thereof. Further, the structures, materials or acts and the equivalents thereof shall include all those described in the summary of the invention, brief description of the drawings, detailed description, abstract, and claims themselves.
  • The term “vehicle” as used herein includes any conveyance, or model of a conveyance, where the conveyance was originally designed for the purpose of moving one or more tangible objects, such as people, animals, cargo, and the like. The term “vehicle” does not require that a conveyance moves or is capable of movement. Typical vehicles may include but are in no way limited to cars, trucks, motorcycles, busses, automobiles, trains, railed conveyances, boats, ships, marine conveyances, submarine conveyances, airplanes, space craft, flying machines, human-powered conveyances, and the like.
  • The preceding is a simplified summary of the disclosure to provide an understanding of some aspects of the disclosure. This summary is neither an extensive nor exhaustive overview of the disclosure and its various aspects, embodiments, and/or configurations. It is intended neither to identify key or critical elements of the disclosure nor to delineate the scope of the disclosure but to present selected concepts of the disclosure in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other aspects, embodiments, and/or configurations of the disclosure are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 depicts an embodiment of a vehicle operating environment;
  • FIG. 2 is a block diagram of an embodiment of a vehicle system;
  • FIG. 3 is a block diagram of an embodiment of a vehicle interior environment separated into areas and/or zones;
  • FIG. 4 depicts an embodiment of a sensor configuration for a vehicle;
  • FIG. 5 is a block diagram of an embodiment of a vehicle control system;
  • FIG. 6 another block diagram of an embodiment of a vehicle control system;
  • FIG. 7A is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system;
  • FIG. 7B is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system;
  • FIG. 7C is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system;
  • FIG. 7D is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system;
  • FIG. 7E is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system;
  • FIG. 7F is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system;
  • FIG. 7G is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system;
  • FIG. 7H is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system;
  • FIG. 7I is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system;
  • FIG. 7J is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system;
  • FIG. 7K is a graphical representation of an embodiment of a gesture that a user may perform to provide input to a vehicle control system;
  • FIG. 8 is a diagram of an embodiment of a data structure for storing information about a user of a vehicle;
  • FIG. 9 is a representation of a vehicle interior that shows an embodiment of an antenna placement configuration;
  • FIG. 10 is a block diagram of an embodiment of a communication system;
  • FIG. 11 is a block diagram of another embodiment of a communication system;
  • FIG. 12A is a flow diagram of a method for creating a universal bus;
  • FIG. 12B is a flow diagram of a method for determining a signal originates inside a vehicle;
  • FIG. 13 is a flow diagram of a method for providing a network hot spot;
  • FIG. 14 is a flow diagram of a method for communicating between devices;
  • FIG. 15 is a flow diagram of a method for sharing an application from a device;
  • FIG. 16 is a flow diagram of a method for managing data stored in the cloud;
  • FIG. 17 is a flow diagram of a method for steaming media in a vehicle.
  • In the appended figures, similar components and/or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label by a letter that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
  • DETAILED DESCRIPTION
  • Presented herein are embodiments of a complete vehicle ecosystem. The ecosystem can comprise single devices or a compilation of devices. This device, or these devices, may be capable of communicating with other devices and/or to an individual or group of individuals. Further, this device, or these devices, can receive user input in unique ways. The overall design and functionality of each device provides for an enhanced user experience making the device more useful and more efficient. As described herein, the device(s) may be electrical, mechanical, electro-mechanical, software-based, and/or combinations thereof.
  • A vehicle environment 100 that may contain a vehicle ecosystem is shown in FIG. 1. The vehicle environment 100 can contain areas associated with a vehicle or conveyance 104. The vehicle 104 is shown as a police car but can be any type of conveyance. The environment 100 can include at least three zones. A first zone 108 may be inside a vehicle 104. The zone 108 includes any interior space, trunk space, engine compartment, or other associated space within or associated with the vehicle 104. The interior environment 108 can be defined by one or more techniques, for example, geo-fencing.
  • A second zone 112 may be delineated by line 120. The zone 112 is created by a range of one or more sensors associated with the vehicle 104. Thus, the area 112 is exemplary of the range of those sensors and what can be detected by those sensors associated with the vehicle 104. The rest of the environment includes all space beyond the range of the sensors and is represented by 116. Thus, the environment 100 may have an area 116 that includes all areas beyond the sensor range 112. The area 116 may include future locations of travel that the vehicle 104 may proceed to in the future.
  • An embodiment of a vehicle system 200 is shown in FIG. 2. The vehicle system 200 may consist of hardware and/or software that conduct various operations for or with the vehicle 104. The operations can include, but are not limited to providing information to the user, receiving input from the user, and controlling the functions or operation of the vehicle 104, etc. The vehicle system 200 can include a vehicle control system 204. The vehicle control system 204 can be any type of computing system operable to conduct the operations as described herein.
  • The vehicle control system 204 may interact with a memory or storage system 208 that stores system data. System data 208 may be any type of data needed for the vehicle control system 204 to control effectively the vehicle 104. An example of some of the data that may be stored by the vehicle control system 204 may be as described in conjunction with FIG. 8. The system data 208 can represent any type of database or other storage system. Thus, the system data 208 can be a flat file data system, an object-oriented data system, or some other data system that may interface with the vehicle control system 204.
  • The vehicle control system 204 may communicate with a device or user interface 212. The user interface 212 may be as described in conjunction with FIG. 5. The user interface 212 may be operable to receive user input either through touch input, on one or more user interface buttons, or through a graphical user interface that may include a gesture capture region, as described in conjunction with FIG. 5. Further, the symbol 212 can represent a device that is located or associated with the vehicle 104. The device 212 can be a mobile device, including, but not limited to, a mobile telephone, a mobile computer, or other type of computing system or device that is either permanently located in or temporarily associated with the automobile 104. Thus, the vehicle control system 204 can interface with the device 212 and leverage the devices computing capability to provide one or more of the features or functions as described herein.
  • The device or user interface 212 can receive input or provide information to a user 216. The user 216 may thus interact with the vehicle control system 204 through the interface or device 212. Further, the device 212 may include or have access to device data 220. The device data 220 can be any type of data that is used in conjunction with the device 212, including, but not limited to, multimedia data, preferences data, bioinformatics, data associated with the user 216, or other types of data. The data may be stored in a device data 220 as a storage system similar to that described in conjunction with system data 208.
  • The vehicle control system 204 may also communicate with or through a communication network 224. The communication network 224 can represent any type of wireless or wired communication system that may be included within the vehicle 104 or operable to communicate outside the vehicle 104. Thus, the communication network 224 can include a local area communication capability and a wide area communication capability. For example, the communication network 224 can include a BLUETOOTH™ wireless system, an 802.11G or 802.11N wireless system, a CAN bus, an Ethernet network within the vehicle 104, or other types of communication networks that may function with or be associated with the vehicle 104. Further, the communication network 224 can also include wide area communication capabilities, including one or more of, but not limited to, a cellular communication capability, satellite telephone communication capability, a wireless wide area network communication capability, or other types of communication capabilities that allow for the vehicle control system 204 to communicate outside the vehicle 104.
  • The vehicle control system 204 may communicate through the communication network 224 to a server 228 that may be located in a facility that is not within physical proximity to the vehicle 104. Thus, the server 224 may represent a cloud computing system or cloud storage that allows the vehicle control system 204 to either gain access to further computing capabilities or to storage in a location outside of the vehicle 104. The server 228 can include a computer processor and memory and be similar to any computing system as understood to one skilled in the art.
  • Further, the server 228 may be associated with stored data 232. The stored data 232 may be stored in any system or by any method, as described in conjunction with system data 208 and/or device data 220. The stored data 232 can include information that may be associated with one or more users 216 or associated with one or more vehicles 104. The stored data 232, being stored in a cloud or in a distant facility, may be exchanged among vehicles 104 or may be used by a user 216 in different locations or with different vehicles 104.
  • The vehicle control system 204 may also communicate with one or more sensors 236/242, which are either associated with the vehicle 104 or communicate with the vehicle 104. Vehicle sensors 242 may include one or more sensors for providing information to the vehicle control system 204 that determine or provide information about the environment 100 in which the vehicle 104 is operating. Embodiments of these sensors may be as described in conjunction with FIG. 4. Non-vehicle sensor 236 can be any type of sensor that isn't currently associated with the vehicle 104. For example, non-vehicle sensor 236 can be sensors in a traffic system operated by a third party that provides data to the vehicle control system 204. Further, the non-vehicle sensor 236 can be other types of sensors which provide information about the distant environment 116 or other information about the vehicle 104 or the environment 100. These non-vehicle sensors 236 may be operated by third parties but provide information to the vehicle control system 204. Examples of information that may be used by the vehicle control system 204 may include weather tracking data, user health tracking data, vehicle maintenance data, or other types of data, which may provide environmental or other data to the vehicle control system 204.
  • An arrangement or configuration for sensors within a vehicle 104 is as shown in FIG. 3. The sensor arrangement 300 can include one or more areas 308 within the vehicle. An area can be a larger part of the environment inside or outside of the vehicle 104. Thus, area one 308A may include the area within the trunk space or engine space of the vehicle 104 and/or the front passenger compartment. Area three 308B may include a portion of the interior space 108 of the vehicle 104. The area N, 308N may include the trunk space or rear compartment area, when included within the vehicle 104. The interior space 108 may also be divided into areas. Thus, one area may be associated with the front passenger's and driver's seats, a second area may be associated with the middle passengers' seats, and a third area may be associated with a rear passenger's seat. Each area 308 may include one or more sensors that are positioned or operate to provide environmental information about that area 308.
  • Each area 308 may be further separated into one or more zones 312 within the area 308. For example, area 1 308A may be separated into zone A, 312 a, and zone B, 312 a. Each zone 312 may be associated with a particular portion of the interior occupied by a passenger. For example, zone A, 312 a may be associated with a driver. Zone B, 312 b, may be associated with a front passenger. Each zone 312 may include one or more sensors that are positioned or configured to collect information about the environment or ecosystem associated with that zone or person.
  • A passenger area 308 b may include more than two zones as described in conjunction with area 308 a. For example, area 308 b may include three zones, 312 c, 312 d, and 312 e. These three separate zones 312 c, 312 d, and 312 e may be associated with three passenger seats typically found in the rear passenger area of an automobile 104. An area 308N and may include a single zone 312N as there may be no separate passenger areas but may include be a single trunk area within the vehicle 104. The number of zones 312 is unlimited within the areas as the areas are also unlimited inside the vehicle 104. Further, it should be noted that there may be one or areas 308 or zones 312 that may be located outside the vehicle 104 that may have a specific set of sensors associated therewith.
  • A set of sensors or vehicle components 400 associated with the vehicle 404 may be as shown in FIG. 4. The vehicle 104 includes, among many other components common to vehicles, wheels 407, a power source 409 (such as an engine, motor, or energy storage system (e.g., battery or capacitive energy storage system)), a manual or automatic transmission 412, a manual or automatic transmission gear controller 416, a power controller 420 (such as a throttle), a vehicle control system 204, the display device 212, a braking system 436, a steering wheel 440, a power source activation/deactivation switch 444 (e.g., an ignition), an occupant seating system 448, a wireless signal receiver 453 to receive wireless signals from signal sources such as roadside beacons and other electronic roadside devices, and a satellite positioning system receiver 457 (e.g., a Global Positioning System (“GPS”) (US), GLONASS (Russia), Galileo positioning system (EU), Compass navigation system (China), and Regional Navigational Satellite System (India) receiver).
  • The vehicle 104 includes a number of sensors in wireless or wired communication with the vehicle control system 204 and/or display device 212 to collect sensed information regarding the vehicle state, configuration, and/or operation. Exemplary sensors include wheel state sensor 460 to sense one or more of vehicle speed, acceleration, deceleration, wheel rotation, wheel speed (e.g., wheel revolutions-per-minute), wheel slip, and the like, a power source energy output sensor 464 to sense a power output of the power source 409 by measuring one or more of current engine speed (e.g., revolutions-per-minute), energy input and/or output (e.g., voltage, current, fuel consumption, and torque) (e.g., turbine speed sensor, input speed sensor, crankshaft position sensor, manifold absolute pressure sensor, mass flow sensor, and the like), and the like, a switch state sensor 468 to determine a current activation or deactivation state of the power source activation/deactivation switch 444, a transmission setting sensor 470 to determine a current setting of the transmission (e.g., gear selection or setting), a gear controller sensor 472 to determine a current setting of the gear controller 416, a power controller sensor 474 to determine a current setting of the power controller 420, a brake sensor 476 to determine a current state (braking or non-braking) of the braking system 436, a seating system sensor 478 to determine a seat setting and current weight of seated occupant, if any) in a selected seat of the seating system 448, exterior and interior sound receivers 490 and 492 (e.g., a microphone and other type of acoustic-to-electric transducer or sensor) to receive and convert sound waves into an equivalent analog or digital signal. Examples of other sensors (not shown) that may be employed include safety system state sensors to determine a current state of a vehicular safety system (e.g., air bag setting (deployed or undeployed) and/or seat belt setting (engaged or not engaged)), light setting sensor (e.g., current headlight, emergency light, brake light, parking light, fog light, interior or passenger compartment light, and/or tail light state (on or off)), brake control (e.g., pedal) setting sensor, accelerator pedal setting or angle sensor, clutch pedal setting sensor, emergency brake pedal setting sensor, door setting (e.g., open, closed, locked or unlocked) sensor, engine temperature sensor, passenger compartment or cabin temperature sensor, window setting (open or closed) sensor, one or more cameras or other imaging sensors (which commonly convert an optical image into an electronic signal but may include other devices for detection objects such as an electromagnetic radiation emitter/receiver that emits electromagnetic radiation and receives electromagnetic waves reflected by the object) to sense objects, such as other vehicles and pedestrians and optionally determine the distance, trajectory and speed of such objects, in the vicinity or path of the vehicle, odometer reading sensor, trip mileage reading sensor, wind speed sensor, radar transmitter/receiver output, brake wear sensor, steering/torque sensor, oxygen sensor, ambient lighting sensor, vision system sensor, ranging sensor, parking sensor, heating, venting, and air conditioning (HVAC) sensor, water sensor, air-fuel ratio meter, blind spot monitor, hall effect sensor, microphone, radio frequency (RF) sensor, infrared (IR) sensor, vehicle control system sensors, wireless network sensor (e.g., Wi-Fi and/or BLUETOOTH™ sensor), cellular data sensor, and other sensors known to those of skill in the vehicle art.
  • In the depicted vehicle embodiment, the various sensors are in communication with the display device 212 and vehicle control system 204 via signal carrier network 480. As noted, the signal carrier network 480 can be a network of signal conductors, a wireless network (e.g., a radio frequency, microwave, or infrared communication system using a communications protocol, such as Wi-Fi), or a combination thereof.
  • In one implementation, the control system 424 receives and reads sensor signals, such as wheel and engine speed signals, as a digital input comprising, for example, a pulse width modulated (PWM) signal. The processor 304 can be configured, for example, to read each of the signals into a port configured as a counter or configured to generate an interrupt on receipt of a pulse, such that the processor 304 can determine, for example, the engine speed in revolutions per minute (RPM) and the speed of the vehicle in miles per hour (MPH). One skilled in the art will recognize that the two signals can be received from existing sensors in a vehicle comprising a tachometer and a speedometer, respectively. Alternatively, the current engine speed and vehicle speed can be received in a communication packet as numeric values from a conventional dashboard subsystem comprising a tachometer and a speedometer. The transmission speed sensor signal can be similarly received as a digital input comprising a signal coupled to a counter or interrupt signal of the processor 304, or received as a value in a communication packet on the network or port interface 352 from an existing subsystem of the vehicle. The ignition sensor signal can be configured as a digital input, wherein a HIGH value represents that the ignition is on and a LOW value represents that the ignition is OFF. Three bits of the port interface 352 can be configured as a digital input to receive the gear shift position signal, representing eight possible gear shift positions. Alternatively, the gear shift position signal can be received in a communication packet as a numeric value on the port interface 352. The throttle position signal can be received as an analog input value, typically in the range 0-5 volts. Alternatively, the throttle position signal can be received in a communication packet as a numeric value on the port interface 352. The output of other sensors can be processed in a similar fashion.
  • Other sensors may be included and position in the interior space 108 of the vehicle 104. Generally, these interior sensors obtain data about the health of the driver and/or passenger(s), data about the safety of the driver and/or passenger(s), and/or data about the comfort of the driver and/or passenger(s). The health data sensors can include sensors in the steering wheel that can measure various health telemetry for the person (e.g., heart rate, temperature, blood pressure, blood presence, blood composition, etc.). Sensors in the seats may also provide for health telemetry (e.g., presence of liquid, weight, weight shifts, etc.). Infrared sensors could detect a person's temperature; optical sensors can determine a person's position and whether the person has become unconscious. Other health sensors are possible and included herein.
  • Safety sensors can measure whether the person is acting safely. Optical sensors can determine a person's position and focus. If the person stops looking at the road ahead, the optical sensor can detect the lack of focus. Sensors in the seats may detect if a person is leaning forward or may be injured by a seat belt in a collision. Other sensors can detect that the driver has at least one hand on a steering wheel. Other safety sensors are possible and contemplated as if included herein.
  • Comfort sensors can collect information about a person's comfort. Temperature sensors may detect a temperature of the interior cabin. Moisture sensors can determine a relative humidity. Audio sensors can detect loud sounds or other distractions. Audio sensors may also receive input from a person through voice data. Other comfort sensors are possible and contemplated as if included herein.
  • An embodiment of a vehicle control system 204 and its associated components 204 may be as shown in FIG. 5. In general, the device 212 includes a front screen 212 with a touch sensitive display 568. The front screen 212 may be disabled and/or enabled by a suitable command. Moreover, the front screen 212 can be touch sensitive and can include different operative areas. For example, a first operative area, within the touch sensitive screen 212, may comprise a touch sensitive display 568. In general, the touch sensitive display 568 may comprise a full color, touch sensitive display. A second area within each touch sensitive screen 568 may comprise a gesture capture region 572. The gesture capture region 572 may comprise one or more areas or regions that is outside of the touch sensitive display 568 area or screen area 212, and that is capable of receiving input, for example in the form of gestures provided by a user. However, the one or more gesture capture regions 572 do not include pixels that can perform a display function or capability.
  • It is further anticipated that a third region of the touch sensitive screen 568 may comprise one or more configurable areas. The configurable area is capable of receiving input and has display or limited display capabilities. As can be appreciated, the configurable area may occupy any part of the touch sensitive screen 568 not allocated to a gesture capture region 572 or touch sensitive display 568. In embodiments, the configurable area may present different input options to the user. For example, the configurable area may display buttons or other relatable items. Moreover, the identity of displayed buttons, or whether any buttons are displayed at all within the configurable area of the touch sensitive screen 568 may be determined from the context in which the device 212 is used and/or operated. In an exemplary embodiment, the touch sensitive screen 568 comprises liquid crystal display devices extending across at least the region of the touch sensitive screen 568 that is capable of providing visual output to a user, and a resistive and/or capacitive input matrix over the regions of the touch sensitive screen 568 that are capable of receiving input from the user.
  • One or more display controllers 516 may be provided for controlling the operation of the touch sensitive screen 568, including input (touch sensing) and output (display) functions. In the exemplary embodiment illustrated in FIG. 5, a touch screen controller 516 is provided for the touch screen 568. In accordance with some embodiments, the functions of a touch screen controller 516 may be incorporated into other components, such as a processor 504.
  • The processor 504 may comprise a general purpose programmable processor or controller for executing application programming or instructions. In accordance with at least some embodiments, the processor 504 may include multiple processor cores, and/or implement multiple virtual processors. In accordance with still other embodiments, the processor 504 may include multiple physical processors. As a particular example, the processor 504 may comprise a specially configured application specific integrated circuit (ASIC) or other integrated circuit, a digital signal processor, a controller, a hardwired electronic or logic circuit, a programmable logic device or gate array, a special purpose computer, or the like. The processor 504 generally functions to run programming code or instructions implementing various functions of the device 212.
  • A device 212 may also include memory 508 for use in connection with the execution of application programming or instructions by the processor 504, and for the temporary or long term storage of program instructions and/or data. As examples, the memory 508 may comprise RAM, DRAM, SDRAM, or other solid state memory. Alternatively or in addition, data storage 512 may be provided. Like the memory 508, the data storage 512 may comprise a solid state memory device or devices. Alternatively or in addition, the data storage 512 may comprise a hard disk drive or other random access memory.
  • In support of communications functions or capabilities, the device 212 can include a cellular telephony module 528. As examples, the cellular telephony module 528 can comprise a GSM, CDMA, FDMA and/or analog cellular telephony transceiver capable of supporting voice, multimedia and/or data transfers over a cellular network. Alternatively or in addition, the device 212 can include an additional or other wireless communications module 532. As examples, the other wireless communications module 532 can comprise a Wi-Fi, BLUETOOTH™, WiMax, infrared, or other wireless communications link. The cellular telephony module 528 and the other wireless communications module 532 can each be associated with a shared or a dedicated antenna 524.
  • A port interface 552 may be included. The port interface 552 may include proprietary or universal ports to support the interconnection of the device 212 to other devices or components, such as a dock, which may include additional or different capabilities from those integral to the device 212. In addition to supporting an exchange of communication signals between the device 212 and another device or component, the docking port (not shown) and/or port interface 552 can support the supply of power to or from the device 212. The port interface 552 also comprises an intelligent element that comprises a docking module for controlling communications or other interactions between the device 212 and a connected device or component.
  • An input/output module 548 and associated ports may be included to support communications over wired networks or links, for example with other communication devices, server devices, and/or peripheral devices. Examples of an input/output module 548 include an Ethernet port, a Universal Serial Bus (USB) port, Institute of Electrical and Electronics Engineers (IEEE) 1594, or other interface.
  • An audio input/output interface/device(s) 544 can be included to provide analog audio to an interconnected speaker or other device, and to receive analog audio input from a connected microphone or other device. As an example, the audio input/output interface/device(s) 544 may comprise an associated amplifier and analog to digital converter. Alternatively or in addition, the device 212 can include an integrated audio input/output device 556 and/or an audio jack for interconnecting an external speaker or microphone. For example, an integrated speaker and an integrated microphone can be provided, to support near talk or speaker phone operations.
  • Hardware buttons 280 can be included for example for use in connection with certain control operations. Examples include a master power switch, volume control, etc., as described in conjunction with FIG. 2. One or more image capture interfaces/devices, such as a camera, can be included for capturing still and/or video images. Alternatively or in addition, an image capture interface/device can include a scanner or code reader. An image capture interface/device can include or be associated with additional elements, such as a flash or other light source.
  • The device 212 can also include a global positioning system (GPS) receiver 536. In accordance with embodiments of the present invention, the GPS receiver 536 may further comprise a GPS module that is capable of providing absolute location information to other components of the device 212. Other sensors 242 may also be included. For example, an accelerometer(s)/gyroscope(s) may also be included. For example, in connection with the display of information to a user and/or other functions, a signal from the accelerometer/gyroscope can be used to determine an orientation and/or format in which to display that information to the user. In some embodiments, the accelerometer/gyroscope may comprise at least one accelerometer and at least one gyroscope.
  • Embodiments of the present invention can also include one or more magnetic sensing feature. The magnetic sensing feature can be configured to provide a signal indicating the position of the device relative to a vehicle-mounted position. This information can be provided as an input, for example to a user interface application, to determine an operating mode, characteristics of the touch sensitive display 568 and/or other device 212 operations. As examples, a magnetic sensing feature can comprise one or more of Hall-effect sensors, a multiple position switch, an optical switch, a Wheatstone bridge, a potentiometer, or other arrangement capable of providing a signal indicating of multiple relative positions the touch screens are in. Alternatively, the magnetic sensing feature may comprise one or more metallic elements used by other sensors associated with the console and/or vehicle to determine whether the device 212 is in a vehicle-mounted position. These metallic elements may include but are not limited to rare-earth magnets, electromagnets, ferrite and/or ferrite alloys, and/or other material capable of being detected by a range of sensors.
  • Communications between various components of the device 212 can be carried by one or more buses 520. In addition, power can be supplied to the components of the device 212 from a power source and/or power control module 560. The power control module 560 can, for example, include a battery, an AC to DC converter, power control logic, and/or ports for interconnecting the device 212 to an external source of power.
  • An embodiment of one or more software modules that may be associated with the vehicle control system 204 may be as shown in FIG. 6. The memory 508 may store and the processor 504 may execute one or more software components. These components can include at least one operating system (OS) 616, an application manager 662, a console desktop 666, and/or one or more applications 664 a and/or 664 b from an application store 660. The OS 616 can include a framework 620, one or more frame buffers 648, one or more drivers 612, and/or a kernel 618. The OS 616 can be any software, consisting of programs and data, which manages computer hardware resources and provides common services for the execution of various applications 664. The OS 616 can be any operating system and, at least in some embodiments, dedicated to mobile devices, including, but not limited to, Linux, ANDROID™, iPhone OS (IOS™), WINDOWS PHONE 7™, etc. The OS 616 is operable to provide functionality to the device 212 by executing one or more operations, as described herein.
  • The applications 664 can be any higher level software that executes particular console functionality for the user. Applications 664 can include programs such as vehicle control applications, email clients, web browsers, texting applications, games, media players, office suites, etc. The applications 664 can be stored in an application store 660, which may represent any memory or data storage, and the management software associated therewith, for storing the applications 664. Once executed, the applications 664 may be run in a different area of memory 608.
  • The framework 620 may be any software or data that allows the multiple tasks running on the device to interact. In embodiments, at least portions of the framework 620 and the discrete components described hereinafter may be considered part of the OS 616 or an application 664. However, these portions will be described as part of the framework 620, but those components are not so limited. The framework 620 can include, but is not limited to, a Surface Cache module 628, a Window Management module 632, an Input Management module 636, an Application Model Manager 642, a Display Controller 644, one or more frame buffers 648, and/or an event buffer 666.
  • The Surface Cache module 628 includes any memory or storage and the software associated therewith to store or cache one or more images of applications, windows, and/or console screens. A series of active and/or non-active windows (or other display objects, such as, a desktop display) can be associated with each display. An active window (or other display object) is currently displayed. A non-active window (or other display objects) was opened and, at some time, displayed but are now not displayed. To enhance the user experience, before a window transitions from an active state to an inactive state, a “screen shot” of a last generated image of the window (or other display object) can be stored. The Surface Cache module 628 may be operable to store a bitmap of the last active image of a window (or other display object) not currently displayed. Thus, the Surface Cache module 628 stores the images of non-active windows (or other display objects) in a data store.
  • In embodiments, the Window Management module 632 is operable to manage the windows (or other display objects) that are active or not active on each of the displays. The Window Management module 632, based on information from the OS 616, or other components, determines when a window (or other display object) is visible or not active. The Window Management module 632 may then put a non-visible window (or other display object) in a “not active state” and, in conjunction with the Task Management module 640 suspends the application's operation. Further, the Window Management module 632 may assign a display identifier to the window (or other display object) or manage one or more other items of data associated with the window (or other display object). The Window Management module 632 may also provide the stored information to the application 664, or other components interacting with or associated with the window (or other display object). The Window Management module 632 can also associate an input task with a window based on window focus and display coordinates within the motion space.
  • The Input Management module 636 is operable to manage events that occur with the device. An event is any input into the window environment, for example, a user interface interactions with a user. The Input Management module 636 receives the events and logically stores the events in an event buffer 656. Events can include such user interface interactions as a “down event,” which occurs when the screen 204 receives a touch signal from a user, a “move event,” which occurs when the screen 204 determines that a user's finger is moving across a screen(s), an “up event”, which occurs when the device 212 determines that the user has stopped touching the screen 568 etc. These events are received, stored, and forwarded to other modules by the Input Management module 636. The Input Management module 636 may also map screen inputs to a motion space which is the culmination of all physical and virtual display available on the device.
  • The frame buffer 648 is a logical structure(s) used to render the user interface. The frame buffer 648 can be created and destroyed by the OS kernel 618. However, the Display Controller 644 can write the image data, for the visible windows, into the frame buffer 648. A frame buffer 648 can be associated with one screen or multiple screens. The association of a frame buffer 648 with a screen can be controlled dynamically by interaction with the OS kernel 618. A composite display may be created by associating multiple screens with a single frame buffer 648. Graphical data used to render an application's window user interface may then be written to the single frame buffer 648, for the composite display, which is output to the multiple screens 204. The Display Controller 644 can direct an application's user interface to a portion of the frame buffer 648 that is mapped to a particular display 208, thus, displaying the user interface on only one screen 212. The Display Controller 644 can extend the control over user interfaces to multiple applications, controlling the user interfaces for as many displays as are associated with a frame buffer 648 or a portion thereof. This approach compensates for the physical screen 212 and any other console screens that are in use by the software component above the Display Controller 644.
  • The Application Manager 662 is an application that provides a presentation layer for the window environment. Thus, the Application Manager 662 provides the graphical model for rendering. Likewise, the Desktop 666 provides the presentation layer for the Application Store 660. Thus, the desktop provides a graphical model of a surface having selectable application icons for the Applications 664 in the Application Store 660 that can be provided to the Window Management Module 632 for rendering.
  • Further, the framework can include an Application Model Manager (AMM) 642. The Application Manager 662 may interface with the AMM 642. In embodiments, the AMM 642 receives state change information from the device 212 regarding the state of applications (which are running or suspended). The AMM 642 can associate bit map images from the Surface Cache Module 628 to the applications that are alive (running or suspended). Further, the AMM 642 may provide a list of executing applications to the Application Manager 662.
  • One or more gestures used to interface with the vehicle control system 204 may be as described in conjunction with FIGS. 7A through 7K. FIGS. 7A through 7H depict various graphical representations of gesture inputs that may be recognized by the screen(s) 212. The gestures may be performed not only by a user's body part, such as a digit, but also by other devices, such as a stylus, that may be sensed by the contact sensing portion(s) of a screen 212. In general, gestures are interpreted differently, based on where the gestures are performed (either directly on the display 568 or in the gesture capture region 572). For example, gestures in the display 568 may be directed to a desktop or application, and gestures in the gesture capture region 572 may be interpreted as for the system.
  • With reference to FIGS. 7A-7H, a first type of gesture, a touch gesture 720, is substantially stationary on the screen 212 for a selected length of time. A circle 728 represents a touch or other contact type received at particular location of a contact sensing portion of the screen. The circle 728 may include a border 732, the thickness of which indicates a length of time that the contact is held substantially stationary at the contact location. For instance, a tap 720 (or short press) has a thinner border 732 a than the border 732 b for a long press 724 (or for a normal press). The long press 724 may involve a contact that remains substantially stationary on the screen for longer time period than that of a tap 720. As will be appreciated, differently defined gestures may be registered depending upon the length of time that the touch remains stationary prior to contact cessation or movement on the screen.
  • With reference to FIG. 7C, a drag gesture 700 on the screen 212 is an initial contact (represented by circle 728) with contact movement 736 in a selected direction. The initial contact 728 may remain stationary on the screen 212 for a certain amount of time represented by the border 732. The drag gesture typically requires the user to contact an icon, window, or other displayed image at a first location followed by movement of the contact in a drag direction to a new second location desired for the selected displayed image. The contact movement need not be in a straight line but have any path of movement so long as the contact is substantially continuous from the first to the second locations.
  • With reference to FIG. 7D, a flick gesture 704 on the screen 212 is an initial contact (represented by circle 728) with truncated contact movement 736 (relative to a drag gesture) in a selected direction. In embodiments, a flick has a higher exit velocity for the last movement in the gesture compared to the drag gesture. The flick gesture can, for instance, be a finger snap following initial contact. Compared to a drag gesture, a flick gesture generally does not require continual contact with the screen 212 from the first location of a displayed image to a predetermined second location. The contacted displayed image is moved by the flick gesture in the direction of the flick gesture to the predetermined second location. Although both gestures commonly can move a displayed image from a first location to a second location, the temporal duration and distance of travel of the contact on the screen is generally less for a flick than for a drag gesture.
  • With reference to FIG. 7E, a pinch gesture 708 on the screen 212 is depicted. The pinch gesture 708 may be initiated by a first contact 728 to the screen 212 by, for example, a first digit and a second contact 728 b to the screen 212 by, for example, a second digit. The first and second contacts 728 a,b may be detected by a common contact sensing portion of a common screen 212, by different contact sensing portions of a common screen 212, or by different contact sensing portions of different screens 212. The first contact 728 a is held for a first amount of time, as represented by the border 732 a, and the second contact 728 b is held for a second amount of time, as represented by the border 732 b. The first and second amounts of time are generally substantially the same, and the first and second contacts 728 a,b generally occur substantially simultaneously. The first and second contacts 728 a,b generally also include corresponding first and second contact movements 736 a,b, respectively. The first and second contact movements 736 a,b are generally in opposing directions. Stated another way, the first contact movement 736 a is towards the second contact 736 b, and the second contact movement 736 b is towards the first contact 736 a. More simply stated, the pinch gesture 708 may be accomplished by a user's digits touching the screen 212 in a pinching motion.
  • With reference to FIG. 7F, a spread gesture 710 on the screen 212 is depicted. The spread gesture 710 may be initiated by a first contact 728 a to the screen 212 by, for example, a first digit and a second contact 728 b to the screen 212 by, for example, a second digit. The first and second contacts 728 a,b may be detected by a common contact sensing portion of a common screen 212, by different contact sensing portions of a common screen 212, or by different contact sensing portions of different screens 212. The first contact 728 a is held for a first amount of time, as represented by the border 732 a, and the second contact 728 b is held for a second amount of time, as represented by the border 732 b. The first and second amounts of time are generally substantially the same, and the first and second contacts 728 a,b generally occur substantially simultaneously. The first and second contacts 728 a,b generally also include corresponding first and second contact movements 736 a,b, respectively. The first and second contact movements 736 a,b are generally in a common direction. Stated another way, the first and second contact movements 736 a,b are away from the first and second contacts 728 a,b. More simply stated, the spread gesture 710 may be accomplished by a user's digits touching the screen 212 in a spreading motion.
  • The above gestures may be combined in any manner, such as those shown by FIGS. 7G and 7H, to produce a determined functional result. For example, in FIG. 7G a tap gesture 720 is combined with a drag or flick gesture 712 in a direction away from the tap gesture 720. In FIG. 7H, a tap gesture 720 is combined with a drag or flick gesture 712 in a direction towards the tap gesture 720.
  • The functional result of receiving a gesture can vary depending on a number of factors, including a state of the vehicle 104, display 568, or screen 212, a context associated with the gesture, or sensed location of the gesture. The state of the vehicle commonly refers to one or more of a configuration of the vehicle 104, a display orientation, and user and other inputs received by the vehicle 104. Context commonly refers to one or more of the particular application(s) selected by the gesture and the portion(s) of the application currently executing, whether the application is a single- or multi-screen application, and whether the application is a multi-screen application displaying one or more windows. Sensed location of the gesture commonly refers to whether the sensed set(s) of gesture location coordinates are on a touch sensitive display 568 or a gesture capture region 572, whether the sensed set(s) of gesture location coordinates are associated with a common or different display or screen 212, and/or what portion of the gesture capture region contains the sensed set(s) of gesture location coordinates.
  • A tap, when received by an a touch sensitive display 568, can be used, for instance, to select an icon to initiate or terminate execution of a corresponding application, to maximize or minimize a window, to reorder windows in a stack, and to provide user input such as by keyboard display or other displayed image. A drag, when received by a touch sensitive display 568, can be used, for instance, to relocate an icon or window to a desired location within a display, to reorder a stack on a display, or to span both displays (such that the selected window occupies a portion of each display simultaneously). A flick, when received by a touch sensitive display 568 or a gesture capture region 572, can be used to relocate a window from a first display to a second display or to span both displays (such that the selected window occupies a portion of each display simultaneously). Unlike the drag gesture, however, the flick gesture is generally not used to move the displayed image to a specific user-selected location but to a default location that is not configurable by the user.
  • The pinch gesture, when received by a touch sensitive display 568 or a gesture capture region 572, can be used to minimize or otherwise increase the displayed area or size of a window (typically when received entirely by a common display), to switch windows displayed at the top of the stack on each display to the top of the stack of the other display (typically when received by different displays or screens), or to display an application manager (a “pop-up window” that displays the windows in the stack). The spread gesture, when received by a touch sensitive display 568 or a gesture capture region 572, can be used to maximize or otherwise decrease the displayed area or size of a window, to switch windows displayed at the top of the stack on each display to the top of the stack of the other display (typically when received by different displays or screens), or to display an application manager (typically when received by an off-screen gesture capture region on the same or different screens).
  • The combined gestures of FIG. 7G, when received by a common display capture region in a common display or screen 212, can be used to hold a first window location constant for a display receiving the gesture while reordering a second window location to include a window in the display receiving the gesture. The combined gestures of FIG. 7H, when received by different display capture regions in a common display or screen 212 or in different displays or screens, can be used to hold a first window location for a display receiving the tap part of the gesture while reordering a second window location to include a window in the display receiving the flick or drag gesture. Although specific gestures and gesture capture regions in the preceding examples have been associated with corresponding sets of functional results, it is to be appreciated that these associations can be redefined in any manner to produce differing associations between gestures and/or gesture capture regions and/or functional results.
  • Gestures that may be completed in three-dimensional space and not on a touch sensitive screen 568 or gesture capture region 572 may be as shown in FIGS. 71 through 7K. The gestures may be completed in an area where a sensor 242, such as an optical sensor, infrared sensor, or other type of sensor, may detect the gesture. For example, the gesture 740 in FIG. 7I, a person may open their hand 764 and move their hand in a back and forth direction 748 as a gesture 740 to complete some function with the vehicle 104. For example gesture 764 may change the station of the radio in the vehicle 104. The sensors 242 may both determine the configuration of the hand and the vector of the movement. The vector and hand configuration can be interpreted to mean certain things to the vehicle control system 204 and produce different results.
  • In another example of a gesture 752 in FIG. 7J, a user may configure their hand 764 to extend two fingers and move the hand in an up and down operation 756. This gesture 752 may control the volume of the radio or some other function. Again, the sensors 242 may determine how the person has configured their hand gesture, and the vector of the movement. In another example of a gesture 760 shown in FIG. 7K, a user may extend their middle three fingers at an angle 45° from straight vertical and circle the hand in a counter-clockwise motion 764. This gesture 760 may cause the automobile to change the heat or do some other function. As can be understood by one skilled in the art, the configurations of the hand and the types of movement are variable. Thus, the user may configure the hand 764 in any way imaginable and may also move that hand 764 in any direction with any vector in three-dimensional space.
  • The gestures 740, 752, 760, as shown in FIG. 7I through 7K, may occur in a predetermined volume of space within the vehicle 104. For example, a sensor 242 may be configured to identify such gestures 740, 752, 760 between the front passenger's and front driver's seats over a console area within the passenger compartment of the automobile 104. The gestures 740, 752, 760 may be made within area 1 304 a between zones A 312 a and B 312 b. However, there may be other areas 308 where a user may use certain gestures, where sensors 242 may be able to determine a certain function is desired. Gestures that may be similar but used in different areas within the vehicle 104 may cause different functions to be performed. For example, the gesture 740 in FIG. 7I, if used in zone E 312 e, may change the heat provided in zone E 312 e, but may change the station of a radio if used in zone A 312 a. Further, the gestures may be made with other body parts or, for example, different expressions of a persons' face may be used to control functions in the vehicle 104. Also, the user may use two hands in some circumstances or do other types of physical movements that can cause different reactions in the vehicle 104.
  • An embodiment of a data structure 800 to store different settings is shown in FIG. 8. The data structure 800 may include one or more of data files or data objects 804. Thus, the data structure 800 may represent different types of data bases or data storage, for example, object-oriented data bases, flat file data structures, relational database, or other types of data storage arrangements. The data file 804 may include several portions 808-836 representing different types of data. Each of these types of data may be associated with a user, as shown in portion 808.
  • There may be one or more user records 840 and associated data stored within the data file 804. The user can be any person that uses or rides within the vehicle or conveyance 104. The user may be identified in portion 812. For the vehicle 104, the user may include a set of one or more features that may identify the user. These features may be the physical characteristics of the person that may be identified by facial recognition or some other type of system. In other embodiments, the user may provide a unique code to the vehicle control system 204 or provide some other type of data that allows the vehicle control system 204 to identify the user. The features or characteristics of the user are then stored in portion 812.
  • Each user identified in portion 808 may have a different set of settings for each area 308 and/or each zone 312 within the vehicle 104. Thus, each set of setting may also be associated with a predetermined zone 312 or area 308. The zone 312 is stored in portion 820 and the area 308 is stored in portion 816.
  • One or more settings may be stored in portion 824. These settings 824 may be the configurations of different functions within the vehicle 104 that are specified by or for that user. For example, the settings 824 may be the position of a seat, the position of a steering wheel, a heating/cooling setting, a radio setting, a cruise control setting, or some other type of setting associated with the vehicle 104. Further, in vehicles adapted to have a configurable console or a configurable dash or heads-up display, the settings 824 may also provide for how that heads-up display, dash, or console are configured for this particular user. Each setting 824 may be associated with a different area 308 or zone 312. Thus, there may be more settings 824 for when the user is the driver and in zone A, 312A, of area 1, 308A. However, there may be similar settings 824 among the different zones 312 or areas 308 as shown in portion 824. For example, the heating or radio settings for the user may be similar in every zone 312.
  • The sensors 242 within the vehicle 104 may be able to either obtain or track health data in portion 828. Health data 828 may include any type of physical characteristic associated with the user. For example, a heart rate, a blood pressure, a temperature, or other types of heath data may be obtained and stored in portion 828. The user may have this health data tracked over a period of time to allow for statistical analysis of the user's health while operating the vehicle 104. In this way if some function of the user's health deviates from a norm, the vehicle 104 may be able to determine there is a problem with the person and react to that data.
  • One or more gestures may be stored in portion 832. Thus, the gestures used and described in conjunction FIG. 7A through 7K may be configurable. These gestures may be determined or created by the user and stored in portion 832. A user may have different gestures for each zone 312 or area 308 within the vehicle. The gestures that do certain things while driving may do other things while in a different area 308 of the vehicle 104. Thus, the user may use a first set of gestures while driving and a second set while a passenger. Further, one or more users may share gestures as shown in portion 832. Each driver may have a common set of gestures that they use in zone A, 312 a. Each of these gestures may be determined or captured and then stored with their average characteristics (e.g., vector, position of gesture, etc.) in portion 832.
  • One or more sets of safety parameters may be stored in portion 836. Safety parameters 836 may be common operating characteristics for this driver/passenger or for all drivers/passengers that if deviated from may determine there is a problem with the driver/passenger or the vehicle 104. For example, a certain route may be taken repeatedly and an average speed or mean speed may be determined. If the mean speed deviates by some number of standard deviations, a problem with the vehicle 104 or the user may be determined. In another example, the health characteristics or driving experience of the user may be determined. If the user drives in a certain position where their head occupies a certain portion of three-dimensional space within the vehicle 104, the vehicle control system 204 may determine that the safety parameter includes the users face or head being within this certain portion of the vehicle interior space. If the user's head deviates from that interior space for some amount of time, the vehicle control system 204 can determine that something is wrong with the driver and change the function or operation of the vehicle 104 to assist the driver. This may happen, for example, when a user falls asleep at the wheel. If the user's head droops and does no longer occupy a certain three dimensional space, the vehicle control system 204 can determine that the driver has fallen asleep and may take control of the operation of the vehicle 204 and steer the vehicle 204 to the side of the road. In other examples, if the user's reaction time is too slow or some other safety parameter is not nominal, the vehicle control system 204 may determine that the user is inebriated or having some other medical problem. The vehicle control system 204 may then assume control of the vehicle to ensure that the driver is safe.
  • An embodiment for communication system configuration 900 is shown in FIG. 9. Here the vehicle 104 is shown with an interior cabin. The interior cabin may include two or more different communication transceivers 904. The communication transceivers 904 may be positioned within the vehicle cabin as to provide for signal identification and location. For example, the communication transceivers 904 a, 904 b, 904 c, and 904 d are currently located at the corners or extents of the interior cabin of the vehicle 104. In this way, signals received by the communication transceivers 904 may be studied or analyzed to identify the origin location of the signal. As such, the communication transceivers 904 can create a geo-fence (i.e., a virtual perimeter for a real-world geographic area) around the vehicle 104 that allows the communication system to determine whether received signals are currently originating inside the vehicle 104.
  • An embodiment of a communication system or network 1000 is shown in FIG. 10. The communication system 1000 can include two or more BLUETOOTH™ transceivers 1004. Each BLUETOOTH™ transceiver 1004 may be paired with a single-user device 1008. Thus, BLUETOOTH™ transceiver 1 1004A may communicate with user 1 device 1008A. There may be any number of different BLUETOOTH™ transceivers as represented by ellipses 1032. BLUETOOTH™ transceivers 1004 may conduct communications using the BLUETOOTH™ protocol with the user device 1008. Information received and/or sent to the user device 1008 may originate from the communication system 1028. An embodiment of a communication system is as provided in FIG. 11.
  • The communication system 1000 may also include other communication components that can communicate with different protocols. For example, communication module 1012 may can communicate using an 802.11, 802.11G, or other wireless LAN protocol. The wireless LAN router/antenna 1012 may communicate with another user 1008 c or other components 1024. Thus, those users or components not able to communicate through the array of BLUETOOTH™ transceivers 1004 may still communicate to the vehicle communication system 1028. Other types of communication devices or components may include an Ethernet LAN 1016. The Ethernet LAN 1016 may include one or more hard-wired ports that may be connected within the vehicle 104. There may be other types of protocols or systems used to communicate with the communication system 1028, as represented by ellipses 1020. The components within the communication system 1000 may be hardware and/or software and may operate as understood in the art as associated with these communication protocols.
  • An embodiment of a communication system 1028 is shown in FIG. 11. The communication system 1028 may include two or more communication modules 1104. Each communication module may communicate with a particular type of communication component, for example, the BLUETOOTH™ transceivers 1004, the 802.11 router 1012, or other types of communication systems. The system communication module 1104 may be operable to interface with a single type of communication component, but provide those signals to a common signal processor 1108. A translation module 1120 may be operable to translate the received or sent signals into a common format for the signal processor 1108. The translation module 1120 thus may make the signals system or protocol agnostic for the signal processor 1108, but also allow the use of different and varying communication modules 1104.
  • The signal processor 1108 may be operable to analyze signal characteristics, relay messages, or do other types of processing for the communication system 1028. A signal processor 1108 can receive signal data from the communication modules 1104. This data may include time stamps, signal attenuation characteristics, Doppler shift characteristics, and other types of characteristics about the signal. The signal data may then be analyzed with the signal processor 1108 to determine the location of the source of the signal. This location determination may then be used to determine whether a user is provided access to the communication system 1028.
  • If access is granted, the address module 1112 may provide an address to the device to provide for inter-device communication or communication from the vehicle 104 to the device. The address module 1112 may be a domain name server (DNS) or other type of addressing system. The signal processor 1108 may also store data about the signal, the device associated with the signal, the user associated with the signal, or other data in a signal data database 1116. The database 1116 may be any type of data structure or data-storage system, for example, an object-oriented database, flat file database, or other types of databases. The data may include any data received and/or processed by the signal processor 1108 and used to identify the source location of the signals. The information in the database 1116 may be accessed, stored, or managed by the signal processor 1108. The signals received by the signal processor 1108 may be sent from or sent to the processor 504 in the vehicle control system 204.
  • An embodiment of a method 1200/1236 for creating a universal bus for the vehicle system is shown in FIGS. 12A and 12B. While a general order for the steps of the method 1200 is shown in FIGS. 12A and 12B. Generally, the method 1200/1236 starts with a start operation 1204/1238 and ends with an end operation 1232/1264. The method 1200/1236 can include more or fewer steps or can arrange the order of the steps differently than those shown in FIGS. 12A and 12B. The method 1200/1236 can be executed as a set of computer-executable instructions executed by a computer system and encoded or stored on a computer readable medium. Hereinafter, the method 1200/1236 shall be explained with reference to the systems, components, modules, software, data structures, user interfaces, etc. described in conjunction with FIGS. 1-11.
  • The communication system 1028 may receive a signal from a device, in step 1208. The signal can be received by one of the receivers 904, which may include a BLUETOOTH™ transceiver 1004, an 802.11 transceiver 1012, or some other receiver. The signal may then be transferred to the communications system 1028.
  • In step 1212, the communication system 1028 may determine if the signal originated from inside the vehicle. Various analyses may be performed on the signal or on signal information contained in the signal. Some of this analysis may be as described in conjunction with FIG. 12B. If the signal is determined to originate outside the vehicle, the method 1200 may proceed NO to step 1216 where the communications system 1028 can reject the receipt of the signal. If the signal is determined to have originated from inside the vehicle, the method 1200 can proceed YES to step 1220, where the communication system 1028 may make a connection to the device 1008.
  • The communication system 1028, in step 1220, can provide an Internet Protocol (IP) address or other type of access such that signals coming from the device 1008 thereinafter are not rejected. Other types of wireless or wired connections may also be made. If the connection is with a BLUETOOTH™ capable device 1008, the communication system 1028 can pair the device 1008 with a BLUETOOTH™ transceiver 1004, in step 1224. The communication system 1028 may make several pairings as there may be two or more BLUETOOTH™ transceivers 1008 available. Upon making the connection or pairing, the communication system 1028 can provide access to the communication bus 1228, such that signals to and from devices 1008 are relayed to the processor 504 of the vehicle control system 204, which may be accessed by the device 1008 sending the signals. In this way, a communication bus is established through wireless or wired connections.
  • Embodiment of analysis used to determine whether a signal originates inside a vehicle is shown in FIG. 12B. The signal processor 1108 of the communication system 1028 may analyze signal characteristics, in step 1240. Signal characteristics can include one or more of, but is not limited to analyzing signal attenuation, where a signal with a shrinking strength or increasing strength may be determined to be moving in relative proximity or position to the vehicle 104, analyzing any Doppler shift in the frequency, which may indicate movement in reference to the vehicle 104, analyzing any kind of delay between receiving the same signal at the various transceivers 904A through 904D. A difference in the time of receipt can be used to triangulate where the location of the signal originated and if that location is outside the vehicle or inside the vehicle.
  • The signal processor 1108 can also analyze location information, in step 1244. Beyond the signal characteristics, the signal processor 1108 may receive information from sensors 242 to determine a location of the vehicle 104. If the location of the vehicle 104 is in an area where there is not a likelihood of signal congestion, for example, in the driveway of someone's home, then all received signals may be determined to have been with inside the vehicle. Thus, as signals are received and if the location has changed, the signal processor 1108 may determine whether the current location is an area where there may be more signals received that would be outside the vehicle or whether the signals received has changed.
  • Analysis of the person sending the signal may also be used, in step 1248. Thus, the signal processor 1108 may access signal data 1116 to determine if the signals have been received from this device or from this person before. Thus, the signal may identify a person documented in the signal data 1116, and the signal processor 1108 may determine if that person has used or connected with the signal processor 1108 previously. Further, the signal processor 1108 can determine if there is movement of the vehicle, in step 1252. If a signal remains within the car after the vehicle 104 moves, then that signal can be determined to be inside the vehicle 104. For example, if the signal is received at the beginning of a route and then at some time thereinafter the signal continues to be received, then it is determined that signal may be inside the vehicle 104.
  • Further, sensor data may be analyzed, in step 1256. Sensor data may include such things as determining if there are people and the number of people within a car. Thus, if there are three people in the vehicle 104 and three signals are received, all three signals may be determined to be inside the vehicle. Further, it may be possible for the sensors 242 to determine if a device is currently being used inside the vehicle. For example, if an optical sensor can view a device 1008 within its field of vision and/or if an electromagnetic field sensor determines that there is EMF radiation emanating from a location in the vehicle 104, then the signal processor 1108 can determine that that signal is originating inside a vehicle 104.
  • The signal processor 1108 may receive one or more of these analyses and resolve the information, in step 1260. Thus, the signal processor 1108 can cross-correlate information from different analyses to determine if the signal is within the vehicle 104. Different weight may be given to different analyzes to make a determination about where the signal originates. In this way, a more robust decision is made as to whether or not the signal originates in the vehicle 104 and should be allowed to connect to the universal bus or the routing system of the vehicle 104.
  • An embodiment of a method 1300 for providing a hot spot in the vehicle 104 may be as provided in FIG. 13. While a general order for the steps of the method 1300 is shown in FIG. 13. Generally, the method 1300 starts with a start operation 1304 and ends with an end operation 1332. The method 1300 can include more or fewer steps or can arrange the order of the steps differently than those shown in FIG. 13. The method 1300 can be executed as a set of computer-executable instructions executed by a computer system and encoded or stored on a computer readable medium. Hereinafter, the method 1300 shall be explained with reference to the systems, components, modules, software, data structures, user interfaces, etc. described in conjunction with FIGS. 1-12.
  • A user using a device 1008 may request a function, instead of 1308. The function can be any type of feature or application that may be offered by the vehicle 104, for example, the playing of multimedia data, provision of access to the internet, e-mail, or other types of functions. The request may be received by the communication system 1028 and processed by the signal processor 1108.
  • The signal processor 1108 may then determine the available devices or sources for the function, in step 1312. The signal processor 1108 can determine if one of the devices 1008 already connected within the communication system 1028 can provide the function. For example, if the function is the provision of a multimedia stream, one of the devices 1008 within the vehicle 104 may be able to connect to a source to provide the multimedia stream. The device 1008 can be a computing system with a processor and memory. As such, the signal processor 1108 may determine to use what available resources there are to provide the function to the requestor.
  • The signal processor 1108 may also determine the load on each of the devices, in step 1316. Thus, not only will the signal processor 1108 look for an available source, the signal processor 1108 can also look for a source that has the least amount of load. The load balancing may be determined by the number of input/output messages sent between a source or device 1008 and the signal processor 1108, may be determined by information sent to the signal processor 1108, or may be determined by some other means.
  • Based on the available sources and the load, the signal processor 1108 can determine a device 1108 or source to select to provide for the function, in step 1320. Upon selecting the device 1108 or source, the signal processor 1108 can send a command or directive to the device 1108 or source, in step 1324, to provide the function. The device 1108 or source may then provide the function, and the requesting device 1108 may receive that function, in step 1328.
  • An embodiment of method 1400 for communication between devices inside a vehicle 104 is shown in FIG. 14. While a general order for the steps of the method 1400 is shown in FIG. 14. Generally, the method 1400 starts with a start operation 1404 and ends with an end operation 1420. The method 1400 can include more or fewer steps or can arrange the order of the steps differently than those shown in FIG. 14. The method 1400 can be executed as a set of computer-executable instructions executed by a computer system and encoded or stored on a computer readable medium. Hereinafter, the method 1400 shall be explained with reference to the systems, components, modules, software, data structures, user interfaces, etc. described in conjunction with FIGS. 1-13.
  • A communication system 1028 may receive a message from a device 1008, in step 1408. This message may be directed to another device 1008 within the vehicle 104. Thus, the communication system 1028 may access signal data 1116 or address information from an address module 1112 to determine an address or location for the receiving device 1008. This information may be used to verify that the message is from a device inside the vehicle 104, in step 1412. Thus, the address or other information from the sender may be compared to signal data 1116 and used to verify the message is authentic. If the message is verified, the communication system 1028 may provide the message to the recipient address, in step 1416. Using the communication system 1028 devices 1008 can communicate without communicating directly between each other or through some other communication network. The devices 1008 within the vehicle 104 may use the communication system 1028 to relay message between each other. In this way, the universal bus for the devices 1008 is used as a more efficient communication system 1100 that is more secure and contained within the vehicle 104.
  • An embodiment of a method 1500 for sharing an application between a device 1008 and the vehicle system 200 may be as shown in FIG. 15. While a general order for the steps of the method 1500 is shown in FIG. 15. Generally, the method 1500 starts with a start operation 1504 and ends with an end operation 1532. The method 1500 can include more or fewer steps or can arrange the order of the steps differently than those shown in FIG. 15. The method 1500 can be executed as a set of computer-executable instructions executed by a computer system and encoded or stored on a computer readable medium. Hereinafter, the method 1500 shall be explained with reference to the systems, components, modules, software, data structures, user interfaces, etc. described in conjunction with FIGS. 1-14.
  • The processor 504 may identify available application sources or devices 1008, in step 1508. The available application sources or devices 1008 may be determined by sending requests through the communication system 1028 to devices 1008/sources. The response may be provided back to the processor 504, which may then identify those applications, sources, and/or devices 1008 to a user's touch sensitive display 568. An interaction with the touch sensitive display 568 may then activate the function in an application, in step 1512. Thus, the processor 504 can receive the selection of a function or application in the touch sensitive display 568. Based on this selection, the processor 504 may determine which function was activated based on what was displayed in the touch sensitive display 568.
  • The processor 504 may then send that selection to a device 1008. Upon receiving the selection, the device 1008 may then execute the application or function and command the vehicle system 200 to perform the display of information based on the function, in step 1516. Thus, the processing of the function or application occurs in the device 1008 but is displayed in the touch sensitive display 568. The device 1008 sends application user interface information or data to the vehicle system 504, in step 1520. This information may be displayed in the touch sensitive display 568.
  • The processor 504 may thereinafter receive an input from the device 1008, in step 1524. Any command or information sent to the processor 504 may then be executed as an application function on the device 1008 but displayed on the touch sensitive display 568 or other display within the vehicle 104, in step 1528. In this way, the device 1008 actually executes the application while the vehicle 104 displays the user interface information. The execution of the application appears to occur in the vehicle system 200, but the processing does not actually occur in the vehicle processor 504.
  • An embodiment of a method 1600 for providing data or functions through cloud-based storage or applications is shown in FIG. 16. While a general order for the steps of the method 1600 is shown in FIG. 16. Generally, the method 1600 starts with a start operation 1604 and ends with an end operation 1632. The method 1600 can include more or fewer steps or can arrange the order of the steps differently than those shown in FIG. 16. The method 1600 can be executed as a set of computer-executable instructions executed by a computer system and encoded or stored on a computer readable medium. Hereinafter, the method 1600 shall be explained with reference to the systems, components, modules, software, data structures, user interfaces, etc. described in conjunction with FIGS. 1-15.
  • A processor 504 can determine the devices 1008 connected to the communication system 1028, in step 1608. Here, the communication system 1028 may send information to the processor 508 from signal data 1116. The signal data 1116 may be sent to the processor 504 from signal processor 1108 and available resources, available devices, and available sources. The processor 504 may also determine cloud sources for applications or data, in step 1612. The signal processor 1108 may determine connections from the devices 1008 to cloud sources. The connection information may then be sent from the signal processor 1108 to the processor 504. Further, the processor 504 may determine what cloud sources are available through server 228. The cloud source information may then be presented in a user interface in the touch sensitive display 568.
  • From this touch sensitive display 568, the processor 504 can receive an input on the vehicle user interface, in step 1616. The input may be a selection of a cloud source for an application or for data. The selection may be analyzed to determine the correct source for that cloud information. Thus, the processor 504 may send a command to the signal processor 1108 to route the input to the appropriate device 1008 or to the server 228 that can provide that cloud application/data, in 1620. The processor 504 can leverage the access of other devices 1008 to obtain cloud data. The cloud source may be provided from two or more different devices 1008 or from a device 1008 and the server 228. In this way, one of the devices 1008 or the server 228 may be selected based on load-balancing principles.
  • The data may be received through the device 1008 or the server 228 from the cloud and can be provided to the signal processor 1108, in step 1624. The information may be received from two or more sources and one source cached so that seamless transition between the sources may occur should one of the sources become unavailable unexpectedly. The data received from the cloud source may then be provided to the vehicle touch sensitive display 568, in 1628. In this way, cloud sources may be leveraged by the vehicle 104, although the access to those cloud sources may be from device 1008.
  • A method 1700 for streaming multimedia data is shown in FIG. 17. While a general order for the steps of the method 1700 is shown in FIG. 17. Generally, the method 1700 starts with a start operation 1704 and ends with an end operation 1740. The method 1700 can include more or fewer steps or can arrange the order of the steps differently than those shown in FIG. 17. The method 1700 can be executed as a set of computer-executable instructions executed by a computer system and encoded or stored on a computer readable medium. Hereinafter, the method 1700 shall be explained with reference to the systems, components, modules, software, data structures, user interfaces, etc. described in conjunction with FIGS. 1-16.
  • A signal processor 1108 can determine devices or sources for multimedia data, in step 1708. The determination may be made by canvassing (e.g., polling each source for what multimedia the source can provide) the different devices 1008 or the server 228 for information about multimedia data. The received information may be consolidated and presented on the vehicle touch sensitive display 568, in step 1712. Thus, the user may select from multimedia data in its consolidated form from the user interface 568.
  • The user may select from the vehicle user interface 568 a request for multimedia, in step 1760. The user touch sensitive display 568 can receive the request and send the request to the processor 504. The processor 504 can determine the source for the multimedia selected and request that media from that source, in step 1720. In embodiments the processor 504 may request the multimedia from two or more sources. The request can be sent from the processor 504 to the signal processor 1108. There, the signal processor 1108 can send the request to the device 1008 or to the server 224.
  • The multimedia received from the source may be cached, in step 1724. If multimedia is being perceived from two or more sources each source may be cached. One of the sources may be used to present the multimedia data to the user while the other stream of data remains cached. In this way, a fail-safe or fail-over may be provided should one of the sources no longer provide data or some malfunction may occur.
  • The processor may then determine if safety parameters are met, in step 1728. The provision of certain types of multimedia data, for example video data, may not be allowed during certain vehicle operations or to certain passengers within the vehicle 104. For example, a driver cannot watch a movie while driving. Thus, the processor 504 can determine if the safety parameters for the particular person desiring the multimedia data are met.
  • If those safety parameters are not met, the method 1700 proceeds NO to step 1732. In step 1732, the processor 504 may record the data to local storage. Thus, the data may be provided at a later time from this local storage. Likewise, the processor 504 can send a signal to another data storage element that can record the data. Thus, the processor 504 can coordinate the recording of the data for the user in a different source that allows the user to view it after they are done operating the vehicle 104.
  • If the safety parameters are met, the method 1700 proceeds YES to step 1736. Here, the data may then be presented in the vehicle 104 to a user interface 568. This user interface 568 may provide video data, internet data, audio data, or other types of data. The presentation of the data may happen in one or more locations within the vehicle 104 and one or more user interfaces.
  • Furthermore, while the exemplary aspects, embodiments, and/or configurations illustrated herein show the various components of the system collocated, certain components of the system can be located remotely, at distant portions of a distributed network, such as a LAN and/or the Internet, or within a dedicated system. Thus, it should be appreciated, that the components of the system can be combined in to one or more devices, such as a tablet-like device, or collocated on a particular node of a distributed network, such as an analog and/or digital telecommunications network, a packet-switch network, or a circuit-switched network. It will be appreciated from the preceding description, and for reasons of computational efficiency, that the components of the system can be arranged at any location within a distributed network of components without affecting the operation of the system. For example, the various components can be located in a switch such as a PBX and media server, gateway, in one or more communications devices, at one or more users' premises, or some combination thereof. Similarly, one or more functional portions of the system could be distributed between a telecommunications device(s) and an associated computing device.
  • Furthermore, it should be appreciated that the various links connecting the elements can be wired or wireless links, or any combination thereof, or any other known or later developed element(s) that is capable of supplying and/or communicating data to and from the connected elements. These wired or wireless links can also be secure links and may be capable of communicating encrypted information. Transmission media used as links, for example, can be any suitable carrier for electrical signals, including coaxial cables, copper wire and fiber optics, and may take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.
  • Also, while the flowcharts have been discussed and illustrated in relation to a particular sequence of events, it should be appreciated that changes, additions, and omissions to this sequence can occur without materially affecting the operation of the disclosed embodiments, configuration, and aspects.
  • In yet another embodiment, the systems and methods of this disclosure can be implemented in conjunction with a special purpose computer, a programmed microprocessor or microcontroller and peripheral integrated circuit element(s), an ASIC or other integrated circuit, a digital signal processor, a hard-wired electronic or logic circuit such as discrete element circuit, a programmable logic device or gate array such as PLD, PLA, FPGA, PAL, special purpose computer, any comparable means, or the like. In general, any device(s) or means capable of implementing the methodology illustrated herein can be used to implement the various aspects of this disclosure. Exemplary hardware that can be used for the disclosed embodiments, configurations and aspects includes computers, handheld devices, telephones (e.g., cellular, Internet enabled, digital, analog, hybrids, and others), and other hardware known in the art. Some of these devices include processors (e.g., a single or multiple microprocessors), memory, nonvolatile storage, input devices, and output devices. Furthermore, alternative software implementations including, but not limited to, distributed processing or component/object distributed processing, parallel processing, or virtual machine processing can also be constructed to implement the methods described herein.
  • In yet another embodiment, the disclosed methods may be readily implemented in conjunction with software using object or object-oriented software development environments that provide portable source code that can be used on a variety of computer or workstation platforms. Alternatively, the disclosed system may be implemented partially or fully in hardware using standard logic circuits or VLSI design. Whether software or hardware is used to implement the systems in accordance with this disclosure is dependent on the speed and/or efficiency requirements of the system, the particular function, and the particular software or hardware systems or microprocessor or microcomputer systems being utilized.
  • In yet another embodiment, the disclosed methods may be partially implemented in software that can be stored on a storage medium, executed on programmed general-purpose computer with the cooperation of a controller and memory, a special purpose computer, a microprocessor, or the like. In these instances, the systems and methods of this disclosure can be implemented as program embedded on personal computer such as an applet, JAVA® or CGI script, as a resource residing on a server or computer workstation, as a routine embedded in a dedicated measurement system, system component, or the like. The system can also be implemented by physically incorporating the system and/or method into a software and/or hardware system.
  • Although the present disclosure describes components and functions implemented in the aspects, embodiments, and/or configurations with reference to particular standards and protocols, the aspects, embodiments, and/or configurations are not limited to such standards and protocols. Other similar standards and protocols not mentioned herein are in existence and are considered to be included in the present disclosure. Moreover, the standards and protocols mentioned herein and other similar standards and protocols not mentioned herein are periodically superseded by faster or more effective equivalents having essentially the same functions. Such replacement standards and protocols having the same functions are considered equivalents included in the present disclosure.
  • The present disclosure, in various aspects, embodiments, and/or configurations, includes components, methods, processes, systems and/or apparatus substantially as depicted and described herein, including various aspects, embodiments, configurations embodiments, subcombinations, and/or subsets thereof. Those of skill in the art will understand how to make and use the disclosed aspects, embodiments, and/or configurations after understanding the present disclosure. The present disclosure, in various aspects, embodiments, and/or configurations, includes providing devices and processes in the absence of items not depicted and/or described herein or in various aspects, embodiments, and/or configurations hereof, including in the absence of such items as may have been used in previous devices or processes, e.g., for improving performance, achieving ease and\or reducing cost of implementation.
  • The foregoing discussion has been presented for purposes of illustration and description. The foregoing is not intended to limit the disclosure to the form or forms disclosed herein. In the foregoing Detailed Description for example, various features of the disclosure are grouped together in one or more aspects, embodiments, and/or configurations for the purpose of streamlining the disclosure. The features of the aspects, embodiments, and/or configurations of the disclosure may be combined in alternate aspects, embodiments, and/or configurations other than those discussed above. This method of disclosure is not to be interpreted as reflecting an intention that the claims require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed aspect, embodiment, and/or configuration. Thus, the following claims are hereby incorporated into this Detailed Description, with each claim standing on its own as a separate preferred embodiment of the disclosure.
  • Moreover, though the description has included description of one or more aspects, embodiments, and/or configurations and certain variations and modifications, other variations, combinations, and modifications are within the scope of the disclosure, e.g., as may be within the skill and knowledge of those in the art, after understanding the present disclosure. It is intended to obtain rights which include alternative aspects, embodiments, and/or configurations to the extent permitted, including alternate, interchangeable and/or equivalent structures, functions, ranges or steps to those claimed, whether or not such alternate, interchangeable and/or equivalent structures, functions, ranges or steps are disclosed herein, and without intending to publicly dedicate any patentable subject matter.
  • The embodiments can include a method, comprising receiving a signal at a vehicle communication system from a device; automatically determining if the signal originates from inside the vehicle; if the signal does not originate from inside the vehicle, rejecting the signal; if the signal originates from inside the vehicle, communicatively connecting the vehicle communication system to the device; and providing access to a vehicle bus.
  • The vehicle communication system can include a communication system in communication with two or more transceivers, wherein two or more of the transceivers are BLUETOOTH transceivers, and wherein two or more antennas for the vehicle communication system are separated but positioned within the vehicle.
  • The communication system can analyze the signal. The analysis includes one or more of: determining signal characteristics; determining location information; determining person information; determining movement information; and determining sensor information. The communication system may resolve the analysis to determine if the signal originated inside the vehicle, wherein the analysis may be weighted to resolve the determination of if the signal originated inside the vehicle. The vehicle communication system can include at least one wireless LAN component, wherein the vehicle communication system includes at least one wireless LAN component, and wherein the device communicates with the at least one wireless LAN component.
  • The embodiments can also include a vehicle communication system, comprising: two or more BLUETOOTH transceivers; a communication system comprising: two or more communication modules; a signal processor; wherein the signal processor is operable to: receive a signal at the vehicle communication system from a device; automatically determine if the signal originates from inside the vehicle; if the signal does not originate from inside the vehicle, reject the signal; if the signal originates from inside the vehicle, communicatively connect the vehicle communication system to the device; and provide access to a vehicle bus.
  • The vehicle communication system can include at least one wireless LAN component, and wherein at least one device connects through the wireless LAN component, wherein the communication system further comprises a signal data database and/or a translator. The signal processor may be further operable to: determine signal characteristics; determine location information; determine person information; determine movement information; determine sensor information; and resolve the signal characteristics, location information, person information, movement information, and sensor information to determine if the signal originated inside the vehicle.
  • The embodiments can also include computer-executable instructions comprising: instructions to receive a signal at a vehicle communication system from a device; instructions to automatically determine if the signal originates from inside the vehicle; if the signal does not originate from inside the vehicle, instructions to reject the signal; if the signal originates from inside the vehicle, instructions to communicatively connect the vehicle communication system to the device; and instructions to provide access to a vehicle bus.
  • The vehicle communication system can includes: at least one wireless LAN component, and wherein at least one device connects through the wireless LAN component; two or more BLUETOOTH transceivers; a communication system comprising: two or more communication modules; a signal processor; a signal data database; and a translator.
  • The signal processor can execute instructions to: determine signal characteristics; determine location information; determine person information; determine movement information; determine sensor information; and resolve the signal characteristics, location information, person information, movement information, and sensor information to determine if the signal originated inside the vehicle, wherein the signal characteristics, location information, person information, movement information, and sensor information is weighted to resolve the determination of if the signal originated inside the vehicle.
  • The embodiments can also include a method, comprising: receiving a signal at a vehicle communication system from a device; automatically determining if the signal originates from inside the vehicle; if the signal does not originate from inside the vehicle, rejecting the signal; if the signal originates from inside the vehicle, communicatively connecting the vehicle communication system to the device; receiving a request for a function; determining one or more of available resources, available devices, and available sources; determining load; selecting a device or source to provide the function; commanding the device or source to provide the function; and receiving the function.
  • The vehicle communication system can include a communication system in communication with one or more devices, wherein the device is a computing system. The function may be one of playing of multimedia data, provisioning of access to the internet, or providing access to e-mail. Determining the load may comprise one of: determining a number of input/output messages sent between a device and vehicle communication system; or analyzing information sent to the vehicle communication system by the device, wherein the selection of the device or source is based both on availability and load. A first device may request the function and a second device provides the function. A vehicle communication system can the request the function and a device provides the function for the vehicle communication system. A device may request the function, and the vehicle communication system provides the function for the device.
  • The embodiments can also include a vehicle communication system, comprising: two or more BLUETOOTH transceivers; a communication system comprising: two or more communication modules; a signal processor; wherein the signal processor is operable to: receive a request for a function; determine one or more of available resources, available devices, and available sources; determine load on the one or more available resources, available devices, and available sources; select a device or source to provide the function; command the device or source to provide the function; and receive the function, wherein the function is one of playing of multimedia data, provisioning of access to the internet, or providing access to e-mail.
  • The embodiments can also include computer-executable instructions comprising: instructions to receive a request for a function; instructions to determine one or more of available resources, available devices, and available sources; instructions to determine load on the one or more available resources, available devices, and available sources; instructions to select a device or source to provide the function; instructions to command the device or source to provide the function; and instructions to receive the function. The vehicle communication system can includes at least one wireless LAN component, and wherein at least one device connects through the wireless LAN component; two or more BLUETOOTH transceivers; a communication system comprising: two or more communication modules; a signal processor; a signal data database; and a translator. The instructions to determine the load comprises one of: instructions to determine a number of input/output messages sent between a device and vehicle communication system; or instructions to analyze information sent to the vehicle communication system by the device, wherein the selection of the device or source is based both on availability and load.
  • The embodiments can also include a method, comprising: receiving a signal at a vehicle communication system from a device; automatically determining if the signal originates from inside the vehicle; if the signal does not originate from inside the vehicle, rejecting the signal; if the signal originates from inside the vehicle, communicatively connecting the vehicle communication system to the device; receiving a message from a first device; verifying that the message originated from inside the vehicle; and providing the message to a second device. The message may be is directed to the second device. The method may also comprise: accessing signal data in a signal data database to retrieve address information from the second device; accessing signal data in a signal data database to retrieve address information for the first device; comparing an address for the first device in the message with the retrieved address information, wherein if the address for the first device in the message is the same as the retrieved address information, the message is verified.
  • The message may be sent through the vehicle communication system to the second device. Verifying the message can comprise: determining signal characteristics; determining location information; determining person information; determining movement information; determining sensor information; and resolving the signal characteristics, location information, person information, movement information, and sensor information to determine if the message originated inside the vehicle. The vehicle communication system can include at least one wireless LAN component, wherein the first device communicates to the second device through the at least one wireless LAN component.
  • The embodiments can also include a vehicle communication system, comprising: two or more BLUETOOTH transceivers; a communication system comprising: two or more communication modules; a signal processor; wherein the signal processor is operable to: receiving a message from a first device; verifying that the message originated from inside the vehicle; and providing the message to a second device, wherein the message is directed to the second device. The vehicle communication system can access signal data in a signal data database to retrieve address information from the second device, and access signal data in a signal data database to retrieve address information for the first device. The vehicle communication system can also compare an address for the first device in the message with the retrieved address information, wherein if the address for the first device in the message is the same as the retrieved address information, the message is verified, wherein the message is sent through the vehicle communication system to the second device.
  • The embodiments can also include computer-executable instructions comprising: instructions to receive a message from a first device; instructions to verify that the message originated from inside the vehicle; and instructions to provide the message to a second device.
  • The vehicle communication system can include at least one wireless LAN component, and wherein at least one device connects through the wireless LAN component; two or more BLUETOOTH transceivers; a communication system comprising: two or more communication modules; a signal processor, wherein the communication system further comprises: a signal data database; and a translator. The instructions can also include instructions to access signal data in a signal data database to retrieve address information from the second device, instructions to access signal data in a signal data database to retrieve address information for the first device; and instructions to compare an address for the first device in the message with the retrieved address information, wherein if the address for the first device in the message is the same as the retrieved address information, the message is verified, wherein the vehicle communication system includes at least one wireless LAN component, and wherein the first device communicates to the second device through the at least one wireless LAN component.
  • The embodiments can also include a method, comprising: receiving a signal at a vehicle communication system from a device; automatically determining if the signal originates from inside the vehicle; if the signal does not originate from inside the vehicle, rejecting the signal; if the signal originates from inside the vehicle, communicatively connecting the vehicle communication system to the device; receiving a request for a application; determining one or more of available resources, available devices, and available sources; determining load; activating function on a device or source to execute the application; receiving, from the device, application user interface information or data at the vehicle to be displayed on a display associated with the vehicle; displaying the application on the display; commanding the vehicle communication system to perform functions for the application; receiving input on the display; based on the input received on the display, executing an application function on the device, wherein the vehicle communication system includes a communication system in communication with one or more devices.
  • The device is a computing system, and wherein determining the load comprises one of: determining a number of input/output messages sent between a device and vehicle communication system; or analyzing information sent to the vehicle communication system by the device, wherein the selection of the device or source is based both on availability and load. The request for the application may be received by the vehicle communication system, wherein a device provides the application for the vehicle communication system.
  • The embodiments can also include a vehicle communication system, comprising: two or more BLUETOOTH transceivers; a communication system comprising: two or more communication modules; a signal processor; wherein the signal processor is operable to: receive a request for a application; determine one or more of available resources, available devices, and available sources; determine load; activate function on a device or source to execute the application; receive, from the device, application user interface information or data at the vehicle to be displayed on a display associated with the vehicle; and display the application on the display, wherein the signal processor is further operable to receive a command for the vehicle communication system to perform functions for the application, receive input on the display; and based on the input received on the display, execute an application function on the device, wherein determining the load comprises one of: determining a number of input/output messages sent between a device and vehicle communication system; or analyzing information sent to the vehicle communication system by the device. The request for the application may be received by the vehicle communication system, and a device provides the application for the vehicle communication system.
  • The embodiments can also include computer-executable instructions comprising: instructions to receive a request for a application; instructions to determine one or more of available resources, available devices, and available sources; instructions to determine load; instructions to activate function on a device or source to execute the application; instructions to receive, from the device, application user interface information or data at the vehicle to be displayed on a display associated with the vehicle; and instructions to display the application on the display, wherein the vehicle communication system includes: at least one wireless LAN component, and wherein at least one device connects through the wireless LAN component; two or more BLUETOOTH transceivers; a communication system comprising: two or more communication modules; a signal processor. The communication system can further comprise: a signal data database; and a translator. The instructions can further comprise: instructions to receive an input on the display; and based on the input received on the display, instructions to execute an application function on the device, wherein the request for the application is received by the vehicle communication system, and wherein a device provides the application for the vehicle communication system.
  • The embodiments can also include a method, comprising: receiving a signal at a vehicle communication system from a device; automatically determining if the signal originates from inside the vehicle; if the signal does not originate from inside the vehicle, rejecting the signal; if the signal originates from inside the vehicle, communicatively connecting the vehicle communication system to the device; determining one or more of available resources, available devices, and available sources; determining cloud services connected to the available resources, available devices, and available sources; receiving input on a user interface of the vehicle, wherein the input requests a cloud service; routing the input to a device; receiving, from the device, data from the cloud service; and providing the data on the display of the vehicle.
  • A vehicle communication system can send signal data to a processor to determine the available resources, available devices, and available sources and the cloud services and determine which cloud sources are available through the device, wherein the processor determines which cloud sources are available through a server. The processor may also present the cloud services information on a touch sensitive display, wherein the device is a computing system, and wherein the cloud service is for the provision of data or for the execution of an application. The input can be received on the touch sensitive display, wherein the input is analyzed to determine a correct source for the cloud service, and wherein, based on the analysis, the vehicle communication system routes the input to the correct device.
  • The embodiments can also include a vehicle communication system, comprising: two or more BLUETOOTH transceivers; a communication system comprising: two or more communication modules; a signal processor; wherein the signal processor is operable to: determine one or more of available resources, available devices, and available sources; determine cloud services connected to the available resources, available devices, and available sources; receive input on a user interface of the vehicle, wherein the input requests a cloud service; route the input to a device; receive, from the device, data from the cloud service; and provide the data on the display of the vehicle, wherein the signal processor determines which cloud sources are available through the device, and wherein a processor determines which cloud sources are available through a server. The processor can be further operable to present the cloud services information on a touch sensitive display, wherein the input is received on the touch sensitive display, wherein the input is analyzed to determine a correct source for the cloud service, and wherein, based on the analysis, the signal processor routes the input to the correct device, and wherein the cloud service is for the provision of data or for the execution of an application.
  • The embodiments can also include computer-executable instructions comprising: instructions to determine one or more of available resources, available devices, and available sources; instructions to determine cloud services connected to the available resources, available devices, and available sources; instructions to receive input on a user interface of the vehicle, wherein the input requests a cloud service; instructions to route the input to a device; instructions to receive, from the device, data from the cloud service; and instructions to provide the data on the display of the vehicle, wherein the vehicle communication system includes: at least one wireless LAN component, and wherein at least one device connects through the wireless LAN component; two or more BLUETOOTH transceivers; a communication system comprising: two or more communication modules; a signal processor, and wherein the communication system further comprises: a signal data database; and a translator.
  • The signal processor can determine which cloud sources are available through the device, and wherein a processor determines which cloud sources are available through a server, wherein the cloud service is for the provision of data or for the execution of an application.
  • The embodiments can also include a method, comprising: receiving a signal at a vehicle communication system from a device; automatically determining if the signal originates from inside the vehicle; if the signal does not originate from inside the vehicle, rejecting the signal; if the signal originates from inside the vehicle, communicatively connecting the vehicle communication system to the device; determining one or more of available resources, available devices, and available sources; determining multimedia sources associated with the available resources, available devices, and available sources; receiving input on a user interface of the vehicle, wherein the input multimedia; routing the input to an available resource, available device, and available source; receiving a multimedia steam from an available resource, available device, and available source; and presenting the multimedia stream on an output of the vehicle.
  • The method can further comprise caching the multimedia stream, wherein two or more multimedia streams are received from two or more sources, and wherein all of the two or more multimedia streams are cached. Determining multimedia sources associated with the available resources, available devices, and available sources can comprise canvassing the available resources, available devices, and available sources for multimedia available at those available resources, available devices, and available sources. The method can further comprise consolidating the multimedia; and presenting the multimedia on a vehicle touch sensitive display; determining if safety parameters are met for presentation of the multimedia stream; if safety parameters are met for presentation of the multimedia stream, presenting the multimedia stream; if safety parameters are not met for presentation of the multimedia stream, recording the multimedia stream, wherein the multimedia stream is recorded in local storage.
  • The embodiments can also include a vehicle communication system, comprising: two or more BLUETOOTH transceivers; a communication system comprising: two or more communication modules; a signal processor; wherein the signal processor is operable to: determine one or more of available resources, available devices, and available sources; determine multimedia sources associated with the available resources, available devices, and available sources; receive input on a user interface of the vehicle, wherein the input multimedia; route the input to an available resource, available device, and available source; receive a multimedia steam from an available resource, available device, and available source; and present the multimedia stream on an output of the vehicle, wherein two or more multimedia streams are received from two or more sources, and wherein all of the two or more multimedia streams are cached.
  • The processor can be further operable to: consolidate the multimedia; present the multimedia on a vehicle touch sensitive display; determine if safety parameters are met for presentation of the multimedia stream; if safety parameters are met for presentation of the multimedia stream, present the multimedia stream; and if safety parameters are not met for presentation of the multimedia stream, record the multimedia stream, wherein the multimedia stream is recorded in local storage in the vehicle or on another storage element not associated with the vehicle.
  • The embodiments can also include computer-executable instructions comprising: instructions to determine one or more of available resources, available devices, and available sources; instructions to determine multimedia sources associated with the available resources, available devices, and available sources; instructions to receive input on a user interface of the vehicle, wherein the input multimedia; instructions to route the input to an available resource, available device, and available source; instructions to receive a multimedia steam from an available resource, available device, and available source; and instructions to present the multimedia stream on an output of the vehicle, wherein the vehicle communication system includes: at least one wireless LAN component, and wherein at least one device connects through the wireless LAN component; two or more BLUETOOTH transceivers; a communication system comprising: two or more communication modules; a signal processor, wherein the communication system further comprises: a signal data database; and a translator.
  • The computer executable instructions can further comprise instructions to determine if safety parameters are met for presentation of the multimedia stream; if safety parameters are met for presentation of the multimedia stream, instructions to present the multimedia stream; and if safety parameters are not met for presentation of the multimedia stream, instructions to record the multimedia stream, wherein the multimedia stream is recorded in local storage in the vehicle or on another storage element not associated with the vehicle.

Claims (20)

What is claimed is:
1. A method, comprising:
receiving a signal at a vehicle communication system from a device;
automatically determining if the signal originates from inside the vehicle;
if the signal does not originate from inside the vehicle, rejecting the signal;
if the signal originates from inside the vehicle, communicatively connecting the vehicle communication system to the device;
determining one or more of available resources, available devices, and available sources;
determining multimedia sources associated with the available resources, available devices, and available sources;
receiving input on a user interface of the vehicle, wherein the input multimedia;
routing the input to an available resource, available device, and available source;
receiving a multimedia steam from an available resource, available device, and available source; and
presenting the multimedia stream on an output of the vehicle.
2. The method of claim 1, further comprising caching the multimedia stream.
3. The method of claim 2, wherein two or more multimedia streams are received from two or more sources.
4. The method of claim 3, wherein all of the two or more multimedia streams are cached.
5. The method of claim 4, wherein determining multimedia sources associated with the available resources, available devices, and available sources comprises canvassing the available resources, available devices, and available sources for multimedia available at those available resources, available devices, and available sources.
6. The method of claim 5, further comprising:
consolidating the multimedia; and
presenting the multimedia on a vehicle touch sensitive display.
7. The method of claim 6, further comprising determining if safety parameters are met for presentation of the multimedia stream.
8. The method of claim 7, further comprising if safety parameters are met for presentation of the multimedia stream, presenting the multimedia stream.
9. The method of claim 7, further comprising if safety parameters are not met for presentation of the multimedia stream, recording the multimedia stream.
10. The method of claim 9, wherein the multimedia stream is recorded in local storage.
11. A vehicle communication system, comprising:
two or more BLUETOOTH transceivers;
a communication system comprising:
two or more communication modules;
a signal processor;
wherein the signal processor is operable to:
determine one or more of available resources, available devices, and available sources;
determine multimedia sources associated with the available resources, available devices, and available sources;
receive input on a user interface of the vehicle, wherein the input multimedia;
route the input to an available resource, available device, and available source;
receive a multimedia steam from an available resource, available device, and available source; and
present the multimedia stream on an output of the vehicle.
12. The vehicle communication system of claim 11, wherein two or more multimedia streams are received from two or more sources, and wherein all of the two or more multimedia streams are cached.
13. The vehicle communication system of claim 11, wherein the processor is further operable to:
consolidate the multimedia; and
present the multimedia on a vehicle touch sensitive display.
14. The vehicle communication system of claim 11, the processor is further operable to:
determine if safety parameters are met for presentation of the multimedia stream;
if safety parameters are met for presentation of the multimedia stream, present the multimedia stream; and
if safety parameters are not met for presentation of the multimedia stream, record the multimedia stream.
15. The vehicle communication system of claim 14, wherein the multimedia stream is recorded in local storage in the vehicle or on another storage element not associated with the vehicle.
16. A computer readable medium having stored thereon computer-executable instructions, the computer executable instructions causing a processor to execute a method for providing a universal bus, the computer-executable instructions comprising:
instructions to determine one or more of available resources, available devices, and available sources;
instructions to determine multimedia sources associated with the available resources, available devices, and available sources;
instructions to receive input on a user interface of the vehicle, wherein the input multimedia;
instructions to route the input to an available resource, available device, and available source;
instructions to receive a multimedia steam from an available resource, available device, and available source; and
instructions to present the multimedia stream on an output of the vehicle.
17. The computer readable medium of claim 16, wherein the vehicle communication system includes:
at least one wireless LAN component, and wherein at least one device connects through the wireless LAN component;
two or more BLUETOOTH transceivers;
a communication system comprising:
two or more communication modules;
a signal processor.
18. The computer readable medium of claim 17, wherein the communication system further comprises:
a signal data database; and
a translator.
19. The computer readable medium of claim 18, further comprising:
instructions to determine if safety parameters are met for presentation of the multimedia stream;
if safety parameters are met for presentation of the multimedia stream, instructions to present the multimedia stream; and
if safety parameters are not met for presentation of the multimedia stream, instructions to record the multimedia stream.
20. The computer readable medium of claim 19, wherein the multimedia stream is recorded in local storage in the vehicle or on another storage element not associated with the vehicle.
US13/679,875 2011-11-16 2012-11-16 Music streaming Abandoned US20130145401A1 (en)

Priority Applications (11)

Application Number Priority Date Filing Date Title
US201161560509P true 2011-11-16 2011-11-16
US201261637164P true 2012-04-23 2012-04-23
US201261646747P true 2012-05-14 2012-05-14
US201261653264P true 2012-05-30 2012-05-30
US201261653275P true 2012-05-30 2012-05-30
US201261653563P true 2012-05-31 2012-05-31
US201261663335P true 2012-06-22 2012-06-22
US201261672483P true 2012-07-17 2012-07-17
US201261714016P true 2012-10-15 2012-10-15
US201261715699P true 2012-10-18 2012-10-18
US13/679,875 US20130145401A1 (en) 2011-11-16 2012-11-16 Music streaming

Applications Claiming Priority (116)

Application Number Priority Date Filing Date Title
US13/679,441 US8983718B2 (en) 2011-11-16 2012-11-16 Universal bus in the car
US13/679,857 US9020491B2 (en) 2011-11-16 2012-11-16 Sharing applications/media between car and phone (hydroid)
US13/678,726 US9043130B2 (en) 2011-11-16 2012-11-16 Object sensing (pedestrian avoidance/accident avoidance)
US13/679,887 US8995982B2 (en) 2011-11-16 2012-11-16 In-car communication between devices
US13/678,762 US9296299B2 (en) 2011-11-16 2012-11-16 Behavioral tracking and vehicle applications
US13/678,745 US9014911B2 (en) 2011-11-16 2012-11-16 Street side sensors
US13/679,878 US9140560B2 (en) 2011-11-16 2012-11-16 In-cloud connection for car multimedia
US13/679,400 US9159232B2 (en) 2011-11-16 2012-11-16 Vehicle climate control
US13/679,875 US20130145401A1 (en) 2011-11-16 2012-11-16 Music streaming
US13/679,369 US9176924B2 (en) 2011-11-16 2012-11-16 Method and system for vehicle data collection
US13/678,753 US9105051B2 (en) 2011-11-16 2012-11-16 Car location
US13/679,459 US9324234B2 (en) 2010-10-01 2012-11-16 Vehicle comprising multi-operating system
US13/678,735 US9046374B2 (en) 2011-11-16 2012-11-16 Proximity warning relative to other cars
US13/679,443 US9240018B2 (en) 2011-11-16 2012-11-16 Method and system for maintaining and reporting vehicle occupant information
US13/679,864 US9079497B2 (en) 2011-11-16 2012-11-16 Mobile hot spot/router/application share site or network
US13/679,350 US9008856B2 (en) 2011-11-16 2012-11-16 Configurable vehicle console
US13/678,710 US9123058B2 (en) 2011-11-16 2012-11-16 Parking space finder based on parking meter data
US13/678,699 US9330567B2 (en) 2011-11-16 2012-11-16 Etiquette suggestion
US13/679,842 US8979159B2 (en) 2011-11-16 2012-11-16 Configurable hardware unit for car systems
US13/829,505 US9088572B2 (en) 2011-11-16 2013-03-14 On board vehicle media controller
US13/828,651 US9055022B2 (en) 2011-11-16 2013-03-14 On board vehicle networking module
US13/830,003 US9008906B2 (en) 2011-11-16 2013-03-14 Occupant sharing of displayed content in vehicles
US13/828,960 US9173100B2 (en) 2011-11-16 2013-03-14 On board vehicle network security
US13/830,133 US9081653B2 (en) 2011-11-16 2013-03-14 Duplicated processing in vehicles
US13/828,513 US9116786B2 (en) 2011-11-16 2013-03-14 On board vehicle networking module
US13/829,718 US9043073B2 (en) 2011-11-16 2013-03-14 On board vehicle diagnostic module
US13/963,728 US9098367B2 (en) 2012-03-14 2013-08-09 Self-configuring vehicle console application store
US14/252,978 US9378601B2 (en) 2012-03-14 2014-04-15 Providing home automation information via communication with a vehicle
US14/253,006 US9384609B2 (en) 2012-03-14 2014-04-15 Vehicle to vehicle safety and traffic communications
US14/253,204 US9147296B2 (en) 2012-03-14 2014-04-15 Customization of vehicle controls and settings based on user profile data
US14/253,312 US9020697B2 (en) 2012-03-14 2014-04-15 Vehicle-based multimode discovery
US14/253,506 US9082239B2 (en) 2012-03-14 2014-04-15 Intelligent vehicle for assisting vehicle occupants
US14/253,840 US9378602B2 (en) 2012-03-14 2014-04-15 Traffic consolidation based on vehicle destination
US14/253,486 US9536361B2 (en) 2012-03-14 2014-04-15 Universal vehicle notification system
US14/253,424 US9305411B2 (en) 2012-03-14 2014-04-15 Automatic device and vehicle pairing via detected emitted signals
US14/253,376 US9317983B2 (en) 2012-03-14 2014-04-15 Automatic communication of damage and health in detected vehicle incidents
US14/253,416 US9142071B2 (en) 2012-03-14 2014-04-15 Vehicle zone-based intelligent console display settings
US14/253,334 US9235941B2 (en) 2012-03-14 2014-04-15 Simultaneous video streaming across multiple channels
US14/253,729 US9183685B2 (en) 2012-03-14 2014-04-15 Travel itinerary based on user profile data
US14/253,755 US9230379B2 (en) 2012-03-14 2014-04-15 Communication of automatically generated shopping list to vehicles and associated devices
US14/253,058 US9058703B2 (en) 2012-03-14 2014-04-15 Shared navigational information between vehicles
US14/253,251 US9147297B2 (en) 2012-03-14 2014-04-15 Infotainment system based on user profile
US14/253,464 US9142072B2 (en) 2012-03-14 2014-04-15 Information shared between a vehicle and user devices
US14/253,766 US9135764B2 (en) 2012-03-14 2014-04-15 Shopping cost and travel optimization application
US14/253,706 US9147298B2 (en) 2012-03-14 2014-04-15 Behavior modification via altered map routes based on user profile information
US14/253,406 US9117318B2 (en) 2012-03-14 2014-04-15 Vehicle diagnostic detection through sensitive vehicle skin
US14/253,405 US9082238B2 (en) 2012-03-14 2014-04-15 Synchronization between vehicle and user device calendar
US14/253,838 US9373207B2 (en) 2012-03-14 2014-04-15 Central network for the automated control of vehicular traffic
US14/253,743 US9153084B2 (en) 2012-03-14 2014-04-15 Destination and travel information application
US14/253,371 US9123186B2 (en) 2012-03-14 2014-04-15 Remote control of associated vehicle devices
US14/253,048 US9349234B2 (en) 2012-03-14 2014-04-15 Vehicle to vehicle social and business communications
US14/253,446 US9646439B2 (en) 2012-03-14 2014-04-15 Multi-vehicle shared communications network and bandwidth
US14/253,330 US9218698B2 (en) 2012-03-14 2014-04-15 Vehicle damage detection and indication
US14/253,078 US9524597B2 (en) 2012-03-14 2014-04-15 Radar sensing and emergency response vehicle detection
US14/468,055 US9240019B2 (en) 2011-11-16 2014-08-25 Location information exchange between vehicle and device
US14/527,209 US9542085B2 (en) 2011-11-16 2014-10-29 Universal console chassis for the car
US14/543,535 US9412273B2 (en) 2012-03-14 2014-11-17 Radar sensing and emergency response vehicle detection
US14/557,427 US9449516B2 (en) 2011-11-16 2014-12-01 Gesture recognition for on-board display
US14/657,934 US9338170B2 (en) 2011-11-16 2015-03-13 On board vehicle media controller
US14/657,829 US9417834B2 (en) 2011-11-16 2015-03-13 Occupant sharing of displayed content in vehicles
US14/659,255 US9297662B2 (en) 2011-11-16 2015-03-16 Universal bus in the car
US14/684,856 US9290153B2 (en) 2012-03-14 2015-04-13 Vehicle-based multimode discovery
US14/822,855 US20160040998A1 (en) 2012-03-14 2015-08-10 Automatic camera image retrieval based on route traffic and conditions
US14/822,840 US20160039430A1 (en) 2012-03-14 2015-08-10 Providing gesture control of associated vehicle functions across vehicle zones
US14/824,886 US20160041820A1 (en) 2012-03-14 2015-08-12 Vehicle and device software updates propagated via a viral communication contact
US14/825,998 US9466161B2 (en) 2012-03-14 2015-08-13 Driver facts behavior information storage system
US14/827,944 US20160047662A1 (en) 2012-03-14 2015-08-17 Proactive machine learning in a vehicular environment
US14/831,696 US9545930B2 (en) 2012-03-14 2015-08-20 Parental control over vehicle features and child alert system
US14/836,677 US20160055747A1 (en) 2011-11-16 2015-08-26 Law breaking/behavior sensor
US14/836,668 US20160062583A1 (en) 2011-11-16 2015-08-26 Removable, configurable vehicle console
US14/847,849 US20160070527A1 (en) 2012-03-14 2015-09-08 Network connected vehicle and associated controls
US14/863,257 US20160082839A1 (en) 2012-03-14 2015-09-23 Configurable dash display based on detected location and preferences
US14/863,361 US20160086391A1 (en) 2012-03-14 2015-09-23 Fleetwide vehicle telematics systems and methods
US14/875,472 US20160114745A1 (en) 2011-11-16 2015-10-05 On board vehicle remote control module
US14/875,411 US20160103980A1 (en) 2011-11-16 2015-10-05 Vehicle middleware
US14/927,196 US20160140776A1 (en) 2011-11-16 2015-10-29 Communications based on vehicle diagnostics and indications
US14/930,197 US20160127887A1 (en) 2011-11-16 2015-11-02 Control of device features based on vehicle state
US14/941,304 US20160155326A1 (en) 2012-03-14 2015-11-13 Relay and exchange protocol in an automated zone-based vehicular traffic control environment
US14/958,371 US20160163133A1 (en) 2012-03-14 2015-12-03 Automatic vehicle diagnostic detection and communication
US14/976,722 US20160188190A1 (en) 2011-11-16 2015-12-21 Configurable dash display
US14/978,185 US20160185222A1 (en) 2011-11-16 2015-12-22 On board vehicle media controller
US14/979,272 US20160189544A1 (en) 2011-11-16 2015-12-22 Method and system for vehicle data collection regarding traffic
US14/991,236 US20160196745A1 (en) 2011-11-16 2016-01-08 On board vehicle presence reporting module
US14/992,950 US20160205419A1 (en) 2012-03-14 2016-01-11 Simultaneous video streaming across multiple channels
US15/014,695 US20160246526A1 (en) 2012-03-14 2016-02-03 Global standard template creation, storage, and modification
US15/014,653 US20160223347A1 (en) 2012-03-14 2016-02-03 Travel route alteration based on user profile and business
US15/014,590 US20160244011A1 (en) 2012-03-14 2016-02-03 User interface and virtual personality presentation based on user profile
US15/058,010 US10079733B2 (en) 2011-11-16 2016-03-01 Automatic and adaptive selection of multimedia sources
US15/064,297 US20160249853A1 (en) 2012-03-14 2016-03-08 In-vehicle infant health monitoring system
US15/066,148 US20160250985A1 (en) 2012-03-14 2016-03-10 Universal vehicle voice command system
US15/073,955 US20160306766A1 (en) 2011-11-16 2016-03-18 Controller area network bus
US15/085,946 US20160321848A1 (en) 2012-03-14 2016-03-30 Control of vehicle features based on user recognition and identification
US15/091,470 US20160318524A1 (en) 2012-03-14 2016-04-05 Storing user gestures in a user profile data template
US15/091,461 US10013878B2 (en) 2012-03-14 2016-04-05 Vehicle registration to enter automated control of vehicular traffic
US15/099,413 US20160247377A1 (en) 2012-03-14 2016-04-14 Guest vehicle user reporting
US15/099,375 US20160306615A1 (en) 2011-11-16 2016-04-14 Vehicle application store for console
US15/133,793 US20160255575A1 (en) 2011-11-16 2016-04-20 Network selector in a vehicle infotainment system
US15/138,108 US9994229B2 (en) 2012-03-14 2016-04-25 Facial recognition database created from social networking sites
US15/138,642 US20160314538A1 (en) 2011-11-16 2016-04-26 Insurance tracking
US15/143,831 US20160318467A1 (en) 2012-03-14 2016-05-02 Building profiles associated with vehicle users
US15/143,856 US20160318468A1 (en) 2012-03-14 2016-05-02 Health statistics and communications of associated vehicle users
US15/269,434 US20170066406A1 (en) 2012-03-14 2016-09-19 Vehicle intruder alert detection and indication
US15/269,079 US20170067747A1 (en) 2012-03-14 2016-09-19 Automatic alert sent to user based on host location information
US15/269,617 US9977593B2 (en) 2011-11-16 2016-09-19 Gesture recognition for on-board display
US15/275,242 US20170078472A1 (en) 2011-11-16 2016-09-23 On board vehicle presence reporting module
US15/274,755 US20170078223A1 (en) 2012-03-14 2016-09-23 Vehicle initiated communications with third parties via virtual personality
US15/274,642 US20170075701A1 (en) 2012-03-14 2016-09-23 Configuration of haptic feedback and visual preferences in vehicle user interfaces
US15/277,412 US20170082447A1 (en) 2012-03-14 2016-09-27 Proactive machine learning in a vehicular environment
US15/287,219 US10020995B2 (en) 2011-11-16 2016-10-06 Vehicle middleware
US15/288,244 US20170099295A1 (en) 2012-03-14 2016-10-07 Access and portability of user profiles stored as templates
US15/289,317 US10275959B2 (en) 2012-03-14 2016-10-10 Driver facts behavior information storage system
US15/337,146 US9952680B2 (en) 2012-03-14 2016-10-28 Positional based movements and accessibility of features associated with a vehicle
US15/347,909 US20170131712A1 (en) 2012-03-14 2016-11-10 Relay and exchange protocol in an automated zone-based vehicular traffic control environment
US15/377,887 US20170132917A1 (en) 2011-11-16 2016-12-13 Law breaking/behavior sensor
US15/395,730 US10023117B2 (en) 2012-03-14 2016-12-30 Universal vehicle notification system
US15/400,947 US20170247000A1 (en) 2012-03-14 2017-01-06 User interface and virtual personality presentation based on user profile

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/058,010 Continuation US10079733B2 (en) 2010-10-01 2016-03-01 Automatic and adaptive selection of multimedia sources

Publications (1)

Publication Number Publication Date
US20130145401A1 true US20130145401A1 (en) 2013-06-06

Family

ID=48430173

Family Applications (56)

Application Number Title Priority Date Filing Date
US13/678,726 Active US9043130B2 (en) 2010-10-01 2012-11-16 Object sensing (pedestrian avoidance/accident avoidance)
US13/679,368 Abandoned US20130145279A1 (en) 2011-11-16 2012-11-16 Removable, configurable vehicle console
US13/679,815 Active US8919848B2 (en) 2011-11-16 2012-11-16 Universal console chassis for the car
US13/678,762 Active 2033-02-05 US9296299B2 (en) 2010-10-01 2012-11-16 Behavioral tracking and vehicle applications
US13/679,363 Abandoned US20130145297A1 (en) 2011-11-16 2012-11-16 Configurable heads-up dash display
US13/678,699 Active 2033-02-05 US9330567B2 (en) 2010-10-01 2012-11-16 Etiquette suggestion
US13/679,875 Abandoned US20130145401A1 (en) 2011-11-16 2012-11-16 Music streaming
US13/679,441 Active US8983718B2 (en) 2010-10-01 2012-11-16 Universal bus in the car
US13/678,735 Active 2032-12-04 US9046374B2 (en) 2010-10-01 2012-11-16 Proximity warning relative to other cars
US13/679,878 Expired - Fee Related US9140560B2 (en) 2010-10-01 2012-11-16 In-cloud connection for car multimedia
US13/679,292 Active US8862299B2 (en) 2011-11-16 2012-11-16 Branding of electrically propelled vehicles via the generation of specific operating output
US13/678,753 Active 2033-04-09 US9105051B2 (en) 2010-10-01 2012-11-16 Car location
US13/679,412 Abandoned US20130145360A1 (en) 2011-11-16 2012-11-16 Vehicle application store for console
US13/679,864 Active 2033-04-29 US9079497B2 (en) 2010-10-01 2012-11-16 Mobile hot spot/router/application share site or network
US13/678,773 Active US8818725B2 (en) 2011-11-16 2012-11-16 Location information exchange between vehicle and device
US13/678,691 Abandoned US20130144459A1 (en) 2011-11-16 2012-11-16 Law breaking/behavior sensor
US13/679,680 Abandoned US20130151065A1 (en) 2011-11-16 2012-11-16 Communications based on vehicle diagnostics and indications
US13/679,857 Active 2033-02-23 US9020491B2 (en) 2010-10-01 2012-11-16 Sharing applications/media between car and phone (hydroid)
US13/679,443 Active 2033-08-08 US9240018B2 (en) 2010-10-01 2012-11-16 Method and system for maintaining and reporting vehicle occupant information
US13/678,673 Abandoned US20130144657A1 (en) 2011-11-16 2012-11-16 Insurance tracking
US13/678,722 Active 2032-12-26 US8922393B2 (en) 2011-11-16 2012-11-16 Parking meter expired alert
US13/679,358 Abandoned US20130152003A1 (en) 2011-11-16 2012-11-16 Configurable dash display
US13/679,476 Abandoned US20130145482A1 (en) 2011-11-16 2012-11-16 Vehicle middleware
US13/679,459 Active 2032-04-11 US9324234B2 (en) 2010-10-01 2012-11-16 Vehicle comprising multi-operating system
US13/679,887 Active 2033-03-18 US8995982B2 (en) 2010-10-01 2012-11-16 In-car communication between devices
US13/679,369 Active 2033-03-04 US9176924B2 (en) 2010-10-01 2012-11-16 Method and system for vehicle data collection
US13/679,350 Active US9008856B2 (en) 2010-10-01 2012-11-16 Configurable vehicle console
US13/679,676 Abandoned US20130145065A1 (en) 2011-11-16 2012-11-16 Control of device features based on vehicle state
US13/679,234 Active US8831826B2 (en) 2011-11-16 2012-11-16 Gesture recognition for on-board display
US13/679,204 Active US8793034B2 (en) 2011-11-16 2012-11-16 Feature recognition for configuring a vehicle console and associated devices
US13/679,400 Expired - Fee Related US9159232B2 (en) 2010-10-01 2012-11-16 Vehicle climate control
US13/678,710 Expired - Fee Related US9123058B2 (en) 2010-10-01 2012-11-16 Parking space finder based on parking meter data
US13/678,745 Active US9014911B2 (en) 2010-10-01 2012-11-16 Street side sensors
US13/843,011 Abandoned US20130231800A1 (en) 2011-11-16 2013-03-15 Vehicle occupant health data gathering and monitoring
US14/468,055 Active US9240019B2 (en) 2010-10-01 2014-08-25 Location information exchange between vehicle and device
US14/527,209 Active US9542085B2 (en) 2010-10-01 2014-10-29 Universal console chassis for the car
US14/557,427 Active US9449516B2 (en) 2010-10-01 2014-12-01 Gesture recognition for on-board display
US14/659,255 Active US9297662B2 (en) 2010-10-01 2015-03-16 Universal bus in the car
US14/832,815 Abandoned US20160070456A1 (en) 2011-11-16 2015-08-21 Configurable heads-up dash display
US14/836,668 Abandoned US20160062583A1 (en) 2010-10-01 2015-08-26 Removable, configurable vehicle console
US14/836,677 Abandoned US20160055747A1 (en) 2010-10-01 2015-08-26 Law breaking/behavior sensor
US14/875,411 Abandoned US20160103980A1 (en) 2010-10-01 2015-10-05 Vehicle middleware
US14/927,196 Abandoned US20160140776A1 (en) 2010-10-01 2015-10-29 Communications based on vehicle diagnostics and indications
US14/930,197 Abandoned US20160127887A1 (en) 2010-10-01 2015-11-02 Control of device features based on vehicle state
US14/976,722 Abandoned US20160188190A1 (en) 2010-10-01 2015-12-21 Configurable dash display
US15/058,010 Active US10079733B2 (en) 2010-10-01 2016-03-01 Automatic and adaptive selection of multimedia sources
US15/099,375 Abandoned US20160306615A1 (en) 2010-10-01 2016-04-14 Vehicle application store for console
US15/138,642 Abandoned US20160314538A1 (en) 2010-10-01 2016-04-26 Insurance tracking
US15/269,617 Active US9977593B2 (en) 2010-10-01 2016-09-19 Gesture recognition for on-board display
US15/287,219 Active US10020995B2 (en) 2010-10-01 2016-10-06 Vehicle middleware
US15/377,887 Abandoned US20170132917A1 (en) 2010-10-01 2016-12-13 Law breaking/behavior sensor
US15/401,719 Active US10177986B2 (en) 2011-11-16 2017-01-09 Universal console chassis for the car
US16/031,931 Pending US20190222484A1 (en) 2011-11-16 2018-07-10 Vehicle middleware
US16/360,018 Pending US20190288916A1 (en) 2011-11-16 2019-03-21 System and method for a vehicle zone-determined reconfigurable display
US16/365,619 Pending US20190288917A1 (en) 2011-11-16 2019-03-26 Insurence Tracking
US16/386,032 Pending US20190311611A1 (en) 2011-11-16 2019-04-16 System and Method for Dynamic Map Updating in a Conveyance.

Family Applications Before (6)

Application Number Title Priority Date Filing Date
US13/678,726 Active US9043130B2 (en) 2010-10-01 2012-11-16 Object sensing (pedestrian avoidance/accident avoidance)
US13/679,368 Abandoned US20130145279A1 (en) 2011-11-16 2012-11-16 Removable, configurable vehicle console
US13/679,815 Active US8919848B2 (en) 2011-11-16 2012-11-16 Universal console chassis for the car
US13/678,762 Active 2033-02-05 US9296299B2 (en) 2010-10-01 2012-11-16 Behavioral tracking and vehicle applications
US13/679,363 Abandoned US20130145297A1 (en) 2011-11-16 2012-11-16 Configurable heads-up dash display
US13/678,699 Active 2033-02-05 US9330567B2 (en) 2010-10-01 2012-11-16 Etiquette suggestion

Family Applications After (49)

Application Number Title Priority Date Filing Date
US13/679,441 Active US8983718B2 (en) 2010-10-01 2012-11-16 Universal bus in the car
US13/678,735 Active 2032-12-04 US9046374B2 (en) 2010-10-01 2012-11-16 Proximity warning relative to other cars
US13/679,878 Expired - Fee Related US9140560B2 (en) 2010-10-01 2012-11-16 In-cloud connection for car multimedia
US13/679,292 Active US8862299B2 (en) 2011-11-16 2012-11-16 Branding of electrically propelled vehicles via the generation of specific operating output
US13/678,753 Active 2033-04-09 US9105051B2 (en) 2010-10-01 2012-11-16 Car location
US13/679,412 Abandoned US20130145360A1 (en) 2011-11-16 2012-11-16 Vehicle application store for console
US13/679,864 Active 2033-04-29 US9079497B2 (en) 2010-10-01 2012-11-16 Mobile hot spot/router/application share site or network
US13/678,773 Active US8818725B2 (en) 2011-11-16 2012-11-16 Location information exchange between vehicle and device
US13/678,691 Abandoned US20130144459A1 (en) 2011-11-16 2012-11-16 Law breaking/behavior sensor
US13/679,680 Abandoned US20130151065A1 (en) 2011-11-16 2012-11-16 Communications based on vehicle diagnostics and indications
US13/679,857 Active 2033-02-23 US9020491B2 (en) 2010-10-01 2012-11-16 Sharing applications/media between car and phone (hydroid)
US13/679,443 Active 2033-08-08 US9240018B2 (en) 2010-10-01 2012-11-16 Method and system for maintaining and reporting vehicle occupant information
US13/678,673 Abandoned US20130144657A1 (en) 2011-11-16 2012-11-16 Insurance tracking
US13/678,722 Active 2032-12-26 US8922393B2 (en) 2011-11-16 2012-11-16 Parking meter expired alert
US13/679,358 Abandoned US20130152003A1 (en) 2011-11-16 2012-11-16 Configurable dash display
US13/679,476 Abandoned US20130145482A1 (en) 2011-11-16 2012-11-16 Vehicle middleware
US13/679,459 Active 2032-04-11 US9324234B2 (en) 2010-10-01 2012-11-16 Vehicle comprising multi-operating system
US13/679,887 Active 2033-03-18 US8995982B2 (en) 2010-10-01 2012-11-16 In-car communication between devices
US13/679,369 Active 2033-03-04 US9176924B2 (en) 2010-10-01 2012-11-16 Method and system for vehicle data collection
US13/679,350 Active US9008856B2 (en) 2010-10-01 2012-11-16 Configurable vehicle console
US13/679,676 Abandoned US20130145065A1 (en) 2011-11-16 2012-11-16 Control of device features based on vehicle state
US13/679,234 Active US8831826B2 (en) 2011-11-16 2012-11-16 Gesture recognition for on-board display
US13/679,204 Active US8793034B2 (en) 2011-11-16 2012-11-16 Feature recognition for configuring a vehicle console and associated devices
US13/679,400 Expired - Fee Related US9159232B2 (en) 2010-10-01 2012-11-16 Vehicle climate control
US13/678,710 Expired - Fee Related US9123058B2 (en) 2010-10-01 2012-11-16 Parking space finder based on parking meter data
US13/678,745 Active US9014911B2 (en) 2010-10-01 2012-11-16 Street side sensors
US13/843,011 Abandoned US20130231800A1 (en) 2011-11-16 2013-03-15 Vehicle occupant health data gathering and monitoring
US14/468,055 Active US9240019B2 (en) 2010-10-01 2014-08-25 Location information exchange between vehicle and device
US14/527,209 Active US9542085B2 (en) 2010-10-01 2014-10-29 Universal console chassis for the car
US14/557,427 Active US9449516B2 (en) 2010-10-01 2014-12-01 Gesture recognition for on-board display
US14/659,255 Active US9297662B2 (en) 2010-10-01 2015-03-16 Universal bus in the car
US14/832,815 Abandoned US20160070456A1 (en) 2011-11-16 2015-08-21 Configurable heads-up dash display
US14/836,668 Abandoned US20160062583A1 (en) 2010-10-01 2015-08-26 Removable, configurable vehicle console
US14/836,677 Abandoned US20160055747A1 (en) 2010-10-01 2015-08-26 Law breaking/behavior sensor
US14/875,411 Abandoned US20160103980A1 (en) 2010-10-01 2015-10-05 Vehicle middleware
US14/927,196 Abandoned US20160140776A1 (en) 2010-10-01 2015-10-29 Communications based on vehicle diagnostics and indications
US14/930,197 Abandoned US20160127887A1 (en) 2010-10-01 2015-11-02 Control of device features based on vehicle state
US14/976,722 Abandoned US20160188190A1 (en) 2010-10-01 2015-12-21 Configurable dash display
US15/058,010 Active US10079733B2 (en) 2010-10-01 2016-03-01 Automatic and adaptive selection of multimedia sources
US15/099,375 Abandoned US20160306615A1 (en) 2010-10-01 2016-04-14 Vehicle application store for console
US15/138,642 Abandoned US20160314538A1 (en) 2010-10-01 2016-04-26 Insurance tracking
US15/269,617 Active US9977593B2 (en) 2010-10-01 2016-09-19 Gesture recognition for on-board display
US15/287,219 Active US10020995B2 (en) 2010-10-01 2016-10-06 Vehicle middleware
US15/377,887 Abandoned US20170132917A1 (en) 2010-10-01 2016-12-13 Law breaking/behavior sensor
US15/401,719 Active US10177986B2 (en) 2011-11-16 2017-01-09 Universal console chassis for the car
US16/031,931 Pending US20190222484A1 (en) 2011-11-16 2018-07-10 Vehicle middleware
US16/360,018 Pending US20190288916A1 (en) 2011-11-16 2019-03-21 System and method for a vehicle zone-determined reconfigurable display
US16/365,619 Pending US20190288917A1 (en) 2011-11-16 2019-03-26 Insurence Tracking
US16/386,032 Pending US20190311611A1 (en) 2011-11-16 2019-04-16 System and Method for Dynamic Map Updating in a Conveyance.

Country Status (3)

Country Link
US (56) US9043130B2 (en)
DE (10) DE112012004770T5 (en)
WO (10) WO2013075005A1 (en)

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020195832A1 (en) * 2001-06-12 2002-12-26 Honda Giken Kogyo Kabushiki Kaisha Vehicle occupant side crash protection system
US8793034B2 (en) 2011-11-16 2014-07-29 Flextronics Ap, Llc Feature recognition for configuring a vehicle console and associated devices
US8949823B2 (en) 2011-11-16 2015-02-03 Flextronics Ap, Llc On board vehicle installation supervisor
US9008906B2 (en) 2011-11-16 2015-04-14 Flextronics Ap, Llc Occupant sharing of displayed content in vehicles
US9020697B2 (en) 2012-03-14 2015-04-28 Flextronics Ap, Llc Vehicle-based multimode discovery
US9043073B2 (en) 2011-11-16 2015-05-26 Flextronics Ap, Llc On board vehicle diagnostic module
US9055022B2 (en) 2011-11-16 2015-06-09 Flextronics Ap, Llc On board vehicle networking module
US9082239B2 (en) 2012-03-14 2015-07-14 Flextronics Ap, Llc Intelligent vehicle for assisting vehicle occupants
US9081653B2 (en) 2011-11-16 2015-07-14 Flextronics Ap, Llc Duplicated processing in vehicles
US9082238B2 (en) 2012-03-14 2015-07-14 Flextronics Ap, Llc Synchronization between vehicle and user device calendar
US9088572B2 (en) 2011-11-16 2015-07-21 Flextronics Ap, Llc On board vehicle media controller
US9098367B2 (en) 2012-03-14 2015-08-04 Flextronics Ap, Llc Self-configuring vehicle console application store
US9116786B2 (en) 2011-11-16 2015-08-25 Flextronics Ap, Llc On board vehicle networking module
US9147298B2 (en) 2012-03-14 2015-09-29 Flextronics Ap, Llc Behavior modification via altered map routes based on user profile information
US9173100B2 (en) 2011-11-16 2015-10-27 Autoconnect Holdings Llc On board vehicle network security
US9373207B2 (en) 2012-03-14 2016-06-21 Autoconnect Holdings Llc Central network for the automated control of vehicular traffic
US9378601B2 (en) 2012-03-14 2016-06-28 Autoconnect Holdings Llc Providing home automation information via communication with a vehicle
US9384609B2 (en) 2012-03-14 2016-07-05 Autoconnect Holdings Llc Vehicle to vehicle safety and traffic communications
US9412273B2 (en) 2012-03-14 2016-08-09 Autoconnect Holdings Llc Radar sensing and emergency response vehicle detection
US9758116B2 (en) 2014-01-10 2017-09-12 Sony Corporation Apparatus and method for use in configuring an environment of an automobile
US9928734B2 (en) 2016-08-02 2018-03-27 Nio Usa, Inc. Vehicle-to-pedestrian communication systems
US9946906B2 (en) 2016-07-07 2018-04-17 Nio Usa, Inc. Vehicle with a soft-touch antenna for communicating sensitive information
US9963106B1 (en) 2016-11-07 2018-05-08 Nio Usa, Inc. Method and system for authentication in autonomous vehicles
US9984572B1 (en) 2017-01-16 2018-05-29 Nio Usa, Inc. Method and system for sharing parking space availability among autonomous vehicles
US10031521B1 (en) 2017-01-16 2018-07-24 Nio Usa, Inc. Method and system for using weather information in operation of autonomous vehicles
US10074223B2 (en) 2017-01-13 2018-09-11 Nio Usa, Inc. Secured vehicle for user use only
US10234302B2 (en) 2017-06-27 2019-03-19 Nio Usa, Inc. Adaptive route and motion planning based on learned external and internal vehicle environment
US10249104B2 (en) 2016-12-06 2019-04-02 Nio Usa, Inc. Lease observation and event recording
US10286915B2 (en) 2017-01-17 2019-05-14 Nio Usa, Inc. Machine learning for personalized driving
US10369966B1 (en) 2018-05-23 2019-08-06 Nio Usa, Inc. Controlling access to a vehicle using wireless access devices
US10369974B2 (en) 2017-07-14 2019-08-06 Nio Usa, Inc. Control and coordination of driverless fuel replenishment for autonomous vehicles
US10410064B2 (en) 2016-11-11 2019-09-10 Nio Usa, Inc. System for tracking and identifying vehicles and pedestrians
US10410250B2 (en) 2016-11-21 2019-09-10 Nio Usa, Inc. Vehicle autonomy level selection based on user context
US10464530B2 (en) 2017-01-17 2019-11-05 Nio Usa, Inc. Voice biometric pre-purchase enrollment for autonomous vehicles
US10471829B2 (en) 2017-01-16 2019-11-12 Nio Usa, Inc. Self-destruct zone and autonomous vehicle navigation

Families Citing this family (672)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7650509B1 (en) 2004-01-28 2010-01-19 Gordon & Howard Associates, Inc. Encoding data in a password
US9026267B2 (en) 2007-03-09 2015-05-05 Gordon*Howard Associates, Inc. Methods and systems of selectively enabling a vehicle by way of a portable wireless device
US10157422B2 (en) 2007-05-10 2018-12-18 Allstate Insurance Company Road segment safety rating
US8606512B1 (en) 2007-05-10 2013-12-10 Allstate Insurance Company Route risk mitigation
US9932033B2 (en) 2007-05-10 2018-04-03 Allstate Insurance Company Route risk mitigation
US10096038B2 (en) 2007-05-10 2018-10-09 Allstate Insurance Company Road segment safety rating system
US8626230B2 (en) * 2008-03-04 2014-01-07 Dish Network Corporation Method and system for using routine driving information in mobile interactive satellite services
RU2011134067A (en) * 2009-01-14 2013-03-10 Томтом Интернэшнл Б.В. Navigation device and method
US8295546B2 (en) 2009-01-30 2012-10-23 Microsoft Corporation Pose tracking pipeline
DE102010016043A1 (en) * 2009-03-25 2010-11-04 Denso Corporation, Kariya-City Display device and measuring device for a vehicle
JP5287746B2 (en) * 2009-05-21 2013-09-11 日産自動車株式会社 Driving support device and driving support method
US9016627B2 (en) * 2009-10-02 2015-04-28 Panasonic Avionics Corporation System and method for providing an integrated user interface system at a seat
US20110102348A1 (en) * 2009-11-02 2011-05-05 Modu Ltd. Dual wireless communicator and human interface device
US8823556B2 (en) * 2010-09-02 2014-09-02 Honda Motor Co., Ltd. Method of estimating intersection control
US8966379B2 (en) 2010-10-01 2015-02-24 Z124 Dynamic cross-environment application configuration/orientation in an active user environment
US20120084736A1 (en) 2010-10-01 2012-04-05 Flextronics Id, Llc Gesture controlled screen repositioning for one or more displays
WO2012044558A2 (en) 2010-10-01 2012-04-05 Imerj, Llc Cross-environment communication framework
US9026709B2 (en) 2010-10-01 2015-05-05 Z124 Auto-waking of a suspended OS in a dockable system
US9047102B2 (en) 2010-10-01 2015-06-02 Z124 Instant remote rendering
US8726294B2 (en) 2010-10-01 2014-05-13 Z124 Cross-environment communication using application space API
US8933949B2 (en) 2010-10-01 2015-01-13 Z124 User interaction across cross-environment applications through an extended graphics context
US8819705B2 (en) 2010-10-01 2014-08-26 Z124 User interaction support across cross-environment applications
US8509982B2 (en) 2010-10-05 2013-08-13 Google Inc. Zone driving
US8761831B2 (en) 2010-10-15 2014-06-24 Z124 Mirrored remote peripheral interface
US20120115413A1 (en) * 2010-11-10 2012-05-10 Ipcomm Llc Method for Suspending Transmission and Reception of Text Messages and Phone Calls while Drivin
US8989950B2 (en) * 2011-02-15 2015-03-24 Bosch Automotive Service Solutions Llc Diagnostic tool with smart camera
US9659099B2 (en) 2011-03-14 2017-05-23 Amgine Technologies (Us), Inc. Translation of user requests into itinerary solutions
US10078855B2 (en) 2011-03-14 2018-09-18 Amgine Technologies (Us), Inc. Managing an exchange that fulfills natural language travel requests
US8762944B2 (en) * 2011-03-23 2014-06-24 International Business Machines Corporation Build process management system
DE102011007914A1 (en) * 2011-04-21 2012-10-25 Deere & Company Data communication interface for an agricultural utility vehicle
US9230440B1 (en) 2011-04-22 2016-01-05 Angel A. Penilla Methods and systems for locating public parking and receiving security ratings for parking locations and generating notifications to vehicle user accounts regarding alerts and cloud access to security information
US9536197B1 (en) 2011-04-22 2017-01-03 Angel A. Penilla Methods and systems for processing data streams from data producing objects of vehicle and home entities and generating recommendations and settings
US9189900B1 (en) 2011-04-22 2015-11-17 Angel A. Penilla Methods and systems for assigning e-keys to users to access and drive vehicles
US9697503B1 (en) 2011-04-22 2017-07-04 Angel A. Penilla Methods and systems for providing recommendations to vehicle users to handle alerts associated with the vehicle and a bidding market place for handling alerts/service of the vehicle
US9171268B1 (en) 2011-04-22 2015-10-27 Angel A. Penilla Methods and systems for setting and transferring user profiles to vehicles and temporary sharing of user profiles to shared-use vehicles
US9581997B1 (en) 2011-04-22 2017-02-28 Angel A. Penilla Method and system for cloud-based communication for automatic driverless movement
US9818088B2 (en) 2011-04-22 2017-11-14 Emerging Automotive, Llc Vehicles and cloud systems for providing recommendations to vehicle users to handle alerts associated with the vehicle
US9809196B1 (en) 2011-04-22 2017-11-07 Emerging Automotive, Llc Methods and systems for vehicle security and remote access and safety control interfaces and notifications
US10286919B2 (en) 2011-04-22 2019-05-14 Emerging Automotive, Llc Valet mode for restricted operation of a vehicle and cloud access of a history of use made during valet mode use
US9229905B1 (en) 2011-04-22 2016-01-05 Angel A. Penilla Methods and systems for defining vehicle user profiles and managing user profiles via cloud systems and applying learned settings to user profiles
US9139091B1 (en) 2011-04-22 2015-09-22 Angel A. Penilla Methods and systems for setting and/or assigning advisor accounts to entities for specific vehicle aspects and cloud management of advisor accounts
US9348492B1 (en) 2011-04-22 2016-05-24 Angel A. Penilla Methods and systems for providing access to specific vehicle controls, functions, environment and applications to guests/passengers via personal mobile devices
US9365188B1 (en) 2011-04-22 2016-06-14 Angel A. Penilla Methods and systems for using cloud services to assign e-keys to access vehicles
US10289288B2 (en) 2011-04-22 2019-05-14 Emerging Automotive, Llc Vehicle systems for providing access to vehicle controls, functions, environment and applications to guests/passengers via mobile devices
US9346365B1 (en) 2011-04-22 2016-05-24 Angel A. Penilla Methods and systems for electric vehicle (EV) charging, charging unit (CU) interfaces, auxiliary batteries, and remote access and user notifications
US9123035B2 (en) 2011-04-22 2015-09-01 Angel A. Penilla Electric vehicle (EV) range extending charge systems, distributed networks of charge kiosks, and charge locating mobile apps
US9648107B1 (en) 2011-04-22 2017-05-09 Angel A. Penilla Methods and cloud systems for using connected object state data for informing and alerting connected vehicle drivers of state changes
US9180783B1 (en) 2011-04-22 2015-11-10 Penilla Angel A Methods and systems for electric vehicle (EV) charge location color-coded charge state indicators, cloud applications and user notifications
US9285944B1 (en) 2011-04-22 2016-03-15 Angel A. Penilla Methods and systems for defining custom vehicle user interface configurations and cloud services for managing applications for the user interface and learned setting functions
US9215274B2 (en) 2011-04-22 2015-12-15 Angel A. Penilla Methods and systems for generating recommendations to make settings at vehicles via cloud systems
US9104537B1 (en) 2011-04-22 2015-08-11 Angel A. Penilla Methods and systems for generating setting recommendation to user accounts for registered vehicles via cloud systems and remotely applying settings
US9288270B1 (en) 2011-04-22 2016-03-15 Angel A. Penilla Systems for learning user preferences and generating recommendations to make settings at connected vehicles and interfacing with cloud systems
US9371007B1 (en) 2011-04-22 2016-06-21 Angel A. Penilla Methods and systems for automatic electric vehicle identification and charging via wireless charging pads
US9493130B2 (en) 2011-04-22 2016-11-15 Angel A. Penilla Methods and systems for communicating content to connected vehicle users based detected tone/mood in voice input
US9963145B2 (en) 2012-04-22 2018-05-08 Emerging Automotive, Llc Connected vehicle communication with processing alerts related to traffic lights and cloud systems
DE102011077486B3 (en) * 2011-06-14 2012-10-18 Robert Bosch Gmbh Device and method for triggering an occupant protection device, triggering system and vehicle
KR101933450B1 (en) * 2011-07-05 2018-12-31 삼성전자주식회사 Method for dynamically changing contents displyed on vehicular head unit and a mobile terminal therefor
US8924970B2 (en) * 2011-08-05 2014-12-30 Vmware, Inc. Sharing work environment information sources with personal environment applications
US8824166B2 (en) 2011-08-31 2014-09-02 Apple Inc. Magnetic stand for tablet device
US8842057B2 (en) 2011-09-27 2014-09-23 Z124 Detail on triggers: transitional states
US9021049B2 (en) * 2011-10-21 2015-04-28 GM Global Technology Operations LLC Method and apparatus for augmenting smartphone-centric in-car infotainment system using vehicle Wi-Fi/DSRC
US20130132434A1 (en) * 2011-11-22 2013-05-23 Inrix, Inc. User-assisted identification of location conditions
TWI447039B (en) * 2011-11-25 2014-08-01 Driving behavior analysis and warning system and method thereof
KR101327057B1 (en) * 2011-12-05 2013-11-07 현대자동차주식회사 System for inducing economic driving for vehicle
US9197720B2 (en) 2011-12-07 2015-11-24 Yahoo! Inc. Deployment and hosting of platform independent applications
US9946526B2 (en) 2011-12-07 2018-04-17 Excalibur Ip, Llc Development and hosting for platform independent applications
US20130151595A1 (en) * 2011-12-07 2013-06-13 Bruno Fernandez-Ruiz Deployment and hosting of platform independent applications
US9158520B2 (en) 2011-12-07 2015-10-13 Yahoo! Inc. Development of platform independent applications
US9268546B2 (en) 2011-12-07 2016-02-23 Yahoo! Inc. Deployment and hosting of platform independent applications
KR20130066347A (en) * 2011-12-12 2013-06-20 현대자동차주식회사 Apparatus and method for max opening angle setting of power tail gate
US8811938B2 (en) 2011-12-16 2014-08-19 Microsoft Corporation Providing a user interface experience based on inferred vehicle state
US9824064B2 (en) * 2011-12-21 2017-11-21 Scope Technologies Holdings Limited System and method for use of pattern recognition in assessing or monitoring vehicle status or operator driving behavior
US8892385B2 (en) 2011-12-21 2014-11-18 Scope Technologies Holdings Limited System and method for use with an accelerometer to determine a frame of reference
DE102011089496A1 (en) 2011-12-21 2013-06-27 Continental Automotive Gmbh System and method for transmission of transmissions
US9518830B1 (en) 2011-12-28 2016-12-13 Intelligent Technologies International, Inc. Vehicular navigation system updating based on object presence
CN104080658B (en) * 2012-01-25 2016-08-24 丰田自动车株式会社 Vehicle remote operation information provider unit, mounted remote operate information acquisition device and possess the vehicle remote operation system of these devices
US20130218604A1 (en) * 2012-02-21 2013-08-22 Elwha Llc Systems and methods for insurance based upon monitored characteristics of a collision detection system
US20130245882A1 (en) * 2012-03-14 2013-09-19 Christopher P. Ricci Removable, configurable vehicle console
US10289108B2 (en) * 2012-03-15 2019-05-14 General Electric Company Methods and apparatus for monitoring operation of a system asset
US8744771B2 (en) * 2012-03-26 2014-06-03 Navteq B.V. Reverse natural guidance
US9191442B2 (en) * 2012-04-03 2015-11-17 Accenture Global Services Limited Adaptive sensor data selection and sampling based on current and future context
US10217160B2 (en) 2012-04-22 2019-02-26 Emerging Automotive, Llc Methods and systems for processing charge availability and route paths for obtaining charge for electric vehicles
DE102012206691A1 (en) * 2012-04-24 2013-10-24 Zumtobel Lighting Gmbh Road and path lighting system
US9696884B2 (en) * 2012-04-25 2017-07-04 Nokia Technologies Oy Method and apparatus for generating personalized media streams
US9002554B2 (en) * 2012-05-09 2015-04-07 Innova Electronics, Inc. Smart phone app-based remote vehicle diagnostic system and method
US9020876B2 (en) * 2012-06-07 2015-04-28 International Business Machines Corporation On-demand suggestion for vehicle driving
US9007229B1 (en) 2012-06-20 2015-04-14 Amazon Technologies, Inc. Sensor based recommendations
US10408857B2 (en) * 2012-09-12 2019-09-10 Alpinereplay, Inc. Use of gyro sensors for identifying athletic maneuvers
US20130346571A1 (en) * 2012-06-24 2013-12-26 Sergei MAKAVEEV Computer and method of operation of its network
DE102012012697A1 (en) * 2012-06-26 2014-01-02 Leopold Kostal Gmbh & Co. Kg Operating system for a motor vehicle
US9000903B2 (en) 2012-07-09 2015-04-07 Elwha Llc Systems and methods for vehicle monitoring
US9165469B2 (en) 2012-07-09 2015-10-20 Elwha Llc Systems and methods for coordinating sensor operation for collision detection
US9558667B2 (en) 2012-07-09 2017-01-31 Elwha Llc Systems and methods for cooperative collision detection
KR101493360B1 (en) * 2012-07-30 2015-02-23 주식회사 케이티 Method of vehicle driving managing through detection state change of around cars and system for it
US9704189B2 (en) * 2012-09-05 2017-07-11 Rakuten Kobo, Inc. System and method for a graphical user interface having recommendations
US9292257B2 (en) * 2012-09-12 2016-03-22 Sap Se Accurate range calculation for vehicles, computed outside of the vehicle
US10200256B2 (en) * 2012-09-17 2019-02-05 Box, Inc. System and method of a manipulative handle in an interactive mobile user interface
DE102012216689B4 (en) * 2012-09-18 2017-05-04 Continental Automotive Gmbh Method for monitoring an Ethernet-based communication network in a motor vehicle
DE102012216827B3 (en) * 2012-09-19 2014-02-20 Continental Automotive Gmbh Method and device for vehicle communication
TW201412583A (en) * 2012-09-20 2014-04-01 Wistron Corp Vehicle control system with parameter setting function
WO2014052903A1 (en) * 2012-09-28 2014-04-03 Stubhub, Inc. Three-dimensional interactive seat map
US20140092249A1 (en) * 2012-09-28 2014-04-03 Ford Global Technologies, Llc Vehicle perimeter detection system
DE112013004924T9 (en) 2012-10-08 2015-07-23 Fisher-Rosemount Systems, Inc. Dynamically reusable classes
US9280859B2 (en) * 2012-10-08 2016-03-08 Toyota Motor Engineering & Manufacturing North America, Inc. Enhanced vehicle onboard diagnostic system and method
US9486070B2 (en) 2012-10-10 2016-11-08 Stirworks Inc. Height-adjustable support surface and system for encouraging human movement and promoting wellness
US20140108939A1 (en) * 2012-10-15 2014-04-17 Nokia Corporation Method and apparatus for managing online content collections using a single programming tool
JP6052865B2 (en) * 2012-10-29 2016-12-27 アルパイン株式会社 In-vehicle display control device and in-vehicle display control method
US8949970B2 (en) * 2012-10-31 2015-02-03 Rockwell Automation Technologies, Inc. Automation system access control system and method
DE102012220228A1 (en) * 2012-11-07 2014-06-12 Bayerische Motoren Werke Aktiengesellschaft Method and monitoring system for monitoring the use of customer functions in a vehicle
US9031710B2 (en) 2012-11-07 2015-05-12 Cloudcar, Inc. Cloud-based vehicle information and control system
JP2014094615A (en) * 2012-11-08 2014-05-22 Honda Motor Co Ltd Vehicle display device
US8874653B2 (en) * 2012-11-12 2014-10-28 Maximilian A. Chang Vehicle security and customization
US8838321B1 (en) 2012-11-15 2014-09-16 Google Inc. Modifying a vehicle state based on the presence of a special-purpose vehicle
US8849557B1 (en) 2012-11-15 2014-09-30 Google Inc. Leveraging of behavior of vehicles to detect likely presence of an emergency vehicle
US9410613B2 (en) * 2012-11-27 2016-08-09 Continental Automotive Systems, Inc. On-screen gear selector for automatic transmission
JP6182855B2 (en) * 2012-12-04 2017-08-23 株式会社リコー Image processing system and information synchronization method
EP2741290A1 (en) * 2012-12-06 2014-06-11 Harman Becker Automotive Systems GmbH Vehicle multimedia system and vehicle
US9224289B2 (en) 2012-12-10 2015-12-29 Ford Global Technologies, Llc System and method of determining occupant location using connected devices
US20140163771A1 (en) * 2012-12-10 2014-06-12 Ford Global Technologies, Llc Occupant interaction with vehicle system using brought-in devices
JP6094798B2 (en) * 2012-12-12 2017-03-15 日本精機株式会社 Vehicle information providing device
US8981942B2 (en) 2012-12-17 2015-03-17 State Farm Mutual Automobile Insurance Company System and method to monitor and reduce vehicle operator impairment
US8930269B2 (en) 2012-12-17 2015-01-06 State Farm Mutual Automobile Insurance Company System and method to adjust insurance rate based on real-time data about potential vehicle operator impairment
JP6113491B2 (en) * 2012-12-18 2017-04-12 三菱重工メカトロシステムズ株式会社 OBE, communication method and program
DE102012224149A1 (en) * 2012-12-21 2014-06-26 Continental Automotive Gmbh System for parking time management
US10081370B2 (en) * 2012-12-21 2018-09-25 Harman Becker Automotive Systems Gmbh System for a vehicle
US20140184428A1 (en) * 2012-12-27 2014-07-03 Jennifer A. Healey Interactive management of a parked vehicle
US9008641B2 (en) * 2012-12-27 2015-04-14 Intel Corporation Detecting a user-to-wireless device association in a vehicle
WO2014107513A2 (en) * 2013-01-04 2014-07-10 Johnson Controls Technology Company Context-based vehicle user interface reconfiguration
US9665997B2 (en) 2013-01-08 2017-05-30 Gordon*Howard Associates, Inc. Method and system for providing feedback based on driving behavior
JP5942864B2 (en) * 2013-01-18 2016-06-29 ソニー株式会社 Terminal device, content transmission method, content transmission program, and content reproduction system
USD734276S1 (en) * 2013-01-22 2015-07-14 Continental Automotive Gmbh Control device for a vehicle having a display
US20150356469A1 (en) * 2013-01-23 2015-12-10 Ying-Tsun Su System and method for management parking spaces
US9332028B2 (en) 2013-01-25 2016-05-03 REMTCS Inc. System, method, and apparatus for providing network security
US9525700B1 (en) * 2013-01-25 2016-12-20 REMTCS Inc. System and method for detecting malicious activity and harmful hardware/software modifications to a vehicle
JP6322364B2 (en) * 2013-01-29 2018-05-09 矢崎総業株式会社 Electronic control unit
JP5835242B2 (en) * 2013-02-01 2015-12-24 株式会社デンソー Vehicle safety control system
US9092309B2 (en) * 2013-02-14 2015-07-28 Ford Global Technologies, Llc Method and system for selecting driver preferences
US9169115B2 (en) * 2013-02-18 2015-10-27 Ford Global Technologies, Llc Method and device for reducing the likelihood of theft at a gas station or a charging station for motor vehicles
US9688246B2 (en) * 2013-02-25 2017-06-27 Ford Global Technologies, Llc Method and apparatus for in-vehicle alarm activation and response handling
JP5838983B2 (en) * 2013-02-25 2016-01-06 トヨタ自動車株式会社 Information processing apparatus and information processing method
US9247373B2 (en) * 2013-03-01 2016-01-26 Denso International America, Inc. Method of determining user intent to use services based on proximity
US9633485B2 (en) * 2013-03-04 2017-04-25 Pedro David GONZÁLEZ VERA System and method for the access to information contained in motor vehicles
US9020482B2 (en) * 2013-03-06 2015-04-28 Qualcomm Incorporated Preventing driver distraction
US9779458B2 (en) 2013-03-10 2017-10-03 State Farm Mutual Automobile Insurance Company Systems and methods for generating vehicle insurance policy data based on empirical vehicle related data
US20140256305A1 (en) * 2013-03-11 2014-09-11 Roman Ginis Methods and systems for mode scheduling in mobile devices
US9736669B2 (en) * 2013-03-11 2017-08-15 General Motors Llc Interface device for providing vehicle services using a vehicle and a mobile communications device
US9035756B2 (en) 2013-03-14 2015-05-19 Gordon*Howard Associates, Inc. Methods and systems related to remote tamper detection
US9378480B2 (en) * 2013-03-14 2016-06-28 Gordon*Howard Associates, Inc. Methods and systems related to asset identification triggered geofencing
US9840229B2 (en) 2013-03-14 2017-12-12 Gordon*Howard Associates, Inc. Methods and systems related to a remote tamper detection
US8928471B2 (en) 2013-03-14 2015-01-06 Gordon*Howard Associates, Inc. Methods and systems related to remote tamper detection
US9751534B2 (en) 2013-03-15 2017-09-05 Honda Motor Co., Ltd. System and method for responding to driver state
US10445758B1 (en) 2013-03-15 2019-10-15 Allstate Insurance Company Providing rewards based on driving behaviors detected by a mobile computing device
US9469261B1 (en) * 2013-03-15 2016-10-18 Intermotive, Inc. CAN network spoofing
US8738523B1 (en) * 2013-03-15 2014-05-27 State Farm Mutual Automobile Insurance Company Systems and methods to identify and profile a vehicle operator
US8876535B2 (en) 2013-03-15 2014-11-04 State Farm Mutual Automobile Insurance Company Real-time driver observation and scoring for driver's education
US9348555B2 (en) * 2013-03-15 2016-05-24 Volkswagen Ag In-vehicle access of mobile device functions
US9842091B2 (en) * 2013-03-15 2017-12-12 Google Llc Switching to and from native web applications
US9224293B2 (en) * 2013-03-16 2015-12-29 Donald Warren Taylor Apparatus and system for monitoring and managing traffic flow
KR101410664B1 (en) * 2013-03-19 2014-06-24 현대자동차주식회사 Smart Touch Type Electronic Auto Shift Lever
JP6182923B2 (en) * 2013-03-21 2017-08-23 株式会社デンソー Radio communication device, vehicle unit, and display device
WO2014147828A1 (en) * 2013-03-22 2014-09-25 トヨタ自動車株式会社 Driving assistance device, driving assistance method, information-providing device, information-providing method, navigation device and navigation method
US9547417B2 (en) * 2013-03-29 2017-01-17 Deere & Company Retracting shortcut bars, status shortcuts and edit run page sets
US20140300494A1 (en) * 2013-04-03 2014-10-09 Ford Global Technologies, Llc Location based feature usage prediction for contextual hmi
KR102009745B1 (en) * 2013-04-05 2019-08-13 삼성전자주식회사 Appratus and method for communicating device to device in an wireless network
DE102013207113A1 (en) * 2013-04-19 2014-10-23 Continental Teves Ag & Co. Ohg A method and system for avoiding a launching of a follower vehicle to an immediate preceding vehicle and use of the system
US20140318989A1 (en) * 2013-04-24 2014-10-30 Rajiv Mohan Dhas System and method for monitoring and oxygenating an automobile cabin
KR20140128832A (en) * 2013-04-29 2014-11-06 팅크웨어(주) Image-processing Apparatus for Car and Method of Sharing Data Using The Same
KR20140131085A (en) * 2013-05-03 2014-11-12 삼성전자주식회사 Method for controlling status information and an electronic device thereof
DE102013208098A1 (en) * 2013-05-03 2014-11-06 Eberspächer Exhaust Technology GmbH & Co. KG Road vehicle
JP6011452B2 (en) * 2013-05-14 2016-10-19 株式会社デンソー Display control apparatus and program
US9225799B1 (en) * 2013-05-21 2015-12-29 Trend Micro Incorporated Client-side rendering for virtual mobile infrastructure
CN104182431A (en) * 2013-05-28 2014-12-03 英业达科技有限公司 Media searching method
US9147353B1 (en) 2013-05-29 2015-09-29 Allstate Insurance Company Driving analysis using vehicle-to-vehicle communication
CN105283356B (en) * 2013-06-12 2017-12-12 本田技研工业株式会社 Application control method and information terminal
US20140372551A1 (en) 2013-06-13 2014-12-18 Rod G. Fleck Providing storage and security services with a smart personal gateway device
JP6136627B2 (en) * 2013-06-24 2017-05-31 マツダ株式会社 Vehicle Information Display Device
US9013333B2 (en) 2013-06-24 2015-04-21 Gordon*Howard Associates, Inc. Methods and systems related to time triggered geofencing
KR20160028453A (en) 2013-07-02 2016-03-11 가부시키가이샤 한도오따이 에네루기 켄큐쇼 Data processing device
IN2013MU02326A (en) * 2013-07-10 2015-06-19 Tata Consultancy Services Limited Driving behavior analysis system and method
US9523984B1 (en) * 2013-07-12 2016-12-20 Google Inc. Methods and systems for determining instructions for pulling over an autonomous vehicle
DE102013214383A1 (en) * 2013-07-23 2015-01-29 Robert Bosch Gmbh Method and device for providing a collision signal with regard to a vehicle collision, method and device for managing collision data regarding vehicle collisions, and method and device for controlling at least one collision protection device of a vehicle
DE102013214554A1 (en) * 2013-07-25 2015-01-29 Bayerische Motoren Werke Aktiengesellschaft Method for heating the interior of a vehicle
US9554328B2 (en) 2013-07-26 2017-01-24 Intel Corporation Vehicle-based small cell base stations
GB2516698B (en) * 2013-07-30 2017-03-22 Jaguar Land Rover Ltd Vehicle distributed network providing feedback to a user
CN106458103A (en) 2013-07-31 2017-02-22 感知驾驶员技术有限责任公司 Vehicle use portable heads-up display
DE102013012777A1 (en) * 2013-07-31 2015-02-05 Valeo Schalter Und Sensoren Gmbh Method for using a communication terminal in a motor vehicle when activated autopilot and motor vehicle
US9230442B2 (en) 2013-07-31 2016-01-05 Elwha Llc Systems and methods for adaptive vehicle sensing systems
US9776632B2 (en) 2013-07-31 2017-10-03 Elwha Llc Systems and methods for adaptive vehicle sensing systems
US9269268B2 (en) 2013-07-31 2016-02-23 Elwha Llc Systems and methods for adaptive vehicle sensing systems
KR101573766B1 (en) * 2013-08-05 2015-12-02 현대모비스 주식회사 Simplification device of connecting wireless communication and sharing data, and the method thereof
DE202013007158U1 (en) * 2013-08-12 2014-11-17 GM Global Technology Operations LLC (n. d. Gesetzen des Staates Delaware) Panoramic projection device with housing for a motor vehicle
DE202013007159U1 (en) * 2013-08-12 2014-11-17 GM Global Technology Operations LLC (n. d. Gesetzen des Staates Delaware) Panoramic projection device for a motor vehicle
US9135756B2 (en) * 2013-08-14 2015-09-15 Hti Ip, L.L.C. Providing communications between a vehicle control device and a user device via a head unit
US8966654B1 (en) * 2013-08-15 2015-02-24 TrueLite Trace, Inc. Privacy control-adjustable vehicle monitoring system with a wild card mode
US9248794B2 (en) * 2013-08-26 2016-02-02 Intel Corporation Configuring user customizable operational features of a vehicle
US20150066345A1 (en) * 2013-08-28 2015-03-05 Elwha Llc Vehicle collision management system responsive to user-selected preferences
US20150063329A1 (en) * 2013-08-28 2015-03-05 General Motors Llc Selective vehicle wi-fi access
TW201509151A (en) * 2013-08-30 2015-03-01 Ibm A method and computer program product for providing a remote diagnosis with a secure connection for an appliance and an appliance performing the method
CA2922228A1 (en) * 2013-08-30 2015-03-05 Honda Motor Co., Ltd. In-vehicle unit, communication system, communication method, and program
US9807196B2 (en) 2013-09-17 2017-10-31 Toyota Motor Sales, U.S.A. Automated social network interaction system for a vehicle
US9387824B2 (en) 2013-09-17 2016-07-12 Toyota Motor Engineering & Manufacturing North America, Inc. Interactive vehicle window display system with user identification and image recording
US9760698B2 (en) 2013-09-17 2017-09-12 Toyota Motor Sales, U.S.A., Inc. Integrated wearable article for interactive vehicle control system
US9902266B2 (en) 2013-09-17 2018-02-27 Toyota Motor Engineering & Manufacturing North America, Inc. Interactive vehicle window display system with personal convenience reminders
US9400564B2 (en) 2013-09-17 2016-07-26 Toyota Motor Engineering & Manufacturing North America, Inc. Interactive vehicle window display system with a safe driving reminder system
US9340155B2 (en) * 2013-09-17 2016-05-17 Toyota Motor Sales, U.S.A., Inc. Interactive vehicle window display system with user identification
WO2016032990A1 (en) * 2014-08-26 2016-03-03 Toyota Motor Sales, U.S.A., Inc. Integrated wearable article for interactive vehicle control system
US9958289B2 (en) * 2013-09-26 2018-05-01 Google Llc Controlling navigation software on a portable device from the head unit of a vehicle
US10054463B2 (en) 2013-09-26 2018-08-21 Google Llc Systems and methods for providing navigation data to a vehicle
US9109917B2 (en) 2013-09-26 2015-08-18 Google Inc. Systems and methods for providing input suggestions via the head unit of a vehicle
US9807349B1 (en) * 2013-09-27 2017-10-31 Isaac S. Daniel Covert recording alarm apparatus for vehicles
US10437376B2 (en) * 2013-09-27 2019-10-08 Volkswagen Aktiengesellschaft User interface and method for assisting a user in the operation of an operator control unit
US20150094929A1 (en) * 2013-09-30 2015-04-02 Ford Global Technologies, Llc Vehicle diagnostic and prognostic systems and methods
US9602624B2 (en) 2013-09-30 2017-03-21 AT&T Intellectual Property I, L.L.P. Facilitating content management based on profiles of members in an environment
US9734694B2 (en) 2013-10-04 2017-08-15 Sol Mingso Li Systems and methods for programming, controlling and monitoring wireless networks
US9821713B2 (en) * 2013-10-07 2017-11-21 Jet Optoelectronics Co., Ltd. In-vehicle lighting device and operating method
US10102755B1 (en) 2013-10-07 2018-10-16 Satcom Direct, Inc. Method and system for aircraft positioning—automated tracking using onboard global voice and high-speed data
US9008868B1 (en) * 2013-10-09 2015-04-14 Satcom Direct, Inc. Cloud based management of aircraft avionics
US9553658B1 (en) 2013-10-09 2017-01-24 Satcom Direct, Inc. Router for aircraft communications with simultaneous satellite connections
US9565618B1 (en) 2013-10-09 2017-02-07 Satcom Direct, Inc. Air to ground management of multiple communication paths
US9577742B1 (en) 2013-10-10 2017-02-21 Satcom Direct, Inc. Data compression and acceleration for air to ground communications
DE102013220523A1 (en) * 2013-10-11 2015-04-16 Continental Automotive Gmbh Method for updating an operating function of a sensor
WO2015057979A1 (en) * 2013-10-16 2015-04-23 REMTCS Inc. System and method for detecting malicious activity and harmful hardware/software modifications to a vehicle
US10075460B2 (en) 2013-10-16 2018-09-11 REMTCS Inc. Power grid universal detection and countermeasure overlay intelligence ultra-low latency hypervisor
US9401923B2 (en) * 2013-10-23 2016-07-26 Christopher Valasek Electronic system for detecting and preventing compromise of vehicle electrical and control systems
US20150116079A1 (en) * 2013-10-24 2015-04-30 GM Global Technology Operations LLC Enhanced vehicle key fob
US20150116200A1 (en) * 2013-10-25 2015-04-30 Honda Motor Co., Ltd. System and method for gestural control of vehicle systems
US20150118959A1 (en) * 2013-10-28 2015-04-30 Nicolas Jean Petit Platform framework for wireless media device simulation and design
US9051890B2 (en) 2013-10-28 2015-06-09 Ford Global Technologies, Llc Method for estimating charge air cooler condensation storage with an intake oxygen sensor
US9177429B2 (en) * 2013-10-29 2015-11-03 Telefonaktiebolaget L M Ericsson (Publ) Method and apparatus for assigning profile data to one or more vehicle sub-systems of a vehicle
US9783137B2 (en) * 2013-10-30 2017-10-10 Powervoice Co., Ltd. Sound QR system for vehicular services
WO2015064745A1 (en) * 2013-10-31 2015-05-07 本田技研工業株式会社 Information notification device, information notification system, information notification method, and information notification program
US9817521B2 (en) 2013-11-02 2017-11-14 At&T Intellectual Property I, L.P. Gesture detection
US20150127253A1 (en) * 2013-11-06 2015-05-07 Ney Jose Torres Hurtado Mileage Tracker
DE102013222473A1 (en) * 2013-11-06 2015-05-07 Conti Temic Microelectronic Gmbh Method for transmission slip control and clutch transmission
DE102013222586A1 (en) * 2013-11-07 2015-05-07 Robert Bosch Gmbh Method for avoiding a collision of a motor vehicle with a wrong-moving vehicle and control and detection device for a motor vehicle to avoid a collision of the motor vehicle with a wrong-driving vehicle
US10025431B2 (en) 2013-11-13 2018-07-17 At&T Intellectual Property I, L.P. Gesture detection
US10391403B2 (en) * 2013-11-14 2019-08-27 Sony Interactive Entertainment LLC Game extensions in a gaming environment
CN104645632B (en) * 2013-11-15 2016-11-23 付志勇 Visual wireless remote-controlled toy vehicle sight exchange method and device
CN103684963B (en) * 2013-11-18 2017-05-24 重庆邮电大学 Framework system and implementation method of middleware applied to car networking
US9401056B2 (en) 2013-11-19 2016-07-26 At&T Intellectual Property I, L.P. Vehicular simulation
US20150143451A1 (en) * 2013-11-19 2015-05-21 Cisco Technology Inc. Safety in Downloadable Applications for Onboard Computers
KR101710317B1 (en) 2013-11-22 2017-02-24 퀄컴 인코포레이티드 System and method for configuring an interior of a vehicle based on preferences provided with multiple mobile computing devices within the vehicle
US20150154247A1 (en) * 2013-12-03 2015-06-04 Caterpillar Inc. System and method for surface data management at worksite
US9120365B2 (en) * 2013-12-09 2015-09-01 Ford Global Technologies, Llc Automatic temperature override pattern recognition system
US8738723B1 (en) * 2013-12-10 2014-05-27 Google Inc. Predictive forwarding of notification data
KR101569020B1 (en) * 2013-12-12 2015-11-13 엘지전자 주식회사 Holder device for portable terminal
KR101583885B1 (en) * 2013-12-18 2016-01-08 현대자동차주식회사 Heat management system and method for engine
US9613459B2 (en) * 2013-12-19 2017-04-04 Honda Motor Co., Ltd. System and method for in-vehicle interaction
US9210549B2 (en) 2013-12-19 2015-12-08 International Business Machines Corporation Tracking a mobile unit in a housing facility for mobile units
GB2521415A (en) 2013-12-19 2015-06-24 Here Global Bv An apparatus, method and computer program for controlling a vehicle
AU2014277738A1 (en) 2013-12-19 2015-07-09 The Raymond Corporation Integrated touch screen display with multi-mode functionality
US9238467B1 (en) * 2013-12-20 2016-01-19 Lytx, Inc. Automatic engagement of a driver assistance system
EP3083337A4 (en) 2013-12-20 2018-02-21 SenseDriver Technologies, LLC Method and apparatus for in-vehicular communications
US10109119B2 (en) 2013-12-23 2018-10-23 Robert Bosch Gmbh System and method for automotive diagnostic tool data collection and analysis
KR101548953B1 (en) * 2013-12-24 2015-09-01 현대자동차주식회사 Method and apparatus for updating information for vehicle
US9915091B2 (en) * 2013-12-27 2018-03-13 Lenovo (Singapore) Pte. Ltd. Low power environment management for an automobile
US10134091B2 (en) 2013-12-31 2018-11-20 Hartford Fire Insurance Company System and method for determining driver signatures
US10023114B2 (en) * 2013-12-31 2018-07-17 Hartford Fire Insurance Company Electronics for remotely monitoring and controlling a vehicle
US10065562B2 (en) 2013-12-31 2018-09-04 International Business Mahcines Corporation Vehicle collision avoidance
US10126823B2 (en) * 2014-01-03 2018-11-13 Harman International Industries, Incorporated In-vehicle gesture interactive spatial audio system
JP6523298B2 (en) * 2014-01-06 2019-05-29 ジョンソン コントロールズ テクノロジー カンパニーJohnson Controls Technology Company Computer system and vehicle interface system
US20160328272A1 (en) * 2014-01-06 2016-11-10 Jonson Controls Technology Company Vehicle with multiple user interface operating domains
US9098957B1 (en) * 2014-01-16 2015-08-04 GM Global Technology Operations LLC Remote control of vehicular wireless router settings
DE102014200993A1 (en) 2014-01-21 2015-07-23 Volkswagen Aktiengesellschaft User interface and method for adapting a view on a display unit
KR20150087985A (en) * 2014-01-23 2015-07-31 한국전자통신연구원 Providing Apparatus and the Method of Safety Driving Information
US9390451B1 (en) 2014-01-24 2016-07-12 Allstate Insurance Company Insurance system related to a vehicle-to-vehicle communication system
US10096067B1 (en) 2014-01-24 2018-10-09 Allstate Insurance Company Reward system related to a vehicle-to-vehicle communication system
US9355423B1 (en) * 2014-01-24 2016-05-31 Allstate Insurance Company Reward system related to a vehicle-to-vehicle communication system
US9467179B2 (en) * 2014-01-27 2016-10-11 General Motor LLC Vehicle head unit priority
DE102014001182A1 (en) 2014-01-30 2015-07-30 Audi Ag A system for operating a combination instrument of a vehicle and a mobile electronic device, which can be releasably supported by a vehicle-mounted bracket
GB201401873D0 (en) * 2014-02-04 2014-03-19 Sudak Menachem M Monitoring system and method
US10038952B2 (en) 2014-02-04 2018-07-31 Steelcase Inc. Sound management systems for improving workplace efficiency
WO2015118325A1 (en) * 2014-02-04 2015-08-13 Sudak Menachem Monitoring system and method
US20150228127A1 (en) * 2014-02-12 2015-08-13 Uwe Ross OBD Interface Device Having Processor Running Diagnostics Web Server to Provide Platform Independent Diagnostics
GB201402627D0 (en) * 2014-02-14 2014-04-02 New Dawn Innovations Ltd Digital radio receiver system
US9940676B1 (en) 2014-02-19 2018-04-10 Allstate Insurance Company Insurance system for analysis of autonomous driving
US9442527B2 (en) 2014-02-20 2016-09-13 Audi Ag Docking and undocking mechanism for remote devices
WO2015130970A1 (en) * 2014-02-26 2015-09-03 Analog Devices, Inc. Systems for providing intelligent vehicular systems and services
JP5967116B2 (en) * 2014-02-27 2016-08-10 株式会社デンソー In-vehicle system, information processing apparatus, and program
US10049508B2 (en) 2014-02-27 2018-08-14 Satcom Direct, Inc. Automated flight operations system
US9958178B2 (en) * 2014-03-06 2018-05-01 Dell Products, Lp System and method for providing a server rack management controller
US9635115B2 (en) 2014-03-07 2017-04-25 International Business Machines Corporation Unused location discriminator
US9734685B2 (en) 2014-03-07 2017-08-15 State Farm Mutual Automobile Insurance Company Vehicle operator emotion management system and method
US9996878B1 (en) 2014-03-11 2018-06-12 Liberty Mutual Insurance Company In-vehicle infotainment insurance applications
KR101550055B1 (en) * 2014-03-18 2015-09-04 주식회사 오비고 Method, apparatus and computer-readable recording media for prpviding application connector using template-based ui
US20150266356A1 (en) * 2014-03-19 2015-09-24 Ford Global Technologies, Llc Method and system to enable commands on a vehicle computer based on user created rules
US10140118B2 (en) * 2014-03-19 2018-11-27 Huawei Device (Dongguan) Co., Ltd. Application data synchronization method and apparatus
DE102014205653A1 (en) * 2014-03-26 2015-10-01 Continental Automotive Gmbh control system
FR3019414B1 (en) * 2014-03-31 2017-09-08 Sagem Defense Securite Method for the flight transmission of blackbox type data
US10282797B2 (en) * 2014-04-01 2019-05-07 Amgine Technologies (Us), Inc. Inference model for traveler classification
US9342797B2 (en) 2014-04-03 2016-05-17 Honda Motor Co., Ltd. Systems and methods for the detection of implicit gestures
JP2015196495A (en) * 2014-04-03 2015-11-09 株式会社デンソー Input device for vehicle
US10466657B2 (en) 2014-04-03 2019-11-05 Honda Motor Co., Ltd. Systems and methods for global adaptation of an implicit gesture control system
US10409382B2 (en) * 2014-04-03 2019-09-10 Honda Motor Co., Ltd. Smart tutorial for gesture control system
EP3129848A4 (en) 2014-04-09 2017-04-19 Microsoft Technology Licensing, LLC Hinged cover for computing device
WO2015154276A1 (en) 2014-04-10 2015-10-15 Microsoft Technology Licensing, Llc Slider cover for computing device
US10182118B2 (en) 2014-04-12 2019-01-15 Gregor Z. Hanuschak Method and apparatus for interacting with a personal computing device such as a smart phone using portable and self-contained hardware that is adapted for use in a motor vehicle
US9135803B1 (en) 2014-04-17 2015-09-15 State Farm Mutual Automobile Insurance Company Advanced vehicle operator intelligence system
DE102014207422A1 (en) * 2014-04-17 2015-10-22 Robert Bosch Gmbh Bus interface unit and method of operation therefor
US20150300828A1 (en) * 2014-04-17 2015-10-22 Ford Global Technologies, Llc Cooperative learning method for road infrastructure detection and characterization
US9226099B2 (en) 2014-04-18 2015-12-29 Lenovo Enterprise Solutions (Singapore) Pte. Ltd. Communicating with an owner of an object without the owner's contact information
US20150307048A1 (en) * 2014-04-23 2015-10-29 Creative Inovation Services, LLC Automobile alert information system, methods, and apparatus
US9712339B2 (en) * 2014-04-24 2017-07-18 Infineon Technologies Ag Bus architecture and access method for plastic waveguide
US10282787B1 (en) 2014-04-25 2019-05-07 State Farm Mutual Automobile Insurance Company Systems and methods for determining cause of loss to a property
EP3142288B1 (en) * 2014-05-08 2018-12-26 Panasonic Intellectual Property Corporation of America In-car network system, electronic control unit and update processing method
US9648399B2 (en) 2014-05-08 2017-05-09 Infineon Technologies Ag System having plastic waveguides
CN103995511B (en) * 2014-05-16 2016-10-05 航天新长征电动汽车技术有限公司 A kind of intelligent bus CAN body control system
US10185999B1 (en) 2014-05-20 2019-01-22 State Farm Mutual Automobile Insurance Company Autonomous feature use monitoring and telematics
US10026130B1 (en) 2014-05-20 2018-07-17 State Farm Mutual Automobile Insurance Company Autonomous vehicle collision risk assessment
US10373259B1 (en) 2014-05-20 2019-08-06 State Farm Mutual Automobile Insurance Company Fully autonomous vehicle insurance pricing
US10185998B1 (en) 2014-05-20 2019-01-22 State Farm Mutual Automobile Insurance Company Accident fault determination for autonomous vehicles
US9972054B1 (en) 2014-05-20 2018-05-15 State Farm Mutual Automobile Insurance Company Accident fault determination for autonomous vehicles
US10319039B1 (en) 2014-05-20 2019-06-11 State Farm Mutual Automobile Insurance Company Accident fault determination for autonomous vehicles
US9863336B2 (en) 2014-05-23 2018-01-09 Ford Global Technologies, Llc System and method for estimating ambient humidity
US9286738B2 (en) * 2014-05-23 2016-03-15 Immortal Data, Inc. Distributed data storage and recovery
JP6476595B2 (en) * 2014-05-27 2019-03-06 株式会社デンソー Vehicle heating system
US20150348460A1 (en) * 2014-05-29 2015-12-03 Claude Lano Cox Method and system for monitor brightness control using an ambient light sensor on a mobile device
USD823858S1 (en) * 2014-06-02 2018-07-24 Mitsubishi Electric Corporation Information display for a vehicle with a graphical user interface
US20150346700A1 (en) * 2014-06-02 2015-12-03 Rovio Entertainment Ltd Control of a computer program
US20150356879A1 (en) * 2014-06-09 2015-12-10 Timothy Best GPS Based Instructional Driving Simulation Device
KR20150142298A (en) * 2014-06-11 2015-12-22 현대자동차주식회사 Vehicle, control method of vehicle and vehicle driving sound control apparatus
US10366370B1 (en) * 2014-06-12 2019-07-30 State Farm Mutual Automobile Insurance Company Systems and methods for managing and communicating vehicle notifications for various circumstances
US9569105B2 (en) * 2014-06-18 2017-02-14 Mediatek Inc. Method for managing virtual control interface of an electronic device, and associated apparatus and associated computer program product
DE102014212006A1 (en) * 2014-06-23 2015-12-24 Zf Friedrichshafen Ag Vehicle transmission system
JP6079705B2 (en) * 2014-06-23 2017-02-15 トヨタ自動車株式会社 Emergency call device for vehicles
EP3165993A4 (en) * 2014-06-30 2018-01-10 Clarion Co., Ltd. Non-contact operation detection device
US20150379408A1 (en) * 2014-06-30 2015-12-31 Microsoft Corporation Using Sensor Information for Inferring and Forecasting Large-Scale Phenomena
EP2963619A1 (en)