US20210097851A1 - Systems, Methods, And Devices For Remotely Controlling Functionalities Of Vehicles - Google Patents
Systems, Methods, And Devices For Remotely Controlling Functionalities Of Vehicles Download PDFInfo
- Publication number
- US20210097851A1 US20210097851A1 US16/589,771 US201916589771A US2021097851A1 US 20210097851 A1 US20210097851 A1 US 20210097851A1 US 201916589771 A US201916589771 A US 201916589771A US 2021097851 A1 US2021097851 A1 US 2021097851A1
- Authority
- US
- United States
- Prior art keywords
- control device
- remote control
- vehicle
- functionality
- computer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title abstract description 23
- 238000004891 communication Methods 0.000 claims abstract description 61
- 230000015654 memory Effects 0.000 claims description 40
- 230000033001 locomotion Effects 0.000 claims description 26
- 230000008859 change Effects 0.000 claims description 18
- 230000003993 interaction Effects 0.000 claims description 7
- 230000006870 function Effects 0.000 description 12
- 230000008569 process Effects 0.000 description 10
- 230000000007 visual effect Effects 0.000 description 7
- 238000001514 detection method Methods 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 238000004590 computer program Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000010438 heat treatment Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000010079 rubber tapping Methods 0.000 description 2
- 238000004378 air conditioning Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000001816 cooling Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000004806 packaging method and process Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 229920001690 polydopamine Polymers 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000009423 ventilation Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/38—Services specially adapted for particular environments, situations or purposes for collecting sensor information
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/80—Arrangements for controlling instruments
-
- B60K37/06—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60N—SEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
- B60N2/00—Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
- B60N2/02—Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles the seat or part thereof being movable, e.g. adjustable
- B60N2/0224—Non-manual adjustments, e.g. with electrical operation
- B60N2/0226—User interfaces specially adapted for seat adjustment
- B60N2/0229—User interfaces specially adapted for seat adjustment characterised by the shape, e.g. switches having cushion or backrest shape
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60N—SEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
- B60N2/00—Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
- B60N2/02—Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles the seat or part thereof being movable, e.g. adjustable
- B60N2/0224—Non-manual adjustments, e.g. with electrical operation
- B60N2/0226—User interfaces specially adapted for seat adjustment
- B60N2/0239—User interfaces specially adapted for seat adjustment using movement detection or gesture recognition
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R16/00—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
- B60R16/02—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
- B60R16/023—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
- B60R16/0231—Circuits relating to the driving or the functioning of the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0338—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of limited linear or angular displacement of an operating part of the device from a neutral position, e.g. isotonic or isometric joysticks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
- G08C17/02—Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/40—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/40—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
- H04W4/48—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for in-vehicle communication
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60H—ARRANGEMENTS OF HEATING, COOLING, VENTILATING OR OTHER AIR-TREATING DEVICES SPECIALLY ADAPTED FOR PASSENGER OR GOODS SPACES OF VEHICLES
- B60H1/00—Heating, cooling or ventilating [HVAC] devices
- B60H1/00642—Control systems or circuits; Control members or indication devices for heating, cooling or ventilating devices
- B60H1/0065—Control members, e.g. levers or knobs
- B60H1/00657—Remote control devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/122—Instrument input devices with reconfigurable control functions, e.g. reconfigurable menus
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/143—Touch sensitive instrument input devices
- B60K2360/1438—Touch screens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/55—Remote control arrangements
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/55—Remote control arrangements
- B60K2360/56—Remote control arrangements using mobile devices
- B60K2360/573—Mobile devices controlling vehicle functions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60N—SEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
- B60N2220/00—Computerised treatment of data for controlling of seats
- B60N2220/20—Computerised treatment of data for controlling of seats using a deterministic algorithm
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R16/00—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
- B60R16/02—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
- B60R16/037—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for occupant comfort, e.g. for automatic adjustment of appliances according to personal settings, e.g. seats, mirrors, steering wheel
-
- E—FIXED CONSTRUCTIONS
- E05—LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
- E05B—LOCKS; ACCESSORIES THEREFOR; HANDCUFFS
- E05B81/00—Power-actuated vehicle locks
-
- E—FIXED CONSTRUCTIONS
- E05—LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
- E05F—DEVICES FOR MOVING WINGS INTO OPEN OR CLOSED POSITION; CHECKS FOR WINGS; WING FITTINGS NOT OTHERWISE PROVIDED FOR, CONCERNED WITH THE FUNCTIONING OF THE WING
- E05F15/00—Power-operated mechanisms for wings
- E05F15/70—Power-operated mechanisms for wings with automatic actuation
- E05F15/77—Power-operated mechanisms for wings with automatic actuation using wireless control
-
- E—FIXED CONSTRUCTIONS
- E05—LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
- E05Y—INDEXING SCHEME ASSOCIATED WITH SUBCLASSES E05D AND E05F, RELATING TO CONSTRUCTION ELEMENTS, ELECTRIC CONTROL, POWER SUPPLY, POWER SIGNAL OR TRANSMISSION, USER INTERFACES, MOUNTING OR COUPLING, DETAILS, ACCESSORIES, AUXILIARY OPERATIONS NOT OTHERWISE PROVIDED FOR, APPLICATION THEREOF
- E05Y2400/00—Electronic control; Electrical power; Power supply; Power or signal transmission; User interfaces
- E05Y2400/80—User interfaces
- E05Y2400/85—User input means
-
- E—FIXED CONSTRUCTIONS
- E05—LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
- E05Y—INDEXING SCHEME ASSOCIATED WITH SUBCLASSES E05D AND E05F, RELATING TO CONSTRUCTION ELEMENTS, ELECTRIC CONTROL, POWER SUPPLY, POWER SIGNAL OR TRANSMISSION, USER INTERFACES, MOUNTING OR COUPLING, DETAILS, ACCESSORIES, AUXILIARY OPERATIONS NOT OTHERWISE PROVIDED FOR, APPLICATION THEREOF
- E05Y2900/00—Application of doors, windows, wings or fittings thereof
- E05Y2900/50—Application of doors, windows, wings or fittings thereof for vehicles
- E05Y2900/53—Type of wing
- E05Y2900/55—Windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C2201/00—Transmission systems of control signals via wireless link
- G08C2201/30—User interface
- G08C2201/32—Remote control based on movements, attitude of remote control device
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C2201/00—Transmission systems of control signals via wireless link
- G08C2201/30—User interface
- G08C2201/34—Context aware guidance
Definitions
- the present disclosure relates to remotely controlling functionalities of vehicles. Some embodiments are directed to systems, methods and devices for remotely controlling functionalities of vehicles.
- a primary interface for vehicle systems and functions include one or more knobs and buttons, and one or more vehicle interior controls for controlling heating, ventilation, and air conditioning (HAVC), windows, lighting, music, audio head, climate head, and shifters, etc.
- HAVC heating, ventilation, and air conditioning
- windows lighting, music, audio head, climate head, and shifters, etc.
- a vehicle interior control typically performs one function for a product lifetime.
- the vehicle interior controls are fixed at respective locations and are accessible only to a driver and/or a front passenger of the vehicle. Thus, the vehicle interior controls are not accessible to the front passenger and/or backseat passengers of the vehicle. Further, packaging restrictions make the vehicle interior control costly and difficult to add new features once a cockpit design of the vehicle is complete.
- FIG. 1 depicts a schematic illustration of an example implementation for remotely controlling functionalities of a vehicle in accordance with one or more embodiments of the disclosure.
- FIG. 2 depicts a schematic illustration of an example implementation for remotely controlling functionalities of a vehicle based on positions relative to the vehicle in accordance with one or more embodiments of the disclosure.
- FIG. 3 depicts an example process flow for remotely controlling functionalities of a vehicle based on positions relative to a remote control device in accordance with one or more embodiments of the disclosure.
- FIG. 4 depicts an example process flow for remotely controlling functionalities of vehicles in accordance with one or more embodiments of the disclosure.
- FIG. 5 depicts an example process flow for remotely controlling functionalities of vehicles in accordance with one or more embodiments of the disclosure.
- FIG. 6 depicts an example process flow for remotely controlling functionalities of vehicles in accordance with one or more embodiments of the disclosure.
- FIG. 7 depicts an illustrative architecture in which techniques and structures for providing the systems and methods disclosed herein may be implemented.
- FIG. 8 depicts a schematic exploded illustration of an example remote control device in accordance with one or more embodiments of the disclosure.
- the systems, methods and devices disclosed herein are configured to remotely controlling one or more functionalities from vehicle interior controls to third-party services based on a position of a remote control device.
- a functionality describes a controllable and/or selectable function associated with a vehicle, e.g., driving information review and/or search, user comfort settings control, vehicle controls, entertainment, wayfinding, lock control, window switches control, seat adjustment control, one or more software applications, and the like.
- the remote control device may be passed between vehicle inhabitants such that the functionalities are accessible to different occupants, e.g., drivers, front passengers, and/or backseat passengers.
- the functionalities may be added, removed, and different/or updated via a software update (e.g., an over-the-air update).
- a software update e.g., an over-the-air update.
- the systems, methods and devices may apply Sensor Fusion, Machine Learning and/or Artificial Intelligence to learn and adapt to various occupants. For example, usage data, device sensor data such as position and biometrics, and vehicle data such as weather, vehicle inertial data, and location are used to understand the occupants' behaviors better and offer the occupants access to their desired content more effectively.
- An occupant may interact with the remote control device to execute one or more functionalities associated with various positions relative to the vehicle. Different positions relative to the vehicle may be associated with different functionalities. For instance, a remote control device is placed at a first position to execute functionalities associated with the first position. A different remote control device placed at a second position may execute different functionalities associated with the second position. If the different remote control device is placed at the same position as the remote control device, the different remote control device may execute the same functionalities as the remote control device. Examples are further described with respect to FIG. 1 and FIG. 2 .
- the occupant may interact with the remote control device to execute one or more functionalities associated with various positions relative to the remote control device.
- the remote control device may control different functionalities based on different positions relative to the remote control device, even if a position relative to the vehicle of the remote control device is changed.
- a position change relative to the remote control device may be caused by tapping or swiping a screen, rotating the entire remote control device, tilting the remote control device, picking the remote device up to hold the remote control device, speaking to the remote control device, and the like. Examples are further described with respect to FIG. 3 .
- the remote control device may function inside and/or outside a vehicle.
- the remote control device includes a high resolution touch display, a microprocessor, and a series of internal sensors in a round, hand-sized form factor. Examples are further described with respect to FIG. 7 and FIG. 8 .
- FIG. 1 depicts a schematic illustration of an example implementation 100 for remotely controlling functionalities 108 A- 108 D of a vehicle 102 based on positions relative to the vehicle 102 in accordance with one or more embodiments of the disclosure.
- An occupant 105 may be a driver, a front passenger or a backseat passenger in the vehicle 102 .
- the vehicle 102 may include one or more remote control devices 110 .
- a remote control device 110 may remotely control one or more functionalities 108 A- 108 D of the vehicle 102 based on positions 104 A- 104 D where the occupant 105 places the remote control device 110 .
- a first position 104 A is associated with functionalities 108 A, such as charging, window controls, and interior lighting controls.
- a second position 104 B is associated with functionalities 108 B, such as one or more user preferred entertainment applications.
- a third position 104 C is associated with functionalities 108 C, such as one or more wayfinding applications.
- the vehicle 102 may determine where the remote control device 110 is located.
- the vehicle 102 may establish a communication connection between the remote control device 110 and one or more functionalities (e.g., the functionalities 108 A, the functionalities 108 B, or the functionalities 108 C) associated with the position based on a machine readable medium associated with the position.
- the machine readable medium may include one or more radio frequency identification (RFID) tags, one or more media using a Bluetooth connectivity, one or more media using a Bluetooth low energy (BLE) connectivity, one or more wireless antennas, or some combination thereof.
- RFID radio frequency identification
- the vehicle 102 may cause a user interface on the remote control device 110 to present one or more settings associated with the one or more functionalities.
- the vehicle 102 may receive an audio input and generate an audio and/or visual output 108 D.
- the remote control device 110 of the vehicle 102 is placed at a fourth position 104 D to execute one or more functionalities associated with the audio input and audio/visual output 108 D.
- the remote control device 110 may be placed at a location (e.g., the first position 104 A, the second position 104 B, the third position 104 C or any other position) other than the fourth position to receive an audio command from the occupant 105 .
- the remote control device 110 may generate an audio and/or visual output 108 D, such as an audio/visual feedback in response to the audio command, an audio/visual output indicating that a status of a currently executed functionality, and/or an audio/visual output indicating that a status of the remote control device 110 for receiving the audio command.
- an audio and/or visual output 108 D such as an audio/visual feedback in response to the audio command, an audio/visual output indicating that a status of a currently executed functionality, and/or an audio/visual output indicating that a status of the remote control device 110 for receiving the audio command.
- the vehicle 102 may further include one or more memories including computer-executable instructions, and one or more computing processors configured to access the one or more memories and execute the computer-executable instructions. For instance, the vehicle 102 may execute the computer-executable instructions to determine that the remote control device 110 is located at the position, to establish a communication connection between the remote control device 110 and functionalities associated with the position, and to cause the user interface on the remote control device 110 to present one or more settings associated with the one or more functionalities.
- the one or more memories including computer-executable instructions, and one or more computing processors configured to access the one or more memories and execute the computer-executable instructions may be included in the remote control device 110 .
- the vehicle 102 may establish a communication connection between the remote control device 110 and functionalities associated with a position based on one or more machine readable media that are included at the position of the vehicle 102 .
- a first readable medium e.g., a RFID tag, or the like
- the vehicle 102 may determine that the remote control device 110 is located at the first position 104 A based on the first readable medium.
- the vehicle 102 may transmit a signal indicative of the first position 104 A by the first machine readable medium located at the first position 104 A of the vehicle 102 .
- the vehicle 102 may cause a communication between the first machine readable medium located at the first position 104 A and the vehicle 102 based on the signal indicative of the first position 104 A.
- the vehicle 102 may establish a communication connection between the remote control device 110 and the functionalities 108 A based on the communication.
- the vehicle 102 may establish a communication connection between the remote control device 110 and functionalities associated with a position based on one or more machine readable media that are included in the remote control device 110 .
- a second machine readable medium e.g., a RFID tag, or the like
- the vehicle 102 may receive sensor data from one or more sensors of the remote control device 110 .
- the sensor data indicates that the remote control device 110 is placed at the first position 104 A as an example.
- a sensor may include an inertial measurement unit (IMU), a magnetometer (compass) module, a near field/far field motion sensor, an ambient light sensor, or some combination thereof.
- the vehicle 102 may cause a communication between a second machine readable medium of the remote control device 110 and the vehicle 102 .
- the vehicle 102 may establish a communication connection between the remote control device 110 and the functionalities 108 A.
- the occupant 105 may interact with the remote control device 110 to execute functionalities associated with positions relative to the vehicle 102 .
- the remote control device 110 may be moved from the first position 104 A to the second position 104 B to execute the functionalities 108 B.
- the vehicle 102 may determine a movement change of the remote control device 110 moving from the first position 104 A to the second position 104 B.
- the vehicle 102 may disestablish a first communication connection between the remote control device 110 and the functionalities 108 A based at on the movement change.
- the vehicle 102 may establish a second communication connection between the remote control device 110 and the functionalities 108 B based on one or more machine readable media associated with the second position 104 B.
- the machine readable media may be included at the second position 104 B of the vehicle 102 , or included in the remote control device 110 .
- the vehicle 102 may cause the user interface on the remote control device 110 to present one or more settings associated with the functionalities 180 B.
- the movement change may be detected by one or more sensors of the vehicle 102 and/or by the sensors of the remote control device 110 .
- the vehicle 102 may receive a user input based on the user interaction with the remote control device 110 to modify the one or more settings associated with the functionalities.
- the user interaction may include tapping or swiping a screen, rotating the entire remote control device, tilting the remote control device, picking the remote device up to hold the remote control device, speaking to the remote control device, and the like.
- the remote control device 110 may include a touch display, one or more sensors to generate sensor data, the one or more memories including computer-executable instructions, and the one or more computing processors to access the one or more memories and execute the computer-executable instructions.
- FIG. 2 depicts a schematic illustration of an example implementation 200 for remotely controlling functionalities 220 A- 220 F of the vehicle 102 based on positions 210 A- 210 F relative to the vehicle 102 in accordance with one or more embodiments of the disclosure.
- Different functionalities 220 A- 220 F may be associated with respective positions 210 A- 210 -F.
- a single remote control device 110 may be placed at a particular position of the positions 210 A- 210 F to execute functionalities associated with the particular position.
- the single remote control device 110 may be passed over from one occupant (e.g., a driver, or a front passenger) to another occupant (e.g., a backseat passenger) to be placed at a position proximate the later occupant to execute functionalities associated with the position.
- the vehicle 102 may include multiple remote control devices 110 .
- Each of the multiple remote control devices 110 may be placed at a respective position (e.g., positions 210 A- 210 F).
- Each of the multiple remote control devices 110 may execute respective functionalities 220 A- 220 F associated with the positions 210 A- 210 F.
- FIG. 3 depicts a schematic illustration of an example implementation 300 for remotely controlling functionalities 330 A- 330 C of the vehicle 102 based on positions 332 - 336 relative to a remote control device 310 in accordance with one or more embodiments of the disclosure.
- the remote control device 310 may be one of embodiments of the remote control device 110 .
- Multiple functionalities e.g., a lock control 320 A, a window control 320 B, and a seat adjustment control 320 C
- a typical vehicle may be executed by the single remote control device 310 based on various positions relative to the remote control device 310 , even if a position relative to the vehicle 102 of the remote control device 310 is changed.
- the occupant 105 may twist 332 the remote control device 310 to execute a lock control 330 A, slide 334 the remote control device 310 to execute a window control 330 B, jog and tilt 336 the remote control device 310 to execute a seat adjustment control 330 C.
- the vehicle 102 may determine a movement change of the remote control device 310 moving from a twisting position 332 to a sliding position 334 relative to the remote control device 310 .
- the vehicle 102 may disestablish a first communication connection between the remote control device 310 and the lock control 330 A based on the movement change.
- the vehicle 102 may establish a second communication connection between the remote control device 310 and the window control 330 B.
- the vehicle 102 may cause a user interface on the remote control device 310 to present one or more settings associated with the window control 330 B.
- the movement change may be detected by one or more sensors of the remote control device 110 .
- the vehicle 102 may determine a movement change of the remote control device 310 moving from the first position 104 A to the second position 104 B relative to the vehicle 102 .
- the vehicle 102 may maintain the functionalities associated with positions relative to the remote control device 310 .
- each occupant may have a remote control device 310 placed proximate to a respective occupant.
- the occupant may pivot the remote control device 310 to pick it up to hold the remote control device for a user input.
- the vehicle 102 may receive a user input based on the user interaction with the remote control device 110 to modify the one or more settings associated with the functionalities.
- FIG. 4 depicts an example process flow 400 for remotely controlling functionalities of vehicles in accordance with one or more embodiments of the disclosure.
- one or more computer processors of a vehicle and/or a remote control device may execute computer-executable instructions stored on memory to determine that the remote control device is located at a position, the position associated with a functionality.
- the position e.g., the positions 104 A- 104 D in FIG. 1 , or the positions 210 A- 210 F in FIG. 2
- the position may be relative to the vehicle.
- the position e.g., the positions in 332 - 336 in FIG. 3
- the position may be determined based on sensor data of the remote control device.
- the position may be determined based on one or more machine readable media associated with the position.
- the machine readable media may be included at the position of the vehicle. In some embodiments, the machine readable media may be included in the remote control device.
- one or more computer processors of a vehicle and/or a remote control device may execute computer-executable instructions stored on memory to establish a communication connection between the remote control device and the functionality based on a machine readable medium associated with the position.
- the machine readable medium is included at the position of the vehicle.
- the vehicle may transmit a signal indicative of the position by the machine readable medium.
- the vehicle may cause a communication between the machine readable medium based on the signal.
- the vehicle may establish the communication connection between the remote control device and the functionality based on the communication.
- the machine readable medium is included in the remote control device.
- the vehicle may receive sensor data from one or more sensors of the remote control device.
- the sensor data indicates that the remote control device is placed at the position.
- the vehicle may cause a communication between the machine readable medium and the vehicle.
- the vehicle 102 may establish a communication connection between the remote control device and the functionality based on the communication.
- one or more computer processors of a vehicle and/or a remote control device may execute computer-executable instructions stored on memory to cause a user interface on the first remote control device to present one or more settings associated with the functionality.
- the vehicle may receive a user input based on a user interaction with the remote control device to modify the one or more settings associated with the functionality.
- FIG. 5 depicts an example process flow 500 for remotely controlling functionalities of vehicles in accordance with one or more embodiments of the disclosure.
- Steps 510 - 590 may be performed by one or more computer processors of a vehicle and/or a remote control device.
- one or more computer processors of a vehicle and/or a remote control device may execute computer-executable instructions stored on memory to execute a driving information application based on an RFID tag, or a Bluetooth/BLE connectivity that is associated with a first position, and/or to execute a user comfort application based on an RFID tag, or a Bluetooth/BLE connectivity that is associated with a second position.
- FIG. 6 depicts an example process flow 600 for remotely controlling functionalities of vehicles in accordance with one or more embodiments of the disclosure.
- Steps 610 - 682 may be performed by one or more computer processors of a vehicle and/or a remote control device.
- one or more computer processors of a vehicle and/or a remote control device may execute computer-executable instructions stored on memory to execute a vehicle control application based on an RFID tag or a Bluetooth/BLE connectivity that is associated with a first position, to execute a user preferred entertainment application based on a RFID tag, or a Bluetooth/BLE connectivity that is associated with a second position, to execute a wayfinding application based on a RFID tag, or a Bluetooth/BLE connectivity that associated with a third position, and/or to execute an audio-command based application based on a RFID tag, or a Bluetooth/BLE connectivity that is associated with a fourth position.
- FIG. 7 depicts an illustrative architecture 700 in which techniques and structures for providing the systems and methods disclosed herein may be implemented.
- the illustrative architecture 700 may include vehicle 102 , one or more remote control devices 110 ( 1 ), 110 ( 2 ), 110 ( 3 ), 110 (S), and a network 760 .
- the network 760 may include any one or a combination of multiple different types of networks, such as cable networks, the Internet, wireless networks, and other private and/or public networks.
- the network 750 may include cellular, Wi-Fi, or Wi-Fi direct.
- the vehicle 102 generally includes a vehicle controller 720 and a functionality assembly 730 .
- the vehicle controller 720 includes one or more processors 722 and one or more memories 724 .
- the memory 724 stores instructions that may be executed by the processor 722 to perform various functions or operations disclosed herein, and/or to instruct the remote control device 110 to remotely control the assembly 730 in accordance with the present disclosure.
- the vehicle controller 720 may also include a communications interface 728 that allows the vehicle controller 720 to communicate with the remote control device 110 over the network 760 .
- the vehicle controller 720 may also include a position detection module 726 that may determine where the remote control device 110 is located.
- the position detection module 726 may include one or more machine readable media 727 . In some embodiments, the position detection module 726 can be included in the remote control device 110 .
- the functionality assembly 730 may include any one or more of an entertainment or infotainment system 732 , one or more lighting elements 734 , a voice command system 736 , one or more seat sensors and/or components 737 , a climate control system 738 , one or more software applications 740 , a lock control system 742 , a window control system 744 , a seat adjustment system 746 , and other functionalities 748 .
- the entertainment or infotainment system 732 is configured to provide visual and/or auditory output for the occupant 105 such as music, videos, or other media.
- the lighting elements 734 include any lighting devices that are located within the cabin of the vehicle 102 . Some of these lighting elements 734 have selectable luminance and/or hue. Thus, the vehicle controller 720 may instruct the remote control device 110 selectively alter or change the luminance and/or hue of the one or more lights.
- the voice command system 736 may include any automated voice controlled system that allows the occupant 105 to interact with the vehicle controller 720 and/or the remote control device 110 using words, phrases, or natural language input.
- the voice command system 736 may be used to instruct a user in utilizing the functionalities of the vehicle 102 .
- the climate control system 738 allows a user to select a temperature within the vehicle 102 as well as control other aspects of climate such as seat heating or cooling.
- the vehicle controller 720 may also cause the climate control system 738 to activate heaters in a seat where the user is located using one or more seat sensors.
- a software applications 740 may be a computer program designed to perform a group of coordinated functions, tasks, or activities for the benefit of the occupant 105 .
- Example of the software application can include a driving information application, a user comfort application, a vehicle controls application, a user preferred entertainment application, a wayfinding application, and/or any suitable computer program that perform tasks for the benefit of the occupant 105 .
- the lock control system 742 controls one or more locks of the vehicle 102 .
- the window control system 744 controls one or more windows of the vehicle 102 .
- the seat adjustment system 746 adjusts seat position for one or more seats of the vehicle.
- the other functionalities 748 may include a scent dispenser that is configured to output scents, one or more biometric sensors that collect biometric data such as heart rate, pulse, body temperature, or other similar biometric data, and/or any other suitable functionality associated with the vehicle 102 .
- the remote control device 110 may be configured to remotely control the functionality assembly 730 .
- the remote control device 110 generally includes one or more processors 772 , one or more memories 774 , one or more I/O interfaces 776 , a communication interface 778 , a sensor module 780 , the position detection module 726 , a charging module 784 , a haptic module 786 , and one or more I/O components 790 .
- the memory 774 stores instructions that may be executed by the processor 772 to perform various functions or operations disclosed herein.
- the processor 772 may execute the instructions stored in the memory 774 to remotely control the functionality assembly 730 .
- the processor 772 may also execute the instructions stored in the memory 774 to receive one or more inputs and to generate one or more outputs via the I/O interfaces 776 and I/O components 790 .
- the communications interface 778 that allows the remote control device 110 to communicate with the vehicle controller 720 over the network 760 .
- the remote control device 110 may also include the position detection module 726 .
- the one or more I/O interfaces 776 allow for the coupling I/O components 790 to the remote control device 110 .
- the I/O interfaces 776 may include inter-integrated circuit (“I2C”), serial peripheral interface bus (“SPI”), universal serial bus (“USB”), RS-232, RS-432, and so forth.
- the I/O components 790 may include one or more displays 792 , one or more touch sensors 794 , one or more audio input/output components 796 , and other suitable I/O components 798 .
- the display 792 may be a high resolution display, a touchscreen, or a liquid crystal or electrophoretic display elements.
- the touch sensor 794 may include interpolating force sensing resistor (“IFSR”) arrays, capacitive sensors, optical touch sensors, and so forth.
- the audio input/output components 796 may include speakers, and microphones.
- the other I/O components 798 may include external memory devices, global positioning system receivers that may be coupled to the remote control device 110 using the I/O interfaces 776 .
- the sensor module 780 may determine a movement relative to the remote control device 110 and a movement relative to the vehicle 102 .
- the sensor module 780 may include an IMU module 782 , one or more magnetometer (compass) module 784 , one or more sensors 785 , and one or more machine readable media 727 .
- the one or more sensors 785 can include a near field motion sensor, a far field motion sensor, a near/far field motion sensor, an ambient light senor, any other suitable sensor, or some combination thereof.
- the sensor module 780 may determine movements generated in FIG. 3 , such as one or more twisting positions, one or more sliding positions, and one or more jogging and tilting positions relative to the remote control device 110 .
- the sensor module 780 may determine the movement of the remote control device 110 relative to the vehicle 102 .
- the charging module 784 may include one or more batteries, and/or one or more charging antennas.
- the haptic module 786 may generate haptic motions associated with actions performed by the occupant 105 , and/or associated with one or more functionalities in the functionality assembly 730 that are currently executed by the remote control device 110 .
- FIG. 8 depicts a schematic illustration of an example remote control device 800 in accordance with one or more embodiments of the disclosure.
- the remote control device 800 is one of embodiments of the remote control device 110 .
- the remote control device 800 includes a lens 810 , a touch display 812 , a microphone 814 , a circuit board 816 , a battery 818 , a haptic motor 820 , a charging antenna 824 , and a magnetic reactive PC enclosure 826 .
- the circuit board 816 may include a microprocessor, a IMU module, a magnetometer module, one or more near/far filed motion sensors, one or more ambient light sensors, one or more machine readable media.
- Implementations of the systems, apparatuses, devices, and methods disclosed herein may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed herein. Implementations within the scope of the present disclosure may also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media may be any available media that may be accessed by a general purpose or special purpose computer system. Computer-readable media that stores computer-executable instructions is computer storage media (devices). Computer-readable media that carries computer-executable instructions is transmission media. Thus, by way of example, and not limitation, implementations of the present disclosure may comprise at least two distinctly different kinds of computer-readable media: computer storage media (devices) and transmission media.
- Computer storage media includes RAM, ROM, EEPROM, CD-ROM, solid state drives (SSDs) (e.g., based on RAM), flash memory, phase-change memory (PCM), other types of memory, other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to store desired program code means in the form of computer-executable instructions or data structures and which may be accessed by a general purpose or special purpose computer.
- SSDs solid state drives
- PCM phase-change memory
- An implementation of the devices, systems, and methods disclosed herein may communicate over a computer network.
- a “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices.
- Transmission media may include a network and/or data links, which may be used to carry desired program code means in the form of computer-executable instructions or data structures and which may be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
- Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions.
- the computer-executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code.
- the present disclosure may be practiced in network computing environments with many types of computer system configurations, including in-dash vehicle computers, personal computers, desktop computers, laptop computers, message processors, handheld devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, tablets, pagers, routers, switches, various storage devices, and the like.
- the disclosure may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by any combination of hardwired and wireless data links) through a network, both perform tasks.
- program modules may be located in both the local and remote memory storage devices.
- ASICs application specific integrated circuits
- a sensor may include computer code configured to be executed in one or more processors and may include hardware logic/electrical circuitry controlled by the computer code.
- processors may include hardware logic/electrical circuitry controlled by the computer code.
- At least some embodiments of the present disclosure have been directed to computer program products comprising such logic (e.g., in the form of software) stored on any computer-usable medium.
- Such software when executed in one or more data processing devices, causes a device to operate as described herein.
- any or all of the aforementioned alternate implementations may be used in any combination desired to form additional hybrid implementations of the present disclosure.
- any of the functionality described with respect to a particular device or component may be performed by another device or component.
- embodiments of the disclosure may relate to numerous other device characteristics.
- embodiments have been described in language specific to structural features and/or methodological acts, it is to be understood that the disclosure is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as illustrative forms of implementing the embodiments.
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Mechanical Engineering (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Transportation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Aviation & Aerospace Engineering (AREA)
- Signal Processing (AREA)
- Automation & Control Theory (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Selective Calling Equipment (AREA)
- Telephonic Communication Services (AREA)
Abstract
Systems, methods, and devices for remotely controlling functionalities of vehicles are disclosed herein. Example systems, methods and devices may include determining that a remote control device is located at a position associated with a functionality, establishing a communication connection between the remote control device and the functionality based on a machine readable medium associated with the position, and causing a user interface on the remote control device to present one or more settings associated with the functionality.
Description
- The present disclosure relates to remotely controlling functionalities of vehicles. Some embodiments are directed to systems, methods and devices for remotely controlling functionalities of vehicles.
- A primary interface for vehicle systems and functions include one or more knobs and buttons, and one or more vehicle interior controls for controlling heating, ventilation, and air conditioning (HAVC), windows, lighting, music, audio head, climate head, and shifters, etc. Typically, a vehicle interior control only performs one function for a product lifetime. The vehicle interior controls are fixed at respective locations and are accessible only to a driver and/or a front passenger of the vehicle. Thus, the vehicle interior controls are not accessible to the front passenger and/or backseat passengers of the vehicle. Further, packaging restrictions make the vehicle interior control costly and difficult to add new features once a cockpit design of the vehicle is complete.
- The detailed description is set forth with reference to the accompanying drawings. The use of the same reference numerals may indicate similar or identical items. Various embodiments may utilize elements and/or components other than those illustrated in the drawings, and some elements and/or components may not be present in various embodiments. Elements and/or components in the figures are not necessarily drawn to scale. Throughout this disclosure, depending on the context, singular and plural terminology may be used interchangeably.
-
FIG. 1 depicts a schematic illustration of an example implementation for remotely controlling functionalities of a vehicle in accordance with one or more embodiments of the disclosure. -
FIG. 2 depicts a schematic illustration of an example implementation for remotely controlling functionalities of a vehicle based on positions relative to the vehicle in accordance with one or more embodiments of the disclosure. -
FIG. 3 depicts an example process flow for remotely controlling functionalities of a vehicle based on positions relative to a remote control device in accordance with one or more embodiments of the disclosure. -
FIG. 4 depicts an example process flow for remotely controlling functionalities of vehicles in accordance with one or more embodiments of the disclosure. -
FIG. 5 depicts an example process flow for remotely controlling functionalities of vehicles in accordance with one or more embodiments of the disclosure. -
FIG. 6 depicts an example process flow for remotely controlling functionalities of vehicles in accordance with one or more embodiments of the disclosure. -
FIG. 7 depicts an illustrative architecture in which techniques and structures for providing the systems and methods disclosed herein may be implemented. -
FIG. 8 depicts a schematic exploded illustration of an example remote control device in accordance with one or more embodiments of the disclosure. - The systems, methods and devices disclosed herein are configured to remotely controlling one or more functionalities from vehicle interior controls to third-party services based on a position of a remote control device. A functionality describes a controllable and/or selectable function associated with a vehicle, e.g., driving information review and/or search, user comfort settings control, vehicle controls, entertainment, wayfinding, lock control, window switches control, seat adjustment control, one or more software applications, and the like. By changing the position associated the remote control device, different functionalities are provided. The remote control device may be passed between vehicle inhabitants such that the functionalities are accessible to different occupants, e.g., drivers, front passengers, and/or backseat passengers. The functionalities may be added, removed, and different/or updated via a software update (e.g., an over-the-air update). In some instances, the systems, methods and devices may apply Sensor Fusion, Machine Learning and/or Artificial Intelligence to learn and adapt to various occupants. For example, usage data, device sensor data such as position and biometrics, and vehicle data such as weather, vehicle inertial data, and location are used to understand the occupants' behaviors better and offer the occupants access to their desired content more effectively.
- An occupant may interact with the remote control device to execute one or more functionalities associated with various positions relative to the vehicle. Different positions relative to the vehicle may be associated with different functionalities. For instance, a remote control device is placed at a first position to execute functionalities associated with the first position. A different remote control device placed at a second position may execute different functionalities associated with the second position. If the different remote control device is placed at the same position as the remote control device, the different remote control device may execute the same functionalities as the remote control device. Examples are further described with respect to
FIG. 1 andFIG. 2 . - The occupant may interact with the remote control device to execute one or more functionalities associated with various positions relative to the remote control device. The remote control device may control different functionalities based on different positions relative to the remote control device, even if a position relative to the vehicle of the remote control device is changed. A position change relative to the remote control device may be caused by tapping or swiping a screen, rotating the entire remote control device, tilting the remote control device, picking the remote device up to hold the remote control device, speaking to the remote control device, and the like. Examples are further described with respect to
FIG. 3 . - The remote control device may function inside and/or outside a vehicle. The remote control device includes a high resolution touch display, a microprocessor, and a series of internal sensors in a round, hand-sized form factor. Examples are further described with respect to
FIG. 7 andFIG. 8 . - Turning now to the drawings,
FIG. 1 depicts a schematic illustration of anexample implementation 100 for remotely controllingfunctionalities 108A-108D of avehicle 102 based on positions relative to thevehicle 102 in accordance with one or more embodiments of the disclosure. Anoccupant 105 may be a driver, a front passenger or a backseat passenger in thevehicle 102. - The
vehicle 102 may include one or moreremote control devices 110. Aremote control device 110 may remotely control one ormore functionalities 108A-108D of thevehicle 102 based onpositions 104A-104D where theoccupant 105 places theremote control device 110. Afirst position 104A is associated withfunctionalities 108A, such as charging, window controls, and interior lighting controls. Asecond position 104B is associated withfunctionalities 108B, such as one or more user preferred entertainment applications. Athird position 104C is associated withfunctionalities 108C, such as one or more wayfinding applications. If theoccupant 105 places theremote control device 110 at a position (e.g., thefirst position 104A, thesecond position 104B, or thethird position 104C), thevehicle 102 may determine where theremote control device 110 is located. Thevehicle 102 may establish a communication connection between theremote control device 110 and one or more functionalities (e.g., thefunctionalities 108A, thefunctionalities 108B, or thefunctionalities 108C) associated with the position based on a machine readable medium associated with the position. The machine readable medium may include one or more radio frequency identification (RFID) tags, one or more media using a Bluetooth connectivity, one or more media using a Bluetooth low energy (BLE) connectivity, one or more wireless antennas, or some combination thereof. Thevehicle 102 may cause a user interface on theremote control device 110 to present one or more settings associated with the one or more functionalities. - As shown in
FIG. 1 , thevehicle 102 may receive an audio input and generate an audio and/orvisual output 108D. For instance, theremote control device 110 of thevehicle 102 is placed at afourth position 104D to execute one or more functionalities associated with the audio input and audio/visual output 108D. However, it should be understood that theremote control device 110 may be placed at a location (e.g., thefirst position 104A, thesecond position 104B, thethird position 104C or any other position) other than the fourth position to receive an audio command from theoccupant 105. Theremote control device 110 may generate an audio and/orvisual output 108D, such as an audio/visual feedback in response to the audio command, an audio/visual output indicating that a status of a currently executed functionality, and/or an audio/visual output indicating that a status of theremote control device 110 for receiving the audio command. - The
vehicle 102 may further include one or more memories including computer-executable instructions, and one or more computing processors configured to access the one or more memories and execute the computer-executable instructions. For instance, thevehicle 102 may execute the computer-executable instructions to determine that theremote control device 110 is located at the position, to establish a communication connection between theremote control device 110 and functionalities associated with the position, and to cause the user interface on theremote control device 110 to present one or more settings associated with the one or more functionalities. In some embodiments, the one or more memories including computer-executable instructions, and one or more computing processors configured to access the one or more memories and execute the computer-executable instructions may be included in theremote control device 110. - The
vehicle 102 may establish a communication connection between theremote control device 110 and functionalities associated with a position based on one or more machine readable media that are included at the position of thevehicle 102. For instance, a first readable medium (e.g., a RFID tag, or the like) may be included at thefirst position 104A. Thevehicle 102 may determine that theremote control device 110 is located at thefirst position 104A based on the first readable medium. Thevehicle 102 may transmit a signal indicative of thefirst position 104A by the first machine readable medium located at thefirst position 104A of thevehicle 102. Thevehicle 102 may cause a communication between the first machine readable medium located at thefirst position 104A and thevehicle 102 based on the signal indicative of thefirst position 104A. Thevehicle 102 may establish a communication connection between theremote control device 110 and thefunctionalities 108A based on the communication. - The
vehicle 102 may establish a communication connection between theremote control device 110 and functionalities associated with a position based on one or more machine readable media that are included in theremote control device 110. For instance, a second machine readable medium (e.g., a RFID tag, or the like) may be included in theremote control device 110. Thevehicle 102 may receive sensor data from one or more sensors of theremote control device 110. The sensor data indicates that theremote control device 110 is placed at thefirst position 104A as an example. A sensor may include an inertial measurement unit (IMU), a magnetometer (compass) module, a near field/far field motion sensor, an ambient light sensor, or some combination thereof. Thevehicle 102 may cause a communication between a second machine readable medium of theremote control device 110 and thevehicle 102. Thevehicle 102 may establish a communication connection between theremote control device 110 and thefunctionalities 108A. - The
occupant 105 may interact with theremote control device 110 to execute functionalities associated with positions relative to thevehicle 102. For instance, theremote control device 110 may be moved from thefirst position 104A to thesecond position 104B to execute thefunctionalities 108B. Thevehicle 102 may determine a movement change of theremote control device 110 moving from thefirst position 104A to thesecond position 104B. Thevehicle 102 may disestablish a first communication connection between theremote control device 110 and thefunctionalities 108A based at on the movement change. Thevehicle 102 may establish a second communication connection between theremote control device 110 and thefunctionalities 108B based on one or more machine readable media associated with thesecond position 104B. The machine readable media may be included at thesecond position 104B of thevehicle 102, or included in theremote control device 110. Thevehicle 102 may cause the user interface on theremote control device 110 to present one or more settings associated with the functionalities 180B. - The movement change may be detected by one or more sensors of the
vehicle 102 and/or by the sensors of theremote control device 110. - The
vehicle 102 may receive a user input based on the user interaction with theremote control device 110 to modify the one or more settings associated with the functionalities. The user interaction may include tapping or swiping a screen, rotating the entire remote control device, tilting the remote control device, picking the remote device up to hold the remote control device, speaking to the remote control device, and the like. - The
remote control device 110 may include a touch display, one or more sensors to generate sensor data, the one or more memories including computer-executable instructions, and the one or more computing processors to access the one or more memories and execute the computer-executable instructions. -
FIG. 2 depicts a schematic illustration of anexample implementation 200 for remotely controllingfunctionalities 220A-220F of thevehicle 102 based onpositions 210A-210F relative to thevehicle 102 in accordance with one or more embodiments of the disclosure.Different functionalities 220A-220F may be associated withrespective positions 210A-210-F. In some embodiments, a singleremote control device 110 may be placed at a particular position of thepositions 210A-210F to execute functionalities associated with the particular position. The singleremote control device 110 may be passed over from one occupant (e.g., a driver, or a front passenger) to another occupant (e.g., a backseat passenger) to be placed at a position proximate the later occupant to execute functionalities associated with the position. In other instances, thevehicle 102 may include multipleremote control devices 110. Each of the multipleremote control devices 110 may be placed at a respective position (e.g., positions 210A-210F). Each of the multipleremote control devices 110 may executerespective functionalities 220A-220F associated with thepositions 210A-210F. -
FIG. 3 depicts a schematic illustration of anexample implementation 300 for remotely controllingfunctionalities 330A-330C of thevehicle 102 based on positions 332-336 relative to aremote control device 310 in accordance with one or more embodiments of the disclosure. Theremote control device 310 may be one of embodiments of theremote control device 110. Multiple functionalities (e.g., alock control 320A, awindow control 320B, and aseat adjustment control 320C) in a typical vehicle may be executed by the singleremote control device 310 based on various positions relative to theremote control device 310, even if a position relative to thevehicle 102 of theremote control device 310 is changed. For instance, theoccupant 105 may twist 332 theremote control device 310 to execute alock control 330A, slide 334 theremote control device 310 to execute awindow control 330B, jog and tilt 336 theremote control device 310 to execute aseat adjustment control 330C. Thevehicle 102 may determine a movement change of theremote control device 310 moving from atwisting position 332 to a slidingposition 334 relative to theremote control device 310. Thevehicle 102 may disestablish a first communication connection between theremote control device 310 and thelock control 330A based on the movement change. Thevehicle 102 may establish a second communication connection between theremote control device 310 and thewindow control 330B. Thevehicle 102 may cause a user interface on theremote control device 310 to present one or more settings associated with thewindow control 330B. The movement change may be detected by one or more sensors of theremote control device 110. Thevehicle 102 may determine a movement change of theremote control device 310 moving from thefirst position 104A to thesecond position 104B relative to thevehicle 102. Thevehicle 102 may maintain the functionalities associated with positions relative to theremote control device 310. - In some embodiments, each occupant may have a
remote control device 310 placed proximate to a respective occupant. The occupant may pivot theremote control device 310 to pick it up to hold the remote control device for a user input. Thevehicle 102 may receive a user input based on the user interaction with theremote control device 110 to modify the one or more settings associated with the functionalities. -
FIG. 4 depicts anexample process flow 400 for remotely controlling functionalities of vehicles in accordance with one or more embodiments of the disclosure. - At
block 410 of theprocess flow 400, one or more computer processors of a vehicle and/or a remote control device may execute computer-executable instructions stored on memory to determine that the remote control device is located at a position, the position associated with a functionality. The position (e.g., thepositions 104A-104D inFIG. 1 , or thepositions 210A-210F inFIG. 2 ) may be relative to the vehicle. The position (e.g., the positions in 332-336 inFIG. 3 ) may also be relative to the remote control device. The position may be determined based on sensor data of the remote control device. The position may be determined based on one or more machine readable media associated with the position. The machine readable media may be included at the position of the vehicle. In some embodiments, the machine readable media may be included in the remote control device. - At
block 420 of theprocess flow 400, one or more computer processors of a vehicle and/or a remote control device may execute computer-executable instructions stored on memory to establish a communication connection between the remote control device and the functionality based on a machine readable medium associated with the position. In one example, the machine readable medium is included at the position of the vehicle. The vehicle may transmit a signal indicative of the position by the machine readable medium. The vehicle may cause a communication between the machine readable medium based on the signal. The vehicle may establish the communication connection between the remote control device and the functionality based on the communication. As another example, the machine readable medium is included in the remote control device. The vehicle may receive sensor data from one or more sensors of the remote control device. The sensor data indicates that the remote control device is placed at the position. The vehicle may cause a communication between the machine readable medium and the vehicle. Thevehicle 102 may establish a communication connection between the remote control device and the functionality based on the communication. - At
block 430 of theprocess flow 400, one or more computer processors of a vehicle and/or a remote control device may execute computer-executable instructions stored on memory to cause a user interface on the first remote control device to present one or more settings associated with the functionality. The vehicle may receive a user input based on a user interaction with the remote control device to modify the one or more settings associated with the functionality. -
FIG. 5 depicts anexample process flow 500 for remotely controlling functionalities of vehicles in accordance with one or more embodiments of the disclosure. Steps 510-590 may be performed by one or more computer processors of a vehicle and/or a remote control device. For instance, one or more computer processors of a vehicle and/or a remote control device may execute computer-executable instructions stored on memory to execute a driving information application based on an RFID tag, or a Bluetooth/BLE connectivity that is associated with a first position, and/or to execute a user comfort application based on an RFID tag, or a Bluetooth/BLE connectivity that is associated with a second position. -
FIG. 6 depicts anexample process flow 600 for remotely controlling functionalities of vehicles in accordance with one or more embodiments of the disclosure. Steps 610-682 may be performed by one or more computer processors of a vehicle and/or a remote control device. For instance, one or more computer processors of a vehicle and/or a remote control device may execute computer-executable instructions stored on memory to execute a vehicle control application based on an RFID tag or a Bluetooth/BLE connectivity that is associated with a first position, to execute a user preferred entertainment application based on a RFID tag, or a Bluetooth/BLE connectivity that is associated with a second position, to execute a wayfinding application based on a RFID tag, or a Bluetooth/BLE connectivity that associated with a third position, and/or to execute an audio-command based application based on a RFID tag, or a Bluetooth/BLE connectivity that is associated with a fourth position. -
FIG. 7 depicts anillustrative architecture 700 in which techniques and structures for providing the systems and methods disclosed herein may be implemented. Theillustrative architecture 700 may includevehicle 102, one or more remote control devices 110(1), 110(2), 110(3), 110(S), and anetwork 760. Thenetwork 760 may include any one or a combination of multiple different types of networks, such as cable networks, the Internet, wireless networks, and other private and/or public networks. In some instances, the network 750 may include cellular, Wi-Fi, or Wi-Fi direct. - The
vehicle 102 generally includes avehicle controller 720 and afunctionality assembly 730. Generally described, thevehicle controller 720 includes one ormore processors 722 and one ormore memories 724. Thememory 724 stores instructions that may be executed by theprocessor 722 to perform various functions or operations disclosed herein, and/or to instruct theremote control device 110 to remotely control theassembly 730 in accordance with the present disclosure. Thevehicle controller 720 may also include acommunications interface 728 that allows thevehicle controller 720 to communicate with theremote control device 110 over thenetwork 760. Thevehicle controller 720 may also include aposition detection module 726 that may determine where theremote control device 110 is located. Theposition detection module 726 may include one or more machinereadable media 727. In some embodiments, theposition detection module 726 can be included in theremote control device 110. - The
functionality assembly 730 may include any one or more of an entertainment orinfotainment system 732, one ormore lighting elements 734, avoice command system 736, one or more seat sensors and/or components 737, aclimate control system 738, one ormore software applications 740, alock control system 742, awindow control system 744, aseat adjustment system 746, andother functionalities 748. - The entertainment or
infotainment system 732 is configured to provide visual and/or auditory output for theoccupant 105 such as music, videos, or other media. Thelighting elements 734 include any lighting devices that are located within the cabin of thevehicle 102. Some of theselighting elements 734 have selectable luminance and/or hue. Thus, thevehicle controller 720 may instruct theremote control device 110 selectively alter or change the luminance and/or hue of the one or more lights. - The
voice command system 736 may include any automated voice controlled system that allows theoccupant 105 to interact with thevehicle controller 720 and/or theremote control device 110 using words, phrases, or natural language input. For example, thevoice command system 736 may be used to instruct a user in utilizing the functionalities of thevehicle 102. - The
climate control system 738 allows a user to select a temperature within thevehicle 102 as well as control other aspects of climate such as seat heating or cooling. Thevehicle controller 720 may also cause theclimate control system 738 to activate heaters in a seat where the user is located using one or more seat sensors. - A
software applications 740 may be a computer program designed to perform a group of coordinated functions, tasks, or activities for the benefit of theoccupant 105. Example of the software application can include a driving information application, a user comfort application, a vehicle controls application, a user preferred entertainment application, a wayfinding application, and/or any suitable computer program that perform tasks for the benefit of theoccupant 105. Thelock control system 742 controls one or more locks of thevehicle 102. Thewindow control system 744 controls one or more windows of thevehicle 102. Theseat adjustment system 746 adjusts seat position for one or more seats of the vehicle. Theother functionalities 748 may include a scent dispenser that is configured to output scents, one or more biometric sensors that collect biometric data such as heart rate, pulse, body temperature, or other similar biometric data, and/or any other suitable functionality associated with thevehicle 102. - The
remote control device 110 may be configured to remotely control thefunctionality assembly 730. Theremote control device 110 generally includes one ormore processors 772, one or more memories 774, one or more I/O interfaces 776, acommunication interface 778, asensor module 780, theposition detection module 726, acharging module 784, ahaptic module 786, and one or more I/O components 790. - The memory 774 stores instructions that may be executed by the
processor 772 to perform various functions or operations disclosed herein. In general, theprocessor 772 may execute the instructions stored in the memory 774 to remotely control thefunctionality assembly 730. Theprocessor 772 may also execute the instructions stored in the memory 774 to receive one or more inputs and to generate one or more outputs via the I/O interfaces 776 and I/O components 790. Thecommunications interface 778 that allows theremote control device 110 to communicate with thevehicle controller 720 over thenetwork 760. In some embodiments, theremote control device 110 may also include theposition detection module 726. - The one or more I/O interfaces 776 allow for the coupling I/
O components 790 to theremote control device 110. The I/O interfaces 776 may include inter-integrated circuit (“I2C”), serial peripheral interface bus (“SPI”), universal serial bus (“USB”), RS-232, RS-432, and so forth. - The I/
O components 790 may include one ormore displays 792, one ormore touch sensors 794, one or more audio input/output components 796, and other suitable I/O components 798. Thedisplay 792 may be a high resolution display, a touchscreen, or a liquid crystal or electrophoretic display elements. Thetouch sensor 794 may include interpolating force sensing resistor (“IFSR”) arrays, capacitive sensors, optical touch sensors, and so forth. The audio input/output components 796 may include speakers, and microphones. The other I/O components 798 may include external memory devices, global positioning system receivers that may be coupled to theremote control device 110 using the I/O interfaces 776. - The
sensor module 780 may determine a movement relative to theremote control device 110 and a movement relative to thevehicle 102. Thesensor module 780 may include anIMU module 782, one or more magnetometer (compass)module 784, one ormore sensors 785, and one or more machinereadable media 727. The one ormore sensors 785 can include a near field motion sensor, a far field motion sensor, a near/far field motion sensor, an ambient light senor, any other suitable sensor, or some combination thereof. Thesensor module 780 may determine movements generated inFIG. 3 , such as one or more twisting positions, one or more sliding positions, and one or more jogging and tilting positions relative to theremote control device 110. Thesensor module 780 may determine the movement of theremote control device 110 relative to thevehicle 102. - The
charging module 784 may include one or more batteries, and/or one or more charging antennas. Thehaptic module 786 may generate haptic motions associated with actions performed by theoccupant 105, and/or associated with one or more functionalities in thefunctionality assembly 730 that are currently executed by theremote control device 110. -
FIG. 8 depicts a schematic illustration of an exampleremote control device 800 in accordance with one or more embodiments of the disclosure. Theremote control device 800 is one of embodiments of theremote control device 110. Theremote control device 800 includes alens 810, atouch display 812, amicrophone 814, acircuit board 816, abattery 818, ahaptic motor 820, a chargingantenna 824, and a magneticreactive PC enclosure 826. Thecircuit board 816 may include a microprocessor, a IMU module, a magnetometer module, one or more near/far filed motion sensors, one or more ambient light sensors, one or more machine readable media. - In the above disclosure, reference has been made to the accompanying drawings, which form a part hereof, which illustrate specific implementations in which the present disclosure may be practiced. It is understood that other implementations may be utilized, and structural changes may be made without departing from the scope of the present disclosure. References in the specification to “one embodiment,” “an embodiment,” “an example embodiment,” etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, one skilled in the art will recognize such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
- Implementations of the systems, apparatuses, devices, and methods disclosed herein may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed herein. Implementations within the scope of the present disclosure may also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media may be any available media that may be accessed by a general purpose or special purpose computer system. Computer-readable media that stores computer-executable instructions is computer storage media (devices). Computer-readable media that carries computer-executable instructions is transmission media. Thus, by way of example, and not limitation, implementations of the present disclosure may comprise at least two distinctly different kinds of computer-readable media: computer storage media (devices) and transmission media.
- Computer storage media (devices) includes RAM, ROM, EEPROM, CD-ROM, solid state drives (SSDs) (e.g., based on RAM), flash memory, phase-change memory (PCM), other types of memory, other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to store desired program code means in the form of computer-executable instructions or data structures and which may be accessed by a general purpose or special purpose computer.
- An implementation of the devices, systems, and methods disclosed herein may communicate over a computer network. A “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or any combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmission media may include a network and/or data links, which may be used to carry desired program code means in the form of computer-executable instructions or data structures and which may be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
- Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. The computer-executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
- Those skilled in the art will appreciate that the present disclosure may be practiced in network computing environments with many types of computer system configurations, including in-dash vehicle computers, personal computers, desktop computers, laptop computers, message processors, handheld devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, tablets, pagers, routers, switches, various storage devices, and the like. The disclosure may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by any combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both the local and remote memory storage devices.
- Further, where appropriate, the functions described herein may be performed in one or more of hardware, software, firmware, digital components, or analog components. For example, one or more application specific integrated circuits (ASICs) may be programmed to carry out one or more of the systems and procedures described herein. Certain terms are used throughout the description and claims refer to particular system components. As one skilled in the art will appreciate, components may be referred to by different names. This document does not intend to distinguish between components that differ in name, but not function.
- It should be noted that the sensor embodiments discussed above may comprise computer hardware, software, firmware, or any combination thereof to perform at least a portion of their functions. For example, a sensor may include computer code configured to be executed in one or more processors and may include hardware logic/electrical circuitry controlled by the computer code. These example devices are provided herein for purposes of illustration and are not intended to be limiting. Embodiments of the present disclosure may be implemented in further types of devices, as would be known to persons skilled in the relevant art(s).
- At least some embodiments of the present disclosure have been directed to computer program products comprising such logic (e.g., in the form of software) stored on any computer-usable medium. Such software, when executed in one or more data processing devices, causes a device to operate as described herein.
- While various embodiments of the present disclosure have been described above, it should be understood that they have been presented by way of example only, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail may be made therein without departing from the spirit and scope of the present disclosure. Thus, the breadth and scope of the present disclosure should not be limited by any of the above-described exemplary embodiments but should be defined only in accordance with the following claims and their equivalents. The foregoing description has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the present disclosure to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. Further, it should be noted that any or all of the aforementioned alternate implementations may be used in any combination desired to form additional hybrid implementations of the present disclosure. For example, any of the functionality described with respect to a particular device or component may be performed by another device or component. Further, while specific device characteristics have been described, embodiments of the disclosure may relate to numerous other device characteristics. Further, although embodiments have been described in language specific to structural features and/or methodological acts, it is to be understood that the disclosure is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as illustrative forms of implementing the embodiments. Conditional language, such as, among others, “may,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments could include, while other embodiments may not include, certain features, elements, and/or steps. Thus, such conditional language is not generally intended to imply that features, elements, and/or steps are in any way required for one or more embodiments.
Claims (20)
1. A vehicle comprising:
a first remote control device configured to remotely control one or more functionalities of the vehicle;
at least one memory comprising computer-executable instructions; and
one or more computing processors configured to access the at least one memory and execute the computer-executable instructions to:
determine that the first remote control device is located at a first position within the vehicle, the first position associated with a first functionality of the vehicle;
establish a first communication connection between the first remote control device and the first functionality based at least in part on a first machine readable medium associated with the first position; and
cause a user interface on the first remote control device to present one or more settings associated with the first functionality.
2. The vehicle of claim 1 , wherein establishing the first communication connection between the first remote control device and the first functionality based at least in part on the first machine readable medium associated with the first position comprises:
transmitting a signal indicative of the first position by the first machine readable medium located at the first position of the vehicle;
causing, based at least in part on the signal indicative of the first position, a communication between the first machine readable medium located at the first position and the first remote control device; and
establishing the first communication connection between the first remote control device and the first functionality of the vehicle based at least in part on the communication.
3. The vehicle of claim 1 , wherein establishing the first communication connection between the first remote control device and the first functionality based at least in part on the first machine readable medium associated with the first position comprises:
receiving sensor data from one or more sensors of the first remote control device, the sensor data indicative of the first position;
causing, based at least in part on the sensor data indicative of the first position, a communication between the first machine readable medium of the first remote control device and the vehicle; and
establishing the first communication connection between the first remote control device and the first functionality based at least in part on the communication.
4. The vehicle of claim 1 , wherein the one or more computing processors configured to access the at least one memory and execute the computer-executable instructions to:
determine a movement change of the first remote control device moving from the first position to a second position;
disestablish the first communication connection between the first remote control device and the first functionality based at least in part on the movement change;
establish a second communication connection between the first remote control device and a second functionality of the one or more functionalities based at least in part on a second machine readable medium associated with the second position; and
cause the user interface on the first remote control device to present one or more settings associated with the second functionality.
5. The vehicle of claim 1 , further comprising a second remote control device, wherein the one or more computing processors is further configured to access the at least one memory and execute the computer-executable instructions to:
determine that the second remote control device is located at the first position of the vehicle;
establish a second communication connection between the second remote control device and the first functionality based at least in part on a second machine readable medium associated with the first position; and
cause a user interface on the second remote control device to present the one or more settings associated with the first functionality.
6. The vehicle of claim 1 , further comprising a second remote control device, wherein the one or more computing processors is further configured to access the at least one memory and execute the computer-executable instructions to:
determine that the second remote control device is located at a second position of the vehicle;
establish a second communication connection between the second remote control device and a second functionality based at least in part on a second machine readable medium associated with the second position; and
cause a user interface on the second remote control device to present one or more settings associated with the second functionality.
7. The vehicle of claim 1 , wherein the one or more computer processors are further configured to access the at least one memory and execute the computer-executable instructions to:
determine a movement change of the first remote control device moving from the first position to a second position; and
maintain the first communication connection at the second position.
8. The vehicle of claim 1 , wherein the one or more computer processors are further configured to access the at least one memory and execute the computer-executable instructions to:
receive a user input based at least in part on one or more user interactions between the first remote control device and at least one passenger of the vehicle; and
modify the one or more settings associated with the first functionality based at least in part on the user input.
9. A remote control device comprising:
a touch display;
one or more sensors configured to generate sensor data;
one or more machine readable media configured to communicate with a vehicle based at least in part on the sensor data,
at least one memory comprising computer-executable instructions; and
one or more computing processors configured to access the at least one memory and execute the computer-executable instructions to:
receive the sensor data from the one or more sensors, the sensor data indicating that the remote control device is located at a first position within the vehicle;
cause, based at least in part on the sensor data indicative of the first position, a first communication connection between the one or more machine readable media of the remote control device and the vehicle;
establish a second communication connection between the remote control device and a first functionality of the vehicle based at least in part on the first communication connection, the first functionality associated with the first position; and
cause a user interface on the remote control device to present one or more settings associated with the first functionality.
10. The remote control device of claim 9 , wherein the one or more computing processors configured to access the at least one memory and execute the computer-executable instructions to:
determine a movement change of the remote control device moving from the first position to a second position based at least in part on the sensor data;
disestablish the second communication connection between the remote control device and the first functionality of the vehicle based at least in part on the movement change;
cause, based at least in part on the sensor data indicative of the second position, a third communication connection between the one or more machine readable media of the remote control device and the vehicle;
establish a fourth communication connection between the remote control device and a second functionality of the vehicle based at least in part on the third communication connection; and
cause the user interface on the touch display to present one or more settings associated with the second functionality.
11. The remote control device of claim 9 , wherein the one or more computer processors are further configured to access the at least one memory and execute the computer-executable instructions to:
determine a movement change of the remote control device moving from the first position to a second position; and
maintain the first communication connection at the second position.
12. The remote control device of claim 9 , wherein the one or more computer processors are further configured to access the at least one memory and execute the computer-executable instructions to:
receive a user input based at least in part on one or more user interactions between the user interface and at least one passenger of the vehicle; and
modify the one or more settings associated with the first functionality based at least in part on the user input.
13. A system, comprising:
a vehicle;
a first remote control device configured to remotely control one or more functionalities of the vehicle;
at least one memory comprising computer-executable instructions; and
one or more computing processors configured to access the at least one memory and execute the computer-executable instructions to:
determine that the first remote control device is located at a first position within the vehicle, the first position associated with a first functionality;
establish a first communication connection between the first remote control device and the first functionality based at least in part on a first machine readable medium associated with the first position; and
cause a user interface on the first remote control device to present one or more settings associated with the first functionality.
14. The system of claim 13 , wherein establishing the first communication connection between the first remote control device and the first functionality based at least in part on the first machine readable medium associated with the first position comprises:
transmitting a signal indicative of the first position by the first machine readable medium located at the first position of the vehicle;
causing, based at least in part on the signal indicative of the first position, a communication between the first machine readable medium located at the first position and the first remote control device; and
establishing the first communication connection between the first remote control device and the first functionality of the vehicle based at least in part on the communication.
15. The system of claim 13 , wherein establishing the first communication connection between the first remote control device and the first functionality based at least in part on the first machine readable medium associated with the first position comprises:
receiving sensor data from one or more sensors of the first remote control device, the sensor data indicative of the first position;
causing, based at least in part on the sensor data indicative of the first position, a communication between the first machine readable medium of the first remote control device and the vehicle; and
establishing the first communication connection between the first remote control device and the first functionality based at least in part on the communication.
16. The system of claim 13 , wherein the one or more computing processors configured to access the at least one memory and execute the computer-executable instructions to:
determine a movement change of the first remote control device moving from the first position to a second position;
disestablish the first communication connection between the first remote control device and the first functionality based at least in part on the movement change;
establish a second communication connection between the first remote control device and a second functionality of the one or more functionalities based at least in part on a second machine readable medium associated with the second position; and
cause the user interface on the first remote control device to present one or more settings associated with the second functionality.
17. The system of claim 13 , further comprising a second remote control device, wherein the one or more computing processors is further configured to access the at least one memory and execute the computer-executable instructions to:
determine that the second remote control device is located at the first position of the vehicle;
establish a second communication connection between the second remote control device and the first functionality based at least in part on a second machine readable medium associated with the first position; and
cause a user interface on the second remote control device to present the one or more settings associated with the first functionality.
18. The system of claim 13 , further comprising a second remote control device, wherein the one or more computing processors is further configured to access the at least one memory and execute the computer-executable instructions to:
determine that the second remote control device is located at a second position of the vehicle;
establish a second communication connection between the second remote control device and a second functionality based at least in part on a second machine readable medium associated with the second position; and
cause a user interface on the second remote control device to present one or more settings associated with the second functionality.
19. The system of claim 13 , wherein the one or more computer processors are further configured to access the at least one memory and execute the computer-executable instructions to:
determine a movement change of the first remote control device moving from the first position to a second position; and
maintain the first communication connection at the second position.
20. The system of claim 13 , wherein the one or more computer processors are further configured to access the at least one memory and execute the computer-executable instructions to:
receive a user input based at least in part on one or more user interactions between the first remote control device and at least one passenger of the vehicle; and
modify the one or more settings associated with the first functionality based at least in part on the user input.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/589,771 US10970998B1 (en) | 2019-10-01 | 2019-10-01 | Systems, methods, and devices for remotely controlling functionalities of vehicles |
DE102020125763.5A DE102020125763A1 (en) | 2019-10-01 | 2020-10-01 | SYSTEMS, PROCEDURES AND DEVICES FOR REMOTE CONTROL OF FUNCTIONALITIES OF VEHICLES |
CN202011072613.8A CN112601189A (en) | 2019-10-01 | 2020-10-09 | System, method and apparatus for remotely controlling vehicle functionality |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/589,771 US10970998B1 (en) | 2019-10-01 | 2019-10-01 | Systems, methods, and devices for remotely controlling functionalities of vehicles |
Publications (2)
Publication Number | Publication Date |
---|---|
US20210097851A1 true US20210097851A1 (en) | 2021-04-01 |
US10970998B1 US10970998B1 (en) | 2021-04-06 |
Family
ID=74872711
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/589,771 Active US10970998B1 (en) | 2019-10-01 | 2019-10-01 | Systems, methods, and devices for remotely controlling functionalities of vehicles |
Country Status (3)
Country | Link |
---|---|
US (1) | US10970998B1 (en) |
CN (1) | CN112601189A (en) |
DE (1) | DE102020125763A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220381081A1 (en) * | 2021-05-31 | 2022-12-01 | Honda Motor Co., Ltd. | Mobile body control system, mobile body control method, and recording medium |
US20240153718A1 (en) * | 2022-11-07 | 2024-05-09 | Pass & Seymour, Inc. | Switch assembly with pivotable switch cover |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11472558B2 (en) | 2018-11-13 | 2022-10-18 | Textron Innovations, Inc. | Aircraft seat |
US12061750B1 (en) | 2023-02-08 | 2024-08-13 | Ford Global Technologies, Llc | Modular vehicle HMI |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9530265B2 (en) * | 2014-05-14 | 2016-12-27 | Lg Electronics Inc. | Mobile terminal and vehicle control |
-
2019
- 2019-10-01 US US16/589,771 patent/US10970998B1/en active Active
-
2020
- 2020-10-01 DE DE102020125763.5A patent/DE102020125763A1/en active Pending
- 2020-10-09 CN CN202011072613.8A patent/CN112601189A/en active Pending
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9530265B2 (en) * | 2014-05-14 | 2016-12-27 | Lg Electronics Inc. | Mobile terminal and vehicle control |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220381081A1 (en) * | 2021-05-31 | 2022-12-01 | Honda Motor Co., Ltd. | Mobile body control system, mobile body control method, and recording medium |
US20240153718A1 (en) * | 2022-11-07 | 2024-05-09 | Pass & Seymour, Inc. | Switch assembly with pivotable switch cover |
Also Published As
Publication number | Publication date |
---|---|
DE102020125763A1 (en) | 2021-04-01 |
US10970998B1 (en) | 2021-04-06 |
CN112601189A (en) | 2021-04-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10970998B1 (en) | Systems, methods, and devices for remotely controlling functionalities of vehicles | |
US11243613B2 (en) | Smart tutorial for gesture control system | |
EP3712001B1 (en) | Method and device for controlling display on basis of driving context | |
EP3346364A1 (en) | Adaptive polyhedral display device | |
US9493125B2 (en) | Apparatus and method for controlling of vehicle using wearable device | |
KR101927170B1 (en) | System and method for vehicular and mobile communication device connectivity | |
IL262373A (en) | System and method for promoting connectivity between a mobile communication device and a vehicle touch screen | |
CN107117114A (en) | In-car add-on module is integrated to driver's user interface | |
EP3714356B1 (en) | Contextual and aware button-free screen articulation | |
JP2015039160A (en) | Control device and control method for controlling vehicle function | |
US9936065B2 (en) | Selectively limiting a non-vehicle user input source of a handheld mobile device | |
KR102686009B1 (en) | Terminal device, vehicle having the same and method for controlling the same | |
KR102390623B1 (en) | Vehicle display device for controlling a vehicle interface using a portable terminal and method for controlling the same | |
JP2017215949A (en) | Intelligent tutorial for gesture | |
US10369943B2 (en) | In-vehicle infotainment control systems and methods | |
EP3126934B1 (en) | Systems and methods for the detection of implicit gestures | |
WO2021019876A1 (en) | Information processing device, driver specifying device, and learning model | |
CN107831825B (en) | Flexible modular screen apparatus for mounting to a participating vehicle and transferring user profiles therebetween | |
US20190286282A1 (en) | Methods and systems for adjusting a layout of applications displayed on a screen | |
US11661007B2 (en) | Wireless mirror control device | |
KR20140128807A (en) | An method for configuring of a vehichle and an appratus using it | |
KR102246498B1 (en) | Cockpit module for autonomous vehicle and method for controlling display module position | |
CN114115529A (en) | Sensing method and device, storage medium and vehicle | |
KR20140128808A (en) | An method for configuring of a vehichle and an appratus using it |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |