WO2023235434A2 - Rendu synchronisé - Google Patents

Rendu synchronisé Download PDF

Info

Publication number
WO2023235434A2
WO2023235434A2 PCT/US2023/024064 US2023024064W WO2023235434A2 WO 2023235434 A2 WO2023235434 A2 WO 2023235434A2 US 2023024064 W US2023024064 W US 2023024064W WO 2023235434 A2 WO2023235434 A2 WO 2023235434A2
Authority
WO
WIPO (PCT)
Prior art keywords
frame
layout
vehicle
examples
content
Prior art date
Application number
PCT/US2023/024064
Other languages
English (en)
Other versions
WO2023235434A3 (fr
Inventor
Vikrant Kasarabada
Andre M. BOULE
Bartosz Ciechanowski
Eldad Eilam
Michael L. Knippers
Sylvain P. Rebaud
Gennadiy Shekhtman
Mark J. VAN BELLEGHEM
Christopher R. Whitney
Francesco ZULIANI
Richard R. Dellinger
Emily C. Schubert
Stephen Chick
Tanya G. KANCHEVA
Original Assignee
Apple Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/952,143 external-priority patent/US20230391190A1/en
Application filed by Apple Inc. filed Critical Apple Inc.
Publication of WO2023235434A2 publication Critical patent/WO2023235434A2/fr
Publication of WO2023235434A3 publication Critical patent/WO2023235434A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41422Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance located in transportation means, e.g. personal vehicle
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/10Mixing of images, i.e. displayed pixel being the result of an operation, e.g. adding, on the corresponding input pixels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/04Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/04Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
    • G09G2370/042Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller for monitor identification
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/20Details of the management of multiple sources of image data
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/10Automotive applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations

Definitions

  • FIG. l is a block diagram illustrating a compute system.
  • FIG. 2 is a block diagram illustrating a device with interconnected subsystems.
  • FIG. 3 is a block diagram illustrating a vehicle connected to a user device via a transport.
  • FIGS. 4A-4H are block diagrams illustrating content being displayed on a display of a vehicle.
  • FIGS. 5 A-5H are flow diagrams illustrating different operations before by a vehicle and a user device.
  • FIG. 6 is a flow diagram illustrating a method for establishing a layout on multiple devices for synchronized rendering.
  • FIG. 7 is a flow diagram illustrating a method for time-based rendering synchronization.
  • FIG. 8 is a flow diagram illustrating a method for controlling rendering by another device.
  • FIG. 9 is a flow diagram illustrating a method for rendering an animation across multiple devices.
  • FIG. 10 is a flow diagram illustrating a method for customizing vehicle controls when connecting to a user device.
  • FIG. 11 is a flow diagram illustrating a method for changing layouts used during synchronized rendering in case of a connection loss.
  • system or computer readable medium contains instructions for performing the contingent operations based on the satisfaction of the corresponding one or more conditions and thus is capable of determining whether the contingency has or has not been satisfied without explicitly repeating steps of a method until all of the conditions upon which steps in the method are contingent have been met.
  • a system or computer readable storage medium can repeat the steps of a method as many times as are needed to ensure that all of the contingent steps have been performed.
  • first means “first,” “second,” etc. to describe various elements, these elements should not be limited by the terms. In some examples, these terms are used to distinguish one element from another. For example, a first device could be termed a second device, and, similarly, a second device could be termed a first device, without departing from the scope of the various described examples. In some examples, the first device and the second device are two separate references to the same device. In some examples, the first device and the second device are both devices, but they are not the same device or the same type of device.
  • the term “if’ is, optionally, construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” depending on the context.
  • the phrase “if it is determined” or “if [a stated condition or event] is detected” is, optionally, construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event],” depending on the context.
  • Compute system 100 is a non-limiting example of a compute system that may be used to perform functionality described herein. It should be recognized that other computer architectures of a compute system may be used to perform functionality described herein.
  • compute system 100 includes processor subsystem 110 coupled (e.g., wired or wirelessly) to memory 120 (e.g., a system memory) and I/O interface 130 via interconnect 150 (e.g., a system bus, one or more memory locations, or other communication channel for connecting multiple components of compute system 100).
  • I/O interface 130 is coupled (e.g., wired or wirelessly) to I/O device 140.
  • I/O interface 130 is included with I/O device 140 such that the two are a single component. It should be recognized that there may be one or more I/O interfaces, with each I/O interface coupled to one or more I/O devices.
  • multiple instances of processor subsystem 110 may be coupled to interconnect 150.
  • Compute system 100 may be any of various types of devices, including, but not limited to, a system on a chip, a server system, a personal computer system (e.g., an iPhone, iPad, or MacBook), a sensor, or the like.
  • compute system 100 is included with or coupled to a physical component for the purpose of modifying the physical component in response to an instruction (e.g., compute system 100 receives an instruction to modify a physical component and, in response to the instruction, causes the physical component to be modified (e.g., through an actuator)).
  • Examples of such physical components include an acceleration control, a break, a gear box, a motor, a pump, a refrigeration system, a suspension system, a steering control, a vacuum system, and a valve.
  • a sensor includes one or more hardware components that detect information about a physical environment in proximity to (e.g., surrounding) the sensor.
  • a hardware component of a sensor includes a sensing component (e.g., an image sensor or temperature sensor), a transmitting component (e.g., a laser or radio transmitter), a receiving component (e.g., a laser or radio receiver), or any combination thereof.
  • sensors include an angle sensor, a chemical sensor, a brake pressure sensor, a contact sensor, a non-contact sensor, an electrical sensor, a flow sensor, a force sensor, a gas sensor, a humidity sensor, an image sensor (e.g., a camera), an inertial measurement unit, a leak sensor, a level sensor, a light detection and ranging system, a metal sensor, a motion sensor, a particle sensor, a photoelectric sensor, a position sensor (e.g., a global positioning system), a precipitation sensor, a pressure sensor, a proximity sensor, a radio detection and ranging system, a radiation sensor, a speed sensor (e.g., measures the speed of an object), a temperature sensor, a time-of-flight sensor, a torque sensor, and an ultrasonic sensor.
  • compute system 100 may also be implemented as two or more compute systems operating together.
  • processor subsystem 110 includes one or more processors or processing units configured to execute program instructions to perform functionality described herein.
  • processor subsystem 110 may execute an operating system, a middleware system, one or more applications, or any combination thereof.
  • the operating system manages resources of compute system 100.
  • Examples of types of operating systems covered herein include batch operating systems (e.g., Multiple Virtual Storage (MVS)), time-sharing operating systems (e.g., Unix), distributed operating systems (e.g., Advanced Interactive executive (AIX), network operating systems (e.g., Microsoft Windows Server), and real-time operating systems (e.g., QNX).
  • the operating system includes various procedures, sets of instructions, software components, and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, or the like) and for facilitating communication between various hardware and software components.
  • the operating system uses a priority-based scheduler that assigns a priority to different tasks that are to be executed by processor subsystem 110.
  • the priority assigned to a task is used to identify a next task to execute.
  • the priority -based scheduler identifies a next task to execute when a previous task finishes executing (e.g., the highest priority task runs to completion unless another higher priority task is made ready).
  • the middleware system provides one or more services and/or capabilities to applications (e.g., the one or more applications running on processor subsystem 110) outside of what is offered by the operating system (e.g., data management, application services, messaging, authentication, API management, or the like).
  • the middleware system is designed for a heterogeneous computer cluster, to provide hardware abstraction, low-level device control, implementation of commonly used functionality, message-passing between processes, package management, or any combination thereof. Examples of middleware systems include Lightweight Communications and Marshalling (LCM), PX4, Robot Operating System (ROS), and ZeroMQ.
  • the middleware system represents processes and/or operations using a graph architecture, where processing takes place in nodes that may receive, post, and multiplex sensor data, control, state, planning, actuator, and other messages.
  • an application e.g., an application executing on processor subsystem 110 as described above
  • a message sent from a first node in a graph architecture to a second node in the graph architecture is performed using a publish-subscribe model, where the first node publishes data on a channel in which the second node is able to subscribe.
  • the first node may store data in memory (e.g., memory 120 or some local memory of processor subsystem 110) and notify the second node that the data has been stored in the memory.
  • the first node notifies the second node that the data has been stored in the memory by sending a pointer (e.g., a memory pointer, such as an identification of a memory location) to the second node so that the second node can access the data from where the first node stored the data.
  • the first node would send the data directly to the second node so that the second node would not need to access a memory based on data received from the first node.
  • Memory 120 may include a computer readable medium (e.g., non-transitory or transitory computer readable medium) usable to store program instructions executable by processor subsystem 110 to cause compute system 100 to perform various operations described herein.
  • memory 120 may store program instructions to implement the functionality associated with any or all of the flows described in FIGS. 4A-4H, FIGS. 5A-5H, and FIGS. 6-11.
  • Memory 120 may be implemented using different physical, non-transitory memory media, such as hard disk storage, floppy disk storage, removable disk storage, flash memory, random access memory (RAM— SRAM, EDO RAM, SDRAM, DDR SDRAM, RAMBUS RAM, or the like), read only memory (PROM, EEPROM, or the like), or the like.
  • Memory in compute system 100 is not limited to primary storage such as memory 120. Rather, compute system 100 may also include other forms of storage such as cache memory in processor subsystem 110 and secondary storage on I/O device 140 (e.g., a hard drive, storage array, etc.). In some examples, these other forms of storage may also store program instructions executable by processor subsystem 110 to perform operations described herein.
  • processor subsystem 110 (or each processor within processor subsystem 110) contains a cache or other form of on-board memory.
  • I/O interface 130 may be any of various types of interfaces configured to couple to and communicate with other devices.
  • I/O interface 130 includes a bridge chip (e.g., Southbridge) from a front-side bus to one or more back-side buses.
  • I/O interface 130 may be coupled to one or more I/O devices (e.g., I/O device 140) via one or more corresponding buses or other interfaces.
  • I/O devices examples include storage devices (hard drive, optical drive, removable flash drive, storage array, SAN, or their associated controller), network interface devices (e.g., to a local or wide-area network), sensor devices (e.g., camera, radar, LiDAR, ultrasonic sensor, GPS, inertial measurement device, or the like), and auditory or visual output devices (e.g., speaker, light, screen, projector, or the like).
  • compute system 100 is coupled to a network via a network interface device (e.g., configured to communicate over Wi-Fi, Bluetooth, Ethernet, or the like).
  • FIG. 2 depicts a block diagram of device 200 with interconnected subsystems.
  • device 200 includes three different subsystems (i.e., first subsystem 210, second subsystem 220, and third subsystem 230) coupled (e.g., wired or wirelessly) to each other.
  • first subsystem 210 i.e., first subsystem 210
  • second subsystem 220 i.e., second subsystem 220
  • third subsystem 230 an example of a possible computer architecture of a subsystem as included in FIG. 2 is described in FIG. 1 (i.e., compute system 100).
  • FIG. 1 i.e., compute system 100
  • device 200 may include more or fewer subsystems.
  • some subsystems are not connected to another subsystem (e.g., first subsystem 210 may be connected to second subsystem 220 and third subsystem 230 but second subsystem 220 may not be connected to third subsystem 230).
  • some subsystems are connected via one or more wires while other subsystems are wirelessly connected.
  • one or more subsystems are wirelessly connected to one or more compute systems outside of device 200, such as a server system. In such examples, the subsystem may be configured to communicate wirelessly to the one or more compute systems outside of device 200.
  • device 200 includes a housing that fully or partially encloses subsystems 210-230.
  • Examples of device 200 include a home-appliance device (e.g., a refrigerator or an air conditioning system), a robot (e.g., a robotic arm or a robotic vacuum), and a vehicle.
  • device 200 is configured to navigate device 200 (with or without direct user input) in a physical environment.
  • one or more subsystems of device 200 are used to control, manage, and/or receive data from one or more other subsystems of device 200 and/or one or more compute systems remote from device 200.
  • first subsystem 210 and second subsystem 220 may each be a camera that is capturing images for third subsystem 230 to use to make a decision.
  • at least a portion of device 200 functions as a distributed compute system. For example, a task may be split into different portions, where a first portion is executed by first subsystem 210 and a second portion is executed by second subsystem 220.
  • FIG. 3 is a block diagram illustrating vehicle 302 connected to user device 320 via transport 330.
  • vehicle 302 connected to user device 320 via transport 330.
  • Such a configuration may allow for a unified experience, bringing together user interface elements from both vehicle 302 and user device 320.
  • user device 320 may be able to drive an experience through and integrate with vehicle 302.
  • vehicle 302 includes vehicle process 304, vehicle Tenderer 306, integration process 308, integration Tenderer 310, output device 312, vehicle sensor 314, and virtual assistant subsystem 316.
  • vehicle process 304 vehicle process 304
  • vehicle Tenderer 306 integration process 308
  • integration Tenderer 310 output device 312, vehicle sensor 314, and virtual assistant subsystem 316.
  • the components of vehicle 302 are meant for explanatory purposes and not intended to be limiting.
  • Vehicle 302 may include more or fewer components, including the combination of depicted components or other components described for compute system 100 or device 200.
  • vehicle 302 includes vehicle process 304.
  • vehicle process 304 is a software program (e.g., one or more instructions executing by one or more processors) of vehicle 302 that is configured to manage operations performed by vehicle 302.
  • vehicle process 304 may be isolated from one or more other processes of vehicle 302 (e.g., integration process 308) such that at least some of its associated memory may only be accessed by vehicle process 304 and communications to and/or from vehicle process 304 are through a structured process of interfaces (e.g., application programming interfaces (APIs)) defined for vehicle process 304.
  • APIs application programming interfaces
  • Vehicle 302 further includes vehicle Tenderer 306.
  • vehicle Tenderer 306 is any hardware or software of vehicle 302 used to generate (sometimes referred to as render) visual content (e.g., an image (sometimes referred to as a frame) or a video) from a model and/or one or more instructions.
  • vehicle Tenderer 306 may be configured to only be used by vehicle process 304 to generate visual content from data detected and/or determined by vehicle 302.
  • vehicle Tenderer 306 is configured to render content associated with an ecosystem of vehicle 302, such as content only stored locally by vehicle 302.
  • vehicle Tenderer 306 may render content associated with a first set of vehicle instruments (e.g., a speed of the vehicle in a heads-up display).
  • the first set of vehicle instruments may be those that do not interact with content rendered remote from vehicle Tenderer 306 (e.g., remote from vehicle 302), such as content that is visually independent and always appears in a fixed position (e.g., turn signal indicators and check engine indicator).
  • vehicle Tenderer 306 renders content from processes executing on vehicle 302, such as a driver assistance system of vehicle 302 (e.g., a video from a backup camera).
  • Vehicle 302 further includes integration process 308.
  • integration process 308 is a software program (e.g., one or more instructions executing by one or more processors) of vehicle 302 that is configured to manage operations based on data received from devices separate from vehicle 302 (e.g., user device 320).
  • integration process 308 may be isolated from one or more other processes of vehicle 302 (e.g., vehicle process 304) such that at least some of its associated memory may only be accessed by integration process 308 and communications to and/or from integration process 308 are through a structured process of interfaces (e.g., APIs) defined for integration process 308.
  • interfaces e.g., APIs
  • Vehicle 302 further includes integration Tenderer 310.
  • integration Tenderer 310 is any hardware or software of vehicle 302 used to generate (sometimes referred to as render) visual content (e.g., an image or a video) from a model and/or one or more instructions.
  • integration Tenderer 310 may be configured to be used by integration process 308 to generate and/or combine visual content from (1) data detected, determined, and/or generated by vehicle 302 (e.g., vehicle Tenderer 306 or integration Tenderer 310), (2) data detected by, determined by, and/or received from user device 320 (e.g., user device Tenderer 322), or (3) any combination thereof.
  • integration Tenderer 310 renders content associated with a second set of vehicle instruments, different from the first set of vehicle instruments rendered by vehicle Tenderer 306.
  • the second set of vehicle instruments may be those that interact with content rendered remote from vehicle Tenderer 306 (e.g., remote from vehicle 302), such as content that is visually integrated or closely associated with content rendered by user device Tenderer 322 (e.g., a speedometer, a gear position, or a cruise control indicator in a main display of vehicle 302).
  • integration Tenderer 310 renders notifications received from processes executing on vehicle 302 (e.g., vehicle process 304), such notifications may be a first set (e.g., a first type) of notifications associated with vehicle 302 (e.g., check control messages).
  • vehicle 302 includes a system for verifying information included with content not rendered by vehicle Tenderer 306 (e.g., content rendered by integration Tenderer 310 or user device Tenderer 322) to make sure what is to be displayed is correct.
  • the system may compare one or more values included in such content with data detected by a sensor of vehicle 302 (e.g., vehicle sensor 314).
  • Vehicle 302 further includes output device 312.
  • output device 312 is any hardware or software of vehicle 302 used to output (e.g., send, display, emit, or produce) data (e.g., visual, audio, or haptic) from vehicle 302.
  • Examples of output device 312 include a display screen, a touch-sensitive surface, a projector, and a speaker.
  • output device 312 is a display screen that displays content rendered by each of vehicle Tenderer 306, integration Tenderer 310, and user device Tenderer 322.
  • Vehicle 302 further includes vehicle sensor 314.
  • vehicle sensor 314 is any hardware or software of vehicle 302 used to detect data about a physical environment in proximity to (e.g., surrounding) vehicle sensor 314, similar to as discussed above for compute system 100). Examples of vehicle sensor 314 include a rotary knob, a steering wheel button, a touch- sensitive surface, a camera, a microphone, and any other sensor discussed with respect to compute system 100.
  • vehicle sensor 314 detects user input. In such examples, user input detected by vehicle sensor 314 is sent to vehicle process 304 and/or integration process 308, as further discussed below.
  • the user input may be sent to vehicle process 304 when the user input corresponds to content rendered by vehicle Tenderer 306 or relates to a process of vehicle 302 (e.g., cruise control, driver assistance system, or volume control).
  • vehicle process 304 may determine what is the result of the user input and instruct a change in display through vehicle Tenderer 306 or integration process 308.
  • the user input may not be sent to integration process 308 and instead vehicle process 304 notifies integration process 308 of any state (e.g., display) changes resulting from the user input being detected.
  • the user input may be sent to integration process 308 when the user input relates to content rendered by integration Tenderer 310 or user device Tenderer 322 (e.g., voice recognition activation, instrument cluster user interface controls, media, and actions related to a telephone call).
  • integration process 308 may send the user input to (1) user device 320 to determine how to respond to the user input or (2) vehicle process 304.
  • the user input is not sent to vehicle process 304 at all when the user input is sent to integration process 308, and any state changes resulting from the user input are also not sent to vehicle process 304.
  • Vehicle 302 further includes virtual assistant subsystem 316 (sometimes referred to as artificial intelligent assistant or digital assistant).
  • virtual assistant subsystem 316 is a software program that performs one or more operations based on data detected by a sensor, such as a natural language voice command detected by a microphone of vehicle 302 or user device 320.
  • vehicle 302 may include the software program (i.e., the software program is executing on one or more processors of vehicle 302) or an interface to the software program (e.g., the interface allows for communication with one or more remote devices executing the software program).
  • vehicle 302 does not include virtual assistant subsystem 316.
  • audio detected by a microphone of vehicle 302 may be sent or transcribed and sent to user device 320 to handle by a virtual assistant subsystem (e.g., virtual assistant subsystem 326).
  • user device 320 is depicted as including user device Tenderer 322, user device sensor 324, and virtual assistant subsystem 326.
  • the components of user device 320 are meant for explanatory purposes and not intended to be limiting.
  • User device 320 may include more or fewer components, including the combination of depicted components or other components described for compute system 100 or device 200.
  • user device 320 includes user device Tenderer 322.
  • user device Tenderer 322 is any hardware or software of user device 320 used to generate (sometimes referred to as render) visual content (e.g., an image or a video) from a model and/or one or more instructions.
  • user device Tenderer 322 may be configured to generate visual content from data detected and/or determined by vehicle 302, user device 320, or any combination thereof for display by vehicle 302 or user device 320.
  • User device Tenderer 322 may also be configured to generate visual content for display by user device 320 and not vehicle 302.
  • user device Tenderer 322 renders content associated with applications executing on user device 320 (e.g., a map from a maps application for a main display of vehicle 302 or map routing instructions for a heads-up display of vehicle 302).
  • user device Tenderer 322 renders a third set (e.g., a different type) of vehicle instruments (different from the first set rendered by vehicle Tenderer 306 and the second set rendered by integration Tenderer 310).
  • user device Tenderer 322 renders notifications associated with user device 320 (such as notifications issued by an operating system of user device 320 or applications executing on user device 320) and a second set of notifications associated with vehicle 302 (different from the first set of notifications rendered by integration Tenderer 310, such as notifications received by user device 320 from vehicle 302 (e.g., low tire pressure)).
  • a notification received by user device 320 from vehicle 302 includes content for user device 320 to use when rendering a representation of the notification (e.g., a notification message, an icon, and optional parameters that may be associated with a notification, such as a format for presenting number of miles (%d miles)).
  • User device 320 further includes user device sensor 324.
  • user device sensor 324 is any hardware or software of user device 320 used to detect data about a physical environment in proximity to (e.g., surrounding) user device sensor 324, similar to as discussed above for compute system 100. Examples of user device sensor 324 include a touch-sensitive surface, a camera, a microphone, and any other sensor discussed with respect to compute system 100.
  • user device sensor 324 detects user input. In such examples, user input detected by user device sensor 324 is received by a process executing on one or more processors of user device 320 that determines an operation to perform, such as what content to render and send for display by vehicle 302.
  • User device 320 further includes virtual assistant subsystem 328 (sometimes referred to as artificial intelligent assistant or digital assistant).
  • virtual assistant subsystem 328 is a software program that performs one or more operations based on data detected by a sensor, such as a natural language voice command detected by a microphone of user device 320 or vehicle 302.
  • user device 320 may include the software program (i.e., the software program is executing on one or more processors of user device 320) or an interface to the software program (e.g., the interface allows for communication with one or more remote devices executing the software program).
  • virtual assistant subsystem 328 receives audio and/or transcribed content from vehicle 328 to act upon, such as when vehicle 302 does not include a virtual assistant subsystem (e.g., virtual assistant subsystem 318).
  • virtual assistant subsystem 328 of user device 320 works in tandem (e.g., in concert or together) with virtual assistant subsystem 318 of vehicle 302 such that some operations are handled by virtual assistant subsystem 318 of vehicle 302 (e.g., such as operations based on data from vehicle 302, operations that are more time-sensitive, or operations that require less processing) and other operations or portions of operations are handled by virtual assistant subsystem 328 of user device 320 (such as operations based on data from user device 320 or data from a device connected to user device 320 other than vehicle 302).
  • transport 330 is a communication channel between two or more devices to convey data between the devices.
  • Examples of transport 330 include wired (e.g., physically connected via one or more cables, such as USB or Lightning cable) or wireless (e.g., an Internet connection, a WiFi connection, a cellular connection, a short-range communication, a radio signal, and any other wireless data connection or network so as to communicate data between devices) channels that connect, for example, vehicle 302 and user device 320.
  • Transport 330 may enable (1) vehicle 302 to communicate information to user device 320 to be used by user device 320 to render content and (2) user device 320 to communicate such rendered content, information, layout packages, or other data to vehicle 302.
  • a first communication channel may stream content (e.g., images or video) from user device 320 to be displayed by vehicle 302 (e.g., the content is encrypted by user device 320 and decrypted by vehicle 302), a second communication channel to send metadata and/or control information related to the streaming content (in some examples, the metadata and/or control information is sent via the first communication channel embedded in the content or along with the content), a third communication channel to send vehicle information to user device 320 (e.g., vehicle information related to data detected by a sensor of vehicle 302), and a fourth communication channel to send data and information to setup vehicle 302 for displaying content received from user device 320 (e.g., a layout package with layouts used by user device 320 and rendered content that is preinstalled on vehicle 302 that may be modified by vehicle 302 when needed to be displayed by vehicle 302).
  • a layout package with layouts used by user device 320 and rendered content that is preinstalled on vehicle 302 that may be modified by vehicle 302 when needed to be displayed by
  • vehicle 302 is paired to user device 320 via transport 330.
  • vehicle 302 may be paired to user device 320 when establishing a key on user device 320 to control (e.g., unlock, lock, or start) vehicle 302.
  • vehicle 302 can be paired to user device 320 in any suitable manner.
  • the pairing may be performed before establishing the key and the key is established in response to the pairing.
  • the pairing may be performed without or after establishing the key on user device 320, such as when the key is established through a connection between user device 320 and a device other than vehicle 302.
  • establishing the key includes a pairing process that is different from a pairing process for the integration features described herein.
  • the two pairing processes are used to establish secure communications between vehicle 302 and user device 320 using different key material and may be performed in any order (e.g., key pairing may occur before integration pairing).
  • the key pairing and the integration pairing are included in a single pairing.
  • vehicle 302 is paired to user device 320 without establishing a key on user device 320.
  • the pairing may allow for vehicle 302 to identify user device 320 before establishing a wireless connection between the two devices (e.g., through a Bluetooth beacon, through a key fob, or some data transmitted by user device 320 before establishing a wireless connection with vehicle 302).
  • vehicle 302 may display content either (1) received by user device 320 during a previous connection or (2) based on instructions received by user device 320 during a previous connection.
  • vehicle 302 defaults to a particular frame and/or layout based on a previous connection.
  • vehicle 302 prioritizes establishing a first connection with a first wireless technology (e.g., Bluetooth) so that communication may occur quicker and then use the first connection to establish a second connection with a second wireless technology (e.g., WiFi) to increase bandwidth for communicating.
  • a first wireless technology e.g., Bluetooth
  • a second wireless technology e.g., WiFi
  • the second wireless technology may have more bandwidth and/or use more power than the first wireless technology.
  • vehicle 302 may perform one or more operations using the first connection, before or while the second connection is established, such as vehicle 302 may receive an instruction from user device 320 through the first connection to display content already stored and/or rendered by vehicle 302 and/or to start an engine of vehicle 302 when detecting a door has opened or closed.
  • FIGs. 4A-4H are block diagrams illustrating exemplary user interfaces in accordance with some examples described herein.
  • the user interfaces in these figures are used to illustrate the processes described below, including the processes in FIGs. 5A-5H and 6-11.
  • FIGs. 4A-4H depict a frame (i.e., frames 400a-400h) that is displayed on a display (e.g., a touch-sensitive display, a heads-up display, or a display screen) of a vehicle (e.g., vehicle 302).
  • the frame may include zero or more user interface elements rendered by the vehicle (e.g., some user interface elements rendered by vehicle Tenderer 306 and some user interface elements rendered by integration Tenderer 310) and zero or more user interface elements rendered by one or more devices other than the vehicle (e.g., a user device (such as by user device Tenderer 322), a server, or any device separate from the vehicle). It should be understood that more or fewer user interface elements may be included with the frames depicted in FIGs. 4A-4H.
  • locations and/or characteristics of user interface elements and/or what content is included in a frame is based on a layout (e.g., a definition including a location, such as an initial location, of user interface elements within the frame).
  • a device rendering at least a portion of the frame e.g., one or more user interface elements or combining already-rendered user interface elements
  • the vehicle stores one or more layouts and selects between the one or more layouts based on information known by the vehicle.
  • another device e.g., a user device
  • FIGs. 4A-4B illustrate a process of transitioning from a frame rendered without input from a user device to a frame rendered with input from the user device.
  • frame 400a is displayed at a first time, before the vehicle is connected to the user device (e.g., user device 320).
  • frame 400a is rendered (e.g., different user interface elements are rendered in particular locations and/or different rendered user interface elements are combined to create frame 400a) by the vehicle (e.g., by vehicle Tenderer 306 or integration Tenderer 310).
  • frame 400a includes multiple user interface elements, including current time 402, speedometer 404, and fuel gauge 406.
  • user interface elements may be individually rendered by the vehicle, the user device, another device (e.g., a different user device, a device associated with manufacture of the vehicle, a server, or any device separate from the vehicle), or some combination thereof.
  • components of speedometer 404 e.g., numbers and the hand
  • speedometer 404 may have been rendered by a previously-connected user device and stored by the vehicle so that the vehicle may construct speedometer 404 to reflect current data detected by the vehicle (e.g., that the speed is 0).
  • the vehicle would use the individual components to render a current state of speedometer 404 to reflect that the speed is 0 and then combine speedometer 404 with other user interface elements included within frame 400a to produce frame 400a.
  • frame 400a may include more or fewer user interface elements, including, for example, a background or other content.
  • Frame 400a is in accordance with a first layout such that a position of current time 402, speedometer 404, and fuel gauge 406 within frame 400a is determined using the first layout.
  • the first layout is selected to be used by the vehicle, such as based on what layout was most recently used or a current context of the vehicle.
  • the first layout may be configured to be used when starting up the vehicle and the decision to use the first layout is based on information installed on the vehicle before connecting to any user device.
  • frame 400a includes current time 402, indicating a current time as determined by a software or hardware component of the vehicle.
  • Current time 402 is displayed at a particular size in a digital format and updates as time passes. It should be understood that current time 402 could indicate the current time using a different format, such as analog.
  • Frame 400a further includes multiple vehicle instruments, including speedometer 404 and fuel gauge 406.
  • a vehicle instrument is a user interface element reflecting data detected by a sensor (e.g., a sensor of the vehicle).
  • Speedometer 404 indicates a current speed of the vehicle and is depicted in an analog form with a gauge that includes a hand pointing to the current speed. It should be understood that speedometer 404 could indicate the current speed using a different format, such as digital with numbers indicating the current speed rather than a hand.
  • Fuel gauge 406 indicates a current amount of fuel remaining for the vehicle and is depicted in an analog form with a hand pointing to the current amount. It should be understood that fuel gauge 406 could indicate the current amount of fuel using a different format, such as digital with numbers indicating a percentage remaining rather than a hand.
  • FIG. 4B depicts frame 400b and is displayed after the first time of FIG. 4A (i.e., a second time).
  • frame 400b is displayed after a user device (e.g., user device 320) connects with the vehicle.
  • frame 400b may be displayed in response to the user device connecting to the vehicle, such that no user input is received by the user device or the vehicle after connecting and before displaying frame 400b.
  • frame 400b is rendered (e.g., different user interface elements are rendered in particular locations and/or different rendered user interface elements are combined to create frame 400b) by the vehicle.
  • different user interface elements may have been received by the vehicle from the user device and then combined with other user interface elements by the vehicle to generate frame 400b, as further discussed below.
  • Frame 400b is in accordance with a different layout than the first layout (i.e., a second layout).
  • the second layout unlike the first layout, is selected by the user device.
  • the second layout may be selected based on a state of the vehicle that was communicated from the vehicle to the user device.
  • the second layout may be selected based on a previous layout used by the user device (e.g., a previous layout used by the user device with the vehicle or another vehicle).
  • the second layout includes two areas: main area 408 and side area 410.
  • the second layout may only include main area 408 (not illustrated), with one or more user interface elements rendered by the vehicle and one or more user interface elements rendered by the user device.
  • the second layout may correspond to an instrument cluster of the vehicle, with a user interface element that displays data that is more critical (such as the current speed of the vehicle) being rendered by the vehicle and a user interface element that displays data that is less critical (such as the current time) being rendered by the user device.
  • the second layout may correspond to a center console of the vehicle, with a user interface element that displays data detected by a sensor of the vehicle (such as a current gas level of the vehicle) and a user interface element that displays data detected by a sensor of the user device (such as a signal level of the user device).
  • main area 408 includes speedometer 404 and fuel gauge 406.
  • speedometer 404 in frame 400b is in a digital format (as opposed to an analog format) at the same location and fuel gauge 406 is in the same format but at a different location.
  • FIG. 4B depicts fuel gauge 406 as the same size as in FIG. 4A. It should be understood that the format of either speedometer 404 or fuel gauge 406 may be different than depicted in FIG. 4B (e.g., speedometer 404 may have still been in an analog format) and the size, opacity, or location of either could be different than depicted in FIG. 4B.
  • FIG. 4B depicts side area 410 including current time 402.
  • Current time 402 in frame 400b is depicted as a smaller font size from current time 402 in FIG. 4A. It should be recognized that there may be other differences, such as a different type of font (e.g., Times New Roman or Arial) or a different format (e.g., a 24-hour clock rather than a 12-hour clock).
  • current time 402 may be rendered by the vehicle or the user device.
  • content in a frame is in a language specified by the user device, such as a language used for content displayed via a display of the user device.
  • content in a frame is in a language specified for such content and may be different from a language used by the user device for displaying content on a display of the user device.
  • the differences in current time 402 may be based on a preference associated with an application executing on the user device, such as a preference selected by a user of the user device. The preference may be provided to the vehicle with or separate from the second layout.
  • Side area 410 of frame 400b further includes multiple user interface elements, including signal affordance 412, multiple application affordances corresponding to different applications of the user device (i.e., maps affordance 414, music affordance 416, phone affordance 418), and dashboard affordance 420. It should be recognized that more or fewer user interface elements may be included in side area 410.
  • Signal affordance 412 indicates a communication technology (i.e., LTE) used by the user device and a signal strength (i.e., 2 of 3 bars) of the user device for the communication technology. It should be understood that different ways to represent such information may be used and that, instead of or in addition to, signal affordance 412, side area 410 may include a representation of a connection between the vehicle and the user device (e.g., wired, WiFi, or BLTE).
  • LTE communication technology
  • signal strength i.e., 2 of 3 bars
  • side area 410 includes multiple application affordances corresponding to different applications of the user device.
  • an application affordance is configured to, when selected, cause display of a user interface associated with a corresponding application.
  • the selection may cause the application to be executed by the user device when the application is not already executing.
  • maps affordance 414 may correspond to a maps application of the user device.
  • the maps application may chart physical locations in a representation of at least a portion of the world for identification and navigation. In such an example, selection of maps affordance may cause a map to be displayed.
  • music affordance 416 may correspond to a music application of the user device for searching and playing audio files
  • phone affordance 418 may correspond to a phone application of the user device for searching contacts of the user device, initiating communication sessions with other devices, reviewing messages from contacts, or any combination thereof. It should be understood that such functionality of the applications may be different and that other applications may be represented in side area 410.
  • side area 410 may include one or more application affordances corresponding to different applications of the vehicle (not illustrated). Such application affordances may operate similarly to the application affordances associated with the user device except that the application is executing by the vehicle rather than the user device.
  • side area 410 may be configured by a user to include particular application affordances corresponding to particular applications. In such examples, the particular affordances may be selected by the user using the vehicle or the user device.
  • side area 410 also includes dashboard affordance 420.
  • Dashboard affordance 420 may be configured to, when selected, cause display of a different user interface, such as a dashboard associated with the user device or the vehicle.
  • the dashboard may include affordances for other applications not included in side area 410.
  • dashboard affordance 420 is configured to, when selected, exit out of a user interface corresponding to a particular application and allow to navigate to a different application.
  • the content of main area 408 and side area 410 are rendered by the user device and sent to the vehicle for the vehicle to display.
  • some user interface elements of frame 400b might have not been included in what was sent from the user device to the vehicle and instead are rendered by the vehicle and combined with the content received from the user device (e.g., rendered on top of what was received from the user device).
  • speedometer 404 and fuel gauge 406 may have been rendered by the vehicle and the application affordances may have been rendered by the user device.
  • FIG. 4B depicts user input 415 at a location corresponding to maps affordance 414 (e.g., user input 415 corresponding to selection of maps affordance 414).
  • the user input 415 may include any type of user input, including a tap on a location of a touch-sensitive display corresponding to maps affordance 414, a push of a physical button of the vehicle while maps affordance 414 is focused upon, a speech request by a user, or any other type of user input signaling selection of maps affordance 414.
  • a signal indicating user input 415 may be sent to the user device.
  • the user device may determine an animation to be displayed by the vehicle to transition from what is displayed in frame 400b to a user interface for a maps application (e.g., main area 408 in frame 400d of FIG. 4D).
  • the animation may include modifications to a current layout, causing changes to what is being displayed, as depicted in FIG. 4C and discussed below.
  • no animation may be used and frame 400d of FIG. 4D is displayed in response to user input 415, such as after sending the signal and receiving a frame corresponding to frame 400d from the user device.
  • FIG. 4C depicts frame 400c and is displayed after the second time of FIG. 4B (i.e., a third time).
  • Frame 400b is displayed after receiving user selection of maps affordance 414.
  • Frame 400b maintains the layout from frame 400b of FIG. 4B (i.e., the second layout) and still includes main area 408 and side area 410.
  • main area 408 and side area 410 in frame 400c are located in the same locations as in frame 400b.
  • side area 410 in frame 400c still includes the other user interface elements described in FIG. 4B. It should be understood that more changes could occur to side area 410 in response to user selection of maps affordance 414. Such changes may be determined by the vehicle and/or the user device.
  • main area 408 still includes speedometer 404 and fuel gauge 406, though speedometer 404 has been modified.
  • speedometer 404 is at a different location with a different size.
  • speedometer 404 has moved down and to the left and become smaller.
  • the different location is a change to the second layout, in that the second layout identifies the previous location as where speedometer 404 is located.
  • the second layout defines that speedometer 404 can be in either the location as depicted in FIG. 4B or the location as depicted in FIG. 4C and the content received from the user device identifies the location for the third time (i.e., FIG. 4C). It should be recognized that other types of movement, size, and or other changes may have been applied to speedometer 404 between frame 400b and frame 400c.
  • the parameters of speedometer 404 were determined by the user device and at least the changes were sent to the vehicle for rendering by the vehicle.
  • the changes may be sent to the vehicle before, with, or after sending a frame (e.g., a frame corresponding to frame 400c, the frame without speedometer 404 and fuel gauge 406) from the user device to the vehicle.
  • the vehicle may render speedometer 404 and place speedometer 404 at the location depicted in frame 400c.
  • the vehicle may not receive a new frame to be displayed at the third time. Instead, the vehicle receives instructions for how to modify a previous frame received and performs such modifications locally without needing to receive a frame from the vehicle.
  • FIG. 4D depicts frame 400d and is displayed after the second time of FIG. 4B and, optionally, the third time of FIG. 4C (i.e., a fourth time).
  • Frame 400d is displayed after receiving user selection of maps affordance 414 and optionally after one or more frames included in an animation between frame 400b and frame 400d (e.g., frame 400d may be displayed after frame 400b when frame 400c is not displayed).
  • Frame 400d is in accordance with a different layout than the second layout (i.e., a third layout).
  • the third layout similar to the second layout, is selected by the user device.
  • the third layout similar to the second layout, includes two areas: main area 408 and side area 410.
  • side area 410 includes the same user interface elements as side area 410 in the second layout.
  • the only change to side area 410 is that current time 402 has been updated based on time passing (i.e., from 10:02 to 10:03).
  • Side area 410 in frame 400d still includes the other user interface elements described in FIG. 4B and FIG. 4C.
  • main area 408 includes a map with current location indicator 422.
  • the map and current location indicator 422 are determined and rendered by the user device, and then sent to the vehicle.
  • the map and current location indicator 422 may be determined by a maps application executing on the user device.
  • main area 408 of frame 400d still includes speedometer 404 and fuel gauge 406, though speedometer 404 has again been modified.
  • speedometer 404 in frame 400d is at a different location with a different size.
  • speedometer 404 has moved down and to the left and become smaller.
  • the different location is defined in the third layout, in that the third layout identifies the location of speedometer 404 in frame 400d as where speedometer 404 is located. It should be recognized that other types of movement, size, and or other changes may have been applied to speedometer 404 between frame 400c and frame 400d.
  • the parameters of speedometer 404 were determined by the user device and at least the changes were sent to the vehicle for rendering by the vehicle.
  • both speedometer 404 and fuel gauge 406 are overlapping the map in main area 408 and, in some examples, were rendered on top of the map by the vehicle.
  • FIG. 4E depicts frame 400e and is displayed after the fourth time of FIG. 4D (i.e., a fifth time).
  • Frame 400e is displayed after the vehicle detects an error in the connection between the vehicle and the user device.
  • Frame 400e maintains the layout from frame 400d of FIG. 4D (i.e., the third layout) and still includes main area 408 with speedometer 404 and fuel gauge 406 and side area 410 with current time 402, the application affordances, and dashboard affordance 420.
  • such user interface elements in frame 400e are located in the same locations as in frame 400d.
  • some changes to main area 408 include that the map and current location indicator 422 are no longer displayed.
  • removal of the map and current location indicator 422 indicates that the connection between the vehicle and the user device is not working.
  • the map and current location indicator 422 are maintained in frame 400e because frame 400d is reused for the fifth time when the connection is not working.
  • all content rendered by the user device would not be displayed when there is an error in the connection between the vehicle and the user device.
  • such content would not be displayed because there would not be content from the user device that is designated to be displayed at the current time (i.e., the fifth time).
  • the application affordances and dashboard affordance 420 in side area 410 may no longer be displayed.
  • only user interface elements that are updating at a certain rate e.g., a predefined rate or a predefined type of user interface element
  • the map and current location indicator 422 may no longer be displayed but the application affordances and dashboard affordance 420 in side area 410 may still be displayed.
  • current time 402 has been updated based on time passing (i.e., from 10:03 to 10:04), speedometer 404 has been updated based on a speed of the vehicle (i.e., from 10 to 15 MPH), and signal affordance 412 has been replaced with error affordance 424 (e.g., error affordance 424 is displayed). Error affordance 424 indicates that there is a connection error.
  • Such updates may have been performed by the vehicle, such that the vehicle re-rendered current time 402 and speedometer 404 to reflect current data determined by the vehicle.
  • Side area 410 in frame 400e still includes the other user interface elements described in FIG. 4D. It should be understood that more changes could occur to side area 410 in response to losing connection. Such changes may be determined by the vehicle after detecting that the connection is not working or before the detecting by the vehicle or the user device.
  • FIG. 4F depicts frame 400f and is displayed after the fifth time of FIG. 4E (i.e., a sixth time).
  • Frame 400f is displayed after the vehicle reconnects to the user device.
  • Frame 400f maintains the layout from frame 400d of FIG. 4D (i.e., the third layout) and still includes main area 408 with the map, current location indicator 422, speedometer 404, and fuel gauge 406 and side area 410 with current time 402, the application affordances, and dashboard affordance 420.
  • such user interface elements in frame 400e are located in the same locations as in frame 400d.
  • the map and current location indicator 422 are displayed again, indicating that the map and current location indicator 422 have been received for the current time from the user device.
  • the map has been updated to reflect a current location of the vehicle.
  • current time 402 has been updated based on time passing (i.e., from 10:04 to 10:05), speedometer 404 has been updated based on a speed of the vehicle (i.e., from 15 to 20 MPH), and error affordance 424 has been replaced with signal affordance 412 (e.g., signal affordance 412 is displayed again).
  • user interface elements such as current time 402 and/or signal affordance 412 may be rendered by the user device at the sixth time (i.e., FIG. 4F), even if current time 402 was re-rendered by the vehicle at the fifth time while there was no connection (i.e., FIG. 4E).
  • Side area 410 in frame 400e still includes the other user interface elements described in FIG. 4D. It should be understood that more changes could occur to side area 410 in response to regaining connection. Such changes may be determined by the vehicle and/or the user device after detecting that the connection is working or before the detecting by the vehicle or the user device.
  • FIG. 4G depicts frame 400g and is displayed after the sixth time of FIG. 4D (i.e., a seventh time).
  • Frame 400g is displayed after the vehicle shifts into reverse and while there is an error in the connection between the vehicle and the user device.
  • Frame 400g does not maintain the layout from frame 400f of FIG. 4F (i.e., the third layout) but instead changes to a new layout (i.e., a fourth layout).
  • the fourth layout is selected by the vehicle due to the error in the connection between the vehicle and the user device.
  • a new layout is selected by the user device.
  • the new layout might be the same layout selected by the vehicle or a different one. The same layout may be selected because both the vehicle and the user device are selecting from the same set of layouts and the input causing the change in layout (e.g., the vehicle shifting into reverse) results in a particular layout.
  • the fourth layout still includes main area 408 with speedometer 404 and fuel gauge 406 (in the same location, size, and font) and side area 410 with current time 402, the application affordances, and dashboard affordance 420.
  • one change to main area 408 is that the map and current location indicator 422 are no longer displayed and instead feed 426 (e.g., an image or a frame of a video) from a rear-view camera is displayed.
  • the rear-view camera generates feed 426 and sends to a process of the vehicle for display such that feed 426 is not attempted to be sent to the user device.
  • feed 426 may never be sent to the user device and instead maintained locally on the user device even when the connection with the user device is working. It should be recognized that feed 426 is not displayed as a result of the error in the connection between the vehicle and the user device.
  • feed 426 is displayed as a result of the vehicle shifting into reverse.
  • feed 426 is displayed whenever the vehicle shifts into reverse regardless of a connection status between the vehicle and the user device.
  • feed 426 would be displayed when (1) the vehicle is connected to the user device and receiving content from the user device to display and (2) the vehicle is not connected to the user device and is displaying content without receiving content from the user device.
  • current time 402 has again been updated based on time passing (i.e., from 10:05 to 10:05), speedometer 404 has been updated based on a speed of the vehicle (i.e., from 20 to 0 MPH), and signal affordance 412 has been replaced with error affordance 424 (e.g., error affordance 424 is displayed).
  • Side area 410 in frame 400e still includes the other user interface elements described in FIG. 4F. It should be understood that more changes could occur to side area 410 in response to shifting into reverse and/or losing connection. Such changes may be determined by the vehicle after detecting that the connection is not working or before the detecting by the vehicle or the user device.
  • FIG. 4H depicts frame 400h and is displayed after the seventh time of FIG. 4G (i.e., an eighth time).
  • Frame 400h is displayed after the vehicle reconnects to the user device.
  • Frame 400h maintains the layout from frame 400g of FIG. 4G (i.e., the fourth layout) and still includes main area 408 with feed 426, speedometer 404, and fuel gauge 406 and side area 410 with current time 402, the application affordances, and dashboard affordance 420.
  • current time 402 has been updated based on time passing (i.e., from 10:06 to 10:07)
  • speedometer 404 has been updated based on a speed of the vehicle (i.e., from 0 to 2 MPH)
  • feed 426 has been updated based on what is received from the rearview camera
  • error affordance 424 has been replaced with signal affordance 412 (e.g., signal affordance 412 is displayed).
  • FIGs. 5A-5H are flow diagrams illustrating different operations by vehicle 500 (e.g., vehicle 302) and user device 502 (e.g., user device 320). The operations are illustrative of what occurs on the various devices to result in the examples described in FIGs. 4A-4H.
  • FIGs. 5 A-5H include operations performed by vehicle 500 on the left side of the vertical line and operations performed by user device 502 on the right side of the vertical line. Operations on both sides of the vertical line may be performed by either or both devices. Operations in boxes of dotted lines represent optional operations. Such optional operations are not performed in some examples. It should be recognized that other operations may be optional and some operations may be performed by different devices than depicted.
  • vehicle 500 is any means in or by which a person travels or an object is carried or conveyed.
  • vehicle 500 include a motor vehicle (e.g., a motorcycle, a car, a truck, a bus, a plane, a boat, etc.) and a railed vehicle (e.g., a train or a tram).
  • user device 502 is an electronic device owned and/or operated by a user. Examples of user device 502 include a mobile or other handheld device (e.g., a smart phone, a tablet, a laptop, or a smart accessory (e.g., a smart watch)).
  • FIG. 5A is a flow diagram illustrating method 504 for establishing layout packages on both vehicle 500 and user device 502. Some operations in method 504 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted.
  • method 504 includes vehicle 500 determining first content to display.
  • 504a occurs after vehicle 500 turns on and before vehicle 500 connects to user device 502.
  • 504a occurs before any communications (e.g., pairing and/or discovery) between vehicle 500 and user device 502.
  • the first content that vehicle 500 determines to display is determined based on information accessible by vehicle 500 (e.g., not information received from user device 502).
  • the first content is depicted in FIG. 4A, including current time 402, speedometer 404, and fuel gauge 406.
  • vehicle 500 may determine what time to display in current time 402, what speed to display in speedometer 404, and what fuel to display in fuel gauge 406.
  • the determined first content also includes a layout to use to display the first content, the layout identifying which user interface elements to display and how to display those user interface elements (e.g., locations of the user interface elements within a frame).
  • the layout is stored by vehicle 500, such as stored at manufacture time or through an update at a time after manufacture.
  • method 504 includes vehicle 500 rendering (e.g., the process of generating an image from a 2D or 3D model by means of a software process) the first content.
  • the rendering is performed by a Tenderer executing on a computer system of vehicle 500 (e.g., vehicle Tenderer 306 or integration Tenderer 310).
  • An example of the rendered first content is frame 400a, as depicted in FIG. 4A.
  • method 504 includes vehicle 500 displaying the first content.
  • the displaying is on a display of vehicle 500, such as a touch-sensitive display, a heads-up display, a surface through a projector, or a screen.
  • vehicle 500 displays different content on different displays of vehicle 500.
  • method 504 includes establishing a connection between vehicle 500 and user device 502.
  • the connection is initiated by vehicle 500 or user device 502.
  • the connection may be hard wired (e.g., through one or more wires) or wireless (e.g., through a wireless communication channel, such as Bluetooth or WiFi).
  • vehicle 500 supports a hard wired connection using a port of vehicle 500
  • the connection may be established by plugging one side of a cord into the port of vehicle 500 and another side of the cord into a port of user device 502.
  • vehicle 500 supports a wireless connection
  • the connection may be established by turning on a wireless network on both vehicle 500 and user device 502 and navigating to a user interface on either vehicle 500 or user device 502 to select the other device for connecting.
  • the connecting may include pairing the two devices together to establish one or more secure connections for sending data between vehicle 500 and user device 502. Such pairing would be performed the first time that the devices are connecting and not be necessary subsequent times.
  • method 504 includes vehicle 500 sending an identification of vehicle 500 to user device 502.
  • the identification is a unique identifier specific to vehicle 500 (e.g., a vehicle identification number (VIN)) or specific to a component of vehicle 500 (e.g., an identifier for a display of vehicle 500) or a non-unique identifier specific to vehicle 500 (e.g., a make and/or model of vehicle 500) or specific to a component of vehicle 500 (e.g., a brand or model number of the component).
  • the identification of vehicle 500 may be sent while establishing the connection at 504d or via the connection established at 504d (i.e., after the connection is establishing using the established connection).
  • the identification of vehicle 500 is sent via a first connection (e.g., using a first communication technology, such as Bluetooth) and subsequent communications of data (e.g., receiving a layout package at 504j) are sent via a second connection (e.g., using a second communication technology, such as WiFi) that is established using the first connection.
  • a first connection e.g., using a first communication technology, such as Bluetooth
  • subsequent communications of data e.g., receiving a layout package at 504j
  • a second connection e.g., using a second communication technology, such as WiFi
  • method 504 includes user device 502 obtaining a layout package for vehicle 500.
  • the layout package includes definitions of one or more layouts for vehicle 500.
  • a layout defines an initial location for one or more user interface elements within a frame.
  • the layout may be used to identify where to render particular user interface elements within a frame.
  • the initial location for a user interface element may be modified, though the initial location provides a starting point and/or expected location of the user interface element.
  • the layout package may further include one or more rendered user interface elements and/or scripts for rendering user interface elements.
  • some of the rendered user interface elements in the layout package are rendered and added to the layout package by user device 502 such that those rendered user interface elements are not included in the layout package received by user device 502.
  • the layout package is obtained using the identification of vehicle 500.
  • user device 502 may send a request for a current layout package for vehicle 500 to a remote device, the request including the identification of vehicle 500.
  • the remote device may then send the layout package to user device 502.
  • method 504 includes user device 502 storing the layout packaged received from the remote device.
  • the storage location of the layout package is local to user device 502 such that user device 502 is able to access the layout package when not able to communicate with the remote device.
  • user device 502 may already store one or more layout packages and, using the identification of vehicle 500, identify which layout package to use with respect to vehicle 500.
  • method 504 includes user device 502 sending the layout package to vehicle 500.
  • the layout package is sent via the connection established at 504d.
  • method 504 includes vehicle 500 receiving the layout package and, at 504k, storing the layout package.
  • the storage location of the layout package is local to vehicle 500 such that vehicle 500 is able to access the layout package when not connected to user device 502.
  • both devices are able to identify where user interface elements are to be placed in a frame and identify where the other device may place user interface elements.
  • less data is needed to be communicated between the devices when attempting to display content; and vehicle 500 is able to continue to operate and display content even when a connection with user device 502 is not working.
  • FIG. 5B is a flow diagram illustrating method 506 for vehicle 500 displaying a frame including content rendered by both vehicle 500 and user device 502. Some operations in method 506 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted. In some examples, method 506 occurs after method 504 of FIG. 5 A.
  • method 506 includes user device 502 determining a layout to use for vehicle 500.
  • the layout is from the layout package obtained by user device 502 at 504g of method 504.
  • the layout may be determined based on a context of vehicle 500 and/or user device 502, such as what is currently being displayed by vehicle 500 and/or what is to be displayed by vehicle 500 in the future.
  • method 506 includes user device 502 sending an identification of the layout to vehicle 500.
  • the identification of the layout may be sent via the connection established at 504d of method 504 or a subsequent connection.
  • the identification of the layout may be sent via a connection configured for sending metadata and control information while a different connection is configured to stream content between the devices (e.g., the rendered first frame sent at 506g).
  • method 506 includes vehicle 500 receiving the identification of the layout and, at 506d, storing the identification of the layout.
  • the storage location of the identification of the layout is local to vehicle 500 such that vehicle 500 is able to access the identification of the layout when not connected to user device 502. By storing the identification of the layout, vehicle 500 is able to determine how to combine frames received from user device 502 with user interface elements rendered by vehicle 500.
  • the identification of the layout may be sent along with any content sent as metadata to the content.
  • method 506 includes user device 502 determining a first frame to be displayed by vehicle 500.
  • the determining is based on a context of vehicle 500 and/or user device 502, such as what is currently being displayed by vehicle 500 and/or what is to be displayed by vehicle 500 in the future.
  • the determining may be performed by an application executing on user device 502 that is pushing content to be displayed by vehicle 500.
  • method 506 includes user device 502 rendering the first frame and, at 506g, sending the rendered first frame and first rendering information to vehicle 500.
  • the rendered first frame and the first rendering information are sent to vehicle 500 separately, such as through different communication channels.
  • the rendered first frame may be sent through a streaming connection for sending frames and the first rendering information may be sent outside of the streaming connection in a message addressed to vehicle 500.
  • the first rendering information may include data to assist vehicle 500 in combining user interface elements with the first frame and/or in displaying the first frame (e.g., a time when to display the first frame).
  • the first rendering information includes instructions to modify an appearance of a user interface element rendered by vehicle 500 and/or a location of where to include the user interface element within the first frame (i.e., different from the layout being used for the first frame).
  • the method includes vehicle 500 receiving the rendered first frame and the first rendering information and, at 506i, rendering a first combined frame.
  • the first combined frame is rendered by combining the rendered first frame with one or more user interface elements stored and/or rendered (e.g., previously rendered before receiving the rendered first frame or rendered on top of the rendered first frame) by vehicle 500.
  • the combination may be based on the first rendering information, such as modifying how the combination is performed in accordance with one or more instructions included with the first rendering information.
  • the method includes vehicle 500 displaying the first combined frame.
  • An example of the first combined frame is frame 400b, depicted in FIG. 4B.
  • the rendered first frame includes all of the user interface elements in frame 400b except for speedometer 404 and fuel gauge 406, both of which may be rendered by vehicle 500 based on data detected by sensors of vehicle 500.
  • FIG. 5C is a flow diagram illustrating method 508 for user device 502 orchestrating display of an animation by vehicle 500. Some operations in method 508 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted. In some examples, method 508 occurs after method 506 of FIG. 5B.
  • method 508 includes user device 502 determining an animation to display on vehicle 500.
  • the animation is determined based on a context of vehicle 500 and/or user device 502, such as what is currently being displayed by vehicle 500 and/or what is to be displayed by vehicle 500 in the future.
  • the determining may be performed by an application executing on user device 502 that is pushing content to be displayed by vehicle 500.
  • the animation may define what is to be displayed in multiple frames by vehicle 500, including, for example, modifications to a layout over time.
  • method 508 includes user device 502 determining a frame based on the animation.
  • the frame is further based on a context of vehicle 500 and/or user device 502, such as what is currently being displayed by vehicle 500 and/or what is to be displayed by vehicle 500 in the future.
  • method 508 includes user device 502 rendering the frame and, at 508, sending the frame and rendering information to vehicle 500 (similar to 506f and 506g of FIG. 5B).
  • Method 506 may continue to perform 508b, 508c, and 508d until the animation is complete (e.g., multiple frames may be determined based on the animation, rendered, and sent to vehicle 500).
  • the method 508 includes vehicle 500 receiving the frame and the rendering information and, at 508f, rendering a combined frame (similar to 506h and 506i of FIG. 5B).
  • the method 508 then includes, at 508g, vehicle 500 displaying the combined frame (similar to 506j of FIG. 5B).
  • Method 508 may continue to perform 508e, 508f, and 508g for each frame received from user device 502.
  • An example of different frames of an animation displayed by vehicle 500 are frames 400b-400c with speedometer 404 moving from the center of main area 408 to the bottom left, as depicted in FIGs. 4B-4C.
  • FIG. 5D is a flow diagram illustrating method 510 for vehicle 500 disconnecting from user device 502. Some operations in method 510 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted. In some examples, method 510 occurs after method 508 of FIG. 5C, method 506 of FIG. 5B, and/or method 504 of FIG. 5 A. [0131] At 510a, method 510 includes vehicle 500 disconnecting from user device 502. In some examples, the disconnection occurs as a result of a cord being unplugged from vehicle 500 and/or user device 502. In other examples, the disconnection occurs as a result of a loss of a wireless connection between vehicle 500 and user device 502, either intentionally or unintentionally.
  • Method 510 includes vehicle 500 determining second content to display at 510b (similar to 504a of FIG. 5A), rendering the second content at 510c (similar to 504c of FIG. 5A), and displaying the rendered second content at 510d (similar to 504d of FIG. 5A).
  • An example of the second content is frame 400e, depicted in FIG. 4E.
  • One difference between when vehicle 500 and user device 502 are connected e.g., method 506 of FIG. 5B and method 508 of FIG. 5C
  • vehicle 500 and user device 502 are disconnected (e.g., method 510 of FIG. 5D) is which device is setting up what to be displayed.
  • vehicle 500 and user device 502 when vehicle 500 and user device 502 are connected, user device 502 is determining a frame to be displayed (e.g., 506e in FIG. 5B and 506b in FIG. 5C), such as identifying multiple user interface elements and where to place those user interface elements.
  • vehicle 500 and user device 502 are disconnected, vehicle 500 is determining a frame to be displayed (e.g., 510b in FIG. 5D). This difference is because vehicle 500 no longer has input from user device 502, causing vehicle 500 to make at least one decision (e.g., where, what, or how to display a particular user interface element within a frame) that user device 502 would make if the connection was working.
  • the second content is based on a layout used by vehicle 500 before (e.g., immediately before) disconnecting from user device 502.
  • the second content is based on a layout determined by vehicle 500 in response to detecting the loss of connection from user device 502.
  • the layout may be based on a context of vehicle 500, such as what vehicle 500 is about to display.
  • FIG. 5E is a flow diagram illustrating method 512 for vehicle 500 reconnecting with user device 502. Some operations in method 512 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted. In some examples, method 512 occurs after method 510 of FIG. 5D.
  • method 512 includes vehicle 500 reconnecting with user device 502.
  • the reconnection may be initiated by vehicle 500 or user device 502.
  • the reconnection may be hard wired (e.g., through one or more wires) or wireless (e.g., through a wireless communication channel, such as Bluetooth or WiFi), as described above at 504d of FIG. 5 A.
  • method 512 includes vehicle 500 sending an identification of vehicle 500 (similar to the identification sent in 504e in FIG. 5 A) and an identification of version of a stored layout package to user device 502.
  • the stored layout package may be the layout package stored at 504h in FIG. 5 A and the version may be information included with the stored layout package.
  • the identification of vehicle 500 and the identification of the version of the stored layout package is sent via the connection established at 512a.
  • method 512 includes user device 502 receiving the identification of vehicle 500 and the identification of the version.
  • user device 502 may determine whether the version is the current version for vehicle 500. If the version is out of date (i.e., not the current version for vehicle 500), method 512 proceeds to 512d. If the version is up to date (i.e., the current version for vehicle 500), method 512 proceeds to 512i.
  • method 504 includes, user device 502 obtaining a new layout package for vehicle 500.
  • obtaining the new layout package may include sending a request for the new layout package or accessing the new layout package already stored by user device 502.
  • the new layout package may include at least one difference from the layout package stored by vehicle 500.
  • the new layout package includes differences from the layout package stored by vehicle 500 such that only the differences are transmitted to vehicle 500 and not the entire layout package.
  • the new layout package is obtained using the identification of vehicle 500 and/or the identification of the version.
  • user device 502 may send a request for a current layout package for vehicle 500 to a remote device, the request including the identification of vehicle 500 and/or the identification of the version.
  • the remote device may then send the new layout package to user device 502.
  • method 512 includes user device 502 storing the new layout packaged received from the remote device.
  • the storage location of the new layout package is local to user device 502 such that user device 502 is able to access the new layout package when not able to communicate to the remote device.
  • method 512 includes user device 502 sending the new layout package to vehicle 500.
  • the new layout package is sent via the connection established at 512a.
  • method 512 includes vehicle 500 receiving the new layout package and, at 512h, storing the new layout package.
  • the storage location of the new layout package is local to vehicle 500 such that vehicle 500 is able to access the new layout package when not connected to user device 502.
  • method 512 proceeds to 512i. In other examples, after (e.g., in response to) user device 502 sends the new layout package, method 512 proceeds to 512i.
  • method 512 includes user device 502 determining to use a second layout.
  • the second layout is optionally different from a layout being used before reconnecting at 512a or being used immediately before most-recently disconnecting.
  • the second layout may be determined based on a context of vehicle 500 and/or user device 502, such as what is currently being displayed by vehicle 500 and/or what is to be displayed by vehicle 500 in the future.
  • method 512 includes user device 502 sending an identification of the second layout to vehicle 500.
  • the identification of the second layout is sent via the connection established at 512a (or a subsequent connection).
  • the second layout may be sent with or separate from the new layout package.
  • method 512 includes vehicle 500 receiving and storing the identification of the second layout.
  • the storage location of the identification of the second layout is local to vehicle 500 such that vehicle 500 is able to access the identification of the second layout when not connected to user device 502.
  • vehicle 500 is able to determine how to combine frames received from user device 502 with user interface elements rendered by vehicle 500.
  • method 512 may proceed to 512m.
  • method 512 includes user device 502 determining a second frame to be displayed by vehicle 500.
  • the determining is based on a context of vehicle 500 and/or user device 502, such as what is currently being displayed by vehicle 500 and/or what is to be displayed by vehicle 500 in the future. In such examples, the determining may be performed by an application executing on user device 502 that is pushing content to be displayed by vehicle 500.
  • method 506 includes user device 502 rendering the second frame and, at 512o, sending the rendered second frame and second rendering information to vehicle 500.
  • the method includes vehicle 500 receiving the rendered second frame and the second rendering information and, at 512q, rendering a second combined frame.
  • the second combined frame is rendered by combining the rendered second frame with one or more user interface elements stored by vehicle 500.
  • the combination may be based on the second rendering information, such as modifying how the combination is performed in accordance with one or more instructions included with the second rendering information.
  • the method includes vehicle 500 displaying the second combined frame.
  • An example of the second combined frame is frame 400f, depicted in FIG. 4F.
  • FIG. 5F is a flow diagram illustrating method 514 for responding to user input detected by vehicle 500 or user device 502. Some operations in method 514 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted. In some examples, method 512 occurs after method 512 of FIG. 5E, method 508 of FIG. 5C, method 506 of FIG. 5B, or method 504 of FIG. 5 A.
  • method 512 includes vehicle 500 detecting first user input.
  • the first user input is detected by a component of vehicle 500, such as a sensor of vehicle 500.
  • the component include a physical button, a touch-sensitive surface, a camera, a microphone, and any other component of vehicle 500 able to detect user input.
  • method 514 proceeds to 514b.
  • vehicle 500 sends an indication of the first user input to user device 502 and, at 504c, user device 502 receives the indication of the first user input.
  • user device 502 may detect second user input at 514d.
  • the second user input is used to determine to change to the third layout without any communication with vehicle 500 (e.g., vehicle 500 does not detect a user input and does not send an indication of the user input to user device 502).
  • the second user input is detected by a component of user device 502, such as a sensor of user device 502. Examples of the component include a physical button, a touch-sensitive surface, a camera, a microphone, and any other component of user device 502 able to detect user input.
  • 514 (i.e., 514f-514n) are similar to 506b-506j in method 506 of FIG. 5B.
  • An example of the third combined frame of 514n is frame 400d, depicted in FIG. 4D.
  • FIG. 5G is a flow diagram illustrating method 516 for recovering from an issue with communication between vehicle 500 and user device 502. Some operations in method 516 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted. In some examples, method 516 occurs after method 514 of FIG. 5F, method 512 of FIG. 5E, method 508 of FIG. 5C, method 506 of FIG. 5B, or method 504 of FIG. 5 A.
  • method 516 includes vehicle 500 detecting third user input and, at 516b, attempting to send an indication of the third user input to user device 502.
  • the third user input may be similar to the first user input discussed above at 514a in FIG. 5F.
  • vehicle 500 attempts to the send the indication via the connection established at 512a in FIG. 5E, when sending after FIG. 5E, or via the connection established at 504d in FIG. 5A, when sending after FIG. 5A, 5B, or 5C.
  • method 516 includes vehicle 500 determining that user device 502 failed to respond to the indication. In some examples, such determining is based on determining that a connection to send the indication to user device 502 is not working. In other examples, such determining is based on determining that a predefined amount of time has expired after attempting to send or sending the indication of the third user input. In other examples, such determining is based on determining that a remaining time to when to display content is reached a threshold that vehicle 500 can no longer wait for user device 502.
  • method 516 includes vehicle 500 determining to change to a fourth layout based on the third user input. In some examples, the determining occurs after determining that user device 502 failed to respond to the indication.
  • method 516 includes vehicle 500 determining a fourth frame to display on vehicle 500.
  • the fourth frame is determined without input from user device 502.
  • vehicle 500 attempted to receive input from user device 502 and, when the input was not received in time, vehicle 500 determined what to display (similar to as described above with respect to method 510 of FIG. 5D). If the user device 502 had responded, vehicle 500 would use a frame received from user device 502.
  • method 516 includes vehicle 500 rendering the fourth frame and displaying the rendered fourth frame.
  • An example of the fourth frame is frame 400g, depicted in FIG. 4G.
  • FIG. 5H is a flow diagram illustrating method 518 for when vehicle 500 reconnects to user device 502 after recovering from an issue with communication. Some operations in method 518 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted. In some examples, method 518 occurs after method 516 of FIG. 5G.
  • method 516 includes vehicle 500 reconnecting with user device 502.
  • the reconnection is initiated by vehicle 500 or user device 502.
  • the reconnection may be hard wired (e.g., through one or more wires) or wireless (e.g., through a wireless communication channel, such as Bluetooth or WiFi), as described above at 504d of FIG. 5A.
  • method 518 includes vehicle 500 sending an identification of a current state of vehicle 500 and, at 518c, user device 502 receiving the identification.
  • the identification of the current state may include information to help user device 502 determine what to cause to be displayed by vehicle 500.
  • the identification may include an identification of a layout being used by vehicle 500, an indication of an input signal detected by vehicle 500 (e.g., the indication of the third user input from 516a in FIG. 5G), an indication of one or more user interface elements included in a frame displayed by vehicle 500, or any combination thereof.
  • vehicle 500 also sends an identification of vehicle 500 and an identification of version of a stored layout package to user device 502, similar to the identifications sent in 512b in FIG. 5E).
  • method 518 may include operations 512c-512h if the stored layout package is out of date.
  • method 518 includes user device 502 determining to use a fifth layout based on the current state of vehicle 500.
  • the fifth layout may be the same or different from a current layout being used by vehicle 500.
  • user device 502 sends an identification of the fifth layout to vehicle 500 (at 518e) and vehicle 500 receives and stores the identification of the fifth layout (at 518f and 518g).
  • FIG. 6 is a flow diagram illustrating method 600 for establishing a layout on multiple devices for synchronized rendering. Some operations in method 600 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted.
  • method 600 is performed at a first device (e.g., compute system 100, device 200, user device 320, or user device 502) (in some examples, the first device is a user device, such as a portable electronic device; in some examples, the first device is logged into a user account that the second device is or is not logged into; in some examples, any combination of an operating system module and/or an application module may perform the steps of method 600).
  • a first device e.g., compute system 100, device 200, user device 320, or user device 502
  • the first device is a user device, such as a portable electronic device; in some examples, the first device is logged into a user account that the second device is or is not logged into; in some examples, any combination of an operating system module and/or an application module may perform the steps of method 600).
  • method 600 includes connecting, via a first connection (e.g., 504d), to a second device (e.g., compute system 100, device 200, vehicle 302, or vehicle 500) (in some examples, the second device is a vehicle, such as a computer system configured to display content on a display of the vehicle) different (e.g., separate) from the first device (in some examples, the connecting is included in a pairing process between the first device and the second device; in some examples, the connecting occurs after a pairing process; in some examples, the connecting is via a wired or wireless connection).
  • a first connection e.g., 504d
  • the second device is a vehicle, such as a computer system configured to display content on a display of the vehicle
  • the connecting is included in a pairing process between the first device and the second device; in some examples, the connecting occurs after a pairing process; in some examples, the connecting is via a wired or wireless connection).
  • method 600 includes receiving, via the first connection, an identification associated with the second device (e.g., 504f) (in some examples, the identification refers to a display or a type of the display of the second device; in some examples, the identification refers to a type of the second device; in some examples, the identification refers to a set of one or more layouts compatible with a display of the second device).
  • an identification associated with the second device e.g., 504f
  • the identification refers to a display or a type of the display of the second device; in some examples, the identification refers to a type of the second device; in some examples, the identification refers to a set of one or more layouts compatible with a display of the second device.
  • method 600 includes, after receiving the identification associated with the second device, obtaining, using the identification, a set of one or more layouts for a display (e.g., 504g) (e.g., a screen or other visual output device) of the second device (in some examples, the obtaining is through a device other than the second device; in some examples, a layout is not displayed by the display but instead used to identify a location of particular content; in some examples, a layout includes one or more dimensions of the display; in some examples, a layout includes a resolution of the display).
  • the set of one or more layouts includes a plurality of layouts (e.g., a plurality of different layouts).
  • method 600 includes storing (e.g., in a memory of the first device) the set of one or more layouts (e.g., 504h).
  • method 600 includes sending (in some examples, the sending is via the first connection), to the second device, the set of one or more layouts for use with the display of the second device (e.g., 504i).
  • method 600 includes after sending the set, determining, based on a layout of the set of one or more layouts stored at the first device (in some examples, the layout is determined by the first device), content for displaying via the display of the second device (e.g., 506e, 508b, 512m, 514i, or 518h) (in some examples, the determining includes rendering (e.g., locally rendering) the content on the first device (e.g., 506f, 508c, 512n, 514j , or 518i); in some examples, the determining includes obtaining rendered content from a remote device).
  • the layout includes a definition of an initial location of at least one user interface element (in some examples, the at least one user interface element is rendered by the first device; in some examples, the at least one user interface element is rendered by the second device).
  • method 600 includes sending (in some examples, the sending is via the first connection), to the second device, a message corresponding to the content (e.g., 506g, 508d, 512o, 514k, or 518j) (in some examples, the message includes the content; in some examples, the content includes a portion (e.g., a placeholder) intended for the second device to render a user interface element and add to the portion; in some examples, the message includes an indication that is used by the second device to obtain the content, such as stored locally on the second device or a device remote from the second device; in some examples, the message includes data used to generate content on the second device).
  • the sending is via the first connection
  • a message corresponding to the content e.g., 506g, 508d, 512o, 514k, or 518j
  • the message includes the content; in some examples, the content includes a portion (e.g., a placeholder) intended for the second device to render a user interface element and add to the portion; in
  • method 600 further includes, while the first device is connected to the second device (in some examples, while the first device is connected to the second device via the first connection or a different (e.g., subsequent) connection): receiving an indication of a user input (e.g., 514c, 514d, or 518c) (in some examples, the indication of the user input is an indication of a virtual assistant (e.g., an indication provided by the virtual assistant in response to the virtual assistant receiving an indication from a user; in some examples, the virtual assistant is hosted by the first device or the second device); in response to receiving the indication of the user input, determining to change a layout being used by the second device to a new layout (e.g., 514e or 518d) (in some examples, the method further comprises, at the first device, determining that the user input corresponds to a request to change a layout (e.g., a current layout)); and sending, to the second device, a message indicating the new layout (e.g., a
  • the user input corresponds to activation of a physical button of the second device (in some examples, the physical button is embedded in the second device). In some examples, the user input corresponds to a touch input detected via a touch-sensitive display of the second device (in some examples, the touch input corresponds to selection of an affordance (e.g., a user interface element, such as a button) displayed by the touch-sensitive display). In some examples, the user input corresponds to user input detected via a sensor of the first device (in some examples, the sensor includes a microphone (e.g., through a virtual assistant), a camera (e.g., through a virtual assistant), a touch-sensitive display, or a sensor detecting activation of a physical button of the first device).
  • the sensor includes a microphone (e.g., through a virtual assistant), a camera (e.g., through a virtual assistant), a touch-sensitive display, or a sensor detecting activation of a physical button of the first device).
  • the user input corresponds to voice input detected via a microphone
  • the voice input corresponds to an audible request to change the layout
  • the voice input relates to a virtual assistant
  • the voice input causes another application (e.g., a virtual assistant application) to execute and return control to changing the layout after the other application determines an output (e.g., the user input);
  • the user input corresponds to a gesture detected via a camera; in some examples, the microphone is of the first device or the second device).
  • the set of one or more layouts is a first version (in some examples, an identification of the first version was sent to the second device with (or separately from) the set of one or more layouts).
  • method 600 further includes after the first connection is disconnected: connecting, via a second connection (e.g., 512a or 518a) (in some examples, the second connection is the same as or different from the first connection), to the second device; receiving, via the second connection, a second identification (e.g., 512c) (in some examples, the second identification is the same as the identification) associated with the second device (in some examples, the second identification refers to a display or a type of the display of the second device; in some examples, the second identification refers to a type of the second device; in some examples, the second identification refers to a set of one or more layouts compatible with a display of the second device); receiving, via the second connection, an identification of a current version associated with the set of one or more layouts (e.g., a current version associated with the
  • method 600 further includes, in accordance with a determination that the current version is different from the first version, sending, to the second device, a new set of one or more layouts (e.g., 512f), wherein the new set is the current version, and wherein the new set is different from the set of one or more layouts (e.g., at least one layout from the new set is different from the set of one or more layouts).
  • the particular layout is a last-used layout (e.g., the last-used layout is used during a previous connection between the first device and the second device) by the first device for the second device.
  • method 600 further includes, in addition to sending the set of one or more layouts, sending, to the second device, a script (e.g., a render script) for rendering a user interface element (e.g., 504i or 512f)
  • a script e.g., a render script
  • the script includes one or more instructions, when executed, renders the user interface element (e.g., an image, including one or more pixel values); in some examples, the script is sent in a message with the set of one or more layouts; in some examples, the script is sent in a different message from a message with the set of one or more layouts).
  • method 600 further includes, in addition to sending the set of one or more layouts, sending, to the second device, rendered content (e.g., 504i or 512f) (e.g., a bitmap or an image, sometimes referred to as a render) (in some examples, the rendered content is sent in a message with the set of one or more layouts; in some examples, the rendered content is sent in a different message from a message with the set of one or more layouts).
  • rendered content e.g., 504i or 512f
  • a render e.g., a bitmap or an image, sometimes referred to as a render
  • method 600 optionally includes one or more of the characteristics of the various methods described below with reference to method 700, such as the message sent at 670 of method 600 may be the rendered frame received at 710 of method 700.
  • method 600 optionally includes one or more of the characteristics of the various methods described below with reference to method 800, such as the message sent at 670 of method 600 may be the rendered frame received at 810 of method 800.
  • method 600 optionally includes one or more of the characteristics of the various methods described below with reference to method 900, such as the message sent at 670 of method 600 may be the first frame sent at 940 of method 900.
  • method 600 optionally includes one or more of the characteristics of the various methods described below with reference to method 1000, such as the connection at 610 of method 600 may be the connection at 1020 of method 1000.
  • method 600 optionally includes one or more of the characteristics of the various methods described below with reference to method 1100, such as the message sent at 670 of method 600 may be the second message received from the device at 1140 of method 1100.
  • FIG. 7 is a flow diagram illustrating method 700 for time-based rendering synchronization. Some operations in method 700 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted.
  • method 700 is performed at a first device (e.g., compute system 100, device 200, vehicle 302, or vehicle 500)
  • the first device is a vehicle, such as a computer system configured to display content on a display of the vehicle; in some examples, any combination of an operating system module and/or an application module may perform the steps of method 700).
  • method 700 includes receiving, from a second device (in some examples, the second device is a user device, such as a portable electronic device; in some examples, the second device is logged into a user account that the first device is or is not logged into) different from the first device, a rendered frame (e.g., 506h, 508e, 512p, 5141, or 518k) (e.g., a photo, an image, a portion of a video, or pixel information) (in some examples, the first device is wired or wirelessly connected to the second device, such as via Bluetooth or WiFi, and the rendered frame is received through the connection; in some examples, before receiving the rendered frame, the first device and the second device establishing a streaming connection (e.g., 504d, 512a, 518a), where the streaming connection is configured to send data from at least one device (e.g., the second device) to another device (e.g., the first device); in some examples, the rendered frame is rendered by the
  • the rendered frame is rendered (e.g., locally rendered) by the second device.
  • the rendered frame received from the second device includes a placeholder portion (in some examples, the placeholder portion does not include content rendered by the second device; in some examples, the placeholder portion includes (in some examples, only includes) background content; in some examples, the placeholder portion does not include a user interface element rendered by the second device; in some examples, the placeholder portion does not include a user interface element other than background content rendered by the second device; in some examples, the placeholder portion does not include a user interface element unique to the placeholder portion as compared to the rest of the rendered frame (e.g., other than other placeholder portions); in some examples, the placeholder portion is only used by the second device to generate the message sent to the first device and no indication of the placeholder portion is sent to the first device separate from a layout), and wherein the combined frame includes the user interface element at a location corresponding to the placeholder portion (e.g., the location is the placeholder portion).
  • method 700 further includes, at a first time, receiving, from the second device, a message including a second time (e.g., 506h, 508e, 512p, 5141, 518k) (e.g., a future time, a time after a current time, or a time in the future) (in some examples, the message includes an indication other than time in which the combined frame should be displayed (e.g., a next possible time)), wherein the second time is after the first time (in some examples, the message is received via the same channel as the rendered frame; in some examples, the message is received via a different channel than the rendered frame, such as a channel configured to send smaller amounts of data (e.g., Bluetooth compared to WiFi); in some examples, the rendered frame is received after the first time but before the second time such that the second device sends the message before sending the rendered frame).
  • a second time e.g., 506h, 508e, 512p, 5141, 518k
  • the message includes an indication other
  • the rendered frame is received at the first device after (e.g., separate from) the message is received at the first device (e.g., the rendered frame is included in a different message from the message).
  • the message includes an identification of a version of the user interface element (in some examples, multiple versions of the user interface element are stored on the first device, such as a light and a dark version of the user interface element).
  • the message includes a modification (in some examples, the modification includes a change in font, size, color, or opacity, such as to make more readable to a user; in some examples, the modification is determined by the user device based on settings of the user device (e.g., accessibility settings), settings of the a application, etc.; in some examples, text and font size are specified by the layout rather than in the message), other than to a location within the rendered frame (e.g., other than to where the user interface element is to be placed within the user interface element), to the user interface element. In some examples, the message includes a location of the user interface element within the rendered frame (in some examples, the location refers to where the user interface element is to be placed with respect to the rendered frame).
  • the modification includes a change in font, size, color, or opacity, such as to make more readable to a user; in some examples, the modification is determined by the user device based on settings of the user device (e.g., accessibility settings), settings of the a application,
  • method 700 further includes rendering (e.g., locally rendering) a user interface element (e.g., 506i, 508f, 512q, 514m, or 5181)
  • a user interface element e.g., 506i, 508f, 512q, 514m, or 5181
  • rendering includes executing a computer program to generate an image from a 2D or 3D model
  • rendering includes modifying content stored by the first device (e.g., pre-rendered content); in some examples, the rendered content is a bitmap; in some examples, the content was stored by the first device before receiving the message and/or the rendered frame; in some examples, rendering includes accessing a script for the user interface element and executing the script; in some examples, rendering is performed by a graphics processing unit (GPU) of the first device).
  • GPU graphics processing unit
  • method 700 further includes, before the second time, generating a combined frame by combining the user interface element with the rendered frame (e.g., 506i, 508f, 512q, 514m, or 5181) (in some examples, the combined frame is generated in response to receiving the message (in some examples, in response to refers to occurring without any further user input); in some examples, generating the combined frame includes rendering (e.g., locally rendering) the user interface element, such that the rendering is not separate from the combining).
  • generating a combined frame by combining the user interface element with the rendered frame e.g., 506i, 508f, 512q, 514m, or 5181
  • the combined frame is generated in response to receiving the message (in some examples, in response to refers to occurring without any further user input)
  • generating the combined frame includes rendering (e.g., locally rendering) the user interface element, such that the rendering is not separate from the combining).
  • combining the user interface element with the rendered frame includes placing (e.g., underlying or overlaying) the user interface element on (e.g., on bottom of (e.g., under) or on top of) the rendered frame (in some examples, the user interface element is placed to appear in front of content included in the rendered frame; in some examples, the rendered frame includes an area without content for where the user interface element is placed; in some examples, the rendered frame includes a portion that includes a higher opacity than another portion of the rendered frame such that the user interface element is placed behind the rendered frame in line with the portion so to be visible with the rendered frame).
  • method 700 further includes outputting (e.g., sending to another component or device or displaying) the combined frame for display at the second time (e.g., 506j, 508g, 512r, 514n, or 518m).
  • the combined frame for display at the second time e.g., 506j, 508g, 512r, 514n, or 518m.
  • method 700 further includes receiving rendered content (e.g., a bitmap or an image, sometimes referred to as a render) corresponding to the user interface element (e.g., 504j or 512g) (e.g., the rendered content is rendered by a device other than the first device, such as the second device) (in some examples, the rendered content is a version of the user interface element, the version different from the user interface element rendered at the first device), wherein the rendered content is received at the first device before the message is received at the first device (in some examples, the rendered content is received in response to the first device connecting with the second device; in some examples, the rendered content is received at the first device via a previous connection between the first device and the second device, the previous connection occurring before a connection used to receive the rendered frame; in some examples, the rendered content is received via a different connection (e.g., a different connection with the second device or a different connection with a device other than the second device) than a connection used to receive the rendered frame
  • rendered content e.
  • method 700 further includes receiving a script (e.g., 504j or 512g) (e.g., a render script) for rendering the user interface element (in some examples, the script includes one or more instructions, when executed, renders the user interface element (e.g., an image, including one or more pixel values)), wherein the script is received at the first device before the message is received at the first device (in some examples, the script is received in response to the first device connecting with the second device; in some examples, the script is received at the first device via a previous connection between the first device and the second device, the previous connection occurring before a connection used to receive the rendered frame; in some examples, the script is received via a different connection (e.g., a different connection with the second device or a different connection with a device other than the second device, such as provisioned on the first device during manufacture, received from a server via a firmware update or an over-the-air update, etc.) than a connection used to receive the rendered frame;
  • a script e
  • method 700 further includes displaying, at the second time, the combined frame (e.g., 506j, 508g, 512r, 514n, or 518m).
  • the combined frame e.g., 506j, 508g, 512r, 514n, or 518m.
  • method 700 further includes detecting, via a sensor (in some examples, the sensor is a speedometer, tachometer, odometer, trip odometer, oil pressure gauge, coolant temperature gauge, battery/charging system sensor, low oil pressure sensor, airbag sensor, coolant overheat sensor, hand-brake sensor, door ajar sensor, high beam sensor, on-board diagnosis indicator (e.g., check engine sensor), fuel gauge, low fuel sensor, hand brake indicator, turn light, engine service indicator, seat belt indicator, or a camera) of the first device, first data (in some examples, the first data includes a location, speed, distance, oil pressure, coolant temperature, an amount of oil pressure, whether an airbag is active, whether coolant is overheating, whether a sensor is on or off, whether a sensor is active, an amount of fuel, whether a particular turn light is active, whether a seat belt is engaged, or an image), wherein rendering the user interface element is based on the first data (in some examples, data is derived
  • method 700 further includes after receiving the rendered frame, receiving, from the second device, a second rendered frame (e.g., 508e) (in some examples, the second rendered frame is the same or different from the rendered frame; in some examples, the second rendered frame is not received but rather an identification to repeat a previous frame (e.g., the rendered frame) received from the second device); at a third time, receiving, from the second device, a second message including a fourth time (e.g., a future time, a time after a current time, or a time in the future) (in some examples, the message includes an indication other than time in which the combined frame should be displayed (e.g., a next possible time); in some examples, the third time is after the first and/or second time; in some examples, the fourth time is after the first and/or second time), wherein the fourth time is after the third time; and outputting (e.g., sending to another component or device or displaying) a frame corresponding to the second rendered frame (in
  • a second rendered frame
  • method 700 optionally includes one or more of the characteristics of the various methods described above with reference to method 600, such as the combined frame generated at 740 of method 700 may be based on the layout from 660 of method 600.
  • method 700 optionally includes one or more of the characteristics of the various methods described below with reference to method 800, such as the second time received at 720 of method 700 may be when the combined frame is output at 860 of method 800.
  • method 700 optionally includes one or more of the characteristics of the various methods described below with reference to method 900, such as the message received at 720 of method 700 may include the first frame and/or the indication of the location sent at 940 of method 900.
  • method 700 optionally includes one or more of the characteristics of the various methods described below with reference to method 1000, such as the message received at 720 of method 700 may be the user-defined preference received at 1020 of method 1000.
  • method 700 optionally includes one or more of the characteristics of the various methods described below with reference to method 1100, such as the combined frame output at 750 of method 700 may be the content displayed at 1140 of method 1100.
  • FIG. 8 is a flow diagram illustrating method 800 for controlling rendering by another device (e.g., controlling rendering by another device). Some operations in method 800 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted. [0191] In some examples, method 800 is performed at a first device (e.g., compute system 100, device 200, vehicle 302, or vehicle 500) (in some examples, the first device is a vehicle, such as a computer system configured to display content on a display of the vehicle; in some examples, any combination of an operating system module and/or an application module may perform the steps of method 800).
  • a first device e.g., compute system 100, device 200, vehicle 302, or vehicle 500
  • the first device is a vehicle, such as a computer system configured to display content on a display of the vehicle; in some examples, any combination of an operating system module and/or an application module may perform the steps of method 800).
  • method 800 includes receiving, from a second device (in some examples, the second device is a user device, such as a portable electronic device; in some examples, the second device is logged into a user account that the first device is or is not logged into) different from the first device, a rendered frame (e.g., 506h, 508e, 512p, 5141, or 518k) (e.g., a photo, an image, a portion of a video, or pixel information) (in some examples, the first device is wired or wirelessly connected to the second device, such as via Bluetooth or WiFi, and the rendered frame is received through the connection; in some examples, before receiving the rendered frame, the first device and the second device establishing a streaming connection (e.g., 504d, 512a, or 518a), where the streaming connection is configured to send data from at least one device (e.g., the second device) to another device (e.g., the first device); in some examples, the rendered frame is rendered by
  • the rendered frame is rendered (e.g., locally rendered) by the second device.
  • the rendered frame received from the second device includes a placeholder portion (in some examples, the placeholder portion does not include content rendered by the second device; in some examples, the placeholder portion includes (in some examples, only includes) background content; in some examples, the placeholder portion does not include a user interface element rendered by the second device; in some examples, the placeholder portion does not include a user interface element other than background content rendered by the second device; in some examples, the placeholder portion does not include a user interface element unique to the placeholder portion as compared to the rest of the rendered frame (e.g., other than other placeholder portions)), and wherein the combined frame includes the user interface element at a location corresponding to the placeholder portion (e.g., the location is the placeholder portion).
  • method 800 further includes receiving, from the second device, a message including data (e.g., 506h, 508e, 512p, 5141, or 518k) (e.g., an instruction) (in some examples, the message is received via the same channel as the rendered frame; in some examples, the message is received via a different channel than the rendered frame, such as a channel configured to send smaller amounts of data (e.g., Bluetooth compared to WiFi); in some examples, the rendered frame is received after the first time but before the second time such that the second device sends the message before sending the rendered frame; in some examples, the data indicates a location). In some examples, the message is received at the first device before the rendered frame is received at the first device.
  • data e.g., 506h, 508e, 512p, 5141, or 518k
  • the message is received via the same channel as the rendered frame; in some examples, the message is received via a different channel than the rendered frame, such as a channel configured to send smaller amounts of data (e.
  • the data includes an indication (e.g., an identification) of a size (e.g., a text size) of the user interface element (in some examples, the indication is a change of the size).
  • the data includes an indication (e.g., an identification) of a location within the rendered frame (in some examples, the location corresponds to the user interface element, such that the location is where the first device is to place the user interface element on the rendered frame).
  • the data includes an indication (e.g., an identification) of an opacity (in some examples, the indication is a change of the opacity; in some examples, the opacity is for the user interface element).
  • the data includes an indication (e.g., an identification) of a color (in some examples, the indication is a change of the color; in some examples, the color is for the user interface element).
  • method 800 further includes determining, based on the data, a modification with respect to a user interface element (e.g., 506i, 508f, 512q, 514m, or 5181) (in some examples, the determining does not relate to when to render the user interface element; in some examples, the determining includes determining, for the user interface element, a size, a color, an opacity, or any combination thereof; in some examples, the determining includes determining that there will be no change to how the user interface element will be rendered and instead a location of the user interface element within the rendered frame will be changed based on the data; in some examples, the modification includes a change in text size or font).
  • a modification with respect to a user interface element e.g., 506i, 508f, 512q, 514m, or 5181
  • the determining does not relate to when to render the user interface element; in some examples, the determining includes determining, for the user interface element, a size, a color, an opac
  • method 800 further includes, in accordance with the determining, rendering (e.g., locally rendering) the user interface element (e.g., 506i, 508f, 512q, 514m, or 5181) (in some examples, the rendering is performed in response to receiving the message; in some examples, rendering includes executing a computer program to generate an image from a 2D or 3D model; in some examples, rendering includes modifying content stored by the first device (e.g., pre-rendered content); in some examples, the rendered content is a bitmap; in some examples, the content was stored by the first device before receiving the message and/or the rendered frame; in some examples, rendering includes accessing a script for the user interface element and executing the script; in some examples, rendering is performed by a graphics processing unit (GPU) of the first device).
  • rendering e.g., locally rendering
  • the user interface element e.g., 506i, 508f, 512q, 514m, or 5181
  • rendering includes executing a computer program
  • method 800 further includes generating a combined frame by combining the user interface element with the rendered frame (e.g., 506i, 508f, 512q, 514m, or 5181) (in some examples, the combined frame is generated at the second time instead of before the second time; in some examples, generating the combined frame includes rendering (e.g., locally rendering) the user interface element, such that the rendering is not separate from the combining).
  • the rendered frame e.g., 506i, 508f, 512q, 514m, or 5181
  • method 800 further includes outputting (e.g., sending to another component or device or displaying) the combined frame for display (e.g., 506j, 508g, 512r, 514n, or 518m).
  • the combined frame for display e.g., 506j, 508g, 512r, 514n, or 518m.
  • method 800 further includes receiving rendered content (e.g., 504j or 512g) (e.g., a bitmap or an image, sometimes referred to as a render) corresponding to the user interface element (e.g., the rendered content is rendered by a device other than the first device, such as the second device) (in some examples, the rendered content is a version of the user interface element, the version different from the user interface element rendered at the first device), wherein the rendered content is received at the first device before the message is received at the first device (in some examples, the rendered content is received in response to the first device connecting with the second device; in some examples, the rendered content is received at the first device via a previous connection between the first device and the second device, the previous connection occurring before a connection used to receive the rendered frame; in some examples, the rendered content is received via a different connection (e.g., a different connection with the second device or a different connection with a device other than the second device) than a connection used to receive the rendered frame
  • rendered content e.
  • method 800 further includes receiving a script (e.g., 504j or 512g) (e.g., a render script) for rendering the user interface element (in some examples, the script includes one or more instructions, when executed, renders the user interface element (e.g., an image, including one or more pixel values)), wherein the script is received at the first device before the message is received at the first device (in some examples, the script is received in response to the first device connecting with the second device; in some examples, the script is received at the first device via a previous connection between the first device and the second device, the previous connection occurring before a connection used to receive the rendered frame; in some examples, the script is received via a different connection (e.g., a different connection with the second device or a different connection with a device other than the second device) than a connection used to receive the rendered frame; in some examples, rendering the user interface element includes executing the script at the first device).
  • a script e.g., 504j or 512g
  • the script
  • method 800 further includes, after receiving the rendered frame, receiving, from the second device, a second rendered frame (e.g., 508e) (in some examples, the second rendered frame is the same or different from the rendered frame; in some examples, the second rendered frame is not received but rather an identification to repeat a previous frame (e.g., the rendered frame) received from the second device); at a third time, receiving, from the second device, a second message including a fourth time (e.g., 508e) (e.g., a future time, a time after a current time, or a time in the future) (in some examples, the message includes an indication other than time in which the combined frame should be displayed (e.g., a next possible time); in some examples, the third time is after the first and/or second time; in some examples, the fourth time is after the first and/or second time), wherein the fourth time is after the third time; and outputting (e.g., sending to another component or device or displaying)
  • a second rendered frame
  • method 800 optionally includes one or more of the characteristics of the various methods described above with reference to method 600, such as the modification determined at 830 of method 800 may be to the layout from 660 of method 600.
  • method 800 optionally includes one or more of the characteristics of the various methods described above with reference to method 700, such as the data received at 820 of method 800 further include the second time referred to at 720 of method 700.
  • method 800 optionally includes one or more of the characteristics of the various methods described below with reference to method 900, such as the modification determined at 830 of method 800 may be to a location corresponding to the indication of the location sent at 940 of method 900.
  • method 800 optionally includes one or more of the characteristics of the various methods described below with reference to method 1000, such as the combined frame output at 860 of method 800 may be the second frame displayed at 1030 of method 1000.
  • method 800 optionally includes one or more of the characteristics of the various methods described below with reference to method 1100, such as the combined frame output at 860 of method 800 may be the content displayed at 1140 of method 1100.
  • FIG. 9 is a flow diagram illustrating method 900 for rendering an animation across multiple devices. Some operations in method 900 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted.
  • method 900 is performed at a first device (e.g., compute system 100, device 200, user device 320, or user device 502)
  • the first device is a user device, such as a portable electronic device; in some examples, the first device is logged into a user account that the second device is or is not logged into; in some examples, any combination of an operating system module, an application module, a remote device or system (e.g., through an application programming interface (API) call), or the like may perform the steps of method 900).
  • a first device e.g., compute system 100, device 200, user device 320, or user device 502
  • the first device is a user device, such as a portable electronic device; in some examples, the first device is logged into a user account that the second device is or is not logged into; in some examples, any combination of an operating system module, an application module, a remote device or system (e.g., through an application programming interface (API) call), or the like may perform the steps of method
  • method 900 includes determining an animation (e.g., 508a) (in some examples, the animation is across at least three frames) to be displayed on a second device (e.g., compute system 100, device 200, vehicle 302, or vehicle 500) different from the first device (in some examples, the animation is determined after sending one or more frames to the second device (e.g., 506g); in some examples, the animation is determined after establishing a connection between the first device and the second device (e.g., 504d); in some examples, the animation is determined based on content that the first device determined to display on the second device (e.g., 508a).
  • the first device is a user device and the second device is a vehicle.
  • method 900 further includes, in accordance with the animation, rendering (e.g., locally rendering) a first frame (e.g., 508c) (in some examples, the rendering is performed in response to determining the animation; in some examples, rendering includes executing a computer program to generate an image from a 2D or 3D model; in some examples, rendering includes modifying content stored by the first device (e.g., pre-rendered content); in some examples, the rendered content is a bitmap; in some examples, rendering includes accessing a script for the user interface element and executing the script; in some examples, rendering is performed by a graphics processing unit (GPU) of the first device).
  • the first frame includes a placeholder image at the location.
  • method 900 further includes determining, based on the animation, a location within the first frame to be updated with a user interface element (e.g., 508b) (in some examples, the user interface element is a vehicle instrument) rendered by the second device (in some examples, the animation is determined after sending a frame to be displayed by the second device; in some examples, based on the animation: the placeholder is in (1) a first location in the first frame and (2) a second location in a second frame, wherein the second location is different from the first location, and wherein the second frame is configured to be displayed after (e.g., subsequent to, such as immediately after) the first frame; in some examples, the location is determined before rendering the first frame; in some examples, the location is used to render the first frame; in some examples, the location is metadata of the first frame).
  • a user interface element e.g., 508b
  • the user interface element is a vehicle instrument
  • the animation is determined after sending a frame to be displayed by the second device
  • the placeholder is in (1) a
  • method 900 further includes sending, to the second device, the first frame and an indication of the location (e.g., 508d) (in some examples, the indication is metadata of the first frame; in some examples, the indication is separate from the first frame; in some examples, the method is performed by an operating system of the first device; in some examples, the method is performed by an application (e.g., an application downloaded to the first device), other than an operating system, executing on the first device; in such examples, an operating system of the first device or the application may determine the animation; in some examples, some of the steps of the method are performed by an application executing on the first device calling one or more operating system APIs (e.g., the application may call a single API to perform the determining and rendering steps) (e.g., the application may call a first API for the determining and a second API for the rendering); in some examples, the application executing on the first device calls a different application for determining the location; in some examples, the application itself determines the location).
  • an indication of the location
  • method 900 further includes determining, based on a characteristic associated with the second device, a time to display the first frame; and sending, to the second device, an indication of the time (e.g., 508d).
  • method 900 further includes determining a current layout of a display of the second device, wherein determining the animation is based on the current layout (e.g., 508a).
  • method 900 further includes determining a modification to a user interface to be rendered by the second device at the location (e.g., 508a or 508b); and sending, to the second device, an indication of the modification (e.g., 508d).
  • the modification includes a change to a characteristic of the user interface element selected from the group consisting of location, opacity, color, font, size, and shape.
  • method 900 further includes, before determining the animation, establishing a streaming connection with the second device to be used to send multiple frames corresponding to the animation (e.g., 504d).
  • method 900 further includes, in accordance with the animation, rendering (e.g., locally rendering) a second frame different from the first frame (e.g., 508c) (in some examples, the rendering is performed in response to determining the animation; in some examples, rendering includes executing a computer program to generate an image from a 2D or 3D model; in some examples, rendering includes modifying content stored by the first device (e.g., pre-rendered content); in some examples, the rendered content is a bitmap; in some examples, rendering includes accessing a script for the user interface element and executing the script; in some examples, rendering is performed by a graphics processing unit (GPU) of the first device); and sending, to the second device, the second frame.
  • rendering e.g., locally rendering
  • a second frame different from the first frame e.g., 508c
  • the rendering is performed in response to determining the animation; in some examples, rendering includes executing a computer program to generate an image from a 2D or 3D model; in some
  • method 900 further includes determining, based on the animation, a second location within the first frame to be updated with a second user interface element (e.g., 508a or 508b) (in some examples, the user interface element is a vehicle instrument) rendered by the second device (in some examples, the animation is determined after sending a frame to be displayed by the second device; in some examples, based on the animation: the placeholder is in (1) a first location in the first frame and (2) a second location in a second frame, wherein the second location is different from the first location, and wherein the second frame is configured to be displayed after (e.g., subsequent to, such as immediately after) the first frame; in some examples, the location is determined before rendering the first frame; in some examples, the location is used to render the first frame; in some examples, the location is metadata of the first frame), wherein the second location is different from the first location, and wherein the second user interface element is different from the user interface element; and sending, to the second device, an indication of the second user interface element (e.g
  • method 900 optionally includes one or more of the characteristics of the various methods described above with reference to method 600, such as the first frame sent at 940 of method 900 may be the content sent at 670 of method 600.
  • method 900 optionally includes one or more of the characteristics of the various methods described above with reference to method 700, such as the animation determined at 920 of method 900 may be used to determine the second time received at 720 of method 700.
  • method 900 optionally includes one or more of the characteristics of the various methods described above with reference to method 800, such as the animation determined at 920 of method 900 may be used to generate the data received at 820 of method 800.
  • method 900 optionally includes one or more of the characteristics of the various methods described below with reference to method 1000, such as the animation determined at 920 of method 900 may be used to determine the user-defined preference that is received at 1020 of method 1000.
  • method 900 optionally includes one or more of the characteristics of the various methods described below with reference to method 1100, such as a frame rendered in accordance with the animation at 920 of method 900 may be the content in the first layout displayed at 1110 of method 1100.
  • FIG. 10 is a flow diagram illustrating method 1000 for customizing vehicle controls when connecting to a user device (e.g., in response to connecting, sometimes referred to as on connection). Some operations in method 1000 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted.
  • method 1000 is performed at a computer system of a vehicle (e.g., compute system 100, device 200, vehicle 302, or vehicle 500), the computer system in communication with a display component (in some examples, any combination of an operating system module and/or an application module may perform the steps of method 1000).
  • a computer system of a vehicle e.g., compute system 100, device 200, vehicle 302, or vehicle 500
  • the computer system in communication with a display component
  • any combination of an operating system module and/or an application module may perform the steps of method 1000.
  • method 1000 includes displaying, via the display component, a first frame (e.g., 504c) (e.g., a display frame) (in some examples, the first frame is an image; in some examples, the first frame is a frame in a series of animation frames) including a first version of a vehicle instrument (in some examples, the vehicle instrument is a speedometer, tachometer, odometer, trip odometer, oil pressure gauge, coolant temperature gauge, battery/charging system sensor, low oil pressure sensor, airbag sensor, coolant overheat sensor, hand-brake sensor, door ajar sensor, high beam sensor, on-board diagnosis indicator (e.g., check engine sensor), fuel gauge, low fuel sensor, hand brake indicator, turn light, engine service indicator, or seat belt indicator; in some examples, the vehicle instrument is a user interface element indicating a state of a component of the vehicle; in some examples, the vehicle instrument is a user interface element indicating data detected by a sensor of the vehicle) wherein the vehicle instrument is a speedometer
  • the first version is digital or analog
  • the second version is not the same version (e.g., digital or analog version) as the first version (in some examples, the first version is digital and the second version is analog; in some examples, the first version is analog and the second version is digital; in some examples, the second version is defined in the user- defined preference).
  • method 1000 further includes, while displaying the first version: connecting to a user device (e.g., 504d) (e.g., establishing a first connection between the user device and the vehicle) (in some examples, the connecting is via a wired (e.g., a cable connecting to a USB port of the vehicle and a lightning port of the user device ) or wireless (e.g., Bluetooth or WiFi) channel); and without further user input after connecting to the user device, receiving, from the user device, a user-defined preference for display of the vehicle instrument (e.g., 504j, 506c, 506h, 512g, 512k, 514g, or 518f) (in some examples, the user device sends the user-defined preference to the vehicle in response to connecting to the vehicle; in some examples, the user-defined preference includes a text size or font).
  • a user device e.g., 504d
  • the connecting is via a wired (e.g., a cable connecting to a USB port of
  • method 1000 further includes, in accordance with the user-defined preference: rendering a second version of the vehicle instrument (e.g., 506i, 508f, 512q, 514m, or 5181) (in some examples, the rendering is based on the user-defined preference; in some examples, the second version is the same as the first version; in some examples, the second version is different from the first version; in some examples, the rendering is not based on the user-defined preference; in some examples, the second version is rendered based on data received from a sensor of the vehicle, such as a sensor to detect a speed of the vehicle); and displaying, via the display component, a second frame including the second version, wherein the second version has a second appearance in the second frame, and wherein the second appearance is different from the first appearance (in some examples, the second frame is rendered based on the user-defined preference; in some examples, rendering the second version and displaying the second frame are performed without any additional user input after connecting to the user device; in some examples, the second
  • the second version includes a different color as compared to the first version (in some examples, the different color is defined in the user-defined preference). In some examples, the second version is located at a different location within a frame from the first version (in some examples, a location of the second version is defined in the user-defined preference).
  • method 1000 further includes, before rendering the second version, sending, to the user device, an identification of a set of one or more layouts stored by the vehicle (e.g., 512b) (in some examples, the set of one or more layouts was received by the vehicle from the user device while the vehicle and the user device were previously connected (e.g., 504j); in some examples, the set of one or more layouts are separately stored by both the vehicle and the user device); in accordance with a determination that the set of one or more layouts is out of date (in some examples, the determination that the set of one or more layouts is out of date is made by the user device), receiving, from the user device, a new set of one or more layouts (e.g., 512g) (in some examples, the new set includes at least one different layout from the set), wherein, after receiving the new set, rendering the second version (e.g., 512q, 514m, 516f, or 5181) and displaying the second frame (e.g., 512b) (
  • method 1000 further includes detecting disconnection of the user device (e.g., 510a or 516b) (e.g., disconnection a connection between the user device and the vehicle), wherein a third version of the vehicle instrument is being displayed via the display component immediately before detecting disconnection of the user device (in some examples, the third version is the second version); and after detecting disconnection of the user device, displaying a fourth version of the vehicle instrument (e.g., 510d or 516g) (in some examples, the fourth version is displayed in response to detecting disconnection of the user device), wherein the fourth version is different from the third version (in some examples, the fourth version is the first version).
  • a fourth version of the vehicle instrument e.g., 510d or 516g
  • method 1000 further includes, while connected to the user device (e.g., while the vehicle is connected to the user device), detecting, by a sensor of the vehicle, user input (e.g., 514a); in response to detecting the user input, sending, to the user device, an indication of the user input (e.g., 514b); after sending the indication of the user input, receiving, from the user device, an indication of a modification corresponding to the vehicle instrument (e.g., 514g or 5141), wherein the modification is determined based on the indication of the user input (in some examples, the modification is determined by the user device; in some examples, the modification causes modification of the vehicle instrument; in some examples, the modification causes a different placement of the vehicle instrument within a displayed frame); rendering, based on the indication of the modification, a third frame including the vehicle instrument (e.g., 514m); and displaying, via the display component, the third frame (e.g., 514n).
  • a sensor of the vehicle e.g., 514a
  • the senor is a physical button (in some examples, the physical button is embedded in the vehicle), and wherein the user input corresponds to activation of the physical button.
  • the sensor is a touch-sensitive display, and wherein the user input corresponds to a touch input detected via the touch-sensitive display (in some examples, the touch input corresponds to selection of an affordance (e.g., a user interface element, such as a button) displayed by the touch-sensitive display).
  • an affordance e.g., a user interface element, such as a button
  • the sensor is a microphone
  • the user input corresponds to voice input detected via the microphone
  • the voice input corresponds to an audible request to change the layout
  • the voice input relates to a virtual assistant
  • the voice input causes another application (e.g., a virtual assistant application) to execute and return control to changing the layout after the other application determines an output (e.g., the user input); in some examples, the user input corresponds to a gesture detected via a camera).
  • method 1000 optionally includes one or more of the characteristics of the various methods described above with reference to method 600, such as the second version rendered at 1030 of method 1000 may be placed at a particular location in a frame based on a layout sent at 650 of method 600.
  • method 1000 optionally includes one or more of the characteristics of the various methods described above with reference to method 800, such as the second version rendered at 1030 of method 1000 may correspond to the user interface element rendered at 730 of method 700.
  • method 1000 optionally includes one or more of the characteristics of the various methods described above with reference to method 900, such as the second version rendered at 1030 of method 1000 may correspond to the user interface element rendered at 840 of method 800.
  • method 1000 optionally includes one or more of the characteristics of the various methods described below with reference to method 1000, such as the user-defined preference received at 1020 of method 1000 may be the indication of the location sent at 940 of method 900.
  • method 900 optionally includes one or more of the characteristics of the various methods described below with reference to method 1100, such as the second frame displayed at 1030 of method 1000 may be the content in the first layout displayed at 1110 of method 1100.
  • FIG. 11 is a flow diagram illustrating method 1100 for changing layouts used during synchronized rendering in case of a connection loss. Some operations in method 1100 are, optionally, combined, the orders of some operations are, optionally, changed, and some operations are, optionally, omitted.
  • method 1100 is performed at a first device (e.g., compute system 100, device 200, vehicle 302, or vehicle 500)
  • the first device is a vehicle, such as a computer system configured to display content on a display of the vehicle; in some examples, any combination of an operating system module and/or an application module may perform the steps of method 1100).
  • method 1100 includes, while displaying content in a first layout (e.g., 514n) (in some examples, a layout represents where one or more elements are placed in a frame displayed by the first device; in some examples, the displaying is on a display component of the first device), receiving an input signal (e.g., 516a) (in some examples, the input signal is a message sent by a component of the first device; in some examples, the input signal represents an indication of user input with respect to a component of the first device; in some examples, the input signal is received from a different device remote from the first device, such as a server), wherein the first layout is selected by a second device (e.g., compute system 100, device 200, user device 320, or user device 502) (in some examples, the second device is a user device, such as a portable electronic device; in some examples, the second device is logged into a user account that the first device is or is not logged into) different (e.g., separate)
  • a second device
  • method 1100 further includes, in response to receiving the input signal, attempting to send, to the second device, a first message indicative of the input signal (e.g., 516b) (in some examples, the first message is attempted to be sent via a first channel established before displaying the content in the first layout (e.g., 504d or 512a); in some examples, the first message includes an indication of the input signal; in some examples, the first message includes an identification of a component that detected the input signal).
  • a first message indicative of the input signal e.g., 516b
  • the first message is attempted to be sent via a first channel established before displaying the content in the first layout (e.g., 504d or 512a); in some examples, the first message includes an indication of the input signal; in some examples, the first message includes an identification of a component that detected the input signal).
  • method 1100 further includes, after attempting to send the first message: in accordance with a determination that the second device failed to respond to the first message (e.g., 516c) (in some examples, the determination that the second device failed to respond to the first message includes a determination that the first device did not receive an acknowledgement message from the second device, the acknowledgement message indicating that the second device received the first message; in some examples, the determination that the second device failed to respond to the first message includes a determination that a predefined amount of time has passed since attempting to send the first message without receiving a response from the second device; in some examples, the determination that the second device failed to respond to the first message includes a determination that a channel for sending messages between the first device and the second device is no longer connected): determining, based on the input signal, to change from the first layout to a second layout (e.g., 516d) (in some examples, determining to change to the second layout is not based on the first message; in some examples, the second layout
  • method 1100 further includes, after attempting to send the first message: in accordance with a determination that sending the first message was successful (e.g., 518a or 518f): determining, based on a second message received from the second device (e.g., 518f), to change from the first layout to a third layout (in some examples, the second message includes an indication of the third layout; in some examples, determining to change to the third layout is not based on the input signal; in some examples, the third layout is from the set of layouts); and displaying content in the third layout (e.g., 518m) (in some examples, the content in the third layout includes different content from the content in the first layout; in some examples, the content in the third layout includes the same content as the content in the second layout; in some examples, the content in the third layout is a combination of content rendered by the first device and content rendered by the second device (e.g., 5181); in some examples, the content in the third layout includes content rendered by the first
  • method 1100 further includes, in accordance with a determination that the second device failed to respond to the first message, the content in the second layout includes placeholder content and does not include content received by the first device after the first device sent the first message (in some examples, the content in the second layout includes content received from the second device before attempting to send the first message, such as content received when receiving the second layout or when connecting to the second device (e.g., 504j, 512g, or 5141); in some examples, the placeholder content is included at a first location), in accordance with a determination that sending the first message was successful, the content in the third layout includes content received by the first device after the first device sent the first message (in some examples, the content received by the first device after the first device sent the first message is included at the first location, where the placeholder content is located when sending the first message failed; in some examples, the content in the third layout does not include the placeholder content), and the third layout is the same as the second layout.
  • the content in the second layout includes media (e.g., an image or a video) captured by a camera of the first device
  • the content in the third layout includes the media
  • the content in the third layout further includes additional content (in some examples, the additional content is received and/or rendered by the second device)
  • the content in the second layout does not include the additional content.
  • method 1100 further includes, at a first time (in some examples, the first time is after displaying the content in the third layout): in accordance with a determination that the first device received first content from the second device for display at the first time, displaying the first content (in some examples, the first content is included in a message that includes an indication of the first time); and in accordance with a determination that the first device did not receive content from the second device for display at the first time, displaying second content (e.g., old or previous content) received from the second device for display at a second time, wherein the second time is before the first time (in some examples, the second content is different from the first content; in some examples, the second content is included in a message that includes a first indication of when to display the second content, wherein the first indication includes an indication of the second time and does not include an indication corresponding to the first time (e.g., the first indication does not refer to a time frame that includes the first time).
  • a first time in some examples, the first time is after
  • method 1100 further includes, at a third time (in some examples, the first time is after displaying the content in the third layout; in some examples, the third time is after the first time): in accordance with a determination that the first device received third content from the second device for display at the third time, displaying the third content (in some examples, the third content is included in a message that includes an indication of the third time); and in accordance with a determination that the first device did not receive content from the second device for display at the third time, displaying fourth content (e.g., placeholder content) configured to be displayed when a connection between the first device and the second device is not working (in some examples, the fourth content is rendered by the first device or the second device).
  • fourth content e.g., placeholder content
  • method 1100 further includes, after receiving the input signal: displaying a user interface element (in some examples, the user interface element is a vehicle instrument); after displaying the user interface element, receiving, from the second device, fifth content (in some examples, the fifth content is in a particular layout); generating combined content by combining the fifth content with the user interface element (in some examples, the combined content is generated using the particular layout); and displaying the combined content (in some examples, the combined content replaces display of the user interface element).
  • a user interface element in some examples, the user interface element is a vehicle instrument
  • fifth content in a particular layout
  • generating combined content by combining the fifth content with the user interface element in some examples, the combined content is generated using the particular layout
  • displaying the combined content in some examples, the combined content replaces display of the user interface element.
  • method 1100 further includes initiating rendering (e.g., locally rendering) of the user interface element (in some examples, rendering includes executing a computer program to generate an image from a 2D or 3D model; in some examples, rendering includes modifying content stored by the first device (e.g., pre-rendered content); in some examples, the rendered content is a bitmap; in some examples, rendering includes accessing a script for the user interface element and executing the script; in some examples, rendering is performed by a graphics processing unit (GPU) of the first device); after initiating rendering of the user interface element (in some examples, after rending the user interface element; in some examples, before finishing rendering the user interface element), receiving, from the second device, sixth content; generating a combined frame by combining the user interface element with the sixth content; and outputting (e.g., sending to another component or device or displaying) the combined frame for display.
  • rendering includes executing a computer program to generate an image from a 2D or 3D model
  • rendering includes modifying content stored by the first
  • method 1100 optionally includes one or more of the characteristics of the various methods described above with reference to method 600, such as first and second layout at 1130 of method 1100 may be included in the set of one or more layouts sent at 640 of method 600.
  • method 1100 optionally includes one or more of the characteristics of the various methods described above with reference to method 700, such as the content in the first layout at 1110 of method 1100 may correspond to the combined frame output at 750 of method 700.
  • method 1100 optionally includes one or more of the characteristics of the various methods described above with reference to method 800, such as the content in the first layout at 1110 of method 1100 may correspond to the combined frame output at 860 of method 800.
  • method 1100 optionally includes one or more of the characteristics of the various methods described above with reference to method 900, such as the content in the first layout at 1110 of method 1100 may be the first frame sent at 940 of method 900.
  • method 1100 optionally includes one or more of the characteristics of the various methods described above with reference to method 1000, such as the content in the first layout at 1110 of method 1100 may be the second frame displayed at 1030 of method 1000.
  • frames sent from one device to another do not includes a placeholder portion but rather the placeholder position is used locally by a sending device to render the frames.
  • one or more layouts or assets with a layout are provisioned (1) on a device during manufacture (e.g., at a factory), (2) as part of a firmware update, (3) an over-the-air (OTA) update, or (4) by another device.
  • the one or more layouts or the assets with a layout may be generated by a manufacturer of either device (e.g., in accordance with a standard).
  • voice input to cause a change in a layout includes particular questions, such as “set to sport mode,” “I cannot read the instruments,” “show my fuel gage,” or “let me know when I need to exit the freeway to charge the car.”
  • a change in a layout is caused when a sensor of a device detects a particular state (e.g., fuel level is low and a vehicle changes to navigate to a charge station).
  • a virtual assistant is running on a user device and/or a vehicle, so that if the user device is disconnected, some virtual assistant intelligence/functionality can still operate (e.g., navigate me to a charging station).
  • this gathered data may include personal information data that uniquely identifies or can be used to identify a specific person and/or a specific location.
  • personal information data can include preferences of a person, data stored on a personal device, an image of a person, an image of a location, a reference to a current location of a person, or any other identifying or personal information.
  • the present disclosure contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information data will comply with well-established privacy policies and/or privacy practices.
  • such entities should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data private and secure.
  • Such policies should be easily accessible by users, and should be updated as the collection and/or use of data changes.
  • Personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses.
  • policies and practices should be adapted for the particular types of personal information data being collected and/or accessed and adapted to applicable laws and standards, including jurisdiction-specific considerations. Hence different privacy practices may be maintained for different personal data types in each country.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Generation (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Les techniques actuelles de rendu de contenu à l'aide de données sur de multiples dispositifs sont généralement inefficaces et/ou inefficientes. La présente divulgation concerne des techniques plus efficaces et/ou efficientes pour rendre un tel contenu. Les techniques complètent ou remplacent éventuellement d'autres procédés de rendu de contenu.
PCT/US2023/024064 2022-06-04 2023-05-31 Rendu synchronisé WO2023235434A2 (fr)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US202263349063P 2022-06-04 2022-06-04
US63/349,063 2022-06-04
US17/952,143 2022-09-23
US17/952,143 US20230391190A1 (en) 2022-06-04 2022-09-23 Synchronized rendering
US17/952,055 2022-09-23
US17/952,055 US20230391189A1 (en) 2022-06-04 2022-09-23 Synchronized rendering
US17/952,060 2022-09-23
US17/952,060 US20230393801A1 (en) 2022-06-04 2022-09-23 Synchronized rendering

Publications (2)

Publication Number Publication Date
WO2023235434A2 true WO2023235434A2 (fr) 2023-12-07
WO2023235434A3 WO2023235434A3 (fr) 2024-01-11

Family

ID=87158054

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/024064 WO2023235434A2 (fr) 2022-06-04 2023-05-31 Rendu synchronisé

Country Status (1)

Country Link
WO (1) WO2023235434A2 (fr)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10824330B2 (en) * 2011-04-22 2020-11-03 Emerging Automotive, Llc Methods and systems for vehicle display data integration with mobile device data
WO2014197340A1 (fr) * 2013-06-08 2014-12-11 Apple Inc. Dispositif et procédé pour générer des interfaces utilisateur à partir d'un modèle
US11068643B2 (en) * 2019-02-08 2021-07-20 Oracle International Corporation Client-side customization and rendering of web content

Also Published As

Publication number Publication date
WO2023235434A3 (fr) 2024-01-11

Similar Documents

Publication Publication Date Title
US11034362B2 (en) Portable personalization
US9699617B2 (en) Sharing location information among devices
US9057624B2 (en) System and method for vehicle navigation with multiple abstraction layers
EP3049892B1 (fr) Systèmes et procédés permettant de fournir des données de navigation à un véhicule
US10225392B2 (en) Allocation of head unit resources to a portable device in an automotive environment
CN102039898A (zh) 情绪咨询系统
EP3446070B1 (fr) Téléchargement de carte en fonction de l'emplacement futur d'un utilisateur
WO2020146136A1 (fr) Interface utilisateur multimodale pour un véhicule
EP2972095A1 (fr) Système et procédé pour réglage contextuel du niveau de détail pour les cartes et systèmes de navigation
US11610342B2 (en) Integrated augmented reality system for sharing of augmented reality content between vehicle occupants
US20240086476A1 (en) Information recommendation method and related device
KR102639605B1 (ko) 인터페이스에서의 공간 객체들의 상대적 표현 및 명확화를 위한 시스템들 및 방법들
CN114882464B (zh) 多任务模型训练方法、多任务处理方法、装置及车辆
KR102384518B1 (ko) 메시지 처리 방법 및 이를 지원하는 전자 장치
KR20110011637A (ko) 맵 디스플레이 이미지의 생성
CN113811851A (zh) 用户界面耦合
CN105450763B (zh) 一种车联网处理系统
WO2023235434A2 (fr) Rendu synchronisé
US20230391189A1 (en) Synchronized rendering
CN114527923A (zh) 一种车内信息显示方法、装置及电子设备
CN109669898B (zh) 聚合来自信息娱乐应用程序附件的车辆数据的系统和方法
Betancur et al. Head-up and head-down displays integration in automobiles
US20190158629A1 (en) Systems and methods to aggregate vehicle data from infotainment application accessories
EP4009251B1 (fr) Dispositif de sortie d'informations, et procédé de sortie d'informations
CN115946530A (zh) 车载设备、交互控制方法、车辆、以及计算机程序产品

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23738938

Country of ref document: EP

Kind code of ref document: A2