WO2023078143A1 - 界面交互方法、电子设备、介质以及程序产品 - Google Patents

界面交互方法、电子设备、介质以及程序产品 Download PDF

Info

Publication number
WO2023078143A1
WO2023078143A1 PCT/CN2022/127800 CN2022127800W WO2023078143A1 WO 2023078143 A1 WO2023078143 A1 WO 2023078143A1 CN 2022127800 W CN2022127800 W CN 2022127800W WO 2023078143 A1 WO2023078143 A1 WO 2023078143A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
interaction
information
interface
terminal device
Prior art date
Application number
PCT/CN2022/127800
Other languages
English (en)
French (fr)
Inventor
龚伟辉
宁维赛
卢冬
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023078143A1 publication Critical patent/WO2023078143A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/452Remote windowing, e.g. X-Window System, desktop virtualisation

Definitions

  • the present disclosure generally relates to the field of information technology, and more particularly relates to an interface interaction method, an electronic device, a computer-readable storage medium, and a computer program product.
  • Embodiments of the present disclosure relate to a scheme for interconnection between a terminal device and a vehicle-machine device to realize remote interaction for a vehicle interface, and specifically provide an interface interaction method, an electronic device, a computer-readable storage medium, and a computer program product.
  • an interface interaction method is provided.
  • the terminal device receives image information from the vehicle-machine device based on the connection with the vehicle-machine device.
  • the vehicle-machine device is associated with a group of vehicle-mounted display devices.
  • a set of in-vehicle interfaces is generated.
  • the terminal device presents a target interface based on the image information, and the target interface includes a group of windows corresponding to a group of vehicle interfaces.
  • the terminal device in response to receiving an interaction operation associated with a group of windows, the terminal device sends interaction information corresponding to the interaction operation to the in-vehicle device.
  • the embodiments of the present disclosure can quickly control the interface of the vehicle-mounted display device through the terminal device, thereby improving the convenience of interaction and ensuring the safety of interaction during driving.
  • the method further includes: receiving updated image information from the in-vehicle device, the updated image information corresponds to a set of updated in-vehicle interfaces, and at least one interface in the set of updated in-vehicle interfaces is updated in response to the interaction information; And based on the updated image information, an updated interface is presented.
  • the embodiments of the present disclosure can make the user interact as if directly on the corresponding vehicle display device, thereby improving the user's interaction experience.
  • presenting the target interface based on the image information includes: based on the image information, generating a group of images corresponding to a group of vehicle interfaces; based on the group of images, generating a group of windows corresponding to the group of vehicle interfaces; and A target interface including the generated set of windows is presented. Therefore, the embodiments of the present disclosure can allow the user to remotely view the current interface of one or more vehicle-mounted display devices, thereby facilitating the management of the vehicle-mounted display devices.
  • the target interface further includes a group of window identifiers corresponding to a group of windows, where the window identifiers are used to indicate the vehicle-mounted display device corresponding to the corresponding window. Therefore, the embodiments of the present disclosure can allow the user to more conveniently know which vehicle-mounted display device the window corresponds to.
  • the method further includes: receiving an interactive operation associated with a group of windows; determining location information corresponding to the interactive operation, where the location information indicates the location where the interactive operation occurs on the terminal device; determining from a group of windows a target window corresponding to the position information; and generating interaction information based on the position information and the window display area of the target window in the terminal device. Therefore, the embodiments of the present disclosure can improve the accuracy of user interaction and reduce the delay of interaction.
  • the position information is the first position information
  • generating the interaction information based on the first position information and the window display area of the target window in the terminal device includes: converting the first position information into the second position information based on the window display area location information, wherein the first location information indicates a first set of coordinates in a first coordinate system associated with the terminal device, and the second location information indicates a second set of coordinates in a second coordinate system corresponding to the target window; and based on The second position information is to generate interaction information, and the interaction information further includes device information used to indicate the target vehicle-mounted display device corresponding to the target window.
  • the embodiments of the present disclosure can allow the corresponding terminal display device to be quickly operated, thereby improving the convenience of interaction.
  • the interaction operation is the first interaction operation
  • the interaction information is the first interaction information
  • the target interface further includes an interface element associated with a local application of the terminal device
  • the method further includes: receiving a first interaction information associated with the interface element Two interactive operations, the second interactive operation indicates the use of the target I/O device associated with the vehicle-machine device; and sending the second interactive information corresponding to the second interactive operation to the vehicle-machine device, so that the target I/O device is activated.
  • the embodiments of the present disclosure can allow the terminal device to invoke vehicle-mounted resources to perform interaction, thereby enriching interaction scenarios and improving convenience.
  • the method further includes: sending a disabling request to the vehicle-machine device, so that the screen interaction function of at least one vehicle-mounted display device in a group of vehicle-mounted display devices is disabled. Therefore, the embodiments of the present disclosure can further strengthen the control and management of the vehicle display device.
  • the method further includes: establishing a connection with the in-vehicle device, so that the in-vehicle resource associated with the in-vehicle device is virtualized as a virtual resource of the terminal device, and the in-vehicle resource includes a group of in-vehicle display devices.
  • the on-vehicle resource further includes an input/output I/O device associated with the on-board equipment, and the output/output device includes at least one of the following: a camera, a microphone or a speaker.
  • a group of vehicle display devices includes at least one of the following: a central control display device, a co-pilot display device or a rear row display device.
  • an interface interaction method is provided.
  • the vehicle-machine device sends image information to the terminal device based on the connection with the terminal device, and the image information is generated based on a group of vehicle-mounted interfaces presented by a group of vehicle-mounted display devices.
  • the vehicle-machine device receives interaction information from the terminal device, the interaction information is generated based on an interaction operation associated with a group of windows in the target interface of the terminal device, the target interface is generated based on image information and includes a set of A set of windows corresponding to the vehicle interface.
  • the in-vehicle device performs an interaction action corresponding to the interaction information.
  • the embodiments of the present disclosure can quickly control the interface of the vehicle-mounted display device through the terminal device, thereby improving the convenience of interaction and ensuring the safety of interaction during driving.
  • the method further includes: based on the execution of the interactive action, generating updated image information, the updated image information corresponds to a set of updated vehicle interfaces, and at least one vehicle interface in the set of updated interfaces is updated in response to the interactive action ; and sending update image information to the terminal device.
  • the embodiments of the present disclosure can make the user interact as if directly on the corresponding vehicle display device, thereby improving the user's interaction experience.
  • the method further includes: based on the device information in the interaction information, determining the target vehicle display device corresponding to the interaction information; and based on the target vehicle display device and the location information in the interaction information, determining the target vehicle display device corresponding to the interaction information. interactive action. Therefore, the embodiments of the present disclosure can improve the accuracy of user interaction and reduce the delay of interaction.
  • executing the interaction action corresponding to the interaction information includes at least one of the following: sending a corresponding control instruction to the vehicle component connected to the vehicle-machine equipment, and the control instruction is generated based on the interaction information; or making a group of vehicle interfaces At least one of the in-vehicle interfaces is updated in response to the interactive information. Therefore, the embodiments of the present disclosure can provide richer remote interaction capabilities.
  • the interactive operation is the first interactive operation
  • the interactive information is the first interactive information
  • the target interface further includes interface elements associated with the local application of the terminal device
  • the method further includes: receiving the second interactive information from the terminal device , the second interaction information is generated based on a second interaction operation associated with the interface element, the second interaction operation indicates the use of a target I/O device associated with the vehicle-machine device; and enabling the target I/O device, for use by terminal devices.
  • the embodiments of the present disclosure can allow the terminal device to invoke vehicle-mounted resources to perform interaction, thereby enriching interaction scenarios and improving convenience.
  • the method further includes: in response to a disabling request received from the terminal device, disabling the screen interaction function of at least one vehicle-mounted display device in a group of vehicle-mounted display devices. Therefore, the embodiments of the present disclosure can further strengthen the control and management of the vehicle display device.
  • the method further includes: establishing a connection with the terminal, so that the on-board resources associated with the in-vehicle device are virtualized as virtual resources of the terminal device, and the on-board resources include a group of on-board display devices.
  • the on-vehicle resource further includes an input/output I/O device associated with the on-board device, and the output/output device includes at least one of the following: a camera, a microphone or a speaker.
  • a group of vehicle display devices includes at least one of the following: a central control display device, a co-pilot display device or a rear row display device.
  • a terminal device in a third aspect of the present disclosure, includes a processor and memory storing instructions. When executed by the processor, the instructions cause the terminal device to execute any method according to the first aspect and its implementation manner.
  • an in-vehicle device includes a processor and memory storing instructions. When the instructions are executed by the processor, the vehicle-machine equipment executes any method according to the second aspect and its implementation.
  • a computer readable storage medium stores instructions, and the instructions, when executed by the electronic device, cause the electronic device to execute any method of the first aspect and its implementation manners.
  • a computer program product includes instructions, which, when executed by the electronic device, cause the electronic device to execute any method of the first aspect and its implementation manners.
  • FIG. 1 shows a schematic diagram of a hardware structure of an electronic device that can implement an embodiment of the present disclosure.
  • FIG. 2 shows a schematic diagram of an example environment in which embodiments according to the present disclosure may be implemented.
  • 3A-3C illustrate schematic diagrams of example interfaces according to embodiments of the present disclosure.
  • FIGS. 4A to 4C show schematic diagrams of example interaction processes according to embodiments of the present disclosure.
  • FIG. 5 shows an example process of an interaction method according to an embodiment of the present disclosure.
  • Fig. 6 shows a schematic architecture diagram of an interactive system according to an embodiment of the present disclosure.
  • FIG. 7 shows an example interaction process of the interaction system according to an embodiment of the present disclosure.
  • Fig. 8 shows a flowchart of an example process of an interface interaction method according to some embodiments of the present disclosure.
  • Fig. 9 shows a flow chart of an example process of an interface interaction method according to still other embodiments of the present disclosure.
  • the term “comprising” and its analogs should be interpreted as an open inclusion, ie “including but not limited to”.
  • the term “based on” should be understood as “based at least in part on”.
  • the term “one embodiment” or “the embodiment” should be read as “at least one embodiment”.
  • the terms “first”, “second”, etc. may refer to different or the same objects, and are used only to distinguish the referred objects without implying a specific spatial order, temporal order, important sexual order, and so on.
  • values, processes, selected items, determined items, equipment, devices, means, components, components, etc. are referred to as “best”, “lowest”, “highest”, “minimum” , “Maximum”, and so on.
  • determining can encompass a wide variety of actions. For example, “determining” may include computing, calculating, processing, deriving, investigating, looking up (eg, in a table, database, or another data structure), ascertaining, and the like. Also, “determining” may include receiving (eg, receiving information), accessing (eg, accessing data in a memory), and the like. Furthermore, “determining” may include resolving, selecting, selecting, establishing, and the like.
  • FIG. 1 shows a schematic diagram of a hardware structure of an electronic device 100 that can implement an embodiment of the present disclosure.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, and a battery 142 , antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193 , a display screen 194, and a subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the structures shown in the embodiments of the present disclosure do not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and /or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input and output
  • subscriber identity module subscriber identity module
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (serial data line, SDA) and a serial clock line (derail clock line, SCL).
  • processor 110 may include multiple sets of I2C buses.
  • the processor 110 may be respectively coupled to the touch sensor 180K, the charger, the flashlight, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 communicates with the touch sensor 180K through the I2C bus interface to realize the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled to the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through the Bluetooth headset.
  • the PCM interface can also be used for audio communication, sampling, quantizing and encoding the analog signal.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is generally used to connect the processor 110 and the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to realize the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI), etc.
  • the processor 110 communicates with the camera 193 through the CSI interface to realize the shooting function of the electronic device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to realize the display function of the electronic device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193 , the display screen 194 , the wireless communication module 160 , the audio module 170 , the sensor module 180 and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface conforming to the USB standard specification, specifically, it may be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100 , and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones and play audio through them. This interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship among the modules shown in the embodiments of the present disclosure is only for schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is configured to receive a charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 can receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 is charging the battery 142 , it can also supply power to the electronic device 100 through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives the input of the battery 142 and/or the charging management module 140 to supply power for the processor 110 , the internal memory 121 , the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be disposed in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be set in the same device.
  • the wireless communication function of the electronic device 100 may be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G/6G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator sends the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is passed to the application processor after being processed by the baseband processor.
  • the application processor outputs a sound signal through an audio device (not limited to a speaker 170A, a receiver 170B, etc.), or displays an image or video through a display screen 194 .
  • the modem processor may be a stand-alone device. In some other embodiments, the modem processor may be independent from the processor 110, and be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide wireless local area network (wireless local area networks, WLAN) (such as wireless fidelity (wireless fidelity, Wi-Fi) network), bluetooth (Bluetooth, BT), global navigation satellite Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include Global System for Mobile Communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD- SCDMA), long term evolution (long term evolution, LTE), 5G and subsequent evolution standards, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc.
  • GSM Global System for Mobile Communications
  • GPRS General Packet Radio Service
  • CDMA Code Division Multiple Access
  • WCDMA Wideband Code Division Multiple Access
  • TD- SCDMA Time Division Code Division Multiple Access
  • LTE long term evolution
  • 5G and subsequent evolution standards BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc.
  • the GNSS may include Global Positioning System (GPS), Global Navigation Satellite System (GLONASS), BeiDou Satellite Navigation System (BDS), Quasi-Zenith Satellite System (QZSS) and/or Satellite-Based Augmentation System (SBAS).
  • GPS Global Positioning System
  • GLONASS Global Navigation Satellite System
  • BDS BeiDou Satellite Navigation System
  • QZSS Quasi-Zenith Satellite System
  • SBAS Satellite-Based Augmentation System
  • the electronic device 100 realizes the display function through the GPU, the display screen 194 , and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the electronic device 100 may include 1 or N display screens 194 , where N is a positive integer greater than 1.
  • the electronic device 100 can realize the shooting function through the ISP, the camera 193 , the video codec, the GPU, the display screen 194 , and the application processor.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, and the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos in various encoding formats, for example, moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. Such as saving music, video and other files in an external memory card.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image playing function, etc.) and the like.
  • the storage data area can store data created during the use of the electronic device 100 (such as audio data, phonebook, etc.) and the like.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the electronic device 100 can implement audio functions through the audio module 170 , the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the SIM card interface 195 is used for connecting a SIM card.
  • the SIM card can be connected and separated from the electronic device 100 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195 .
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card etc. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the multiple cards may be the same or different.
  • the SIM card interface 195 is also compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as calling and data communication.
  • the electronic device 100 adopts an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture.
  • the embodiments of the present disclosure take a mobile operating system with a layered architecture as an example to illustrate the software structure of the electronic device 100 .
  • vehicles are playing an increasingly important role in people's daily lives.
  • Various electronic technologies are gradually being applied to vehicles.
  • a significant trend is that vehicles are providing drivers or passengers with richer information through various types of screens, so as to facilitate the driving of the driver or enrich the itinerary of the passengers.
  • some vehicles can also provide passengers in the vehicle with information such as news, weather, Navigation, audio and video, etc.
  • such an in-vehicle interaction method may also cause various problems that are not expected. For example, some child seats may only be installed in the rear seat, which may result in children in the rear row may not know how to interact with the display device in the rear row to obtain desired content when sitting alone in the rear row, or The driver or co-pilot passenger of , may not expect children to use rear display devices to obtain certain content.
  • the vehicle may only include the driver and a child passenger in the rear row, and the child passenger may not be able to grasp the interaction method of the display device in the rear row, which may require the driver to go to the rear row for assistance.
  • a child passenger may have specific needs while driving, which may cause the driver to temporarily stop the vehicle.
  • the passenger in the co-pilot seat may be the parent of the rear passenger, who may not expect the rear passenger to play games through the rear display device in the vehicle driving map.
  • parents may need to turn around frequently to control the rear display devices.
  • the passenger in the co-pilot seat may not be familiar with the vehicle interaction system, and may not be able to quickly use the co-pilot display device to obtain desired services. This makes it possible to seek the service of the driver, which may lead to some dangerous behaviors of the driver during driving.
  • a terminal device may receive image information from the vehicle-machine device based on the connection with the vehicle-machine device.
  • the vehicle-machine device may be associated with a group of vehicle-mounted display devices (for example, a vehicle-mounted central control display device, a passenger seat display device, a rear row display device, etc.).
  • the terminal device may present a target interface based on the image information, and the target interface includes a group of windows corresponding to a group of vehicle interfaces presented by the group of vehicle display devices.
  • the terminal device may send interaction information corresponding to the interaction operation to the vehicle-machine device, so that the vehicle-machine device performs an action corresponding to the interaction operation.
  • the embodiments of the present disclosure can present the interface of the vehicle-mounted display device on the terminal device, so as to facilitate understanding the states of different vehicle-mounted display devices.
  • the embodiments of the present disclosure can also allow remote control of the corresponding vehicle-mounted display device based on interactive operations on the terminal device. Therefore, the embodiments of the present disclosure can improve the convenience of interaction and ensure the safety of interaction during driving.
  • FIG. 2 shows a schematic diagram of an example environment 200 in which various embodiments of the present disclosure can be implemented.
  • the example environment 200 may include a terminal device 210 and an in-vehicle device 240 communicatively connected to the terminal device 210 .
  • the terminal device 210 can be connected to the vehicle-machine device 240 through any suitable connection method, examples of which can include but not limited to: wired connection such as via a USB cable, such as Bluetooth, Wi-Fi, cellular network wireless connection, etc.
  • the terminal device 210 may include any suitable type of electronic device, for example, a smart phone, a tablet computer, a notebook computer, a smart watch, smart glasses, a smart projector, and the like. As shown in FIG. 2, such a terminal device 210 can use a corresponding display screen or a projection device to present a graphical interface UI 220 as shown at 220.
  • the in-vehicle device 240 may be associated with one or more in-vehicle display devices. As shown in FIG. 2 , the in-vehicle device 240 can, for example, be used to control a plurality of vehicle display devices 250 - 1 and 250 - 2 (individually or collectively referred to as the in-vehicle display device 250 ). It should be understood that although the vehicle-mounted display device 250 is shown in FIG. It can be deployed in any appropriate position in the car, such as rear display devices, etc.
  • such a vehicle-mounted display device 250 may present a corresponding vehicle-mounted interface, such as a vehicle-mounted interface 260-1 and a vehicle-mounted interface 260-2 (individually or collectively referred to as the vehicle-mounted interface 260).
  • the central control in-vehicle display device 250-1 may present an in-vehicle interface 260-1, which may include interface elements for controlling different vehicle components (eg, air conditioner, lighting, sunroof, etc.).
  • the co-pilot display device 250-2 may, for example, present an in-vehicle interface 260-2, which may include, for example, interface elements associated with entertainment functions.
  • the terminal device 210 may receive image information from the vehicle-machine device 240 for presenting the interface 220 .
  • the interface 220 may include, for example, a first window 230-1 corresponding to the vehicle interface 260-1 of the central control display device 250-1.
  • the interface 220 may further include a second window 230-2 corresponding to the vehicle interface 260-2 of the co-pilot display device 250-2.
  • the first window 230 - 1 and the second window 230 - 1 may be called windows 230 individually or collectively.
  • the specific implementation of the generation interface 220 will be described in detail below in conjunction with FIG. 5 , and will not be described in detail here.
  • the embodiments of the present disclosure enable drivers, passengers or other appropriate users (not necessarily inside the vehicle) to quickly view the interfaces currently presented by one or more vehicle-mounted display devices through the terminal device.
  • the embodiments of the present disclosure also allow the user to remotely interact with the vehicle interface through the terminal device 210 .
  • the user can directly operate the elements in the window 230 through the touch screen provided by the terminal device 210 , which makes the user directly manipulate the corresponding vehicle display device 250 .
  • the user may click an interface element in the interface 220 presented by the terminal device 210 to, for example, adjust the temperature of the vehicle air conditioner.
  • the specific implementation of the remote interaction will be described in detail below in conjunction with FIG. 5 , and will not be described in detail here.
  • the embodiments of the present disclosure also enable the driver, passengers or other appropriate users (not necessarily inside the vehicle) to quickly operate the interface currently presented by one or more vehicle-mounted display devices through the terminal device.
  • FIG. 3A shows a schematic diagram 300A of an example interface according to some embodiments of the present disclosure.
  • the terminal device 210 may present a first interface 310 , for example.
  • the first interface 310 may include, for example, a vehicle information display area 302 , which may, for example, present information related to a group of vehicle display devices 250 .
  • the in-vehicle information display area 302 may include, for example, identifications of currently enabled in-vehicle display devices (for example, "central control display device” and "copilot display device”).
  • the in-vehicle information display area 302 may also include information of an in-vehicle display device that is associated with the in-vehicle device 240 but is currently not activated (for example, identified as "rear row display device").
  • the in-vehicle information display area 320 may include a real-time image corresponding to the in-vehicle interface 260, which may be presented with a smaller resolution, for example.
  • the user can intuitively know the real-time status of the corresponding vehicle-mounted display device 250 without affecting the user's normal operation of the current terminal device 210 .
  • the terminal device 210 can also present an interface of a local video application through the display area 304 .
  • the image corresponding to the vehicle interface 260 in the vehicle information display area 320 may be refreshed at a lower frequency, for example.
  • the terminal device 210 may receive the image of the vehicle interface 260 at a frequency of 24 frames per second, but it may update the image in the vehicle information display area 320 at a frequency of 4 frames per second. Therefore, the embodiments of the present disclosure can not only satisfy the user's requirement of viewing the real-time status of the corresponding vehicle display device 250, but also reduce the cost of rendering images by the terminal device.
  • the in-vehicle information display area 320 may not present images corresponding to the in-vehicle interface 260 , but provide a window corresponding to the in-vehicle interface 260 only when the user initiates a request to view the corresponding in-vehicle interface 260 .
  • the terminal device 210 may, for example, present the second interface 320 .
  • the user may, for example, click on the text "central control display device" in the first interface 310 or the image or video above it through the touch screen of the terminal device 210 to initiate viewing of the real-time interface of the central control display device 250-1. ask.
  • the second interface 320 may include, for example, a window 312 corresponding to the vehicle interface 260-1 of the central control display device 250-1, so as to present an image corresponding to the vehicle interface 260-1 through a larger display area.
  • the image in window 312 may be updated in real time based on image information received from vehicle-machine device 240 .
  • the refresh frequency of the images in the window 312 may be higher than the refresh frequency of the images in the vehicle information display area 302 , for example.
  • FIG. 3B when the image corresponding to the central control display device 250 - 1 is presented through the window 312 , its corresponding information may be hidden from the vehicle information display area 302 .
  • the terminal device 210 may also adjust the display size of the window 312 according to the user's operation. For example, when the user clicks the button of maximizing the window 312, the terminal device 210 can display the window 312 in full screen, for example. In some embodiments, when the user clicks the button to minimize the window 312, the terminal device 21 can close the window 312, and re-display the image corresponding to the vehicle interface in the vehicle information display area 302 as shown in FIG. 3A .
  • the user can also realize real-time control of the corresponding vehicle interface 250 - 1 by operating the interface elements in the window 312 .
  • the terminal device 210 may also present an interface 220 (also referred to as a third interface).
  • the terminal device 210 may, for example, present the third interface 220 in response to the user's request to view the “central control display device” and the “co-pilot display device” in the vehicle information display area 302 .
  • the third interface 220 may include a first window 230-2 corresponding to the central control display device 250-1, and a second window 230-2 corresponding to the co-pilot display device 250-2.
  • the third interface 220 may further include, for example, window identifiers 332-1 and 332-2 corresponding to the windows 230-1 and 230-2 (individually or collectively referred to as window identifiers 332).
  • the window identifier can be used, for example, to indicate the vehicle-mounted display device corresponding to the corresponding window.
  • the window identifier of the first window 230-1 is "CAR DESKTOP 1", which can be used to indicate that the image presented by the first window 230-1 corresponds to the vehicle interface 260-1 of the central control display device 250-1, for example.
  • the window identifier of the second window 230-2 is "CAR DESKTOP 2", which can be used to indicate, for example, that the image presented by the second window 230-2 corresponds to the vehicle interface 260-2 of the co-pilot display device 250-2.
  • the first window 230-1 may include interface elements corresponding to the controls in the vehicle interface 260-1, such as an interface element 334-1 and an interface element 334-2 ( individually or collectively as interface elements).
  • the terminal device 210 may receive an interactive operation on the interface element 334, and generate interactive information accordingly so that the user can feel as if he is directly operating the corresponding vehicle display device. Details about interaction scenarios and interaction implementation will be described in detail below.
  • the interface 220 may further include interface elements associated with local applications of the terminal device 210 .
  • interface 220 may include interface element 336 associated with a "camera application,” interface element 338 associated with a "phone application,” and so forth.
  • the terminal device 210 may also receive an interactive operation on the interface element 336 or the interface element 338 , and may utilize an input/output (I/O) device associated with the in-vehicle device 240 to complete the corresponding interaction.
  • I/O input/output
  • the embodiments of the present disclosure can improve the convenience of management of the vehicle display device by providing the terminal device with an image related to the vehicle interface of the vehicle display device.
  • terminal device 210 An example of using the terminal device 210 to implement remote interaction with respect to the vehicle interface 260 of the vehicle terminal display device 250 will be described below with reference to FIGS. 4A to 4C .
  • the user can, for example, control the vehicle through a window presented in the terminal device 210 .
  • the first window 230 - 1 may include, for example, interface elements for controlling vehicle components.
  • the interface element 410 may correspond to a control for controlling the temperature of the air conditioner in the driver's area in the vehicle interface 260-1.
  • the terminal device 210 may receive a user's click operation on the interface element 410, and generate interaction information corresponding to the click operation. Specifically, the terminal device 210 may, for example, determine the click position corresponding to the click operation. For example, the click position may indicate the first coordinate in the interface 220 of the terminal device 210 .
  • the terminal device 210 may further determine based on the first coordinate that the click operation occurs within the window display area of the first window 230-1. Further, the terminal device 210 may convert the first coordinates into the second coordinates in the coordinate system corresponding to the first window 230-1 based on the window display area of the first window 230-1. Thus, the terminal device 210 can convert the coordinates of the clicked position in the interface 220 into the actual coordinates of the corresponding control in the vehicle interface 260-1.
  • the terminal device 210 may generate interaction information based on the second coordinates, to indicate that a click operation is performed on the corresponding coordinates in the vehicle interface 260-1.
  • the interaction information may also include device information for indicating the central control vehicle-mounted display device 250-1 corresponding to the first target window 230-1.
  • the terminal device 210 may send the interaction information to the vehicle-machine device 240 via the connection with the vehicle-machine device 240 .
  • the in-vehicle device 240 may analyze the interaction information to perform an operation corresponding to the interaction information. Specifically, the vehicle-machine device 240 may first determine that the interaction information corresponds to the central control display device 250-1 based on the device information in the interaction information. Further, the vehicle-machine device 240 may also determine the interaction action corresponding to the interaction information based on the determined central control display device 250-1 and the position information in the interaction information.
  • the in-vehicle device 240 may determine based on the received interaction information that the corresponding interaction action is a click operation performed on a specific coordinate in the in-vehicle interface 260-1 of the central control display device 250-1. Correspondingly, the in-vehicle device 240 executes a response action to the click operation, as if the click operation actually occurred at the central control display device 250-1.
  • the vehicle-machine device 240 can send corresponding control instructions to the vehicle components (for example, air-conditioning components) connected to the vehicle-machine device 240 based on the interaction information, so as to adjust the driver's zone in response to the interaction operation.
  • the temperature of the air conditioner for example, air-conditioning components
  • the in-vehicle device 240 may update the in-vehicle interface 260-1, and send an update image corresponding to the updated in-vehicle interface 260-1 through the connection with the terminal device 250-1. to the terminal device 210.
  • the terminal device 210 may correspondingly update the display in the first window 230-1 based on the update image. For example, an updated image may show that the driver's zone air conditioning temperature has been adjusted from "26°" to "25°".
  • the embodiments of the present disclosure can allow the user to realize quick control of the vehicle through the terminal device.
  • Such a control method is like actually operating the corresponding vehicle-mounted display device without changing the usage habits of the user.
  • the click operation described above is only an example of an interactive operation.
  • the terminal device may allow the user to perform any appropriate interactive operations, examples of which may include but not limited to: touch-screen-based clicks, long presses, drags, or slides, etc.; appropriate somatosensory interaction methods (such as , eye movement interaction); interaction based on other input devices (eg, keyboard, mouse, etc.).
  • the format of the interaction information generated by the terminal device 210 may be similar to the format of the interaction information generated by the interaction on the vehicle-mounted display device.
  • FIG. 4A the function of controlling the air-conditioning components described in FIG. 4A is only exemplary, and embodiments of the present disclosure can also allow quick control of other vehicle components, such as lights, wipers or sunroofs, based on the window provided in the terminal device. wait.
  • the user can realize convenient interface interaction through, for example, a window presented in the terminal device 210 .
  • the first window 230-1 may include interface elements for displaying information.
  • interface element 330 may correspond to a control in vehicle interface 260-1 for presenting particular contact information, for example.
  • the terminal device 210 may receive a user's right swipe operation on the interface element 330, and generate interaction information corresponding to the swipe operation. Specifically, the terminal device 210 may, for example, determine a group of screen positions corresponding to the sliding operation. For example, the set of screen positions may indicate a first set of coordinates in the interface 220 of the terminal device 210 .
  • the terminal device 210 may further determine based on the first set of coordinates that the sliding operation occurs in the window display area of the first window 230-1. Further, the terminal device 210 may convert the first set of coordinates into the second set of coordinates in the coordinate system corresponding to the first window 230-1 based on the window display area of the first window 230-1. Thus, the terminal device 210 can convert the coordinates of the sliding position in the interface 220 into the actual coordinates of the corresponding controls in the vehicle interface 260-1.
  • the terminal device 210 may generate interaction information based on the second set of coordinates, to indicate that a click operation is performed on the corresponding coordinates in the vehicle interface 260-1.
  • the interaction information may also include device information for indicating the central control vehicle-mounted display device 250-1 corresponding to the first target window 230-1.
  • the terminal device 210 may send the interaction information to the vehicle-machine device 240 via the connection with the vehicle-machine device 240 .
  • the in-vehicle device 240 may analyze the interaction information to perform an operation corresponding to the interaction information. Specifically, the vehicle-machine device 240 may first determine that the interaction information corresponds to the central control display device 250-1 based on the device information in the interaction information. Further, the vehicle-machine device 240 may also determine the interaction action corresponding to the interaction information based on the determined central control display device 250-1 and the position information in the interaction information.
  • the in-vehicle device 240 may determine, based on the received interaction information, that the corresponding interaction action is to perform a right swipe operation corresponding to the second set of coordinates on the in-vehicle interface 260-1 of the central control display device 250-1. Correspondingly, the in-vehicle device 240 executes a response action to the right swipe operation, as if the right swipe operation actually occurred at the central control display device 250-1.
  • the in-vehicle device 240 may present a display area corresponding to the contact details on the in-vehicle interface 260-1 based on the right swipe operation. Additionally, the in-vehicle device 240 may update the in-vehicle interface 260-1, for example, and send an updated image corresponding to the updated in-vehicle interface 260-1 to the terminal device 210 through the connection with the terminal device 250-1. Correspondingly, the terminal device 210 may correspondingly update the display in the first window 230-1 based on the update image. For example, the first window 230-1 may include a new display area 420 to present detailed information of the selected contact.
  • the embodiments of the present disclosure can allow the user to implement shortcut interaction for the vehicle display device through the terminal device.
  • Such a control method is like actually operating the corresponding vehicle-mounted display device without changing the usage habits of the user.
  • the passenger in the passenger seat can, for example, quickly manipulate the interface of the rear display device through a terminal device (for example, a smart phone) connected to the vehicle-machine equipment, so as to provide passengers in the rear seat with specific content (e.g., video).
  • a terminal device for example, a smart phone
  • specific content e.g., video
  • the terminal device when used to remotely control the vehicle display device, people may also wish to disable the ability to interact with the vehicle display device. For example, in the above example of controlling the display devices in the rear row to play specific content, the user may not expect children seated in the rear row to perform undesired operations on the screen (for example, touch by mistake).
  • the terminal device 210 may, for example, allow the user to disable the screen interaction function of a specific vehicle display device through an interface.
  • the user may, for example, long press the window bar of the second window 230-2 to initiate a request to disable the interactive function of the corresponding co-pilot display device 250-2.
  • the terminal device 210 may send a disabling request to the in-vehicle device 240 after obtaining confirmation from the user, so that the screen interaction function of the co-pilot display device 250-2 is disabled.
  • the disabling request may also be automatically sent by the terminal device 210, for example.
  • the user can configure the interaction mode of the group of in-vehicle display devices after the terminal device is connected to the in-vehicle device.
  • the interaction mode may include, for example, only allowing interaction through a terminal device, only allowing interaction through a vehicle-mounted display device, allowing interaction through both the terminal device/vehicle display device, and the like.
  • the terminal device can, for example, automatically send a disabling request based on the configuration information, so that the terminal device configured as "only allow interaction through the terminal device"
  • the screen interaction function of the on-board display device in interactive mode is disabled.
  • the embodiments of the present disclosure can further facilitate the user to manage multiple vehicle-mounted display devices.
  • the terminal device 210 may also allow the user to use an input/output (I/O) device associated with the in-vehicle device 240 to perform input or output.
  • I/O input/output
  • the terminal device 210 may, for example, present a camera interface 400C as shown in 4C.
  • the camera interface 400C may include a camera selection control 430 , for example.
  • a floating window 440 may appear in the photographing interface 400C to allow the user to confirm the camera currently used for photographing.
  • the vehicle-mounted resources associated with the vehicle-machine device 240 can be virtualized as virtual resources of the terminal device 210 .
  • Such on-vehicle resources may include, but not limited to, for example: an on-vehicle display device 250 , various types of sensors, various types of I/O devices, and the like.
  • the user can, for example, select “car camera (A Car)” as the camera device to be used by the camera application.
  • the terminal device 210 may send interactive information to the car-machine device 240, so that the "car camera (A Car)” is enabled for the camera application.
  • the vehicle-machine device 240 can occupy the "vehicle camera (A Car)" hardware so as to enable the "vehicle camera (A Car)” to acquire real-time images. Further, the real-time image may be further sent back to the terminal device 210 via the connection with the terminal device 210, for example, to be used in the photographing interface 400C shown in FIG. 4C.
  • the embodiments of the present disclosure can allow the terminal device to further utilize vehicle resources to perform richer types of interactions.
  • the "camera” described in the example in FIG. 4C is only exemplary, and the user can also click the interface element 338 of the "telephone application” shown in FIG. 3C to use the vehicle microphone and speaker to perform a phone call process.
  • FIG. 5 shows an example interface interaction process 500 according to an embodiment of the disclosure.
  • Process 500 may involve terminal devices and in-vehicle devices.
  • the terminal device may include, for example, the terminal device 210 in FIG. 2
  • the vehicle-machine device may include, for example, the vehicle-machine device 240 in FIG. 2 .
  • the process 400 may also be applicable to any other suitable terminal devices and in-vehicle devices.
  • the terminal device 210 establishes a connection with the in-vehicle device 240 .
  • the terminal device 210 can be connected to the vehicle-machine device 240 through any suitable connection method, examples of which can include but not limited to: wired connection such as via a USB cable, such as Bluetooth, Wi-Fi, cellular network wireless connection, etc.
  • the terminal device 210 can, for example, establish a connection with the vehicle-machine device 240 via a vehicle-linked tool (eg, HiCar, CarPlay, CarLife, etc.).
  • a vehicle-linked tool eg, HiCar, CarPlay, CarLife, etc.
  • the in-vehicle device generates image information based on the set of in-vehicle interfaces 260 presented by the set of in-vehicle display devices 250 .
  • image information may include, for example, corresponding images of the set of vehicle interfaces 260 .
  • the image information may include, for example, a group of images compressed using an appropriate compression algorithm.
  • the terminal device 210 receives image information from the in-vehicle device 240 .
  • the terminal device 210 presents the target interface 220 based on the image information, and the target interface 220 includes a group of windows 230 corresponding to a group of vehicle interfaces 250 .
  • the target interface 220 may present multiple windows corresponding to multiple vehicle interfaces 250 at the same time, for example.
  • one or more windows in set of windows 230 may be hidden or expanded in response to user manipulation.
  • the terminal device 210 receives an interaction operation associated with a group of windows 230 and generates interaction information corresponding to the interaction operation.
  • Such interactive operations may include any appropriate type of interface operations using any interactive tools, examples of which may include but are not limited to: touch screen-based clicks, long presses, drags or slides, etc.; appropriate somatosensory interaction methods (such as eye movement interaction); interaction based on other input devices (eg, keyboard, mouse, etc.).
  • the terminal device 210 sends the generated interaction information to the vehicle-machine device 240 .
  • the in-vehicle device 240 performs an action corresponding to the interaction information.
  • the vehicle-machine device 240 may send corresponding control instructions to the vehicle components connected to the vehicle-machine device in response to the interaction information, wherein the control instructions are generated based on the interaction information.
  • the vehicle-machine device 240 may cause at least one vehicle-mounted interface in a group of vehicle-mounted interfaces 250 to be updated in response to the interaction information.
  • the embodiments of the present disclosure can quickly control the interface of the vehicle-mounted display device through the terminal device, thereby improving the convenience of interaction and ensuring the safety of interaction during driving.
  • the vehicle-machine device 240 generates updated image information based on the execution of the interactive action, where the updated image information corresponds to a group of updated vehicle interfaces, and at least one vehicle interface in a group of updated interfaces responds to the interaction action is updated.
  • the in-vehicle device 240 sends updated image information to the terminal device 210 .
  • the terminal device 210 presents an update interface based on the update image information.
  • the embodiments of the present disclosure can allow the user to realize quick manipulation of the car and the display device through the terminal device.
  • Such a control method is like actually operating the corresponding vehicle-mounted display device without changing the usage habits of the user.
  • FIG. 6 shows a schematic diagram of a system framework 600 for interface interaction according to an embodiment of the present disclosure.
  • the system framework can be realized based on the HiCar tool of the Internet of Vehicles.
  • the terminal device 210 may include multiple components: a hardware part; a module for ensuring the security of a data connection, such as HiChain, which can provide HiCar scene authentication capabilities; a module for scheduling application processes, such as AMS (Activity Manager Service) module; modules for multi-screen management, such as multi-screen framework; modules for multi-screen interaction, such as Airsharing; modules for providing reverse control capabilities, such as Hisight module; Modules with machine virtualization capabilities, such as DMSDP (Distributed Multi-mode Sensing Data Platform, distributed multi-node sensing data platform); modules for short-distance vehicle-machine interconnection, such as NEARBY modules; upper-layer applications, such as HiCar APK and iConnect APK , providing HiCar and iConnect applications respectively.
  • AMS Activity Manager Service
  • the in-vehicle device 240 may include a plurality of components: a hardware part; a module for multi-screen management, such as a multi-screen frame; a module for multi-screen interaction, such as Airsharing; Modules, such as Hisight; modules for providing vehicle-machine virtualization capabilities, such as DMSDP (Distributed Multi-mode Sensing Data Platform, distributed multi-node sensing data platform); modules for short-distance vehicle-machine interconnection, such as NEARBY modules;
  • An upper-layer application such as a manufacturer's APK, can implement, for example, an application Car APK for interconnection, which can include a module for authentication, such as an Authgent module.
  • the module for authentication may include a module for ensuring the security of data connection, such as HiChain.
  • process 700 may include a device discovery phase 710 .
  • the discovery phase includes: at 702, the user may trigger a discovery broadcast between hardware by operating on the car-machine side.
  • a wireless user can operate on the car to send out Bluetooth broadcasts, or a USB user can use a USB cable to connect the terminal device to the car-machine device by wire.
  • iConnect connects to the hardware broadcast, and judges whether it is a car type that allows access according to the car ID.
  • iConnect passes the discovered devices to the HiCar APP.
  • the HiCar initiates the connection to the in-vehicle device.
  • Process 700 may also include a wireless connection back phase 720 .
  • the hardware of the in-vehicle device may detect pairing with the terminal device.
  • the car-machine device can initiate a connection between the car-machine and the terminal device through the Car SDK.
  • Process 700 may also include a connection authentication phase 730 .
  • a connection channel between the in-vehicle device and the terminal device is established.
  • iCar APP receives the vehicle information and current connection information transmitted by iConnect, and actively initiates a wireless/wired connection at the terminal device.
  • the terminal device can accept the connection request.
  • the terminal device and the in-vehicle device perform connection authentication.
  • the HiCar APP can exchange messages with the Car SDK to transmit information such as the necessary software version and vehicle capabilities.
  • HiCar APP can call the HiChain interface to start authentication, where HiCar APP can undertake the authentication process arrangement and authentication data transmission part, while HiChain is responsible for the authentication algorithm part.
  • Process 700 may also include a device virtualization stage 740 . Specifically, at 731, establish a connection between the terminal device and the vehicle-machine device. For example, after the authentication is successful, the HiCar APP notifies DMSDP to connect the vehicle equipment.
  • device virtualization is performed. Specifically, after DMSDP is successfully connected to the vehicle-machine device, DMSDP can obtain the vehicle resources on the vehicle, virtualize the vehicle resources into virtual resources (such as peripherals) that can be used by the mobile phone, and then report the virtual resources to the HiCar APP. At 734 and 735, the virtual device reported by DMSDP may be enabled. At 736, a data transmission channel is established based on device virtualization. At 737 and 738, the DMSDP may notify Airsharing/Hisight to add the virtual device to the terminal device and map the key on the vehicle with the key of the terminal device.
  • virtual resources such as peripherals
  • Process 700 may also include a screen casting stage 750 .
  • Airsharing/Hisight may notify the multi-screen framework of the display device added with the HiCar service.
  • the multi-screen framework notifies the Car SDK on the car-machine device side and transmits it to the multi-screen control program device of the terminal device, and the Car SDK of the car-machine device starts to project the screen.
  • HiCar-based system implementation framework is only exemplary, and the embodiments of the present disclosure may also use any other appropriate connection, authentication, and communication methods to implement the interface interaction solution according to the embodiments of the present disclosure.
  • FIG. 8 shows a flowchart of an example process 800 of a graphical interface method according to an embodiment of the disclosure.
  • Process 800 may be implemented, for example, by a terminal device (eg, terminal device 210 discussed with reference to FIG. 2).
  • the terminal device 210 receives image information from the vehicle-machine device based on the connection with the vehicle-machine device.
  • the vehicle-machine device is associated with a group of vehicle-mounted display devices.
  • a set of vehicle-mounted interfaces presented by a set of vehicle-mounted display devices is generated.
  • the terminal device 210 presents a target interface including a group of windows corresponding to a group of vehicle interfaces based on the image information.
  • the terminal device 210 in response to receiving an interaction operation associated with a group of windows, sends interaction information corresponding to the interaction operation to the in-vehicle device.
  • the process 800 further includes: establishing a connection with the in-vehicle device, so that the in-vehicle resources associated with the in-vehicle device are virtualized as virtual resources of the terminal device, and the in-vehicle resources include a group of in-vehicle display devices.
  • the on-vehicle resource further includes an input/output I/O device associated with the on-board equipment, and the output/output device includes at least one of the following: a camera, a microphone or a speaker.
  • the process 800 further includes: receiving updated image information from the in-vehicle device, the updated image information corresponds to a set of updated in-vehicle interfaces, and at least one interface in the set of updated in-vehicle interfaces is updated in response to the interaction information ; and presenting an update interface based on the update image information.
  • presenting the target interface based on the image information includes: based on the image information, generating a group of images corresponding to a group of vehicle interfaces; based on the group of images, generating a group of windows corresponding to the group of vehicle interfaces; and A target interface including the generated set of windows is presented.
  • the target interface further includes a group of window identifiers corresponding to a group of windows, where the window identifiers are used to indicate the vehicle-mounted display device corresponding to the corresponding window.
  • the process 800 further includes: receiving an interactive operation associated with a group of windows; determining location information corresponding to the interactive operation, where the location information indicates the location where the interactive operation occurs on the terminal device; determining a target window corresponding to the position information; and generating interaction information based on the position information and the window display area of the target window in the terminal device.
  • the position information is the first position information
  • generating the interaction information based on the first position information and the window display area of the target window in the terminal device includes: converting the first position information into the second position information based on the window display area location information, wherein the first location information indicates a first set of coordinates in a first coordinate system associated with the terminal device, and the second location information indicates a second set of coordinates in a second coordinate system corresponding to the target window; and based on The second position information is to generate interaction information, and the interaction information further includes device information used to indicate the target vehicle-mounted display device corresponding to the target window.
  • the interactive operation is the first interactive operation
  • the interactive information is the first interactive information
  • the target interface further includes interface elements associated with local applications of the terminal device
  • the process 800 further includes: receiving information associated with the interface elements The second interactive operation, the second interactive operation indicates the use of the target I/O device associated with the vehicle-machine device; and sending the second interaction information corresponding to the second interactive operation to the vehicle-machine device, so that the target I/O The device is started.
  • the process 800 further includes: sending a disabling request to the vehicle-machine device, so that the screen interaction function of at least one vehicle-mounted display device in a group of vehicle-mounted display devices is disabled.
  • a group of vehicle display devices includes at least one of the following: a central control display device, a co-pilot display device or a rear row display device.
  • FIG. 9 shows a flowchart of an example process 900 of a graphical interface method according to an embodiment of the disclosure.
  • Process 900 may be implemented, for example, by an on-board device (eg, on-board device 240 discussed with reference to FIG. 2 ).
  • on-board device 240 discussed with reference to FIG. 2 .
  • the in-vehicle device 240 sends image information to the terminal device based on the connection with the terminal device, and the image information is generated based on a group of vehicle interfaces presented by a group of vehicle display devices.
  • the in-vehicle device 240 receives interaction information from the terminal device, the interaction information is generated based on an interaction operation associated with a group of windows in the target interface of the terminal device, the target interface is generated based on the image information and includes a A group of windows corresponding to the group vehicle interface.
  • the in-vehicle device 240 performs an interaction action corresponding to the interaction information.
  • the process 900 further includes: establishing a connection with the terminal, so that the on-board resources associated with the in-vehicle device are virtualized as virtual resources of the terminal device, and the on-board resources include a group of on-board display devices.
  • the on-vehicle resource further includes an input/output I/O device associated with the on-board device, and the output/output device includes at least one of the following: a camera, a microphone or a speaker.
  • the process 900 further includes: based on the execution of the interactive action, generating updated image information, the updated image information corresponds to a set of updated vehicle interfaces, at least one vehicle interface in the set of updated interfaces is updated in response to the interactive action updating; and sending updated image information to the terminal device.
  • the process 900 further includes: based on the device information in the interaction information, determining the target vehicle-mounted display device corresponding to the interaction information; and based on the target vehicle-mounted display device and the location information in the interaction information, determining interactive action.
  • executing the interaction action corresponding to the interaction information includes at least one of the following: sending a corresponding control instruction to the vehicle component connected to the vehicle-machine equipment, and the control instruction is generated based on the interaction information; or making a group of vehicle interfaces At least one of the in-vehicle interfaces is updated in response to the interactive information.
  • the interactive operation is the first interactive operation
  • the interactive information is the first interactive information
  • the target interface further includes interface elements associated with the local application of the terminal device
  • the method further includes: receiving the second interactive information from the terminal device , the second interaction information is generated based on a second interaction operation associated with the interface element, the second interaction operation indicates the use of a target I/O device associated with the vehicle-machine device; and enabling the target I/O device, for use by terminal devices.
  • the process 900 further includes: in response to the disabling request received from the terminal device, disabling the screen interaction function of at least one vehicle-mounted display device in a group of vehicle-mounted display devices.
  • a group of vehicle display devices includes at least one of the following: a central control display device, a co-pilot display device or a rear row display device.
  • the embodiments of the present disclosure can quickly control the interface of the vehicle-mounted display device through the terminal device, thereby improving the convenience of interaction and ensuring the safety of interaction during driving.
  • the present disclosure may be a method, apparatus, system and/or computer program product.
  • a computer program product may include a computer-readable storage medium having computer-readable program instructions thereon for carrying out various aspects of the present disclosure.
  • a computer readable storage medium may be a tangible device that can retain and store instructions for use by an instruction execution device.
  • a computer readable storage medium includes, but is not limited to, electrical storage devices, magnetic storage devices, optical storage devices, electromagnetic storage devices, semiconductor storage devices, or any suitable combination of the foregoing.
  • Computer-readable storage media include: portable computer diskettes, hard disks, random access memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM), or flash memory), static random access memory (SRAM), compact disc read only memory (CD-ROM), digital versatile disc (DVD), memory stick, floppy disk, mechanically encoded device, such as a printer with instructions stored thereon A hole card or a raised structure in a groove, and any suitable combination of the above.
  • RAM random access memory
  • ROM read-only memory
  • EPROM erasable programmable read-only memory
  • flash memory static random access memory
  • SRAM static random access memory
  • CD-ROM compact disc read only memory
  • DVD digital versatile disc
  • memory stick floppy disk
  • mechanically encoded device such as a printer with instructions stored thereon
  • a hole card or a raised structure in a groove and any suitable combination of the above.
  • computer-readable storage media are not to be construed as transient signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through waveguides or other transmission media (e.g., pulses of light through fiber optic cables), or transmitted electrical signals.
  • Computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or downloaded to an external computer or external storage device over a network, such as the Internet, a local area network, a wide area network, and/or a wireless network.
  • the network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers, and/or edge servers.
  • a network adapter card or a network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in each computing/processing device .
  • Computer program instructions for performing the operations of the present disclosure may be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-dependent instructions, microcode, firmware instructions, state setting data, or Source or object code written in any combination, including object-oriented programming languages—such as Smalltalk, C++, etc., and conventional procedural programming languages—such as the “C” language or similar programming languages.
  • Computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server implement.
  • the remote computer can be connected to the user computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or it can be connected to an external computer (such as via the Internet using an Internet service provider). connect).
  • LAN local area network
  • WAN wide area network
  • an electronic circuit such as a programmable logic circuit, field programmable gate array (FPGA), or programmable logic array (PLA)
  • FPGA field programmable gate array
  • PDA programmable logic array
  • These computer readable program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine such that when executed by the processing unit of the computer or other programmable data processing apparatus , producing an apparatus for realizing the functions/actions specified in one or more blocks in the flowchart and/or block diagram.
  • These computer-readable program instructions can also be stored in a computer-readable storage medium, and these instructions cause computers, programmable data processing devices and/or other devices to work in a specific way, so that the computer-readable medium storing instructions includes An article of manufacture comprising instructions for implementing various aspects of the functions/acts specified in one or more blocks in flowcharts and/or block diagrams.
  • each block in a flowchart or block diagram may represent a module, a portion of a program segment, or an instruction that includes one or more Executable instructions.
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks in succession may, in fact, be executed substantially concurrently, or they may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations can be implemented by a dedicated hardware-based system that performs the specified function or action , or may be implemented by a combination of dedicated hardware and computer instructions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本公开的实施例提供了一种界面交互方法、电子设备、存储介质以及程序产品。在该方法中,终端设备(例如,移动设备)可以基于与车机设备的连接,来从车机设备接收图像信息。车机设备可以与一组车载显示设备(例如,车载中控显示设备、副驾驶座显示设备、后排显示设备等)相关联。进一步地,终端设备可以基于图像信息来呈现目标界面,目标界面包括与该组车载显示设备所呈现的一组车载界面对应的一组窗口。当接收到与该组窗口相关联的交互操作时,终端设备可以向车机设备发送与交互操作对应的交互信息,以使得车机设备执行与该交互操作对应的动作。基于这样的方式,本公开的实施例能够通过终端设备来对车载显示设备的界面进行快捷的操控,从而能够提高交互的便捷性,并且保证驾驶过程中的交互安全性。

Description

界面交互方法、电子设备、介质以及程序产品
本申请要求于2021年11月03日提交中国专利局,申请号为202111295106.5,发明名称为“界面交互方法、电子设备、介质以及程序产品”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本公开总体上涉及信息技术领域,并且更特别地涉及一种界面交互方法、电子设备、计算机可读存储介质、以及计算机程序产品。
背景技术
随着电子技术的迅猛发展,车辆也变得越来越智能化。各种类型地交互方式逐渐被集成到车辆中,以为驾驶者或乘客提供更为便捷和安全的交互体验。近年来,车辆中的多屏化和联屏化也逐渐成为车辆座舱显示屏发展的新趋势。例如,乘客可以在车辆行驶过程中利用副驾位置或后排位置的娱乐屏来例如进行各种类型的娱乐活动。
发明内容
本公开的实施例涉及一种终端设备与车机设备之间互联以实现针对车载界面的远程交互的方案,并且具体提供了界面交互方法、电子设备、计算机可读存储介质、以及计算机程序产品。
在本公开的第一方面,提供了一种界面交互方法。在该方法中,终端设备基于与车机设备的连接,从车机设备接收图像信息,车机设备与一组车载显示设备相关联,图像信息是车机设备基于由一组车载显示设备所呈现的一组车载界面而生成。进一步地,终端设备基于图像信息呈现目标界面,目标界面包括与一组车载界面对应的一组窗口。相应地,响应于接收到与一组窗口相关联的交互操作,终端设向车机设备发送与交互操作对应的交互信息。
基于这样的方式,本公开的实施例能够通过终端设备来对车载显示设备的界面进行快捷的操控,从而能够提高交互的便捷性,并且保证驾驶过程中的交互安全性。
在一些实施例中,方法还包括:从车机设备接收更新图像信息,更新图像信息对应于一组更新的车载界面,一组更新的车载界面中的至少一个界面响应于交互信息而被更新;以及基于更新图像信息,呈现更新界面。
由此,本公开的实施例能够使得用户如同直接在对应的车载显示设备上进行交互,进而提升用户的交互体验。
在一些实施例中,基于图像信息呈现目标界面包括:基于图像信息,生成与一组车载界面对应的一组图像;基于一组图像,生成与一组车载界面对应的一组窗口;以及呈现包括所生成的一组窗口的目标界面。由此,本公开的实施例能够允许用户远程地查看一个或多个车载显示设备的当前界面,从而方便对于车载显示设备的管理。
在一些实施例中,目标界面还包括与一组窗口对应的一组窗口标识,其中窗口标识用于指示与相应窗口对应的车载显示设备。由此,本公开的实施例能够允许用户更加方便地了解窗口是对应于哪个车载显示设备。
在一些实施例中,方法还包括:接收与一组窗口相关联的交互操作;确定与交互操作所对应的位置信息,位置信息指示交互操作在终端设备上发生的位置;从一组窗口中确定与位置信息对应的目标窗口;以及基于位置信息和目标窗口在终端设备中的窗口显示区域,生成交互信息。由此,本公开的实施例能够提升用户交互的准确性,并降低交互的延迟。
在一些实施例中,位置信息为第一位置信息,并且基于第一位置信息和目标窗口在终端设备中的窗口显示区域生成交互信息包括:基于窗口显示区域,将第一位置信息变换为第二位置信息,其中第一位置信息指示在与终端设备相关联的第一坐标系中的第一组坐标,第二位置信息指示与目标窗口对应的第二坐标系中的第二组坐标;以及基于第二位置信息,生成交互信息,交互信息还包括用于指示与目标窗口对应的目标车载显示设备的设备信息。
由此,本公开的实施例能够允许快捷地操作对应的终端显示设备,从而提升交互的便利性。
在一些实施例中,交互操作为第一交互操作,交互信息为第一交互信息,目标界面还包括与终端设备的本地应用相关联的界面元素,方法还包括:接收与界面元素相关联的第二交互操作,第二交互操作指示对与车机设备相关联的目标I/O设备的使用;以及向车机设备发送与第二交互操作对应的第二交互信息,以使目标I/O设备被启动。
由此,本公开的实施例能够允许终端设备调用车载资源来执行交互,从而提升了丰富了交互场景,并提升了便利性。
在一些实施例中,方法还包括:向车机设备发送禁用请求,以使一组车载显示设备中的至少一个车载显示设备的屏幕交互功能被禁用。由此,本公开的实施例能够进一步加强对于车载显示设备的控制和管理。
在一些实施例中,方法还包括:建立与车机设备的连接,使得与车机设备相关联的车载资源被虚拟化为终端设备的虚拟资源,车载资源包括一组车载显示设备。在一些实施例中,车载资源还包括与车载机备相关联的输入输出I/O设备,输出输入设备包括以下至少一项:相机、麦克风或扬声器。由此,本公开的实施例能够高效地实现设备之间的同步和通信。
在一些实施例中,一组车载显示设备包括以下至少一项:中控显示设备、副驾驶显示设备或后排显示设备。
在本公开的第二方面,提供了一种界面交互方法。在该方法中,车机设备基于与终端设备的连接向终端设备发送图像信息,图像信息基于由一组车载显示设备所呈现的一组车载界面而生成。进一步地,车机设备从终端设备接收交互信息,交互信息是基于与终端设备的目标界面中的一组窗口相关联的交互操作而被生成,目标界面基于图像信息而被生成并且包括与一组车载界面对应的一组窗口。相应地,车机设备执行与交互信息对应的交互动作。
基于这样的方式,本公开的实施例能够通过终端设备来对车载显示设备的界面进行快捷的操控,从而能够提高交互的便捷性,并且保证驾驶过程中的交互安全性。
在一些实施例中,方法还包括:基于交互动作的执行,生成更新图像信息,更新图像信息对应于一组更新的车载界面,一组更新界面中的至少一个车载界面响应于交互动作而被更新;以及向终端设备发送更新图像信息。
由此,本公开的实施例能够使得用户如同直接在对应的车载显示设备上进行交互,进而提升用户的交互体验。
在一些实施例中,方法还包括:基于交互信息中的设备信息,确定与交互信息对应的目标车载显示设备;以及基于目标车载显示设备和由交互信息中的位置信息,确定与交互信息 对应的交互动作。由此,本公开的实施例能够提升用户交互的准确性,并降低交互的延迟。
在一些实施例中,执行与交互信息对应的交互动作包括以下至少一项:向与车机设备连接的车辆组件发送对应的控制指令,控制指令基于交互信息而被生成;或者使一组车载界面中的至少一个车载界面响应于交互信息而被更新。由此,本公开的实施例能够提供更加丰富的远程交互能力。
在一些实施例中,交互操作为第一交互操作,交互信息为第一交互信息,目标界面还包括与终端设备的本地应用相关联的界面元素,方法还包括:从终端设备接收第二交互信息,第二交互信息是基于与界面元素相关联的第二交互操作而被生成,第二交互操作指示对与车机设备相关联的目标I/O设备的使用;以及启用目标I/O设备,以供终端设备使用。
由此,本公开的实施例能够允许终端设备调用车载资源来执行交互,从而提升了丰富了交互场景,并提升了便利性。
在一些实施例中,方法还包括:响应于从终端设备接收的禁用请求,使一组车载显示设备中的至少一个车载显示设备的屏幕交互功能被禁用。由此,本公开的实施例能够进一步加强对于车载显示设备的控制和管理。
在一些实施例中,方法还包括:建立与终端的连接,使得与车机设备相关联的车载资源被虚拟化为终端设备的虚拟资源,车载资源包括一组车载显示设备。在一些实施例中,车载资源还包括与车机设备相关联的输入输出I/O设备,输出输入设备包括以下至少一项:相机、麦克风或扬声器。由此,本公开的实施例能够高效地实现设备之间的同步和通信。
在一些实施例中,一组车载显示设备包括以下至少一项:中控显示设备、副驾驶显示设备或后排显示设备。
在本公开的第三方面,提供了一种终端设备。电子设备包括处理器以及存储有指令的存储器。指令在被处理器执行时使得终端设备执行根据第一方面及其实现方式的任一方法。
在本公开的第四方面,提供了一种车机设备。电子设备包括处理器以及存储有指令的存储器。指令在被处理器执行时使得车机设备执行根据第二方面及其实现方式的任一方法。
在本公开的第五方面,提供了一种计算机可读存储介质。计算机可读存储介质存储有指令,指令在被电子设备执行时使得电子设备执行第一方面及其实现方式的任一方法。
在本公开的第六方面,提供了一种计算机程序产品。计算机程序产品包括指令,指令在被电子设备执行时使得电子设备执行第一方面及其实现方式的任一方法。
应当理解,发明内容部分中所描述的内容并非旨在限定本公开的关键或重要特征,亦非用于限制本公开的范围。本公开的其他特征通过以下的描述将变得容易理解。
附图说明
通过参考附图阅读下文的详细描述,本公开的实施例的上述以及其他目的、特征和优点将变得容易理解。在附图中,以示例性而非限制性的方式示出了本公开的若干实施例。
图1示出了可以实现本公开的实施例的一种电子设备的硬件结构的示意图。
图2示出了可以实施根据本公开的实施例的示例环境的示意图。
图3A至图3C示出了根据本公开的实施例的示例界面的示意图。
图4A至图4C示出了根据本公开实施例的示例交互过程的示意图。
图5示出了根据本公开的实施例的交互方法的示例过程。
图6示出了根据本公开的实施例的交互系统的示意架构图。
图7示出了根据本公开的实施例的交互系统的示例交互过程。
图8示出了根据本公开的一些实施例的界面交互方法的示例过程的流程图。
图9示出了根据本公开的又一些实施例的界面交互方法的示例过程的流程图。
贯穿所有附图,相同或者相似的参考标号被用来表示相同或者相似的组件。
具体实施方式
下文将参考附图中示出的若干示例性实施例来描述本公开的原理和精神。应当理解,描述这些具体的实施例仅是为了使本领域的技术人员能够更好地理解并实现本公开,而并非以任何方式限制本公开的范围。在以下描述和权利要求中,除非另有定义,否则本文中使用的所有技术和科学术语具有与所属领域的普通技术人员通常所理解的含义。
如本文所使用的,术语“包括”及其类似用语应当理解为开放性包含,即“包括但不限于”。术语“基于”应当理解为“至少部分地基于”。术语“一个实施例”或“该实施例”应当理解为“至少一个实施例”。术语“第一”、“第二”等等可以指代不同的或相同的对象,并且仅用于区分所指代的对象,而不暗示所指代的对象的特定空间顺序、时间顺序、重要性顺序,等等。在一些实施例中,取值、过程、所选择的项目、所确定的项目、设备、装置、手段、部件、组件等被称为“最佳”、“最低”、“最高”、“最小”、“最大”,等等。应当理解,这样的描述旨在指示可以在许多可使用的功能选择中进行选择,并且这样的选择不需要在另外的方面或所有方面比其他选择更好、更低、更高、更小、更大或者以其他方式优选。如本文所使用的,术语“确定”可以涵盖各种各样的动作。例如,“确定”可以包括运算、计算、处理、导出、调查、查找(例如,在表格、数据库或另一数据结构中查找)、查明等。此外,“确定”可以包括接收(例如,接收信息)、访问(例如,访问存储器中的数据)等。再者,“确定”可以包括解析、选择、选取、建立等。
示例设备
图1示出了可以实施本公开的实施例的一种电子设备100的硬件结构的示意图。如图1所示,电子设备100可以包括处理器110、外部存储器接口120、内部存储器121、通用串行总线(universal serial bus,USB)接口130、充电管理模块140、电源管理模块141、电池142、天线1、天线2、移动通信模块150、无线通信模块160、音频模块170、扬声器170A、受话器170B、麦克风170C、耳机接口170D、传感器模块180、按键190、马达191、指示器192、摄像头193、显示屏194、以及用户标识模块(subscriber identification module,SIM)卡接口195等。传感器模块180可以包括压力传感器180A、陀螺仪传感器180B、气压传感器180C、磁传感器180D、加速度传感器180E、距离传感器180F、接近光传感器180G、指纹传感器180H、温度传感器180J、触摸传感器180K、环境光传感器180L、骨传导传感器180M等。
应当理解,本公开的实施例所示意的结构并不构成对电子设备100的具体限定。在本公开的另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP)、调制解调处理器、图形处理器(graphics processing unit,GPU)、图像信号处理器(image signal processor,ISP)、控制器、视频编解码器、数字信号处理器(digital  signal processor,DSP)、基带处理器、和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口、集成电路内置音频(inter-integrated circuit sound,I2S)接口、脉冲编码调制(pulse code modulation,PCM)接口、通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口、移动产业处理器接口(mobile industry processor interface,MIPI)、通用输入输出(general-purpose input/output,GPIO)接口、用户标识模块(subscriber identity module,SIM)接口、和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器180K、充电器、闪光灯、摄像头193等。例如,处理器110可以通过I2C接口耦合触摸传感器180K,使处理器110与触摸传感器180K通过I2C总线接口通信,实现电子设备100的触摸功能。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。在一些实施例中,音频模块170可以通过I2S接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。
PCM接口也可以用于音频通信,将模拟信号抽样、量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与无线通信模块160。例如,处理器110通过UART接口与无线通信模块160中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以通过UART接口向无线通信模块160传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器110与显示屏194、摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI)、显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现电子设备100的拍摄功能。处理器110和显示屏194通过DSI接口通信,实现电子设备100的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头193、显示屏194、无线通信模块160、音频模块170、传感器模块180等。GPIO接口还可以被配置为I2C接口、I2S接口、UART接口、MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口、Micro USB接口、USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本公开的实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本公开的另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备100的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备100供电。
电源管理模块141用于连接电池142、充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110、内部存储器121、显示屏194、摄像头193、和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
电子设备100的无线通信功能可以通过天线1、天线2、移动通信模块150、无线通信模块160、调制解调处理器以及基带处理器等实现。天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如,可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G/6G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器、开关、功率放大器、低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A、受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络)、蓝牙(Bluetooth,BT)、全球导航卫星系统(global navigation satellite system,GNSS)、调频(frequency modulation, FM)、近距离无线通信技术(near field communication,NFC)、红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(GSM)、通用分组无线服务(GPRS)、码分多址接入(CDMA)、宽带码分多址(WCDMA)、时分码分多址(TD-SCDMA)、长期演进(long term evolution,LTE)、5G以及后续演进标准、BT、GNSS、WLAN、NFC、FM、和/或IR技术等。其中GNSS可以包括全球卫星定位系统(GPS)、全球导航卫星系统(GLONASS)、北斗卫星导航系统(BDS)、准天顶卫星系统(QZSS)和/或星基增强系统(SBAS)。
电子设备100通过GPU、显示屏194、以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可以包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备100可以通过ISP、摄像头193、视频编解码器、GPU、显示屏194以及应用处理器等实现拍摄功能。ISP用于处理摄像头193反馈的数据。例如,在拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如,动态图像专家组(moving picture experts group,MPEG)1、MPEG2、MPEG3、MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如图像识别、人脸识别、语音识别、文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功 能。例如将音乐、视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。处理器110通过运行存储在内部存储器121的指令、和/或存储在设置于处理器中的存储器的指令,执行电子设备100的各种功能应用以及数据处理。
电子设备100可以通过音频模块170、扬声器170A、受话器170B、麦克风170C、耳机接口170D、以及应用处理器等实现音频功能。例如音乐播放,录音等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备100的接触和分离。电子设备100可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口195可以支持Nano SIM卡、Micro SIM卡、SIM卡等。同一个SIM卡接口195可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口195也可以兼容不同类型的SIM卡。SIM卡接口195也可以兼容外部存储卡。电子设备100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备100采用eSIM,即嵌入式SIM卡。eSIM卡可以嵌在电子设备100中,不能和电子设备100分离。
电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本公开的实施例以分层架构的一种移动操作系统为例,示例性说明电子设备100的软件结构。
基本原理
如上文所讨论的,车辆在人们的日常生活中正扮演越来越重要的角色。各种电子化技术也逐渐地被应用到车辆中。目前,一个显著的趋势是,车辆正通过各种类型的屏幕来为驾驶者或乘客提供更加丰富的信息,以方便驾驶员的驾驶或者使乘客的行程更加丰富。
示例性地,除传统的车辆中控屏幕外,一些车辆还可以通过部署副驾驶显示设备、后排显示设备或者被安装在其他适当位置的显示设备来为车内的乘客提供诸如新闻、天气、导航、影音等内容。
然而,这样的车内交互方式也可能会引发不期望的各种问题。例如,一些儿童座椅可能只能被安装在后排座位,这可能会导致后排的儿童在单独在后排乘坐时可能不知晓如何与后排显示设备交互来获取期望的内容,或者前排的驾驶员或副驾驶乘客可能不期望儿童利用后排显示设备来获取某些特定内容。
例如,在一个场景中,车辆可能仅包括驾驶员和后排的儿童乘客两人,儿童乘客可能无法掌握后排显示设备的交互方法,这导致可能需要驾驶员到后排来进行协助。或者,在驾驶过程中,儿童乘客可能会有一些特定的需求,这导致驾驶员可能不得不临时停靠车辆。
在另一个场景中,副驾驶位的乘客可能是后排乘客的家长,其可能不期望后排乘客在车辆驾驶图中通过后排显示设备来进行游戏。这导致家长可能需要频繁转身去控制后排显示设备。
在又一个场景中,副驾驶位的乘客可能不熟悉车辆交互系统,其可能无法快速地利用副驾驶显示设备来获取想要的服务。这使得其可能会寻求驾驶员的服务,从而导致驾驶员在驾驶过程中可能出现一些危险行为。
能够看到,虽然车辆内的屏幕系统能够为乘客或驾驶员带来更加丰富的内容,但目前其也可能会带来一些不便利问题,甚至是安全隐患。
有鉴于此,根据本公开的实施例,提供了一种界面交互方案。具体地终端设备(例如,移动设备)可以基于与车机设备的连接,来从车机设备接收图像信息。车机设备可以与一组车载显示设备(例如,车载中控显示设备、副驾驶座显示设备、后排显示设备等)相关联。进一步地,终端设备可以基于图像信息来呈现目标界面,目标界面包括与该组车载显示设备所呈现的一组车载界面对应的一组窗口。当接收到与该组窗口相关联的交互操作时,终端设备可以向车机设备发送与交互操作对应的交互信息,以使得车机设备执行与该交互操作对应的动作。
由此,本公开的实施例能够在终端设备呈现车载显示设备的界面,以方便了解不同车载显示设备的状态。此外,本公开的实施例还能够允许基于终端设备上的交互操作来远程地控制对应的车载显示设备。由此,本公开的实施例能够提高交互的便捷性,并且保证驾驶过程中的交互安全性。
以下将结合附图来描述根据本公开实施例的界面交互方案。
示例环境
图2示出了本公开的多个实施例能够在其中实现的示例环境200的示意图。该示例环境200可以包括终端设备210以及与终端设备210通信连接的车机设备240。在一些实施例中,终端设备210可以通过任何适当的连接方式来与车机设备240连接,其示例可以包括但不限于:诸如经由USB线缆的有线连接、诸如基于蓝牙、Wi-Fi、蜂窝网络的无线连接等。
在一些实施例中,终端设备210可以包括任何适当类型的电子设备,例如,智能手机、平板电脑、笔记本电脑、智能手表、智能眼镜、智能投影等。如图2所示,这样的终端设备210可以利用相应的显示屏或投影设备来呈现如220所示的图形界面UI 220。
在一些实施例中,车机设备240可以与一个或多个车载显示设备相关联。如图2所示,车机设备240例如可以用于控制多个车辆显示设备250-1和250-2(单独或统一称为车载显示设备250)。应当理解,虽然车载显示设备250在图2中示出为包括中控显示设备250-1和副驾驶显示设备250-2,但在保证车辆安全性的情况下,这样的车载显示设备250例如还可以被部署在车内任何的适当位置,例如后排显示设备等。
示例性地,这样的车载显示设备250可以呈现对应的车载界面,例如车载界面260-1和车载界面260-2(单独或统一称为车载界面260)。示例性地,中控车载显示设备250-1例如可以呈现车载界面260-1,其例如可以包括用于控制不同车辆组件(例如,空调、灯光、天窗等)的界面元素。副驾驶显示设备250-2例如可以呈现车载界面260-2,其例如可以包括与娱乐功能相关联的界面元素。
根据本公开的界面交互方案,基于终端设备210和车机设备240之间的连接,终端设备210可以从车机设备240接收图像信息,以用于呈现界面220。如图2所示,界面220例如可以包括与中控显示设备250-1的车载界面260-1所对应的第一窗口230-1。界面220例如还可以包括与副驾驶显示设备250-2的车载界面260-2所对应的第二窗口230-2。第一窗口230-1 和第二窗口230-1例如可以单独或统一称为窗口230。关于生成界面220的具体实现将在下文结合图5详细描述,在此暂不详叙。
基于这样的方式,本公开的实施例使得驾驶员、乘客或者其他适当的用户(不一定在车辆内部)能够通过终端设备来快捷地查看一个或多个车载显示设备当前呈现的界面。
在一些实施例中,本公开的实施例还允许用户通过终端设备210来远程地车载界面进行交互。示例性地,用户例如可以通过终端设备210所提供的触摸屏幕来直接对窗口230中的元素进行操作,这使得用户如同直接在操控对应的车载显示设备250。例如,用户可以通过点击终端设备210所呈现的界面220中的界面元素来例如调整车辆空调的温度。关于远程交互的具体实现将在下文结合图5详细描述,在此暂不详叙。
基于这样的方式,本公开的实施例还使得驾驶员、乘客或者其他适当的用户(不一定在车辆内部)能够通过终端设备来快捷地操作一个或多个车载显示设备当前呈现的界面。
示例界面
以下将参考图3A至图3C来描述利用终端设备210所呈现的界面来查看车载终端显示设备250的车载界面260的示例。
图3A示出了根据本公开的一些实施例的示例界面的示意图300A。如图3A所示,在终端设备210与车机设备240建立通信连接后,终端设备210例如可以呈现第一界面310。如图3A所示,第一界面310例如可以包括车载信息显示区域302,其例如可以呈现与一组车载显示设备250有关的信息。
在一些实施例中,车载信息显示区域302例如可以包括当前已经启用的车载显示设备的标识(例如,“中控显示设备”和“副驾驶显示设备”)。备选地,车载信息显示区域302也可以包括与车机设备240相关联的、但当前未被启用的车载显示设备的信息(例如,标识“后排显示设备”)。
在一些实施例中,车载信息显示区域320可以包括与车载界面260对应的实时图像,其例如可以通过较小的分辨率被呈现。由此,用户可以直观地了解到对应车载显示设备250的实时状态,而又不会影响到用户对于当前终端设备210的正常操作。例如,终端设备210还可以通过显示区域304来呈现本地视频应用的界面。
在一些实施例中,车载信息显示区域320中与车载界面260对应的图像例如可以以较低的频率被刷新。例如,终端设备210例如可以按24帧每秒的频率来接收车载界面260的图像,但其例如可以按照每秒4帧的频率来更新车载信息显示区域320中的图像。由此,本公开的实施例既能够满足用户查看对应车载显示设备250的实时状态的需求,又能够降低终端设备渲染图像的开销。
在一些实施例中,车载信息显示区域320例如可以不呈现与车载界面260对应的图像,而只有在用户发起查看相应车载界面260的请求时,才提供与车载界面260对应的窗口。
如图3B所示,当用户发起了查看中控显示设备的车载界面的详情的请求时,终端设备210例如可以呈现第二界面320。示例性地,用户例如可以通过终端设备210的触摸屏幕来点击第一界面310中的文字“中控显示设备”或上方的图像或视频,以发起查看中控显示设备250-1的实时界面的请求。
如图3B所示,第二界面320例如可以包括与中控显示设备250-1的车载界面260-1对应的窗口312,以通过更大的显示区域来呈现车载界面260-1对应的图像。
在一些实施例中,窗口312中的图像可以基于从车机设备240接收的图像信息来实时地更新。在一些实施例中,窗口312中图像被刷新的频率例如可以高于车载信息显示区域302中图像被刷新的频率。在一些实施例中,如图3B所示,当中控显示设备250-1对应的图像通过窗口312被呈现时,其对应的信息可以从车载信息显示区域302中被隐藏。
在一些实施例中,终端设备210还可以根据用户的操作来调整窗口312的显示尺寸。例如,当用户例如点击最大化窗口312的按钮时,终端设备210例如可以使得该窗口312被全屏显示。在一些实施例中,当用户例如点击最小化窗口312的按钮时,终端设备21例如可以关闭该窗口312,并重新如图3A所示在车载信息显示区域302中来显示车载界面所对应的图像。
在一些实施例中,如下文将详细介绍的,用户还可以通过操作窗口312内的界面元素来实现对于对应的车载界面250-1的实时控制。
如图3C所示,终端设备210例如还可以呈现界面220(也称为第三界面)。例如,终端设备210例如可以响应于用户对于车载信息显示区域302中“中控显示设备”和“副驾驶显示设备”的查看请求,而呈现第三界面220。
如图3C所示,第三界面220可以包括与中控显示设备250-1对应的第一窗口230-2,和与副驾驶显示设备250-2对应的第二窗口230-2。在一些实施例中,第三界面220例如还可以包括与窗口230-1和窗口230-2对应的窗口标识332-1和332-2(单独或统一称为窗口标识332)。该窗口标识例如可以用于指示与相应窗口对应的车载显示设备。
例如,第一窗口230-1的窗口标识为“CAR DESKTOP 1”,其例如可以用于指示第一窗口230-1所呈现的图像对应于中控显示设备250-1的车载界面260-1。第二窗口230-2的窗口标识为“CAR DESKTOP 2”,其例如可以用于指示第二窗口230-2所呈现的图像对应于副驾驶显示设备250-2的车载界面260-2。
在一些实施例中,如图3C所示,第一窗口230-1中例如可以包括与车载界面260-1中的控件所对应的界面元素,例如界面元素334-1和界面元素334-2(单独或统一称为界面元素)。在一些实施例中,终端设备210可以接收针对界面元素334的交互操作,并相应地生成交互信息以使得用户能够感觉如同直接在操作对应的车载显示设备。关于交互场景和交互实现的细节将在下文详细描述。
在一些实施例中,如图3C所示,界面220还可以包括与终端设备210的本地应用相关联的界面元素。例如,界面220可以包括与“相机应用”相关联的界面元素336、与“电话应用”相关联的界面元素338等。
在一些实施例中,终端设备210也可以接收针对界面元素336或界面元素338的交互操作,并可以利用与车机设备240相关联的输入输出(I/O)设备来完成相应的交互。关于利用车载I/O设备来完成终端设备210的本地交互的过程将在下文详细描述。
能够看到,通过在终端设备处提供与车载显示设备的车载界面有关的图像,本公开的实施例能够提高车载显示设备管理的便捷性。
示例交互场景1
以下将参考图4A至图4C来描述利用终端设备210来实现针对车载终端显示设备250的车载界面260的远程交互的示例。
在一些实施例中,用户例如可以通过终端设备210中呈现的窗口来实现针对车辆的控制。 如图4A所示,第一窗口230-1中例如可以包括用于控制车辆组件的界面元素。例如,界面元素410例如可以对应于车载界面260-1中用于控制驾驶员区域的空调温度的控件。
在一些实施例中,终端设备210可以接收用户对于界面元素410的点击操作,并生成与该点击操作对应的交互信息。具体地,终端设备210例如可以确定与该点击操作对应的点击位置。例如,该点击位置可以指示在终端设备210的界面220中的第一坐标。
在一些实施例中,终端设备210可以进一步基于该第一坐标来确定该点击操作发生在第一窗口230-1的窗口显示区域内。进一步地,终端设备210可以基于第一窗口230-1的窗口显示区域,而将第一坐标转换为与第一窗口230-1对应的坐标系中的第二坐标。由此,终端设备210可以将界面220中点击位置的坐标转换为在对应控件在车载界面260-1中的实际坐标。
进一步地,终端设备210可以基于第二坐标来生成交互信息,以指示对车载界面260-1中的对应坐标执行了点击操作。在一些实施例中,交互信息还可以包括用于指示与第一目标窗口230-1对应的中控车载显示设备250-1的设备信息。相应地,该终端设备210可以经由与车机设备240的连接,而将该交互信息发送至车机设备240。
在一些实施例中,在接收到交互信息后,车机设备240可以对交互信息进行解析,以执行与交互信息对应的操作。具体地,车机设备240可以首先基于交互信息中的设备信息,确定该交互信息对应于中控显示设备250-1。进一步地,车机设备240还可以基于所确定的中控显示设备250-1和由交互信息中的位置信息来确定与交互信息对应的交互动作。
示例性地,车机设备240可以基于所接收的交互信息确定所对应的交互动作是对中控显示设备250-1的车载界面260-1中的特定坐标处执行了点击操作。相应地,车机设备240对该点击操作执行响应动作,如同该点击操作真实发生在中控显示设备250-1处。
在图4A的示例中,车机设备240可以基于该交互信息来向与车机设备240连接的车辆组件(例如,空调组件)发送对应的控制指令,以响应于该交互操作来调整驾驶员区域空调的温度。
附加地,由于该响应动作的执行,车机设备240例如可以更新车载界面260-1,并通过与终端设备250-1之间的连接,以将与更新车载界面260-1对应的更新图像发送至终端设备210。相应地,终端设备210可以相应地基于更新图像来更新第一窗口230-1中的显示。例如,更新图像可以显示驾驶员区域空调温度从“26°”被调整至“25°”。
基于以上的方式,本公开的实施例能够允许用户通过终端设备来实现针对车辆的快捷操控。这样的操控方式如同真实地操作对应的车载显示设备,而没有改变用户的使用习惯。
应当理解,以上所描述的点击操作仅是交互操作的一种示例。基于对应车载显示设备的交互能力,终端设备可以允许用户执行任何适当的交互操作,其示例可以包括但不限于:基于触摸屏的点击、长按、拖拽或滑动等;适当的体感交互方式(例如,眼动交互);基于其他输入设备(例如,键盘、鼠标等)的交互。由终端设备210所生成的交互信息的格式例如可以类似于在车载显示设备发生交互所产生的交互信息的格式。
此外,应当理解,图4A中描述的控制空调组件的功能仅是示例性的,本公开的实施例还能够允许基于终端设备中提供的窗口来快捷地控制其他车辆组件,例如灯光、雨刷或天窗等。
示例交互场景2
在一些实施例中,用户例如可以通过终端设备210中呈现的窗口来实现便捷的界面交互。如图4B所示,第一窗口230-1中例如可以包括用于信息显示的界面元素。例如,界面元素330例如可以对应于车载界面260-1中用于呈现特定联系人信息的控件。
在一些实施例中,终端设备210可以接收用户对于界面元素330的右滑操作,并生成与该滑动操作对应的交互信息。具体地,终端设备210例如可以确定与该滑动操作对应的一组屏幕位置。例如,该组屏幕位置可以指示在终端设备210的界面220中的第一组坐标。
在一些实施例中,终端设备210可以进一步基于该第一组坐标来确定该滑动操作发生在第一窗口230-1的窗口显示区域内。进一步地,终端设备210可以基于第一窗口230-1的窗口显示区域,而将第一组坐标转换为与第一窗口230-1对应的坐标系中的第二组坐标。由此,终端设备210可以将界面220中滑动位置的坐标转换为在对应控件在车载界面260-1中的实际坐标。
进一步地,终端设备210可以基于第二组坐标来生成交互信息,以指示对车载界面260-1中的对应坐标执行了点击操作。在一些实施例中,交互信息还可以包括用于指示与第一目标窗口230-1对应的中控车载显示设备250-1的设备信息。相应地,该终端设备210可以经由与车机设备240的连接,而将该交互信息发送至车机设备240。
在一些实施例中,在接收到交互信息后,车机设备240可以对交互信息进行解析,以执行与交互信息对应的操作。具体地,车机设备240可以首先基于交互信息中的设备信息,确定该交互信息对应于中控显示设备250-1。进一步地,车机设备240还可以基于所确定的中控显示设备250-1和由交互信息中的位置信息来确定与交互信息对应的交互动作。
示例性地,车机设备240可以基于所接收的交互信息确定所对应的交互动作是对中控显示设备250-1的车载界面260-1执行了与第二组坐标对应的右滑操作。相应地,车机设备240对该右滑操作执行响应动作,如同该右滑操作真实发生在中控显示设备250-1处。
在图4A的示例中,车机设备240可以基于该右滑操作来在车载界面260-1呈现与该联系人详情对应的显示区域。附加地,车机设备240例如可以更新车载界面260-1,并通过与终端设备250-1之间的连接,以将与更新车载界面260-1对应的更新图像发送至终端设备210。相应地,终端设备210可以相应地基于更新图像来更新第一窗口230-1中的显示。例如,第一窗口230-1可以包括新的显示区域420,以呈现所选择联系人的详细信息。
基于以上的方式,本公开的实施例能够允许用户通过终端设备来实现针对车载显示设备的快捷交互。这样的操控方式如同真实地操作对应的车载显示设备,而没有改变用户的使用习惯。
在一个示例场景中,位于副驾驶座的乘客例如可以通过与车机设备连接的终端设备(例如,智能手机)来快捷地操控后排显示设备的界面,以例如可以向后排位置的乘客提供特定的内容(例如,视频)。由此方式,可以提高驾驶过程的安全性。
在一些实施例中,当在利用终端设备远程控制车载显示设备时,人们可能还期望禁用对用车载显示设备的交互能力。例如,对于以上控制后排显示设备以播放特定内容的示例中,用户可能不期望后排就座的儿童对屏幕发生不期望的操作(例如,误触)。
在一些实施例中,终端设备210例如可以允许用户通过界面来禁用特定车载显示设备的屏幕交互功能。示例性地,用户例如可以通过长按第二窗口230-2的窗口栏,以发起禁用对应的副驾驶显示设备250-2的交互功能的请求。终端设备210例如可以在获得用户的确认后,向车机设备240发送禁用请求,以使得副驾驶显示设备250-2的屏幕交互功能被禁用。
在一些实施例中,禁用请求例如也可以是由终端设备210所自动发出的。例如,用户可以配置在终端设备连接到车机设备后,该组车载显示设备的交互模式。例如,交互模式例如可以包括:仅允许通过终端设备交互、仅允许通过车载显示设备交互、允许通过终端设备/车载显示设备两者进行交互等。在这样的配置被确定后,在每次相同用户的终端设备连入时,终端设备例如可以基于该配置信息来自动地发送禁用请求,以使得被配置为“仅允许通过终端设备交互”这一交互模式的车载显示设备的屏幕交互功能被禁用。
基于这样的方式,本公开的实施例能够进一步方便用户管理多个车载显示设备。
示例交互场景3
在一些实施例中,终端设备210还可以允许用户利用与车机设备240相关联的输入输出(I/O)设备来进行输入或输出。
示例性地,当终端设备210接收到对于“相机应用”的界面元素336的点击操作时,终端设备210例如可以呈现如4C所示的拍照界面400C。在一些实施例中,与传统的拍照界面不同,拍照界面400C例如可以包括相机选择控件430。当接收到针对相机选择控件430的选择时,拍照界面400C中可以呈现浮窗440,以使用户确认当前拍照所要使用的相机。
如下文将参考图5至图7详细介绍的,在终端设备210建立与车机设备240的连接后,与车机设备240相关联的车载资源可以被虚拟化为终端设备210的虚拟资源。这样的车载资源例如可以包括但不限于:车载显示设备250、各种类型的传感器、各种类型的I/O设备等。
如图4C所示,通过浮窗440,用户例如可以选择“车载相机(A Car)”作为拍照应用所要使用的相机设备。响应于对“车载相机(A Car)”的选择,终端设备210可以向车机设备240发送交互信息,以使得“车载相机(A Car)”被启用,以用于拍照应用。
相应地,车机设备240在接收到该交互信息后,可以占用该“车载相机(A Car)”硬件从而启用“车载相机(A Car)”以获取实时图像。进一步地,该实时图像可以进一步经由与终端设备210的连接而被发送回执终端设备210,以例如用于如图4C所示的拍照界面400C。
基于这样的方式,本公开的实施例能够允许终端设备进一步利用车载资源来执行更加丰富类型的交互。
应当理解,图4C中示例所描述的“相机”仅是示例性的,用户例如还可以通过点击如图3C所示的“电话应用”的界面元素338,以使用车载麦克风和扬声器来执行电话呼叫过程。
界面交互流程
以下将参考图5来描述实现本公开的界面交互方案的具体过程。图5示出了根据本公开实施例的示例界面交互过程500。过程500可能涉及终端设备和车机设备。终端设备例如可以包括图2中的终端设备210,车机设备例如可以包括图2中的车机设备240。应当理解,过程400也可以适用于其他任何适当的终端设备和车机设备。
如图5所示,在502,终端设备210建立与车机设备240的连接。在一些实施例中,终端设备210可以通过任何适当的连接方式来与车机设备240连接,其示例可以包括但不限于:诸如经由USB线缆的有线连接、诸如基于蓝牙、Wi-Fi、蜂窝网络的无线连接等。
在一些实施例中,终端设备210例如可以经由车联工具(例如,HiCar、CarPlay和CarLife等)来建立与车机设备240的连接。关于使用HiCar来建立连接的具体过程将结合图6和图7来详细描述。
在504,车机设备基于由一组车载显示设备250所呈现的一组车载界面260来生成图像信息。在一些实施例中,这样的图像信息例如可以包括该组车载界面260的对应图像。在一些实施例中,为了降低网络传输的开销,图像信息例如可以包括利用适当压缩算法所压缩过的一组图像。
在506,终端设备210从车机设备240接收图像信息。在508,终端设备210基于图像信息呈现目标界面220,该目标界面220包括与一组车载界面250对应的一组窗口230。
在一些实施例中,目标界面220例如可以同时呈现与多个车载界面250所对应的多个窗口。备选地或附加地,一组窗口230中的一个或多个窗口可以响应于用户操作而被隐藏或展开。
在510,终端设备210接收与一组窗口230相关联的交互操作而生成与该交互操作对应的交互信息。这样的交互操作可以包括利用任何交互工具的任何适当类型的界面操作,其示例可以包括但不限于:基于触摸屏的点击、长按、拖拽或滑动等;适当的体感交互方式(例如,眼动交互);基于其他输入设备(例如,键盘、鼠标等)的交互。
在512,终端设备210向车机设备240发送所生成的交互信息。在514,车机设备240执行与该交互信息对应的动作。在一些实施例中,车机设备240可以响应于交互信息而向与车机设备连接的车辆组件发送对应的控制指令,其中控制指令基于交互信息而被生成。备选地,车机设备240可以使一组车载界面250中的至少一个车载界面响应于交互信息而被更新。
基于这样的方式,本公开的实施例能够通过终端设备来对车载显示设备的界面进行快捷的操控,从而能够提高交互的便捷性,并且保证驾驶过程中的交互安全性。
在一些实施例中,在514,车机设备240基于交互动作的执行而生成更新图像信息,其中更新图像信息对应于一组更新的车载界面,一组更新界面中的至少一个车载界面响应于交互动作而被更新。在516,车机设备240向终端设备210发送更新图像信息。在518,终端设备210基于更新图像信息来呈现更新界面。
能够基于以上的方式,本公开的实施例能够允许用户通过终端设备来实现针对车子啊显示设备的快捷操控。这样的操控方式如同真实地操作对应的车载显示设备,而没有改变用户的使用习惯。
系统实现
图6示出了根据本公开的实施例的用于界面交互的系统框架600的示意图。如图6所示,该系统框架可以基于车联工具HiCar而被实现。示例性地,终端设备210例如可以包括多个组件:硬件部分;用于保障数据连接的安全性的模块,例如HiChain,其例如可以提供HiCar场景鉴权能力;用于调度应用进程的模块,例如AMS(Activity Manager Service)模块;用于多屏管理的模块,例如多屏框架;用于多屏互动的模块,例如Airsharing;用于提供反向控制能力的模块,例如Hisight模块;用于提供车机虚拟化能力的模块,例如DMSDP(Distributed Multi-mode Sensing Data Platform,分布式多节点感知数据平台);用于近距离车机互联的模块,例如NEARBY模块;上层应用,例如HiCar APK和iConnect APK,分别提供了HiCar和iConnect应用。
如图6所示,车机设备240可以包括多个组件:硬件部分;用于多屏管理的模块,例如多屏框架;用于多屏互动的模块,例如Airsharing;用提供反向控制能力的模块,例如Hisight;用于提供车机虚拟化能力的模块,例如DMSDP(Distributed Multi-mode Sensing Data Platform, 分布式多节点感知数据平台);用于近距离车机互联的模块,例如NEARBY模块;上层应用,例如厂家APK,其例如可以实现了用于互联的应用Car APK,其可以包括用于认证的模块,例如Authgent模块。进一步地,用于认证的模块可以包括用于保障数据连接的安全性的模块,例如HiChain。
以下将进一步参考图7来描述基于HiCar连接而实施的屏幕共享过程700。如图7所示,过程700可以包括设备发现阶段710。具体地,发现阶段包括:在702,用户可以在车机侧操作触发硬件之间的发现广播。例如,无线用户可以在车上操作以发出蓝牙广播,或者USB用户可以利用USB线缆将终端设备与车机设备有线连接。
在704,iConnect连接到硬件广播,并根据汽车ID判断是否是允许接入的汽车类型。在706,iConnect将发现的设备传递给HiCar APP。在708,由HiCar发起到车机设备的连接。
过程700还可以包括无线回连阶段720。具体地,在712,车机设备的硬件可以检测到与终端设备的配对。在714,车机设备可以通过Car SDK发起车机到终端设备的连接。
过程700还可以包括连接鉴权阶段730。具体地,在721,建立车机设备和终端设备之间的连接通道。例如,iCar APP接收到iConnect传递的车辆信息和当前连接信息,在终端设备处主动发起无线/有线连接。或者,在HiCar APP收到车机设备主动发出的连接请求,确定是可信设备后,终端设备可以接受连接请求。
在722和723,终端设备和车机设备执行连接鉴权。具体地,HiCar APP可以和Car SDK交换消息,传递必要的软件版本、车机能力等信息。HiCar APP可以调用HiChain接口开始认证,其中HiCar APP可以承担认证的流程编排并认证数据传输部分,而HiChain承担认证算法部分。在724和725,在鉴权通过后,可以相应地确定认证完成。
过程700还可以包括设备虚拟化阶段740。具体地,在731,建立终端设备和车机设备的连接。例如,认证成功后,HiCar APP通知DMSDP连接车机设备。
在732和733,执行设备虚拟化。具体地,DMSDP连接车机设备成功后,DMSDP可以获取车辆上的车在资源,并将车载资源虚拟成手机可使用的虚拟资源(例如,外设),再将虚拟资源上报给HiCar APP。在734和735,可以使能DMSDP上报的虚拟设备。在736,基于设备虚拟化来建立数据传输通道。在737和738,DMSDP可以通知Airsharing/Hisight将虚拟设备添加到终端设备,并将车辆上的关键码与终端设备的关键码映射。
过程700还可以包括投屏阶段750。具体地,在732,Airsharing/Hisight可以通知多屏框架添加了HiCar业务的显示设备。在734,多屏框架通知车机设备侧Car SDK并传入终端设备多屏控制程序设备,车机设备的Car SDK开始投屏。
应当理解,以上基于HiCar的系统实现框架仅是示例性的,本公开的实施例还可以利用其他任何适当的连接、鉴权、通信方式来实现根据本公开实施例的界面交互方案。
示例过程
图8示出了根据本公开实施例的图形界面方法的示例过程800的流程图。过程800例如可以由终端设备(例如,参考图2所讨论的终端设备210)来实施。
如图8所示,在框810,终端设备210基于与车机设备的连接,从车机设备接收图像信息,车机设备与一组车载显示设备相关联,图像信息是车机设备基于由一组车载显示设备所呈现的一组车载界面而生成。
在框820,终端设备210基于图像信息呈现目标界面,目标界面包括与一组车载界面对 应的一组窗口。
在框830,响应于接收到与一组窗口相关联的交互操作,终端设210向车机设备发送与交互操作对应的交互信息。
在一些实施例中,过程800还包括:建立与车机设备的连接,使得与车机设备相关联的车载资源被虚拟化为终端设备的虚拟资源,车载资源包括一组车载显示设备。
在一些实施例中,车载资源还包括与车载机备相关联的输入输出I/O设备,输出输入设备包括以下至少一项:相机、麦克风或扬声器。
在一些实施例中,过程800还包括:从车机设备接收更新图像信息,更新图像信息对应于一组更新的车载界面,一组更新的车载界面中的至少一个界面响应于交互信息而被更新;以及基于更新图像信息,呈现更新界面。
在一些实施例中,基于图像信息呈现目标界面包括:基于图像信息,生成与一组车载界面对应的一组图像;基于一组图像,生成与一组车载界面对应的一组窗口;以及呈现包括所生成的一组窗口的目标界面。
在一些实施例中,目标界面还包括与一组窗口对应的一组窗口标识,其中窗口标识用于指示与相应窗口对应的车载显示设备。
在一些实施例中,过程800还包括:接收与一组窗口相关联的交互操作;确定与交互操作所对应的位置信息,位置信息指示交互操作在终端设备上发生的位置;从一组窗口中确定与位置信息对应的目标窗口;以及基于位置信息和目标窗口在终端设备中的窗口显示区域,生成交互信息。
在一些实施例中,位置信息为第一位置信息,并且基于第一位置信息和目标窗口在终端设备中的窗口显示区域生成交互信息包括:基于窗口显示区域,将第一位置信息变换为第二位置信息,其中第一位置信息指示在与终端设备相关联的第一坐标系中的第一组坐标,第二位置信息指示与目标窗口对应的第二坐标系中的第二组坐标;以及基于第二位置信息,生成交互信息,交互信息还包括用于指示与目标窗口对应的目标车载显示设备的设备信息。
在一些实施例中,交互操作为第一交互操作,交互信息为第一交互信息,目标界面还包括与终端设备的本地应用相关联的界面元素,过程800还包括:接收与界面元素相关联的第二交互操作,第二交互操作指示对与车机设备相关联的目标I/O设备的使用;以及向车机设备发送与第二交互操作对应的第二交互信息,以使目标I/O设备被启动。
在一些实施例中,过程800还包括:向车机设备发送禁用请求,以使一组车载显示设备中的至少一个车载显示设备的屏幕交互功能被禁用。
在一些实施例中,一组车载显示设备包括以下至少一项:中控显示设备、副驾驶显示设备或后排显示设备。
图9示出了根据本公开实施例的图形界面方法的示例过程900的流程图。过程900例如可以由车机设备(例如,参考图2所讨论的车机设备240)来实施。
在910,车机设备240基于与终端设备的连接向终端设备发送图像信息,图像信息基于由一组车载显示设备所呈现的一组车载界面而生成。
在920,车机设备240从终端设备接收交互信息,交互信息是基于与终端设备的目标界面中的一组窗口相关联的交互操作而被生成,目标界面基于图像信息而被生成并且包括与一组车载界面对应的一组窗口。
在930,车机设备240执行与交互信息对应的交互动作。
在一些实施例中,过程900还包括:建立与终端的连接,使得与车机设备相关联的车载资源被虚拟化为终端设备的虚拟资源,车载资源包括一组车载显示设备。
在一些实施例中,车载资源还包括与车机设备相关联的输入输出I/O设备,输出输入设备包括以下至少一项:相机、麦克风或扬声器。
在一些实施例中,过程900还包括:基于交互动作的执行,生成更新图像信息,更新图像信息对应于一组更新的车载界面,一组更新界面中的至少一个车载界面响应于交互动作而被更新;以及向终端设备发送更新图像信息。
在一些实施例中,过程900还包括:基于交互信息中的设备信息,确定与交互信息对应的目标车载显示设备;以及基于目标车载显示设备和由交互信息中的位置信息,确定与交互信息对应的交互动作。
在一些实施例中,执行与交互信息对应的交互动作包括以下至少一项:向与车机设备连接的车辆组件发送对应的控制指令,控制指令基于交互信息而被生成;或者使一组车载界面中的至少一个车载界面响应于交互信息而被更新。
在一些实施例中,交互操作为第一交互操作,交互信息为第一交互信息,目标界面还包括与终端设备的本地应用相关联的界面元素,方法还包括:从终端设备接收第二交互信息,第二交互信息是基于与界面元素相关联的第二交互操作而被生成,第二交互操作指示对与车机设备相关联的目标I/O设备的使用;以及启用目标I/O设备,以供终端设备使用。
在一些实施例中,过程900还包括:响应于从终端设备接收的禁用请求,使一组车载显示设备中的至少一个车载显示设备的屏幕交互功能被禁用。
在一些实施例中,一组车载显示设备包括以下至少一项:中控显示设备、副驾驶显示设备或后排显示设备。
基于以上所讨论的过程,本公开的实施例能够通过终端设备来对车载显示设备的界面进行快捷的操控,从而能够提高交互的便捷性,并且保证驾驶过程中的交互安全性。
本公开可以是方法、装置、系统和/或计算机程序产品。计算机程序产品可以包括计算机可读存储介质,其上载有用于执行本公开的各个方面的计算机可读程序指令。
计算机可读存储介质可以是可以保持和存储由指令执行设备使用的指令的有形设备。计算机可读存储介质包括但不限于电存储设备、磁存储设备、光存储设备、电磁存储设备、半导体存储设备或者上述的任意合适的组合。计算机可读存储介质的更具体的例子(非穷举的列表)包括:便携式计算机盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、静态随机存取存储器(SRAM)、便携式压缩盘只读存储器(CD-ROM)、数字多功能盘(DVD)、记忆棒、软盘、机械编码设备、例如其上存储有指令的打孔卡或凹槽内凸起结构、以及上述的任意合适的组合。这里所使用的计算机可读存储介质不被解释为瞬时信号本身,诸如无线电波或者其他自由传播的电磁波、通过波导或其他传输媒介传播的电磁波(例如,通过光纤电缆的光脉冲)、或者通过电线传输的电信号。
这里所描述的计算机可读程序指令可以从计算机可读存储介质下载到各个计算/处理设备,或者通过网络、例如因特网、局域网、广域网和/或无线网下载到外部计算机或外部存储设备。网络可以包括铜传输电缆、光纤传输、无线传输、路由器、防火墙、交换机、网关计算机和/或边缘服务器。每个计算/处理设备中的网络适配卡或者网络接口从网络接收计算机可读程序指令,并转发该计算机可读程序指令,以供存储在各个计算/处理设备中的计算机可读存储介质中。
用于执行本公开操作的计算机程序指令可以是汇编指令、指令集架构(ISA)指令、机器指令、机器相关指令、微代码、固件指令、状态设置数据、或者以一种或多种编程语言的任意组合编写的源代码或目标代码,所述编程语言包括面向对象的编程语言—诸如Smalltalk、C++等,以及常规的过程式编程语言—诸如“C”语言或类似的编程语言。计算机可读程序指令可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络—包括局域网(LAN)或广域网(WAN)—连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。在一些实施例中,通过利用计算机可读程序指令的状态信息来个性化定制电子电路,例如可编程逻辑电路、现场可编程门阵列(FPGA)或可编程逻辑阵列(PLA),该电子电路可以执行计算机可读程序指令,从而实现本公开的各个方面。
这里参照根据本公开实施例的方法、装置(系统)和计算机程序产品的流程图和/或框图描述了本公开的各个方面。应当理解,流程图和/或框图的每个方框以及流程图和/或框图中各方框的组合,都可以由计算机可读程序指令实现。
这些计算机可读程序指令可以提供给通用计算机、专用计算机或其它可编程数据处理装置的处理单元,从而生产出一种机器,使得这些指令在通过计算机或其它可编程数据处理装置的处理单元执行时,产生了实现流程图和/或框图中的一个或多个方框中规定的功能/动作的装置。也可以把这些计算机可读程序指令存储在计算机可读存储介质中,这些指令使得计算机、可编程数据处理装置和/或其他设备以特定方式工作,从而,存储有指令的计算机可读介质则包括一个制造品,其包括实现流程图和/或框图中的一个或多个方框中规定的功能/动作的各个方面的指令。
也可以把计算机可读程序指令加载到计算机、其它可编程数据处理装置、或其它设备上,使得在计算机、其它可编程数据处理装置或其它设备上执行一系列操作步骤,以产生计算机实现的过程,从而使得在计算机、其它可编程数据处理装置、或其它设备上执行的指令实现流程图和/或框图中的一个或多个方框中规定的功能/动作。
附图中的流程图和框图显示了根据本公开的多个实施例的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段或指令的一部分,所述模块、程序段或指令的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个连续的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或动作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。

Claims (24)

  1. 一种界面交互方法,用于终端设备,其特征在于,所述方法包括:
    所述终端设备基于与车机设备的连接,从所述车机设备接收图像信息,所述车机设备与一组车载显示设备相关联,所述图像信息是所述车机设备基于由所述一组车载显示设备所呈现的一组车载界面而生成;
    所述终端设备基于所述图像信息呈现目标界面,所述目标界面包括与所述一组车载界面对应的一组窗口;以及
    响应于接收到与所述一组窗口相关联的交互操作,所述终端设备向所述车机设备发送与所述交互操作对应的交互信息。
  2. 根据权利要求1所述的方法,还包括:
    从所述车机设备接收更新图像信息,所述更新图像信息对应于一组更新的车载界面,所述一组更新的车载界面中的至少一个界面响应于所述交互信息而被更新;以及
    基于所述更新图像信息,呈现更新界面。
  3. 根据权利要求1所述的方法,其中基于所述图像信息呈现所述目标界面包括:
    基于所述图像信息,生成与所述一组车载界面对应的一组图像;
    基于所述一组图像,生成与所述一组车载界面对应的一组窗口;以及
    呈现包括所生成的所述一组窗口的所述目标界面。
  4. 根据权利要求1所述的方法,其中所述目标界面还包括与所述一组窗口对应的一组窗口标识,其中窗口标识用于指示与相应窗口对应的车载显示设备。
  5. 根据权利要求1所述的方法,还包括:
    接收与所述一组窗口相关联的所述交互操作;
    确定与所述交互操作所对应的位置信息,所述位置信息指示所述交互操作在所述终端设备上发生的位置;
    从所述一组窗口中确定与所述位置信息对应的目标窗口;以及
    基于所述位置信息和所述目标窗口在所述终端设备中的窗口显示区域,生成所述交互信息。
  6. 根据权利要求5所述的方法,其中所述位置信息为第一位置信息,并且基于所述第一位置信息和所述目标窗口在所述终端设备中的窗口显示区域生成所述交互信息包括:
    基于所述窗口显示区域,将所述第一位置信息变换为第二位置信息,其中所述第一位置信息指示在与所述终端设备相关联的第一坐标系中的第一组坐标,所述第二位置信息指示与所述目标窗口对应的第二坐标系中的第二组坐标;以及
    基于所述第二位置信息,生成所述交互信息,所述交互信息还包括用于指示与所述目标窗口对应的目标车载显示设备的设备信息。
  7. 根据权利要求1所述的方法,其中所述交互操作为第一交互操作,所述交互信息为第一交互信息,所述目标界面还包括与所述终端设备的本地应用相关联的界面元素,所述方法还包括:
    接收与所述界面元素相关联的第二交互操作,所述第二交互操作指示对与所述车机设备相关联的目标输入输出I/O设备的使用;以及
    向所述车机设备发送与所述第二交互操作对应的第二交互信息,以使所述目标I/O设备 被启动。
  8. 根据权利要求1所述的方法,还包括:
    向所述车机设备发送禁用请求,以使所述一组车载显示设备中的至少一个车载显示设备的屏幕交互功能被禁用。
  9. 根据权利要求1所述的方法,还包括:
    建立与所述车机设备的所述连接,使得与所述车机设备相关联的车载资源被虚拟化为所述终端设备的虚拟资源,所述车载资源包括所述一组车载显示设备。
  10. 根据权利要求9所述的方法,其中所述车载资源还包括与所述车机设备相关联的I/O设备,所述I/O设备包括以下至少一项:相机、麦克风或扬声器。
  11. 根据权利要求1所述的方法,其中所述一组车载显示设备包括以下至少一项:中控显示设备、副驾驶显示设备或后排显示设备。
  12. 一种界面交互方法,用于车机设备,所述车机设备与一组车载显示设备相关联,其特征在于,所述方法包括:
    所述车机设备基于与终端设备的连接向所述终端设备发送图像信息,所述图像信息基于由所述一组车载显示设备所呈现的一组车载界面而生成;
    所述车机设备从所述终端设备接收交互信息,所述交互信息是基于与所述终端设备的目标界面中的一组窗口相关联的交互操作而被生成,所述目标界面基于所述图像信息而被生成并且包括与所述一组车载界面对应的所述一组窗口;以及
    所述车机设备执行与所述交互信息对应的交互动作。
  13. 根据权利要求12所述的方法,还包括:
    基于所述交互动作的执行,生成更新图像信息,所述更新图像信息对应于一组更新的车载界面,所述一组更新界面中的至少一个车载界面响应于所述交互动作而被更新;以及
    向所述终端设备发送所述更新图像信息。
  14. 根据权利要求12所述的方法,还包括:
    基于所述交互信息中的设备信息,确定与所述交互信息对应的目标车载显示设备;以及
    基于所述目标车载显示设备和由所述交互信息中的位置信息,确定与所述交互信息对应的所述交互动作。
  15. 根据权利要求12所述的方法,其中执行与所述交互信息对应的交互动作包括以下至少一项:
    向与所述车机设备连接的车辆组件发送对应的控制指令,所述控制指令基于所述交互信息而被生成;或者
    使所述一组车载界面中的至少一个车载界面响应于所述交互信息而被更新。
  16. 根据权利要求12所述的方法,其中所述交互操作为第一交互操作,所述交互信息为第一交互信息,所述目标界面还包括与所述终端设备的本地应用相关联的界面元素,所述方法还包括:
    从所述终端设备接收第二交互信息,所述第二交互信息是基于与所述界面元素相关联的第二交互操作而被生成,所述第二交互操作指示对与所述车机设备相关联的目标输入输出I/O设备的使用;以及
    启用所述目标I/O设备,以供所述终端设备使用。
  17. 根据权利要求12所述的方法,还包括:
    响应于从所述终端设备接收的禁用请求,使所述一组车载显示设备中的至少一个车载显示设备的屏幕交互功能被禁用。
  18. 根据权利要求12所述的方法,还包括:
    建立与所述终端的所述连接,使得与所述车机设备相关联的车载资源被虚拟化为所述终端设备的虚拟资源,所述车载资源包括所述一组车载显示设备。
  19. 根据权利要求18所述的方法,其中所述车载资源还包括与所述车机设备相关联的I/O设备,所述I/O设备包括以下至少一项:相机、麦克风或扬声器。
  20. 根据权利要求12所述的方法,其中所述一组车载显示设备包括以下至少一项:中控显示设备、副驾驶显示设备或后排显示设备。
  21. 一种终端设备,包括:处理器、以及存储有指令的存储器,所述指令在被所述处理器执行时使得所述终端设备执行根据权利要求1至11中任一项所述的方法。
  22. 一种车机设备,包括:处理器、以及存储有指令的存储器,所述指令在被所述处理器执行时使得所述车机设备执行根据权利要求12至20中任一项所述的方法。
  23. 一种计算机可读存储介质,所述计算机可读存储介质存储有指令,所述指令在被电子设备执行时使得所述电子设备执行根据权利要求1至20中任一项所述的方法。
  24. 一种计算机程序产品,所述计算机程序产品包括指令,所述指令在被电子设备执行时使得所述电子设备执行根据权利要求1至20中任一项所述的方法。
PCT/CN2022/127800 2021-11-03 2022-10-26 界面交互方法、电子设备、介质以及程序产品 WO2023078143A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111295106.5A CN116069199A (zh) 2021-11-03 2021-11-03 界面交互方法、电子设备、介质以及程序产品
CN202111295106.5 2021-11-03

Publications (1)

Publication Number Publication Date
WO2023078143A1 true WO2023078143A1 (zh) 2023-05-11

Family

ID=86179159

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/127800 WO2023078143A1 (zh) 2021-11-03 2022-10-26 界面交互方法、电子设备、介质以及程序产品

Country Status (2)

Country Link
CN (1) CN116069199A (zh)
WO (1) WO2023078143A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117261925A (zh) * 2023-11-23 2023-12-22 润芯微科技(江苏)有限公司 一种基于单屏场景的副驾便捷智能交互系统及方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007253648A (ja) * 2006-03-20 2007-10-04 Toyota Motor Corp 入力支援システムおよびそのシステムを構成する車載端末装置
CN107747949A (zh) * 2017-09-28 2018-03-02 惠州Tcl移动通信有限公司 导航时车载终端画面投射的方法、移动终端及存储介质
CN108431753A (zh) * 2015-10-30 2018-08-21 法拉第未来公司 内容共享系统及其方法
CN112230840A (zh) * 2020-12-21 2021-01-15 智道网联科技(北京)有限公司 对车载电脑远程控制的方法、装置、电子设备及存储介质
CN113068153A (zh) * 2021-03-15 2021-07-02 深圳乐播科技有限公司 车载中控的遥控方法、装置、设备及存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007253648A (ja) * 2006-03-20 2007-10-04 Toyota Motor Corp 入力支援システムおよびそのシステムを構成する車載端末装置
CN108431753A (zh) * 2015-10-30 2018-08-21 法拉第未来公司 内容共享系统及其方法
CN107747949A (zh) * 2017-09-28 2018-03-02 惠州Tcl移动通信有限公司 导航时车载终端画面投射的方法、移动终端及存储介质
CN112230840A (zh) * 2020-12-21 2021-01-15 智道网联科技(北京)有限公司 对车载电脑远程控制的方法、装置、电子设备及存储介质
CN113068153A (zh) * 2021-03-15 2021-07-02 深圳乐播科技有限公司 车载中控的遥控方法、装置、设备及存储介质

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117261925A (zh) * 2023-11-23 2023-12-22 润芯微科技(江苏)有限公司 一种基于单屏场景的副驾便捷智能交互系统及方法
CN117261925B (zh) * 2023-11-23 2024-01-23 润芯微科技(江苏)有限公司 一种基于单屏场景的副驾便捷智能交互系统及方法

Also Published As

Publication number Publication date
CN116069199A (zh) 2023-05-05

Similar Documents

Publication Publication Date Title
WO2021051989A1 (zh) 一种视频通话的方法及电子设备
WO2020221039A1 (zh) 投屏方法、电子设备以及系统
WO2021213120A1 (zh) 投屏方法、装置和电子设备
US20220400305A1 (en) Content continuation method and electronic device
WO2021175213A1 (zh) 刷新率切换方法和电子设备
WO2020228645A1 (zh) 音视频的播放方法、终端及装置
WO2021233218A1 (zh) 投屏方法、投屏源端、投屏目的端、投屏系统及存储介质
CN113497909B (zh) 一种设备交互的方法和电子设备
CN114040242B (zh) 投屏方法、电子设备和存储介质
WO2022048474A1 (zh) 一种多应用共享摄像头的方法与电子设备
WO2021249087A1 (zh) 卡片分享方法、电子设备及通信系统
WO2022017393A1 (zh) 显示交互系统、显示方法及设备
WO2022166618A1 (zh) 一种投屏的方法和电子设备
WO2023078143A1 (zh) 界面交互方法、电子设备、介质以及程序产品
WO2020259623A1 (zh) 页面绘制的控制方法、装置及设备
US20230350631A1 (en) Projection display method and electronic device
US20230350629A1 (en) Double-Channel Screen Mirroring Method and Electronic Device
WO2023134509A1 (zh) 视频推流方法、装置、终端设备及存储介质
WO2023020012A1 (zh) 设备之间的数据通信方法、设备、存储介质及程序产品
CN116170629A (zh) 一种传输码流的方法、电子设备及计算机可读存储介质
CN115185441A (zh) 控制方法、装置、电子设备及可读存储介质
WO2023197999A1 (zh) 显示方法及电子设备
WO2023098467A1 (zh) 语音解析方法、电子设备、可读存储介质及芯片系统
CN114125805B (zh) 蓝牙回连方法及终端设备
WO2024051611A1 (zh) 人机交互方法及相关装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22889168

Country of ref document: EP

Kind code of ref document: A1