WO2024152765A1 - Procédé de gestion et de commande d'application, appareil monté sur véhicule, dispositif monté sur véhicule, véhicule et support lisible - Google Patents

Procédé de gestion et de commande d'application, appareil monté sur véhicule, dispositif monté sur véhicule, véhicule et support lisible Download PDF

Info

Publication number
WO2024152765A1
WO2024152765A1 PCT/CN2023/135801 CN2023135801W WO2024152765A1 WO 2024152765 A1 WO2024152765 A1 WO 2024152765A1 CN 2023135801 W CN2023135801 W CN 2023135801W WO 2024152765 A1 WO2024152765 A1 WO 2024152765A1
Authority
WO
WIPO (PCT)
Prior art keywords
application
interface
vehicle
input
user
Prior art date
Application number
PCT/CN2023/135801
Other languages
English (en)
Chinese (zh)
Inventor
胡晚成
崔威风
张亚男
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2024152765A1 publication Critical patent/WO2024152765A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/023Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Definitions

  • the embodiments of the present application relate to the field of intelligent automobile technology, and in particular to an application management and control method, a vehicle-mounted device, a vehicle-mounted equipment, a vehicle, and a readable medium.
  • smart cockpits provide various applications for drivers and passengers, thereby providing people with a variety of services, such as navigation services, audio and video services, and multimedia.
  • the inventor found that during the driving of the vehicle, there are situations where the interface display of the application is unreasonable, which will bring some safety hazards.
  • the main driver operates the video application on the central control screen to play the video.
  • the playback of the video screen will distract the main driver's attention, making it impossible for the main driver to concentrate on driving, affecting driving safety.
  • the present application provides an application management method, an in-vehicle device, an in-vehicle equipment, a vehicle, a readable medium and a program product to ensure driving safety.
  • the present application provides an application management method, the method comprising: obtaining a first input, wherein the first input is used to indicate that a first interface of a first application is displayed on a display screen of a vehicle.
  • obtaining a first user who inputs the first input In response to obtaining the first input during vehicle driving, obtaining a first user who inputs the first input. Determining whether to manage the interface display of the first application based on the first user, and when the determination result is yes, managing the display of the first interface of the first application.
  • the above-mentioned application management and control method can be executed by an on-board device or a cloud server on the vehicle, and the on-board device includes a vehicle computer, etc.
  • the application control method of the present application is illustrated by using a vehicle computer to execute the first input, and the first input indicates that the first interface of the first application is displayed on the display screen of the vehicle (such as the display screen of the vehicle computer).
  • the vehicle computer In response to obtaining the first input during the driving of the vehicle, the vehicle computer first determines whether the interface display of the first application is reasonable based on driving safety. If reasonable, the first interface of the first application is displayed on the display screen in response to the first input. If unreasonable, the interface display of the first application is to be controlled. This avoids the situation where the unreasonable interface display of the application in the vehicle occurs due to a direct response to the first input, reduces the negative impact on the main driver, and ensures driving safety.
  • the vehicle computer when judging whether the interface display of the first application is reasonable based on safe driving, the vehicle computer fully considers the role played by the first user who inputs the first input in the smart cockpit, and judges the rationality of displaying the interface of the first application in the smart cockpit during vehicle driving in a more intelligent and humane way.
  • This can not only urge the main driver to comply with the safe driving behavior norms stipulated in the Road Traffic Safety Law to ensure driving safety, but also meet the use requirements of users in the smart cockpit (such as people other than the main driver) for the application while ensuring driving safety, thereby improving the user experience.
  • determining whether to control the interface display of the first application according to the first user includes: when determining that a first condition is met, controlling the interface display of the first application, the first condition includes that the first user is in the main driving seat.
  • the first condition also includes: the first application belongs to a preset first list, or the first application does not belong to a preset second list, or the type of the first application belongs to a preset first type list, or the type of the first application does not belong to a preset second type list; the first list is used to record the applications to be controlled, the first type list is used to record the types of applications to be controlled, the second list is used to record the applications not to be controlled, and the second type list is used to record the types of applications not to be controlled.
  • control is performed through the corresponding list to avoid the first application (such as a video application) that affects safe driving from affecting the user in the main driving seat when the interface is displayed on the vehicle, thereby ensuring driving safety.
  • the first application such as a video application
  • determining whether to control the interface display of the first application according to the first user includes: when determining that the second condition is met, controlling the interface display of the first application, the second condition includes that the first user is not in the main driving seat and the first display area of the first application is close to the main driving seat. Control is performed through the first display area of the first application to avoid the first display area of the first application being close to the main driving seat and distracting the main driver's attention, thereby ensuring driving safety and meeting the needs of the first user who is not in the main driving seat to use the application.
  • the first display area of the first application is close to the main driver's seat includes: the first display area is an area on the display screen close to the main driver's seat; or the vehicle includes N display screens, and the first display area is displayed on a display screen close to the main driver's seat among the N display screens, where N is an integer greater than or equal to 2. Control is performed based on the first display area being close to the main driver's seat to avoid distracting the main driver's attention due to the first display area being close to the main driver's seat, thereby ensuring driving safety and satisfying the need of the first user who is not in the main driver's seat to use the application.
  • the method further includes: setting the first display area of the first application according to the first user, so that the set first display area is close to the first user. Based on the need of the first user to use the first application, the first display area for displaying the interface of the first application (such as the first interface) is brought close to the first user to improve the first user's experience of using the first application.
  • setting the first display area of the first application according to the first user includes: when the first input triggers the split screen, using the split screen window close to the first user as the first display area of the first application, thereby improving the first user's experience of using the first application.
  • the first input when the vehicle uses a navigation service, the first input triggers a split screen, thereby improving the first user's experience of using the first application and meeting driving needs.
  • controlling the display of the first interface of the first application includes one or more of the following operations: prohibiting the first application from running, prohibiting the display of the first interface, adding a mask to the first interface, outputting a safety instruction, outputting a safety guide, and setting the first display area of the first application so that the set first display area is away from the user in the main driving seat, so as to avoid distracting the main driver's attention, ensure that the main driver concentrates on driving, and ensure driving safety.
  • outputting the safety guidance includes: displaying a control corresponding to the safety guidance on a display screen, and the method further includes: obtaining a second input to the control; in response to the second input, exiting the first application, and/or, when the first display area of the first application is close to the user in the main driving seat, setting the first display area of the first application so that the set first display area is away from the user in the main driving seat.
  • the interface display of the first application is controlled according to the needs of the user.
  • the method further includes: obtaining a third input, wherein the third input is used to instruct to swap the display area of the first interface on the display screen with the display area of the second interface on the display screen, so that the display area of the first interface is close to the user in the main driving seat, wherein the second interface is not the interface of the first application; when the judgment result is yes, in response to the third input, one or more of the following operations are performed: outputting a safety instruction, outputting a safety guide, and prohibiting the swap.
  • the interface display of the first application is continuously controlled to ensure driving safety.
  • the method further includes: obtaining a fourth input, wherein the fourth input is used to instruct the first application to play audio at a first volume value; when the first volume value is greater than or equal to a preset threshold, performing one or more of the following operations: outputting a safety instruction, outputting a safety guide, and prohibiting the first application from playing audio at the first volume value.
  • the audio of the first application is controlled to avoid excessive volume that distracts the driver's attention, thereby ensuring driving safety.
  • obtaining the first user who inputs the first input includes: obtaining a first time when the first input is input; obtaining first data in the smart cockpit of the vehicle, wherein the acquisition time of the first data is associated with the first time, and the first data includes one or more of the following data: voice data, image data, and sensor data; and obtaining the first user who inputs the first input according to the first data.
  • the first user can be identified based on multiple data sources to ensure the accuracy of the first user identification.
  • the present application provides a vehicle-mounted device, which includes: an input acquisition module, used to obtain a first input, wherein the first input is used to indicate that a first interface of a first application is displayed on a display screen of the vehicle; a first user acquisition module, used to obtain a first user who inputs the first input in response to obtaining the first input during driving of the vehicle; a judgment module, used to judge whether to control the interface display of the first application based on the first user; and a control module, used to control the display of the first interface of the first application when the judgment result is yes.
  • the third aspect provides a vehicle-mounted device, including at least one processor and a memory, wherein the at least one processor is coupled to the memory and is used to read and execute instructions in the memory to execute any of the above application management methods.
  • a fourth aspect provides a vehicle, comprising the above-mentioned vehicle-mounted device.
  • a fifth aspect provides a computer storage medium, wherein the computer readable medium stores program code, and when the computer program code runs on a computer, the computer executes any of the above application management and control methods.
  • a sixth aspect provides a computer program product, which, when executed on a computer, enables the computer to execute the application management and control method as described above.
  • FIG1 is a schematic diagram of an application scenario of the application management method disclosed in an embodiment of the present application.
  • FIG. 2 is a schematic diagram of an interface display of an existing smart cockpit in response to application startup.
  • FIG3 is a schematic diagram of the structure of a vehicle provided in an embodiment of the present application.
  • FIG4 is a schematic diagram of the acquisition module provided in an embodiment of the present application being deployed in a smart cockpit.
  • FIG5 is a flow chart of an application management method provided in an embodiment of the present application.
  • FIG. 6 is a schematic diagram of obtaining a first user who inputs a first input provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of an interface for application processing provided in an embodiment of the present application.
  • FIG8 is a flow chart of another application management method provided in an embodiment of the present application.
  • FIGS 9A to 9F are schematic diagrams of another interface for application processing provided in an embodiment of the present application.
  • FIG. 10 is a schematic diagram of an interface in response to a third input provided in an embodiment of the present application.
  • FIG. 11 is a schematic diagram of the structure of the vehicle-mounted device provided in an embodiment of the present application.
  • FIG. 12 is a schematic diagram of the structure of the vehicle-mounted device provided in an embodiment of the present application.
  • the term “plurality” refers to two or more.
  • the terms “first”, “second”, etc. are only used for the purpose of distinguishing descriptions, and cannot be understood as indicating or implying relative importance, nor can they be understood as indicating or implying an order.
  • words such as “exemplary” or “for example” are used to indicate examples, illustrations or descriptions. Any embodiment or design described as “exemplary” or “for example” in the embodiments of the present application should not be interpreted as being more preferred or more advantageous than other embodiments or designs. Specifically, the use of words such as “exemplary” or “for example” is intended to present related concepts in a specific way.
  • FIG. 1 Please refer to FIG. 1 for an exemplary introduction to the application scenario 100 of the present application.
  • the vehicle 10 is traveling on a road 30.
  • the smart cockpit 20 of the vehicle 10 is equipped with a display screen 21, and navigation information is displayed on the display screen 21.
  • the user in the smart cockpit 20 inputs an operation of playing a video to the display screen 21.
  • the car machine in the existing smart cockpit responds to the operation of playing a video, and displays the video interface and the navigation interface on the display screen 21 in a split screen.
  • the presentation of the video interface will interfere with the main driver's acquisition of navigation information, and the picture and sound of the video playing will distract the main driver, thereby affecting driving safety.
  • FIG. 2 (b) together Please refer to FIG. 2 (b) together.
  • Another car machine in the existing smart cockpit responds to the operation of playing a video, displays a video interface on the display screen 21, and displays navigation information in a card-like manner. Displaying the navigation interface in a card-like manner will hinder the main driver from obtaining navigation information, and the main driver's operation is also required to fully present the navigation interface, interfering with the main driver's driving.
  • the interface display of the application video playback interface
  • the interface display of the application will distract the main driver and will bring potential safety hazards, so the interface display of the application is unreasonable.
  • the present application provides an application management and control method, which is applicable to vehicle driving scenarios, reduces the impact on the main driver, avoids distraction of the main driver, and ensures driving safety.
  • the basic principle of the present application During the driving process of the vehicle, the application that is about to run or is running on the vehicle is managed and controlled to ensure driving safety. Specifically, the vehicle computer obtains a first input, and the first input indicates that the first interface of the first application is displayed on the display screen of the vehicle. In response to obtaining the first input during the driving of the vehicle, the vehicle computer first determines whether the interface display of the first application is reasonable based on driving safety. If reasonable, the first interface of the first application is displayed on the display screen in response to the first input.
  • the interface display of the first application is to be managed and controlled. This avoids the situation where the unreasonable display of the interface of the application in the vehicle is caused by directly responding to the first input, reduces the negative impact on the main driver, and ensures driving safety. Furthermore, when judging whether the interface display of the first application is reasonable based on safe driving, the vehicle computer fully considers the role played by the first user who inputs the first input in the smart cockpit, and judges the rationality of displaying the interface of the first application in the smart cockpit in a more intelligent and humane way.
  • FIG. 3 is a schematic diagram of the structure of a vehicle 10 provided in an embodiment of the present application.
  • the vehicle 10 includes an input module 101 , a collection module 102 , a processing module 103 and an output module 104 .
  • the input module 101 is used to obtain a first input from a user in a smart cockpit of the vehicle 10 , wherein the first input is used to instruct to display a first interface of a first application on a display screen of the vehicle.
  • the application (application, app) involved in the embodiments of the present application refers to a software program that can realize one or more specific functions.
  • multiple applications can be installed in an electronic device (such as a vehicle).
  • an application with a navigation function for example, an application with a navigation function, an instant messaging application, etc.
  • the input module 101 includes but is not limited to: a touch screen, a voice acquisition device and a communication module.
  • the touch screen can be used to receive the user's touch input
  • the voice acquisition device can be used to receive the user's voice input
  • the communication module can be used to receive input from other devices, which are not vehicle-mounted devices.
  • the first input includes but is not limited to the above-mentioned touch input, voice input and input from other devices.
  • the other device may include a terminal, such as a mobile phone, a watch, an IPAD, and a laptop computer.
  • the vehicle 10 and other devices have communication modules, and the vehicle 10 can receive input from other devices based on the communication between the communication modules.
  • the acquisition module 102 is used to collect relevant data (such as biometric data) of users in the smart cockpit of the vehicle 10.
  • the acquisition module 102 includes but is not limited to: a voice acquisition device, an image acquisition device, and a first sensor.
  • the voice acquisition device can be used to collect voice data of users in the smart cockpit.
  • the image acquisition device can be used to collect image data of users in the smart cockpit.
  • the first sensor can be used to collect sensor data of users in the smart cockpit.
  • the voice acquisition device includes but is not limited to a microphone
  • the image acquisition device includes but is not limited to a camera
  • the first sensor is used to detect whether there is a living body at its location.
  • the first sensor can be implemented as a pressure sensor, a photoelectric sensor, an infrared sensor, etc., and this application does not make specific restrictions on this.
  • the first sensor can be set close to the user in the smart cockpit, such as being set on a seat in the smart cockpit.
  • FIG. 4 illustrates an exemplary deployment of the acquisition module provided in the smart cockpit 20 according to an embodiment of the present application.
  • the smart cockpit 20 includes a display screen 21, a steering wheel 22, a main driver's seat 23 (main driver's seat), a co-driver's seat 24 (not the main driver's seat) and the above acquisition module, the main driver's seat 23 is located in front of the steering wheel 22, and the position between the main driver's seat 23 and the co-driver's seat 24 is taken as the origin.
  • the first microphone 411 and the first camera 421 are set in the upper left corner of the smart cockpit 20, the second microphone 412 and the second camera 422 are set in the upper front of the smart cockpit 20, and the third microphone 413 and the third camera 423 are set in the upper right corner of the smart cockpit 20.
  • a first infrared sensor 431 is set on the main driver's seat 23, and a second infrared sensor 432 is set on the co-driver's seat 24.
  • the smart cockpit 20 provided in the present application may also include positions for rear passengers.
  • FIG4 is an illustrative illustration of a display screen 21 and does not constitute a limitation on the present application.
  • the smart cockpit 20 provided in the present application may include N display screens 21, where N is an integer greater than or equal to 2.
  • the device type, quantity, and deployment position of the acquisition module in FIG4 are merely illustrative and do not constitute a limitation on the present application.
  • the device type, deployment position, and quantity of the acquisition module can be set according to actual conditions to ensure accurate collection of relevant data of users in the smart cockpit 20.
  • the processing module 103 is used to obtain the first input obtained by the input module 101 and/or obtain the biometric data collected by the collection module 102, and then process the obtained first input and/or biometric data.
  • the processing module 103 may include one or more processors, for example, M processors, where M is an integer greater than or equal to 1.
  • the processor is a circuit having the ability to process signals.
  • the processor may be a circuit having the ability to read and execute instructions, such as a central processing unit (CPU), a microprocessor, a graphics processing unit (GPU) (which can be understood as a microprocessor), or a digital signal processor (DSP); in another implementation, the processor may implement certain functions through the logical relationship of a hardware circuit, and the logical relationship of the hardware circuit is fixed or reconfigurable, such as a hardware circuit implemented by an application-specific integrated circuit (ASIC) or a programmable logic device (PLD), such as a field programmable gate array (FPGA).
  • ASIC application-specific integrated circuit
  • PLD programmable logic device
  • FPGA field programmable gate array
  • the process of the processor loading a configuration document to implement the hardware circuit configuration can be understood as the process of the processor loading instructions to implement the functions of some or all of the above units.
  • it can also be a hardware circuit designed for artificial intelligence, which can be understood as an ASIC, such as a neural network processing unit (NPU), a tensor processing unit (TPU), a deep learning processing unit (DPU), etc.
  • the processing module 103 can also include a memory, the memory is used to store instructions, and some or all of the M processors can call the instructions in the memory and execute the instructions to implement the corresponding functions.
  • the present application provides a safe driving service, which includes determining whether to control the interface display of a first application in response to obtaining a first input during vehicle driving, such as determining whether to control the interface display of the first application based on a first user who inputs the first input.
  • a safe driving service which includes determining whether to control the interface display of a first application in response to obtaining a first input during vehicle driving, such as determining whether to control the interface display of the first application based on a first user who inputs the first input.
  • the processing module 103 can be used to provide the above-mentioned safe driving service.
  • the processing module 103 can be used to execute the application management method provided by the present application, and specifically refer to the embodiments shown in the following Figures 5 and 8.
  • the processing module 103 has an operating system (operation system, OS), which can be the Hongmeng operating system, or other possible operating systems.
  • OS operating system
  • the embodiment of the present application does not specifically limit it.
  • the Hongmeng car machine OS is used as an example for exemplary description. Developers can develop a software program that implements the application control method (safe driving service) provided in the embodiment of the present application based on the system architecture of the operating system, so that the application control method can run based on the operating system. That is, the processing module 103 or the processor can implement the application control method provided in the embodiment of the present application by running the software program in the operating system.
  • the processing module 103 may be installed with an operating system program, which includes a computer program that can implement the application management method provided in the embodiments of the present application, so that after the processor reads the operating system program and runs the operating system, the operating system may have the application management method provided in the embodiments of the present application.
  • an operating system program which includes a computer program that can implement the application management method provided in the embodiments of the present application, so that after the processor reads the operating system program and runs the operating system, the operating system may have the application management method provided in the embodiments of the present application.
  • the processing module 103 being installed with an operating system program can be realized as follows: the processor on the processing module 103 is installed with the operating system program. Alternatively, the processing module 103 also includes a memory, and other data other than the computer program is stored on the memory. The other data may include data generated after the above-mentioned operating system or application is run, etc. The data includes system data (such as configuration parameters of the operating system) and user data.
  • the memory generally includes internal memory and external memory.
  • the internal memory may be a random access memory (RAM), a read-only memory (ROM), and a cache (CACHE), etc.
  • the external memory may be a hard disk, an optical disk, a USB disk, a floppy disk, or a tape drive, etc.
  • the computer program is usually stored in the external memory, and the processor will load the computer program from the external memory to the internal memory before executing the processing.
  • the processing module 103 processes the first input and/or biometric data including preprocessing the first input and/or biometric data.
  • the processing module 103 preprocesses the collected image data, including but not limited to bad pixel correction processing, time domain noise reduction processing, 3D noise reduction processing, linearization processing, and black level correction processing, and accordingly obtains a preprocessed image.
  • the processing module 103 may be implemented as a component on a vehicle or as an onboard device on a vehicle.
  • any device placed or installed on a vehicle can be considered as an on-board device.
  • On-board devices can include devices that are factory-installed on the vehicle by the vehicle manufacturer before the vehicle leaves the factory, as well as devices that are installed or placed in the vehicle by the user after the vehicle is sold.
  • a vehicle box T-BOX
  • a vehicle computer for example, Huawei HiCar
  • a smart rearview mirror a vehicle microphone
  • a vehicle speaker an electronic control unit (ECU), etc.
  • ECU electronice control unit
  • the telematics box is mainly used to communicate with the background system/other device (mobile phone) application (APP) to realize the vehicle information display and control of the mobile phone APP.
  • the background will send a monitoring request instruction to the vehicle-mounted T-BOX.
  • the vehicle After the vehicle obtains the control command, it sends a control message through the controller area network (CAN) bus and realizes the control of the vehicle.
  • the operation result is fed back to the user's mobile phone APP.
  • the background can also be called a server, a background server, etc., which can be used to remotely activate and start the vehicle and perform corresponding authentication.
  • the server can be a cloud server.
  • the car computer is a device used for information exchange between people and vehicles. It can be installed on the center console of the vehicle.
  • the car computer is the abbreviation of the in-vehicle infotainment product installed in the car.
  • the car computer can realize information communication between people and cars, and between cars and the outside world (cars).
  • ECU Electronic Control Unit
  • driving computer also known as “driving computer”, “on-board computer”, etc. It should be understood that a vehicle may include multiple ECUs.
  • the vehicle further includes a cloud server, which provides the above-mentioned safe driving service.
  • the cloud server communicates with the processing module 103, and the cloud server executes the application control method provided in this application.
  • the cloud server obtains the first input and/or biometric data received by the processing module 103, or, after the processing module 103 pre-processes the first input and/or biometric data, the cloud server obtains the pre-processed first input and/or biometric data from the processing module 103, and then executes the application control method provided in this application according to the pre-processed first input and/or biometric data.
  • the output module 104 is used to output the processing result output by the processing module 103 or the cloud server, which is the relevant processing result obtained after the execution subject (the processing module 103 or the cloud server) executes the application management method provided by this application.
  • the output module 104 includes but is not limited to a display device and an audio output device.
  • the first category is the vehicle display screen; the second category is the projection display screen, such as the head-up display (HUD).
  • the vehicle display screen is a physical display screen and an important part of the vehicle infotainment system.
  • Head-up display also known as head-up display system.
  • HUD includes, for example, a combined head-up display (C-HUD) system, a windshield head-up display (W-HUD) system, and an augmented reality head-up display (ARHUD) system.
  • C-HUD head-up display
  • W-HUD windshield head-up display
  • ARHUD augmented reality head-up display
  • audio output devices include smart speakers, etc.
  • a certain device on the vehicle 10 can be used to implement different functions.
  • a voice acquisition device can be used to obtain a first input, and it can also be used to collect biometric data of a user.
  • the vehicle 10 may also include more or fewer modules, which is not specifically limited in this application.
  • the vehicle 10 is connected to the cloud server through the communication module, and the first input obtained by the input module 101 and the biometric data collected by the collection module 102 are transmitted to the cloud server through the communication module for processing.
  • the execution subject of the application control method provided by the present application may be the above-mentioned processing module or cloud server.
  • the application control method provided by the present application may be jointly executed by multiple execution subjects. When executed by multiple execution subjects, for example, some steps are executed by the first execution subject, and some steps are executed by the second execution subject. The third execution subject obtains the results output by the first execution subject and/or the second execution subject, and executes another step accordingly.
  • the execution subject of the present application can be implemented by software and/or hardware, and can be integrated in various general-purpose computer devices, such as the above-mentioned vehicle-mounted device (vehicle computer), cloud server, etc.
  • vehicle computer vehicle-mounted device
  • cloud server cloud server
  • FIG5 exemplarily introduces an application management method provided in an embodiment of the present application.
  • Step S501 the vehicle computer obtains a first input, wherein the first input is used to instruct to run a first application.
  • the first input is used to indicate that "running the first application" in running the first application includes: starting the first application to run the first application, or running the first application that has been started.
  • the device that runs the first application is referred to as the first device, and the first device is an on-board device on a vehicle, such as a vehicle computer.
  • the first input is used to instruct the first device to start the first application to run the first application, or to instruct the first device to run the first application that has been started.
  • the first application may be an application installed on the vehicle, such as an original vehicle application provided by the car manufacturer, that is, the first application is installed on the first device.
  • the first application may also be an application installed on other devices (not the first device), and the other device communicates with the first device.
  • the first input instructs the first device to run the first application, including: the first input instructs the first device to start the first application installed on its operating system, or the first input instructs the first device to run the first application that has been started (such as the video application has been started, and the first input indicates to open TV series A).
  • the first input instructs the first device to run the first application means: the first device provides the user with application services from the application of the other device.
  • the car computer obtains the interface of the application from the other device, and the output device of the car computer (such as a display screen) outputs the interface.
  • the car computer receives application A from other devices, and application A is not started.
  • the vehicle computer can display the icon of application A.
  • the vehicle computer receives a first input, such as clicking on the icon of application A.
  • the vehicle computer transmits the first input to other devices, and the other devices start application A.
  • the vehicle computer then receives and displays the interface of application A after the other devices start application A.
  • the technical implementation of the application service provided by the first device to the user from the application program of other devices includes: the operating system of other devices and the operating system on the first device constitute a distributed operating system (distributed operating systems).
  • the distributed operating system is a part of the distributed software system (distributed software systems) and needs to be installed in the entire distributed system. It is mainly responsible for managing the resources of the distributed processing system and controlling the operation of distributed programs.
  • the basic functions of the distributed operating system include: providing communication means so that processes running on different computers (such as mobile phones and car computers) can exchange data through communication; providing the function of accessing the resources of other machines so that users can access the resources located on other machines; providing a certain programming language so that users can write distributed programs that can run in parallel on multiple nodes in the system; efficient control and network resource management, which is transparent to users.
  • the first device can provide users with application services from applications of other devices, which can also be achieved through screen projection technology, etc., and this application does not make specific restrictions on this.
  • the vehicle computer that executes the method of the present application may include the above-mentioned output device and/or input device.
  • the vehicle computer is in communication connection with the above-mentioned output device and/or input device, and the vehicle computer can obtain the first input through the above-mentioned input device.
  • the vehicle computer can output relevant information through the above-mentioned output device, including the information received from other devices and the processing results.
  • the vehicle computer can obtain the first input input by the first user through the above-mentioned input module.
  • the first input can be voice input, touch input, or input from other devices, etc.
  • a voice assistant is installed on the vehicle computer, and the first user inputs a voice "Please open the video application" to the voice assistant, and the sound collection device on the vehicle computer collects the voice input by the first user and transmits the collected voice to the vehicle computer.
  • the vehicle computer obtains the first input (i.e., voice input), and the voice input is used to instruct the vehicle computer to start the video application.
  • the mobile phone is connected to the vehicle computer in communication.
  • the first user uses Universal Serial Bus (USB) or Wireless Fidelity (Wi-Fi) to connect the mobile phone to the vehicle computer, and then uses the carplay, carlife and other in-vehicle application desktops on the mobile phone (an application that supports mapping applications installed on other devices to the vehicle computer) to start the projection function on the mobile phone, and project the in-vehicle services (such as video applications) on the mobile phone to the display screen of the vehicle computer for display, so that the user can use the video application on the display screen.
  • the user can project the started or unstarted application to the vehicle computer, and the vehicle computer receives the first input, which is used to instruct the display screen of the vehicle computer to display the interface of the video application.
  • the vehicle computer is connected to a display device (such as a display screen), and please refer to (a) in FIG. 7 .
  • the display screen 21 displays a main interface 71, and the main interface 71 includes an icon 72 of a map application and an icon 73 of a video application.
  • the first user touches the icon 73 of the video application to start the video application.
  • the display screen 21 detects the touch of the first user and transmits the detected touch to the vehicle computer.
  • the vehicle computer obtains the first input (touch input), and the touch input is used to indicate the start of the video application.
  • the first input is used to indicate the running of the first application program, specifically including: the first input is used to indicate the display of the first interface of the first application program on the display screen of the vehicle.
  • the display screen on the vehicle may be the above-mentioned vehicle display screen or projection display screen, etc., and the present application does not specifically limit the type of display screen and the number of display screens.
  • the first interface of the first application may be an interface for launching the first application, such as a startup page or a main interface of the first application.
  • the first interface may also be another interface of the first application after launching the first application, such as a playback interface of a video in a video application, or a playback interface of a video in a comprehensive application after launching the comprehensive application, wherein the comprehensive application is an application that can provide multiple application services (such as video services, news services, etc.), such as a headline application.
  • the first input indicates that the first interface of the first application is displayed on the display screen of the vehicle, including but not limited to the following situations:
  • a video application icon is displayed on a display screen of a first device, and a user clicks the icon of the video application to start the video application, and the display screen receives the user click, which is the first input and is used to instruct the display screen to display a main interface of the video application.
  • Scenario 2 The display screen of the first device displays the main interface of the comprehensive application, which may include multiple videos.
  • the first user operates video a on the main interface of the comprehensive application, and the display screen obtains the user's operation on video a.
  • the operation on video a is the first input, which is used to instruct the display screen to display the playback interface of video a.
  • Scenario 3 After the user establishes a communication connection between the other device and the first device, the user projects a video application that is not started on the other device to the display screen of the first device, and the display screen displays the icon of the video application. The user operates the icon of the video application to start the video application. The operation performed on the icon is the first input, which is used to instruct the display screen to display the startup page or main interface of the video application.
  • Scenario four after the user establishes a communication connection between the other device and the first device, a video interface is being played on the other device.
  • the user performs a projection operation to project the video interface being played on the other device to the display screen of the first device.
  • the projection operation is the first input, which is used to indicate that the video interface from the other device is displayed on the display screen.
  • Scenario 5 After the user establishes a communication connection between other devices and the vehicle computer, the user projects the launched comprehensive application to the display screen of the vehicle computer, and the display screen displays the launched interface (such as the main interface) of the comprehensive application.
  • the user operates video A on the comprehensive application, and the operation on video A on the comprehensive application is the first input, which is used to instruct the display screen to display the playback interface of video A on the comprehensive application.
  • the execution entity of the present application such as a cloud server
  • the first device vehicle computer
  • the execution entity is communicatively connected to the first device, and the first device can transmit the received first input to the execution entity through the above-mentioned input module.
  • Step S502 In response to obtaining a first input during vehicle driving, the vehicle computer obtains a first user who inputs the first input.
  • the vehicle computer can determine whether the vehicle is in motion based on the vehicle using navigation services, vehicle computer startup, engine startup, vehicle gear parameters, vehicle geographic location data, vehicle motion data, etc.
  • the vehicle may also include a positioning module, and the vehicle computer may obtain geographic location data and motion data based on the positioning module (such as a global positioning system (GPS), assisted GPS (A-GPS), or a similar geographic positioning unit), and then determine whether the vehicle is moving based on the geographic location data and motion data.
  • a positioning module such as a global positioning system (GPS), assisted GPS (A-GPS), or a similar geographic positioning unit
  • the gear parameters of the vehicle when the vehicle is an automatic transmission vehicle, include: P gear, D gear, N gear, R gear, etc.
  • the gear parameters of the vehicle when the vehicle is a manual transmission vehicle, include: 1st gear, 2nd gear, 3rd gear, 4th gear, 5th gear, R gear, etc.
  • the vehicle computer detects that the vehicle is in gear D, gear N, or gear R, it can be considered that the vehicle is in motion.
  • gear D or gear R it can be considered that the vehicle is in motion.
  • the vehicle computer determines a first user according to relevant information when the first input is input, wherein the first user may be a main driver, a co-driver, or a rear passenger.
  • the relevant information when the first input is input includes the relevant data collected by the above-mentioned collection module.
  • the acquisition module may be started when the vehicle is detected to be moving. In other embodiments, the acquisition module may be started when the first input is obtained. In other embodiments, the acquisition module may be started after the vehicle computer is started. As long as it can be ensured that the acquisition module can obtain relevant data in the smart cockpit in real time during the vehicle driving process, this application does not specifically limit the timing of starting the acquisition module.
  • the vehicle computer obtains the first time when the first input is input, then obtains the first data related to the first time from the acquisition module, and finally obtains the first user who inputs the first input according to the first data related to the first time.
  • the first data related to the first time can be the collection time of the first data associated with the first time, such as the collection time is the same as the first time, or the collection time covers the first time, that is, the collection time includes the first time and the time adjacent to the first time, etc.
  • the first data may include one or more of the following data: voice data, image data, and sensor data.
  • the method of obtaining the first user who inputs the first input according to the first data in the present application includes but is not limited to the following:
  • Method 1 The voice collection device in the vehicle collects voice data and marks the timestamp when the voice data is collected.
  • the vehicle computer obtains the voice data associated with the collection time and the first time, and then locates the first user who outputs the voice data based on the sound source localization method of the microphone array in the prior art.
  • the sound source localization method based on the microphone array includes but is not limited to the following: controllable beamforming technology based on maximum output power, high-resolution spectrogram estimation technology, and sound source localization technology based on time-delay estimation (TDE).
  • the first input input by the user is a voice input
  • the vehicle computer obtains the first input at time T1 (first time).
  • the vehicle computer obtains voice data with a timestamp of time adjacent to time T1, such as obtaining voice data at time T1 and after time T1.
  • the voice data is voice data collected by multiple sound collection devices. As shown in Figure 4, the signal strength of the voice data obtained by the three microphones at time T1 and after is combined to determine the location of the sound source.
  • the first user is determined to be the main driver (in the main driving seat) according to the position of the first microphone 411 in the smart cockpit, or the co-main driver (not in the main driving seat), or the rear passenger (not in the main driving seat). As shown in Figure 4, if the first microphone 411 is set close to the main driver, the first user is determined to be the main driver.
  • Method 2 The voiceprints of the main driver and/or the co-driver and/or the rear passengers can be pre-stored.
  • the vehicle computer receives voice input, the vehicle computer obtains the voiceprint corresponding to the voice input and matches it with the pre-stored voiceprint to determine the first user.
  • Method 3 The camera acquisition device in the vehicle acquires image data and marks the timestamp when the image data is acquired.
  • the vehicle computer acquires the image data associated with the acquisition time and the first time, and then performs image processing to determine the first user.
  • the vehicle computer obtains the first input at time T1 (first time), and the vehicle computer obtains image data associated with the acquisition time and the first time, and then determines the first user according to the image data. If it is determined according to the image data that there is only the main driver in the smart cabin, the first user can be determined to be the main driver. If the main driver and the deputy main driver in the smart cabin are determined according to the image data, the actions of the main driver and the deputy main driver can be identified to determine the first user. For example, when the first input is a voice input, the lip shape changes in the main driver or the deputy main driver at the first time are determined according to the image data, and the user whose lip shape changes are determined as the first user.
  • the user's hand movement is determined according to the image data, and the user who detects the hand operation of the display screen is determined as the first user.
  • the first input is an input from other devices
  • the user who uses other device terminals in the first time is determined according to the image data as the first user.
  • the identity of the first user is determined according to his position in the smart cockpit, such as the main driver, the deputy driver or the rear passenger.
  • corresponding image features can be set in advance, such as image features indicating speaking, image features indicating operating the in-vehicle display screen, and image features indicating operating other smart terminals, etc., so that when the image data is subsequently processed and the above-mentioned image features are identified, the user corresponding to the image features can be determined as the first user.
  • Method 4 Obtain sensor data collected by the first sensor, and determine based on the sensor data that no living body is located at the position of the first sensor at the first time. If it is determined that no living body is located at the position of the co-pilot at the first time, the first user can be excluded as the co-pilot.
  • Method 5 When the first input is input from another device, the first user can be determined according to the device identification of the other device. For example, when the vehicle computer receives input from the first terminal device, the vehicle computer records the user corresponding to the first terminal device as the vehicle owner, and the vehicle owner is assumed to be the primary driver. Then, when the input from the first terminal device is received, the first user can be determined to be the primary driver.
  • the user can set his terminal device according to the actual situation, such as setting the terminal device of the car owner as the main driving device and setting the terminal device of other users as the co-pilot device.
  • the corresponding first user can be determined according to the terminal device.
  • one or more of the above-mentioned methods 1, 2, 3, 4 and 5 can be used to obtain the first user. For example, taking the first input of the user as voice input, the vehicle computer obtains the first input at time T1 (first time). The vehicle computer obtains the voice data associated with the acquisition time and the first time, and then determines the first user based on the signal strength of the voice data. At the same time, the vehicle computer combines the image data collected by the camera acquisition device, such as judging the first user who inputs the voice input by the facial features and mouth shape.
  • the first user can be determined based on the preset stored priority.
  • the priority of image data is greater than that of sensor data
  • the priority of sensor data is greater than that of voice data, that is, when the above data conflict, the first user determined by the image data shall prevail, and when the first users determined by the sensor data and the voice data conflict, the first user determined by the sensor data shall prevail.
  • the co-driver uses the terminal device of the main driver to communicate with the vehicle computer, and combines one or more of the above-mentioned methods one, two, three, and four to determine that the co-driver inputs the first input, then determines that the first user is the co-driver.
  • the first input when it is determined that the first user is a rear passenger, the first input may be determined as an invalid input and the process is terminated. For example, when the voice input indicates to start a video application on the central control screen, and the voice input comes from a rear passenger, the voice input may be determined as an invalid input.
  • the vehicle computer may start a safe driving service in response to obtaining a first input during vehicle driving.
  • the safe driving service is implemented by obtaining a first user who inputs the first input.
  • Step S503 The vehicle computer determines the first attribute of the first application according to the first user, wherein the first attribute of the first application includes a danger attribute and a safety attribute.
  • the vehicle computer determines whether to control the interface display of the first application according to the first user, and outputs the judgment result.
  • the judgment result is yes
  • the first attribute of the first application is determined to be a dangerous attribute.
  • the judgment result is no
  • the first attribute of the first application is determined to be a safe attribute. That is, when the vehicle computer determines that the interface display of the first application is to be controlled according to the first user, the first attribute of the first application is determined to be a dangerous attribute.
  • the vehicle computer determines that the interface display of the first application does not need to be controlled according to the first user, the first attribute of the first application is determined to be a safe attribute.
  • the interface display of the application on the vehicle is reasonable. If it is reasonable, there is no need to control the interface display of the application, and the first attribute of the first application is determined to be a safe attribute. If it is unreasonable, it is necessary to control the interface display of the application, and the first attribute of the first application is determined to be a dangerous attribute.
  • the dangerous attribute indicates: when the above-mentioned first device (vehicle computer) responds to the first input, runs the first application or starts the first application to run the first application, it is easy to cause a traffic accident. That is, in response to the first input, the interface of the first application (including the first interface) is displayed on the display screen of the vehicle, which will have a direct or indirect impact on the main driver, and the impact will cause the main driver to have a negative impact on driving (such as parking distance and lane control), which is easy to cause traffic accidents. For example, the car computer plays a video projected from the screen or the car computer starts a game application to run the game application, which will attract the driver's attention.
  • the safety attribute indicates that when the first device responds to the first input, runs the first application or starts the first application to run the first application, it is not easy to cause a traffic accident. That is, in response to the first input, the interface of the first application (including the first interface) is displayed on the display screen of the vehicle, which will not directly or indirectly affect the main driver, or the impact on the main driver is small, and the main driver will not have a negative impact on driving, or the negative impact of the main driver on driving is small.
  • the danger attribute indicates that there are safety risks when the first device runs the first application or starts the first application to run the first application, and displays the interface of the first application (including the first interface) on the display screen of the vehicle, such as distracting the main driver, resulting in driving that does not comply with the behavioral norms of safe driving.
  • the safety attribute indicates that the first device runs the first application or starts the first application to run the first application, and displays the interface of the first application (including the first interface) on the display screen of the vehicle, which will not distract the main driver's attention and ensure that the driving complies with the behavioral norms of safe driving.
  • the first attribute of the same application may be different in different situations, and accordingly, the control method of the application is also different.
  • the first user who instructs to run the first application is different, and different first users play different roles in the driving process of the vehicle, and the determined first attributes may be different, thereby ensuring driving safety while meeting the use requirements of users in the smart cockpit (such as people other than the main driver) for the application and improving the user experience.
  • the vehicle computer determines whether to control the interface display of the first application based on the first user, including: when it is determined that the first condition is met, the interface display of the first application is controlled, and the first condition includes that the first user is in the main driving seat.
  • the first user is in the main driving seat, that is, the first user is the main driver.
  • the first user is the user who drives the vehicle (such as operating the steering wheel) in the smart cockpit.
  • the first condition further includes: the first application belongs to a preset first list, or the first application does not belong to a preset second list, or the type of the first application belongs to a preset first type list, or the type of the first application does not belong to a preset second type list.
  • the first list is used to record preset applications to be controlled
  • the first type list is used to record the types of preset applications to be controlled
  • the second list is used to record preset applications not to be controlled
  • the second type list is used to record the types of preset applications not to be controlled.
  • the application to be controlled or the type of application to be controlled can be pre-set according to the function of the application or the actual situation.
  • the pre-set types of applications to be controlled may include games, social, audio and video, etc. Other types may also be included.
  • game applications can provide online game services for users, social applications have social network functions, such as dating software, etc.
  • Social applications also include applications for disseminating information to users, such as news applications. Audio and video applications are used to provide audio and video services, such as Huawei Video.
  • the remaining applications that are not to be controlled can be added to the second list, or the remaining types of applications that are not to be controlled can be added to the second type list.
  • the types of applications that are not in any of the game, social, and audio and video categories can be added to the second type list.
  • the first list is used to store the pre-set application identifiers (generally referred to as package names) of the applications to be controlled.
  • the second list is used to store the pre-set application identifiers of applications that are not to be controlled.
  • the first type list is used to store the pre-set types of applications to be controlled, such as: games, social, and audio and video.
  • the second type list is used to store the types of applications that are not to be controlled, such as: navigation, music, etc. That is, the first list and the first type list are whitelists, and the second list and the second type list are blacklists.
  • the first list may be implemented to store the types to be managed and the identifiers of the applications corresponding to the types, such as storing the application identifiers of applications belonging to the game category, storing the application identifiers of applications belonging to the social category, and storing the application identifiers of applications belonging to the audio and video category.
  • the first table is as shown in Table 1:
  • M, J, and K are all integers greater than or equal to 2, and the number of application identifiers stored in the preset list is not specifically limited.
  • the types of applications to be managed are maintained, and then the identifiers of the applications to be managed are stored in the corresponding types.
  • the first application belongs to the preset first list means that the application identifiers stored in the first list include the application identifier of the first application.
  • the first application does not belong to the preset second list means that the application identifiers stored in the second list do not include the application identifier of the first application.
  • the type of the first application belongs to the preset first type list means that the types stored in the first type list include the types corresponding to the first application.
  • the type of the first application does not belong to the preset second type list means that the types stored in the second type list do not include the types corresponding to the first application.
  • the first application belongs to the first list including: the first list includes the application identifier of the first application or the types stored in the first list include the types corresponding to the first application, or the types stored in the first list include the types of the interface of the first application. If the types stored in Table 1 include game, social and audio and video, when the first interface of the first application is a game interface, a social interface (such as a chat) or an audio and video interface (video playback interface), the first application belongs to the first list.
  • the types stored in Table 1 include game, social and audio and video
  • the first interface of the first application is a game interface, a social interface (such as a chat) or an audio and video interface (video playback interface)
  • the first application belongs to the first list.
  • the application identifier recorded in the first list includes the application identifier of Huawei Video
  • the types recorded in the first type list include games, social networking, and audio and video.
  • the car machine obtains the type of Huawei Video and determines that the type of Huawei Video is audio and video, and then determines that the interface display of Huawei Video needs to be controlled.
  • the list maintained by the car machine includes the type of application and the application identifier corresponding to each type (such as Table 1), taking the first application as Toutiao application as an example, the type of Toutiao application is a comprehensive application.
  • the first interface indicated by the first input is to open the main interface of Toutiao application, it is determined that the first application does not belong to Table 1, and there is no need to control the interface display of Toutiao application.
  • the first interface indicated by the first input is the video interface in Toutiao application, and the audio and video category in Table 1 includes the type of the video interface. It is determined that the first application belongs to Table 1, and it is necessary to control the interface display of Toutiao application.
  • There is no limitation on the method of obtaining the type of application For example, the application name of the application can be identified, a keyword can be identified, and the corresponding application type can be determined based on the keyword. For example, the keyword "video" in the application name "Huawei Video" can be determined to be an audio and video type.
  • the above list may be pre-stored in a corresponding memory or database, etc.
  • the above list may be updated, such as maintaining the above list in a cloud server, and when the above list is updated, the updated list is transmitted to the vehicle computer.
  • the user's input operation may be received, and the above list may be updated according to the input operation.
  • the user's customized operation may also be received, and the application identifier of the application corresponding to the customized operation may be stored in the above first list or the second list.
  • the first user when the first user is not in the main driving seat, it can be directly determined that there is no need to control the interface display of the first application. If the first user is not in the main driving seat, the first user can be a user in the co-pilot seat in the vehicle (co-pilot) or a rear passenger.
  • the first device when there is no need to control the interface display of the first application, the first device responds to the first input, the first device starts the first application or the first device runs the first application, and outputs the result of starting the first application or outputs the result of running the first application through the output device, including outputting sound or presenting an interface, such as displaying the first interface of the first application on a display screen.
  • determining whether to control the interface display of the first application based on the first user includes: controlling the interface display of the first application when it is determined that a second condition is met, and the second condition includes that the first user is not in the main driving seat and the first display area of the first application is close to the main driving seat.
  • the first display area is an area on the display screen of the vehicle for displaying the interface of the first application (such as the first interface), and the interface of the first application can be an interface when the first application is started, or an interface when the first application is running. If the first application is a video application, the first interface can be the main interface or video interface of the video application.
  • the first display area of the first application may be set by default on the vehicle computer.
  • the vehicle computer sets the first display area of the application by default on the left side of the central control screen, or by default on the right side of the central control screen, etc.
  • the vehicle computer sets the first display area of the first interface by default on the display screen near the co-driver's seat.
  • the vehicle computer obtains the first display area of the first application program, including obtaining the display screen displayed by the first display area and the position area of the first display area on the display screen.
  • the first display area of the first application program is on the left side of the central control screen.
  • the first display area when the first display area is close to the main driver's seat, it is determined that the interface display of the first application needs to be controlled.
  • the first display area being close to the main driver's seat includes one or more of the following situations: the first display area is an area on the vehicle's display screen (such as the central control screen) close to the main driver's seat. Or, the vehicle includes N display screens, and the first display area is displayed on the display screen close to the main driver's seat among the N display screens, where N is an integer greater than or equal to 2.
  • the positional relationship between each display area on the vehicle and the user in the smart cockpit can be pre-stored, and then whether the first display area is close to the main driver's seat can be determined based on the pre-stored positional relationship.
  • the display area close to the main driver's seat on the vehicle can be pre-stored.
  • the positional relationship includes: the central control screen is set between the main driver's seat and the co-driver's seat, the main driver's seat is on the left side of the central control screen, and the co-driver's seat is on the right side of the central control screen. The left area of the central control screen can be stored close to the main driver's seat.
  • the vehicle includes a first display screen and a second display screen, and the positional relationship includes: the first display screen is located in front of the main driver's seat, and the second display screen is located in front of the co-driver's seat.
  • the first display screen can be stored close to the main driver's seat.
  • the role played by the first user in the smart cockpit is fully considered to achieve a more intelligent and humane judgment of the rationality of the startup or operation of the first application, to ensure that the first attribute of the first application is reasonably and accurately determined, and then corresponding control can be performed according to the first attribute to ensure driving safety.
  • the first user is in the main driving seat, it is necessary to control the interface display of the first application.
  • the first attribute of the first application is determined according to the above list to avoid the user in the main driving seat from using the application that needs to be controlled during driving, and to urge the user in the main driving seat to comply with the behavioral norms of the main driver stipulated in the Road Traffic Safety Law to ensure driving safety.
  • the first attribute can be directly determined as a safety attribute, and the vehicle computer can directly provide the first user with the service corresponding to the first application to meet the service needs of the non-main driving user in the smart cockpit.
  • the vehicle computer determines the first attribute of the first application based on the first display area to prevent the startup or operation of the first application from affecting the main driving, so as to provide the service corresponding to the first application while ensuring driving safety.
  • the safe driving service is implemented to determine the first attribute of the first application (i.e., to determine whether to control the interface display of the application). After the vehicle computer starts the safe driving service, the safe driving service can determine the first attribute by referring to the contents of the above-mentioned first and second conditions.
  • Step S504 When the first attribute is a dangerous attribute, the vehicle computer controls the first application.
  • the vehicle computer controls the interface display of the first application. That is, in response to obtaining the first input during vehicle driving, when the interface display of the first application needs to be controlled according to the first user's judgment, the display of the first interface of the first application is controlled to reduce or avoid direct or indirect impact on the main driver, and reduce or avoid the negative impact of the main driver on driving.
  • controlling the display of the first interface of the first application includes one or more of the following operations: prohibiting the running of the first application, prohibiting the display of the first interface, processing the first interface to reduce its appeal to users (such as adding a mask to the first interface), outputting safety prompts, outputting safety guidance, and setting the first display area of the first application so that the set first display area is away from the user in the main driving seat, thereby achieving the presentation of the first application away from the user in the main driving seat.
  • the interface of the first application is not displayed on the display screen (i.e., the first application exits), and/or the first interface is not displayed, and/or the first interface and the mask on the first interface are displayed, and/or the display of safety instructions is controlled, and/or the controls corresponding to the safety instructions are displayed, and/or the first interface is displayed on the set first display area.
  • the vehicle computer obtains a first input during vehicle driving, and the first input indicates starting a video application.
  • the vehicle computer suspends the running of the video application, adds a mask 74 to the interface (first interface) of the video application, and displays a safety instruction 75 “affects safe driving, not recommended to start” on the mask 74, and displays a control 76 “Exit” corresponding to the safety instruction.
  • the control 76 the mask, safety prompt and video application are exited, and the display screen can display the main interface 71.
  • the vehicle computer executes steps S501 and S502, and the vehicle computer transmits the obtained first input and the determined first user to the cloud server.
  • the cloud server can refer to the relevant content of the above step S503 to determine the first attribute (determine whether to control the interface display of the application).
  • the cloud server can refer to the content of the above step S504 to control the interface display of the first application.
  • the processing results that can be output after the cloud server controls may include one or more of the following: a first instruction, a second instruction, a third instruction, a fourth instruction, a fifth instruction, and a sixth instruction, wherein the first instruction is used to instruct to suspend the operation of the first application, the second instruction is used to instruct to add a mask on the first interface, the third instruction is used to instruct to output a safety instruction, the fourth instruction is used to instruct to output a safety guide, the fifth instruction is used to instruct to set the first display area as a display area away from the main driving seat, and the sixth instruction is used to instruct to prohibit the display of the first interface.
  • the first instruction is used to instruct to suspend the operation of the first application
  • the second instruction is used to instruct to add a mask on the first interface
  • the third instruction is used to instruct to output a safety instruction
  • the fourth instruction is used to instruct to output a safety guide
  • the fifth instruction is used to instruct to set the first display area as a display area away from the main driving seat
  • the sixth instruction is
  • the vehicle computer suspends the operation of the first application and controls the output device to suspend the output of the interface or audio of the first application.
  • the vehicle computer controls to add a mask on the first interface, and controls the display screen to display the first interface and the mask.
  • the vehicle computer controls the display screen to output a safety instruction.
  • the vehicle computer controls the display screen to output a control corresponding to the safety instruction.
  • the vehicle computer sets the first display area away from the main driver's seat.
  • the vehicle computer prohibits displaying the first interface.
  • the vehicle computer executes steps S501 and S502, the vehicle computer transmits the obtained first input and the determined first user to the cloud server, the cloud server executes step S503, the cloud server sends the first attribute of the first application to the vehicle computer, and the vehicle computer executes step S504.
  • the vehicle computer after the vehicle computer obtains the first user who inputs the first input in step S502, the vehicle computer sets the first display area of the first application according to the first user and the proximity display principle, so that the set first display area is close to the first user.
  • the proximity display principle is used to set the first display area of the first application program close to the first user who inputs the first input.
  • the vehicle computer can set the first display area in the area of the central control screen close to the co-driver according to the pre-stored position relationship.
  • the vehicle computer sets the first display area of the first application according to the first user and the proximity display principle, so that the set first display area is close to the first user.
  • the safe driving service is implemented to control the first application when the first attribute is a dangerous attribute.
  • the safe driving service can refer to the above step S504 content to control the display of the first interface of the first application.
  • corresponding application services are provided in direct response to application service requests during vehicle driving.
  • This application will first determine the first attribute of the first application, and then perform corresponding processing according to the first attribute. If the first attribute is a dangerous attribute, the first application is safely processed, that is, the display of the first interface is controlled to reduce or avoid direct or indirect impact on the main driver, thereby ensuring driving safety. If the first attribute is a safe attribute, the application service corresponding to the first application can be provided. Furthermore, the first attribute of the first application is determined according to the first user who inputs the first input, so that services can be provided to the corresponding users in a more intelligent and humane way. If the user is in the main driving seat, driving safety is given priority. If the user is not in the main driving seat, the corresponding application service is provided without affecting driving safety.
  • FIG. 8 Please refer to FIG. 8 for an exemplary introduction to another application management method provided in an embodiment of the present application.
  • Step S101 the vehicle computer obtains a first input, where the first input is used to instruct to run a first application.
  • step S101 may refer to the above step S501 and will not be described in detail here.
  • Step S102 In response to obtaining a first input during vehicle driving, the vehicle computer obtains a first user who inputs the first input.
  • step S102 may refer to the above step S502 and will not be described in detail here.
  • Step S103 When the first input triggers the split screen, the vehicle computer sets the first display area of the first application according to the first user and the proximity display principle, so that the first display area is close to the first user.
  • the situation in which the first input triggers the split screen includes, but is not limited to, the following: interface A is being displayed on the display screen, and the interface A is not the interface of the first application. For example, if it is the interface of another application, if the first interface of the first application indicated by the first input will also be displayed on the display screen, then the first input triggers the split screen.
  • the vehicle computer triggers the split screen in response to obtaining the first input when the vehicle uses the navigation service.
  • the vehicle's use of the navigation service includes displaying a navigation interface on the display screen of the vehicle computer, or running an application that provides the navigation service in the background of the vehicle computer, displaying other interfaces on the display screen (central control screen) of the vehicle computer, and the other interfaces may be interfaces or main interfaces of other applications, etc.
  • the audio output device of the vehicle computer broadcasts the navigation audio
  • the first input indicates that the interface of the first application is also presented on the central control screen, triggering the split screen.
  • the output device of the vehicle outputs navigation information such as broadcasting navigation voice and/or presenting navigation interface, and it is determined that the vehicle uses navigation service. If the navigation interface is being presented on the display screen and/or the navigation voice is broadcast through the audio output device, it is determined that the vehicle uses navigation service.
  • the vehicle computer can determine whether the running application is in the navigation state, and if so, the vehicle uses the navigation service. Exemplarily, the vehicle computer communicates with the application through an interface, and determines whether the application is in the navigation state through the IsNavigating() function.
  • a map application is running on the vehicle computer, and a navigation interface is displayed on the display screen.
  • the first user clicks the icon of the first application on the display screen, and the vehicle computer calls startActivity() to start the first application. It is determined whether the map application running on the vehicle computer is in the navigation state by calling the IsNavigating() function. If not, the first application is started in full screen mode. If so, the first application is started in split screen mode.
  • launching the first application in full-screen mode means that the first interface of the first application is displayed in full screen on the display screen.
  • Split-screen mode means that the display screen is split and two split-screen windows are presented, such as two left and right split-screen windows.
  • Launching the first application in split-screen mode means that the first interface of the first application is split and displayed in a split-screen window on the display screen, and the other split-screen window on the display screen is used to present the interface displayed on the display screen before the split-screen (i.e., before receiving the first input).
  • the first display area of the first application is set according to the first user and the principle of nearest display. For example, when the central control screen is displaying the navigation interface, the first input of the first user is received, and the first input triggers the split screen.
  • the area on the central control screen close to the first user is the left side of the central control screen, then the central control screen is divided into two split-screen windows on the left and right, and the interface of the first application is displayed in the split-screen window close to the first user, that is, the first display area position of the first application is set to the left side of the central control screen (the split-screen window close to the first user), such as HW_MULTI_WINDOWING_MODE_LEFT.
  • the navigation interface is displayed in another split-screen window away from the first user, and the window position of the navigation interface of the map application is set to the right side of the central control screen, such as HW_MULTI_WINDOWING_MODE_RIGHT.
  • the first application and the map application enter the split-screen guidance, and ActivityManagerEx.setStartToSplit() is used to set whether the application supports split-screen, and isStartToSplit is used to determine whether it is in split-screen state, and MainActivity's onMultiWindowModeChanged (boolean isInMultiWindowMode) is used for monitoring.
  • ActivityManagerEx.setStartToSplit() is used to set whether the application supports split-screen
  • isStartToSplit is used to determine whether it is in split-screen state
  • MainActivity's onMultiWindowModeChanged boolean isInMultiWindowMode
  • the example of determining whether it is in split-screen state through isStartToSplit is as follows: public boolean isStartToSplit() ⁇ return mIsStartToSplit; ⁇
  • the return result mIsStartToSplit indicates that the screen is in split state. If the return result is fault, it indicates that the screen is not in split state.
  • the specific implementation of launching the first application in full screen mode can refer to the prior art and will not be repeated here.
  • the specific implementation of launching the first application in split screen mode can refer to the prior art and will not be repeated here.
  • Step S104 The vehicle computer determines a first attribute of the first application.
  • the first input triggers the split screen, and the first display area of the first application is close to the first user.
  • the first attribute can be determined by referring to the above first condition or the second condition.
  • the first interface is set to be displayed on the left side of the central control screen
  • the navigation interface is set to be displayed on the right side of the central control screen.
  • the vehicle computer determines the first attribute of the first application according to the first user.
  • the vehicle computer can refer to the content of the above first condition to determine the first attribute, which will not be repeated here. If the first user is not in the main driving seat, the vehicle computer can refer to the content of the above second condition to determine the first attribute, which will not be repeated here.
  • Step S105 When the first attribute is a dangerous attribute, the vehicle computer controls the first application.
  • Step S105 may refer to the content of the above-mentioned step S504, which will not be repeated here.
  • the display screen 21 is used as the central control screen
  • the first user is the main driver
  • the vehicle computer pre-stores the position relationship including the main driver being located on the left side of the central control screen.
  • the navigation interface 111 and the main interface icon 112 are displayed on the central control screen.
  • the main driver clicks the main interface icon 112, and the main interface 71 shown in FIG9B is displayed.
  • the icon 73 of the video application is displayed on the main interface 71.
  • the central control screen detects the first input (clicking the icon of the video application) and transmits the first input to the vehicle computer.
  • the vehicle computer triggers the split screen in response to obtaining the first input when the vehicle uses the navigation service.
  • the vehicle computer sets the first display area according to the principle of the main driver and the nearest display, and sets the first display area to the left side of the central control screen based on the position relationship between the main driver and the central control screen, so that the first display area is close to the main driver.
  • the vehicle computer determines that the first attribute of the first application is a dangerous attribute by referring to the content of the first condition or the second condition above, then the vehicle computer suspends the operation of the first application, sets a mask on the first interface, and outputs safety prompts and safety guidance.
  • the right area of the central control screen displays the interface before the split screen (main interface 71), and the first display area is located in the left area of the central control screen.
  • the first interface and the mask 113 are displayed on the first display area.
  • the safety prompt 114 "Affects safe driving, not recommended to start” and the control 115 "Exit” and control 116 "Interface swap” corresponding to the safety guidance are displayed on the mask 113.
  • the vehicle computer responds to the user clicking on the interface swap control 116 by swapping the display areas of the first interface and the main interface 71.
  • the main interface 71 is displayed on the left area of the central control screen, and the first display area is located in the right area of the central control screen.
  • the first interface 117 is displayed on the first display area, and the first interface 117 is unmasked.
  • the navigation interface 111 is displayed on the central control screen, and the vehicle computer receives a first input, and the first input is that the user of the main driver casts the video interface on the mobile phone to the central control screen of the vehicle computer.
  • the vehicle computer triggers the split screen in response to obtaining the first input when the vehicle uses the navigation service.
  • the vehicle computer sets the first display area to be located on the left side of the central control screen according to the positional relationship between the main driver and the central control screen, so that the first display area is close to the main driver.
  • the vehicle computer determines that the first attribute of the first application is a dangerous attribute by referring to the content of the above first condition or the second condition, then the vehicle computer suspends the operation of the first application, sets a mask on the first interface, and outputs safety prompts and safety guidance.
  • the right area of the central control screen displays the interface before the split screen (navigation interface 111)
  • the first display area is located in the left area of the central control screen
  • the first interface and the mask 113 are displayed on the first display area
  • the safety prompt 114 "affects safe driving, not recommended to start” and the control 115 "exit” and control 116 "interface swap" corresponding to the safety guidance are displayed on the mask 113.
  • the navigation interface 111 is displayed in the left area of the central control screen, and the first display area is located in the right area of the central control screen, and the first interface 117 (video interface) is displayed on the first display area.
  • the control corresponding to the vehicle computer safety guidance is the control 76 shown in (b) of FIG. 7 or the control 115 and the control 116 shown in FIG. 9C and FIG. 9E.
  • the user inputs the control corresponding to the safety guidance, the vehicle computer obtains a second input to the control, and then exits the first application in response to the second input, and/or, in response to the second input, when the first display area of the first application is close to the user in the main driving seat, the first display area is set away from the user in the main driving seat.
  • the present application after the vehicle computer starts the safe driving service, the present application will also monitor the running process of the application.
  • the application management method shown in Figure 5 or Figure 8 above may also include the following steps: the vehicle computer obtains a third input, wherein the third input is used to indicate that the display area of the first interface on the display screen and the display area of the second interface on the display screen are interchanged, so that the display area of the first interface is close to the user in the main driving seat, wherein the first interface is the interface of the first application, and the second interface is not the interface of the first application.
  • the vehicle computer obtains the first attribute of the first application, and when the first attribute is a dangerous attribute, it is determined that the interface of the first application is to be managed.
  • the vehicle computer In response to the third input, the vehicle computer performs one or more of the following operations: outputting a safety indication (used to indicate that the operation of the third input will affect driving safety), outputting a safety guide (used to guide the user to exit the operation of the third input) and prohibiting interchange.
  • a safety indication used to indicate that the operation of the third input will affect driving safety
  • a safety guide used to guide the user to exit the operation of the third input
  • the navigation interface 121 is displayed on the left area of the display screen 21 (the side close to the main driver's seat).
  • the navigation interface 121 is not the interface of the video application.
  • the first interface 122 of the video application is displayed on the right area of the display screen 21 (the side away from the main driver's seat).
  • the user inputs a third input 123 to the display screen 21, such as swapping three fingers, to indicate that the content of the first interface 122 is displayed on the side close to the main driver's seat.
  • the vehicle computer After the vehicle computer detects the third input 123, based on the first attribute of the video application being a dangerous attribute (i.e., it has been determined that the interface display of the video application needs to be controlled), the vehicle computer outputs a safety instruction, as shown in (b) in FIG. 10 , and presents a safety prompt 124 "It affects safe driving, and swapping is not recommended.”
  • the application management method shown in Figure 5 or Figure 8 above may also include the following steps: the vehicle computer obtains a fourth input, wherein the fourth input is used to instruct the first application to play audio at a first volume value; when the first volume value is greater than or equal to a preset threshold, one or more of the following operations are performed: outputting a safety indication (used to indicate that the operation of the third input will affect driving safety), outputting safety guidance (used to guide the user to exit the operation of the third input), and prohibiting the first application from playing audio at the first volume value.
  • a safety indication used to indicate that the operation of the third input will affect driving safety
  • safety guidance used to guide the user to exit the operation of the third input
  • the prompt form and text corresponding to the safety instructions involved in this application can be set according to actual conditions, such as the safety instruction output by the vehicle computer in response to the third input can be "affects safe driving, swap is not recommended", and the safety instruction output by the vehicle computer in response to the fourth input can be "affects safe driving, it is not recommended to increase the audio". Accordingly, the guidance content and controls corresponding to the safety instructions involved in this application can be set according to actual conditions, and this application does not make specific limitations on this.
  • the vehicle-mounted device 130 can be used to execute the above-mentioned application management method of the present application.
  • the vehicle-mounted device 130 includes an input acquisition module 131, a first user acquisition module 132, a judgment module 133 and a management module 134.
  • the above-mentioned input acquisition module 131, the first user acquisition module 132, the judgment module 133 and the control module 134 can be deployed in corresponding devices according to actual conditions, such as all the above-mentioned modules are deployed on the vehicle computer or all are deployed on the cloud server, or, the above-mentioned input acquisition module 131 and the first user acquisition module 132 are deployed on the vehicle computer, and the judgment module 133 and the control module 134 are deployed on the cloud server.
  • This application does not make any specific limitations.
  • the input acquisition module 131 is used to acquire a first input, wherein the first input is used to indicate that a first interface of a first application is displayed on a display screen of the vehicle.
  • the first user acquisition module 132 is used to acquire a first user who inputs the first input in response to obtaining the first input during the driving of the vehicle.
  • the judgment module 133 is used to judge whether to control the interface display of the first application according to the first user.
  • the control module 134 is used to control the display of the first interface of the first application when the judgment result is yes.
  • the judgment module includes: a first judgment unit, used to control the interface display of the first application when it is determined that the first condition is met, and the first condition includes that the first user is in the main driving seat.
  • the first condition also includes: the first application belongs to a preset first list, or the first application does not belong to a preset second list, or the type of the first application belongs to a preset first type list, or the type of the first application does not belong to a preset second type list; the first list is used to record the applications to be controlled, the first type list is used to record the types of applications to be controlled, the second list is used to record applications that are not to be controlled, and the second type list is used to record the types of applications that are not to be controlled.
  • the judgment module includes: a second judgment unit, used to control the interface display of the first application when it is determined that a second condition is met, and the second condition includes that the first user is not in the main driving seat and the first display area of the first application is close to the main driving seat.
  • the management and control module performs management and control including one or more of the following operations:
  • Prohibit the running of the first application prohibit the display of the first interface, add a mask to the first interface, output safety instructions, output safety guidance, and set the first display area of the first application so that the set first display area is away from the user in the main driving seat.
  • the vehicle-mounted device 140 can be a car computer, etc.
  • the vehicle-mounted device 140 may include: a display screen 1401, one or more processors 1402, one or more memories 1403, one or more computer programs 1404, a communication module 1405, an input device 1407, and an output device 1408; the above-mentioned devices may be connected via one or more communication buses 1406.
  • the display screen 1401 can be used to display the first interface, or the interface of an application from other devices.
  • the processor 1402 can be used to implement the content of the above-mentioned processing module.
  • the input device 1407 can be used to implement the content of the above-mentioned input module.
  • the output device can be used to implement the content of the above-mentioned output module.
  • one or more computer programs 1404 are stored in the above-mentioned memory 1403 and are configured to be executed by the one or more processors 1402.
  • the one or more computer programs 1404 include instructions, and the above-mentioned instructions can be used to execute the various steps in Figures 5 and 8 and the corresponding embodiments.
  • the embodiments of the present application do not specifically limit the specific structure of the execution subject of the method provided in the embodiments of the present application, as long as it can communicate according to the method provided in the embodiments of the present application by running a program that records the code of the method provided in the embodiments of the present application.
  • the execution subject of the method provided in the embodiments of the present application may be an on-board device, or a functional module in the on-board device that can call and execute a program.
  • An embodiment of the present application provides a terminal, including a processor, the processor is coupled to a memory, and the processor is used to execute a computer program or instruction stored in the memory, so that the terminal implements the application management method as described above.
  • An embodiment of the present application provides a cloud platform, including a processor, which is coupled to a memory, and the processor is used to execute a computer program or instruction stored in the memory, so that the cloud platform implements the application management and control method as described above.
  • the embodiment of the present application also provides a computer program product.
  • the computer program product When the computer program product is run on a computer, the computer is caused to execute the above-mentioned related steps to implement the application management and control method of the application program in the above-mentioned method embodiments.
  • An embodiment of the present application also provides a computer storage medium, including computer instructions.
  • the computer instructions When the computer instructions are executed on an electronic device, the electronic device executes the application management method as described in the above embodiment.
  • the electronic device, computer storage medium, computer program product or chip system provided in the embodiments of the present application are all used to execute the corresponding methods provided above. Therefore, the beneficial effects that can be achieved can refer to the beneficial effects in the corresponding methods provided above and will not be repeated here.
  • the disclosed devices and methods can be implemented in other ways.
  • the device embodiments described above are only schematic.
  • the division of the modules or units is only a logical function division. There may be other division methods in actual implementation, such as multiple units or components can be combined or integrated into another device, or some features can be ignored or not executed.
  • Another point is that the mutual coupling or direct coupling or communication connection shown or discussed can be through some interfaces, indirect coupling or communication connection of devices or units, which can be electrical, mechanical or other forms.
  • the unit described as a separate component may or may not be physically separated, and the component shown as a unit may be one physical unit or multiple physical units, that is, it may be located in one place or distributed in multiple different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the present embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit may be implemented in the form of hardware or in the form of software functional units.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a readable storage medium.
  • the technical solution of the embodiment of the present application is essentially or the part that contributes to the prior art or all or part of the technical solution can be embodied in the form of a software product, which is stored in a storage medium and includes several instructions to enable a device (which can be a single-chip microcomputer, chip, etc.) or a processor (processor) to execute all or part of the steps of the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read-only memory (ROM), random access memory (RAM), disk or optical disk and other media that can store program code.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Mechanical Engineering (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente demande, qui relève du domaine technique des automobiles intelligentes, divulgue un procédé de gestion et de commande d'application, un appareil monté sur véhicule, un dispositif monté sur véhicule, un véhicule, un support lisible et un produit-programme. Le procédé comprend : l'acquisition d'une première entrée, la première entrée étant utilisée pour ordonner d'afficher une première interface d'une première application sur un écran d'affichage d'un véhicule; en réponse à l'acquisition de la première entrée dans un processus de conduite de véhicule, l'acquisition d'un premier utilisateur entrant la première entrée; et selon le premier utilisateur, le fait de déterminer s'il faut gérer et commander l'affichage d'interface de la première application et, si tel est le cas, la gestion et la commande de l'affichage de la première interface de la première application. La présente demande garantit la sécurité de conduite.
PCT/CN2023/135801 2023-01-17 2023-12-01 Procédé de gestion et de commande d'application, appareil monté sur véhicule, dispositif monté sur véhicule, véhicule et support lisible WO2024152765A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202310089404.1 2023-01-17
CN202310089404.1A CN118363689A (zh) 2023-01-17 2023-01-17 应用管控方法、车载装置、车载设备、车辆及可读介质

Publications (1)

Publication Number Publication Date
WO2024152765A1 true WO2024152765A1 (fr) 2024-07-25

Family

ID=91878852

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/135801 WO2024152765A1 (fr) 2023-01-17 2023-12-01 Procédé de gestion et de commande d'application, appareil monté sur véhicule, dispositif monté sur véhicule, véhicule et support lisible

Country Status (2)

Country Link
CN (1) CN118363689A (fr)
WO (1) WO2024152765A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015079676A1 (fr) * 2013-11-29 2015-06-04 Denso Corporation Procédé pour appareil véhiculaire, appareil véhiculaire et véhicule
CN111385410A (zh) * 2018-12-29 2020-07-07 华为技术有限公司 终端设备的控制方法、装置及存储介质
CN111873800A (zh) * 2020-07-31 2020-11-03 科大讯飞股份有限公司 一种基于车载输入法的驾驶安全提示方法、装置以及设备
CN112416280A (zh) * 2020-11-20 2021-02-26 湖北亿咖通科技有限公司 车载终端的多显示屏控制方法
CN115291960A (zh) * 2021-04-19 2022-11-04 华为技术有限公司 一种车载电子设备的控制方法及车载电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015079676A1 (fr) * 2013-11-29 2015-06-04 Denso Corporation Procédé pour appareil véhiculaire, appareil véhiculaire et véhicule
CN111385410A (zh) * 2018-12-29 2020-07-07 华为技术有限公司 终端设备的控制方法、装置及存储介质
CN111873800A (zh) * 2020-07-31 2020-11-03 科大讯飞股份有限公司 一种基于车载输入法的驾驶安全提示方法、装置以及设备
CN112416280A (zh) * 2020-11-20 2021-02-26 湖北亿咖通科技有限公司 车载终端的多显示屏控制方法
CN115291960A (zh) * 2021-04-19 2022-11-04 华为技术有限公司 一种车载电子设备的控制方法及车载电子设备

Also Published As

Publication number Publication date
CN118363689A (zh) 2024-07-19

Similar Documents

Publication Publication Date Title
KR100950008B1 (ko) 화상 표시 제어 장치
JP5592473B2 (ja) 携帯機器と連携して動作可能な車載機器
KR20190076731A (ko) 탑승자 단말 및 디스트랙션 확인을 통한 컨텐츠 표시 방법
WO2022000448A1 (fr) Procédé d'interaction de geste d'air dans un véhicule, dispositif électronique et système
CN107835398A (zh) 一种基于投屏的定制化导航信息显示方法、装置
US20220197457A1 (en) Coupling of User Interfaces
WO2024104045A1 (fr) Procédé d'acquisition d'instruction d'opération sur la base d'une zone de compartiment, procédé d'affichage et dispositif associé
EP3167360B1 (fr) Accélération du démarrage d'un système d'exploitation
CN114312797B (zh) 智能体装置、智能体方法以及记录介质
CN115309285A (zh) 控制显示的方法、装置和移动载体
EP3796159A1 (fr) Accélération de démarrage de système d'exploitation
CN105946747B (zh) 车辆中的控制方法和车辆主板
CN113851126A (zh) 车内语音交互方法及系统
WO2024022437A1 (fr) Procédé d'affichage, appareil et support mobile
WO2024152765A1 (fr) Procédé de gestion et de commande d'application, appareil monté sur véhicule, dispositif monté sur véhicule, véhicule et support lisible
CN116155988A (zh) 车载信息推送方法、装置、设备及存储介质
KR20230050535A (ko) 전기버스의 자율주행 안전성 향상을 위한 디스플레이 시스템 및 그 방법
US11853232B2 (en) Device, method and computer program
US20240239265A1 (en) Rear display enhancements
US20230419751A1 (en) Vehicle display device, vehicle display system, and vehicle display method
US20230382409A1 (en) System for enabling a vehicle to perform a circuit mode
CN112738447B (zh) 一种基于智能座舱的视频会议方法和智能座舱
WO2024027550A1 (fr) Procédé de commande d'application pour dispositif de commande central de véhicule, et appareil associé
EP4290486A1 (fr) Dispositif de diagnostic de conduite, système de diagnostic de conduite et procédé de diagnostic de conduite
CN118151381A (zh) 一种用于基于hud的功能控件可视化的方法和系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23917233

Country of ref document: EP

Kind code of ref document: A1