CN115904283A - Distributed display method, readable medium and vehicle-mounted device - Google Patents

Distributed display method, readable medium and vehicle-mounted device Download PDF

Info

Publication number
CN115904283A
CN115904283A CN202111005585.2A CN202111005585A CN115904283A CN 115904283 A CN115904283 A CN 115904283A CN 202111005585 A CN202111005585 A CN 202111005585A CN 115904283 A CN115904283 A CN 115904283A
Authority
CN
China
Prior art keywords
vehicle
application
display screen
user
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111005585.2A
Other languages
Chinese (zh)
Inventor
宁维赛
卢冬
刘磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202111005585.2A priority Critical patent/CN115904283A/en
Publication of CN115904283A publication Critical patent/CN115904283A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to the technical field of intelligent automobiles and discloses a distributed display method, a readable medium and vehicle-mounted equipment. The distributed display method is used for at least one vehicle-mounted device in a vehicle, the at least one vehicle-mounted device controls a plurality of display screens in the vehicle, the vehicle-mounted device can acquire biological characteristic information of a user and determine the position of the user in a compartment, and the application to be started is determined to be displayed on which display screen of the vehicle according to the biological characteristic information of the user and the position of the user in the compartment, so that the application can be displayed on the display screen expected by the user without extra operation, and user experience is improved.

Description

Distributed display method, readable medium and vehicle-mounted device
Technical Field
The application relates to the technical field of intelligent automobiles, in particular to a distributed display method, a readable medium and vehicle-mounted equipment.
Background
Along with the rise of the car networking, people continuously improve the requirement of on-vehicle amusement, and more passenger cars support multi-screen display to provide abundant seeing and hearing experience. Also, in the case of multi-screen display, a distributed display function is also supported, and for example, an application displayed on the display screen a may be started and displayed on the display screen B.
For example, as shown in FIG. 1, the vehicle 11 supports three display screens, display screen 100-1, display screen 100-2, and display screen 100-3; the display screen 100-1 may be a center control screen disposed on the vehicle 11, the display screen 100-2 may be a display screen disposed on a passenger seat of the vehicle 11, and the display screen 100-3 may be a display screen disposed on a rear seat of the vehicle 11.
For example, an application displayed on display screen 100-1 may launch and be displayed on display screen 100-2. As shown in FIG. 2A, a driver may click on a "navigate" icon on display interface 101 of display screen 100-1, pop up widget 102 on display interface 101 of display screen 100-1, and the driver may select on which display screen the "navigate" application launches and is displayed. For example, as shown in FIG. 2B, the driver clicks a button on the secondary screen, i.e., selects to launch and display a "navigation" application on the display screen 100-2. For example, as shown in FIG. 2C, a "navigation" application is launched and displayed on display screen 100-2.
Although the driver can start the application on the display screen 100-1 and display the application on the display screen 100-2 through manual operation, when the driver wants to start a navigation application or a multimedia application while the vehicle is moving, the driver needs to operate while driving, which easily causes driver distraction and has great potential safety hazard. Therefore, how to support the multi-screen interaction function and reduce the operation of the driver, the driver is prevented from being distracted, and the problem that the multi-screen interaction display of the vehicle needs to be solved urgently is solved.
Disclosure of Invention
The embodiment of the application provides a distributed display method, a readable medium and vehicle-mounted equipment.
In a first aspect, an embodiment of the present application provides a distributed display method, which is used for at least one vehicle-mounted device in a vehicle, and is characterized in that the at least one vehicle-mounted device controls a plurality of display screens in the vehicle; and the method comprises: detecting a starting instruction of a user to a first application; acquiring first biometric information associated with a first application, and acquiring at least one second biometric information of a user at least one display screen of a plurality of display screens; and under the condition that second biological characteristic information matched with the first biological characteristic information exists in the at least one second biological characteristic information, displaying the started first application in the following steps: and the display screen of the user of the second biological characteristic information matched with the first biological characteristic information.
In a possible implementation of the first aspect, the method further includes: and under the condition that second biological characteristic information matched with the first biological characteristic information does not exist in the at least one second biological characteristic information, displaying the started first application on at least one display screen responding to the operation instruction of the user according to the detected operation instruction of the user.
For example, in the case where there is no second biometric information matching the first biometric information in the at least one second biometric information, the user may select at least one display screen desired for the first application display according to the options provided by the small window 102, according to the display interface provided as shown in fig. 2B. It can be understood that the vehicle-mounted device can detect the operation instruction of the user and respond to the operation instruction of the user to display the first application on at least one display screen which is expected to be displayed by the user. In a possible implementation of the first aspect, the method further includes: the at least one onboard device includes a first onboard device and a second onboard device, and the plurality of display screens includes a first display screen controlled by the first onboard device and a second display screen controlled by the second onboard device; the starting instruction is generated by the first vehicle-mounted device in response to the selection operation of the user on the first application icon of the first application on the first display screen; and the display screen where the user of the second biological characteristic information matched with the first biological characteristic information is located is a second display screen.
For example, the first in-vehicle device may be the following in-vehicle device 10-1, the second in-vehicle device may be the following in-vehicle device 10-2, the first display screen may be the following display screen 100-1, and the second display screen may be the following display screen 100-2; the first biometric information may be biometric information of a user who used the first application for a past period of time saved by the in-vehicle device 10-1.
In one possible implementation of the first aspect, the method further includes: obtaining at least one second biometric information of the user at least one of the plurality of display screens, comprising: the first vehicle-mounted device obtains second biological characteristic information of the user at the second display screen, which is acquired by the second vehicle-mounted device, from the second vehicle-mounted device.
Taking human face information as an example, the camera for acquiring image data in the carriage may be a camera on the vehicle-mounted device 10-1, and the camera may be mounted on the display screen 100-1; the camera for collecting the image data in the compartment may also be a camera on the vehicle-mounted device 10-2 or the vehicle-mounted device 10-3, and the vehicle-mounted device 10-2 or the vehicle-mounted device 10-3 may send the image data collected by the camera to the vehicle-mounted device 10-1. Wherein the image data comprises face information.
In a possible implementation of the first aspect, the method further includes: displaying the started first application in: on the display screen of the user place of the second biological characteristic information that matches with first biological characteristic information, include: the method comprises the steps that a first vehicle-mounted device sends information of a first application to a second vehicle-mounted device; and the second vehicle-mounted equipment displays the started first application on a second display screen according to the information of the first application.
For example, the information of the first application may be a start instruction of the first application generated by the first vehicle-mounted device, and the second vehicle-mounted device displays the started first application on the second display screen according to the start instruction of the first application.
For example, the information of the first application may also be image data of a display interface of the first application displayed by the first vehicle-mounted device, the first vehicle-mounted device may send the image data of the display interface of the first application to the second vehicle-mounted device, and the second vehicle-mounted device displays the started first application on the second display screen according to the image data of the display interface of the first application.
In a possible implementation of the first aspect, the method further includes: the at least one vehicle-mounted device comprises a third vehicle-mounted device, and the plurality of display screens comprise a third display screen and a fourth display screen controlled by the third vehicle-mounted device; the starting instruction is generated by the third vehicle-mounted equipment in response to the selection operation of the user on the first application icon of the first application on the third display screen; and the display screen where the user of the second biological characteristic information matched with the first biological characteristic information is located is a fourth display screen.
In a possible implementation of the first aspect, the method further includes: obtaining at least one second biometric information of the user at least one of the plurality of display screens, comprising: and the second biological characteristic information of the user at the fourth display screen is acquired by the third vehicle-mounted equipment.
In a possible implementation of the first aspect, the method further includes: displaying the started first application in: on the display screen of the user place of the second biological characteristic information that matches with first biological characteristic information, include: and the third vehicle-mounted equipment displays the started first application on a fourth display screen.
In a possible implementation of the first aspect, the method further includes: the first biometric information and the second biometric information include at least one of: face information, voiceprint information, fingerprint information, iris information.
In a second aspect, an embodiment of the present application provides a readable medium, on which instructions are stored, and when executed on an electronic device, the instructions cause the electronic device to perform the first aspect described above and any one of various possible implementations of the first aspect.
In a third aspect, an embodiment of the present application provides an onboard apparatus, including:
a memory for storing instructions for execution by one or more processors of the in-vehicle device, an
The processor is one of processors of the vehicle-mounted device, and is configured to execute any one of the distributed display methods of the first aspect and various possible implementations of the first aspect.
Drawings
FIG. 1 illustrates a schematic view of the location of multiple display screens in a vehicle, according to some embodiments of the present application;
2A-2C illustrate process schematic diagrams of an in-vehicle application displayed on a display screen in a vehicle, according to some embodiments of the present application;
3A-3B illustrate process schematic diagrams of another in-vehicle application displayed on a display screen in a vehicle, according to some embodiments of the present application;
FIG. 4 illustrates a scene schematic of a distributed display, according to some embodiments of the present application;
FIG. 5 illustrates a flow diagram of a distributed display, according to some embodiments of the present application;
FIG. 6 illustrates a flow diagram of another distributed display, according to some embodiments of the present application;
FIG. 7 illustrates a flow diagram of another distributed display, according to some embodiments of the present application;
FIG. 8 illustrates a composition diagram of a static rule set, according to some embodiments of the present application;
FIG. 9 illustrates a schematic diagram of a static rule set created by an in-vehicle device for an application on a display screen, in accordance with some embodiments of the present application;
10A-10C illustrate a user interface provided on an electronic device for setting an association of an application with a display screen, according to some embodiments of the present application;
11A-11C illustrate a user interface provided on an in-vehicle device for setting an association of an application with a display screen, according to some embodiments of the present application;
FIG. 12 illustrates a schematic block diagram of a software architecture interaction based on a distributed operating system in accordance with some embodiments of the present application;
FIG. 13 illustrates a scene schematic of another distributed display, in accordance with some embodiments of the present application;
FIG. 14 illustrates a schematic diagram of an in-vehicle device, according to some embodiments of the present application.
Detailed Description
Illustrative embodiments of the present application include, but are not limited to, methods, readable media, and electronic devices.
In order to solve the above technical problem, the present application provides a distributed display method, which is applied to an on-board device, and the on-board device can detect a position of a user in a vehicle cabin, and determine on which display screen of the vehicle 11 an application to be started is displayed according to the position of the user.
Further, the vehicle-mounted device may further acquire biometric information of the user and determine the position of the user in the vehicle cabin, and determine on which display screen of the vehicle 11 the application to be started is displayed according to the biometric information of the user, the position of the user in the vehicle cabin, and the number of times the user uses the application to be started corresponding to the biometric information of the user in a past period of time.
As an example, as shown in fig. 3A, the vehicle-mounted device detects a user's start operation on a "navigation" application on the display screen 100-1, and the vehicle-mounted device acquires the position of the user in the vehicle cabin as being on the passenger seat; the in-vehicle apparatus displays the display interface of the "navigation" application on the display screen 100-2 of the passenger seat (for example, as shown in fig. 3B) according to the position of the user in the vehicle compartment.
In order to make the objects, technical solutions and advantages of the present application more clear, the technical solutions of the present application are described in detail below with reference to fig. 1 to 12.
Fig. 4 is a schematic diagram illustrating a scene of a distributed display according to an embodiment of the present application, where the scene includes: a vehicle 11; the vehicle 11 is equipped with a plurality of vehicle-mounted devices, namely a vehicle-mounted device 10-1, a vehicle-mounted device 10-2, a vehicle-mounted device … … and a vehicle-mounted device 10-n.
As shown in FIG. 4, each of the in-vehicle apparatuses includes a display screen, for example, the in-vehicle apparatus 10-1 includes a display screen 100-1, the in-vehicle apparatus 10-2 includes a display screen 100-2, … …, and the in-vehicle apparatus 10-n includes a display screen 100-n. The same vehicle-mounted application can be displayed on the display interfaces from the display screen 100-1 to the display screen 100-n, and different vehicle-mounted applications can also be displayed.
In the embodiment of the present application, the operating systems that can be run on the in-vehicle apparatus 10-1, the in-vehicle apparatuses 10-2, … …, and the in-vehicle apparatus 10-n may be any of the following operating systemsThe method comprises the following steps: a Hongmon operating system (Harmony operating system),
Figure BDA0003237045720000041
Operating system>
Figure BDA0003237045720000042
Operating system>
Figure BDA0003237045720000043
An operating system, a MAC operating system, etc. The operating systems running on the vehicle-mounted device 10-1, the vehicle-mounted devices 10-2, … … and the vehicle-mounted device 10-n may be the same or different, and according to actual application requirements, whether the operating systems running on the vehicle-mounted device 10-1, the vehicle-mounted devices 10-2, … … and the vehicle-mounted device 10-n are the same or not is not specifically limited.
In some embodiments, the distributed display is implemented by communication connection between multiple vehicle-mounted devices, for example, the multiple vehicle-mounted devices may be communicatively connected via a Wireless Local Area Network (WLAN) (e.g., a wireless fidelity (Wi-Fi) network), bluetooth (BT), near Field Communication (NFC), or other wireless communication networks.
It is understood that, for example, the three display screens on the vehicle 11 shown in fig. 1 may be display screens respectively disposed on three in-vehicle devices, and distributed display is realized by communication between the three in-vehicle devices. It can be understood that the application displayed on the display interface of the display screen on any vehicle-mounted device can be displayed on the device itself or on other vehicle-mounted devices. For example, the application displayed on the display interface of the display screen of the in-vehicle device 10-1 may be displayed on the display screen of the own device or the display screen of the in-vehicle device 10-2 or the display screen of the in-vehicle device 10-n.
The following description will be given taking an example in which the vehicle 11 is equipped with 3 in-vehicle devices, that is, the in-vehicle device 10-1, the in-vehicle device 10-2, the in-vehicle device 10-3, and the display content of the application on the in-vehicle device 10-1 is displayed on the display screen 100-2.
FIG. 5 is a flow diagram illustrating a distributed display according to an embodiment of the present application; as shown in fig. 5, the distributed display process includes:
s501: the in-vehicle apparatus 10-1 detects an operation of starting a "navigation" application, in which a "navigation" application icon is displayed on the display screen 100-1 of the in-vehicle apparatus 10-1.
In some embodiments, the in-vehicle device 10-1 may detect operation of a user to launch a "navigation" application on the display screen 100-1, where the "navigation" application is displayed on the display screen of the in-vehicle device 10-1.
For example, as shown in fig. 3, the display screen 100-1 of the vehicle 11, the user may start the application displayed on the display screen 100-1 by touching, gesture, voice, or the like. For example, as shown in FIG. 3A, the user may click on a "navigate" icon on the display interface of the display screen 100-1, at which time the in-vehicle device 10-1 may detect an operation in which the user launches a "navigate" application.
It is understood that the user may also start the application displayed on the display screen 100-2 of the in-vehicle device 10-2 by touching, gesture, voice, or the like, and the user may also start the application displayed on the display screen 100-3 of the in-vehicle device 10-3 by touching, gesture, voice, or the like. The following description will be given taking as an example an operation in which the in-vehicle apparatus 10-1 detects that the user starts the "navigation" application displayed on the display screen 100-1 of the in-vehicle apparatus 10-1.
S502: the in-vehicle device 10-1 acquires the position of the user in the vehicle compartment.
In some embodiments, the in-vehicle device 10-1 may determine the position of the user in the vehicle cabin based on the acquired data collected by the sensors disposed in the vehicle cabin. The sensor may be an image sensor (e.g., a camera), an infrared sensor, an ultrasonic sensor, a laser radar sensor, or the like.
For example, the in-vehicle device 10-1 may determine that the user is in the main driver seat based on data collected by a sensor on the own device. The in-vehicle apparatus 10-1 may also determine that the user is on the front passenger seat or the rear seat based on data collected by the sensors of the in-vehicle apparatus 10-2 or the in-vehicle apparatus 10-3 transmitted by the in-vehicle apparatus 10-2 or the in-vehicle apparatus 10-3.
In other embodiments, the in-vehicle device 10-2 or the in-vehicle device 10-3 may determine that the user is on the passenger seat or the rear seat according to data collected by a sensor on the own device; the in-vehicle device 10-2 or the in-vehicle device 10-3 transmits information of the user on the passenger seat or the rear seat to the in-vehicle device 10-1, and the in-vehicle device 10-1 acquires the position of the user in the vehicle compartment.
In some other embodiments of the present application, the in-vehicle device 10-1 may determine the position of the user in the vehicle cabin according to the acquired sound source information of the user collected by the microphone disposed in the vehicle cabin, and the in-vehicle device 10-1 may determine the position of the user in the vehicle cabin according to the sound source information of the user collected by the microphone disposed in the vehicle cabin. The specific content of the source of the sound source information of the user acquired by the vehicle-mounted device 10-1 refers to the content of the data acquired by the sensor arranged in the carriage acquired by the vehicle-mounted device 10-1, and is not described herein again.
S503: the in-vehicle apparatus 10-1 determines on the display screen on which the display content of the "navigation" application is displayed, according to the position of the user in the vehicle compartment.
For example, the in-vehicle apparatus 10-1 acquires that the position of the user in the vehicle compartment is the passenger seat, and upon detecting that the user starts the in-vehicle apparatus "navigation" application, the in-vehicle apparatus 10-1 determines that the display content of the "navigation" application is displayed on the display screen 100-2.
For example, the in-vehicle device 10-1 may also acquire that the positions of the user in the vehicle compartment are a passenger seat, a main seat, and a rear seat, respectively, and the in-vehicle device 10-1 determines that the display content of the "navigation" application is displayed on the display screen 100-1, the display screen 100-2, and the display screen 100-3.
S504: the in-vehicle apparatus 10-1 displays the display interface of the "navigation" application on the display screen 100-2.
For example, as shown in FIG. 3B, the in-vehicle apparatus 10-1 may launch and display a "navigation" application on the display screen 100-2 of the in-vehicle apparatus 10-2. The specific content of the "navigation" application started and displayed on the display screen 100-2 of the vehicle-mounted device 10-2 by the vehicle-mounted device 10-1 is described in detail below with reference to software architectures of the vehicle-mounted device 10-1 and the vehicle-mounted device 10-2, and is not described herein again.
It is understood that the in-vehicle device 10-2 or the in-vehicle device 10-3 may also acquire the position of the user in the vehicle compartment, and determine on which display screen in the vehicle compartment the application on the in-vehicle device 10-2 or the in-vehicle device 10-3 is displayed. For details, reference is made to the description shown in fig. 5, which is not repeated herein.
As is apparent from the description process of FIG. 5, the in-vehicle device 10-1 can acquire the position of the user in the vehicle compartment, and upon detecting that the user starts the application on the in-vehicle device 10-1, the in-vehicle device 10-1 determines on which display screen in the vehicle compartment the application on the in-vehicle device 10-1 is displayed. Therefore, when the user performs the operation (for example, the operation shown in fig. 2B) on the current display screen on the "navigation" application, the display content of the "navigation" application can be displayed on the display screen 100-2, so that the redundant operation of the user is reduced, and the user experience is improved.
In other embodiments of the present application, the vehicle-mounted device may further obtain biometric information of the user in the vehicle cabin, and determine on which display screen in the vehicle cabin the "navigation" application is displayed according to the position of the user in the vehicle cabin and the biometric information of the user in the vehicle cabin.
FIG. 6 is a flow diagram illustrating another distributed display according to an embodiment of the present application; as shown in fig. 6, the distributed display process includes:
s601: the in-vehicle apparatus 10-1 detects an operation to start a "navigation" application, in which a "navigation" application icon is displayed on the display screen 100-1 of the in-vehicle apparatus 10-1. For details, refer to step S501 in fig. 5, which is not described herein.
S602: the in-vehicle device 10-1 acquires the position of the user in the vehicle compartment. For details, refer to step S502 of fig. 5, which is not described herein.
S603: the in-vehicle device 10-1 acquires biometric information of the user in the vehicle compartment, wherein the in-vehicle device 10-1 can identify the user in the vehicle compartment based on the biometric information of the user.
In some embodiments, the biometric information of the user includes face information, voiceprint information, fingerprint information, iris information, and the like. The in-vehicle device 10-1 may recognize whether the user in the vehicle compartment is the user a or the user B based on the biometric information of the user.
In some embodiments, the in-vehicle device 10-1 may perform step S602, and then the in-vehicle device 10-1 acquires biometric information of the user at the position according to the position of the user in the vehicle compartment, thereby identifying the user at the position.
Taking the human face information as an example, the in-vehicle device 10-1 may acquire, by a camera arranged in the carriage, image data of a user sitting in the driver seat or another seat acquired by the camera, acquire the human face information according to the image data, and identify the user sitting in the driver seat or another seat. The in-vehicle device 10-1 may also acquire the face information of the user in the driver's seat or other seat transmitted by the in-vehicle device 10-2 or the in-vehicle device 10-3, and identify the user sitting in the driver's seat or other seat.
It is understood that the camera that collects the image data in the vehicle compartment may be a camera on the in-vehicle device 10-1, which may be mounted on the display screen 100-1; the camera for collecting the image data in the compartment may also be a camera on the vehicle-mounted device 10-2 or the vehicle-mounted device 10-3, and the vehicle-mounted device 10-2 or the vehicle-mounted device 10-3 may send the image data collected by the camera to the vehicle-mounted device 10-1. According to practical applications, the present application does not specifically limit the location of the data acquisition device (e.g., a camera, a microphone, a fingerprint sensor) for the biometric information of the user in the vehicle compartment, which is acquired by the in-vehicle device 10-1.
S604: the in-vehicle apparatus 10-1 determines the display screen on which the display content of the "navigation" application is displayed, based on the position of the user in the vehicle compartment, the biometric information of the user, and the number of times the user has used the "navigation" application corresponding to the biometric information of the user in the past period of time.
In some embodiments, the in-vehicle device 10-1 may save the number of uses of each application on the in-vehicle device 10-1 by each user for a past period of time, and determine on which display screen in the vehicle compartment the "navigation" application is displayed, based on the number of uses, the position of the user in the vehicle compartment, and the biometric information of the user.
For example, the number of times that the user a used the "navigation" application in the past 10 days is 100, the number of times that the user B used the "navigation" application in the past 10 days is 50, the in-vehicle device 10-1 acquires that the user a is on the passenger seat, and the user B is on the rear seat, and then the in-vehicle device 10-1 may display the display content of the "navigation" application on the display screen 100-2 because the user used the application a higher number of times in the past 10 days in the case where the in-vehicle device 10-1 detects that the user started the application on the in-vehicle device 10-1.
S605: the in-vehicle apparatus 10-1 displays the display interface of the "navigation" application on the display screen 100-2. For details, refer to the description of step S505, which is not described herein again.
It is understood that the in-vehicle device 10-2 or the in-vehicle device 10-3 may also acquire the position of the user in the vehicle compartment and the biometric information of the user, and determine on which display screen in the vehicle compartment the application on the in-vehicle device 10-2 or the in-vehicle device 10-3 is displayed. For details, reference is made to the description shown in fig. 6, which is not repeated herein.
As is apparent from the description process of fig. 6, the in-vehicle device 10-1 may acquire the position of the user in the vehicle compartment and the biometric information of the user, and determine on which display screen in the vehicle compartment the "navigation" application is displayed, based on the position of the user in the vehicle compartment and the biometric information of the user in the vehicle compartment. It is understood that the in-vehicle device 10-1 may display applications frequently used by the user on a display screen of a seat in which the user is located, with the identity and location of the user in the vehicle compartment determined according to the habit of the user using the applications. Therefore, the application can be displayed on the display screen expected by the user under the condition that the user does not need extra operation, and the user experience is improved.
FIG. 7 is a flow diagram illustrating another distributed display according to an embodiment of the present application; as shown in fig. 7, the distributed display process includes:
s701: the in-vehicle apparatus 10-1 detects an operation to start a "navigation" application, in which a "navigation" application icon is displayed on the display screen of the in-vehicle apparatus 10-1. For details, refer to step S501 in fig. 5, which is not described herein.
S702: the in-vehicle device 10-1 determines the display screen 100-2 on which the display content of the "navigation" application is displayed, according to the association relationship between the preset "navigation" application and the display screen 100-2.
For example, the in-vehicle device 10-1 may display a display interface of the "navigation" application on the display screen 100-2 according to a preset association relationship between the "navigation" application and the display screen 100-2.
In some embodiments, the vehicle-mounted device may create an association relationship between the application and the display screen according to an association operation performed on the application and the display screen by the user on the vehicle-mounted device; for example, the user may perform an association operation of the "navigation" application with the display screen 100-2 in a "setup" application of the in-vehicle device, and the in-vehicle device generates an association relationship of the "navigation" application with the display screen 100-2 in response to the association operation.
In other embodiments, the electronic device establishes a communication connection with the vehicle-mounted device, and an application for managing the association relationship between the vehicle-mounted application and the display screen can be installed on the electronic device; the vehicle-mounted equipment can establish the association relation between the application and the display screen according to the association operation of the user on the electronic equipment; for example, an APP (application program) may be installed on the electronic device, and the user may perform an association operation on the display screen 100-2 and the navigation application in the APP of the electronic device, and the in-vehicle device generates an association relationship between the navigation application and the display screen 100-2 in response to the association operation.
It is understood that the "navigation" application may also be associated with the display screen 100-3, and the "navigation" application may also be associated with the display screen 100-1; the association relationship between the application on the in-vehicle device 10-1 and the display screen in the vehicle 11 is set by the user, and this is not particularly limited in this application.
S703: the in-vehicle apparatus 10-1 displays the display interface of the "navigation" application on the display screen 100-2. For details, refer to step S504 of fig. 5, which is not described herein.
It is to be understood that the in-vehicle device 10-2 or the in-vehicle device 10-3 may also determine on which display screen in the vehicle compartment the application on the in-vehicle device 10-2 or the in-vehicle device 10-3 is displayed according to the association relationship between the application on the in-vehicle device 10-2 or the in-vehicle device 10-3 and the display screen in the vehicle compartment, which is set by the user in advance. For details, reference is made to the description shown in fig. 5, which is not repeated herein.
As is apparent from the description process of FIG. 7, the in-vehicle device 10-1 may determine on which display screen in the vehicle compartment the application on the in-vehicle device 10-1 is displayed, upon detecting that the user starts the application on the in-vehicle device 10-1, based on the association relationship between the application on the in-vehicle device 10-1 and the display screen in the vehicle compartment, which is set in advance by the user. Therefore, under the condition that the user does not need extra operation, the application on the vehicle-mounted equipment 10-1 is displayed on the display screen expected to be displayed by the user, and the user experience is improved.
The process of the in-vehicle device 10-1 creating an association of an application with a display screen in step S702 of fig. 7 will be described in detail below.
In some embodiments, the association of the application on the in-vehicle device 10-1 with the display screen within the vehicle cabin may maintain a static rule set.
For example, as shown in FIG. 8, a static rule set may include a single application's association with a display screen, and a set of applications' associations with a display screen. An application set represents a set containing a plurality of functionally similar applications; for example, an "instant messaging" application set may include an "information" application "
Figure BDA0003237045720000081
"apply," based on a predetermined condition>
Figure BDA0003237045720000082
And if the association relation between the instant messaging application set and the display screen is established, the association relation between each application in the instant messaging application set and the display screen is established.
In some embodiments, the static rule set of the application contains an association of the application or the application set with a display screen, and the association may be such that when the in-vehicle device 10-1 detects that the application in the static rule set is started, the in-vehicle device 10-1 may start and display the application on the display screen having the association with the application.
For example, as shown in FIG. 3A, the applications displayed on display screen 100-1 include: "navigation", "music", "video", "listening to a book", "instant messaging" application set, etc., wherein the "instant messaging" application set includes "
Figure BDA0003237045720000083
"application," information "application"
Figure BDA0003237045720000084
"application".
FIG. 9 shows a schematic diagram of a static rule set created by the in-vehicle device 10-1 for an application on the display screen 100-1.
As shown in FIG. 9, the static rule set pre-created by the application on display screen 100-1 may include: the "navigation" application establishes an association with the display screen 100-2; the music application establishes an association with the display screen 100-2; the association relation between the book listening application and the display screen 100-1 is established; the browser application establishes an association with the display screen 100-2; the video application establishes an association with the display screen 100-3; the "instant messaging" application set establishes an association with the display screen 100-2.
It is to be understood that, as shown in fig. 9, the "navigation" application has an association relationship with the display screen 100-2, and the display contents of the "navigation" application may be displayed on the display screen 100-2 when the "navigation" application is started.
In some embodiments of the present application, the in-vehicle device 10-1 may create the static rule set shown in fig. 9 according to a user operation. The in-vehicle device 10-1 may determine on which display screen on the vehicle 11 the application to be started is started according to a static rule set created in advance.
For example, the user may set the association relationship of the application displayed on the display screen 100-1 with the display screen 100-1, the display screen 100-2, and the display screen 100-3 on a web page on the computer side or on an Application (APP) installed for controlling the vehicle 11. The user can also set the association relationship between the application displayed on the display screen 100-1, the display screen 100-2 and the display screen 100-3 on the APP installed on the vehicle-mounted device 10-1. It is understood that the user may operate on the APP and the in-vehicle device 10-1 creates a static rule set in response to the user operation.
10A-10C illustrate a user interface provided on an electronic device (e.g., a cell phone) for setting an association of an application with a display screen.
Fig. 10A illustrates a user interface 300 displayed by the electronic device. As shown in fig. 10A, the user interface 300 has displayed therein: status bar, tray with common application icons, page indicator, other application icons, and the like. The user interface 300 exemplarily illustrated in fig. 10A may be a Home screen (Home screen).
As shown in fig. 10A, other application icons may include, for example, an application for managing vehicles (e.g., "vehicle on") icon 301 or the like.
As shown in fig. 10A, the electronic device may detect a user operation (e.g., a click operation, a touch operation, etc.) acting on an icon 301, and in response to the user operation, launch an application program (e.g., "vehicle on") for managing the vehicle corresponding to the icon 301 and display a user interface 400 provided by the application program.
FIG. 10B illustrates one implementation of a user interface 400. The user interface 400 is used for electronic equipment to assist in managing the vehicle.
As shown in fig. 10B, the user interface 400 has displayed therein: a menu bar 401 and an in-vehicle application setting bar 402.
The in-vehicle application settings bar 402 may be used for a user to select a display screen associated with an in-vehicle application.
The application bar 403 may be used to indicate an Application (APP) displayed on the display screen 100-1. For example, as shown in FIG. 10B, the user may set up an associated display screen for a "navigation" application, a "music" application, a "listening to books" application, a "set of instant messaging" applications, and so forth.
The designated display screen bar 404 may be used to monitor a user operation, and in response to the user operation, the electronic device may send an association relationship between the APP and the display screen displaying the APP content to the in-vehicle device 10-1, and generate the APP display screen association relationship.
As shown in fig. 10B, the user may select a display screen associated with the in-vehicle application within the designated display screen column 404, for example, the electronic device may detect a user operation (e.g., a click operation, a touch operation, etc.) acting on the control 404, and in response to the user operation, open an option corresponding to the control 404 for selecting a display screen associated with the "navigation" application, and display a widget 500 provided by the option.
As shown in FIG. 10C, the user may select a display screen associated with the "navigate" application on portlet 500. For example, a user may associate a "navigation" application with display screen 100-2 by clicking on button 501 next to display screen 100-1 on window 500.
In other embodiments, the user can also set the association relationship between the application displayed on the display screen 100-1 and the display screens 100-1, 100-2 and 100-3 on the APP installed on the vehicle-mounted device 10-1.
FIGS. 11A-11C illustrate a user interface provided on the in-vehicle apparatus 10-1 for setting an association of an application with a display screen.
FIG. 11A illustrates a user interface 300 displayed by the display screen 100-1.
As shown in fig. 11A, the electronic device may detect a user operation (e.g., a click operation, a touch operation, a gesture operation, etc.) acting on an icon 901, and in response to the user operation, launch an application (e.g., "setup") corresponding to the icon 901 for managing applications of the in-vehicle device 10-1, and display a user interface 950 provided by the application.
FIG. 11B illustrates one implementation of a user interface 950 displayed by display screen 100-1. The user interface 950 is used to assist a user in controlling the vehicle 11.
As shown in fig. 11B, the user interface 950 has displayed therein: the in-vehicle application settings bar 951.
The in-vehicle application settings bar 951 may be used by a user to select a display screen associated with an in-vehicle application.
The application column 953 may be used to indicate an Application (APP) displayed on the display screen 100-1. For example, as shown in FIG. 11B, the user may set up an associated display screen for a "navigation" application, a "music" application, a "listening to books" application, a "set of instant messaging" applications, and so forth.
The specified display screen bar 952 may be used to monitor a user operation, and in response to the user operation, the electronic device may send an association relationship between the APP and a display screen displaying the APP content to the in-vehicle device 10-1, and generate the association relationship between the APP and the display screen.
As shown in fig. 11B, the user may select a display screen associated with the in-vehicle application within the designated display screen column 952, for example, the electronic device may detect a user operation (e.g., a click operation, a touch operation, etc.) acting on the control 954, and in response to the user operation, open an option corresponding to the control 954 for selecting a display screen associated with the "navigation" application, and display a small window 960 provided by the option.
As shown in FIG. 11C, the user may select a display screen associated with the "navigate" application on a widget 960. For example, a user may associate a "navigation" application with display screen 100-1 by clicking on button 961 next to display screen 100-1 on widget 960.
FIG. 12 shows a schematic block diagram of a software architecture interaction based on a distributed operating system.
As shown in FIG. 12, the distributed operating systems respectively installed on the in-vehicle device 10-1 and the in-vehicle device 10-2 adopt a hierarchical architecture. The layered architecture divides the distributed operating system into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface.
Illustratively, the in-vehicle device 10-1 divides the distributed operating system into four layers, namely an application layer 41, a framework layer 42 and a service layer 43 from top to bottom. In other embodiments, the distributed operating system may be divided into other numbers of hierarchies, which are not limited herein.
The application layer 41 may include a series of application programs such as a system application, an extended application (or a third-party application), and the like. The distributed implementation method of multiple applications provided by the present application is suitable for distributed implementation of applications (including system applications and extended applications) in the application layer 41. The system application comprises a desktop, settings, a camera, a Wireless Local Area Network (WLAN), bluetooth, navigation and the like; extended applications include third party developed software applications such as photo applications (e.g., american, pop, etc.), navigation applications 411 (e.g., grand maps, baidu maps, etc.), music applications (e.g., cool dog music, cyber cloud music, etc.), and so on.
The framework layer 42 includes a graphic framework 421, an application framework 422, and the like for the application layer.
The service layer 43 is the core of the distributed operating system, and the system service layer provides services to the application programs in the application layer through the framework layer.
It is understood that the distributed scheduling module 431 of the present application may be deployed in the framework layer 42 or the service layer 43, so as to implement uniform scheduling of the process of calling the peripheral component by each application, independently from each application program of the application layer.
It is to be understood that, in the distributed operating system, the UI framework, the user program framework, and the like in the framework layer and the distributed scheduling module 431 in the system service layer may together form a system basic capability subsystem set, which is not limited herein.
It can be understood that, for specific content of the software architecture of the distributed operating system of the in-vehicle device 10-2, reference is made to the distributed operating system of the in-vehicle device 10-1, and details are not described herein.
For example, the in-vehicle device 10-1 detects an operation of the user on the "navigation" application displayed on the display screen 100-1, the navigation application 411 of the application layer 41 may send a start instruction of the "navigation" application to the image frame 421 of the framework layer 42, the image frame 421 sends an instruction of starting the navigation "application to the application frame 422, the application frame 422 may send a start instruction of the" navigation "application to the distributed scheduling module 431 of the service layer 43, the distributed scheduling module 431 may send a start instruction of the" navigation "application to the distributed scheduling module 531, and the distributed scheduling module 531 starts the" navigation "application of the in-vehicle device 10-2 on the display screen 100-2 of the in-vehicle device 10-2 based on the start instruction of the navigation" application.
In some other embodiments of the present application, the distributed scheduling module 431 may also start the "navigation" application on the vehicle-mounted device 10-2 on the virtual screen according to the received instruction to start the navigation "application, the distributed scheduling module 431 transmits the image data of the" navigation "application displayed on the virtual screen to the distributed scheduling module 531, and the distributed scheduling module 531 may display the image data of the" navigation "application on the display screen 100-2 of the vehicle-mounted device 10-2.
In some other embodiments of the present application, the distributed scheduling module 431 may also send the data packet of the "navigation" application to the distributed scheduling module 531 according to the received instruction to start the navigation "application, and the distributed scheduling module 531 starts and displays the" navigation "application on the display screen 100-2 of the vehicle-mounted device 10-2 according to the data packet of the" navigation "application.
In some embodiments, the distributed scheduling module 431 includes, but is not limited to, a distributed soft bus, where the distributed soft bus may build an intangible bus between the vehicle-mounted device 10-1 and the vehicle-mounted device 10-2, and has the characteristics of automatic discovery, i.e., connection and use, ad hoc network (heterogeneous network networking), high bandwidth, low time delay, high reliability, and the like; in addition, the distributed soft bus can also transmit data or instructions (for example, an instruction to start a navigation application, or a data packet of the navigation application, image data, etc.) between heterogeneous networks such as bluetooth and Wireless Fidelity (WIFI).
In the distributed display scenario illustrated in fig. 4, the three display screens on the vehicle 11 may be display screens respectively disposed on the in-vehicle device 10-1, the in-vehicle device 10-2, and the in-vehicle device 10-3. In other embodiments of the present application, the three display screens on the vehicle 11 may be display screens disposed on one vehicle-mounted device.
Fig. 13 is a schematic diagram illustrating another scenario of distributed display according to an embodiment of the present application, where the scenario includes: a vehicle 11; the vehicle 11 is equipped with an in-vehicle device 20, wherein the in-vehicle device 20 comprises n display screens, namely a display screen 200-1, a display screen 200-2, a display screen … … and a display screen 200-n. The same vehicle-mounted application can be displayed on the display interfaces of the display screens 200-1 to 200-n, and different vehicle-mounted applications can also be displayed.
In the embodiment of the present application, the operating system that can be run on the in-vehicle apparatus 20 may be any one of the following operating systems: a Hongmon operating system (Harmony operating system),
Figure BDA0003237045720000111
An operating system,
Figure BDA0003237045720000112
Operating system>
Figure BDA0003237045720000113
Operating system, MAC operating system, etc. The specific content of the software architecture of the operating system of the vehicle-mounted device 20 refers to the distributed operating system of the vehicle-mounted device 10-1, which is not described herein again.
It can be understood that the in-vehicle device 20 may obtain the biometric information of the user and determine the position of the user in the vehicle cabin, and determine on which display screen of the vehicle the application to be started is displayed according to the biometric information of the user and the position of the user in the vehicle cabin, so that the application can be displayed on the display screen desired by the user without additional operation by the user, and user experience is improved. Fig. 14 shows a schematic configuration diagram of the in-vehicle apparatus 10 or the in-vehicle apparatus 20.
The in-vehicle device 10 or the in-vehicle device 20 may include a processor 110, an internal memory 121, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a sensor module 180, a key 190, a camera 193, a display screen 100, a Subscriber Identification Module (SIM) card interface 195, and the like. Wherein the sensor module 180 may include a fingerprint sensor 180E, a touch sensor 180K, and the like.
It is to be understood that the configuration illustrated in the embodiment of the invention does not constitute a specific limitation to the in-vehicle apparatus 10 or the in-vehicle apparatus 20. In other embodiments of the present application, the in-vehicle apparatus 10 or the in-vehicle apparatus 20 may include more or less components than those shown, or combine some components, or split some components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and the like.
The I2C interface is a bidirectional synchronous serial bus including a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the camera 193, the display screen 100, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement the touch function of the in-vehicle device 10 or the in-vehicle device 20.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 through an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through the I2S interface to implement a function of answering a call.
The MIPI interface may be used to connect the processor 110 with peripheral devices such as the display screen 100, the camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, the processor 110 and the camera 193 communicate through a CSI interface to implement the photographing function of the in-vehicle apparatus 10 or the in-vehicle apparatus 20. The processor 110 and the display screen 100 communicate through the DSI interface to implement the display function of the in-vehicle device 10 or the in-vehicle device 20.
For example, in a case where the processor 110 of the in-vehicle apparatus 20 determines that the "navigation" application on the display screen 100-1 is started and displays the display screen 100-2, the processor 110 of the in-vehicle apparatus 20 may communicate with the display screen 100-2 through the DSI interface to display an image of a display window of the "navigation" application on the display screen 100-2.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display screen 100, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, and the like.
It should be understood that the interface connection relationship between the modules according to the embodiment of the present invention is only illustrative, and does not constitute a structural limitation on the in-vehicle device 10 or the in-vehicle device 20. In other embodiments of the present application, the in-vehicle device 10 or the in-vehicle device 20 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The wireless communication function of the in-vehicle device 10 or the in-vehicle device 20 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the in-vehicle device 10 or the in-vehicle device 20 may be used to cover a single or a plurality of communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the in-vehicle device 10 or the in-vehicle device 20. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like, which is applied to the in-vehicle device 10 or the in-vehicle device 20. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves via the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the in-vehicle device 10 or the in-vehicle device 20 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the in-vehicle device 10 or the in-vehicle device 20 can communicate with a network and other devices through a wireless communication technology. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The display screen 100 is used to display images, video, and the like. The display screen 100 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the in-vehicle apparatus 10 or the in-vehicle apparatus 20 may include 1 or N display screens 100, N being a positive integer greater than 1. For example, the display screen 100 may display a display window of an application to be launched.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the in-vehicle apparatus 10 or the in-vehicle apparatus 20 may include 1 or N cameras 193, N being a positive integer greater than 1. For example, the camera 193 may capture facial information of a user in the vehicle 11.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The stored data area may store data (such as audio data, a phonebook, etc.) created during use of the in-vehicle apparatus 10 or the in-vehicle apparatus 20, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications and data processing of the in-vehicle device 10 or the in-vehicle device 20 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The in-vehicle apparatus 10 or the in-vehicle apparatus 20 may implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor, etc. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The in-vehicle apparatus 10 or the in-vehicle apparatus 20 may listen to music or listen to a handsfree call through the speaker 170A.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking near the microphone 170C through the mouth. The in-vehicle apparatus 10 or the in-vehicle apparatus 20 may be provided with at least one microphone 170C. In other embodiments, the in-vehicle apparatus 10 or the in-vehicle apparatus 20 may be provided with two microphones 170C, which may implement a noise reduction function in addition to collecting sound signals. In other embodiments, three, four or more microphones 170C may be further disposed on the vehicle-mounted device 10 or the vehicle-mounted device 20 to collect sound signals, reduce noise, identify sound sources, implement directional recording functions, and so on.
The fingerprint sensor 180E is used to collect a fingerprint. The vehicle-mounted device 10 or the vehicle-mounted device 20 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access to an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 100, and the touch sensor 180K and the display screen 100 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to the touch operation may be provided through the display screen 100. In other embodiments, the touch sensor 180K may be disposed on the surface of the in-vehicle device 10 or the in-vehicle device 20, and may be located at a position different from the position of the display screen 100.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The in-vehicle apparatus 10 or the in-vehicle apparatus 20 may receive a key input, generating a key signal input related to user setting and function control of the in-vehicle apparatus 10 or the in-vehicle apparatus 20.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the in-vehicle apparatus 10 or the in-vehicle apparatus 20 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The in-vehicle device 10 or the in-vehicle device 20 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 is also compatible with different types of SIM cards. The SIM card interface 195 is also compatible with external memory cards. The vehicle-mounted device 10 or the vehicle-mounted device 20 interacts with the network through the SIM card to realize functions such as conversation and data communication. In some embodiments, the in-vehicle device 10 or the in-vehicle device 20 employs eSIM, that is: an embedded SIM card. The eSIM card may be embedded in the in-vehicle apparatus 10 or the in-vehicle apparatus 20 and may not be separated from the in-vehicle apparatus 10 or the in-vehicle apparatus 20.
Embodiments of the mechanisms disclosed herein may be implemented in hardware, software, firmware, or a combination of these implementations. Embodiments of the application may be implemented as computer programs or program code executing on programmable systems comprising at least one processor, a storage system (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
Program code may be applied to input instructions to perform the functions described herein and generate output information. The output information may be applied to one or more output devices in a known manner. For purposes of this Application, a processing system includes any system having a Processor such as, for example, a Digital Signal Processor (DSP), a microcontroller, an Application Specific Integrated Circuit (ASIC), or a microprocessor.
The program code may be implemented in a high level procedural or object oriented programming language to communicate with a processing system. Program code may also be implemented in assembly or machine language, if desired. Indeed, the mechanisms described in this application are not limited in scope to any particular programming language. In any case, the language may be a compiled or interpreted language.
In some cases, the disclosed embodiments may be implemented in hardware, firmware, software, or any combination thereof. The disclosed embodiments may also be implemented as instructions carried by or stored on one or more transitory or non-transitory machine-readable (e.g., computer-readable) storage media, which may be read and executed by one or more processors. For example, the instructions may be distributed via a network or via other computer readable media. Thus, a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), including, but not limited to, floppy diskettes, optical disks, read-Only memories (CD-ROMs), magneto-optical disks, read-Only memories (ROMs), random Access Memories (RAMs), erasable Programmable Read-Only memories (EPROMs), electrically Erasable Programmable Read-Only memories (EEPROMs), magnetic or optical cards, flash Memory, or tangible machine-readable memories for transmitting information (e.g., carrier waves, infrared signals, digital signals, etc.) using the Internet to transmit information in an electrical, optical, acoustical or other form of propagated signals. Thus, a machine-readable medium includes any type of machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (e.g., a computer).
In the drawings, some features of the structures or methods may be shown in a particular arrangement and/or order. However, it is to be understood that such specific arrangement and/or ordering may not be required. Rather, in some embodiments, the features may be arranged in a manner and/or order different from that shown in the illustrative figures. In addition, the inclusion of a structural or methodical feature in a particular figure is not meant to imply that such feature is required in all embodiments, and in some embodiments, may not be included or may be combined with other features.
It should be noted that, in the embodiments of the apparatuses in the present application, each unit/module is a logical unit/module, and physically, one logical unit/module may be one physical unit/module, or may be a part of one physical unit/module, and may also be implemented by a combination of multiple physical units/modules, where the physical implementation manner of the logical unit/module itself is not the most important, and the combination of the functions implemented by the logical unit/module is the key to solve the technical problem provided by the present application. Furthermore, in order to highlight the innovative part of the present application, the above-mentioned device embodiments of the present application do not introduce units/modules which are not so closely related to solve the technical problems presented in the present application, which does not indicate that no other units/modules exist in the above-mentioned device embodiments.
It is noted that, in the examples and descriptions of this patent, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the use of the verb "comprise a" to define an element does not exclude the presence of another, same element in a process, method, article, or apparatus that comprises the element.
While the present application has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present application.

Claims (11)

1. A distributed display method for at least one onboard device in a vehicle, characterized in that the at least one onboard device controls a plurality of display screens in the vehicle; and is provided with
The method comprises the following steps:
detecting a starting instruction of a user to a first application;
obtaining first biometric information associated with the first application and obtaining at least one second biometric information of the user at least one of the plurality of display screens;
when second biological characteristic information matched with the first biological characteristic information exists in the at least one second biological characteristic information, the started first application is displayed in the following steps: and the display screen of the user of the second biological characteristic information matched with the first biological characteristic information.
2. The method of claim 1, further comprising:
and under the condition that second biological characteristic information matched with the first biological characteristic information does not exist in the at least one second biological characteristic information, displaying the started first application on at least one display screen responding to an operation instruction of a user according to the detected operation instruction of the user.
3. The method of claim 1, wherein the at least one onboard device comprises a first onboard device and a second onboard device, and the plurality of display screens comprises a first display screen controlled by the first onboard device and a second display screen controlled by the second onboard device; and is provided with
The starting instruction is generated by the first vehicle-mounted device in response to the user's selection operation of a first application icon of a first application on the first display screen; and is
And the display screen where the user of the second biological characteristic information matched with the first biological characteristic information is located is the second display screen.
4. The method of claim 3, wherein obtaining at least one second biometric information of the user at least one of the plurality of display screens comprises:
the first in-vehicle device obtains, from the second in-vehicle device, second biometric information of the user at the second display screen acquired by the second in-vehicle device.
5. The method according to claim 3, wherein the displaying the first application after the launching is performed by: with on the display screen of the user place of the second biological characteristic information that first biological characteristic information matches, include:
the first vehicle-mounted device sends information of the first application to the second vehicle-mounted device;
and the second vehicle-mounted equipment displays the started first application on the second display screen according to the information of the first application.
6. The method of claim 1, wherein the at least one in-vehicle device comprises a third in-vehicle device, and the plurality of display screens comprises a third display screen and a fourth display screen controlled by the third in-vehicle device; and is provided with
The starting instruction is generated by the third vehicle-mounted equipment in response to the selection operation of a user on a first application icon of the first application on the third display screen; and is
And the display screen where the user of the second biological characteristic information matched with the first biological characteristic information is located is the fourth display screen.
7. The method of claim 6, wherein obtaining at least one second biometric information of the user at least one of the plurality of display screens comprises:
and the second biological characteristic information of the user at the fourth display screen is acquired by the third vehicle-mounted equipment.
8. The method according to claim 6, wherein the displaying the first application after the launching is performed by: with on the display screen of the user place of the second biological characteristic information that first biological characteristic information matches, include:
and the third vehicle-mounted equipment displays the started first application on the fourth display screen.
9. The method of claim 1, wherein the first biometric information and the second biometric information comprise at least one of: face information, voiceprint information, fingerprint information, iris information.
10. A readable medium, characterized in that the readable medium has stored thereon instructions that, when executed on an in-vehicle apparatus, cause the in-vehicle apparatus to execute the distributed display method according to any one of claims 1 to 9.
11. An in-vehicle apparatus, characterized by comprising:
a memory for storing instructions for execution by one or more processors of the in-vehicle device, an
A processor, which is one of processors of an in-vehicle device, for performing the distributed display method of any one of claims 1 to 9.
CN202111005585.2A 2021-08-30 2021-08-30 Distributed display method, readable medium and vehicle-mounted device Pending CN115904283A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111005585.2A CN115904283A (en) 2021-08-30 2021-08-30 Distributed display method, readable medium and vehicle-mounted device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111005585.2A CN115904283A (en) 2021-08-30 2021-08-30 Distributed display method, readable medium and vehicle-mounted device

Publications (1)

Publication Number Publication Date
CN115904283A true CN115904283A (en) 2023-04-04

Family

ID=86476597

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111005585.2A Pending CN115904283A (en) 2021-08-30 2021-08-30 Distributed display method, readable medium and vehicle-mounted device

Country Status (1)

Country Link
CN (1) CN115904283A (en)

Similar Documents

Publication Publication Date Title
EP4030276B1 (en) Content continuation method and electronic device
CN112231025B (en) UI component display method and electronic equipment
WO2021036809A1 (en) Subscriber identity module (sim) management method and electronic device
CN112449099B (en) Image processing method, electronic equipment and cloud server
CN110716776A (en) Method for displaying user interface and vehicle-mounted terminal
EP4120074A1 (en) Full-screen display method and apparatus, and electronic device
CN110602315A (en) Electronic equipment with foldable screen and display method
CN110198362B (en) Method and system for adding intelligent household equipment into contact
CN112789934B (en) Bluetooth service query method and electronic equipment
CN114115770B (en) Display control method and related device
CN112954648B (en) Interaction method, terminal and system of mobile terminal and vehicle-mounted terminal
CN109327613B (en) Negotiation method based on voice call translation capability and electronic equipment
CN112543447A (en) Device discovery method based on address list, audio and video communication method and electronic device
CN114258671A (en) Call method and device
EP4365722A1 (en) Method for displaying dock bar in launcher and electronic device
CN113472477A (en) Wireless communication system and method
CN114500901A (en) Double-scene video recording method and device and electronic equipment
WO2024001940A1 (en) Vehicle searching method and apparatus, and electronic device
US20230350629A1 (en) Double-Channel Screen Mirroring Method and Electronic Device
CN113301544A (en) Method and equipment for voice intercommunication between audio equipment
CN112532508B (en) Video communication method and video communication device
CN115904283A (en) Distributed display method, readable medium and vehicle-mounted device
CN117864147A (en) Driving state detection method and related equipment
CN113890929B (en) Method and device for switching audio output channel and electronic equipment
CN115706916A (en) Wi-Fi connection method and device based on position information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination