CN117075834A - Multisystem fusion display method applied to automobile - Google Patents

Multisystem fusion display method applied to automobile Download PDF

Info

Publication number
CN117075834A
CN117075834A CN202311118051.XA CN202311118051A CN117075834A CN 117075834 A CN117075834 A CN 117075834A CN 202311118051 A CN202311118051 A CN 202311118051A CN 117075834 A CN117075834 A CN 117075834A
Authority
CN
China
Prior art keywords
operating system
display
screen
layer
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311118051.XA
Other languages
Chinese (zh)
Inventor
肖文平
刘进朝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Hinge Electronic Technologies Co Ltd
Original Assignee
Shanghai Hinge Electronic Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Hinge Electronic Technologies Co Ltd filed Critical Shanghai Hinge Electronic Technologies Co Ltd
Priority to CN202311118051.XA priority Critical patent/CN117075834A/en
Publication of CN117075834A publication Critical patent/CN117075834A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Abstract

The application provides a multisystem fusion display method applied to an automobile, which at least comprises the following steps: the system comprises a first operating system and a second operating system, wherein the first operating system is used as a main system, and the second operating system runs on the first operating system in a virtualization mode; the first operating system is a real-time operating system with high security level; the second operating system sends the second data to be displayed to the first operating system through the frame buffer; the first operating system outputs the first data to be displayed and the second data from the second operating system to the display screen for display through the I/O interface after fusion processing. The technical scheme provided by the application can realize the display of a plurality of systems by using a single display, and saves the hardware cost. The display of different systems serves as different layers, and the UI displays of different systems are spliced together through superposition of the layers and adjustment of the sizes of the layers, so that the multiple systems share the same screen.

Description

Multisystem fusion display method applied to automobile
Technical Field
The application relates to the field of intelligent automobile display, in particular to a multi-system display fusion method applied to an automobile.
Background
With the continuous development of automobile technology in recent years, the demands of users for the intellectualization, automation and networking of automobiles are also continuously increasing. Undoubtedly, automobiles with good user experience will win the back-end of the purchasing user, forcing automobile manufacturers to continually add new services to meet the needs of the user, and to achieve security.
In the existing automobile, the safety is required to be placed in the first place, an operating system in the automobile is required to be installed to have good stability and instantaneity, but often the bottom layer design of the safety system is relatively solidified, the bottom layer authority is limited, a user cannot install entertainment software in the system, and user experience is reduced. In order to solve the problem in the prior art, in the cabin development process of an automobile, a configuration of double systems is adopted, but the two systems are mutually independent and are respectively connected with different display screens. Since the two systems are independent, there are many times when the vehicle's instrumentation system needs to interact with the entertainment system, such two independent systems also make interaction very inconvenient and the cost of the hardware and software required can be increased.
Disclosure of Invention
Based on one of the defects existing in the prior art, the application provides a multi-system fusion display method applied to an automobile, which at least comprises the following steps:
the system comprises a first operating system and a second operating system, wherein the first operating system is used as a main system, and the second operating system runs on the first operating system in a virtualization mode;
the first operating system is a real-time operating system with high security level;
the second operating system sends the second data to be displayed to the first operating system through the frame buffer;
the first operating system outputs the first data to be displayed and the second data from the second operating system to the display screen for display through the I/O interface after fusion processing.
The multi-system fusion display method applied to the automobile is further characterized in that optionally, a screen picture layer management module is arranged in the first operation system, and the picture layer is managed and controlled through the screen picture layer management module;
the management and control of the layers by the screen layer management module specifically comprises the following steps: and starting a first screen process through the first screen file, connecting the first screen process to a screen management file library, loading a display configuration file of a system by the screen management file library, calling a bottom layer driver according to the configuration file to create a corresponding display window, and managing the created window.
The method for multi-system fusion display applied to the automobile further comprises the optional step of setting a layer switching module in the first operating system, wherein the layer switching module is used for monitoring a screen layer management module, acquiring state information of each layer in the first operating system and/or the second operating system, and then calling a screen layer management module interface to complete layer switching according to switching requests from application programs in the first operating system and/or the second operating system, and the layer display sequence depends on the allocated layer numbers.
The multi-system fusion display method applied to the automobile is further characterized in that a plurality of different layers are arranged in a first operating system and a second operating system respectively, and images to be displayed are spliced, fused and output to the same display screen for display through superposition of the layers and adjustment of the sizes of the layers.
The multi-system fusion display method applied to the automobile is further characterized in that optionally, according to the operation event transmitted by a keyboard, a mouse or a touch screen, the information of the current top layer displayed layer or the display area where the layer is positioned is judged, and the operation event is decided to be transmitted to the corresponding operation system for processing.
The multi-system fusion display method applied to the automobile is further optional, when the display screen adopts a touch screen, the touch screen driver operates in a first operating system, a corresponding file node upper layer application access is created in a preset path through the touch screen driver, and touch information is acquired through the file node, wherein the touch information at least comprises screen coordinate information and touch type.
The multi-system fusion display method applied to the automobile is further optional, when the first operating system and the second operating system are fused at the display interface, the layer interface of the first operating system is displayed in a full screen mode, the interface part of a container for storing the application program is displayed in the second operating system, wherein the layer formed after the application program is displayed in the second operating system covers the layer formed by the first operating system for displaying and executing the application program, and the boundary corresponding to the layer in the first operating system is larger than the boundary corresponding to the layer in the second operating system.
Further optionally, the touch event processing specifically includes:
monitoring file nodes created by the touch driver;
when a touch event occurs, judging whether the touch range exceeds the interface range of the second operating system;
if the touch range is not beyond the interface range of the second operating system, judging whether the interface where the touch range is located is a container interface of the second operating system, if the touch range is not located in the container interface of the second operating system, handing the touch event to the second operating system for processing, and if the touch range is located in the container interface of the second operating system, judging the touch type of the current touch event, and calling different processes for processing according to the touch type;
if the touch event exceeds the interface range of the second operating system, judging whether the first operating system and/or the second operating system are/is needed to process at present, and if so, sending the touch event to the corresponding operating system to process.
Further optionally, the display mode of the display screen includes: the first display mode composition_on_gust, in which the data generated by the UI in the second operating system is synthesized by the rendering process of the second operating system end, and then the first operating system is sent for display;
or a second display mode, a composition_on_host mode, in which the second operating system renders the UI-generated data to the first operating system for rendering into a final display.
Further optionally, the first operating system is a QNX system, the second operating system is an Android system, the virtualization mode is a Hypervisor, the Hypervisor operates on the QNX system, and the Android system operates on the Hypervisor.
Further optionally, the communication between the QNX system and the Android system depends on the CM module, the bottom layer communicates through tcp/udp protocol, the physical transmission is based on the virtual network card created by QVM, and the transmission of the subsequent key event depends on the CM module.
The beneficial effects are that:
the technical scheme provided by the application has the following advantages:
and the display of a plurality of systems can be realized by only one display physically, so that the hardware cost is saved. The display of different systems serves as different layers, and the UI displays of different systems can be spliced together through superposition of the layers and adjustment of the sizes of the layers, so that the aim that multiple systems share the same screen is fulfilled. The touch event transmitted from the touch screen can determine to transmit the current touch event to the system for processing by judging the information of the current display layer at the uppermost layer or the display area where the layer is located, so that a plurality of systems share one set of touch screen and touch drive. The real-time system with high security level is used as Host to coexist with the non-real-time system with strong function but weak stability as Guest, if the Guest system exits due to abnormality, the Host system can still operate normally and be displayed on the screen. Compared with the traditional scheme that each system monopolizes one screen, a plurality of systems are fused into one screen, even some businesses can be fused into one UI interface, the operation is more flexible and convenient for users, and the fusion of the UI saves a great deal of repeated development work (for example, two systems can share status bars, navigation bars, system settings and the like).
Drawings
The following drawings are only illustrative of the application and do not limit the scope of the application.
FIG. 1 shows a hardware and software architecture used in dual system fusion according to an embodiment of the present application
Fig. 2 is a software and hardware architecture used in the three-system fusion according to an embodiment of the present application.
Fig. 3 is a schematic structural diagram of each system module in dual-system fusion according to an embodiment of the present application.
FIG. 4 is a block diagram of a dual system shared display screen according to an embodiment of the present application.
Fig. 5 is a diagram illustrating a logical structure of layer switching implementation in a dual system fusion process according to an embodiment of the present application.
FIG. 6 is a flow chart illustrating a touch event processing procedure in dual system fusion according to an embodiment of the present application.
Detailed Description
For a clearer understanding of the technical features, objects and effects herein, a detailed description of the present application will now be made with reference to the accompanying drawings in which like reference numerals refer to like parts throughout the various views. For simplicity of the drawing, the figures schematically show portions relevant to the present application and do not represent the actual structure thereof as a product. In addition, for simplicity and ease of understanding, components having the same structure or function in some of the figures are shown schematically only one of them, or only one of them is labeled.
With respect to control systems, functional blocks, applications (APP), etc. are well known to those skilled in the art and may take any suitable form, either hardware or software, as well as a plurality of functional blocks disposed discretely, or as a plurality of functional units integrated into one piece of hardware. In its simplest form, the control system may be a controller, such as a combinational logic controller, a micro-programmed controller, etc., provided that the described operations of the present application can be implemented. Of course, the control system may also be integrated as a different module into one physical device, without departing from the basic principle and scope of the application.
"connected" in the present application may include a direct connection, or may include an indirect connection, a communication connection, or an electrical connection, unless specifically indicated otherwise.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, values, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, values, steps, operations, elements, components, and/or groups thereof. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items
It should be understood that the term "vehicle" or "vehicular" or other similar terms as used herein generally include motor vehicles, such as passenger automobiles including Sport Utility Vehicles (SUVs), buses, trucks, various commercial vehicles, watercraft including various boats, ships, aircraft, etc., and include hybrid vehicles, electric vehicles, plug-in hybrid electric vehicles, hydrogen-powered vehicles, and other alternative fuel vehicles (e.g., fuels derived from non-petroleum sources of energy). As referred to herein, a hybrid vehicle is a vehicle having two or more power sources, such as a vehicle that is both gasoline powered and electric powered.
Furthermore, the controller of the present disclosure may be embodied as a non-transitory computer readable medium on a computer readable medium containing executable program instructions for execution by a processor, controller, or the like. Examples of computer readable media include, but are not limited to, ROM, RAM, compact Disk (CD) -ROM, magnetic tape, floppy disk, flash memory drives, smart cards, and optical data storage devices. The computer readable recording medium CAN also be distributed over network coupled computer systems so that the computer readable recording medium is stored and executed in a distributed fashion, for example, by a telematics server or Controller Area Network (CAN).
The application provides a multisystem fusion display method applied to an automobile, which is applied to an intelligent cabin, and particularly refers to fig. 1 to 6.
Specifically, in the existing intelligent cabin, two sets of different systems are adopted in the central control and the instrument, and the instrument system adopts a safe real-time system at the instrument end because the instrument system relates to driving safety, such as control of accelerator, brake, acceleration and the like, such as: a customized linux system, a QNX system, etc. are employed. At the central control end, human-computer interaction and vehicle-mounted entertainment are needed, more selectable systems such as an apple carplay system and an android system can be adopted, but the two systems are independent of each other, display is also independent and display is carried out on different display screens, so that the interaction of the two systems is difficult or more software and hardware cost is needed, and user experience is poor. For driving safety, sometimes, the navigation of the central control screen needs to be projected to the instrument end for display, or the instrument end is utilized to display some important information from the central control screen, so that two independent systems and independent display screens are obviously difficult to realize, delay and blocking of pictures can occur, and user experience is affected.
In particular, the present application proposes a solution that allows two systems to share one output display screen and to implement independent operation of the two systems on one set of hardware.
Specifically, as shown in fig. 1 and fig. 2, a first operating system is run on the underlying hardware, and in the first operating system, a virtualization module is run, such as: hypervisor;
a second operating system, or more operating systems, is run on the virtualization module.
For example: the first operating system can be a QNX real-time operating system, the QNX has good real-time performance, safety and stability, and an instrument system is applied;
taking the QNX system as a main system, and installing a virtualization module program in the main system, such as: the Hypervisor runs a second operating system in the virtual phone module program, and the second operating system can be an android system;
the second operating system and the first operating system communicate in a shared memory mode;
the Android system has more ecological applications, but has lower safety, and when a user crashes or collapses the second operating system due to upgrading or improper operation, the operation of the first operating system with safe design is not affected.
Specifically, the display fusion method provided in this embodiment is applicable to not only dual systems but also multiple systems.
Such as: the third operating system may be a carrlay, and is suitable for performing man-machine interaction with the mobile terminal performed by the apple MAC system of the user.
Specifically, the multi-system fusion display method applied to the automobile at least comprises the following specific steps:
the system comprises a first operating system and a second operating system, wherein the first operating system is used as a main system, and the second operating system runs on the first operating system in a virtualization mode;
the first operating system is a real-time operating system with high security level;
the second operating system sends the second data to be displayed to the first operating system through the frame buffer;
the first operating system outputs the first data to be displayed and the second data from the second operating system to the display screen for display through the I/O interface after fusion processing.
Specifically, as shown in fig. 3, the first operating system is a QNX master system, the second operating system is an Android slave system, and the whole scheme of the QNX master system is based on QNX hypervisor, wherein QNX is used as a master system end, and Android is started through a QVM process of the master system and is used as a slave system end.
Setting display hardware, virtual display drive and screen image layer management module in the QNX main system, and processing the second display processing process of the display data from the Android end of the system and the first display processing process of the data to be displayed by the QNX main system end;
in order to adapt to the QNX system in the Android system, display data generated by an Android APP application is called and processed by DRM framework API in the system and then sent to virtual Dispay Driver virtual display drivers, processed by virtual Dispay Driver and then sent to a created linux FB frame buffer and then sent to a second display processing process in QNX.
Interaction of display commands and display data is achieved between the first operating system and the second operating system through a shared memory technology.
Specifically, in order to achieve image fusion from different independently running operating systems, the present embodiment divides the display area in fusion, displays the UI graphical interface of the first operating system in the top area 100 and the bottom area 106 of the display screen, displays the image interface from the UI of the second operating system in the middle area 102 of the display screen, and divides a block area 104 in the middle area 102 for displaying a plurality of application programs installed in the second operating system using a dock container.
Specifically, the display screen fusion scheme comprises the steps of setting a plurality of different layers in a first operating system and a second operating system respectively, and carrying out splicing fusion on images to be displayed and outputting the images to the same display screen for display through superposition of the layers and adjustment of the sizes of the layers.
Specifically, a screen picture layer management module is arranged in a first operating system, and picture layers are managed and controlled through the screen picture layer management module;
the management and control of the layers by the screen layer management module specifically comprises the following steps: and starting a first screen process through the first screen file, connecting the first screen process to a screen management file library, loading a display configuration file of a system by the screen management file library, calling a bottom layer driver according to the configuration file to create a corresponding display window, and managing the created window.
When the first operating system is QNX, all layers are managed by a screen layer management module Screen manager at the QNX end;
this screen manager corresponds to the libscreen library of the QNX hypervisor system,
and starting a screen process through a SCREE file, wherein the screen process is connected with a libscreen library of the system, loading a display configuration file of the system by the libscreen library, and calling a bottom driver to create a corresponding display window according to the configuration file.
In the image composition, the configuration of the display position of the image interface is specifically implemented as follows when the first display data and the second display data are respectively from the first operating system and the second operating system:
configuring images of a main system end, and at least setting resolution, display ID, color mode and refresh rate of a main system image layer;
after the configuration in the subsystem (such as the second operating system), the resolution, the display ID, the start range of the displayed position coordinates and the rendering mode which are the same as those of the layers in the main system are set.
The display modes of the display screen include: a first display mode (composition_on_gust), in which data generated from the UI in the second operating system is synthesized by the rendering process of the second operating system end, and then the first operating system is sent to display;
or a second display mode (composition_on_host) in which the second operating system renders the UI-generated data to the first operating system for rendering into a final display.
Specific examples are:
and (3) configuring a main system side layer:
the display screen has a split rate of 1920x1080, a refresh frequency of 60Hz, a corresponding display ID number of 3, a color mode of rgba8888, and the following codes need to be newly added in graphics.
After the configuration is completed, the following commands are executed to enable the script to be effective:
screen-c graphics.conf
specifically, after the configuration of the main system side is completed, the layer configuration is performed at the second operating system side:
in this scheme Android is displayed in the above configured display, so ID is 3;
only the middle position of the screen is occupied, so that the initial position needs to be calculated according to the height of the status bar, and the end position needs to be calculated according to the height of the navigation bar.
In this scheme, the status bar height is 20 pixels, the navigation bar height is 40 pixels, so the starting position is (x, y) = (0, 20), the display height is: 1080-20-40= 1020.
The refresh rate and the main system end keep at 60Hz all the time, and the display mode adopts the composition_on_host mode, so the composition_on_gust is set to 0.
Regarding the display mode mainly related to the image composition mode, two modes are mainly supported at present, one is a composition_on_guide mode, in which image data is self-synthesized by the guide terminal, and then what composition result is presented to host for display.
In the other is a composition_on_host mode in which the guset gives UI-generated data to host for rendering to a final display. Rendering pipeline and layer number are to be distinguished from host in window configuration, here default layer number is 6, and then dynamically adjusted according to business requirements. The following code needs to be added to the vdev wfd android h3.Conf file:
this script is loaded by the qdev-wfd executable file of QNX hypervision, executing the following commands to validate the configuration:
vdev-wfd-c vdev_wfd_android_h3.conf-n dis-adr
referring to fig. 5 specifically, a layer switching module is set in the first operating system, where the layer switching module is configured to monitor a screen layer management module, obtain status information of each layer in the first operating system and/or the second operating system, and then call a screen layer management module interface to complete layer switching according to a switching request from an application program in the first operating system and/or the second operating system, where a layer display order depends on an allocated layer number.
When the first operating system and the second operating system are fused at the display interface, the layer interface of the first operating system is displayed in a full screen mode, and the interface part of a container for storing the application program is displayed in the second operating system, wherein the layer formed after the application program is displayed in the second operating system covers the layer formed by the first operating system for displaying and executing the application program, and the boundary corresponding to the layer in the first operating system is larger than the boundary corresponding to the layer in the second operating system.
Specifically, in the image numbering, the implementation may set that the layer numbers of the first operating system are all larger than the layer numbers of the second operating system, or vice versa, and the layer numbers, such as the large layer numbers and the small layer numbers, may be set to cover the layer numbers; or conversely, through the coverage of the layers, the parts of different areas of different layers can be displayed for fusion.
As shown in fig. 4, after an application switching request from the main system QNX application or the second operating system is sent to the layer switching module, the layer switching module invokes the screen image management module to set the image, and sends a feedback signal after the setting is completed.
After the fusion of the system, the embodiment provides the following operation event processing scheme aiming at the operation event of the fused interface:
judging the current information of the image layer displayed on the uppermost layer or the display area where the image layer is positioned according to the operation event transmitted by the keyboard, the mouse or the touch screen, and determining to transmit the operation event to the corresponding operation system for processing;
when the display screen adopts a touch screen, the touch screen driver operates in a first operating system, a corresponding file node upper layer application access is established on a preset path through the touch screen driver, and touch information is acquired through the file node, wherein the touch information at least comprises screen coordinate information and touch type;
specific examples are:
touch-driven access:
the touch screen driver operates at the QNX host end and is started through the following commands:
mtouch-r
after the touch driver is started, corresponding file node upper layer application access is created on the following paths:
/dev/mtouch/touch_display0
the touch information acquired by this node includes:
1) Coordinate information (x, y), range (192,1080);
2) Touch type: push (TOUCH), slide (MOVE), RELEASE (RELEASE).
Touch event processing flow:
the following describes a key processing flow taking a page integrating QNX and Android as an example. In the page, QNX is displayed in a full screen mode, android only displays an interface part for storing an app container, the Android can cover the QNX layer because the Android layer number is larger than that of the QNX layer, and the Android layer display boundary is not as large as the QNX layer, so that a state column and a navigation column of the QNX are displayed, and the purpose of sharing a UI (user interface) between the Android and the QNX is achieved.
When the operation event is triggered by the touch screen, referring to fig. 6, the formed touch event processing specifically includes:
monitoring file nodes created by the touch driver;
when a touch event occurs, judging whether the touch range exceeds the interface range of the second operating system;
if the touch range is not beyond the interface range of the second operating system, judging whether the interface where the touch range is located is a container interface of the second operating system, if the touch range is not located in the container interface of the second operating system, handing the touch event to the second operating system for processing, and if the touch range is located in the container interface of the second operating system, judging the touch type of the current touch event, and calling different processes for processing according to the touch type;
specifically, when the touch type is a sliding event, a switching command is transmitted to a layer switching module to switch the layers; and when the touch type is a touch event, triggering the starting of the APP in the container.
If the touch event exceeds the interface range of the second operating system, judging whether the first operating system and/or the second operating system are/is needed to process at present, and if so, sending the touch event to the corresponding operating system to process.
If the QNX processing is needed, the processing is sent to QNX, if the Android processing is needed, the processing is sent to Android, and if the processing is needed, the processing is sent to QNX and Android.
Specifically, for the operation of the navigation bar, such as: clicking a home key or a related key of facial light control requires QNX processing;
clicking a navigation shortcut key, wherein the navigation APP is integrated into the second operating system and is responsible for processing by the second operating system;
the QNX system is used for processing the UI display when entering the setting page, but certain equipment specific control is finished by corresponding android, so that the two parties are required to process respectively.
The communication between the QNX system and the Android system depends on a CM module, the bottom layer communicates through tcp/udp protocols, the virtual network card created based on QVM is physically transmitted, and the transmission of the subsequent key event depends on the CM module;
the CM module is developed according to an AUTOSAR standard communication module.
The above is only a preferred embodiment of the present application, and the present application is not limited to the above examples. It will be clear to a person skilled in the art that the form in this embodiment is not limited thereto, nor is the manner of adjustment. It will be appreciated that other modifications and variations, which may be directly derived or contemplated by those skilled in the art, are deemed to be within the scope of the present application without departing from the essential concept thereof.

Claims (11)

1. A multi-system fusion display method applied to an automobile, comprising:
the system comprises a first operating system and a second operating system, wherein the first operating system is used as a main system, and the second operating system runs on the first operating system in a virtualization mode;
the first operating system is a real-time operating system with high security level;
the second operating system sends the second data to be displayed to the first operating system through the frame buffer;
the first operating system outputs the first data to be displayed and the second data from the second operating system to the display screen for display through the I/O interface after fusion processing.
2. The multi-system fusion display method applied to the automobile as claimed in claim 1, wherein a screen picture layer management module is arranged in the first operation system, and the picture layer is managed and controlled through the screen picture layer management module;
the management and control of the layers by the screen layer management module specifically comprises the following steps: and starting a first screen process through the first screen file, connecting the first screen process to a screen management file library, loading a display configuration file of a system by the screen management file library, calling a bottom layer driver according to the configuration file to create a corresponding display window, and managing the created window.
3. The method for multi-system fusion display of an automobile according to claim 1, wherein a layer switching module is provided in a first operating system, the layer switching module is configured to monitor a screen layer management module, obtain status information of each layer in the first operating system and/or a second operating system, and then call a screen layer management module interface to complete layer switching according to a switching request from an application program in the first operating system and/or the second operating system, where a layer display order depends on an allocated layer number.
4. The multi-system fusion display method applied to the automobile according to claim 1, wherein a plurality of different layers are respectively arranged in the first operating system and the second operating system, and images to be displayed are spliced, fused and output to the same display screen for display through superposition of the layers and adjustment of the sizes of the layers.
5. The method of claim 1, wherein the information of the current top layer or the display area where the layer is located is determined according to the operation event transmitted from the keyboard, the mouse or the touch screen, and the current operation event is determined to be transmitted to the corresponding operation system for processing.
6. The method for multi-system fusion display applied to an automobile according to claim 5, wherein when the display screen adopts a touch screen, the touch screen driver operates in the first operating system, creates a corresponding upper application access of a file node in a preset path through the touch screen driver, and obtains touch information through the file node, wherein the touch information at least comprises screen coordinate information and touch type.
7. The multi-system fusion display method for an automobile according to claim 1, wherein when the first operating system and the second operating system are fused, the layer interface of the first operating system is displayed full screen, the interface part of the container storing the application program is displayed in the second operating system, wherein the layer formed after the application program is displayed in the second operating system covers the layer formed by the first operating system for displaying the application program, and the boundary corresponding to the layer in the first operating system is larger than the boundary corresponding to the layer in the second operating system.
8. The method for multi-system fusion display for an automobile according to claim 5, wherein the touch event processing comprises:
monitoring file nodes created by the touch driver;
when a touch event occurs, judging whether the touch range exceeds the interface range of the second operating system;
if the touch range is not beyond the interface range of the second operating system, judging whether the interface where the touch range is located is a container interface of the second operating system, if the touch range is not located in the container interface of the second operating system, handing the touch event to the second operating system for processing, and if the touch range is located in the container interface of the second operating system, judging the touch type of the current touch event and calling different modules for processing according to the touch type;
if the touch event exceeds the interface range of the second operating system, judging whether the first operating system and/or the second operating system are/is needed to process at present, and if so, sending the touch event to the corresponding operating system to process.
9. The multi-system fusion display method applied to an automobile according to claim 5, wherein the display mode of the display screen comprises: in the first display mode, the data generated by the UI in the second operating system is rendered, processed and synthesized by the second operating system end, and then sent to the first operating system for display;
or a second display mode, in which the second operating system renders the UI-generated data to the first operating system for rendering into a final display.
10. The multi-system fusion display method applied to an automobile according to claim 1, wherein the first operating system is a QNX system, the second operating system is an Android system, the virtualization mode is a Hypervisor, the Hypervisor operates on the QNX system, and the Android system operates on the Hypervisor.
11. The multi-system fusion display method applied to an automobile according to claim 10, wherein the communication between the QNX system and the Android system depends on a CM module, the bottom layer communicates through tcp/udp protocols, the physical transmission is based on a virtual network card created by QVM, and the transmission of subsequent key events depends on the CM module.
CN202311118051.XA 2023-08-31 2023-08-31 Multisystem fusion display method applied to automobile Pending CN117075834A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311118051.XA CN117075834A (en) 2023-08-31 2023-08-31 Multisystem fusion display method applied to automobile

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311118051.XA CN117075834A (en) 2023-08-31 2023-08-31 Multisystem fusion display method applied to automobile

Publications (1)

Publication Number Publication Date
CN117075834A true CN117075834A (en) 2023-11-17

Family

ID=88707880

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311118051.XA Pending CN117075834A (en) 2023-08-31 2023-08-31 Multisystem fusion display method applied to automobile

Country Status (1)

Country Link
CN (1) CN117075834A (en)

Similar Documents

Publication Publication Date Title
CN104050992B (en) There is the vehicle intelligent system of virtual instrument and Infotainment terminal
CN109388467A (en) Map information display method, device, computer equipment and storage medium
CN109542283B (en) Gesture touch multi-screen operation method
CN110515667A (en) A kind of terminal dual system display changeover method and terminal dual system based on linux kernel
CN114416000B (en) Multi-screen interaction method and multi-screen interaction system applied to intelligent automobile
CN105653336A (en) Method for integrating application under double operating systems
KR102262926B1 (en) Vehicle software control device
CN104598124A (en) Method and system for regulating field angle of head-up display device
CN112203130B (en) Vehicle-mounted information entertainment terminal, multi-screen interactive display method thereof and automobile
CN114851846A (en) Method, system, vehicle, storage medium and program product for a vehicle cabin
CN110955399B (en) Vehicle-mounted display system, image display method, storage medium, and host
CN110775099B (en) Integration method of communication system in train
CN117075834A (en) Multisystem fusion display method applied to automobile
CN113970334A (en) Map rendering method, architecture, device and storage medium
US20230082308A1 (en) Virtual connected vehicle infrastructure
CN113504870A (en) Hypervisor intelligent cockpit input method sharing system and method
US11403155B2 (en) Integration of vehicle manufacturer user management system with automotive operating system
CN115904295B (en) Multi-screen display control method, device, medium, system, chip and panel
CN115658208A (en) Multi-user application control method and device, storage medium and computer equipment
CN115734197A (en) Method, device and system for interaction among vehicle-mounted driver and passenger auxiliary equipment
CN116726489A (en) Control method and device for vehicle-mounted game, vehicle and storage medium
CN116028239A (en) Intelligent cabin calculation force sharing architecture, calculation force sharing method, equipment and medium
WO2024035566A1 (en) Real-time cpu availability monitoring
CN117312023A (en) Vehicle control method, vehicle, and computer-readable storage medium
CN116627644A (en) Computing resource scheduling method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination