CN115934405B - Detecting multi-system display synchronicity on a display device - Google Patents

Detecting multi-system display synchronicity on a display device Download PDF

Info

Publication number
CN115934405B
CN115934405B CN202310043119.6A CN202310043119A CN115934405B CN 115934405 B CN115934405 B CN 115934405B CN 202310043119 A CN202310043119 A CN 202310043119A CN 115934405 B CN115934405 B CN 115934405B
Authority
CN
China
Prior art keywords
pixel value
operating system
layer
timestamp
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310043119.6A
Other languages
Chinese (zh)
Other versions
CN115934405A (en
Inventor
雷金亮
吴成贵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Weilai Automobile Technology Anhui Co Ltd
Original Assignee
Weilai Automobile Technology Anhui Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Weilai Automobile Technology Anhui Co Ltd filed Critical Weilai Automobile Technology Anhui Co Ltd
Priority to CN202310043119.6A priority Critical patent/CN115934405B/en
Publication of CN115934405A publication Critical patent/CN115934405A/en
Application granted granted Critical
Publication of CN115934405B publication Critical patent/CN115934405B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The application relates to detecting a display synchronicity of a multi-system on a display device, the multi-system comprising a first operating system and a second operating system, the first operating system providing a virtual machine monitor and the second operating system running in a virtualized environment provided by the virtual machine monitor, wherein the method of detecting the display synchronicity of the multi-system on the display device comprises: determining a timestamp for verification; generating a first image layer and a second image layer for display on a display device under a first operating system and a second operating system respectively and displaying according to a mixing rule, wherein the first image layer and the second image layer are positioned at a first pixel value and a second pixel value at a target coordinate and are generated based on a time stamp; intercepting an image displayed on a display device and determining a third pixel value of the intercepted image at a target coordinate; and determining whether the third pixel value is a result of mixing the first pixel value and the second pixel value generated based on the time stamp according to a mixing rule.

Description

Detecting multi-system display synchronicity on a display device
Technical Field
The present application relates to the field of vehicle cabin systems, and in particular to a method of detecting a synchronicity of a multi-system display on a display device, a vehicle cabin system, a vehicle comprising the same, and a computer readable storage medium.
Background
Most of the existing intelligent cabins are based on a mode of QNX+android, namely, a multisystem structure with QNX as a bottom system and an Android system as a virtual machine. However, in the context of the Android system as a load-bearing system for central control application, the display device often generates problems such as black screen and frozen screen due to various false touches or performance reasons. After these problems occur, unless the system is restarted by manual intervention, the system is difficult to automatically detect the state and restore to normal. Such display problems may affect the user experience.
In view of this, there is a need for an improved mechanism for detecting the synchronicity of the display of multiple systems on a display device.
Disclosure of Invention
Embodiments of the present application provide a method for detecting display synchronicity of multiple systems on a display device, a vehicle cabin system, a vehicle comprising the same, and a computer readable storage medium for detecting real-time display of multiple systems on a single screen.
According to one aspect of the present application, a method of detecting display synchronicity of a multi-system on a display device is provided. The multi-system includes a first operating system providing a virtual machine monitor and a second operating system running in a virtualized environment provided by the virtual machine monitor, wherein the method includes the steps of: determining a timestamp for verification; generating a first layer for display on the display device under the first operating system and displaying according to a blending rule, wherein a first pixel value of the first layer at a target coordinate is generated based on the timestamp; generating a second layer for display on the display device under the second operating system and displaying according to the blending rule, wherein a second pixel value of the second layer at the target coordinate is generated based on the timestamp; intercepting an image displayed on the display device and determining a third pixel value of the intercepted image at the target coordinates; and determining whether the third pixel value is a result of the first pixel value and the second pixel value generated based on the timestamp being mixed according to the mixing rule.
In some embodiments of the present application, optionally, the value of the timestamp satisfies: t is less than or equal to 2 N*3 Wherein T is the timestamp and N is the bit depth of the first layer and the second layer.
In some embodiments of the present application, optionally, the first pixel value and the second pixel value are generated according to the following formula: r= (T/2) N /2 N )% 2 N ,G = (T/2 N )% 2 N ,B = T% 2 N Wherein R represents a red component value of the first and second pixel values, G represents a green component value of the first and second pixel values, and B represents a blue component value of the first and second pixel values.
In some embodiments of the present application, optionally, the bit depth N is 8.
In some embodiments of the present application, optionally, the mixing rule satisfies: { R0, G0, B0} = { R1, G1, B1}, a1+ { R2, G2, B2}, wherein A1 represents the transparency of the first layer and A2 represents the transparency of the second layer; { R1, G1, B1} represents the pixel value at the target coordinate in the first layer, and R1 represents the red component value, G1 represents the green component value, B1 represents the blue component value; { R2, G2, B2} represents the pixel value at the target coordinate in the second layer, and R2 represents the red component value, G2 represents the green component value, and B2 represents the blue component value; and { R0, G0, B0} represents the pixel value at the target coordinates after mixing, and R0 represents the red component value, G0 represents the green component value, and B0 represents the blue component value.
In some embodiments of the present application, optionally, the timestamp is generated under the first operating system according to a timestamp of the processor.
In some embodiments of the present application, optionally, the method further comprises: and after the first layer is generated under the first operating system, the timestamp is sent to the second operating system.
In some embodiments of the present application, optionally, the method further comprises: and generating the second image layer under the second operating system, and then sending a ready signal to the first operating system, wherein the image displayed on the display device is intercepted under the first operating system according to the ready signal.
In some embodiments of the present application, optionally, determining whether the third pixel value is a result of the first pixel value and the second pixel value generated based on the timestamp being mixed according to the mixing rule includes: deriving original pixel values of the third pixel value in each layer prior to blending based on the blending rule; and determining whether the original pixel value was generated based on the timestamp.
In some embodiments of the present application, optionally, the first operating system is QNX and the second operating system is Android.
In some embodiments of the present application, optionally, the method further comprises: restarting the second operating system if it is determined that the third pixel value is not a result of blending according to the blending rule.
According to another aspect of the present application, there is provided a vehicle cabin system, the system comprising: a memory configured to store instructions; and a processor configured to execute the instructions to perform any of the methods as described above.
According to another aspect of the present application there is provided a vehicle comprising any one of the vehicle cabin systems as described above.
According to another aspect of the present application, there is provided a computer readable storage medium having instructions stored therein, which when executed by a processor, cause the processor to perform any one of the methods as described above.
Some embodiments of the present application provide a method of detecting the synchronicity of a multi-system display on a display device, a vehicle cabin system and a vehicle comprising the same, a computer readable storage medium, which can be used to detect the real-time of the multi-system display on a single screen and which can be restarted when an operating system failure is detected. Specifically, a time synchronization and image color processing mechanism is utilized, so that the system can detect whether abnormal display exists or not, display is asynchronous and the like. Actively discovering an abnormal state of the system and restoring to a normal state will help to promote system stability.
Drawings
The foregoing and other objects and advantages of the application will be apparent from the following detailed description taken in conjunction with the accompanying drawings in which like or similar elements are designated by the same reference numerals.
FIG. 1 illustrates a method of detecting a multi-system display synchronicity on a display device according to one embodiment of the present application;
FIG. 2 illustrates a method of detecting a multi-system display synchronicity on a display device according to one embodiment of the present application;
fig. 3 illustrates layer blending according to one embodiment of the present application.
Detailed Description
For the purposes of brevity and explanation, the principles of the present application are described herein primarily with reference to exemplary embodiments thereof. However, those skilled in the art will readily recognize that the same principles are equally applicable to, and can be implemented in, all types of methods of detecting the synchronicity of multiple systems on a display device, vehicle cabin systems and vehicles including the same, computer readable storage media, and that any such variations do not depart from the true spirit and scope of the present application.
According to one aspect of the present application, a method of detecting display synchronicity of a multi-system on a display device is provided. The multiple systems share a set of hardware, particularly a display device. For example, the multiple systems may be present on a vehicle cabin system and each be used to implement a different function of the vehicle cabin system. Specifically, the multi-system may include a first operating system and a second operating system, where the first operating system provides a virtual machine monitor and the second operating system runs in a virtualized environment provided by the virtual machine monitor. In some examples, the first operating system may be QNX and the second operating system may be Android.
As shown in fig. 1, a method 10 for detecting the display synchronicity of a multi-system on a display device (hereinafter referred to as method 10) includes the steps of: determining a time stamp for verification in step S102; generating a first layer for display on a display device by a first operating system and displaying according to a blending rule in step S104, wherein the first layer is generated based on a time stamp for a first pixel value at a target coordinate; generating, by the second operating system and in accordance with the blending rule, a second layer for display on the display device, wherein the second layer is generated based on the time stamp at a second pixel value at the target coordinate in step S106; intercepting an image displayed on the display device and determining a third pixel value of the intercepted image at the target coordinates in step S108; and determining in step S110 whether the third pixel value is a result of mixing the first pixel value and the second pixel value generated based on the time stamp according to the mixing rule.
Synchronization of the two displays is particularly important because the first operating system and the second operating system may need to synchronously display respective layers on the display device at the same time to achieve complete interface presentation for a specific application. The method 10 can detect whether the display of two different systems on the same display device is synchronous as expected or not through the steps, so as to prevent the display dyssynchrony caused by the problems of false death and the like of the operating system (second operating system) running on the virtual machine from affecting the user experience. The principle of operation of the various steps of the method 10 will be described in detail below.
The method 10 determines a time stamp for verification in step S102. Pixels in each layer may be generated in a random manner using the time stamps as a basis for generating pixel values, thereby increasing randomness for the blended pixels. This way, subsequent verification will be made more robust, so that verification bias due to occasional errors (e.g. both systems are dying and the pixels to be verified exactly meet the blending rule) can be prevented.
In some examples, the timestamp is generated under the first operating system from a timestamp of the processor. The first operating system is not running in the virtual machine and thus would theoretically be more efficient to run. And the system operation efficiency such as QNX is theoretically also due to the general system such as Android thanks to the self attribute of the system. By implementing some functions of the method 10 that are not necessarily implemented in the second operating system in the first operating system, the efficiency of the verification process of the method 10 can be significantly improved, and the failure of the verification work to be normally performed due to the death of the second operating system can be avoided.
The method 10 generates a first layer for display on a display device under a first operating system and displays according to a blending rule in step S104, wherein the first layer is generated based on a time stamp for a first pixel value at a target coordinate. Similarly, the method 10 generates a second layer for display on the display device under a second operating system and displays according to a blending rule in step S106, wherein the second layer is generated based on the time stamp for a second pixel value at the target coordinate.
In theory, if the first operating system and the second operating system both work properly, the first pixel value and the second pixel value generated should be the same, and the references herein to "first" and "second" before the pixel value are intended to distinguish the generated subject. Although it is not illustrated how the remaining pixel values of the first layer and the second layer except at the target coordinates are generated, the remaining pixel values may be copied to the first pixel value and the second pixel value, respectively, in some simple examples.
In other examples, the first pixel value and the second pixel value may also be used only to replace pixels at target coordinates in a picture that the first operating system and the second operating system were originally to display (e.g., triggered by other applications than the authentication application) so as not to interfere with the intended presentation of the picture. In other words, the first pixel value and the second pixel value used for testing only replace the pixel value at the target coordinate in the picture to be presented originally, so that the normal work of the first operating system and the second operating system is not interfered by the detection process. The user will have difficulty in perceiving the authentication process being performed in the background while browsing the first layer, the second layer generated in this way.
The method 10 intercepts the image displayed on the display device and determines a third pixel value of the intercepted image at the target coordinates in step S108. The truncated image will be used to determine whether the first pixel value and the second pixel value generated by the first operating system, the second operating system are blended as expected. In theory, if the first operating system and the second operating system both work normally and both generate the first layer and the second layer according to the plan, the third pixel value will be the result of mixing the first pixel value and the second pixel value layer. Conversely, if either the first operating system or the second operating system fails to function properly (e.g., in general, the second operating system fails to work as if it were dead), then the first layer and the second layer may not be expected to be generated according to the plan, and the third pixel value in the screenshot may not be the result of the first pixel value and the second pixel value being mixed according to the mixing rule.
The method 10 determines in step S110 whether the third pixel value is the result of the mixing of the first pixel value and the second pixel value generated based on the time stamp according to the mixing rule. Specifically, whether or not to blend by the first pixel value and the second pixel value according to the blending rule may be back-deduced by the blending rule. For example, in some examples it may be determined whether the third pixel value is a result of mixing the first pixel value and the second pixel value generated based on the timestamp according to a mixing rule by: first, the original pixel values of the third pixel value in the respective layers prior to blending may be derived based on the blending rule (assuming that the first layer, the second image both produce the same pixel value as expected at the target coordinates). Second, it can be determined whether the original pixel value was generated based on a time stamp, which would be back-pushed by blending rules. The reverse process is the reverse of the mixing process, which will be described in detail below.
In some embodiments of the present application, in order to simplify the generation algorithm of the first pixel value and the second pixel value, the value of the timestamp satisfies T.ltoreq.2 N*3 Where T is a timestamp and N is the bit depth of the first layer and the second layer, and may generally be the bit depth that the display device can support. Wherein 2 is N*3 The maximum value stored in the pixel points which can be generated by the first layer and the second layer. For example, the bit depth N may be 8, then the timestamp has a value T.ltoreq.2 24 . At this time, the maximum value stored in the pixel points generated by the first layer and the second layer is 2 24
In some embodiments of the present application, the first pixel value and the second pixel value may be generated according to the following formulas in step S104 and step S106: r= (T/2) N /2 N )% 2 N ,G = (T/2 N )% 2 N ,B = T% 2 N Wherein "%" is a modulo (remainder) operator, R represents a red component value of the first pixel value and the second pixel value, G represents a green component value of the first pixel value and the second pixel value, and B represents a blue component value of the first pixel value and the second pixel value. In this way a first pixel for use at the coordinates of the target can be generated quicklyThe value and the second pixel value, and the generation algorithm may be implemented with less overhead of resources.
In some embodiments of the present application, the above mixing rule satisfies: { R0, G0, B0} = { R1, G1, B1}, a1+ { R2, G2, B2}, wherein A1 represents the transparency of the first layer and A2 represents the transparency of the second layer; { R1, G1, B1} represents the pixel value at the target coordinate in the first layer, and R1 represents the red component value, G1 represents the green component value, B1 represents the blue component value; { R2, G2, B2} represents the pixel value at the target coordinate in the second layer, and R2 represents the red component value, G2 represents the green component value, and B2 represents the blue component value; and { R0, G0, B0} represents the pixel value at the target coordinates after mixing, and R0 represents the red component value, G0 represents the green component value, and B0 represents the blue component value.
Turning to FIG. 3, a first operating system 31 provides a virtual machine monitor (not shown), and a second operating system 32 may run in a virtualized environment provided by the virtual machine monitor. As described above, the first operating system 31 will generate the first layer 311, and the first pixel value of the first layer 311 at the target coordinates (shown as a black dot on the layer) is generated based on the time stamp. Similarly, the first operating system 32 will generate a second layer 321, and second pixel values for which the second layer 321 is located at the target coordinates (shown as black dots on the layer) are also generated based on the time stamps. At this time, if the first pixel value is { R1, G1, B1}, the second pixel value is { R2, G2, B2}, and the transparency of the first layer is preset as A1 and the transparency of the second layer is preset as A2 in the blending rule, the pixel values { R0, G0, B0} = { R1, G1, B1} { R2, G2, B2} after blending are equal to each other. The above-described mixing rule also shows how the back-pushing process in step S110 is implemented.
In some embodiments of the present application, the method 10 further includes the following steps (not shown in the figures): the time stamp is sent to the second operating system after the first layer is generated under the first operating system. In this way, the first operating system will act as the dominant operating system for the detection process, and the detection operation process of the second operating system is triggered by the first operating system sending a timestamp to the second operating system.
In some embodiments of the present application, the method 10 further includes the following steps (not shown in the figures): and generating a second image layer under the second operating system, and then sending a ready signal to the first operating system, wherein the image displayed on the display device is intercepted under the first operating system according to the ready signal.
In some embodiments of the present application, the method 10 further includes the following steps (not shown in the figures): in the event that it is determined that the third pixel value is not a result of blending according to the blending rule, the second operating system is restarted. Based on this, a failure of the second operating system, such as a false death, will be repaired in time.
For purposes of more clearly explaining the principles of the present application, FIG. 2 illustrates in a more complete form a method of detecting the display synchronicity of multiple systems on a display device, it being understood that the example of FIG. 2 should not be considered as an additional limitation to other examples herein (e.g., the corresponding embodiment of FIG. 1).
The method 20 for detecting the display synchronicity of a multisystem on a display device (hereinafter referred to as method 20) comprises the steps of: the detection flow starts at step S201 and ends at step S212. In step S202, the QNX system will acquire a time stamp for implementing the detection process. Subsequently, the QNX system will generate a first pixel value (at the target coordinates) from the time stamp in step S203, and generate a first layer to draw based on the blending rule. Next, the QNX system sends the obtained timestamp to the Android system in step S204. In step S205, the Android system judges whether the difference value between the time stamp and the current time exceeds a preset value, and if so, the Android system returns to step S202 to acquire the time stamp again; if the preset value is not exceeded, the process proceeds to step S206, and the Android system generates a second pixel value (at the target coordinates) according to the timestamp, and generates a second layer to draw based on the blending rule. Subsequently, the Android system may notify the QNX system that the preparation work is ready in step S207. The QNX system may determine in step S208 whether the difference between the timestamp and the current time exceeds a preset value, and if so, return to step S202 to reacquire the timestamp; if the preset value is not exceeded, the process proceeds to step S209. The QNX system will intercept the image currently displayed on the display device and process in step S209, which includes extracting the third pixel value at the target coordinate. In step S210, QNX will determine, according to the color mixing formula (mixing rule), whether the third pixel value is the original pixel value that the QNX system and the Android system mix together and before mixing is generated according to the timestamp acquired in step S202. If the determination result in step S210 indicates that the third pixel value is that the QNX system and the Android system are mixed together and the original pixel value before mixing is generated according to the timestamp acquired in step S202, it indicates that the display is synchronous, and the step proceeds to step S212 to end; if the determination result in step S210 indicates that the third pixel value is not the QNX system and the Android system and/or the original pixel value before mixing is not generated according to the timestamp acquired in step S202, it indicates that the display of the two operating systems is asynchronous, and the step proceeds to step S211 to trigger the recovery mechanism of the Android system first, and then proceeds to step S212 to end the detection process.
According to another aspect of the present application, there is provided a vehicle cabin system, the system comprising: a memory configured to store instructions; and a processor configured to execute the instructions to perform any of the methods as described above.
According to another aspect of the present application there is provided a vehicle comprising any one of the vehicle cabin systems as described above.
According to another aspect of the present application, there is provided a computer readable storage medium having stored therein instructions which, when executed by a processor, cause the processor to perform any of the methods of detecting the display synchronicity of a multisystem on a display device as described above. Computer-readable media, as referred to in this application, include any type of computer storage media which can be accessed by a general purpose or special purpose computer. For example, the computer-readable medium may include RAM, ROM, EPROM, E 2 PROM, register,A hard disk, a removable disk, a CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage device, or any other temporary or non-temporary medium that can be used to carry or store desired program code means in the form of instructions or data structures and that can be accessed by a general purpose or special purpose computer, or a general purpose or special purpose processor. Disk, as used herein, typically replicates data magnetically, while disk replicates data optically with a laser. Combinations of the above should also be included within the scope of computer-readable media. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. The ASIC may reside in a user terminal. In the alternative, the processor and the storage medium may reside as discrete components in a user terminal.
The foregoing is merely a specific embodiment of the present application, and the scope of the present application is not limited thereto. Other possible variations or substitutions will occur to those skilled in the art from the teachings disclosed herein and are intended to be within the scope of the present application. In the case of no conflict, the embodiments of the present application and the features of the embodiments may also be combined with each other. The claims of the protection scope of the present application control.

Claims (14)

1. A method of detecting display synchronicity of a multisystem on a display device, the multisystem comprising a first operating system and a second operating system, the first operating system providing a virtual machine monitor and the second operating system running in a virtualized environment provided by the virtual machine monitor, wherein the method comprises:
determining a timestamp for verification;
generating a first layer for display on the display device under the first operating system and displaying according to a blending rule, wherein a first pixel value of the first layer at a target coordinate is generated based on the timestamp;
generating a second layer for display on the display device under the second operating system and displaying according to the blending rule, wherein a second pixel value of the second layer at the target coordinate is generated based on the timestamp;
intercepting an image displayed on the display device and determining a third pixel value of the intercepted image at the target coordinates; and
determining whether the third pixel value is a result of the first pixel value and the second pixel value generated based on the timestamp being mixed according to the mixing rule.
2. The method of claim 1, wherein the timestamp has a value that satisfies:
T≤ 2 N*3 wherein, the method comprises the steps of, wherein,
t is the timestamp and N is the bit depth of the first layer and the second layer.
3. The method of claim 2, wherein the first pixel value and the second pixel value are generated according to:
R = (T/2 N /2 N )% 2 N
G = (T/2 N )% 2 N
B = T% 2 N wherein, the method comprises the steps of, wherein,
r represents a red component value of the first pixel value and the second pixel value, G represents a green component value of the first pixel value and the second pixel value, and B represents a blue component value of the first pixel value and the second pixel value.
4. The method of claim 2, wherein the bit depth N is 8.
5. The method of claim 1, wherein the mixing rule satisfies:
{ R0, G0, B0} = { R1, G1, B1}, A1+ { R2, G2, B2}, wherein,
a1 represents the transparency of the first layer and A2 represents the transparency of the second layer;
{ R1, G1, B1} represents the pixel value at the target coordinate in the first layer, and R1 represents the red component value, G1 represents the green component value, B1 represents the blue component value;
{ R2, G2, B2} represents the pixel value at the target coordinate in the second layer, and R2 represents the red component value, G2 represents the green component value, and B2 represents the blue component value; and
{ R0, G0, B0} represents the pixel value at the target coordinates after mixing, and R0 represents the red component value, G0 represents the green component value, and B0 represents the blue component value.
6. The method of claim 1, wherein the timestamp is generated under the first operating system from a timestamp of a processor.
7. The method of claim 6, wherein the method further comprises: and after the first layer is generated under the first operating system, the timestamp is sent to the second operating system.
8. The method of claim 7, wherein the method further comprises: and generating the second image layer under the second operating system, and then sending a ready signal to the first operating system, wherein the image displayed on the display device is intercepted under the first operating system according to the ready signal.
9. The method of claim 1, wherein determining whether the third pixel value is a result of the first pixel value and the second pixel value generated based on the timestamp being mixed according to the mixing rule comprises:
deriving original pixel values of the third pixel value in each layer prior to blending based on the blending rule; and
a determination is made as to whether the original pixel value was generated based on the timestamp.
10. The method of claim 1, wherein the first operating system is QNX and the second operating system is Android.
11. The method of claim 1, wherein the method further comprises: restarting the second operating system if it is determined that the third pixel value is not a result of blending according to the blending rule.
12. A vehicle cabin system, the system comprising:
a memory configured to store instructions; and
a processor configured to execute the instructions to perform the method of any one of claims 1-11.
13. A vehicle comprising the vehicle cabin system of claim 12.
14. A computer readable storage medium having instructions stored therein, which when executed by a processor, cause the processor to perform the method of any of claims 1-11.
CN202310043119.6A 2023-01-29 2023-01-29 Detecting multi-system display synchronicity on a display device Active CN115934405B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310043119.6A CN115934405B (en) 2023-01-29 2023-01-29 Detecting multi-system display synchronicity on a display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310043119.6A CN115934405B (en) 2023-01-29 2023-01-29 Detecting multi-system display synchronicity on a display device

Publications (2)

Publication Number Publication Date
CN115934405A CN115934405A (en) 2023-04-07
CN115934405B true CN115934405B (en) 2023-07-21

Family

ID=86699247

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310043119.6A Active CN115934405B (en) 2023-01-29 2023-01-29 Detecting multi-system display synchronicity on a display device

Country Status (1)

Country Link
CN (1) CN115934405B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1498721A1 (en) * 2003-07-15 2005-01-19 ELMOS Semiconductor AG Device for recognition of fog, especially for a vehicle
CN105830428A (en) * 2013-12-19 2016-08-03 株式会社理光 Object detection apparatus, moving body device control system and program thereof
WO2016187692A1 (en) * 2015-05-27 2016-12-01 Idk Interactive Inc. Display systems using facial recognition for viewership monitoring purposes
EP3355188A1 (en) * 2017-01-31 2018-08-01 OpenSynergy GmbH Instrument display on a car dashboard by checking frames of a gui by a realtime os
CN109324903A (en) * 2018-09-21 2019-02-12 深圳前海达闼云端智能科技有限公司 Display resource regulating method and device for embedded system
WO2019030763A1 (en) * 2017-08-10 2019-02-14 Argus Cyber Security Ltd. System and method for detecting exploitation of a component connected to an in-vehicle network
CN112805690A (en) * 2018-11-16 2021-05-14 深圳市欢太科技有限公司 Display screen detection method and device, electronic equipment and computer readable storage medium
CN113296672A (en) * 2021-05-20 2021-08-24 前海七剑科技(深圳)有限公司 Interface display method and system
CN115237518A (en) * 2022-07-05 2022-10-25 南京中科创达软件科技有限公司 Screen interface display processing method and device, electronic equipment and medium
CN115339288A (en) * 2022-08-30 2022-11-15 浙江吉利控股集团有限公司 Air conditioner interaction interface display control method and device and vehicle

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10169066B2 (en) * 2015-08-06 2019-01-01 Ionroad Technologies Ltd. System and method for enhancing advanced driver assistance system (ADAS) as a system on a chip (SOC)
CN113870756B (en) * 2020-06-30 2023-12-08 京东方科技集团股份有限公司 Correction method, system and device of display equipment
US20230023347A1 (en) * 2021-07-23 2023-01-26 Ford Global Technologies, Llc Vehicle using full-velocity determination with radar

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1498721A1 (en) * 2003-07-15 2005-01-19 ELMOS Semiconductor AG Device for recognition of fog, especially for a vehicle
CN105830428A (en) * 2013-12-19 2016-08-03 株式会社理光 Object detection apparatus, moving body device control system and program thereof
WO2016187692A1 (en) * 2015-05-27 2016-12-01 Idk Interactive Inc. Display systems using facial recognition for viewership monitoring purposes
EP3355188A1 (en) * 2017-01-31 2018-08-01 OpenSynergy GmbH Instrument display on a car dashboard by checking frames of a gui by a realtime os
WO2019030763A1 (en) * 2017-08-10 2019-02-14 Argus Cyber Security Ltd. System and method for detecting exploitation of a component connected to an in-vehicle network
CN109324903A (en) * 2018-09-21 2019-02-12 深圳前海达闼云端智能科技有限公司 Display resource regulating method and device for embedded system
CN112805690A (en) * 2018-11-16 2021-05-14 深圳市欢太科技有限公司 Display screen detection method and device, electronic equipment and computer readable storage medium
CN113296672A (en) * 2021-05-20 2021-08-24 前海七剑科技(深圳)有限公司 Interface display method and system
CN115237518A (en) * 2022-07-05 2022-10-25 南京中科创达软件科技有限公司 Screen interface display processing method and device, electronic equipment and medium
CN115339288A (en) * 2022-08-30 2022-11-15 浙江吉利控股集团有限公司 Air conditioner interaction interface display control method and device and vehicle

Also Published As

Publication number Publication date
CN115934405A (en) 2023-04-07

Similar Documents

Publication Publication Date Title
US8314805B2 (en) Control method and computer system for switching display between OSs
US8898577B2 (en) Application sharing with occlusion removal
US20160292892A1 (en) Methods and Devices for Interface Display
US7392403B1 (en) Systems, methods and computer program products for high availability enhancements of virtual security module servers
US20090328180A1 (en) Granting Least Privilege Access For Computing Processes
CN109739937B (en) Method, device, equipment and storage medium for displaying path in knowledge graph
JP2008077264A (en) Recovery method using cdp
US20190318090A1 (en) Malicious software detection based on api trust
WO2018210179A1 (en) Application page processing method and device and storage medium
US20150379038A1 (en) Data replication in site recovery environment
CN108228308B (en) Monitoring method and device for virtual machine
US20200057843A1 (en) Secure file sharing using semantic watermarking
US20200065498A1 (en) System and method for security analysis
CN115934405B (en) Detecting multi-system display synchronicity on a display device
US20170115866A1 (en) Method, device and terminal apparatus for recognizing multi-finger sliding gesture
KR100979092B1 (en) method of judging whether an image-enhanced gamehack is used, and intercepting an image-enhanced gamehack
CN109916551B (en) Brake performance detection method and device and electronic equipment
CN113922975A (en) Security control method, server, terminal, system and storage medium
WO2020119232A1 (en) Virtual desktop-based watermark addition method and device
CN107168774A (en) It is a kind of based on the virtual machine migration method being locally stored and system
CN108681494B (en) Backup data restoration method and device, user equipment and storage medium
US11449405B2 (en) Information processing apparatus, control method, and program
CN103650459A (en) Information presentation method and equipment
CN113419801B (en) Form rendering method, terminal, device and computer-readable storage medium
US20090089702A1 (en) Interactive analysis of network adjustment results

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant