AU2020265721A1 - Device for a validatable output of images - Google Patents
Device for a validatable output of images Download PDFInfo
- Publication number
- AU2020265721A1 AU2020265721A1 AU2020265721A AU2020265721A AU2020265721A1 AU 2020265721 A1 AU2020265721 A1 AU 2020265721A1 AU 2020265721 A AU2020265721 A AU 2020265721A AU 2020265721 A AU2020265721 A AU 2020265721A AU 2020265721 A1 AU2020265721 A1 AU 2020265721A1
- Authority
- AU
- Australia
- Prior art keywords
- screen
- signature
- image data
- images
- cameras
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/147—Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2310/00—Command of the display device
- G09G2310/08—Details of timing specific for flat panels, other than clock recovery
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/04—Maintaining the quality of display appearance
- G09G2320/041—Temperature compensation
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2330/00—Aspects of power supply; Aspects of display protection and defect management
- G09G2330/12—Test circuits or failure detection circuits included in a display system, as permanent part thereof
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Closed-Circuit Television Systems (AREA)
- Collating Specific Patterns (AREA)
- Image Input (AREA)
Abstract
The invention relates to a device for a validatable output of images, having a screen (20) for outputting images on the basis of image data (BD1), a number N1 of cameras (31, 32), where N1 > 1, each of the cameras being designed to provide image data (BD1, BD2) on the basis of captured camera images, an integration unit (42) for outputting expanded image data (eBD1) on the basis of the integration of a signature, which is visible in the images output by the screen, into the image data provided by one of the cameras, a number N2 of photosensors (51, 52) arranged in front of the screen, where N2 > 1, said N2 photosensors being designed to detect the integrated signature in the images output by the screen, and a validation unit (44) which is coupled to the N2 photosensors and which is designed to validate the detected signature and initiate a security function on the basis of the validated signature.
Description
The present invention relates to a device for validatable output of images from one or more cameras on a screen. The present invention further relates to vehicle comprising such a device for validatable output of images from one or more cameras on a screen.
For example, in military vehicles such as tanks, devices for outputting images are used that have cameras arranged on the outer skin of the vehicle and at least one screen arranged in the interior of the vehicle for outputting the images recorded by the cameras. A multiplexer is coupled between the cameras and the screen, said multiplexer providing the images of a selected camera from the number of coupled cameras for the screen.
In doing so, it is possible that a camera arranged on the outer skin of the vehicle delivers a still image, that is to say that it is frozen, and the user within the vehicle has no means of noticing this.
A particularly high risk arises if the camera is linked, for example, to a weapon station and the image displayed on the screen is used for shooting at sight.
It is worth noting here, that there is a technology-related change in the image delay time for displaying a video image, on an LCD display or on a TFT display, for example, under different environmental conditions, in particular with different temperatures. While a video image is displayed very quickly and smoothly by the display at room temperature, negative temperatures result in a much more sluggish display, up to a blurring of the video image in moving pictures. Other examples for an error pattern or error image in addition to the still image are a delayed image as well as a wrong view, i.e., the display of a wrong camera.
Against this background, it is an object of the present invention to provide a solution for the validatable output of images.
Accordingly, a device for a very validatable output of images is proposed which has: a screen for outputting images based on image data, a number N1 of cameras, with N1 > 1, wherein each of the cameras is adapted to provide image data based on a recording of camera images, an integration unit for outputting extended image data based on an integration of a visible signature in the images displayed by the screen into the image data provided by one of the cameras, a number N2 of photo sensors arranged in front of the screen, with N2 > 1, wherein the photo sensors are adapted to detect the integrated signature in the images output by the screen, and
a validation unit coupled with the photo sensors, the validation unit being adapted to validate the detected signature and, responsive to the validated signature, trigger a safety function.
Responsive to the validated signature, detected by at least one of the photo sensors and validated by the validation unit, the present device, a part of the device, e.g., the screen, and/or an apparatus couple to the device, for example, a driver's sight or a weapon station, can be put into a safe state by means of a safety function.
For example, if the delay time of the device, in particular the difference between the detection of the signature by the photo sensor and the imprinting or integrating of the signature into the provided image data, for example, due to a very low ambient temperature, is greater than a predetermined threshold value, a safety function can be induced or triggered by the validation unit.
Optical detection of the signature by the number of photo sensors allows errors in the display to be detected by the screen itself.
The signature may also be referred to as stimulus or embedded image information, embedded in the images to be displayed.
Monitoring of the additionally embedded image information can be used for configuring and designing a respective safety function for all three safety-critical aspects, still image, delayed imaged and wrong view. In particular, the error patterns of the delayed image, the still image and the wrong view can be sensed by means of the integrated signature or embedded image information. In the event that such an error pattern is sensed, a safety function can be induced.
Examples for such a safety function comprise outputting a message to a user (visually and/or acoustically and/or haptically), switching-off the screen and/or locking a weapon station or weapon coupled with the device.
The device may also be referred to as a sight system. The respective camera may be embodied as video camera, as daylight camera, as thermal imaging camera, as infrared camera or as photo camera. In particular, the cameras may comprise different camera types. The photo sensor may also be referred to as photo detector, as image sensor or as optical detector.
The screen may be embodied, for example, as a flat screen arranged in a vehicle interior of a military vehicle. The screen may also be referred to as monitor or display. The display may be an LCD display or a TFT display or an optical display consisting, for example, of one or several LEDs.
In particular, an image processing unit or a control unit is provided, which is adapted to control the integration unit as well as the screen. The image processing unit comprises, for example, electronics such as a microprocessor or a CPLD (CPLD; Complex Programmable Logic Device) or an FPGA.
According to an embodiment, the validation unit is configured to determine a time difference between a second point in time of the detection of the signature by the N2 photo sensors and a first point in time of the integration of the signature into the provided image data and to induce the safety function if the determined time differences is greater than a predetermined threshold value.
Different threshold values may also be used to differentiate different error images such as a delayed images or a still image. Responsive to the validation of the detected signature, and, in particular based on the different threshold values used, the validation unit can also induce different safety functions.
According to another embodiment, the validation unit is adapted to detect and/or differentiate different errors or error patterns based on the determined time difference, in particular by using different threshold values. The different error
patterns comprise, for example "delayed image", "still image" and "wrong view".
For example, upon reaching a maximum permissible delay time, a safety function may take effect that signals the user the limit exceedance for the error event "delayed image" and reliably puts the device into a safe state, at least until the current delay time is again below the maximum permissible delay time.
In addition, a trigger for indicating the first point in time of the integration of the signature may also be transmitted directly from the integration unit to the validation unit. The signature itself may also be used as the trigger. If, after the validation unit received the trigger, no expected change on the screen is detected by the photo sensors, the error event "still image" must be assumed, and the system can be put into the safe state.
According to another embodiment, the signature is embodied as a coded signature comprising a time stamp that is specific for indicating the first point in time of the integration of the signature into the image data provided.
Thus, the time stamp indicates the first point in time of the integration of the signature into the image data provided.
According to another embodiment, the signature is embodied as a coded signature comprising a time stamp that is specific for indicating the first point in time of the integration of the signature into the image data provided and source information that is specific for the camera providing the image data.
The source information as part of the coded signature and thus as part of the extended image data is suitable for specifying the camera providing the image data. The source information may also be referred to as identification information or ID or source ID. The source information is preferably generated based on a hardware ID of the camera.
According to another embodiment, the respective one of the photo sensors is adapted to monitor a predetermined output area of the screen with an MxN pixel area for detecting the signature.
M refers to a number of rows and N to a number of columns of the output area or the section of the output area of the screen on which the signature is displayed. The output area is in particular located at the perimeter or at a corner of the screen so that the center region of the screen is still available for the output to the user.
According to another embodiment, the pixels of the MxN pixel area of the output area have a defined brightness for forming the visible signature, in particular a defined and identical brightness.
According to another embodiment, the pixels of the MxN pixel area of the output area form a predetermined pattern for forming the visible signature.
The predetermined pattern may also be referred to as design. Various information may be coded in the pattern that is evaluated by the validation unit and is used for the validation and/or for selecting the suitable safety function.
The pattern may also be embodied as a video pattern. In the example of using four different cameras, a respective camera-specific video pattern may be assigned to the respective camera and location-specifically displayed and correspondingly detected, e.g., in the four corners of the screen. This also allows for the error event "wrong view" to be detected by means of the four photo sensors and the validation unit arranged downstream.
According to another embodiment, four photo sensors are provided, with N2 = 4, wherein the respective output area monitored by one of the four photo sensors is located in one of the four corners of the screen.
Here, the output areas are located in the four corners of the screen, and the four photo sensors are correspondingly arranged in front of these four corners of the screen, so that the center region of the screen is still provided for the display of the images of the cameras and thus for the user.
By using four photo sensors, each arranged in a corner of the screen, four different positions on the screen can be monitored and evaluated. In the example of using four different cameras, respective camera-specific embedded image information can be displayed on the top left, top right, bottom left and bottom right. Thus, as mentioned above, the error event "wrong view" can also be detected by the photo sensor and the downstream validation unit. For example, if the validation reveals that a certain screen is supposed to display the images of a camera A, but in fact displays the images of a camera B, the action to be executed may be to lock the weapon that may be coupled to the device.
According to another embodiment, a plurality of photo sensor is provided, with N2 > 4. The output areas monitored by the photo sensor are located in the vertical and/or horizontal perimeter regions of the screen.
According to another embodiment, a number N3 of temperature sensors for measuring a respective temperature is provided, with N3 = N2. A respective one of the temperature sensors is arranged in front of the screen and adjacent to a respective one of the photo sensors.
According to another embodiment, the respective temperature sensor is adapted to measure a temperature of the MxN pixel area of the output area.
According to another embodiment, the validation unit is adapted to validate the detected signature based on the temperature measured by the N3 temperature sensors.
The temperature sensors are, in particular, adapted to measure the temperature of the respective active area, i.e., the respective MxN pixel area of the output area. This results, in particular, also a in dependency of the delay time relative to the display pixel temperature. This dependency can additionally be used for validating the delay time. In addition, the same temperature sensors may be used to advantageously control a heating of the display that regulates the screen to a target temperature at which the delay time is, in particular, below the specified threshold value or limit value and thus continuous operation is ensured.
According to another embodiment, the safety function comprises outputting a message to a user, switching of the screen and/or locking a weapon coupled to the device.
According to another embodiment, the device comprises a transmission channel connected between the N1 cameras and the screen for transmitting the image data from the N1 cameras to the screen, wherein the integration unit is arranged upstream of the screen in the transmission channel.
The transmission channel between the cameras and the screen comprises in particular a multiplexer arranged downstream of the cameras, an image processing unit arranged downstream of the multiplexer and preferably also the integration unit and the validation unit. Alternatively, the integration unit and validation unit may also be directly assigned to the monitor or be a part of the same.
The interfaces used in the transmission channel comprise, for example, DVI and/or camera link and/or a wireless interface such as WiFi.
According to another embodiment, the device comprises a control device coupled to the validation unit.
The control device is, in particular, adapted to lock the weapon that can be coupled to the device, if the validation indicates a display of a still image or a display of an image of a wrong camera ("wrong view") or a display of a significantly delayed image.
According to another embodiment, the control device is adapted to drive a reliable failure indication in dependency of the validation, which failure indication indicates a failure of one or more cameras.
This gives the user a reliable indication that a specific camera has failed and preferably which camera has failed. This significantly increases the safety of the entire device.
The respective unit, for example, the integration unit or the validation unit, may be implemented as hardware and/or as software. In case of an implementation as hardware, the respective unit may be embodied as a device or part of a device, for example, as a computer or as a microprocessor or a FPGA. In case of an implementation as software, the respective unit may be embodied as a computer program product, as a function, as a routine, as part of a program code or as executable object.
Furthermore, a vehicle comprising the device for validatable output of images described above is proposed, wherein the N1 cameras are arranged on an outer skin of the vehicle and the screen is arranged in a space in the vehicle.
The vehicle preferably is a military vehicle. The vehicle may be a tank, a truck, a special vehicle, for example, a crane, a loader or excavator, a ship or a flying system such as a drone or a helicopter.
Other possible implementations of the invention also comprise combinations of features or embodiments that are not explicitly mentioned above or described below in the context of the exemplary embodiments. The person skilled in the art will also add individual aspects as improvements or additions to the respective basic form of the invention.
Further advantageous designs and aspects of the invention are subject of the dependent claims as well as the exemplary embodiments of the invention described below. Furthermore, the invention is explained in more detail on the basis of preferred embodiments with reference to the enclosed figures.
Fig. 1 shows a schematic block diagram of an exemplary embodiment of a device for validatable output of images; and
Fig. 2 shows a schematic block diagram of an exemplary embodiment of a screen for a device for validatable output of images.
In the figures, identical or functionally identical segments have been provided with the same reference numbers, unless otherwise indicated.
Fig. 1 illustrates a schematic block diagram of a first exemplary embodiment of a device 10 for validatable output of images. Fig. 2 shows a schematic block diagram of an exemplary embodiment of a screen 20 for the device 10 of Fig. 1.
The device 10 of Fig. 1 comprises a number N1 of cameras 31, 32, a screen 20 and a transmission channel 40 connected between the cameras 31, 32 and the screen 20. Without loss of generality, the exemplary embodiment of Fig. 1 shows two camera (N1 = 2). Furthermore, Fig. 1 shows a single screen 20. Without loss of generality, the above can also be applied to a plurality of screens.
The screen 20 is suitable for outputting images based on image data. The screen 20 is, for example, a display, preferably an LCD display or a TFT display. The respective camera 31, 32 is adapted to output image data BD1, BD2 based on a recording of camera images. The image data BD1, BD2 is transmitted to the transmission channel 40. For this purpose, the transmission channel 40 comprises a multiplexer 41. In addition to the multiplexer 41, according to the exemplary embodiment of Fig. 1, the transmission channel 40 comprises an image processing unit 42, an integration unit 43 and a validation unit 44. Without loss of generality, the transmission channel 40 may comprise additional units.
On its output side, the multiplexer 41 provides one of the received inputs, here the image data BD1 from camera 31 or the image data BD2 from camera 32. In the illustrated example in Fig. 1, the multiplexer 41 provides on its output side the image data BD1 of camera 31 to the image processing unit 42. In the present example, the image processing unit comprises the integration unit 43. The integration unit 43 is adapted to output extended image data eBD1 based on an integration of a visible signature S in the images displayed by the screen 20 into the image data BD1 provided by one of the cameras 31.
In addition, the device 10 comprises a number N2 of photo sensors 51, 52 arranged in front of the screen 20. In the example of Fig. 1, two photo sensors 51, 52 are shown. Without loss of generality, a greater number of photo sensors may be used. For example, Fig. 2 shows an exemplary embodiment having four photo sensors 51-54.
The photo sensors 51, 52 are adapted to detect the integrated signature S in the images output by the screen 20. Accordingly, the signature S may also be referred to as embedded additional image information.
The respective photo sensor 51-54 is in particular adapted to monitor a predetermined output area 21-24 (see Fig. 2) of the screen 20 with an MxN pixel area for detecting the signature S. The pixels of the MxN pixel area of the output area 21-24 in particular have a defined brightness for forming the visible signature S, preferably a defined and identical brightness. Alternatively, the pixels of the MxN pixel area may also form a predetermined pattern, for example, a video pattern, which preferably is camera-specific.
Furthermore, Fig. 1 shows, that the photo sensors 51, 52 are connected to the validation unit 44, so that the signals P1, P2 from the photo sensors 51, 52 can be transmitted to the validation unit 44. The signatures P1, P2 from the photo sensors 51, 52 comprise an information indicative for the detected signature S.
The validation unit 44 is adapted to validate the detected signature S, in particular by using the signals received from the photo sensors 51, 52 and to trigger a safety function based on the validated signature S. For this purpose, the validation unit 44 can also transmit a validation result V generated in dependency of the above to the image processing unit 42, which can then execute the triggered safety function.
The validation unit 44 is, in particular, configured to determine a time difference between a point in time of the detection of the signature by the N2 photo sensors
51, 52 and a first point in time of the integration of the signature S into the provided image data BD1, BD2 and to induce the safety function if the determined time differences is greater than a predetermined threshold value.
As illustrated in Fig. 1, for this purpose, the integration unit 43 can transmit the signature S, in particular at the first point in time to the validation unit 44. Alternatively, a simple trigger signal may be used that is transmitted from the integration unit 43 to the validation unit 44 at the first point in time.
In addition, the signature S may also be embodied as a coded signature which comprises a time stamp that is specific for indicating the first point in time for the integration of the signature S into the image data BD1 provided.
In addition, the coded signature may also comprise a source information specific
for the camera 31 providing the image data BD1.
The safety function that may be induced or triggered by the validation unit 44 may, for example, comprise outputting a message to a user, switching-off the screen 20 and/or locking a weapon or weapon station coupled with the device 10.
As already mentioned above, Fig. 2 shows an exemplary embodiment of a screen 20 for the device 10 of Fig. 1 comprising four output areas 21-24, wherein the respective output area 21-24 is arranged in one of the four corners of the screen 20, and the four respectively arranged photo sensor 51-54.
Meaning, photo sensor 51 is arranged in output area 21 (top left on the screen 20) and photo sensor 52 is arranged in output area 22 (top right on the screen 20). Also, photo sensor 53 is arranged in output area 23 (bottom left on the screen 20) and photo sensor 54 is arranged in output area 24 (bottom right on the screen 20).
In the exemplary embodiment of Fig. 2, four temperature sensors 61-64 are also provided. A respective one of the temperature sensors 61-64 is arranged in front of the screen 20 and adjacent to a respective one of the photo sensors 51-54. The respective temperature sensor 61-64 is adapted to measure a temperature of the respective output area 21-24 of the screen 20. In this case, the validation unit 44 may also use the temperatures measured by the temperature sensors 61-64 for the validation.
Although the present invention has been described using examples, it can be modified in many ways.
10 Device 20 Screen 21 Output area 22 Output area 23 Output area 24 Output area 31 Camera 32 Camera 40 Transmissionchannel 41 Multiplexer 42 Image processing unit 43 Integration unit
44 Validation unit 51 Photo sensor 52 Photo sensor 53 Photo sensor 54 Photo sensor 61 Temperature sensor 62 Temperature sensor 63 Temperature sensor 64 Temperature sensor BD1 Image data BD2 Image data eBD1 extended image data P1 Signal of photo sensor P2 Signal of photo sensor S Signature V Validation result
Claims (12)
1. A device (10) for validatable output of images, comprising: a screen (20) for outputting images based on image data, a number N1 of cameras (31, 32), with N1> 1, wherein each of the cameras (31, 32) is adapted to provide image data (BD1, BD2) based on a recording of camera images, an integration unit (43) for outputting extended image data (eBD2) based on an integration of a visible signature (S) in the images displayed by the screen (20) into the image data (BD1, BD2) provided by one of the cameras (31, 32), a number N2 of photo sensors (51-54) arranged in front of the screen (20), with N2 > 1, wherein the photo sensors (51-54) are adapted to detect the integrated signature (S) in the images output by the screen (20), and a validation unit (44) coupled with the photo sensors (51-54), the validation unit being adapted to validate the detected signature (S) and, responsive to the validated signature, trigger a safety function, wherein the validation unit (44) is adapted to determine a time difference between a second point in time of the detection the signature (S) by the N2 photo sensors (51-54) and a first point in time of the integration of the signature (S) into the image data (BD1, BD2) provided and to induce the safety function, if the determined time difference is greater than a predetermined threshold value, wherein the signature (S) is embodied as a coded signature comprising a time stamp specific for the integration of the signature (S) into the image data (BD1, BD2) provided to indicate the first point in time and a source information specific for the camera (31, 32) providing the image data (BD1, BD2).
2. The device according to claim 1, characterized in that, the respective photo sensor (51-54) is adapted to monitor a predetermined output area (21-24) of the screen (20) with an MxN pixel area for detecting the signature (S).
3. The device according to claim 2, characterized in that, the pixels of the MxN pixel area of the output area (21-24) have a defined brightness for forming the visible signature (S), in particular, have a defined and identical brightness.
4. The device according to claim 2, characterized in that, the pixels of the MxN pixel area of the output area (21-24) form a predetermined pattern for forming the visible signature (S).
5. The device according to any of claims 2 to 4, characterized in that, four photo sensors (51-54) are provided, with N2 = 4, wherein the respective output area (21-24) monitored by one of the four photo sensors (51-54) is located in one of the four corners of the screen (20).
6. The device according to any of claims 2 to 4, characterized in that, a plurality of photo sensors (51-54) is provided, with N2 > 4, wherein the output areas (21-24) monitored by the photo sensors (51-54) are located in the vertical and/or horizontal perimeter regions of the screen (20).
7. The device according to any of claims 1 to 6, characterized in that, a number N3 of temperature sensors (61-64) is provided for measuring a respective temperature, with N3 = N2, wherein a respective one of the temperature sensors (61-64) is arranged in front of the screen (20) and adjacent to a respective one of the photo sensors (51-54).
8. The device according to claim 7, characterized in that, the respective temperature sensor (61-64) is adapted to measure a temperature of the MxN pixel area of the output area (21-24) of the screen (20).
9. The device according to claim 7 or 8, characterized in that, the validation unit (44) is adapted to validate the detected signature (S) based on the temperature measured by the N3 temperature sensors (61-64).
10. The device according to any of claims 1 to 9, characterized in that, the safety function comprises outputting a message to a user, switching of the screen (20) and/or locking a weapon coupled to the device (10).
11. The device according to any of claims 1 to 10, characterized by a transmission channel (40) connected between the N1 cameras (31, 32) and the screen (20) for transmitting the image data (BD1, BD2) from the N1 cameras to the screen (20), wherein the integration unit (43) is arranged in the transmission channel (40) upstream of the screen (20).
12. A vehicle, in particular a military vehicle comprising a device (10) for validatable output of images according to any of claims 1 to 11, wherein the N1 of cameras (31, 32) are arranged on the outer skin of the vehicle and the screen (20) is arranged in a space of the vehicle.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102019111178.1 | 2019-04-30 | ||
DE102019111178.1A DE102019111178A1 (en) | 2019-04-30 | 2019-04-30 | Device for the validatable output of images |
PCT/EP2020/061066 WO2020221621A1 (en) | 2019-04-30 | 2020-04-21 | Device for a validatable output of images |
Publications (2)
Publication Number | Publication Date |
---|---|
AU2020265721A1 true AU2020265721A1 (en) | 2021-12-23 |
AU2020265721B2 AU2020265721B2 (en) | 2023-03-09 |
Family
ID=70465017
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
AU2020265721A Active AU2020265721B2 (en) | 2019-04-30 | 2020-04-21 | Device for a validatable output of images |
Country Status (6)
Country | Link |
---|---|
EP (1) | EP3948512A1 (en) |
AU (1) | AU2020265721B2 (en) |
BR (1) | BR112021021854A2 (en) |
CL (1) | CL2021002861A1 (en) |
DE (1) | DE102019111178A1 (en) |
WO (1) | WO2020221621A1 (en) |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DK1763864T3 (en) * | 2004-06-04 | 2009-04-27 | Jakob Hatteland Display As | Still image detector |
DE102014213456A1 (en) * | 2014-07-10 | 2016-01-14 | Continental Automotive Gmbh | Monitoring a signal transmission from a recording device to an output device |
KR102393370B1 (en) * | 2014-10-02 | 2022-05-03 | 삼성디스플레이 주식회사 | Organic light emitting display apparatus and driving method thereof |
DE102016202620A1 (en) * | 2016-02-19 | 2017-08-24 | Siemens Aktiengesellschaft | Device and method for monitoring a spatial area, in particular in the environment or inside a vehicle |
DE102016203266A1 (en) * | 2016-02-29 | 2017-08-31 | Zf Friedrichshafen Ag | Method for operating a display device and display device |
AT519108B1 (en) * | 2016-11-07 | 2018-04-15 | Tb Traxler Gmbh | Method for reading and editing video images in real time |
-
2019
- 2019-04-30 DE DE102019111178.1A patent/DE102019111178A1/en active Pending
-
2020
- 2020-04-21 WO PCT/EP2020/061066 patent/WO2020221621A1/en unknown
- 2020-04-21 AU AU2020265721A patent/AU2020265721B2/en active Active
- 2020-04-21 EP EP20721491.7A patent/EP3948512A1/en active Pending
- 2020-04-21 BR BR112021021854A patent/BR112021021854A2/en unknown
-
2021
- 2021-10-29 CL CL2021002861A patent/CL2021002861A1/en unknown
Also Published As
Publication number | Publication date |
---|---|
WO2020221621A1 (en) | 2020-11-05 |
BR112021021854A2 (en) | 2021-12-21 |
AU2020265721B2 (en) | 2023-03-09 |
DE102019111178A1 (en) | 2020-11-05 |
CL2021002861A1 (en) | 2022-07-15 |
EP3948512A1 (en) | 2022-02-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA2819937C (en) | Latency measurement system and method | |
CN102934091A (en) | Method and device for detecting an incorrect representation of image data on a display unit | |
US10397564B2 (en) | Image monitoring apparatus, image display system, and vehicle | |
US9973701B2 (en) | Monitoring system | |
JP6259132B2 (en) | In-vehicle camera device | |
JP6364845B2 (en) | Vibration measuring device | |
US10518701B2 (en) | Image monitoring apparatus, movable object, program and failure determining method | |
JP2008216334A (en) | Detecting method and detecting device for screen display fault | |
KR20130025665A (en) | Driver condition detecting device with ir sensor | |
US20170297489A1 (en) | Method and System for Representing Vehicle Surroundings of a Motor Vehicle on a Display Device Located in the Motor Vehicle | |
WO2018146907A1 (en) | Pneumatic fender monitoring system | |
JP6127558B2 (en) | Imaging device | |
AU2020265721B2 (en) | Device for a validatable output of images | |
US20190092339A1 (en) | Signal processing system and signal processing method for sensors of vehicle | |
EP2472874A2 (en) | Seamless mosaic projection system and method of aligning the same | |
JP5162671B2 (en) | Monitoring device for monitoring a display device | |
KR20110052190A (en) | Real-time monitoring system of digital cluster | |
US8547456B1 (en) | System and method for monitoring inactive pixels in a scene imaging system | |
JP2020068411A5 (en) | Position detection device, display system, and position detection method | |
US11176708B2 (en) | Camera and method for checking an alignment of such a camera | |
KR100766995B1 (en) | 3 dimension camera module device | |
AU2019360351B2 (en) | Apparatus for validatable output of images | |
WO2014132423A1 (en) | Video analysis device, display device, measurement method for display device, video correction method for display device | |
KR102226458B1 (en) | System and method of monitoring device in nuclear power plant | |
KR101956125B1 (en) | Fire sign detection system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FGA | Letters patent sealed or granted (standard patent) |