US20160189678A1 - Adjusting a transparent display with an image capturing device - Google Patents

Adjusting a transparent display with an image capturing device Download PDF

Info

Publication number
US20160189678A1
US20160189678A1 US15/064,295 US201615064295A US2016189678A1 US 20160189678 A1 US20160189678 A1 US 20160189678A1 US 201615064295 A US201615064295 A US 201615064295A US 2016189678 A1 US2016189678 A1 US 2016189678A1
Authority
US
United States
Prior art keywords
transparent display
display
capturing device
image
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/064,295
Inventor
Wes A. Nagara
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Visteon Global Technologies Inc
Original Assignee
Visteon Global Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Visteon Global Technologies Inc filed Critical Visteon Global Technologies Inc
Priority to US15/064,295 priority Critical patent/US20160189678A1/en
Assigned to VISTEON GLOBAL TECHNOLOGIES, INC. reassignment VISTEON GLOBAL TECHNOLOGIES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NAGARA, WES A.
Publication of US20160189678A1 publication Critical patent/US20160189678A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/023Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/10Intensity circuits
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • G06K9/00302
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/066Adjustment of display parameters for control of contrast
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/08Arrangements within a display terminal for setting, manually or automatically, display parameters of the display terminal
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/10Automotive applications

Definitions

  • Transparent displays such as a transparent light emitting display (LED) may be provided to augment pre-existing display units.
  • the transparent display allows a viewer to see through the transparent display simultaneously while being presented information being presented on the display.
  • the transparent display may be implemented in a vehicle.
  • the vehicle is ideal for a transparent display because the transparent display allows the operator of the vehicle to view mechanical components disposed at a rear of the display (ex. gauges), while simultaneously being served information on the transparent display.
  • the transparent display may convey information, such as information directed to road conditions, weather, vehicle status, and the like.
  • information such as information directed to road conditions, weather, vehicle status, and the like.
  • the operator of the vehicle may rely on the display of the transparent display to safely and efficiently operate the vehicle.
  • Vehicles may also incorporate cameras. Cameras, or image capturing devices, may assist the driver in various operations.
  • the camera may be placed in the rear of the vehicle, thereby alerting the vehicle's operator of any obstacles that may be in the vehicle's path while reversing.
  • a system and method for adjusting a transparent display an image capturing device includes an image receiving module to receive an image from the image capturing device, the image capturing device being situated on an opposing side in which the transparent display presents content; an interfacing module to interface with the transparent display; an analysis module to analyze the received image with the interfacing module to perform an analysis; and an output module to perform a display modification or an error indication based on the analysis.
  • FIG. 1 is a block diagram illustrating an example computer.
  • FIG. 2 illustrates an example of a system for adjusting a transparent display with an image capturing device.
  • FIGS. 3-5 illustrate example implementations of the system of FIG. 2 .
  • Transparent displays allow a viewer of the transparent display to see a surrounding environment, while being simultaneously presented information contained on the transparent display.
  • the transparent display may be implemented in various locations.
  • One such example is a vehicle, and in particular, a dashboard area of a vehicle.
  • the information presented on the transparent display may be erroneous.
  • the information may present data that misleads the driver (i.e. presents the wrong speed of the vehicle, or the wrong physical location). This may be due to the fact that a machine driving the transparent display contains bugs or is not operating correctly.
  • the transparent display may be difficult to read. This may be caused by environmental conditions, like an excess or lack of lighting. Thus, the transparent display becomes difficult to see for the operator of the vehicle.
  • buttons or physical inputs may encourage an operator of the vehicle to reach over and engage any of the buttons or physical inputs.
  • the action of extending one's appendage to perform this task may be burdensome, and in certain cases, unsafe.
  • the camera integrated with the transparent display may be achieved by placing the camera behind the transparent display, on a side opposing the viewer of the display. In this way, the camera may view not only the viewer of the display, but also the content being displayed on the transparent display.
  • aspects disclosed herein may be implemented in a vehicle. For the reasons discussed further below, employing the aspects disclosed herein lead to a safer, convenient, and enhanced driving experience.
  • FIG. 1 is a block diagram illustrating an example computer 100 .
  • the computer 100 includes at least one processor 102 coupled to a chipset 104 .
  • the chipset 104 includes a memory controller hub 120 and an input/output (I/O) controller hub 122 .
  • a memory 106 and a graphics adapter 112 are coupled to the memory controller hub 120
  • a display 118 is coupled to the graphics adapter 112 .
  • a storage device 108 , keyboard 110 , pointing device 114 , and network adapter 116 are coupled to the I/O controller hub 122 .
  • Other embodiments of the computer 100 may have different architectures.
  • the storage device 108 is a non-transitory computer-readable storage medium such as a hard drive, compact disk read-only memory (CD-ROM), DVD, or a solid-state memory device.
  • the memory 106 holds instructions and data used by the processor 102 .
  • the pointing device 114 is a mouse, track ball, or other type of pointing device, and is used in combination with the keyboard 110 to input data into the computer system 100 .
  • the graphics adapter 112 displays images and other information on the display 118 .
  • the network adapter 116 couples the computer system 100 to one or more computer networks.
  • the computer 100 is adapted to execute computer program modules for providing functionality described herein.
  • module refers to computer program logic used to provide the specified functionality.
  • a module can be implemented in hardware, firmware, and/or software.
  • program modules are stored on the storage device 108 , loaded into the memory 106 , and executed by the processor 102 .
  • the types of computers used by the entities and processes disclosed herein can vary depending upon the embodiment and the processing power required by the entity.
  • the computer 100 may be a mobile device, tablet, smartphone or any sort of computing element with the above-listed elements.
  • a video corpus such as a hard disk, solid state memory or storage device, might be stored in a distributed database system comprising multiple blade servers working together to provide the functionality described herein.
  • the computers can lack some of the components described above, such as keyboards 110 , graphics adapters 112 , and displays 118 .
  • FIG. 2 illustrates an example of a system 200 for adjusting a transparent display 250 with an image/video capturing device 260 .
  • the system 200 may be incorporated as a device, such as computer 100 .
  • the system 200 includes an image receiving module 210 , an interfacing module 220 , an analysis module 230 , and output module 240 .
  • the image receiving module 210 may be implemented to communicate with an image/video capturing device 260 .
  • the image/video capturing device 260 may be any sort of image/video capturing device 260 , such as a digital camera, digital video recorder, or the like.
  • the system 200 may be incorporated with the image capturing device 260 , the transparent display 250 , or a combination thereof.
  • FIGS. 3-5 illustrate example implementations of the system 200 .
  • the image/video capturing device 260 may be oriented in a position behind the transparent display 250 .
  • the image/video capturing device 260 may be oriented in a way to capture the contents being displayed on the transparent display 250 , the viewer of the transparent display 250 , and any other object in an environment in which the image/video capturing device 260 and the transparent display 250 is implemented at.
  • the transparent display 250 , the image/video capturing device 260 and system 200 may be implemented in an environment such as a dashboard of a vehicle.
  • the image receiving module 210 receives either images or videos from the image/video capturing device 260 .
  • the image receiving module 210 may receive either the images or videos at predetermined intervals, in real-time, or instigated based on an operation of the vehicle or motion detected by the image/video capturing device 260 .
  • the image receiving module 210 may store the images or videos in a data store 205 .
  • the display interfacing module 220 interfaces with the transparent display 250 .
  • the transparent display 250 may monitor the current content being displayed on the transparent display 250 .
  • the transparent display 250 may interact with a buffer implemented with the transparent display 250 and retrieve content that has already been displayed on the transparent display 250 .
  • the analysis module 230 includes a facial detection module 231 , an error detection module 232 , an environmental detection module 233 , and a gesture detection module 234 . Depending on the implementation of system 200 , various combinations of the elements 231 - 234 may be selectively included. The various elements of the analysis module 230 may interact with the image/video capturing device 260 and the transparent display 250 to perform the various operations described below.
  • the facial detection module 231 analyzes the image/video captured by the image receiving module 210 .
  • the facial detection module 231 determines if the image contains the viewer of the transparent display 250 .
  • the facial detection module 231 may further identify the viewer's face, or even further, identify features on the face. For example, an identified feature may be the viewer's eyes. Once the feature is identified, the facial detection module 231 may determine a state of the viewer's eyes, such as the eyes being closed, squinting, or wide open.
  • the error detection module 232 may determine if the transparent display 250 is currently misoperating. For example, if the transparent display 250 's driving logic contains an error, the transparent display 250 does not display the correct information, or the like, the error detection module 232 may record this error or inconsistency.
  • the error detection module 232 is facilitated by the placement of the image/video capturing device 260 in a position in which it directly views the transparent display 250 .
  • the error detection module 232 may review the refresh/rendering rate on the transparent display 250 . If the error detection module 232 detects that the refresh/rendering rate is not within a predetermined threshold of an acceptable rate, the error detection module 232 may record that an error has occurred.
  • the error detection module 232 may interface with a sensor or another image/video capturing device associated with a location at which the system 200 is implemented. For example, if system 200 is implemented in a vehicle, an exterior camera/sensor associated with the vehicle may determine that the speed of the vehicle should be or is at 70 miles per hour (mph). The image/video capturing device 260 simultaneously may determine that the transparent display 250 indicates a speed of 65 mph. In this situation the error detection module 232 may detect that the transparent display 250 is erroneously indicating wrong data.
  • the environmental detection module 233 may detect a certain aspect of the environment that may affect the quality of the transparent display 250 .
  • the environmental detection module 233 may determine that the conditions of the environment are influenced by a large amount of ambient light. In another example, the environmental detection module 233 may determine that the outside conditions are affected by an overcast day.
  • the environmental detection module 233 may record this condition in the data store 205 , and periodically update this recordation at every predetermined interval. For example, if the vehicle in which the system 200 is implemented in enters a dark place, such as a tunnel, and then egresses the tunnel, the environmental detection module 233 may update its recordation of the outside environment based on this reading.
  • the gesture detection module 234 may record a gesture made by the viewer of the transparent display 250 . For example, if the viewer wave's his/her hand in front of the image/video capturing device 260 , the gesture detection module 234 may detect this wave, and subsequently translate this motion or action into a command.
  • the output module 240 translates the various monitored data associated with the above described elements, and either signifies information detecting an error, or modifies the transparent display 250 . If an error is signified, the error may be communicated to a processor associated with the operation of the transparent display 250 , or alternatively, displayed on the transparent display 250 .
  • the output module 240 includes a display modification module 241 and an error indication module 242 .
  • the display modification module 241 modifies the transparent display 250 based on the various data analysis performed by the analysis module 230 . For example, if the facial detection module 231 detects that the viewer of the transparent display 250 's eyes are squinting, the transparent display 250 may be adjusted accordingly. An example is shown in FIG. 3 .
  • the display modification module 241 may modify the transparent display 250 based on the detected gesture by the gesture detection module 234 .
  • the gesture detection module 234 may determine that a hand is waving in front of the transparent display 250 .
  • the hand gesture may be translated into an operation enacted on the transparent display 250 .
  • the gesture detection module 234 in response to system 200 being implemented in a vehicle, may detect that a hand has been displaced from a steering wheel.
  • the transparent display 250 may be controlled to display a simpler display.
  • the system 200 becomes cognizant of the fact that the viewer is operating the vehicle, and thus, may be aided by a simpler user interface. An example of this is shown in FIG. 4 .
  • the display modification module 241 may alter the transparent display 250 based on the current environment. For example, based on excessive sunlight, or too little lighting, the transparent display 250 may alter a contrast on the transparent display 250 based on the detection of the environmental detection module 233 . An example of this is shown in FIG. 5 .
  • the error indication module 242 indicates an error based on the detected error by the error detection module 232 .
  • the error indication module 244 may record the error in the data store 205 , or transmit the error to a central processing unit (computer 100 ). Alternatively, the error indication module 242 may instigate the transparent display 250 to display the error.
  • the computing system includes a processor (CPU) and a system bus that couples various system components including a system memory such as read only memory (ROM) and random access memory (RAM), to the processor. Other system memory may be available for use as well.
  • the computing system may include more than one processor or a group or cluster of computing system networked together to provide greater processing capability.
  • the system bus may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
  • a basic input/output (BIOS) stored in the ROM or the like may provide basic routines that help to transfer information between elements within the computing system, such as during start-up.
  • BIOS basic input/output
  • the computing system further includes data stores, which maintain a database according to known database management systems.
  • the data stores may be embodied in many forms, such as a hard disk drive, a magnetic disk drive, an optical disk drive, tape drive, or another type of computer readable media which can store data that are accessible by the processor, such as magnetic cassettes, flash memory cards, digital versatile disks, cartridges, random access memories (RAMs) and, read only memory (ROM).
  • the data stores may be connected to the system bus by a drive interface.
  • the data stores provide nonvolatile storage of computer readable instructions, data structures, program modules and other data for the computing system.
  • the computing system may include an input device, such as a microphone for speech and audio, a touch sensitive screen for gesture or graphical input, keyboard, mouse, motion input, and so forth.
  • An output device can include one or more of a number of output mechanisms.
  • multimodal systems enable a user to provide multiple types of input to communicate with the computing system.
  • a communications interface generally enables the computing device system to communicate with one or more other computing devices using various communication and network protocols.
  • Embodiments disclosed herein can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the herein disclosed structures and their equivalents. Some embodiments can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on a tangible computer storage medium for execution by one or more processors.
  • a computer storage medium can be, or can be included in, a computer-readable storage device, a computer-readable storage substrate, or a random or serial access memory.
  • the computer storage medium can also be, or can be included in, one or more separate tangible components or media such as multiple CDs, disks, or other storage devices.
  • the computer storage medium does not include a transitory signal.
  • the term processor encompasses all kinds of apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, a system on a chip, or multiple ones, or combinations, of the foregoing.
  • the processor can include special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
  • the processor also can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, a cross-platform runtime environment, a virtual machine, or a combination of one or more of them.
  • a computer program (also known as a program, module, engine, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and the program can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment.
  • a computer program may, but need not, correspond to a file in a file system.
  • a program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
  • a computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • GUI graphical user interface
  • Such GUI's may include interactive features such as pop-up or pull-down menus or lists, selection tabs, scannable features, and other features that can receive human inputs.
  • the computing system disclosed herein can include clients and servers.
  • a client and server are generally remote from each other and typically interact through a communications network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
  • a server transmits data (e.g., an HTML page) to a client device (e.g., for purposes of displaying data to and receiving user input from a user interacting with the client device).
  • client device e.g., for purposes of displaying data to and receiving user input from a user interacting with the client device.
  • Data generated at the client device e.g., a result of the user interaction

Abstract

A system and method for adjusting a display an image capturing device is provided. The method includes receiving an image, detecting a facial feature, such as eyes squinting, and adjusting a contrast of a display accordingly. As such, by employing a facial feature detection technique, a display may more dynamically provide an updated contrast.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This U.S. patent application is a Continuation application of U.S. patent application Ser. No. 14/276,635, filed May 13, 2014, entitled “Adjusting A Transparent Display With An Image Capturing Device,” now pending, which claims the benefit of U.S. Provisional Patent Application Ser. No. 61/842,783 filed Jul. 3, 2013, entitled “Adjusting A Transparent Display With An Image Capturing Device,” now expired the entire disclosure of the applications being considered part of the disclosure of this application and hereby incorporated by reference.
  • BACKGROUND
  • Transparent displays, such as a transparent light emitting display (LED), may be provided to augment pre-existing display units. The transparent display allows a viewer to see through the transparent display simultaneously while being presented information being presented on the display.
  • The transparent display may be implemented in a vehicle. The vehicle is ideal for a transparent display because the transparent display allows the operator of the vehicle to view mechanical components disposed at a rear of the display (ex. gauges), while simultaneously being served information on the transparent display.
  • The transparent display may convey information, such as information directed to road conditions, weather, vehicle status, and the like. Thus, the operator of the vehicle may rely on the display of the transparent display to safely and efficiently operate the vehicle.
  • Vehicles may also incorporate cameras. Cameras, or image capturing devices, may assist the driver in various operations. The camera may be placed in the rear of the vehicle, thereby alerting the vehicle's operator of any obstacles that may be in the vehicle's path while reversing.
  • SUMMARY
  • A system and method for adjusting a transparent display an image capturing device is provided. The system includes an image receiving module to receive an image from the image capturing device, the image capturing device being situated on an opposing side in which the transparent display presents content; an interfacing module to interface with the transparent display; an analysis module to analyze the received image with the interfacing module to perform an analysis; and an output module to perform a display modification or an error indication based on the analysis.
  • DESCRIPTION OF THE DRAWINGS
  • The detailed description refers to the following drawings, in which like numerals refer to like items, and in which:
  • FIG. 1 is a block diagram illustrating an example computer.
  • FIG. 2 illustrates an example of a system for adjusting a transparent display with an image capturing device.
  • FIGS. 3-5 illustrate example implementations of the system of FIG. 2.
  • DETAILED DESCRIPTION
  • Transparent displays allow a viewer of the transparent display to see a surrounding environment, while being simultaneously presented information contained on the transparent display. The transparent display may be implemented in various locations. One such example is a vehicle, and in particular, a dashboard area of a vehicle.
  • However, in certain cases, the information presented on the transparent display may be erroneous. For example, the information may present data that misleads the driver (i.e. presents the wrong speed of the vehicle, or the wrong physical location). This may be due to the fact that a machine driving the transparent display contains bugs or is not operating correctly.
  • In other cases, the transparent display may be difficult to read. This may be caused by environmental conditions, like an excess or lack of lighting. Thus, the transparent display becomes difficult to see for the operator of the vehicle.
  • Further, the transparent display may be operated by buttons or physical inputs. Thus, the buttons or physical inputs may encourage an operator of the vehicle to reach over and engage any of the buttons or physical inputs. The action of extending one's appendage to perform this task may be burdensome, and in certain cases, unsafe.
  • Disclosed herein are systems and methods for integrating a camera behind a transparent display. The camera integrated with the transparent display, according to aspects disclosed herein, may be achieved by placing the camera behind the transparent display, on a side opposing the viewer of the display. In this way, the camera may view not only the viewer of the display, but also the content being displayed on the transparent display.
  • Further, the aspects disclosed herein may be implemented in a vehicle. For the reasons discussed further below, employing the aspects disclosed herein lead to a safer, convenient, and enhanced driving experience.
  • FIG. 1 is a block diagram illustrating an example computer 100. The computer 100 includes at least one processor 102 coupled to a chipset 104. The chipset 104 includes a memory controller hub 120 and an input/output (I/O) controller hub 122. A memory 106 and a graphics adapter 112 are coupled to the memory controller hub 120, and a display 118 is coupled to the graphics adapter 112. A storage device 108, keyboard 110, pointing device 114, and network adapter 116 are coupled to the I/O controller hub 122. Other embodiments of the computer 100 may have different architectures.
  • The storage device 108 is a non-transitory computer-readable storage medium such as a hard drive, compact disk read-only memory (CD-ROM), DVD, or a solid-state memory device. The memory 106 holds instructions and data used by the processor 102. The pointing device 114 is a mouse, track ball, or other type of pointing device, and is used in combination with the keyboard 110 to input data into the computer system 100. The graphics adapter 112 displays images and other information on the display 118. The network adapter 116 couples the computer system 100 to one or more computer networks.
  • The computer 100 is adapted to execute computer program modules for providing functionality described herein. As used herein, the term “module” refers to computer program logic used to provide the specified functionality. Thus, a module can be implemented in hardware, firmware, and/or software. In one embodiment, program modules are stored on the storage device 108, loaded into the memory 106, and executed by the processor 102.
  • The types of computers used by the entities and processes disclosed herein can vary depending upon the embodiment and the processing power required by the entity. The computer 100 may be a mobile device, tablet, smartphone or any sort of computing element with the above-listed elements. For example, a video corpus, such as a hard disk, solid state memory or storage device, might be stored in a distributed database system comprising multiple blade servers working together to provide the functionality described herein. The computers can lack some of the components described above, such as keyboards 110, graphics adapters 112, and displays 118.
  • FIG. 2 illustrates an example of a system 200 for adjusting a transparent display 250 with an image/video capturing device 260. The system 200 may be incorporated as a device, such as computer 100. The system 200 includes an image receiving module 210, an interfacing module 220, an analysis module 230, and output module 240. The image receiving module 210 may be implemented to communicate with an image/video capturing device 260. The image/video capturing device 260 may be any sort of image/video capturing device 260, such as a digital camera, digital video recorder, or the like. Alternatively, the system 200 may be incorporated with the image capturing device 260, the transparent display 250, or a combination thereof.
  • FIGS. 3-5 illustrate example implementations of the system 200.
  • The image/video capturing device 260 may be oriented in a position behind the transparent display 250. The image/video capturing device 260 may be oriented in a way to capture the contents being displayed on the transparent display 250, the viewer of the transparent display 250, and any other object in an environment in which the image/video capturing device 260 and the transparent display 250 is implemented at. The transparent display 250, the image/video capturing device 260 and system 200 may be implemented in an environment such as a dashboard of a vehicle.
  • The image receiving module 210 receives either images or videos from the image/video capturing device 260. The image receiving module 210 may receive either the images or videos at predetermined intervals, in real-time, or instigated based on an operation of the vehicle or motion detected by the image/video capturing device 260. The image receiving module 210 may store the images or videos in a data store 205.
  • The display interfacing module 220 interfaces with the transparent display 250. The transparent display 250 may monitor the current content being displayed on the transparent display 250. Alternatively, the transparent display 250 may interact with a buffer implemented with the transparent display 250 and retrieve content that has already been displayed on the transparent display 250.
  • The analysis module 230 includes a facial detection module 231, an error detection module 232, an environmental detection module 233, and a gesture detection module 234. Depending on the implementation of system 200, various combinations of the elements 231-234 may be selectively included. The various elements of the analysis module 230 may interact with the image/video capturing device 260 and the transparent display 250 to perform the various operations described below.
  • The facial detection module 231 analyzes the image/video captured by the image receiving module 210. The facial detection module 231 determines if the image contains the viewer of the transparent display 250. The facial detection module 231 may further identify the viewer's face, or even further, identify features on the face. For example, an identified feature may be the viewer's eyes. Once the feature is identified, the facial detection module 231 may determine a state of the viewer's eyes, such as the eyes being closed, squinting, or wide open.
  • The error detection module 232, based on the content retrieved by the display interfacing module 220, may determine if the transparent display 250 is currently misoperating. For example, if the transparent display 250's driving logic contains an error, the transparent display 250 does not display the correct information, or the like, the error detection module 232 may record this error or inconsistency. The error detection module 232 is facilitated by the placement of the image/video capturing device 260 in a position in which it directly views the transparent display 250.
  • In another example, the error detection module 232 may review the refresh/rendering rate on the transparent display 250. If the error detection module 232 detects that the refresh/rendering rate is not within a predetermined threshold of an acceptable rate, the error detection module 232 may record that an error has occurred.
  • In another example, the error detection module 232 may interface with a sensor or another image/video capturing device associated with a location at which the system 200 is implemented. For example, if system 200 is implemented in a vehicle, an exterior camera/sensor associated with the vehicle may determine that the speed of the vehicle should be or is at 70 miles per hour (mph). The image/video capturing device 260 simultaneously may determine that the transparent display 250 indicates a speed of 65 mph. In this situation the error detection module 232 may detect that the transparent display 250 is erroneously indicating wrong data.
  • The environmental detection module 233 may detect a certain aspect of the environment that may affect the quality of the transparent display 250. The environmental detection module 233 may determine that the conditions of the environment are influenced by a large amount of ambient light. In another example, the environmental detection module 233 may determine that the outside conditions are affected by an overcast day. The environmental detection module 233 may record this condition in the data store 205, and periodically update this recordation at every predetermined interval. For example, if the vehicle in which the system 200 is implemented in enters a dark place, such as a tunnel, and then egresses the tunnel, the environmental detection module 233 may update its recordation of the outside environment based on this reading.
  • The gesture detection module 234 may record a gesture made by the viewer of the transparent display 250. For example, if the viewer wave's his/her hand in front of the image/video capturing device 260, the gesture detection module 234 may detect this wave, and subsequently translate this motion or action into a command.
  • The output module 240 translates the various monitored data associated with the above described elements, and either signifies information detecting an error, or modifies the transparent display 250. If an error is signified, the error may be communicated to a processor associated with the operation of the transparent display 250, or alternatively, displayed on the transparent display 250. The output module 240 includes a display modification module 241 and an error indication module 242.
  • The display modification module 241 modifies the transparent display 250 based on the various data analysis performed by the analysis module 230. For example, if the facial detection module 231 detects that the viewer of the transparent display 250's eyes are squinting, the transparent display 250 may be adjusted accordingly. An example is shown in FIG. 3.
  • In another example, the display modification module 241 may modify the transparent display 250 based on the detected gesture by the gesture detection module 234. The gesture detection module 234 may determine that a hand is waving in front of the transparent display 250. The hand gesture may be translated into an operation enacted on the transparent display 250.
  • Alternatively, the gesture detection module 234, in response to system 200 being implemented in a vehicle, may detect that a hand has been displaced from a steering wheel. In response to one of the viewer's hand being displaced from the steering wheel, the transparent display 250 may be controlled to display a simpler display. A justification for this is that the system 200 becomes cognizant of the fact that the viewer is operating the vehicle, and thus, may be aided by a simpler user interface. An example of this is shown in FIG. 4.
  • In another example, if the system 200 is implemented in a location that experiences dynamic environments, such as a vehicle, the display modification module 241 may alter the transparent display 250 based on the current environment. For example, based on excessive sunlight, or too little lighting, the transparent display 250 may alter a contrast on the transparent display 250 based on the detection of the environmental detection module 233. An example of this is shown in FIG. 5.
  • The error indication module 242 indicates an error based on the detected error by the error detection module 232. The error indication module 244 may record the error in the data store 205, or transmit the error to a central processing unit (computer 100). Alternatively, the error indication module 242 may instigate the transparent display 250 to display the error.
  • Certain of the devices shown in FIG. 1 include a computing system. The computing system includes a processor (CPU) and a system bus that couples various system components including a system memory such as read only memory (ROM) and random access memory (RAM), to the processor. Other system memory may be available for use as well. The computing system may include more than one processor or a group or cluster of computing system networked together to provide greater processing capability. The system bus may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. A basic input/output (BIOS) stored in the ROM or the like, may provide basic routines that help to transfer information between elements within the computing system, such as during start-up. The computing system further includes data stores, which maintain a database according to known database management systems. The data stores may be embodied in many forms, such as a hard disk drive, a magnetic disk drive, an optical disk drive, tape drive, or another type of computer readable media which can store data that are accessible by the processor, such as magnetic cassettes, flash memory cards, digital versatile disks, cartridges, random access memories (RAMs) and, read only memory (ROM). The data stores may be connected to the system bus by a drive interface. The data stores provide nonvolatile storage of computer readable instructions, data structures, program modules and other data for the computing system.
  • To enable human (and in some instances, machine) user interaction, the computing system may include an input device, such as a microphone for speech and audio, a touch sensitive screen for gesture or graphical input, keyboard, mouse, motion input, and so forth. An output device can include one or more of a number of output mechanisms. In some instances, multimodal systems enable a user to provide multiple types of input to communicate with the computing system. A communications interface generally enables the computing device system to communicate with one or more other computing devices using various communication and network protocols.
  • Embodiments disclosed herein can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the herein disclosed structures and their equivalents. Some embodiments can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on a tangible computer storage medium for execution by one or more processors. A computer storage medium can be, or can be included in, a computer-readable storage device, a computer-readable storage substrate, or a random or serial access memory. The computer storage medium can also be, or can be included in, one or more separate tangible components or media such as multiple CDs, disks, or other storage devices. The computer storage medium does not include a transitory signal.
  • As used herein, the term processor encompasses all kinds of apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, a system on a chip, or multiple ones, or combinations, of the foregoing. The processor can include special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit). The processor also can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, a cross-platform runtime environment, a virtual machine, or a combination of one or more of them.
  • A computer program (also known as a program, module, engine, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and the program can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment. A computer program may, but need not, correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • To provide for interaction with an individual, the herein disclosed embodiments can be implemented using an interactive display, such as a graphical user interface (GUI). Such GUI's may include interactive features such as pop-up or pull-down menus or lists, selection tabs, scannable features, and other features that can receive human inputs.
  • The computing system disclosed herein can include clients and servers. A client and server are generally remote from each other and typically interact through a communications network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. In some embodiments, a server transmits data (e.g., an HTML page) to a client device (e.g., for purposes of displaying data to and receiving user input from a user interacting with the client device). Data generated at the client device (e.g., a result of the user interaction) can be received from the client device at the server.

Claims (4)

We claim:
1. A method for adjusting a display with an image capturing device, comprising:
receiving an image from the image capturing device, the image capturing device being oriented towards a driver of a vehicle;
detecting, from the received image, a facial feature of eyes squinting; and
adjusting a contrast of the display based on a detection of the eyes squinting.
2. The method according to claim 1, wherein the display is a transparent display.
3. The method according to claim 1, further comprising receiving data from an environmental sensor.
4. The method according to claim 3, wherein the adjusting further comprises employing a combination of the detection of squinting and the received data from the environmental sensor.
US15/064,295 2013-07-03 2016-03-08 Adjusting a transparent display with an image capturing device Abandoned US20160189678A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/064,295 US20160189678A1 (en) 2013-07-03 2016-03-08 Adjusting a transparent display with an image capturing device

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201361842783P 2013-07-03 2013-07-03
US14/276,635 US9741315B2 (en) 2013-07-03 2014-05-13 Adjusting a transparent display with an image capturing device
US15/064,295 US20160189678A1 (en) 2013-07-03 2016-03-08 Adjusting a transparent display with an image capturing device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/276,635 Continuation US9741315B2 (en) 2013-07-03 2014-05-13 Adjusting a transparent display with an image capturing device

Publications (1)

Publication Number Publication Date
US20160189678A1 true US20160189678A1 (en) 2016-06-30

Family

ID=52132460

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/276,635 Expired - Fee Related US9741315B2 (en) 2013-07-03 2014-05-13 Adjusting a transparent display with an image capturing device
US15/064,295 Abandoned US20160189678A1 (en) 2013-07-03 2016-03-08 Adjusting a transparent display with an image capturing device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/276,635 Expired - Fee Related US9741315B2 (en) 2013-07-03 2014-05-13 Adjusting a transparent display with an image capturing device

Country Status (3)

Country Link
US (2) US9741315B2 (en)
JP (1) JP5819488B2 (en)
CN (1) CN104281258B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150138417A1 (en) * 2013-11-18 2015-05-21 Joshua J. Ratcliff Viewfinder wearable, at least in part, by human operator

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140368425A1 (en) * 2013-06-12 2014-12-18 Wes A. Nagara Adjusting a transparent display with an image capturing device
US9514706B1 (en) 2015-05-28 2016-12-06 Chunghwa Picture Tubes, Ltd. Transparent display apparatus and image adjustment method thereof
KR20180074973A (en) * 2016-12-26 2018-07-04 삼성전자주식회사 Electronic device for linking/separating information between digital displays
US10714018B2 (en) * 2017-05-17 2020-07-14 Ignis Innovation Inc. System and method for loading image correction data for displays

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070159470A1 (en) * 2006-01-11 2007-07-12 Industrial Technology Research Institute Apparatus for automatically adjusting display parameters relying on visual performance and method for the same
US20140092121A1 (en) * 2012-09-28 2014-04-03 Hewlett-Packard Development Company, Lp System with content display management

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7469381B2 (en) * 2007-01-07 2008-12-23 Apple Inc. List scrolling and document translation, scaling, and rotation on a touch-screen display
US8068942B2 (en) * 1999-12-15 2011-11-29 Automotive Technologies International, Inc. Vehicular heads-up display system
JP2001306254A (en) * 2000-02-17 2001-11-02 Seiko Epson Corp Inputting function by slapping sound detection
US6952195B2 (en) * 2000-09-12 2005-10-04 Fuji Photo Film Co., Ltd. Image display device
JP4624577B2 (en) * 2001-02-23 2011-02-02 富士通株式会社 Human interface system with multiple sensors
US6496117B2 (en) * 2001-03-30 2002-12-17 Koninklijke Philips Electronics N.V. System for monitoring a driver's attention to driving
US6822624B2 (en) * 2002-09-10 2004-11-23 Universal Avionics Systems Corporation Display generation system
US7248229B2 (en) * 2003-12-31 2007-07-24 Zerphy Bryron L Dynamic message sign display panel communication error detection and correction
JP4588366B2 (en) * 2004-06-08 2010-12-01 株式会社リコー Image display device
JP4872451B2 (en) * 2006-05-15 2012-02-08 トヨタ自動車株式会社 Vehicle input device
JP4333697B2 (en) * 2006-06-06 2009-09-16 トヨタ自動車株式会社 Vehicle display device
JP4127296B2 (en) * 2006-06-09 2008-07-30 ソニー株式会社 IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND COMPUTER PROGRAM
KR100781651B1 (en) * 2006-11-06 2007-12-03 삼성전자주식회사 Apparatus and method for displaying screen in portable terminal
DE102007035769A1 (en) * 2007-07-27 2009-02-26 Continental Automotive Gmbh Motor vehicle cockpit
JP2009168614A (en) * 2008-01-16 2009-07-30 Alpine Electronics Inc On-vehicle navigation device
JP5211120B2 (en) * 2010-07-30 2013-06-12 株式会社東芝 Information display device and information display method
JP2013545154A (en) * 2010-09-10 2013-12-19 ワイフェアラー・インコーポレーテッド RF fingerprint for content location
US8608319B2 (en) * 2011-04-19 2013-12-17 Igt Multi-layer projection displays
JP5830987B2 (en) * 2011-07-06 2015-12-09 ソニー株式会社 Display control apparatus, display control method, and computer program
EP2587818B1 (en) * 2011-10-27 2016-08-10 Samsung Electronics Co., Ltd. Multi-view device of display apparatus and control method thereof, and display system
JP5799776B2 (en) * 2011-11-29 2015-10-28 株式会社バッファロー Information display device and program
JP2013125985A (en) * 2011-12-13 2013-06-24 Sharp Corp Display system
EP2608546A1 (en) * 2011-12-21 2013-06-26 Thomson Licensing Video processing apparatus and method for detecting a temporal synchronization mismatch
TWI531495B (en) * 2012-12-11 2016-05-01 Automatic Calibration Method and System for Vehicle Display System
US20140204200A1 (en) * 2013-01-24 2014-07-24 Wipro Limited Methods and systems for speed calibration in spectral imaging systems

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070159470A1 (en) * 2006-01-11 2007-07-12 Industrial Technology Research Institute Apparatus for automatically adjusting display parameters relying on visual performance and method for the same
US20140092121A1 (en) * 2012-09-28 2014-04-03 Hewlett-Packard Development Company, Lp System with content display management

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150138417A1 (en) * 2013-11-18 2015-05-21 Joshua J. Ratcliff Viewfinder wearable, at least in part, by human operator
US9491365B2 (en) * 2013-11-18 2016-11-08 Intel Corporation Viewfinder wearable, at least in part, by human operator

Also Published As

Publication number Publication date
CN104281258B (en) 2019-08-06
JP5819488B2 (en) 2015-11-24
CN104281258A (en) 2015-01-14
US9741315B2 (en) 2017-08-22
JP2015015025A (en) 2015-01-22
US20150009126A1 (en) 2015-01-08

Similar Documents

Publication Publication Date Title
US20160189678A1 (en) Adjusting a transparent display with an image capturing device
US9437131B2 (en) Driving a multi-layer transparent display
CN109145680B (en) Method, device and equipment for acquiring obstacle information and computer storage medium
US10931912B2 (en) Detecting anomalous events to trigger the uploading of video to a video storage server
US11363192B2 (en) Method, and apparatus for clock synchronization, device, storage medium and vehicle
CN107666987A (en) Robotic process automates
US9978074B2 (en) Automated experiment scheduling
CN109738904A (en) A kind of method, apparatus of detection of obstacles, equipment and computer storage medium
CN107040574B (en) Screenshot and data processing method and device
US20150187143A1 (en) Rendering a virtual representation of a hand
US9740668B1 (en) Plotting webpage loading speeds and altering webpages and a service based on latency and pixel density
EP3738027B1 (en) Feature usage prediction using shell application feature telemetry
US10209772B2 (en) Hands-free time series or chart-based data investigation
US9805254B2 (en) Preventing display clearing
US11462025B2 (en) Method of and system for determining traffic signal state
US9875019B2 (en) Indicating a transition from gesture based inputs to touch surfaces
US20200081024A1 (en) Method and device for detecting obstacle speed, computer device, and storage medium
WO2020061449A1 (en) Apparatus and method for providing an electronic user manual
US20150185831A1 (en) Switching between gaze tracking and head tracking
Milazzo et al. KIND‐DAMA: A modular middleware for Kinect‐like device data management
CN114740975A (en) Target content acquisition method and related equipment
Liu et al. 1 MobileUTDrive: A portable device platform for in-vehicle driving data collection
DE102014108656B4 (en) Customize a transparent display with an image capture device
CN104243953A (en) Adjusting a transparent display with an image capturing device
CN112000538B (en) Page content display monitoring method, device and equipment and readable storage medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: VISTEON GLOBAL TECHNOLOGIES, INC., MICHIGAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NAGARA, WES A.;REEL/FRAME:037967/0398

Effective date: 20140513

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION