WO2015026296A1 - System and method for providing illumination to an interior of a vehicle - Google Patents

System and method for providing illumination to an interior of a vehicle Download PDF

Info

Publication number
WO2015026296A1
WO2015026296A1 PCT/SG2014/000391 SG2014000391W WO2015026296A1 WO 2015026296 A1 WO2015026296 A1 WO 2015026296A1 SG 2014000391 W SG2014000391 W SG 2014000391W WO 2015026296 A1 WO2015026296 A1 WO 2015026296A1
Authority
WO
WIPO (PCT)
Prior art keywords
illumination
vehicle
image
led
data
Prior art date
Application number
PCT/SG2014/000391
Other languages
English (en)
French (fr)
Inventor
Philipp MUNDHENK
Sebastian STEINHORST
Martin LUKASIEWYCZ
Kai Xiang WANG
Original Assignee
Tum Create Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tum Create Limited filed Critical Tum Create Limited
Priority to DE112014003826.1T priority Critical patent/DE112014003826T5/de
Publication of WO2015026296A1 publication Critical patent/WO2015026296A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q3/00Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors
    • B60Q3/80Circuits; Control arrangements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2500/00Special features or arrangements of vehicle interior lamps
    • B60Q2500/30Arrangements for illuminating different zones in the vehicle, e.g. front/rear, different seats
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/141Control of illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

Definitions

  • the present invention relates to a system and method for providing illumination to an interior of a vehicle.
  • the present invention aims to provide a new and useful system and method for providing illumination to the interior of a vehicle.
  • the invention proposes that illumination with characteristics varying with the environment around the vehicle is provided to the interior of the vehicle.
  • a first aspect of the invention is an apparatus for providing illumination to an interior of a vehicle, the apparatus comprising:
  • input means configured to provide input data varying according to an environment surrounding the vehicle
  • processing means configured to process the input data to determine characteristics of the illumination to be provided; and output means configured to provide the illumination with the determined characteristics to the interior of the vehicle;
  • the characteristics of the illumination provided by the output means match the characteristics of the environment surrounding the vehicle when the illumination is provided; and wherein the characteristics comprise both of colours and illuminance.
  • a second aspect of the invention is a method for providing illumination to an interior of a vehicle, the method comprising:
  • the characteristics of the illumination provided by the output means match the characteristics of the environment surrounding the vehicle when the illumination is provided;
  • characteristics comprise both of colours and illuminance.
  • the invention helps to influence the psychology of the passengers in the vehicle in a positive manner. For example, it can induce specific feelings and moods in the passengers.
  • the invention also enhances the experience of the passengers by offering a more intensive and natural perception of the environment surrounding the vehicle as it allows reproduction of the surrounding lighting conditions inside the vehicle. This perception is otherwise diminished due to small or tinted windows. Therefore, the invention allows projection of information into the field of view of the passengers which is otherwise not visible to them. Configuring the lightings such that the colours of these lightings correspond to the colours of the environment also helps in enhancing the feel of spaciousness in the confined compartment of a vehicle.
  • the invention achieves a visual perception enlargement effect as the lights provided to the interior of the vehicles adapt in real-time to the characteristics (colour and illuminance) of the environment around the vehicle.
  • the lights may be arranged around a particular window and when the colours of the lights correspond to the view from the window, one perceives the coloured light as an extension of the window and hence, perceives the window as being larger.
  • the apparatus is suitable for operating during daylight.
  • the visual perception enlargement effect is caused by the inhomogeneous field of view of humans. This is due to the distribution of cone cells. In particular, the cone cells are concentrated in a certain area of the eye, the fovea. Fig.
  • FIG. 1 shows the visual acuity of the human eye in relation to the fovea distance [KGS05].
  • the visual acuity has its maximum of one arc per minute in the area where cone cells are concentrated. Only 10° away, the visual acuity decreases to 0.1 arc per minute [Gre08]. Due to this, the image is blurred on the rest of the retina.
  • the brain processes the signals provided by the eye. While doing so, the brain follows distinct patterns, which are described by the theories of Gestalt Psychology. Thus, areas which are close or have the same colour are grouped together, leading to the enlargement of the perceptual view [Roc97]. It has been proven in many studies [SVK07, KP99] that visual perception enlargement provides several positive effects. One such effect is the reduction of visual discomfort [BA06].
  • the Ambilight TV uses LEDs which are mounted on its back. These LEDs serve to reduce eyestrain and visual discomfort, and enlarge visual perception.
  • One of the recent models, the 6000 series has 10 LEDs mounted on both sides of the TV. The colour of the LEDs illuminate in the same colour as the current image on the television. To achieve this, integrated electronics are used to analyze the input video signal in real-time to determine the colour for each LED.
  • the coloured light is reflected off the wall behind the TV and this creates a perceived enlargement of the TV image [vdH08].
  • the TV is the main source of light in a dark room
  • the LEDs of the Ambilight TV function as another lighting, apart from the actual TV screen, so that the change in brightness becomes smaller.
  • the pupil of the person watching the Ambilight TV which continuously adapts to the light level, is thus not as strained as it would be without the LEDs [vdH08].
  • the enlargement of the visual perception created by this invention can help reduce the power consumed by the air conditioner in the vehicle. This is especially useful in tropical megacities, where the air conditioner in the . vehicle is usually turned on throughout the car ride so as to cool the interior of the vehicle. This advantage is explained as follows.
  • a reduction of the actual window size can help lower the heat flow. However, this is generally not feasible since a decreased window size will reduce the view of the passenger. As a result, the passenger's comfort is negatively affected.
  • the actual window size can be reduced and yet, the passenger will not be substantially affected as he or she perceives the window to be bigger than its actual size. Therefore, with the present invention, the window size and hence the power consumption of the air conditioner can be reduced, without negatively affecting the comfort of the passenger.
  • Fig. 1 shows visual acuity of the human eye in relation to the fovea distance
  • Fig. 2 shows an apparatus for providing illumination to an interior of a vehicle according to an embodiment of the present invention
  • Figs. 3(a) - (c) show components of the apparatus of Fig. 2;
  • Fig. 4 shows the relative light intensity of a LED as a function of the scanning angle
  • Fig. 5 shows an example way of installing a LED-strip of the apparatus of Fig. 2 in a car
  • Fig. 6 shows the interfaces between processing means and output means of the apparatus of Fig. 2;
  • Fig. 7 shows a data package according to the RGB-data transmission protocol adopted for colour data transmission in the apparatus of Fig. 2;
  • Fig. 8 shows the encoding of the LED enumeration as single bits of one byte in the transmission protocol of Fig. 19;
  • Fig. 9 shows the basic structure of the single transmission protocol used for transmitting colour data for one LED in the apparatus of Fig. 2;
  • Fig. 10 shows the initialization protocol used in the apparatus of Fig. 2;
  • Fig. 11 shows the delay measurement protocol used in the apparatus of Fig. 2;
  • Fig. 12 shows the tasks performed by the apparatus of Fig. 2 after its configuration and initialization
  • Fig. 13 shows the tasks of Fig. 12 together with two further blocks for configuring and initializing the apparatus
  • Fig. 14 shows information flow in the apparatus of Fig. 2;
  • Fig. 15 shows the schematic of the process structure implemented on the apparatus of Fig. 2;
  • Fig. 16 shows representative colours determined with and without first converting the image data into the HSB space
  • Fig. 17 shows a situation where colours of the illumination provided by the apparatus of Fig. 2 is mismatched with the environment and a situation where colours of illumination provided by the apparatus of Fig. 2 matches the environment.
  • Fig. 2 shows an apparatus 200 for providing illumination to an interior of a vehicle according to an embodiment of the present invention.
  • the apparatus 200 may be referred to as a Car Environment Light System (CELS 200).
  • CELS 200 may be described as a model which:
  • CELS 200 may be isolated from its surroundings (i.e. the CELS 200 may be referred to as a super-system)
  • ii. contains relations between different attributes (inputs, outputs, states, etc.)
  • iii. consists of interconnected components or subsystems.
  • Fig. 2 shows the CELS 200 with its input and output. Due to the interaction with its surroundings, the CELS 200 is considered an open system. The CELS 200 adapts to the environment which functions as the input of the CELS 200. Furthermore, by emitting light, the CELS 200 returns output to the surroundings.
  • a scene corresponding to a view seen by a person through the window of the vehicle is captured and in doing so, this scene which corresponds to the environment surrounding the vehicle is digitized into a digital image.
  • This digital image is then processed to obtain context-sensitive lighting to be displayed to an interior of the vehicle.
  • the CELS 200 comprises input means 202, processing means 204 and output means 206.
  • the relations between these three components 202, 204, 206 are illustrated by the arrows in Figs. 3(b) and (c).
  • These components 202, 204, 206 of the CELS 200 communicate with each other by communication systems.
  • the input means 202 are configured to provide input data varying according to an environment surrounding the vehicle. As shown in Fig. 3(a), these input means 202 may comprise colour/light sensors.
  • the processing means 204 are configured to process the input data to determine characteristics of the illumination to be provided.
  • the processing means 204 serve to process the digital representation of the scene received from the input means 202 to transform the input attributes (colours and illuminance) of this scene into desired output attributes (colours and illuminance).
  • the processed information is then communicated to the output means 206 mounted in the vehicle's interior.
  • the output means 206 are configured to provide the illumination with the determined characteristics to the interior of the vehicle. In particular, they display lights (context- sensitive lighting) whose colours and illuminance are based on the processed information.
  • the illumination provided by the output means 206 correspond to the environment surrounding the vehicle in the manner that the characteristics of the illumination match the characteristics of the environment surrounding the vehicle when the illumination is provided.
  • the vehicle is in the form of a car
  • the output means 206 are positioned along the roof liner length of the car's interior and the information provided by the input means 202 is acquired through windows of the car.
  • the car may be an electric taxi which is suited for operation in tropical megacities as it considers specific challenges like the constant operation of air-conditioning systems due to the climate conditions [Wit13].
  • the vehicle may be in other forms such as an aeroplane and the output means 206 may be positioned along other parts of the vehicle.
  • the input data can be acquired from any other see-through panel of the aeroplane. This panel is defined by a boundary and the output means may be located along the boundary of the panel.
  • a controller 208 may be further included.
  • This controller 208 can be used to manipulate the determination of the characteristics of the illumination to be projected and/or to overlay additional information.
  • this controller 208 can be in the form of a user-interface to allow the user to adjust the output lighting based on his or her personal preferences.
  • the CELS is used to communicate additional information to the passengers inside the vehicle or subliminally influence the actions of passengers.
  • the output means 206 may be mounted near a vehicle door in which case, it can be used together with the controller 208 to show situation-dependent information (e.g. whether it is safe to alight from the door of the vehicle).
  • the input means 202, processing means 204 and output means 206 of the CELS 200 will now be described in more detail.
  • Possible image sources are cameras (real time) or image collections. The latter usually only provides images of the past and is not suitable for use in a system that considers ongoing changes. However, there are some image collections that can be considered as real-time data source. This is explained in the following.
  • Image Databases as the Input Means 202
  • image collections can be found on the internet but most of these collections are not stored in a structured and systematic way.
  • image databases comprising structured image collections for example, Street View images provided by Google.
  • Street View images are systematically taken and saved, and each image is associated with geographic parameters (e.g. the GPS coordinates and the cardinal direction) of where the image is taken.
  • geographic parameters e.g. the GPS coordinates and the cardinal direction
  • the images can be utilized in a deliberate manner [Goo12]. For instance, by using the following unique parameters: Longitude: 103.854189; Latitude: 1.287802; Heading: 110°; Pitch: 0°; Field of View: 110°; Size: 640x640, the corresponding distinct image associated with these parameters can be downloaded.
  • Street View images only cover areas adjacent to streets does not interfere with the purpose of the CELS 200. Additionally, the Street View database covers most of the streets in many countries including Singapore, hence permitting the possible operation of the CELS 200 using Street View images in these countries. Therefore, it is possible to use Google Street View image collection (or any other structured image collection with online images of possible environments surrounding the car e.g. Bing Maps) as the real-time input data for the CELS 200.
  • Google Street View image collection or any other structured image collection with online images of possible environments surrounding the car e.g. Bing Maps
  • one evaluation criterion is the amount of integration effort required.
  • the bulk of the integration effort required when using cameras lies in the mechanical domain. In particular, the space, position and fixture of the cameras have to be considered.
  • the bulk of the integration effort when using the Google Street View lies in the software domain. Since the Street View images are stored online, access to the internet is required if the images are to be downloaded from the internet.
  • a device configured to download the Street View images serves as the input means 202. This device is able to provide internet access via a connection and may be an automotive human-machine interface that can provide internet access via a 3G connection. However, in times of low connectivity or even complete loss of either the mobile or the GPS signal, the Street View images may not be available. This results in asynchronous data, which can deteriorate the performance of the CELS 200.
  • the input means 202 comprises a device configured to store the pre-loaded images of possible environments that can surround the vehicle.
  • the device may also be used to download the images from the internet beforehand or alternatively, a different device is used for the download and the images are transferred to the input means 202.
  • Such devices may be either developed or bought if available. Since the Street View images are generally not frequently updated, it is possible to process the images from the complete Street View database in advance. In other words, the processing means can process each pre-loaded image to determine the characteristics of the illumination prior to the vehicle going through the environment shown in the pre-loaded image. Therefore, the necessary computation power for the operation of the CELS 200 can be reduced in this case.
  • a Google Street View image is taken at a specific time (e.g. morning) of the day and under a specific weather condition (e.g. sunny).
  • a pre-loaded image does not include information regarding the intensity of the environment lighting at the time the vehicle is at the location shown in the image.
  • an ambient light sensor for detecting the intensity of the environment lighting surrounding the vehicle can be further incorporated in the CELS 200. For example, on cloudy days, the ambient light sensor detects environment lighting with a lower intensity whereas on sunny days, the ambient light sensor detects environment lighting with a higher intensity. The characteristics of the illumination provided to the interior of the vehicle are then determined based on not just the colours in the pre-loaded Google Street View image but also on the intensity of the environment lighting as detected by the ambient light sensor.
  • the integration of one or several cameras may seem very excessive. Suitable spaces and locations have to be found for the integration of the cameras.
  • the cameras have to be positioned such that they have a clear field of view, without adversely affecting the appearance of the vehicle's interior.
  • the input means 202 in the CELS 200 embodiment is realized with cameras and the input data comprises images of environments surrounding the vehicle.
  • the camera may be used to acquire video data and in this case, images from the recorded video (i.e. frames of the recorded video) can be provided as input data.
  • images from the recorded video i.e. frames of the recorded video
  • one or more cameras may be used and other sources apart from cameras may be integrated into the CELS 200. Street View images may also be used together with images and/or videos captured by the camera(s) as the input data.
  • Low light intensity Since an image is obtained by exposing individual photosites to light, a low light intensity can negatively affect the image quality.
  • the amount of light the photosites are exposed to can be increased by using a lens with a high lens speed.
  • the light sensitivity of the camera can be improved by using an image sensor with a high fill-factor and a bigger image sensor format.
  • the amount of light the photosites are exposed to can be dynamically adjusted during operation by lowering the f-Stop and increasing the exposure time.
  • Wide view coverage To achieve a wide view coverage, the angle of view of the lens is preferably high. This allows a large area of the outside environment to be captured which in turn allows the possibility of choosing a smaller area of interest for further processing.
  • Fast-changing light conditions are e.g. encountered if the vehicle with the CELS drives into a tunnel.
  • the exposure time may be adjusted.
  • the intensity of light the image sensor in the camera is exposed to can be reduced. This can be done by adjusting the aperture size using a controlled DC-iris lens.
  • an integrated DC-motor can be used to continuously adjust the aperture according to motor-control signals sent by a circuit within the camera.
  • Egomotion The motion of the camera (due to the motion of the vehicle) or the motion of the object the camera is trying to capture can affect a variety of camera characteristics, which are described in the following.
  • Egomotion causes artifacts due to the Rolling-Shutter-Effect and/or interlaced scanning.
  • a global shutter and progressive scanning can be used instead.
  • Motion blur can be reduced by decreasing the exposure time to the lowest value possible. To compensate for this decrease in exposure time, characteristics which improve the overall light sensitivity, like a bigger aperture or sensor size, are preferred.
  • the moving vehicle causes the camera to vibrate and thus it may be preferable to include an image stabilizer.
  • Output Format Preferably, whether application of a particular type of compression is reasonable is evaluated based on the output format. For this purpose, the already conducted calculation of the data rate is repeated with the recently specified values. Presuming a resolution of 1280x720 pixels at a frame rate of 25 FPS and 24 bits/pixel, the resulting data rate adds up to nearly 66 MByte/s. With the exception of USB 2.0, this data rate could be handled by the communication interfaces. However, considering that the CELS 200 preferably illuminates not only one, but two sides, a minimum of three cameras is probably preferable. This can have some limiting effects on the computer's data bus and CPU load. Therefore, the application of compression such as H.264 or M-JPEG is preferred. An example of the compression technique used in the CELS 200 embodiment and the associated communication interface is shown in Table 1 below.
  • the processing means may alternatively comprise Field programmable gate arrays (FPGA) or application-specific integrated circuits (ASICS).
  • FPGA Field programmable gate arrays
  • ASICS application-specific integrated circuits
  • the output means 206 preferably satisfy the following requirements.
  • Colour Depth Preferably, the light sources are capable of providing colour fidelity so as to effectively achieve a visual perception enlargement effect.
  • the human eye can distinguish approximately 380,000 different colours. Assuming that the final colour of the emitted light is created by mixing the three primary colours red, green and blue (RGB), the necessary number of shades for each primary colour required to obtain 380,000 different colours can be calculated as V380000 ⁇ 73. This requirement implies that the ability to change the illuminance (i.e. to be dimmable) in order to create different shades is a factor to consider when deciding which light source to use.
  • the output means 206 has a high luminous efficacy (the closer the luminous efficacy is to the maximum value of 683lm/W (100%), the better). This translates to a high efficiency.
  • the luminous efficacy is calculated as the ratio of luminous flux emitted to the power consumed by the light source [JRAA00].
  • the output means 206 are mounted along the roof liner length of the car's interior. To minimize the integration effort, it is preferable to use a light source which requires minimal amount of space for its installation. Therefore, it is preferable to reduce the number of additional devices required.
  • a LED-strip (comprising a series of inorganic LEDs along its length) is selected to serve as the output means 206 as it fulfilled the desired characteristics mentioned above.
  • Other possible output means that can be used includes any other form of lighting components which can be configured to vary in intensity and colours, for example filament bulbs, cold cathode fluorescent lamps and solid state light sources (e.g. inorganic LED, organic LED (OLED) and electroluminescence).
  • the LEDs are capable of emitting lights of different colours.
  • the LED-strip in CELS 200 utilizes RGB-LEDs which are able to display up to 16.7 million different colours by mixing the three primary colours red, green and blue.
  • the LEDs of the LED-strip used in the CELS 200 are individually addressable to enable a smooth light colour adaptation over the whole roof liner. Hence, both the brightness and the colour of each LED can be independently controlled.
  • Fig. 5 shows an example way of installing the LED-strip in the car.
  • the LED-strip 1000 is installed along the roof liner of the car, except the portion of the roof liner along the front of the car (this is to avoid distracting the driver).
  • Fig. 6 shows the interfaces between the processing means 204 in the form of a computer 204 and the output means 206 in the form of the LED-strip. As shown in Fig. 6, the information flow between the computer and the output means 206 is facilitated by a communication interface device and a plurality of LED drivers.
  • each LED driver is configured to drive a LED along the LED-strip.
  • Each LED-strip comprises a control interface for communicating with the LED drivers.
  • Colour data i.e. characteristics of the illumination to be provided
  • the LED drivers are implemented using integrated circuits (ICs) in the CELS 200.
  • the LED driver is implemented using the Worldsemi WS2801 LED driver addressed using the SPI interface bus and the communication interface device is a SPI interface device implemented with a microcontroller (MCU) such as an ATmega328.
  • MCU microcontroller
  • the WS2801 LED driver can support data cascading, allowing multiple drivers and LEDs to be connected in series.
  • the SPI interface device serves as the SPI master device.
  • the SPI master device receives signals from the processing means 204, e.g. via USB, Ethernet or similar, converts the signals to SPI signals and sends the signals to the first LED driver of the LED-strip.
  • each LED driver relays the SPI signals it receives in a previous clock pulse (if any) to a subsequent LED driver. This relay of SPI signals continues for as long as the transmission of data from the processing means 204 continues.
  • every addressed driver i.e. every driver which has received SPI signals
  • the number of illuminated LEDs depends on the number of addressed drivers which in turn depends on the amount of transmitted data.
  • the first SPI signals sent from the SPI master device are always received by the first LED driver which is associated with the first LED along the LED-strip. Furthermore, the CELS 200 is configured such that no LED driver is skipped over when performing the data transmission. However, these are not necessary and in other embodiments, the first SPI signals may be received by a LED driver associated with a LED further down the LED-strip and/or some LED drivers may be skipped over such that their associated LEDs do not light up.
  • a transmission protocol is developed for the CELS 200 to transmit the colour data so that the computer is able to send the calculated representative colour to the communication interface device which in turn pushes it to the LED strip via the LED drivers.
  • the protocol allows detection of possible transmission errors.
  • the protocol is preferably as short as possible.
  • the processing means 204 in the CELS 200 are configured to transmit the colour data to the output means 206 in parts.
  • Fig. 7 illustrates a data package according to the RGB-data transmission protocol adopted for transmitting colour data in the CELS 200.
  • the transmit header serves to differentiate this data package from other data packages based on other protocol types.
  • the addressing section of the data package comprising the Strip Divisor and the LED Number serves to implement the following two functions.
  • the processing means 204 are configured to transmit the colour data to the output means 206 in parts.
  • the colour data for a portion of the LED strip is transmitted separately from that for other portions of the LED strip (in other words, the LED strip is virtually split into several parts).
  • Strip Divisor Since each LED requires only 3 bytes of RGB data (1 byte for each R, G and B value), the length of a data package can be significantly reduced via the virtual splitting of the LED strip. The transmission frequency is then equal to the Strip Divisor value.
  • each LED is separately addressed by means of the LED number in the addressing section.
  • the LEDs along the LED strip are enumerated. This allows the exclusion of specific LEDs from being updated. In particular, only LEDs whose colours are to be changed are updated and hence, less data (and thus less packages) have to be sent. For example, if all the LEDs in a particular section do not need to be updated, the package for this section does not have to be sent.
  • the enumeration of the LEDs along the particular section of the LED strip is stored under the "LED Number" in the data package for this section. The total size of this information i.e. Size (LED Number) is variable and is calculated as shown in Equation (1 ).
  • a specific LED can thus be addressed based on the section of the LED strip it belongs to and its position in this section. For instance, with a total of 96 LEDs along the LED strip and a Strip Divisor of 4, there would be 4 sections of 24 LEDs. The 56th LED would belong to the 3rd section which starts with the 49th LED. Thus, the 56th LED can be addressed through the 7th bit of the data package sent for the 3rd section of the LED strip.
  • Fig. 8 shows the encoding of the individual LEDs.
  • each LED corresponds to a single bit of one byte of the LED number.
  • a 0-bit indicates that there is no colour data for the corresponding LED and it is not necessary to update this particular LED.
  • the number at the bottom of Fig. 8 shows the resulting decimal representation of each byte.
  • the data packages sent according to these protocols comprise distinct headers that distinguish the data packages from other data packages sent based on other protocols.
  • the protocols are described below.
  • the single transmission protocol is also used for the transmission of RGB data but only for one LED.
  • the basic structure of this protocol is similar to the above-described transmission protocol as shown in Fig. 9.
  • This single transmission protocol serves to allow specific tasks to be assigned to certain LEDs.
  • a LED that is part of the CELS 200 can simultaneously serve as a reading lamp.
  • the single transmission protocol is used to address only this specific LED to turn its R, G. B values to values for maximum illuminance.
  • the initialization protocol is used to initialize the communication between the communication controller and the computer.
  • Fig. 10 shows the initialization protocol and indicates the parameters that can be adjusted via this protocol.
  • the delay measurement protocol serves to transmit the delay measured by the communication interface device. Since one byte can only represent values up to 255, the protocol uses a four byte representation of a signed integer value. Therefore, it is possible to transmit delay values up to 2.1 billion microseconds.
  • Fig. 11 shows the delay measurement protocol.
  • Acknowledgement protocol An acknowledgement package based on an acknowledgement protocol is sent from the communication interface device to the computer to confirm the receipt of any packages based on the previously described protocols. Th e size of each data package sent based on this acknowledgement protocol is fixed to three bytes and only the acknowledge message changes according to the particular case. Table 2 shows the different acknowledgement messages to be included in each acknowledgement data package. These messages indicate the successful or unsuccessful transmission of a data package.
  • RGB-data is not a multiple of 3 bytes
  • Fig. 12 shows the tasks performed by the CELS 200 after the configuration and initialization of the CELS 200.
  • input data in the form of an image is first acquired.
  • This image is then converted into a different format for further processing.
  • the representative colours in the image are then determined and the communication interface device then transmits these representative colours to the LEDs along the LED-strip via the LED drivers as described above.
  • Fig. 13 shows the tasks performed by the CELS 200 of Fig. 12 together with two further blocks 1302 and 1304.
  • the first way includes only block 1302 whereby the CELS 200 is initialized before any image is captured and is not re-configured during the subsequent operation of the CELS 200.
  • the second way includes both blocks 1302 and 1304 whereby the CELS 200 is not only initialized but is also controlled during operation.
  • the process structure of the processing device preferably fulfills the following requirements:
  • a method to enable other applications to access the input data is preferably implemented.
  • the CELS 200 processes the latest information.
  • each succeeding task it is preferable for each succeeding task to have access to the latest data provided from the former task. Moreover, this is preferably fulfilled independent from any discrepancy in processing time between the various tasks.
  • the process structure of the processing device is preferably configured as follows.
  • the CELS 200 In order to reduce the delay caused by the processing of the tasks, the CELS 200 preferably operates the tasks at a high efficiency. Therefore, a multi-thread framework, where distinct tasks can run concurrently in their own thread, is preferably implemented in the CELS 200. Mechanisms which prevent data corruption due to this concurrency are also preferably implemented. Calibrate the CELS performance versus the necessary resources.
  • the CELS 200 can be integrated in an automotive human-machine interface. Therefore, preferably, the computation power of the CELS 200 is not completely utilized in performing the tasks. In order to allocate the resources, one or several parameters are preferably provided to influence the CELS 200 performance. This is so as to reduce the amount of computation power required to perform the tasks.
  • the colour data of the LED-strip is preferably updated at fixed intervals to ensure a smooth and uniform colour transition.
  • the process structure in CELS 200 is preferably configured to implement a mechanism which refreshes the LED-strip data independently from the processing times of different input sources.
  • the process structure implemented on the processing device is divided into four main parts, each representing a self-contained thread. This helps to fulfill the three requirements described above (as elaborated below).
  • Concurrent thread-safe framework The whole CELS framework is embedded in a concurrent framework comprising four threads. Each thread has its distinct task and shares data with the preceding and succeeding thread. In order to prevent data corruption due to simultaneous data access, a thread-safe data structure is implemented.
  • the Image Capture block and the succeeding Image Conversion block and Representative Colour Determination block are divided into two threads. This enables an effective way to calibrate the CELS 200 performance. Since the processing thread is periodically executed with an adjustable rate, the necessary computation power can be calibrated by changing this rate. Hence, although the processing is independent from the image source frame rate, the execution rate of the resource-consuming image conversion can be calibrated. This calibration can be based on the resources which are available to the CELS 200 (taking into account that a lower execution rate results in a less frequent update of the colour data). Fixed LED-strip data update.
  • the other division which separates the Image Conversion block and the Representative Colour Determination block from the LED Colour Adjustment block and the MCU Communication block, helps in satisfying the requirement of a fixed rate in the LED-strip data update.
  • the LED Colour Adjustment block is also periodically executed. Thus, it is decoupled from the former ImageProcessor class.
  • the execution of the LED Colour Adjustment class is independent from both the number of input sources and the individual image processing times of these input sources. Therefore, the LED Colour Adjustment class processes the latest colour data available as provided by the preceding classes. Thus, the LED-strip is updated with a constant refresh rate.
  • Fig. 15 shows the schematic of the process structure implemented on the communication interface device in the CELS 200.
  • the communication interface device runs the initialization routine.
  • this initialization routine several variables are declared, the RS-232 communication is set to 115200/8-N-1 and a watchdog timer is set to two seconds.
  • the communication interface device remains in an infinite while loop. This while loop contains two if-clauses. The first if-clause is satisfied once serial data has been received and the second if-clause is cyclically executed. Both are explained in the following subsections.
  • the program looks through the data the communication interface device receives to find a start byte and a header byte. Once these are found, the processing begins.
  • the RGB-Data Transmission protocol is used for transmitting each data package.
  • the information regarding the strip partition (i.e. segment of the LED strip) and the addressed LED are extracted from the data package.
  • the colour data can be directly written from the serial buffer into the distinct LED data structure.
  • the LED data structure also provides an acknowledgement package upon successful receipt of the whole data package.
  • the second if-clause is cyclically executed and the following is implemented in each cycle.
  • Equation (2) is used to mix the colours.
  • a mix-factor is pre-determined whereby this mix-factor determines the composition of the colours eventually displayed by the LEDs.
  • the "Final Colour Value” is a R, G or B value which is respectively determined by using Equation (2) with the latest R, G or B data written to the communication interface device ("Latest Data") and the previous R, G or B data written to the communication interface device ("Previous Data”).
  • the colour displayed by the LEDs is then determined from the Final Colour Values (i.e. R, G and B values) obtained using Equation (2).
  • the mix-factor is set as 0.5.
  • the data to be used may include any number of sets from the most recent predetermined number of (e.g. ten) sets of data written to the communication interface device. If more sets of data are used, the influence (weight) of each set is preferably adjusted such that it reflects when the set of data was written to the communication interface device. Preferably, the later the set of data written, the higher the weight given to the data.
  • the weight for each set of data with a particular age can then be calculated as 1/(age+2).
  • the above is cyclically executed, that is, executed every predetermined period of time (for example, 80ms).
  • a constant LED colour refresh rate can be achieved.
  • the cycle time i.e. the predetermined period of time between changes in the LED colours
  • the cycle time can be adjusted using the initialization protocol as shown in Fig. 10.
  • the cycle time controls the degree of smoothness of the colour transformation.
  • the output means 206 comprise a LED-strip aligned along the roof liner.
  • An image is repeatedly acquired by a camera through each window and is associated with a group of LEDs along the portion of the LED strip over the window.
  • the image shows a view through the window at the instance it is captured.
  • the image is two-dimensional with a width (which corresponds to the dimension of the view parallel to the LED strip over the window) and a height. The top of the image is nearer to the LED strip whereas the bottom of the image is further away from the LED strip.
  • the computer is configured to determine representative colours from each input image acquired. This is elaborated below.
  • Each input image comprises a plurality of pixels, each of which having a pixel value representing its colour.
  • each pixel value is in the RGB space i.e. it comprises R, G, B sub-values indicating brightness of red, green and blue in the pixel.
  • an area of interest is first selected from the input image.
  • the colours of the illumination are then determined from only this area of interest.
  • the input image comprises a plurality of lines and the area of interest is selected by first extracting a certain percentage (e.g. 10%) of the lines at the top of the image to form an initial area of interest (e.g. the initial area of interest may contain lines 0 - 108 of the input image) .
  • a certain percentage e.g. 10%
  • lines are cyclically skipped over (for example, only every x th line in the initial area of interest is selected where x may be equal to 2) to form a final area of interest.
  • an area of interest may not be selected and the whole image may be processed or a different method may be used to select the area of interest.
  • Selecting an area of interest from the image for further processing is advantageous as the JPEG conversion (i.e. conversion of the JPEG image from the RGB space to the HSV space) to be performed later on requires computation time. Selecting an area of interest helps to reduce the amount of image data that needs to be processed and due to this reduction, the computation time can be reduced. In effect, the computing time is nearly decreased by the same factor by which the regarded image data is reduced. This has been experimentally determined.
  • the area of interest is then weighted using a weighting function.
  • the area of interest comprises a plurality of pixels representing respective points in a view captured through a window adjacent the LED-strip and each pixel is given a weight that depends on the proximity of the point it represents to the LED-strip. The nearer the point is to the LED-strip, the greater the weight given to the pixel.
  • colour data from the bottom of the image representing the lower view from the window is taken less into account than that from the top of the image. More specifically, the area of interest is divided into three horizontal sections.
  • this weighting-method does not discard data but instead emphasizes each section of the area of interest with a user-specified weighting.
  • the area of interest is then divided into multiple segments. Each segment is processed independently to determine the colours of the illumination.
  • the number of segments is equal to the number of LEDs corresponding to the image (these LEDs' output are to be determined based on the image), so the illumination characteristics determined for each segment is projected by an associated LED along the LED-strip.
  • Each segment has a height equal to the image height (which depends on the number of lines in the area of interest) and a width equal to the image width divided by the number of corresponding LEDs.
  • the area of interest is divided equally to obtain a number of image tiles with equal widths and heights. But in other embodiments, the area of interest can be divided in an uneven manner to obtain image tiles of different shapes and/or sizes.
  • Each segment in the red, green, blue (RGB) colour space is then converted into the hue, saturation, brightness (HSB) colour space so as to enhance the image data in the segment.
  • each pixel value in the segment is converted to H, S, B sub- values indicating hue, saturation and brightness of the pixel.
  • the colour and the brightness information in the HSB space are separated. Therefore, it is possible to increase the brightness of a particular colour without changing the actual colour.
  • Fig. 16 shows representative colours determined without first converting the area of interest into the HSB space (see colours above the line) and with the conversion into the HSB space (see colours below the line).
  • G-5) Determine Representative Colour for Each Segment
  • the representative colour of each image segment in the HSB space is then determined by determining the median colour of the image segment taking into account the weights of the pixels in the image segment (i.e. by determining the weighted median colour of the image segment).
  • a weighted median H value, a weighted median S value, a weighted median B value are obtained by taking the weighted median of the H, S and B values of the pixels in the image segment.
  • the weighted mean or weighted mode of the pixel values in each image segment may be calculated instead.
  • the weighted median H, S, B values are then converted back into representative R, G, B values.
  • the image data of each segment is condensed into one representative colour which is a mixture of the three R, G, B colours.
  • the computer is configured to take into consideration the delay between the providing of the input data and the providing of the illumination to the interior of the vehicle when determining the characteristics of the illumination.
  • the delay between an image capture and an output from the LED-strip comprises delay from the following processes:
  • Processing Delay The time, which the computer needs to receive and process the input data in order to send the colour data to the communication interface device.
  • Communication Processing Delay The time, which the communication interface device needs to process the specified data protocol.
  • Communication Delay The time, which is necessary to communicate with the LED- strip driver and to set the colour.
  • the above delay leads to an undesirable difference between the current view from the window and the colours which the LEDs show. This is illustrated in Fig. 17 (top).
  • the colours can in fact change rapidly. This can cause flicker or harsh colour changes.
  • This asynchronous relation between the displayed colours and the environment deteriorates the visual perception enlargement effect. Therefore, the delay has to be taken into account when determining the representative colour. Taking the delay into account helps to achieve a more synchronous relation between the displayed colours and the environment as shown in Fig. 17 (bottom).

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Arrangements Of Lighting Devices For Vehicle Interiors, Mounting And Supporting Thereof, Circuits Therefore (AREA)
PCT/SG2014/000391 2013-08-20 2014-08-20 System and method for providing illumination to an interior of a vehicle WO2015026296A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE112014003826.1T DE112014003826T5 (de) 2013-08-20 2014-08-20 System und Vefahren zum Bereitstellen einer Beleuchtung für einen Innenraum eines Fahrzeugs

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361867771P 2013-08-20 2013-08-20
US61/867,771 2013-08-20

Publications (1)

Publication Number Publication Date
WO2015026296A1 true WO2015026296A1 (en) 2015-02-26

Family

ID=52483966

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SG2014/000391 WO2015026296A1 (en) 2013-08-20 2014-08-20 System and method for providing illumination to an interior of a vehicle

Country Status (2)

Country Link
DE (1) DE112014003826T5 (de)
WO (1) WO2015026296A1 (de)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10569704B2 (en) 2017-11-07 2020-02-25 Cnh Industrial America Llc Ambient lighting system for an agricultural machine
CN110901524A (zh) * 2019-11-19 2020-03-24 上海博泰悦臻电子设备制造有限公司 车辆、车机设备及其车载氛围灯风格自动调节方法
WO2023094482A1 (en) 2021-11-23 2023-06-01 Atlas Technologies Holding B.V. Interior lighting system for vehicle

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102018207848A1 (de) * 2018-05-18 2019-11-21 Bayerische Motoren Werke Aktiengesellschaft System und Verfahren zur Steuerung von Anzeige- und Ausgabeeinheiten im Fahrzeug in Abhängigkeit der Fahrzeugumgebung
DE102018211929A1 (de) * 2018-07-18 2020-01-23 Bayerische Motoren Werke Aktiengesellschaft Verfahren zum Betrieb einer Beleuchtungsvorrichtung in einem Kraftfahrzeug

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5143437A (en) * 1989-06-26 1992-09-01 Nissan Motor Company, Limited Vehicle room illuminating apparatus
US6935763B2 (en) * 2000-02-17 2005-08-30 Volkswagen Ag Interior lighting system of a motor vehicle and a method for controlling the same
US7221264B2 (en) * 2005-04-20 2007-05-22 Honda Motor Co., Ltd. Method for adjusting interior illumination
US20070183163A1 (en) * 2006-02-07 2007-08-09 Joseph Daniel Ambient light based illumination control
US20110084852A1 (en) * 2009-10-13 2011-04-14 Gm Global Technology Operations, Inc. Method and apparatus for communicatively changing interior illumination color in a vehicle
US20110227716A1 (en) * 2010-03-17 2011-09-22 Ford Global Technologies, Llc Ambient lighting to reflect changes in vehicle operating parameters

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5143437A (en) * 1989-06-26 1992-09-01 Nissan Motor Company, Limited Vehicle room illuminating apparatus
US6935763B2 (en) * 2000-02-17 2005-08-30 Volkswagen Ag Interior lighting system of a motor vehicle and a method for controlling the same
US7221264B2 (en) * 2005-04-20 2007-05-22 Honda Motor Co., Ltd. Method for adjusting interior illumination
US20070183163A1 (en) * 2006-02-07 2007-08-09 Joseph Daniel Ambient light based illumination control
US20110084852A1 (en) * 2009-10-13 2011-04-14 Gm Global Technology Operations, Inc. Method and apparatus for communicatively changing interior illumination color in a vehicle
US20110227716A1 (en) * 2010-03-17 2011-09-22 Ford Global Technologies, Llc Ambient lighting to reflect changes in vehicle operating parameters

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10569704B2 (en) 2017-11-07 2020-02-25 Cnh Industrial America Llc Ambient lighting system for an agricultural machine
CN110901524A (zh) * 2019-11-19 2020-03-24 上海博泰悦臻电子设备制造有限公司 车辆、车机设备及其车载氛围灯风格自动调节方法
WO2023094482A1 (en) 2021-11-23 2023-06-01 Atlas Technologies Holding B.V. Interior lighting system for vehicle
NL2029862B1 (en) * 2021-11-23 2023-06-15 Atlas Technologies Holding Bv Vehicle with smart interior lighting system.

Also Published As

Publication number Publication date
DE112014003826T5 (de) 2016-06-02

Similar Documents

Publication Publication Date Title
WO2015026296A1 (en) System and method for providing illumination to an interior of a vehicle
JP4950988B2 (ja) データ送信装置、データ送信方法、視聴環境制御装置、視聴環境制御システム、及び視聴環境制御方法
US11498500B1 (en) Determining comfort settings in vehicles using computer vision
US11528424B2 (en) Imaging device, imaging system, vehicle running control system, and image processing device
US20150239395A1 (en) Illumination System for the Interior of a Motor Vehicle
US10595379B2 (en) Illumination control
JPWO2007122987A1 (ja) データ送信装置、データ送信方法、視聴環境制御装置、視聴環境制御システム、及び視聴環境制御方法
CN107431758B (zh) 成像装置、成像方法、信号处理装置、信号处理方法和记录介质
KR102672973B1 (ko) 차량 및 그 제어방법
JP6533050B2 (ja) 車載カメラシステム
CN110015236A (zh) 一种车辆显示装置、方法及车辆
WO2021070214A1 (ja) 送信方法、送信システム及びシステム制御装置
US20200070981A1 (en) Passenger cabin, lighting arrangement and operating method
JP2009081822A (ja) データ送信装置、データ送信方法、視聴環境制御装置、視聴環境制御システム、及び視聴環境制御方法
US11971559B2 (en) Stereoscopic display using microLED technology
CN116137953A (zh) 图像的基于多帧深度的多相机光照调节
US20040161159A1 (en) Device and method for enhancing vision in motor vehicles
CN116234126A (zh) 车室氛围照明方法和系统
CN105141855B (zh) 一种图像处理方法及电子设备
KR102632092B1 (ko) 차량 및 그 제어방법
JP2009137505A (ja) 車両用照明装置および車両用表示システム
CN112584109A (zh) 车用摄像装置及车用图像处理方法
JP2019047224A (ja) 発光装置、および画像信号供給装置
JP3245467U (ja) 画枠の陰影動的調整装置
US10583778B2 (en) System for generating a moving image in an aircraft

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14838465

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 112014003826

Country of ref document: DE

Ref document number: 1120140038261

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14838465

Country of ref document: EP

Kind code of ref document: A1