EP3494518A1 - Verifizierung von durch eine intelligente hörbare vorrichtung projizierten nachrichten - Google Patents

Verifizierung von durch eine intelligente hörbare vorrichtung projizierten nachrichten

Info

Publication number
EP3494518A1
EP3494518A1 EP17837713.1A EP17837713A EP3494518A1 EP 3494518 A1 EP3494518 A1 EP 3494518A1 EP 17837713 A EP17837713 A EP 17837713A EP 3494518 A1 EP3494518 A1 EP 3494518A1
Authority
EP
European Patent Office
Prior art keywords
message
intended
audible
display
intelligent
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP17837713.1A
Other languages
English (en)
French (fr)
Other versions
EP3494518A4 (de
Inventor
John Rilum
Paul Atkinson
Edzer Huitema
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chromera Inc
Original Assignee
Chromera Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chromera Inc filed Critical Chromera Inc
Publication of EP3494518A1 publication Critical patent/EP3494518A1/de
Publication of EP3494518A4 publication Critical patent/EP3494518A4/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B29/00Checking or monitoring of signalling or alarm systems; Prevention or correction of operating errors, e.g. preventing unauthorised operation
    • G08B29/12Checking intermittently signalling or alarm systems
    • G08B29/126Checking intermittently signalling or alarm systems of annunciator circuits
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B7/00Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00
    • G08B7/06Signalling systems according to more than one of groups G08B3/00 - G08B6/00; Personal calling systems according to more than one of groups G08B3/00 - G08B6/00 using electric transmission, e.g. involving audible and visible signalling through the use of sound and light sources

Definitions

  • the field of the present invention is the design, manufacture, and use of electronic audible systems with audible output and input devices.
  • the audible system will include an electronic display, such as LCD, OLED, or electrophoretic displays.
  • An intelligent label is associated with a good, and includes one or more electro-optic devices that are used to report the condition of that good at selected points in the movement or usage of that good. These electro-optic devices provide immediate visual information regarding the good without need to interrogate or communicate with the electronics or processor on the intelligent label. In this way, anyone in the shipping or use chain for the good, including the end user consumer, can quickly understand whether the product is meeting shipping and quality standards.
  • the intelligent label may take many forms, such as a tag attached to the good, integrated into the packaging for the good, integrated into the good itself, or may even be an information area on a prepaid card for example.
  • the intelligent label may also include, for example, print information regarding the good, usage or shipping rules, or address and coded information.
  • the intelligent label includes a computer processor for managing the overall electronic and communication processes on the intelligent label.
  • the processor controls any RFID communication, as well as storage of information data.
  • the processor also has a clock, which may be used to accurately identify when the good changed hands in the shipping chain, or when the good failed to meet a quality standard.
  • the intelligent label may also have one or more sensors that can detect a chemical or gaseous composition, optical, electrical or an environmental condition such as temperature, humidity, altitude, or vibration. If the processor determines that the sensor has a condition that exceeds the safe handling characteristics, then the processor may store information regarding the out-of- specification handling, and may take additional actions as necessary.
  • the processor may cause an electro-optic device such as an electrochromic indicator or display to show a "caution" as to using the product.
  • the processor may determine that the sensor has greatly exceeded the outer specification criteria, and cause an electro-optic indicator to show that the product is spoiled or otherwise unusable.
  • the term 'display' as used herein is to be understood to encompass indicators and other electro-optic devices capable of displaying visually perceptible states, data, information, patterns, images, shapes, symbols etc. which are collectively referred to herein as "messages”.
  • the intelligent label provides a robust, trustworthy, easily usable system for tracking goods from a point of origin to delivery to the consumer.
  • the intelligent label provides important visual alerts, updates and information throughout the shipping process without the need for expensive communication, RFID, or interrogation equipment.
  • the intelligent label facilitates simple and reliable communication of shipping information from a consumer back to a manufacturer or seller, for example, for confirming warranty or replacement information. In this way, a shipping and delivery system having a high degree of trust, and resistance to fraud, is enabled.
  • a particularly difficult problem occurs when an intended message has been sent to the display for the intelligent label, and then something occurs, either external or internal to the good or label, that makes the message imperceptible to the reader, which can be a human or a machine.
  • the intelligent label, and any network to which it communicates has a record that a particular message was displayed to a reader at a particular time.
  • the intended message could not be communicated to the reader. Accordingly, there is a need to detect what was actually displayed to a reader, and to do so in a reliable, compact, and cost efficient manner. It will be appreciated that the need for such message detection would be useful in many display applications other than the use of intelligent labels.
  • the intended message may be an audible message, such as an alarm or human recognizable message.
  • audible message such as an alarm or human recognizable message.
  • an intelligent label may sound an alarm if a temperature threshold is exceeded.
  • a temperature threshold is exceeded.
  • a verifiable display is provided that enables the visual content of the display to be detected and confirmed in a variety of ambient lighting conditions, enviroments, and operational states.
  • the verifiable display has a display layer that is capable of visually setting an intended message for human or machine reading, with the intendended message being set using pixels.
  • the message that is actually displayed and perceivable may vary from the intended message.
  • a light detection layer in the verifiable display detects the illumination state of the pixels, and in that way is able to detect what message is actually being presented by the display layer.
  • An intelligent audible device is provided that is constructed to monitor for an event, such as actual or elapsed time, or a sensor exceeding a threshold. Responsive to the event, a sound input transducer is activated, and an output sound signal representing an intended message is projected into the local environment by a sound output transducer. The sound input transducer captures the actual sound projected into the local
  • the captured actual sound is processed and compared to the output sound signal. In this way it may be confidently determined if the intended message was actually properly projected into the local enviroment.
  • the verifiable display allows the automated and electronic detection of messages that were actually displayed, and with supporting circuitry and logic, may determine a level of perceptibility. With this information, decisions may be made regarding setting alarms, communicating warnings, or refreshing the intended message, for example. Further, an accurate electronic history of the actual messages may be saved for use in determining whether appropriate actions were taken responsive to the messages actually presented on the verifiable display.
  • FIG. 1 is an illustration of a display in accord with the present invention.
  • FIG. 2 is an illustration of a display in accord with the present invention.
  • FIGs. 3 A and 3B are illustrations of a display in accord with the present invention.
  • FIG. 4 is an illustration of a display in accord with the present invention.
  • FIG. 5 is an illustration of a display in accord with the present invention.
  • FIG. 6 is an illustration of a display in accord with the present invention.
  • Fig. 7A is a diagram of an emissive display with the photosensitive detector in front of display in accord with the present invention.
  • Fig. 7B is a diagram of an emissive display with the photosensitive detector behind the display in accord with the present invention.
  • Fig. 8 is a diagram of an emissive display using a backlight and a shutter, like an LC layer with the detector placed on top of the display in accord with the present invention.
  • Fig. 9 is a block diagram of an intelligent label in accord with the present invention.
  • Fig. 10 is illustrates a light-sensing in-cell touch integrates optical sensors into the thin film transistor layer in accord with the present invention.
  • Fig. 11 is a cross-section of readout and photo a-Si TFT with opening in black matrix in accord with the present invention.
  • Fig. 12 is a circuit diagram of four LCD pixels and one sensor circuit in accord with the present invention.
  • Fig. 13 A is a schematic diagram of AMOLED pixel circuit in accord with the present invention.
  • Fig. 13B is a timing diagram for a AMOLED pixel circuit in accord with the present invention.
  • Fig. 15 is a-Si:H optical feedback pixel circuit in accord with the present invention.
  • Fig. 16 is a reflective display using a light source (e.g. a backlight) and an integrated optical sensor in accord with the present invention.
  • a light source e.g. a backlight
  • an integrated optical sensor in accord with the present invention.
  • Fig. 17 is a reflective display in accord with the present invention.
  • Fig. 18 is an emissive display in accord with the present invention.
  • Fig. 19 is an emissive display in accord with the present invention.
  • Fig. 20 is a shutter display with an integrated optical sensor in accord with the present invention.
  • Fig. 21 is a reflective display in accord with the present invention.
  • Fig. 22 is a reflective display in accord with the present invention.
  • Fig. 23 is a reflective display in accord with the present invention.
  • Fig. 24 is a reflective display with shutter in accord with the present invention
  • Fig. 25 is a reflective display with shutter in accord with the present invention
  • Fig. 26 is a reflective display with shutter in accord with the present invention
  • Fig. 27 is a reflective display with shutter in accord with the present invention
  • Fig. 28 is a reflective display with shutter in accord with the present invention
  • Fig. 29 is a reflective display with shutter in accord with the present invention
  • Fig. 30 is a verifiable display in accord with the present invention.
  • Fig. 31 is an alphanumeric display in accord with the present invention.
  • Fig. 32 is a verifiable display in accord with the present invention.
  • Fig. 33 is a back lit display with a shutter in accord with the present invention.
  • Fig. 34 is a verifiable display in accord with the present invention.
  • Fig. 35 is a verifiable display in accord with the present invention.
  • Fig. 36 is a verifiable display in accord with the present invention.
  • Fig. 37 illustrates measurements of a verifiable display in accord with the present invention.
  • Fig. 38 illustrates measurements of a verifiable display in accord with the present invention.
  • Fig. 39 illustrates measurements of a verifiable display in accord with the present invention.
  • Fig. 40 illustrates measurements of a verifiable display in accord with the present invention.
  • Fig. 41 is a switching curve of a pixel that is switched from white to black and back to white again in accord with the present invention.
  • Fig. 42 is a switching curve of a pixel that is switched from white to black and back to white again in accord with the present invention.
  • Fig. 43 is a block diagram of an intelligent audible device in accord with the present invention.
  • Fig. 44 is a block diagram of an intelligent audible device in accord with the present invention.
  • Fig. 45 is a block diagram of an intelligent audible device in accord with the present invention.
  • Fig. 46 is a block diagram of determinator circuitry for an intelligent audible deice in accord with the present invention.
  • Fig. 47 is a block diagram of determinator circuitry for an intelligent audible deice in accord with the present invention.
  • Fig. 48 is a flow chart of method of operating an intelligent audible deice in accord with the present invention.
  • Fig. 49 is a flow chart of method of operating an intelligent audible deice in accord with the present invention.
  • Fig. 50 is a flow chart of method of operating an intelligent audible deice in accord with the present invention.
  • bi-stable displays such as electrophoretic displays manufactured by E Ink and certain LCDs (e.g., zenithal bistable and cholesteric) are to varying degrees stable without the continuous application of power. By design, they are however reversible and the displayed messages are therefore subject to accidental or intentional erasure or alteration. It can't be certain therefore whether the displayed information is as intended or otherwise determined (unlike irreversible displays such as those described in patent US 9,030,724 B2).
  • reflective displays that are illuminated with ambient light and read from the same side in reflection.
  • example displays described herein can be extended to other types of displays including, but not limited to, transmissive, transreflective or emissive (e.g. back or front lit) configurations.
  • the inventions described herein cover determination and verification systems for reflective electrophoretic and reflective bistable liquid crystal displays, however, they are also applicable to other types of bi-stable or multi-stable displays and to electro-optic displays in general.
  • pixels are single addressable visual elements of the display.
  • a pixel may be a 'dot' and in others it maybe a shape such as a 'segment' used in the formation of a 'seven segment' alphanumeric display.
  • Pixels may also be a variety of shapes, symbols or images that are determined by the surface areas of the electrodes used to signal them. A shape of course may be comprised of multiple pixels.
  • the density, variety and resolution of the displayed messages is not typical of that required for consumer electronics.
  • the messages may be generated using comparatively large pixels in shapes optimized for messages appropriate for the application instead of arrays of much larger numbers of significantly smaller pixels.
  • a message consists of the 'state' of one or more pixels.
  • a pixel typically has at least two intended states, one each of two distinct colors (e.g. black and white) and depending on the display, a third state which is not one of the disctinct colors (e.g., gray or semi-transparent).
  • the intended state of a pixel may be different from its actual displayed state however due to damage, hardware or software malfunction, loss of power, age, radiation, tampering, being subjected to environmental conditions outside of allowed operating or storage conditions, etc.
  • an intended message also maybe different from the corresponding displayed message.
  • the visible state of pixels that make up a message depends on available light (intensity, wavelengths etc.).
  • the perceptibility of a visible message further may depend on other variables that affect its understandability or interpretability.
  • the perceptibility of a message for example, may depend on the contrast between the pixels comprising a message and their areas surrounding them. The clarity and sharpness of the pixels, individually and in combination, may also impact the perceptibility of a message.
  • a message may have an intended display state, a visible state, and a perceptible state.
  • the displayed state is the state of the message pixels independent of the available light.
  • the displayed state of a message corresponds to what could have been visible to man or machine (observable, seen) if light was available.
  • the visible state is the state of the message pixels visible (by man or machine) with available light.
  • the visible state of a message corresponds to what could be observed (seen) with available light.
  • the perceptible state is the state of a set of message pixels that is understandable or interpretable (by man or machine) with available light.
  • the perceptible state of a message corresponds to what could be understood or interpreted with the available light.
  • Described herein are devices, methods and systems for verifying and determining displayed messages and their corresponding states, either by human or with automation. And further, for enabling transactions, analytics, monitoring conditions and outcomes, and managing outcomes based on access to, receipt of, and access to information that is verifiable, verified or enhanced by being a product of, a component of, or an outcome of such devices, methods or systems.
  • the terms 'verify' and 'determine' may sometimes be used herein
  • verify typically implies a comparison between a displayed message and a known dataset - e.g. an intended message.
  • the term determine typically implies determining the displayed messages or patterns independent of an intended message. Reference data however may be used to make sense of the patterns.
  • verify typically implies being able to confirm 'what' the user saw (or thought they saw) and was the basis of their decision or action.
  • a display device as defined hereinafter, comprises a display layer and a light detection layer. Devices may also have a light source layer. These functional 'layers' may be configured in different ways and in different combinations depending in part on their respective reflective, transreflective or transmissive properties. They may also share common elements (e.g. common electrodes). The term 'layer' should be construed broadly to encompass configurations other than those where the functions ascribed to the terms above are literally layered. Of particular interest are configurations where the display layer, light detection layer and light source layer, as well as, the assembled device, are flexible. Devices however, and their components, may also be semi-rigid and rigid. Devices may also include electronics, methods and systems described herein.
  • the display layer displays the message and may be any of different types including, but not limited to, electrophoretic, liquid crystal, plasma, OLED, and electrochromic.
  • displays display layers
  • Display layers may be further distinguished in accordance with their ability to reflect/absorb or pass/block light.
  • electrophoretic displays comprising transparent electrodes where the charged particles may be positioned so that in one state they block light from passing, and in a second state they are moved out of the light path, and allow light to pass.
  • a light detection layer is typically sized appropriately to detect/measure light associated with the state of the display pixels and optionally, other areas such as that for detecting/measuring ambient light.
  • a light detection layer can be made of photovoltaic materials, light harvesting proteins, or other photoactive compounds.
  • Preferred photovoltaic materials include organic photovoltaic materials (OPV) for ease of roll-to-roll manufacturing and optical properties (e.g. high
  • An exemplary embodiment of a light detection layer consists of a transparent electrode layer of ITO, an organic photovoltaic material based on for example Poly 3- hexylthiophene (P3HT) and an electrode layer (transparent or non-transparent) such as ITO, PEDOT:PSS, graphene, a metal conductor (e.g. Al), or a combination thereof.
  • an organic photovoltaic device that are near transparent or semitransparent (see e.g. US Pub. No. US20140084266 "Semi-transparent, transparent, stacked and top-illuminated organic photovoltaic devices," and US20120186623
  • Bioelectronics 19 (2004) 869-874, and included references is a preferred light harvesting protein for the photoactive layer.
  • a light detection layer e.g. photovoltaic photoactive sensor
  • bistable liquid crystal display layers corresponds to a change in the polarization of the light transmitting through the reflective display.
  • This polarization change is in many configurations converted into a display reflectivity change by means of a linear polarization filter at the front (viewable) side of the display layer.
  • the maximum brightness of such a display assuming an otherwise ideal display and polarizer, would be only 1 ⁇ 2 of that of a non-polarizing display.
  • a polarizing display layer 15 would also generate a smaller detected contrast ratio between bright and dark pixels in the light sensing layer 11.
  • a display device may include a light source layer to improve the effectiveness and/or efficiency of light detection or measurement.
  • the light source layer may be a thin film such as an OLED or transparent OLED (T-OLED) that generates light in the viewable area of the device.
  • the source of light in a light source layer may be outside the viewable area although the light is emitted in the viewable area.
  • An exemplary embodiment of such a light source layer is an LED and a lightguide. Other techniques and processes are also know to one skilled in the art.
  • the light source layer is preferably optimized to emit light in wavelengths to which the light detection layer is most sensitive.
  • an LED that outputs light in a wavelength range of approximately 450-600nm for a photovoltaic light detection layer consisting of P3HT.
  • the light source layer and light detection layer may be optimized for, or intentionally limited to, wavelengths outside the visible light spectrum (e.g. to be machine but not human readable).
  • the display layer also may be optimized to absorb/reflect/transmit particular wavelengths of light in conjunction with the light source layer and/or light detection layer to enhance performance (detection, measurement, visibility, power etc.).
  • the ink particles in an electrophoretic display (or the fluid in which they are suspended) for example, may be colored or otherwise optimized for that purpose.
  • An example of an electrophoretic display with ink particles possessing photoluminescence is shown in Figure 4.
  • Display layers, light detection layers and light source layers require electrodes typically configured on the top and the bottom of each layer. Each electrode layer may be configured with multiple electrodes. Depending on the display layer, light detection layer, or light source layer one or both of the electrode layers may be patterned. The pattern determines the shape and addressability of the display pixels, detection pixels and less often, light source pixels (typically the light source consists of two non-patterned electrodes effectively creating a single light pixel or layer).
  • one or both of the electrode layers may be a transparent conductor such as ITO and other transparent conductive oxide, PEDOT:PSS and other conductive polymers, nanoparticle inks etc.).
  • the electrodes in the light detection layer are configured so that they are in electrical contact with the photovoltaic material.
  • electrodes in light source layers consisting of a photoactive layer in the viewing area (e.g. OLED or T- OLED) are typically in electrical contact with the photoactive layer.
  • the electrodes in the certain display layers may be positioned on the outward facing surfaces of the display (e.g. on the outward facing surface of a barrier film).
  • an electrode layer can be used in more than one of the display, light detection and light source layers.
  • a single non-patterned electrode layer may be used when setting the display message, and separately used when activating a T-OLED light source layer.
  • a single patterned electrode layer is used when setting the states of the display pixels and separately when sensing/measuring light via the detection pixels.
  • the patterned electrode layer determines the shape, position and addressability of both the display pixels and the detection pixels. And importantly it assures they are near-perfectly aligned so that the reflected light from, or transmissive light through, one display pixel corresponds to that detected/measured by the appropriate (paired) light detection pixel.
  • Electrode layers can be configured in a variety of ways and placed in contact with other layers of a device. This allows for simpler devices and considerable flexibility in manufacturing, particularly where different processes are involved (e.g. chemical etching, vapor deposition, printing etc.).
  • a transparent electrode layer is applied to the surface of a lightguide that is then placed in contact with the surface of a display layer (e.g. a barrier film or adhesive layer without an electrode layer of its own).
  • the common electrode layer could be patterned or non-patterned.
  • a photovoltaic material is deposited directly on a transparent electrode layer previously deposited on a lightguide.
  • a separate display layer with an outward facing patterned electrode layer could then be combined to create a device consisting of a display layer, a light detection layer, and a light source layer - and using only three electrode layers.
  • the photovoltaic material is deposited directly on the outward facing transparent electrode layer on the barrier film of display layer to which a light guide with a transparent electrode layer is placed in contact.
  • light detection and light source layers may be separately manufactured and then combined.
  • a shared common patterned electrode manufactured as part of either the display layer or the light detection layer for example would avoid alignment problems common to roll-to-roll manufacturing processes.
  • the component layers that make-up the display layer, light detection layer and light source layer may be fabricated advantageously in part or in whole, directly onto adjacent device layers.
  • Devices may incorporate light absorbing or light reflecting materials to enhance the performance of the light detecting layer and the light source layer.
  • a display device 50 consists of display layer 51 and a light detection layer 52 where the light detection layer 52 is on the back side of the display layer 51, which front side 54 is facing the viewerand ambient light 53 impinges (if present). Further, the display layer 51 is of an electrophoretic micro-cup 57 configuration where each micro-cup 57 corresponds to a single pixel with charged and reflective particles of a single type suspended in a clear liquid 58 (shutter mode).
  • a first state 61 the charged particles 55 are set along the viewable surface of the micro-cup 57 (through the application of a voltage across the front and appropriate back electrode of the display layer) thus blocking light from reaching the light detection layer.
  • a second state 63 the charged particles are moved to one side of the micro-cup 57 allowing light to pass through to the light detection layer 52.
  • the display pixel is reflective and from the viewer's perspective 'bright' compared to the second state 63.
  • the display pixel is largely transmissive as the ink particles 56 collect in a corner, and the light detection layer absorbs most of the light. From the viewer's perspective the display pixel appears comparatively 'dark' .
  • the shutter mode of the display layer can also be implemented with other display
  • the color of the charged particle is chosen to maximize the reflectivity of visible light (e.g. 'white') and the composition of the light detection layer (top and bottom electrodes, photovoltaic materials) is chosen to absorb visible light.
  • a light- absorbing material (which may be part of or separate from and behind the back electrode 61 of the light detection layer) may be incorporated to maximize the absorption ( or reflectivity in combination with light absorbing ink particles).
  • Figure 3B shows the device 75 similar to the device 50 of 3 A but with the addition of a T-OLED 76 light source layer.
  • the vertical to lateral dimensional ratio of the pixels is high, it is further advantageous to directionalize the typically Lambertian distribution of the OLED emission to minimize any lateral crosstalk from adjacent pixel illumantion consequently reducing state detection constrast.
  • the normal incident directionality can be enchanced to reduce such crosstalk.
  • Electronics may be integral, proximate or local to a device (or devices), distributed or remote and advantageously include a processor and circuits for receiving signals from the light detection layer, for transmitting signals to the display layer or light source layer.
  • the communications or signaling may be by electrical connection or wireless.
  • the processor may be a microprocessor, and in some cases may be an embedded RFID or other purpose built (fit for use) processor.
  • the processor may also include signal processing units for improved efficiency in processing received signals. Such a signal processing unit may be useful for more efficient determination of messages or patterns, for verifying messages, for determining states of a message, and for determining displayed, visual, and perceptible states.
  • the processor may also be used for monitoring conditions, for example absolute timing or elapsed timing, or for receiving inputs from environmental sensors. In this way, the processor will provide conditional rules for making decisions as to what may be displayed, and possibly what level of perception is needed for the particular environment.
  • the electronics may include memory for storing messages, and processes for determining a subset of critical messages to store to save power and memory space. Electronics may also include various clocks, timers, sensors, antennas, transmitters, and receivers as needed. For particular applications the communication paths may also include encryption and decryption capability.
  • the device may be powered locally by a battery or a capacitor, and may have energy harvesting systems such as RF, optical, thermal, mechanical, or solar.
  • a device may further have of a switch, button, toggle or control for scrolling or switching between multiple messages on the same screen.
  • Those methods and systems may be used with electrical signals that correspond to the optical states of display pixels that correspond to reflected and/or transmitted light that corresponds to the state of display pixels; wavelengths of reflected and/or transmissive light that corresponds to the state of display pixels; or polarization of reflected and/or transmitted light that corresponds to the state of display pixels.
  • Those methods and systems may further use measures of ambient light and/or light emitted by a light source layer (e.g. reference pixels, calibrated measurements).
  • Those methods and systems may use electrical signals corresponding to the optical states of display pixels with and without ambient light, pre and post activation of a light source layer or different combinations thereof.
  • electrical signals corresponding to the optical states of display pixels are preferably stored along with the time or period the measurements are taken.
  • optical signals corresponding to the optical states of display pixels are preferably stored along with the time or period the measurements are taken.
  • measurements can be initiated in response to events such as the setting message pixels, time, change in monitored/detected condition, absolute or elapsed time, external signal (e.g. electrical, RF, human and machine readable light etc.) etc.
  • external signal e.g. electrical, RF, human and machine readable light etc.
  • the light source layer can be activated in response to a variety of 'events' and as appropriate precede or follow the setting of message pixels.
  • an event first initiates a measurement of ambient light to determine if it is sufficient to effectively detect/measure the optical states of the message pixels. If the ambient light is insufficient (or uncertain), then the light source layer is activated and the optical measurements taken. Further, the output of the light source layer may be regulated in response to the level and composition of the ambient light. In some applications, the light source layer may be activated (e.g. flash) to alert users to a changed condition that warrants their attention (and in low light environments allows them to see an appropriate message).
  • the detection signals from the light detection layer may be compensated for (e.g., through a calibration procedure) temperature (e.g. the conductivity of many organic polymers increase with higher temperature), supply voltage variation, detector dark current, average ambient light level, uneven light source distribution, pixel or segment size, manufacturing defects, etc. This allows for a more precise determination of the optical state of the pixel/segment
  • the calibration procedure may involve pixels (e.g. stable black and stable white reference pixels) outside of the active display area wich may or may not be shielded from receivng any ambient light.
  • a set of messages may be displayed in a series, randomly, pseudo randomly, in response to user control (e.g. by scrolling through them) etc.
  • the displayed messages and their states may be individually verified or as a set.
  • the user inputs and timing may be recorded along with the verification data to encourage users to view/perceive the complete message set.
  • the results of message verification can be used to trigger a separate viewable message independent of the first/primary message.
  • the second/separate message for example could alert the user as to uncertainty regarding to the accuracy, visibility, perceptibility etc. of the primary message despite it being sensible.
  • this "state of the message" message would be simple and thus robust, reliable and serve to alert the viewer as to a fault with, or uncertainty in regards to, the primary message.
  • Meta systems receive data from devices/electronics/methods/systems
  • Device data capable of verifying displayed messages (e.g. electrically or optically) and combine/use it with data from other sources to transact, analyze, monitor, etc. items, events and outcomes. Knowing that messages (and patterns) can be, or have been, verified/determined increases participation and proper usage, and confidence in the data, outcomes and meta systems.
  • Meta systems typically involve data from multiple, often independent, parties. Some meta systems are typically centered on the item to which the device is attached and associated events or monitored conditions. An insurance or payment system for example may use device data received from the buyer (condition of an item), the seller (customer information) and shipper information (time of delivery). Other meta systems are typically centered on outcomes from the human (or machine) use of device data (as well as the device data itself).
  • Meta systems for example, can analyze the impact of human (or machine) usage of device data of outcomes. Meta systems can help identify device or system failures vs. those of humans, whether they have been tampered and appropriately 'localized' (e.g. messages displayed in languages and date format appropriate to the location, custodian or user).
  • the outcomes (results) of a clinical trial may depend on displayed messages being not only correct but also used correctly by healthcare professionals and participants.
  • a meta system may therefore analyze outcomes of a clinical trial (e.g.
  • the financial performance of a grocer for example may depend on messages as to the state of perishable foods (e.g. as ordered/acceptable, not as ordered/unacceptable or not as ordered, but acceptable at discount) being correct, perceptible etc. and
  • a meta system may therefore analyze outcomes such as sales, cost of goods sold, shrinkage or profit figures with action data (rejected shipments or discounts requested) as well as received device data.
  • the meta system may further analyze outcomes involving suppliers (e.g. shipment condition over time, discounts issued etc.) in context of received device data.
  • a detector layer or photoactive thin film sensor 11 consisting of a light sensitive layer 12 sandwiched between two transparent conductive layers, a front layer 13 respectively back layer 14.
  • This photoactive thin film sensor is inserted on the front (i.e., readout side) of a reflective display 15.
  • the light sensitive layer, or photoactive layer may consist of a single compound or many layers, in order to provide an electrical signal (16a, 16b), e.g., a voltage differential, between the respective transparent conductive layers, when ambient light (18a, 18b) impinges onto the photoactive sensor system.
  • the electrical signal is dependent on not only on the ambient lighting (18a, 18b) conditions (intensity over the visible and/or invisible part of the electromagnetic spectrum), but also on the amount of light reflected back from the reflective underlying display pixel (19a, 19b).
  • the ambient light (17a, 17b) passing through the front electrode 13 will act as an electrical bias on the detected electrical (16a, 16b) originating from the display pixel.
  • This electrical signal (16a, 16b) can, in a similar way to that of the electrophoretic display described above, be used to verify the state of the display, preferably by first substracting out the electrical bias signal.
  • the reflective display layer 15 has two pixels, one dark 20a and one bright 20b, with corresponding sensor pixels (21a, 21b).
  • a proper separation 22 between the electrode layer 14 of the sensing pixels must be provided in at least one of the transparent layers (e.g. through gaps), i.e. 14 or 13, in order to measure the states of the desired pixels of the bistable display.
  • the detector layer (photoactive film sensor) 11 can be fabricated with proper alignment directly onto the reflective display layer 15 or onto a supporting carrier film 23 for subsequent transfer onto the display.
  • at least one of transparent electrodes e.g. 33 in Fig 2 that drive the display layer (e.g. the photoactive material 12 in Fig. 1 or 31 in Fig.
  • the carrier film 23 may have patterned ITO on both sides, each aligned to the other.
  • the photoactive layer in the above configurations can be made of photovoltaic materials, light harvesting proteins, or other photoactive compounds.
  • Preferred photovoltaic materials include organic photovoltaic materials (OPV) for ease of roll-to- roll manufacturing and with optical properties of high transparency (for configurations shown in Figures 1 and 2) to minimize the impact of the display readability.
  • OCV organic photovoltaic materials
  • Of particular interest are organic photovoltaic devices that are near transparent or semitransparent developed primarily for automotive and building window applications (see e.g. US Pub. No.
  • the photoactive layer 31 of the light detection layer 35 sandwiched between its front 32 and back 33 electrodes, is polarization sensitive and integrated with the polarizing display layer 34.
  • the polarization sensitive photoactive sensor (light detection layer) 35 is inserted between the polarizer 36 and the front alignment layer 37 (typically glass or polymer film) of the bistable liquid crystal display layer 34.
  • a typical reflective bistable liquid display layer also includes the liquid crystal layer itself 38, a back alignment layer 39 and a reflector 40, which also acts at the back electrode. However, depending on the configuration it may also include additional layers, such as a quarter-wave plate and an additional back polarizer (not shown for simplicity).
  • the pixelated back transparent conductor layer 33 for the sensor signal (41a, 41b) also acts as the pixelated front electrode of the display and is used for the display switching signal (42a, 42b), thus eliminating one transparent conductive layer in the (integrated sensor) display device 51 (or 30).
  • the polarization sensitive film 31 maybe made from incorporation of nanowire or nano-tube technology, or by preferentially photochemically bleaching of
  • the light detection layer 52 is located behind a bistable electrophoretic display layer 51.
  • the electrophoretic display 51 illustrated contains visibly white ink particles (55, 56) in a clear fluid 58 contained in a segmented microcup 57 configuration.
  • a first state 61 corresponding to a bright segment from the viewing side 54, the white ink particles 55 are distributed at the front surface of the microcup 57 after applying an appropriate switching voltage to the electrodes 59 and front transparent conductor 65 of the display layer 51.
  • the ambient light is reflected by the white ink particles 55 (creating a bright viewable segment) and largely blocked from going through the segment cup 57 and reaching the light detection layer 52.
  • the white ink particles 56 are displaced to a smaller lateral region at the side and toward the back of the segment cup 57 after applying an appropriate switching voltage to a smaller area-sized electrode 59 in the back and the front transparent conductor 65 of the display layer 51.
  • most of the ambient light passes through the microcup cell 57 and further onto the light detection layer 52.
  • a visible light absorbing conductor 61 is preferred on the back of the light detection layer 52, in order to yield a higher contrast of the displayed message.
  • the light detection layer 52 is exposed to the complementary light level of the segment state as compared to that viewable by the observer of the display.
  • FIG. 3B a device 75 similar to the device 50 of Figure 3 A is shown.
  • An integral light source layer 76 e.g., as illustrated here: T-OLED
  • T-OLED transparent organic light source
  • the integral light source layer 76 allows for increased detection levels at the light detection layer and ability to discriminate between the states of the display. This exemplary configuration is preferred when the state detection takes place under low ambient lighting conditions or in a dark environment.
  • an electrophoretic display layer 127 comprising a two ink particle system with the light source layer 129 emitting a shorter wavelength (e.g., UV illumination) and first ink particles 131 (e.g. visibly white) possessing a shorter wavelength (e.g., UV illumination) and first ink particles 131 (e.g. visibly white) possessing a shorter wavelength (e.g., UV illumination) and first ink particles 131 (e.g. visibly white) possessing a shorter wavelength (e.g., UV illumination) and first ink particles 131 (e.g. visibly white) possessing a shorter wavelength (e.g., UV illumination) and first ink particles 131 (e.g. visibly white) possessing a shorter wavelength (e.g., UV illumination) and first ink particles 131 (e.g. visibly white) possessing a shorter wavelength (e.g., UV illumination) and first ink particles 131 (e.g. visibly white) possessing a shorter wavelength (e.g., UV
  • wavelength(s) when subjected to the illumination of the light source layer through phosphorescence or fluorescence.
  • This longer wavelength can further be used to illuminate the display layer 127 (front or back) and enhance the detection by the light detection layer 128.
  • the second ink particles 133 e.g., visibly black
  • the shorter wavelength e.g., UV
  • a display device 175 is shown similar to the devices of Figures 3A/B and 4, previously described, so only the differences will be highlighted.
  • both the light source layer 176 and the light detection layer 177 are situated in front of the display 179 (here illustrated as a microencapsulated electrophoretic display).
  • This configuration allows for optical state detection, with or without the presence of ambient light, from the same side as the observer, and is particularly favorable for reflective displays that do not have a complementary optical state detection capability from the back side of the display.
  • the exemplary light source layer 176 illustrated consists of an LED 181 edge-lit light guide plate 182 (see e.g.
  • Planetech International or FLEx Lighting which redirects and distributes the light from the LED towards the display layer 179.
  • This particular configuration also allows the light source layer 176 to aid the observer in viewing the display under dark ambient lighting conditions.
  • this front lit configuration also induces undesirable bias light (independent of the display state) onto the light detection layer 177.
  • both the light source layer 176 and the light detection layer 177 must provide significant optical transmission as to not significantly deteriorate the brightness and contrast of the observed display.
  • the segmented (or patterned) transparent conductor 184 can favorably both be used to switch the state of the display segment, as well as, to determine the state of the corresponding segment by the light detection layer.
  • a display device 225 is shown similar to the devices of Figures 3A/B, 4 and 5, previously described, so only the differences will be highlighted.
  • Device 225 has reverse stack configuration as compared to that in Figure 5, and is shown with a two particle microencapsulated electrophoretic display layer 227.
  • the display performance including brightness and contrast, from the viewer side is uncompromised.
  • the common segmented transparent conductor is on the back side of the display further improving the displayed message, by reducing any potential visual ghosting effects from the (non-ideal) transmission of the conductor.
  • Figures 7A and 7B show two configurations for an emissive display device 250 with a photosensitive detector 251.
  • Detector 251 has the same general structures as already discussed with reference to Figures 3-6, so will not be discussed in detail in this section.
  • Detector 251 and display layer 255 both have their own top and bottom substrates, 252a/b and 256a/b respectively, but it is also possible that they share a substrate or are even integratedwithout a substrate separating the two.
  • configuration 250 shows the detector layer 251 configured in front of the display layer 255.
  • the top of device 250 is the front side that is positioned toward a viewer, and the bottom of the device 250 is the back side that is positioned away from a viewer.
  • configuration 260 uses the fact that emissive displays in general emit light in both directions. By placing the detector 251 under the display 255 the back emission is detected. The amount of back emission can be tuned by the reflectivity of the back electrode of the emissive display.
  • the additional advantage of this configuration is that the sensor receives less ambient light.
  • the abbreviations in Figures 7A, 7B and 8 are definded as follows: SUB (substrate); DTE (Display Top Electrode); EM (Emmissive Layer); PE (Pixel Electrodes); STE (Sensor Top Electrode); PS (Photo Sensitive Layer); CF (Color Filter); SU (Shutter); BL (Backlight); and SBE (Sensor Bottom Electrode).
  • Figure 8 shows an exemplary embodiment of a display device 275 with a backlight 276, a shutter 277 (for example an LC layer with polarizers) and a front detector 279.
  • the middle substrate 281 can again be shared, or the detector 279 and the display 283 can even integrated without a separating substrate and the color filter 285 is optional.
  • the exemplary embodiments of display devices 250, 260, and 275 require power in order to show the image.
  • An intelligent label that is directly connected to a large power source or to the power grid could operate continuously or for extended periods of time. This could be possible in for example a store setting where the intelligent label is showing the price of an item.
  • the intelligent label can be continuously powered in that case and can show the information continuously.
  • the exemplary embodiments make it possible to also continuously verify if the information is displayed correctly or verify this whenever needed.
  • An intelligent label may have an actuator that activates the display temporarily from time to time responsive to an activation signal, for example a signal from an environmental sensor.
  • the sensor could be a proximity sensor, an (IR) movement sensor, a push button, a touch interface, a bend sensor (strain gage), a microphone or an accelerometer, etc.
  • the message actuator ensures that the display is mostly off in order to conserve power.
  • the display could be activated for a certain amount of time or until the sensor does not detect movement, touch, finger push or bending (movement) or sound for a certain amount of time. Detecting the state of the display now becomes more energy efficient, as the display is only on for certain short periods of time. Detecting the state just at the start of an activation period may be sufficient, instead of detecting the state of the display at various moments in time for a permanent (bistable) display as used in selected other embodiments.
  • a block diagram 300 of the intelligent label 305 with the message actuator 306 is show in Figure 9.
  • the different elements have the same function as outlined in copending U.S. patent application number 14/586,672, filed December 30, 2014 and entitled "Intelligent Label Device and Method," which is incorporated herein by reference as if set forth in its entirety.
  • the message actuator 306 communicates with the state detector (sensor) 307 as described above that sends the activation signal to the electronics of the intelligent label to activate the display (i.e. the message indicators 308 and 309) and shows the message and also sends a deactivation signal based upon a timer or a sensor deactivation signal, or a combination of these two.
  • Compensating for ambient light with an emissive display is possible by inserting short periods of time where the display is not emitting light. During that time the sensor only senses the ambient light. That measurement can be used to correct for any bias, such as high ambient light intensity or spatially or temporal changes in ambient light intensity over the display.
  • the emission can be turned off by powering off the pixels. In a backlit LC display this can either be done by changing all pixels to the black state or by turning off the backlight.
  • emissive displays such as OLED, LC (with integrated light), or QD can switch very fast.
  • OLED or QD can switch between on and off within microseconds, while modern LC can switch within 1 millisecond.
  • a scheme can thus preferably be implemented for each image frame update (of for example 20ms (50Hz)) wherein a small portion (e.g., a few milliseconds) would be reserved for ambient light sensing. As this can be done very fast, the viewer will not see any flickering.
  • ambient light sensing could be done at the start and/or at the end of displaying the information in case the display is not always on.
  • an emissive display is almost always visible, even in dark environments as it does not rely on an external light source. Also, the state detection of the display could become more easy for a display that only show the information when activated. Further, due to the fast switching capabilities of most emissive displays, efficient compensation of the ambient light is possible.
  • a backlight 327 is used behind the display 325, usually an LCD, where an object, e.g. a finger 329, on the display 325 reflects the light from the backlight 327 back to a detector 331 that is integrated on the backplane 333 of the LCD.
  • an object e.g. a finger 329
  • a detector 331 that is integrated on the backplane 333 of the LCD.
  • a structure 350 using a photo TFT 351 (thin film transistor) and a readout TFT 352 that is used to read-out the photo sensor is shown.
  • the photo TFT 351 can receive reflected light through the opening 355 in the black matrix 357 laterally offset from the color filter 359, while the read-out TFT 352 is under the black matrix 357.
  • the photo TFT 351 typically has a light blocking layer as a first (bottom) layer in order to avoid direct illumination from the back light.
  • the photo diodes are typically sensitive to temperature as well, the accuracy of the light sensing can be increased by adding a 2nd diode that only measures the effect of the local temperature (i.e. has a bottom and top light blocking layer) and is subtracted from the photo diode signal.
  • FIG 12 a backplane circuit 400 for an active-matrix LCD with integrated light sensors is shown.
  • One light sensor is implemented for every 4 pixels, although it is possible to implement more or less light sensors as well.
  • the light sensing circuit is a simple 2 TFT circuit as shown in Figure 11.
  • the sensing circuit shares a number of line with the pixel circuits to simplify the external wiring.
  • the circuits works by first putting a bias on the capacitor Cst2 that leaks away through the photo TFT depending on the light intensity. By reading the remaining bias on the storage capacitor after a certain amount of time (e.g. 20ms) the average light intensity on the photo TFT can be calculated.
  • FIG. 13 A a pixel circuit 425 for an AMOLED is shown with integrated scanner function.
  • the photodiode is made from a p-i-m amorphous silicon diode.
  • Figure 13B illustrates a timing diagram 450 for the circuit of Figure 13 A.
  • OLED compensation circuits using optical sensors In Figure 15, an OLED compensation circuit 500 based on optical feedback is shown.
  • the photo TFT is an a-Si NIP diode integrated on the backplane.
  • the photo TFT detects the light coming from the OLED.
  • the drain-source current from the photo TFT determines the amount of time the OLED is on during a frame. This compensates for degradation of the OLED by making the on-time of a degraded OLED longer such that the integrated light output over one frame is equal to that of a fresh OLED.
  • the general implementation consists of integration of or adding a light sensitive element to the display.
  • the optimal solution is to integrate the light sensitive element directly in the active matrix as already proposed for in-cell touch and OLED compensation.
  • the light sensitive element can be incorporated into one of the substrates or can be created on a separate substrate and adhered to the bottom or the top of the display as already proposed for the light sensitive layer in previous embodiments.
  • a light blocking layer is proposed to shield contribution from the ambient light falling onto the photo detector.
  • This light shielding layer can also be used in various embodiments as previously described in order to improve the signal to noise ratio.
  • Integrated light sensitive element in a back lit reflective display In this embodiment 525 illustrated in Figure 16, a reflective display 526, such as an
  • electrophoretic E Ink display is used in combination with a backlight 527 as a light source and an integrated optical sensor 528, such as a photo diode or a photo transistor as the detector.
  • the optical display (from the back side) will scatter the light back onto the light sensor, with a light level indicative of the optical state of the display (pixel).
  • the sensor 528 will sense the inverse image as it is sensing on the backside.
  • the backside of the display is black only a fraction of the light impinges on the sensor as compared to a white state. Intermediary grey states can also be detected.
  • the optical sensor 528 can then be implemented as a light sensitive transistor in the same technology as already used for the matrix backplane.
  • the light shield 531 under the sensor 528 can easily be implemented by using one of the metal layers underneath the sensor 528. Of course it is possible to use the sensor 528 without a light shield 531, but the optical contrast will then be much lower.
  • the backlight 527 can also only emit non-visible light, such as IR or UV, in order to avoid light leakage through the reflective display impacting the viewer.
  • the sensor 528 can be tuned to be sensitive to the particular wavelength of the backlight. In this embodiment vertical separation (e.g.
  • a spacer layer of the optical sensor 528 and the reflective display 526 is desirable in case larger pixel areas are employed.
  • Separate light sensitive element in a back lit reflective display It is also possible to add the light sensitive element as a separate layer to the display, as shown in Figure 17. This could be useful in case a simple display structure, such as a few segments, is used or when a separate add-on is more economical.
  • the bottom display substrate and electrode structure must be transparent enough to be able to sense the switching state of the display medium through these layers. This can be done by using ITO or other transparent metals for the pixel electrode.
  • a backlight 601 is used in combination with a light sensor sheet 602.
  • the light sensor sheet 602 can be made with light sensitive transistors or diodes build by photolithography.
  • the resolution is lower it is also possible to mount discrete light sensors to a flex foil, as long as the flex foil has enough transparency for the backlight.
  • This embodiment is similar to the embodiment shown in Figure 6, but is now using an optical sensor with a light shielding element instead of a photosensitive layer.
  • a separate sheet 626 with light sources and light sensors in a side-by-side configuration is integrated.
  • This is typically a low resolution solution build with discrete components (e.g. LEDs and photo detectors) on a flex foil, although it is also possible to build such a layer with high resolution OLED with integrated photo diodes or transistors.
  • an array of light sources and detectors is used, it is possible to switch light sources and detectors sequentially or in groups in order to get the best possible optical contrast for the display state verification.
  • a separate sheet 651 only contains the light sources in a side-by-side configuration, while a photosensitive layer 652 is positioned behind the display and the light source layer.
  • the detector By switching one light source on at-a-time the detector will detect the switching state of the illuminated part of the display. This works well for low resolution segmented displays or, in case the light sources are made in a high resolution technology, like a matrix OLED array, this could even be used for high resolution matrix displays.
  • the photosensitive layer could contain multiple discrete sensors for a faster response time, like in display 626 or be processed in a grid with row and column electrodes.
  • Emissive display e.g. OLED
  • an emissive display 700 embodiment with an integrated optical sensor is shown.
  • the emissive layer emits light in all directions.
  • the light that is emitted down is sensed by the optical sensor.
  • the optical sensor can be integrated into the active matrix using the same layers and technology.
  • the optional light shield layer shields the ambient light from the sensor in order to reduce bias.
  • an absorbing layer it is also possible to make it a reflective layer as that increases the amount of light falling on the optical sensor even further, but it will also decrease the display optical performance for the viewer.
  • the shield layer can be reflective on the back side and absorbing on the front side
  • the optical sensor is positioned just below the light shielding layer and above the emissive layer, but the disadvantage of that is that the sensor now needs to be processed separately and cannot be made at the same time as the electrodes and transistors on the bottom substrate.
  • FIG 19 a similar structure 725 is shown, but now with the light sensor implemented in a separate sheet.
  • This embodiment is similar to the embodiment shown in Figure 7A, but is now using an optical sensor with a light shielding element instead of a photosensitive layer.
  • a shutter display 750 with an integrated optical sensor is shown.
  • a shutter display has various degrees of transparency depending on the switching state of the material. For example, in case liquid crystal (LC) is used, the LC can be switched between a semitransparent state and a dark state by sandwiching the LC material between crossed polarizers.
  • the display has a backlight it is advantageous to use a light shield layer just below the light sensor to reduce signal bias induced by the backlight.
  • the light sensor can detect the state of the pixels even without ambient light. Further, by using non-visible (IR) light in the front light the optical performance in the visible wavelength range is largely unaffected, while the signal level for the optical detector could be further increased.
  • IR non-visible
  • the shutter display is a reflective display (with a reflective bottom electrode)
  • a backlight is not functional, but the front light could provide additional visibility for the user and the sensor when the ambient light is poor.
  • an optical shutter is added to the display, such that the photo sensitive layer only receives the reflection, transmission, or emission from one pixel at a time.
  • the advantage is that this allows the photo sensitive layer to be unpatterned (i.e. not have any pixels) which makes it much easier to manufacture.
  • the shutter can be a simple LC display, the shutter and the display can be made with the same manufacturing infrastructure which makes it easy to manufacture with matching pixel size and shape. LC displays are now extremely cheap, thus adding only marginally to the cost of the display system. Also, it is possible to make the shutter normally transparent (i.e. normally white) in order to make the transparent state the state without any power to the shutter.
  • the photo sensitive layer is prefereably made by a solar cell type of
  • the photo sensitive layer does not need to be pixelated anymore, something that is very compatible with the general structure of solar cells.
  • the photo sensitive layer such as photosensitive transistor or diode structures, or even use discrete photo sensitive components mounted on a flex board, as also previously described.
  • a reflective display device 800 with an exemplary electrophoretic display layer 801, is shown with a photo sensitive layer 802 in front.
  • a shutter 803 is positioned in front of the photo sensitive layer 802.
  • the shutter 803 has a pixilation that is such that it can pass or block light per pixel of the display.
  • the pixilation of the shutter 803 can be identical to the display or it can be different (larger or smaller than one pixel), but still allowing the passing or blocking of the light per (part of a) display pixel.
  • the photosensitive layer 802 is not pixelated and only registers the amount of light that is passing through its light sensitive layer. By switching the shutter from pixel to pixel, the state of each pixel can be registered.
  • the front light 804 and color filter 805 are optional. Substrates can be shared or some of the components could even by monolithically integrated on top of each other.
  • the user looking at the display will see the shutter 803 blocking part of the image depending on the speed of the shutter and the way the shutter 803 is driven. This can be addressed by operating the shutter 803 at a high speed, for example 50Hz or higher.
  • the shutter 803 can be opened by the shutter multiple times, for example 50 times. This would result in a total measurement time of 1 second, where each pixel is measured 50 times for short periods of time.
  • FIG. 22 An alternative embodiment 810 is shown in Figure 22, where now both the shutter 813 and the photo sensitive layer 812 are pixelated, such that the combination of the two allows a per display pixel measurement of the switching state. Any trade-off is possible between the two layers in order to find the optimal solution from a
  • FIG. 23 an alternative embodiment 820 is shown where the shutter 823 is positioned in-between the back light 826 and the photo sensitive layer 822.
  • the photo sensitive layer 822 now senses the switching state of the backside of the display.
  • This embodiment can also be well used for shutter like display effects, such as LC, instead of reflective E Ink. In that case the front light is omitted, but the rest of the stack is the same.
  • FIG 24 an embodiment 830 using a shutter 833 is shown for a reflective display that is switched between a reflective state and a transparent state, such as a Cholesteric Texture Liquid Crystal (CTLC) display layer 835.
  • the shutter again selects the pixel to be measured.
  • the photo sensor When the display pixel is in its reflective state the photo sensor will not detect light, while it does detect light when it is in its transparent state.
  • the reflectivity curve 839 for the CTLC display 838 is also illustrated.
  • the shutter embodiment is shown for an emissive display 840.
  • the emissive display is not bi-stable, so it only emits light when it is powered.
  • the emissive display typically emits light in both directions, the light emitted towards the back is used to detect the state of the display.
  • the amount of light that is emitted towards the back can be tuned by optimizing the layer thickness of the back electrodes of the display layer.
  • FIG. 26 a simplified embodiment 850 is shown where the shutter function 853 has been integrated into the emissive display layer.
  • the shutter function 853 When the emissive layer is showing the image to the viewer, it can modulate each pixel at a high speed, such that the photo sensitive layer can detect the change in light and thereby can detect the correct switching state of the pixel. This can be done with the same methods described for drive schemes of the shutter above.
  • FIG. 27 the embodiment 860 of the emissive display with the shutter 863 and photo sensitive layer 862 in front of the display is shown.
  • the advantage of this embodiment is that the emission of the display is unidirectional towards the viewer.
  • the disadvantage is that more layers are now between the display and the viewer including the shutter that needs to be operated.
  • the integrated shutter function into the emissive layer can be used here as well, as shown in device 850.
  • FIG. 28 an embodiment 870 is shown where a shutter 873 display effect is used, both to display the image and to function as the shutter for the photo sensitive layer 872.
  • a shutter 873 display effect is used, both to display the image and to function as the shutter for the photo sensitive layer 872.
  • the backlight could be modulated (or strobed) from two light sources, e.g. one emitting in the visible wavelength range for viewing the emissive display and one emitting at a wavelength range outside of the visible range (e.g. in the IR or UV) for detection puposes with a corresponding wavelength-tuned detector.
  • a similar embodiment 880 is shown, but now using a reflective shutter 883 type display effect.
  • the photo sensor 882 will always be subjected to the bias light from the front light while detecting the pixel state at high speed. This is possible by the polarization sensitive sensors as previously discussed with the front side polarizer of the display layer placed in front of the detection layer. Accordingly, an unpatterned or coarsely patterned photo sensor can be used in combination with a low- cost off-the-shelf shutter.
  • Display pixel state verification by a detector generally requires a detector that has at least the same resolution as the pixels of the display itself. Especially for high resolution displays this would require an expensive optical detection system. Further, large area optical sensors, such as solar cells, are manufactured with different (low resolution) infrastructure than displays. The applicability of an optical sensor it is therefore highest when the resolution requirements on the sensor are low.
  • a lower resolution optical sensor in combination with a consecutive update of the display in matching orthogonal blocks can be employed to determine the optical state of the display pixels.
  • a scanning front or backlight can be used.
  • These systems and methods can be applied to not only bi-stable displays, such as electrophoretic and CTLC displays, but also to non bistable displays, such as LCD, OLED, QD or micro LED. It is applicable to segmented displays, passive matrix displays and active matrix displays.
  • a differential signal is recorded by the sensor, meaning that the pixels are switched to a reference state and the final state, where the difference is recorded for verification of the state of the pixel.
  • the sensor can be a solar cell, a (integrated) transistor sensor, a discrete grid of optical sensors, a capacitive sensor or any other kind of sensor that can record the (change of the) switching state of a pixel or a group of pixels.
  • FIG. 30 an embodiment 900 is shown where the display 901 is updated directionally.
  • the new content is written to the display 901 from left to right, i.e. pixel column by pixel column in this case.
  • the display 901 does not have to be updated from left to right or top to bottom as long as every group of pixels that is updated at the same time only triggers a response on one of the optical detector segments. Therefore, this same approach can also be used for segmented displays or displays with other shapes.
  • An example 910 is shown in Figure 31. Note that in Figure 31, the figure on left can also be achieved with only three sensor stripes as illustrated in the figure on the right.
  • An example of an alternative 920 approach would be to have an optical sensor array 922 consisting of rectangular pixels that are large enough to overlap with 5x5 display pixels 921, as shown in Figure 32. By updating the display such that in 25 steps every pixel in the 5x5 blocks is updated sequentially, the sensor pixels detect only the change per pixel resulting in a verification of the display state.
  • a bi-stable display such as an electrophoretic or CTLC display
  • the display is always showing information, even when it is not powered. It is therefore best if the pixels are first switched to a known reference state (e.g. black) followed by switching them to the new state. That way the detector can detect the change in optical signal when the pixels are refreshed. Even when the image is static and does not need to change the information that is displayed, the verification action should trigger this update in order to correctly verify the state of the pixels by detecting a difference per pixel.
  • a non bi-stable display such as an LCD
  • the display is only showing information when it is powered and scanned. LCDs can either be segmented, passive matrix, or active matrix.
  • Passive matrix LCDs are usually driven by scanning in a certain direction, for example from left to right. During the activation of a certain column of pixels, the pixels are put into a switching state that generates the right grey level for the frame time.
  • Active matrix LCDs use a transistor circuit per pixel in order to generate a substantially constant switching state (i.e. light output) per pixel during a frame time.
  • the pixels are refreshed a row-at-a-time at high speed in order to show moving or static images.
  • it is advantegeous to insert a short pixel-off interval (or alternatively a reference pixel switching state) per row during every scan to detect the difference between the off-state and the new state for all the pixels by the simplified detector.
  • This method requires a fast LC switching effect and detector.
  • example display 950 cross sections are shown with either a back light 951 or a front light 952.
  • this configurations it is possible to combine the resolution of the front light 951 or back light 952 with that of the optical sensor 953 such that the resolution requirement of the sensor is reduced.
  • FIG. 34 an example 975 is shown where the front or back light 976 is scanning from left to right over time, resulting in a simplified structure for the optical sensor 977.
  • the scanning frequency can be so high that the viewer cannot perceive the scanning of the front of back light 976, while the optical sensor 977 can now detect the (change of) light per area of the display 978 that is lit by the front of back light.
  • the combination of the front or back light 976 resolution and the optical detector 977 resolution must be equal to the pixel resolution of the display 978 in order to verify the pixels individually.
  • FIG. 35 An example 980 is shown in Figure 35, where the combination of the scanning front or back light 981 with the consecutive update of the display 982 results in the possibility to use an unpatterend optical detector 983.
  • Emissive displays In the case of an emissive display device 990, essentially the front or backlight and the display are integrated into one. By using a fast scanning update scheme, as discussed with reference to Figure 26, it is possible to simplify the optical sensor 991 electrode structure, as shown in Figure 36.
  • the emissive display 992 is showing the image by emitting light from the pixels. Typically, this can be achieved by OLED, QD, or micro LED type of displays.
  • each group of segments is put in its on-state (or in its off-state) sequentially. It is also possible to use another defined grey state instead of the off or on state.
  • the passive matrix emissive addressing scheme with a simplified optical sensor, as shown in Figure 36, the sensor will detect the flashing of every individual pixel during the addressing. Through this method the optical state of every pixel can be verified.
  • Active matrix emissive displays use a transistor circuit per pixel in order to generate a substantially constant light output per pixel during a frame time.
  • the pixels are refreshed row-at-a-time at high speed in order to show moving or static images.
  • it is advantegeous to insert a short pixel-off interval (or generally a reference state interval) per row during every scan to detect the difference between the off-state and the new state for all the pixels by the simplified detector.
  • two consecutive measurements 1001, 1002 are made with a reflective display layer 1006 with a front light 1005.
  • the first measurement 1001 is done with the lighting 1005 off, while the second measurement 1002 is done with the lighting 1005 on.
  • the difference between the two signals corresponds to the ambient light contribution, which can thus be compensated for in the state detection signal (by subtracting a bias).
  • the width of the arrow pointing towards the pixels represents the local amount of ambient light falling on the part of the display.
  • the two measurements can be done closely space in time, where the front light 1005 is quickly flashed to the off state for the off measurement while it is on the remaining time or vice versa. Further is it also possible to use a scanning front light as proposed in Figure 34 so that the measurement in the off- or on-state can be done in a scanning way to please the eye of the viewer. As a front light can generally be switched fast (i.e. 50Hz or higher) the user does not need to see this as flashing, but more generally as a continuous light intensity (low if the front light is off or just below high when it is on).
  • the measurements need to be done closely spaced in time in order to avoid temporal fluctuations in the light intensity to affect the pixel state verification. It is possible to make this multi-step verification measurement more pleasing to the eye of the viewer by doing the consecutive measurements pixel-by-pixel or in certain blocks of pixels in order to make the measurement less visible.
  • Two consecutive measurements with an emissive display In the case of the display device 1075 illustrated in Figure 40, the first measurement 1076 is done while all pixels are off (not emitting). During this measurement the local (pixel) intensity is of the ambient light. The second measurement 1077 is done while showing the image. In this case both the ambient light and the composite signals are measured. By subtracting the two measurements, the ambient light component can be compensated for. As emissive displays can typically be switched fast (i.e. 50Hz or faster) the two measurements can be spaced closely in time. It is also possible to do the two measurements per pixel or per row or column of the display in order to make it more pleasing to the viewer.
  • the two (or more) measurements that can be used to subtract the ambient light contribution can also be used to detect lighting conditions that are not good enough to do a reliable measurement.
  • multiple actions can be taken.
  • One of them could be to temporarily increase the intensity of the artificial lighting (front, back or self-lighting), in order to reduce the relative contribution from the ambient lighting.
  • optical and electrical verification methods can be manipulated or distorted resulting in an ambiguous or even a wrong state indication to the tag or backend system while in fact the display was showing the correct information in a perceivable way.
  • One or more reference pixels can be added that are switched in a predefined way during every verification cycle.
  • a display could have one reference pixel that is switched from white to black and back to white again during every measurement of the pixel state, as shown in Figure 41.
  • the measurement output for this pixel should also behave in a predicable way.
  • the switching curve can be sampled. This should result in a smooth curve (i.e. the consecutive measurement should either be increasing or decreasing in value) with a certain minimum and maximum readout when the external conditions are good enough and constant enough for the measurements.
  • the quality of the external environment during the pixel verification can be verified.
  • the switching curves of the pixels to be verified can be measured. This is especially useful for displays that are not bi-stable, such as LCD or OLED, as they are continuously driven.
  • the pixels are switched from their current state to a certain reference state and then back to the current state again.
  • the reference state can either be the full on or off state or a small difference compared to the current switching state such that the user can hardly notice the difference.
  • not only the current state is measured, but also the reference state or even states in between the current state and the reference state.
  • the switching curve is known and smooth the multiple measurements should result in a predicable relative outcome. When the external environment is fluctuating in time or position or is in general not good enough to do the measurement reliably, the series of measurements will result in a switching curve that is not as predicted.
  • the measurements can be done optically and/or electrically in ways already disclosed before.
  • Environmental sensors By adding environmental sensors, such as optical sensors, electromagnetic radiation sensors, vibration sensors, acceleration sensors, etc. it is possible to sense if the environment is good enough to perform a reliable pixel verification and if the environment is not fluctuating in time.
  • the sensors can be added to the display system and it is also possible to add multiple sensors of the same type at different locations. The sensors would be read-out before, during and/or after the pixel verification in order to ensure that during the whole verification measurement the environment was good enough and not fluctuating to reliably do the verification.
  • an optical verification system could be used to sense the amount of light reflected, emitted or transmitted per pixel, while an electrical verification system at the same time senses if the (switching or test) voltages put on the electrodes really reach the other end of these electrodes and also measures the capacitance of and/or the current flowing into each pixel. This combined information from multiple sources can make the system extremely robust against tampering.
  • Adding a static or dynamic watermark to the image By adding a certain visible or even better an invisible pattern to the image that is displayed or to the update of the image, it is possible to detect tampering with the system. When the watermark cannot be detected, the system could well be hacked or be tampered with. As a response the system can then shutdown and/or a (error) message could be displayed, stored or sent.
  • the types of unique patterns can be any of:
  • a unique modulation of the electrical signals e.g. additional high frequency modulation, modulation in frame rate, AC/DC signal added to voltage levels, etc.).
  • the unique patterns or watermarks can be stored in the system upon fabrication or be a generated pseudo random series that uses the unique system ID as seed. Alternatively, the unique pattern could be sent by the backend system to the system using any known way to make a unique one-time sequence. [0195] Accordingly, the disclosed embodiments result in a display device where tampering can become virtually impossible during the verification process.
  • placing a mirror that is a bit off-angle in front of the display in order to create an ambiguous spatial fluctuation in the lighting conditions can be detected either by using a reference pixel that detects an abnormal response when switching, by measuring pixels in a number of different switching states, by measuring the switching curves of pixels, by external detectors that detect different light intensities at different locations or by using electrical measurements of the pixel state instead of optical.
  • Using a source of electromagnetic radiation to create electrical noise for the measurements can also be overcome by detectors, reference pixels, measuring switching curves, or using an optical detection system. When complemented by watermarking, the complete system can become tamperproof.
  • a device such as an intelligent label
  • users may receive information visually as discussed above, and users may in some cases also receive information audibly. Similar to the need for verifying the visual information, there is also a need for verifying the audible information. Described herein are systems and methods for determining the audible output from intelligent audible devices configured to generate sounds in response to events as determined by intelligence integrated within the device.
  • Audible messages are the audible (acoustic) outputs from intelligent audible devices. Audible messages may be any of many forms including a single simple beep or tone, periodic or random signals, complex signals (e.g. varying frequencies or volumes), recorded messages (e.g. voice or music), artificially generated speech, or any combination of the aforesaid, etc.
  • 'determining' refers to the actions of detecting, converting and interpreting audible messages. Determination is the result of the cumulative actions required to ascertain meaningful information from the audible output. Those actions for example, could include comparing detected audible patterns to reference values or parameters that correlate to meaning (e.g. an audible pattern that is characteristic of a letter, number or word, or a specific alarm pattern). Verification is a subset of determination and includes the actions of comparing the actual audible message to an intended message.
  • Verification could involve comparing a 'digital fingerprint' of an intended audible message (e.g., a prerecorded or digitally generated message) to a digital fingerprint of the actual audible message projected by the intelligent audible device.
  • the digital fingerprint of the actual audible message could be generated by detecting the actual audible message (the one projected) and converting it into a digital fingerprint.
  • Circuitry in an intelligent audible device initiates the generation
  • exemplary events are changes in (or exceptions to set thresholds) monitored environmental or internal conditions, mechanical action, detected sound, location, time (elapsed or absolute) etc. They may include interactions between the intelligent audible device and 'stakeholders', that is anyone that has an interest in the good or service. Further, they may include events associated with the determination of past and present (concurrent) audible messages as well as conditions relating to their perceptibility (e.g., ambient noise, reflected noise, etc.). They may also include internal events, such as tampering, malfunction, and loss of power.
  • Exemplary intelligent audible devices are intelligent labels and hardware agents such as those described in the U.S. patent applications 14/479,055; 15/228,270; and 15/602,885; all of which are incorporated herein by reference as if set forth in their entirety. Intelligent audible devices, however, may take many different forms.
  • the intelligent audible device is in the form of an intelligent label 1203.
  • the intelligent label 1203 may generally be constructed as set forth in US patent application number 14/479,055. It will be understood that the intelligent label 1203 may take other forms, such as being constructed in the form of a hardware agent as set forth in US patent application numberl5/602,885.
  • the intelligent label 1203 has a processor 1205 that controls and operates the functionality within the intelligent label 1203.
  • the processor 1205 typically has a clock 1207 for managing processor functions, as well as providing timing functions that can provide actual and elapsed time.
  • the processor also has storage 1211, which may have a combination of erasable and non-erasable memory.
  • the intelligent label 1203 also has a power source 1213. This power source often will be in the form of a battery, however it will be understood that other sources such as solar photovoltaic cells or RF harvesting circuitry may be used.
  • the intelligent label 1203 also has a message generator 1222.
  • the message generator 1222 is constructed to form messages that are intended to communicate particular information to a listener. For example, the message may be a simple alarm tone, or may be a more complicated human understandable instruction. It will be understood that the message may take many forms.
  • the message generator 1222 may make a message signal for immediate communication, or the messages may be stored locally for use at a later time. In this way, the message generator 1222 may have its own memory, or may use storage 1211 of the processor 1205.
  • An event generator 1224 is constructed to provide a signal upon the occurrence of a particular event. That event can take many forms, such as events internal to the intelligent label 1203, such as an actual or elapsed time, or the event may be external to the intelligent label 1203, such as temperature, location, or shock. The event may also be receiving a message from another device from a wired or wireless connection. It will be understood that the event can take many forms.
  • the event generator may have a set of rules for determining when an event signal is to be sent. Once the event generator has determined that an event has occurred, the event generator generates a signal that causes the message generator 1222 to cause the audio output transducer 1237 to project the message into the local environment.
  • the audio output transducer can take many forms, such as a sound speaker, piezo-electric device, buzzer or other electro-acoustic device. It will be further understood that the message signal to be projected through the audio output transducer may be adjusted according to
  • the event signal is typically used to activate the audio input transducer 1241. It will be understood that in some cases the audio input transducer 1241 may be
  • the audio input transducer 1241 may be for example a microphone in which case it only needs to be active during the time that the intended message is expected to be announced.
  • the audio input transducer may be in close proximity to the audio output transducer, or may be spaced apart. It may consist of a single audio input transducer or multiple audio input transducers - one e.g. to detect ambient noise - directed away from audio output transducer, and one directed toward the audio output transducer, and possibly a third audio transducer to detect reflected audio from audio output transducer.
  • the audio input transducer may be an array of audio input transducers.
  • the sound captured by the audio input transducer 1241 will typically undergo an analog to digital conversion, and the digitized sound signal is then transferred to a message determinator 1243.
  • the message determinator processes and analyzes the captured sound, and compares it to the intended message.
  • the message determinator 1243 is thereby able to provide an indication of whether or not the intended message was properly projected from the audio output transducer 1237.
  • the communication circuit may provide for wireless communication.
  • the wireless communication would be an RFID or NFC communication radio. It will be understood that the communication circuitry 1239 can take many forms.
  • Intelligent audible device 1250 is similar to intelligent audible device 1200 described above.
  • the circuitry described with reference to intelligent audible device 1200 will not be described again, and only additional circuitry and functionality will be discussed with reference to intelligent audible device 1250.
  • the intelligent audible device 1250 is again illustrated as an intelligent label 1255.
  • the intelligent label 1255 is illustrated showing that the event generator 1224 may have one or more sensors 1226 for sensing environmental conditions. By way of example, these conditions may be temperature, shock, vibration, humidity, lighting conditions, or any other environmental condition. These sensors may be continuously active, or may be activated by the processor at particular times.
  • the event generator is constructed to monitor the sensor 1226 and monitor for the presence of a particular condition, or that a condition has exceeded a predefined threshold. When that happens, the event generator generates an event signal.
  • the event generator may also have its own clock 1228 for providing actual or elapsed time.
  • the event generator 1224 may also have location sensing circuitry 1234, such as a GPS receiver, for determining a particular location. Accordingly, upon an actual or elapsed time, or being in a particular location, the event generator may generate an event signal.
  • the event generator 1224 may also have its own communication circuitry 1232. This communication circuitry 1232 may be a wired connection, or may be a wireless connection such as an RFID or NFC radio. Accordingly, upon receiving a message from the communication circuitry, the event generator may generate an event signal.
  • the intelligent label 1255 is also illustrated with the message determinator 1243 having separate circuitry 1245 for performing message verification.
  • a verification may be performed on the actual intended message as compared to the actual captured sound. However, this may require considerable power and processing capability.
  • the intended message is analyzed and a fingerprint, profile, or signature of that intended message is generated. Typically, the fingerprint will be substantially smaller than the actual intended audible message.
  • the message verification 1245 may then be efficiently performed by comparing the intended fingerprint to the actual fingerprint. Further, the message verification 1245 may additionally provide a confidence value that indicates the closeness in fit between the intended message and the captured message.
  • FIG. 45 another intelligent audible device 1275 is illustrated.
  • the intelligent audible device 1275 is similar to intelligent audible device 1250 described above, so only additional circuitry will be described.
  • Intelligent audible device 1275 again is in the form of an intelligent label 1277.
  • the intelligent label 1277 may have a visual display 1281, this display may be, for example, an LED, LCD, electrophoretic display or other type of display as previously described. It will be understood that the intelligent label may also have visual verification circuitry and function as described previously.
  • the intelligent label 1277 may also have an actuator 1283.
  • the actuator is a device which allows the intelligent label 1277 to operate in a very low power state until a particular action has been taken. On that action, such as pulling a tab and breaking a circuit, or completing a seal and closing a circuit, the processor and the other circuitry on the intelligent label 1277 may be placed in a power-on or activated state.
  • message determinator circuitry 1300 is described.
  • the message determinator circuitry 1300 is illustrated on an intelligent label device 1305. In this way, the message determinator circuitry 1300 is similar to the message
  • determinator circuitry 1300 has detection circuitry 1308.
  • This detection circuitry is used to capture the actual sound that was projected from the intelligent label's audio output transducer.
  • This detection circuitry may include a audio input transducer or other type of sound transducer. It may also detect velocity and acceleration of sound waves, or may be an electrical signal sensor. The electrical signal sensor would be used to detect the change in voltage, current, or capacitance of a device that is acting responsive to the sound waves.
  • the detection circuitry 1308 will create an analog sample of the sound that is then passed to the conversion circuitry 1212.
  • Conversion circuitry 1212 generally will convert the analog sample to a digital sample. Although this digital sample could be used directly for comparison, it is more efficient to convert the digital sample in to a more compact form.
  • the conversion circuitry 1312 may convert the digital sample into a digital profile, digital fingerprint, or digital signature that represents the digital sample, but is in a far easier to use form.
  • the conversion circuitry 1312 can provide an FFT analysis on the digital sample, which can then be used as a very simple digital profile. It will be understood that many types of digital profiles or other types of analysis can be used to create a shorthand for the actual digital sample.
  • the interpretation circuitry 1317 will be able to use information regarding the start and stop times of the sound, as well as the duration of the sound.
  • the digital sample and the digital fingerprint may then be used by interpretation circuitry 1317 to extract higher order meaning from the captured sound.
  • the interpretation circuitry 1317 may look for particular patterns, variations, frequency changes, cadence changes, or other features that may indicate that the intended message was projected.
  • Verification circuitry 1321 is also used to compare the fingerprint of the captured sound to a fingerprint of the intended sound. This intended sound fingerprint could be generated at the time the message is played, however more likely the intended fingerprint would be determined beforehand, and stored for later use. Accordingly the verification circuitry 1321 can use this stored fingerprint and directly compare it to the captured fingerprint. In doing so, the verification circuitry 1321 would correlate the fingerprints by aligning start times, stop times, and durations, or embedded marks.
  • the comparison can then indicate whether or not the intended message was actually projected.
  • the verification circuitry 1321 can provide a confidence value that would be a numerical indication of the closeness of fit between the actual fingerprint and the intended fingerprint. The verification circuitry may thereby require that the confidence value exceed a predefined threshold before it will be determined that the captured message perceptibly matched the intended message. And depending on the confidence value, optionally repeat the sequence or initiate another action.
  • Determinator circuitry 1325 is similar to determinator circuitry 1300 described with reference to Figure 46, so only the differences will be described.
  • input signals 1327 are provided to detection circuitry 1328, conversion circuitry 1331, interpretation circuitry 1333, and verification circuitry 1336. It will be appreciated that the input signals 1327 may be provided to fewer than all of the detection, conversion, interpretation, and verification circuitries.
  • the input signals may be used to adjust the way the determinator circuitry performs its functions depending upon external causes. For example, a sensor may be used, such as the microphone, to detect background ambient noise. Such noise may initiate an event or affect the way the detection, conversion, and interpretation processes may need to be performed.
  • Method 1350 for providing an intelligent audible device.
  • Method 1350 has a first portion 1352 which are processes performed before the sound or message is projected into the local enviroment.
  • Method 1350 also has a second portion 1353 which is performed as or after the sound is being projected into the local enviroment.
  • Method 1350 starts by generating an intended audible message 1355. This message may be a simple alarm, or may have a higher order meaning, such as language to be understood by a human. It will also be understood that the audible message may be at ranges not intended for human hearing, but for machine or animal hearing. The audible message is then processed as shown in block 1357.
  • the purpose of the processing is to anticipate the actual environment in which the sound will be generated, thereby allowing a certain level of predictability as to what the captured sound should sound like.
  • the message may also be calibrated according to the specific type of hardware that is used on the intelligent audible device, such as the particular audio output transducer and the particular audio input transducer.
  • the intended message may also be processed according to a particular environment, such as anticipated level of background noise.
  • noise cancellation schemes including passive (e.g. analog or ditigal filtering), or adaptive (active) noise suppression or cancellation may be employed, which may incorporate more than one (e.g. an array) of audible input transducers.
  • the noise cancelation schemes may be specifically tuned to the set of indended audible messages for the device. It will be understood that there are many ways in which the audible message can be processed to anticipate the actual environment. Once the intended message has been fully processed, it is converted in to a digital fingerprint, digital profile, or other type of signature, as illustrated in block 1359. This digital fingerprint can then be stored into the intelligent agent as shown in block 1365. In some cases, it may also be useful to store the intended message and the processed message in the intelligent audible device. It will be understood that the intelligent audible device may take many forms, such as the intelligent label and hardware agent as described previously. [0216] At some time an event will occur, either internal or external to the intelligent audible device.
  • the intelligent audible device When that event happens, the intelligent audible device will cause the intended audible message to be projected into the local environment through an audio output transducer as shown in block 1367. Accordingly, concurrent with activating the audio output transducer, the intelligent audible device will activate its audio input transducer or capture device to capture the actual audio that is being projected into the environment, as shown in block 1367. The captured audio is used to generate a fingerprint of the actual audio as illustrated in block 1371.
  • the process for creating the actual fingerprint is similar to the process for generating the intended fingerprint and is described with reference to block 1359.
  • the actual fingerprint may then be compared to the stored intended fingerprint to determine whether or not the actual message was projected, as shown in block 1373. Depending upon the level of correlation or closeness of fit between the actual fingerprint and the intended fingerprint, it may be determined whether or not the intended message was perceptibly projected into the actual
  • Method 1400 has a first part 1402 which is used prior to projecting the sound into the actual environment, and a second part 1403 which is used as or after the sound has been projected.
  • intelligent audible device generates an intended audible message as shown in block 1405.
  • This message may be intended for human perception in the form of an alarm, speech, or message, it may also be intended to non-humans, such as animals or to a machine. It will be understood that the intended audible message may take a wide variety of forms.
  • the intended audible message is then processed as shown in block 1407.
  • the message is processed according to the specific hardware used for generating and capturing the message, such as the speaker and the microphone. It can also be processed for the specific environment that the audible message will be projected in.
  • a fingerprint, signature, or profile is generated as illustrated in block 1409.
  • a set of reference characteristics 1412 may be used. These reference characteristics may be, for example, using only certain frequencies or amplitudes in generating the fingerprint, or may set the particular sampling and algorithmic processes used for generating the fingerprint. In some cases, the reference characteristics may include inaudible sound, such as an audible watermark or stegonographic mark.
  • the fingerprint Once the fingerprint has been generated, it may be stored in the intelligent audible device as shown in block 1415. The intended message, reference characteristics, and a confidence threshold may also be stored.
  • the sound will be generated and projected into the actual environment as shown in block 1427.
  • the event can be anything from an internal clock to sensing an external event.
  • the intelligent audible device will activate its audio input transducer or capture device to capture the actual audible message as shown in block 1427.
  • This captured audio is then used to generate an actual fingerprint shown in block 1431.
  • the same analytic processes is used for creating the intended fingerprint are used, and in most cases, the reference characteristics 1412 that were used to generate the intended fingerprint are also used to generate the actual fingerprint 1431.
  • the intended fingerprint is then compared to the actual fingerprint to generate a confidence value as shown in block 1433.
  • This confidence value is a numeric indication of the closeness of fit between the intended fingerprint and the actual fingerprint.
  • the numeric confidence value is compared to the predefined confidence threshold. If the confidence value exceeds the confidence threshold, then it is determined that the message that was actually projected was the intended message, as illustrated in block 1437.
  • method 1450 is illustrated.
  • the process as described with reference to Figure 49 has created a confidence value, and in block 1452 that confidence value is being compared to a predefined confidence threshold. If the confidence value is below the confidence threshold, then it is determined that the intended message has not been projected into the environment 1453. However, if the confidence value is at or above the confidence threshold, then it is determined that the intended message has been projected into the environment 1468. In the case when the intended message has not been properly or perceptibly projected 1453, then the system may store certain information for later use, such as the actual captured sound, the captured digital signature, and the confidence value, as shown in block 1455.
  • the system may also set off an alarm as shown in block 1457, or communicate a message to a remote location as shown in block 1459.
  • This communication may be immediate through a wired or wireless communication circuitry, or may be stored and communicated at a later time.
  • the intelligent audible device may have a built-in visual display which may be changed to visually indicate that the audible message was not projected properly, as shown in block 1462.
  • the alarm may be activated if the visual display failed verification.
  • the method 1450 may adjust the determination process as shown in block 1464. In this way, the next time a sound is to be projected, the system may be adjusted for improved projection of the intended message, as shown in block 1464.
  • block 1466 shows that the system can adjust the actual message according to the reasons that the intended message was not properly projected. For example, if the system determines that the environment is unusually noisy, then the message may be lengthened or repeated, or enhanced to give the listener a better opportunity to hear the message.
  • the intelligent audible device again may store the captured sound, the captured fingerprint, and the confidence measure for later use, as shown in block 1470.
  • alarm may be set off a shown in block 1472, or a message may be communicated as illustrated in block 1473. Additionally, if a display is present, the display may be updated to show that the particular sound has been properly projected.
  • Intelligent audio devices may dynamically optimize, localize or otherwise modify audible messages to facilitate detection, conversion and interpretation of audible messages (e.g., in response to monitored environments). Intelligent audio devices may insert or combine audible messages with audible and inaudible steganographic marks and watermarks (1) prior to, or concurrent with projection of an audible message (e.g. a prerecorded sound file), (2) during generation of an audible message (dynamic insertion) or post detection of an audible message (e.g. to uniquely identify the intelligent audio device, date/time, location etc. where generated or detected).
  • an audible message e.g. a prerecorded sound file
  • dynamic insertion e.g. to uniquely identify the intelligent audio device, date/time, location etc. where generated or detected.
  • Intelligent audio devices may include circuitry and devices to detect, convert and interpret the presence of items proximate the intelligent audible device that interfere with the acoustic path and thus the perceptibility or detectability of audible messages, such as, but not limited to sensor(s) (e.g. light), and appropriate to the method, a signal generator (e.g. optic, acoustic etc.), and rangefinders.
  • sensor(s) e.g. light
  • a signal generator e.g. optic, acoustic etc.
  • intelligent audio devices may include circuitry for detecting relative motion between the intelligent audio device and proximate items (or
  • Intelligent audio devices may contain memory and logic configured appropriately for specific stakeholders to set one or more audible messages (e.g. stored in memory, often immutable once set). Intelligent audio devices may contain logic (optionally immutable) to select from a database of stored audible messages in response to different events. Intelligent audio devices may also contain logic and communication capability to retrieve audible messages via live communication with a remote source.
  • Audible messages may be compressed digital sound files or text files (for synthesized output). Audible messages may be dynamically altered/adapted in anticipation of, or in response to, events and monitored conditions (e.g. intensity, duration, frequency, pattern, etc.).
  • events and monitored conditions e.g. intensity, duration, frequency, pattern, etc.
  • the functions of an intelligent audio device e.g. generation, detection, conversion and interpretation of audible messages, are
  • Detection periods are advantageously initiated to precede the audible period or to span the audible period. They may also be periodic, random or follow set sequences. Elapsed, relative or absolute time, are advantageous in determining: the length of time the intended audible message was actually projected, the time when the audible message was actually projected, and the period of time when the audible message was perceptible.
  • Detection may occur at predetermined or random times. Detection may also be dynamic (e.g. occur in response to 'event's such as those previously described). Detection may be synchronized with the start/initiation of the audible message.
  • the intelligent audio devices actions of projecting and determining audible messages may be concurrent (e.g. both triggered by the same event), however, it is often advantageous to space them temporally. In one embodiment for example: an event triggers detection, followed by generation of an audible message, followed by termination of the audible message, followed by termination of detection.
  • the intelligent audio devices acoustic environment and adapt the projecting of the audible message (e.g. increase volume) or aid in the detection process (e.g. noise cancellation).
  • the projecting of the audible message e.g. increase volume
  • aid in the detection process e.g. noise cancellation
  • the interpretation process may be used to provide internal feedback to improve the generation and detection of audible messages. It may also be used as a "learning system” either local/internal or remotely with results from multiple intelligent audio devices.
  • a multitude of intelligent audio devices may also be grouped or nested depending on the application.
  • the intelligent audio devices could have the same configuration and capabilities, but may also be different (e.g., they may have different types of sensors and be able to react to different types of triggers).
  • Some or all of the grouped intelligent audio devices could, for instance, respond to secondary events (yielding in secondary projected messages) by re-projecting the message of a specific intelligent audio device or other specific message in response to a triggered primary event occurring in the specific intelligent audio device.
  • the secondary projected messages could be optimized for the particular environment (with knowledge of the intended [primary] projected message through a network).
  • an audio file is created containing an audible message (audible message file). Note that the steps below would typically be taken using an appropriately configured external device and application(s), however they could also be taken by an appropriately configured intelligent audio device. 1. An "intended" audible message is generated and digitally processed to enhance perception and determination. E.g. for
  • Personality e.g. voice, language, sound etc.
  • Perception e.g. taking into account psychoacoustics
  • the intended audible message and the digital reference fingerprint of the intended audible message are set into the intelligent audio device.
  • a "confidence index” parameters corresponding to the level of 'confidence' in the perceptibility of the audible message based on a comparison between the intended audible message and the actual audible message (as determined using their respective digital
  • the intelligent audio device In response to an event as determined by the intelligent audio device, the intelligent audio device:
  • A. Detects the characteristics of the audible message used to create the reference digital fingerprint (e.g. amplitude of the sound at a specified frequency and time)
  • C. Interprets the actual digital fingerprint by comparing it to the reference digital fingerprint and generates a corresponding value or set of values. E. Those values are then compared to the confidence index and a
  • the intelligent audio device may take a variety actions including for example: Storing the result for later access or generating an alarm (visible, audible or wireless signal).
  • the intelligent audio device processor in conjunction with an appropriate confidence index may also include measures of proximity to items that influence the perceptibility of the actual audible message. While audible is the typical output of interest, in certain applications inaudible output generation and detection are desirable (e.g. outside the human range): silent alarms, range for dogs or machines, higher sensitivity or optimized total power. Concurrent audible messages of different frequency ranges may also be advantageous, e.g. one human perceptible and another machine perceptible, or one human and one animal perceptible. It will be understood that the audible message may be optimized by emphasizing frequencies that are known to be of high perception value, selecting frequencies with a higher likelihood of being perceived in a noisy environment, or for perception in a particular language.
  • Noise cancellation may advantageously be done once the message has been positively confirmed and its intended reference audio stream has been subtracted from the captured message.
  • Applying advanced sequence models such as Hidden Markov and neural network based models may provide advantageous results.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
EP17837713.1A 2016-08-03 2017-08-03 Verifizierung von durch eine intelligente hörbare vorrichtung projizierten nachrichten Withdrawn EP3494518A4 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662370376P 2016-08-03 2016-08-03
PCT/US2017/045354 WO2018027066A1 (en) 2016-08-03 2017-08-03 Verifying messages projected from an intelligent audible device

Publications (2)

Publication Number Publication Date
EP3494518A1 true EP3494518A1 (de) 2019-06-12
EP3494518A4 EP3494518A4 (de) 2020-04-01

Family

ID=61074156

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17837713.1A Withdrawn EP3494518A4 (de) 2016-08-03 2017-08-03 Verifizierung von durch eine intelligente hörbare vorrichtung projizierten nachrichten

Country Status (2)

Country Link
EP (1) EP3494518A4 (de)
WO (1) WO2018027066A1 (de)

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7174293B2 (en) * 1999-09-21 2007-02-06 Iceberg Industries Llc Audio identification system and method
US7360689B2 (en) * 2001-07-10 2008-04-22 American Express Travel Related Services Company, Inc. Method and system for proffering multiple biometrics for use with a FOB
US6982640B2 (en) * 2002-11-21 2006-01-03 Kimberly-Clark Worldwide, Inc. RFID system and method for tracking food freshness
US7222072B2 (en) * 2003-02-13 2007-05-22 Sbc Properties, L.P. Bio-phonetic multi-phrase speaker identity verification
US9471862B2 (en) * 2010-12-30 2016-10-18 Chromera, Inc. Intelligent label device and method
EP2207367A1 (de) * 2009-01-08 2010-07-14 Schweizer Electronic M2S AG Lautsprechergerät und entsprechendes Verfahren zu dessen Verwendung
US20110050397A1 (en) * 2009-08-28 2011-03-03 Cova Nicholas D System for generating supply chain management statistics from asset tracking data
DE202010001322U1 (de) * 2009-11-23 2010-06-24 Pfannenberg Gmbh Vorrichtung zur Überwachung eines Schallgebers, insbesondere eines Alarmschallgebers, und ein entsprechender Schallgeber
WO2014030027A1 (en) * 2012-08-24 2014-02-27 Freescale Semiconductor, Inc. Audio unit and method for generating a safety critical audio signal

Also Published As

Publication number Publication date
WO2018027066A1 (en) 2018-02-08
EP3494518A4 (de) 2020-04-01

Similar Documents

Publication Publication Date Title
US11410585B2 (en) Optically determining messages on a display
US11204331B2 (en) Optically determining the condition of goods
US10468053B2 (en) Verifying messages projected from an intelligent audible device
TWI475445B (zh) 用於使用陰影及反射模式的觸碰感應之方法及觸控面板系統
EP2810147B1 (de) Ultraschall-berührungssensor mit einem anzeigebildschirm
TWI254255B (en) Image read out device
US10147098B2 (en) Symbol verification for an intelligent label device
EP2336857B1 (de) Verfahren zur Ansteuerung einer Anzeigevorrichtung mit Berührungsbildschirm sowie Anzeigevorrichtung und Computerprogrammprodukt zur Durchführung des Verfahrens
US10317335B2 (en) Reflective tag and polarized light sensor for transmitting information
CN101281445B (zh) 显示装置
KR20200085401A (ko) 지문 센서 및 이를 포함하는 표시 장치
US11436901B2 (en) Verifying messages projected from an intelligent audible device
TW201319897A (zh) 位置輸入系統及方法
JP2011003184A (ja) タッチ式入力装置及び該タッチ式入力装置を備えた電子装置
EP3494518A1 (de) Verifizierung von durch eine intelligente hörbare vorrichtung projizierten nachrichten
WO2018136464A1 (en) Optically determining the condition of goods
CN106662936B (zh) 显示器上的位置输入
CN209707880U (zh) 光学检测装置、背光模组、显示装置、和电子设备
JP2012083592A (ja) 光センサ内蔵表示装置
US11475462B2 (en) Symbol verification for an intelligent label device
US10891447B2 (en) Symbol verification for an intelligent label device
EP3465604A1 (de) Symbolverifikation für eine intelligente etikettenvorrichtung

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20190301

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20200303

RIC1 Information provided on ipc code assigned before grant

Ipc: G08B 3/10 20060101ALI20200226BHEP

Ipc: G06K 19/07 20060101ALI20200226BHEP

Ipc: G06K 19/06 20060101AFI20200226BHEP

Ipc: G08B 29/12 20060101ALI20200226BHEP

Ipc: G08B 3/00 20060101ALI20200226BHEP

Ipc: G06K 19/077 20060101ALI20200226BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20201001