WO2012018092A1 - Capteur de zone et dispositif d'affichage - Google Patents

Capteur de zone et dispositif d'affichage Download PDF

Info

Publication number
WO2012018092A1
WO2012018092A1 PCT/JP2011/067895 JP2011067895W WO2012018092A1 WO 2012018092 A1 WO2012018092 A1 WO 2012018092A1 JP 2011067895 W JP2011067895 W JP 2011067895W WO 2012018092 A1 WO2012018092 A1 WO 2012018092A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
light receiving
light
receiving element
parallax
Prior art date
Application number
PCT/JP2011/067895
Other languages
English (en)
Japanese (ja)
Inventor
伸一 宮崎
健吾 ▲高▼濱
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Publication of WO2012018092A1 publication Critical patent/WO2012018092A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L31/00Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof
    • H01L31/02Details
    • H01L31/0232Optical elements or arrangements associated with the device
    • H01L31/02325Optical elements or arrangements associated with the device the optical elements not being integrated nor being directly associated with the device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14603Special geometry or disposition of pixel-elements, address-lines or gate-electrodes
    • H01L27/14605Structural or functional details relating to the position of the pixel elements, e.g. smaller pixel elements in the center of the imager compared to pixel elements at the periphery
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/1462Coatings
    • H01L27/14623Optical shielding
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L31/00Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof
    • H01L31/08Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof in which radiation controls flow of current through the device, e.g. photoresistors
    • H01L31/10Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof in which radiation controls flow of current through the device, e.g. photoresistors characterised by potential barriers, e.g. phototransistors
    • H01L31/101Devices sensitive to infrared, visible or ultraviolet radiation
    • H01L31/102Devices sensitive to infrared, visible or ultraviolet radiation characterised by only one potential barrier
    • H01L31/105Devices sensitive to infrared, visible or ultraviolet radiation characterised by only one potential barrier the potential barrier being of the PIN type
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14643Photodiode arrays; MOS imagers
    • H01L27/14649Infrared imagers
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14678Contact-type imagers
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14683Processes or apparatus peculiar to the manufacture or treatment of these devices or parts thereof
    • H01L27/14692Thin film technologies, e.g. amorphous, poly, micro- or nanocrystalline silicon

Definitions

  • the present invention relates to an area sensor provided with a light receiving element (light sensor) and a display device provided with such an area sensor.
  • Some display devices such as liquid crystal display devices have a touch panel (area sensor) function that can detect the touched position when the surface of the display device is touched with an input pen or a human finger.
  • An integrated display device has been developed.
  • Conventional touch panel integrated display devices have a resistive film method (a method in which an input position is detected by contact between an upper conductive substrate and a lower conductive substrate when pressed) and a capacitive method (in which the touched area is touched).
  • the method of detecting the input position by detecting a change in capacitance has been the mainstream.
  • touch panel integrated display devices have disadvantages such as an increase in thickness and the addition of a separate manufacturing process.
  • a touch panel integrated liquid crystal display device in which a sensor) is provided for each pixel (or for each of a plurality of pixels) in an image display region of the liquid crystal display device has been developed.
  • the liquid crystal display device is provided with a pixel transistor for each pixel.
  • the said light receiving element is formed between the both board
  • a display device that includes a light receiving element and an area sensor that detects an input position from the outside is attracting attention.
  • a floating state (non-contact) of an object to be detected such as an input pen or a human finger, which is difficult to realize with a resistive touch panel or a capacitive touch panel The position in the state) can be detected.
  • the distance of the detection object in a float state from the display surface of the display device can also be detected.
  • Patent Document 1 describes an information input device including a light receiving unit configured by a photodiode capable of obtaining distance information of a target object such as a hand with high resolution and extracting a three-dimensional shape of the target object. ing.
  • FIG. 21 is a diagram showing a schematic configuration of the information input device.
  • the light emitted from the light emitting means 105 is reflected by the target object 106 and forms an image on the light receiving surface of the reflected light extracting means 108 by the light receiving optical system 107 such as a lens. ing.
  • the reflected light extraction unit 108 includes a first light receiving unit 109, a second light receiving unit 110, and a difference calculation unit 111, and detects the intensity distribution of reflected light reflected by the target object 106, that is, a reflected light image. .
  • the first light receiving means 109 and the second light receiving means 110 are set to receive light at different timings, and the light emitting means 105 emits light when the first light receiving means 109 is receiving light,
  • the timing control unit 112 controls the operation timing so that the light emitting unit 105 does not emit light when the second light receiving unit 110 is receiving light.
  • the first light receiving means 109 receives the reflected light of the light from the light emitting means 105 by the target object 106 and other external light such as sunlight and illumination light, while the second light receiving means 109 receives the second light.
  • the light receiving means 110 receives only external light. Since the timings at which the light is received are different, but are close to each other, fluctuations in external light during this period can be ignored.
  • the difference calculation unit 111 calculates and outputs the difference between the images received by the first light receiving unit 109 and the second light receiving unit 110.
  • the reflected light extraction means 108 sequentially outputs the reflected light amount of each pixel of the reflected light image.
  • the output from the reflected light extraction means 108 is amplified by the amplifier 113, converted into digital data by the A / D converter 114, stored in the memory 115, and stored in the memory 115 at an appropriate timing.
  • Data is read out and processed by the feature information generation means 116. These controls are performed by the timing control means 112.
  • the reflected light from the target object 106 decreases significantly as the distance between the target object 106 and the first light receiving means 109 increases.
  • the amount of received light per pixel of the reflected light image decreases in inverse proportion to the square of the distance between the target object 106 and the first light receiving means 109. Therefore, when the target object 106 is placed in front of the information input device, the reflected light from the background becomes small enough to be ignored, and a reflected light image from only the target object 106 can be obtained.
  • a reflected light image from the hand is obtained.
  • the amount of light received per pixel of the reflected light image is The amount of reflected light received by the unit light receiving unit corresponding to the one pixel is obtained.
  • the amount of reflected light is affected by the properties of the target object 106 (such as specularly reflecting, scattering, and absorbing light), the orientation of the surface of the target object 106, the distance of the target object 106, etc.
  • the entire object 106 is an object that uniformly scatters light, the amount of reflected light has a close relationship with the distance to the target object 106.
  • a hand or the like is an object in which the entire target object 106 scatters light uniformly, so the reflected light image when the hand is put out shows the distance of the hand, the inclination of the hand (the distance is partially different), etc. reflect. Therefore, by extracting such feature information, various information can be input / generated.
  • FIG. 22 is a diagram showing a schematic configuration of the display device described in Patent Document 2.
  • the pixels 140 have a matrix arrangement that is continuously arranged in the vertical and horizontal directions.
  • the display area 120 in one pixel 140 is formed in a rectangular shape, and one set of the optical sensors 130R and 130L is arranged in the horizontal direction below the display area 120.
  • the light shielding member 150 is arranged as hatched in FIG. In other words, the light shielding member 150 has an opening 152 that matches the shape of the display area 120 and a rectangular opening 154 that matches the optical sensors 130R and 130L.
  • the optical sensors 130R and 130L are alternately arranged, and the optical sensor 130R and the optical sensor 130L are arranged in this order from the left in FIG.
  • a light shielding member 150 is provided so as to straddle the sensor 130L.
  • an opening 154 of the light shielding member 150 is provided so as to straddle the optical sensor 130L and the optical sensor 130R.
  • the light shielding member 150 is provided so that the boundary between the optical sensor 130R and the optical sensor 130L in each pixel 140 and the center line 138 of the rectangular opening 154 in the light shielding member 150 coincide with each other.
  • one light shielding member is provided so as to straddle part of the optical sensor 130R and part of the optical sensor 130L.
  • 150 openings 154 are provided.
  • a detected object such as a finger is present from the display panel 100 to the driver seat side or the passenger seat side in a relatively bright exterior, a shadow of the detected object is generated, that is, against the background. A darkened part occurs and is detected by the optical sensors 130R and 130L.
  • the irradiation light from the backlight (not shown) is reflected by the detected object and enters the optical sensors 130R and 130L. Becomes brighter against the background.
  • the portion specified by the optical sensor 130R among the portions where the light amount is reduced or increased compared to the background portion, whether the outside is bright or dark, is the R image, the optical sensor.
  • An image specified by 130L can be an L image.
  • FIG. 23 is a diagram illustrating a case in which an object to be detected such as a finger is shown as a sphere, and the object to be detected approaches the display panel 100.
  • Patent Document 2 the parallax between the R image and the L image is used, and the finger performs a touch operation on the display panel 100 when the parallax is smaller than a preset threshold. It is described that it is possible to realize a display device that can be discriminated.
  • Japanese Patent Laid-Open No. 10-177449 released on June 30, 1998) Japanese Patent Laying-Open No. 2009-9098 (released on January 15, 2009) Japanese Patent Laid-Open No. 9-27969 (published January 28, 1997)
  • the amount of received light per pixel of the reflected light image is inversely proportional to the distance to the target object 106 in estimating the distance of the target object 106, that is, the reflection of the target object 106. Only strength is used.
  • the reflection intensity of the target object 106 varies depending on the reflectance of the target object 106, for example, if the target object 106 is located at the same distance from the first light receiving unit 109, the target object 106 is separated by the same distance. Nevertheless, the reflection characteristics of the hand differ depending on race (skin color), hand skin thickness, blood circulation, etc. Therefore, in order to estimate the distance with high accuracy, every time the user changes There is a problem that optimization (adjustment) of the information input device is required.
  • the configuration of the above-mentioned Patent Document 1 has a problem that it is difficult to cope with a momentary fluctuation in external light, making it difficult to estimate the distance.
  • Patent Document 2 as shown in FIG. 22, a part of the optical sensor 130R and the optical sensor 130L are provided so that the optical sensor 130R and the optical sensor 130L in each pixel 140 have different directivity characteristics. It is the structure which provided the opening part 154 of the one light shielding member 150 so that one part may be straddled.
  • the optical sensor 130R, the optical sensor 130L, and the light shielding member 150 are arranged so that the boundary between the optical sensor 130R and the optical sensor 130L in each pixel 140 and the center line 138 of the rectangular opening 154 in the light shielding member 150 coincide. Is provided in stripes at regular intervals.
  • each pixel 140 since one common opening 154 is formed for the two photosensors 130R and 130L, the two photosensors 130R and 130L have different directivity characteristics. Therefore, the arrangement positions of the optical sensors 130R and 130L with respect to the arrangement position of the opening 154 are limited.
  • the photosensor 130R and the photosensor 130L having two different directional characteristics in the upper left direction and the upper right direction are provided for each pixel 140.
  • the left-right parallax can be detected from the R and L images. That is, according to the configuration of Patent Document 2, the direction in which the parallax can be detected is uniaxial.
  • FIG. 24A a plurality of similar objects 170a, 170b, 170c (a plurality of fingers) having a similar shape are arranged at equal intervals on the display panel 100 described in Patent Document 2.
  • the parallax between the R image and the L image is calculated. Therefore, when performing image matching between the R image and the L image, FIG. As shown in the drawing, there is a problem that many ghost peaks occur in addition to the peak value indicating the maximum correlation value, and the detection accuracy of parallax is lowered.
  • the present invention has been made in view of the above-described problems, and can estimate a distance (height) with higher accuracy and relatively freely arrange light receiving elements having different directivity characteristics. It is an object of the present invention to provide an area sensor and a display device that can be used.
  • an area sensor of the present invention is an area sensor having a surface on which a plurality of light receiving elements for detecting the amount of incident light is formed, and the light receiving elements include a plurality of sets.
  • the first light receiving element includes a first light receiving unit, and the first light receiving element.
  • a first light-blocking member that is formed above the first light-receiving unit and blocks the light
  • the second light-receiving element is formed above the second light-receiving unit and the second light-receiving unit.
  • the fourth light receiving element is formed in a layer above the fourth light receiving portion and the light receiving portion.
  • a first light-receiving element that can receive light components from the first direction, and light from the first direction is specularly reflected by the surface.
  • the opening portion of the first light shielding member is partially in plan view with respect to the first light receiving portion so that the light component from the second direction opposite to the direction of the reflected light can be shielded.
  • the second light receiving element can receive light components from the second direction so as to be overlapped with each other, and can be received from the first direction.
  • the opening of the second light shielding member is shifted by a predetermined amount toward the second direction so as to partially overlap the second light receiving part in plan view so that the light component can be shielded.
  • the first direction and the top A light component from a third direction different from the second direction can be received, and a fourth direction that is opposite to the direction of the reflected light when the light from the third direction is specularly reflected by the surface
  • the third light-shielding member has a predetermined amount on the third direction side so that the opening of the third light-shielding member partially overlaps the third light-receiving part in plan view so that light components from the light can be shielded.
  • the fourth light blocking element is formed so that the light component from the fourth direction can be received and the light component from the third direction can be blocked.
  • the opening of the member is characterized in that it is formed to be shifted by a predetermined amount on the fourth direction side so as to partially overlap the fourth light receiving portion in plan view.
  • an opening of one light shielding member is provided so as to straddle a part of each of the two light receiving elements formed adjacent to each other.
  • the light receiving element and the light shielding member are provided at regular intervals so that the boundary between the two light receiving elements coincides with the center line of the opening of the one light shielding member.
  • the opening of the first light shielding member in order to give directivity to the first light receiving element, the opening of the first light shielding member is partially separated from the first light receiving unit in a plan view. So that the second light-receiving element has directivity, the opening of the second light-shielding member is formed in the second direction.
  • the third light receiving element is formed by shifting by a predetermined amount on the second direction side so as to partially overlap the light receiving part in plan view.
  • the opening of the light shielding member is formed by shifting a predetermined amount on the third direction side so as to partially overlap the third light receiving portion in plan view.
  • the opening of the fourth light shielding member is made to be away from the fourth light receiving unit. As partially overlap in plan view, it is formed by shifting a predetermined amount to the fourth direction side of the.
  • each light receiving element does not affect the directivity characteristics of the other light receiving elements.
  • the arrangement position of each light receiving element is not particularly limited.
  • the set of light receiving elements having different directivity characteristics does not necessarily have to be provided at adjacent positions. Therefore, an area sensor in which the light receiving elements can be arranged relatively freely is provided. Can be realized.
  • the above configuration it is possible to detect a parallax in one axial direction, for example, the left-right direction, from two types of images obtained by the first light receiving element and the second light receiving element.
  • the other one-axis direction for example, the vertical disparity can be detected from the two types of images obtained by the light receiving element and the fourth light-receiving element.
  • the height of the object to be detected from the surface on which the first light receiving element, the second light receiving element, the third light receiving element, and the fourth light receiving element are formed can be detected.
  • the parallax of the image of the detection object detected by the first light receiving element and the second light receiving element, and the third light receiving element and the fourth light receiving element Based on the detected parallax of the image of the detected object, the distance (height) of the detected object from the surface on which the light receiving elements are formed can be detected.
  • a sensor can be realized.
  • first light receiving element and the second light receiving element, and the third light receiving element and the fourth light receiving element are each in one set. And the number of the second light receiving elements are the same, and the number of the third light receiving elements and the number of the fourth light receiving elements are the same. It means that the number of second light receiving elements) may be different from the number of third light receiving elements (fourth light receiving elements).
  • the display device of the present invention includes the area sensor, and includes the first light receiving element, the second light receiving element, the third light receiving element, and the fourth light receiving element. Are formed in a matrix and provided with a plurality of pixels for displaying an image. The first light receiving element, the second light receiving element, the third light receiving element, and the above The fourth light receiving element is provided according to the position of the pixel.
  • a display device in which the area sensor is integrated can be realized. Therefore, information can be input using the area sensor while viewing the image of the display device.
  • each said light receiving element is provided according to the position of the said pixel, for example, when four types of light receiving elements in which directivity characteristics differ in one pixel, or the above-mentioned four types of light in a plurality of pixels Examples include cases where the light receiving elements are arranged in a distributed manner or in any of the above cases and include pixels where the light receiving elements are not provided, but are not limited thereto. .
  • the light receiving element includes a plurality of sets of first light receiving elements and second light receiving elements, and a plurality of sets of third light receiving elements and fourth light receiving elements.
  • the first light receiving element includes a first light receiving part and a first light shielding member formed in an upper layer than the first light receiving part and blocking the light, and the second light receiving element.
  • the third light receiving element includes the third light receiving portion
  • a third light-blocking member that is formed in an upper layer than the third light-receiving unit and blocks the light
  • the fourth light-receiving element is formed in a layer above the fourth light-receiving unit and the fourth light-receiving unit.
  • a fourth light blocking member that blocks the light, and the first light receiving element can receive a light component from the first direction; The opening portion of the first light shielding member so that the light component from the second direction opposite to the direction of the reflected light when the light from the first direction is specularly reflected by the surface can be shielded.
  • the opening of the second light shielding member is in plan view with respect to the second light receiving portion so that the light component from the second direction can be received and the light component from the first direction can be blocked.
  • the third light receiving element is formed to be shifted by a predetermined amount so as to partially overlap, and in the third light receiving element, the third direction is different from the first direction. The light component from the direction can be received, and the light from the third direction is specularly reflected by the surface.
  • the fourth light receiving element is formed to be shifted by a predetermined amount so as to partially overlap, and the fourth light receiving element can receive a light component from the fourth direction.
  • the opening of the fourth light shielding member is located on the fourth direction side so as to partially overlap the fourth light receiving portion in plan view so that light components from the direction can be shielded. This is a configuration formed by shifting a certain amount.
  • the display device of the present invention includes the area sensor, and is formed by the first light receiving element, the second light receiving element, the third light receiving element, and the fourth light receiving element.
  • a plurality of pixels that are formed in a matrix and display an image are provided on the surface, and the first light receiving element, the second light receiving element, the third light receiving element, and the fourth light receiving element are provided.
  • the light receiving element has a configuration provided in accordance with the position of the pixel.
  • FIG. 1 is a diagram showing a schematic configuration of an area sensor integrated liquid crystal display device according to an embodiment of the present invention
  • FIG. 1 The example of formation of the optical sensor provided in the area sensor integrated liquid crystal display device of one embodiment of this invention is shown.
  • FIG. 1 shows the schematic circuit structure in each pixel of the area sensor integrated liquid crystal display device of one embodiment of this invention.
  • FIG. 1 shows schematic block diagram which shows the structure of the area sensor integrated liquid crystal display device of one embodiment of this invention.
  • FIG. 6 is a diagram for explaining a case where a parallax is calculated by performing correlation calculation using a binarized left image and right image in the area sensor integrated liquid crystal display device according to the embodiment of the present invention.
  • a parallax is calculated by matching a 256-tone left image and a right image for each line using a normalized correlation method. It is a figure for demonstrating. To describe a case where a parallax is calculated by performing correlation calculation for each line using a left image and a right image in which an edge is detected in an area sensor integrated liquid crystal display device according to an embodiment of the present invention.
  • the left image and the right image in which the edge illustrated in FIGS. 9B and 9C is detected are normalized and correlated. It is a figure for demonstrating the case where matching is performed using FIG.
  • a case where an edge direction image of a left image and a right image is obtained and parallax is obtained by correlation calculation including a term related to the edge direction is described.
  • FIG. To explain a case where a parallax is calculated by performing correlation calculation for each line using an upper image and a lower image in which an edge is detected in an area sensor integrated liquid crystal display device according to an embodiment of the present invention.
  • the normalized correlation method is used for the upper image and the lower image in which the edge illustrated in FIGS. 12B and 12C is detected. It is a figure for demonstrating the case where matching is performed using FIG.
  • FIG. 6 is a diagram for explaining a method for obtaining a height h from a parallax d and an incident angle ⁇ in an area sensor integrated liquid crystal display device according to an embodiment of the present invention. It is a figure for demonstrating the flow which calculates
  • FIG. 16 is a diagram illustrating a reference table for obtaining a value of Wdir which is a term related to the edge direction in FIG. 15. It is a figure for demonstrating the flow which calculates
  • an area sensor integrated liquid crystal display device including a light receiving element will be described as an example.
  • the present invention is not limited thereto, and the area sensor is provided.
  • the present invention can be applied to a display device such as an organic EL display device or an area sensor including a light receiving element.
  • the area sensor integrated liquid crystal display device 1 (also referred to simply as the liquid crystal display device 1) shown in FIG. 3 is provided with a photosensor for each pixel of the liquid crystal display panel 2, although not shown.
  • the image of the detected object 7 (finger) placed on the display surface 2a side of the liquid crystal display panel 2 is extracted and its position is detected.
  • a backlight 3 for irradiating the liquid crystal display panel 2 with light is provided on the back surface 2b side of the liquid crystal display panel 2.
  • the liquid crystal display panel 2 includes an active matrix substrate 4 in which a large number of pixels are arranged in a matrix, and a counter substrate 5 disposed so as to face the active matrix substrate 4, and further between these two substrates 4 and 5.
  • the liquid crystal layer 6 is sandwiched between the two.
  • the VA mode liquid crystal display panel 2 is used.
  • the display mode of the liquid crystal display panel 2 is not particularly limited, and any display mode such as a TN mode or an IPS mode can be applied. it can.
  • a front-side polarizing plate and a back-side polarizing plate are provided outside the liquid crystal display panel 2 so as to sandwich the liquid crystal display panel 2.
  • Each of the polarizing plates serves as a polarizer, and in the present embodiment in which the liquid crystal material in the liquid crystal layer 6 is a vertical alignment type, the polarization direction of the front-side polarizing plate and the polarization direction of the back-side polarizing plate are By arranging them in a crossed Nicols relationship, a normally black mode liquid crystal display device can be realized.
  • the active matrix substrate 4 includes a TFT element which is a switching element for driving each pixel, a pixel electrode electrically connected to the TFT element, the optical sensor, an alignment film, and the like. Is formed.
  • the counter substrate 5 is formed with a color filter layer, a counter electrode, an alignment film, and the like.
  • the said color filter layer can be comprised from the coloring part which has each color of red (R), green (G), and blue (B), and a black matrix, for example.
  • the detected object 7 is detected with higher accuracy regardless of the external brightness without affecting the image quality of the display image. Therefore, a backlight having a function of uniformly emitting infrared light that is invisible light is used. Since the infrared light is invisible light, it can be appropriately modulated and pulsed as necessary.
  • the infrared light emitted from the backlight 3 is reflected by the detected object 7 and detects the amount of incident light that is incident on the optical sensor, the light is used.
  • a black matrix made of carbon black or the like provided on the counter substrate 5 is positioned on the region where the sensor is formed.
  • the black matrix made of carbon black or the like blocks visible light and transmits infrared light. Therefore, light that passes through the black matrix and enters the photosensor is only infrared light. The amount of infrared light reflected by the detection object 7 and incident on the optical sensor can be detected.
  • the backlight 3 is a backlight that does not contain infrared light and emits visible light for display
  • the backlight 3 is provided on the counter substrate 5 on the region where the photosensor is formed.
  • the opening of the black matrix made of carbon black or the like is positioned, and the position of the detection object 7 can be detected as follows.
  • the amount of incident light from the outside to the optical sensor is different between the location where the detected object 7 is present and the location where the detected object 7 is not present. 7 positions can be detected.
  • the position of the detected object 7 can be detected by utilizing the fact that light is reflected at the location where the detected object 7 exists.
  • FIG. 1 is a diagram showing a schematic configuration of an optical sensor provided in the liquid crystal display device 1 of the present embodiment.
  • FIG. 1A shows an optical sensor 8a (first light receiving element) having a directional characteristic in an obliquely upward direction in the X direction (hereinafter referred to as an upper left direction) in the drawing provided on the active matrix substrate 4.
  • FIG. 1B shows an optical sensor 8b (second light receiving element) having directional characteristics in an obliquely downward direction in the X direction (hereinafter referred to as an upper right direction) in the drawing provided on the active matrix substrate 4. ).
  • FIG. 1C shows an optical sensor 8c (third light receiving element) having directional characteristics in an obliquely upward direction in the Y direction (hereinafter referred to as an obliquely upward direction) in the figure provided for the active matrix substrate 4.
  • FIG. 1 (d) shows an optical sensor 8d having a directional characteristic in the Y-direction obliquely downward direction (hereinafter referred to as the obliquely downward direction) in FIG. 4).
  • the X direction and the Y direction are orthogonal to each other, but the present invention is not limited to this.
  • P + A PIN diode having a structure in which the layers 9p +, I layer 9i (first light receiving unit, second light receiving unit, third light receiving unit, fourth light receiving unit) and N + layer 9n + do not overlap each other is used.
  • I layer 9i first light receiving unit, second light receiving unit, third light receiving unit, fourth light receiving unit
  • N + layer 9n + do not overlap each other.
  • any optical sensor may be used as long as it passes different current values according to the amount of light received in the light receiving unit provided in the optical sensor. You can also.
  • a glass substrate is used as a substrate for constituting the active matrix substrate 4, but the present invention is not limited to this, and a quartz substrate, a plastic substrate, or the like can also be used.
  • the light emitted from the backlight 3 is applied to the pixel TFT on the surface on which pixel TFTs to be described later and the optical sensors 8a, 8b, 8c, and 8d are formed. And a light-shielding film for blocking light incident on the optical sensors 8a, 8b, 8c and 8d.
  • a base coat film is formed on the light shielding film so as to cover the entire surface of the light shielding film and the glass substrate.
  • a film made of an insulating inorganic material such as a silicon oxide film, a silicon nitride film, or a silicon oxynitride film, or a laminated film appropriately combining them can be used.
  • a silicon oxide film was used.
  • these films can be formed by depositing by LPCVD, plasma CVD, sputtering, or the like.
  • optical sensors 8a, 8b, 8c, and 8d shown in FIGS. 1A to 1D are formed.
  • the process for forming the optical sensors 8a, 8b, 8c and 8d on the base coat film is as follows.
  • a non-single-crystal semiconductor thin film that will later become the polycrystalline semiconductor film 9 is formed by LPCVD, plasma CVD, sputtering, or the like.
  • the non-single-crystal semiconductor thin film includes amorphous silicon, polycrystalline silicon, amorphous germanium, polycrystalline germanium, amorphous silicon / germanium, polycrystalline silicon / germanium, amorphous silicon / carbide, Crystalline silicon carbide or the like can be used. In this embodiment mode, amorphous silicon is used.
  • the non-single-crystal semiconductor thin film is crystallized to form a polycrystalline semiconductor film 9.
  • a laser beam, an electron beam, or the like can be used.
  • crystallization is performed using a laser beam.
  • the polycrystalline semiconductor film 9 is patterned by photolithography according to the formation region of the light shielding film.
  • an I semiconductor layer 9i that is an intrinsic semiconductor layer or a semiconductor layer that has a relatively low impurity concentration is provided at the center of the polycrystalline semiconductor film 9.
  • a P + layer 9p + that is a semiconductor layer having a relatively high P-type impurity concentration and an N + layer 9n + that is a semiconductor layer having a relatively high N-type impurity concentration are formed on both sides thereof. .
  • a gate insulating film 10 made of a silicon oxide film or the like is formed, and the polycrystalline semiconductor film 9 is covered with the gate insulating film 10.
  • contact holes penetrating the gate insulating film 10 are formed on the P + layer 9p + and the N + layer 9n +, respectively.
  • a conductive film is formed on the entire surface by sputtering or the like.
  • the conductive film for example, a conductive film made of aluminum or the like can be used.
  • the conductive film is not limited to this, and is selected from Ta, W, Ti, Mo, Al, Cu, Cr, Nd, and the like.
  • An element, or an alloy material or a compound material containing the element as a main component may be used, and if necessary, a laminated structure may be formed by appropriately combining them.
  • the conductive film may be formed using a semiconductor film typified by polycrystalline silicon or the like doped with an impurity such as phosphorus or boron. Note that in this embodiment mode, aluminum is used for the conductive film.
  • the conductive film is patterned into a desired shape by etching by photolithography, and becomes metal electrodes (wiring) 11a and 11b electrically connected to the P + layer 9p + and the N + layer 9n +, respectively.
  • a transparent insulating layer 12 is formed on the entire surface so as to cover the gate insulating film 10 and the conductive film.
  • an organic interlayer insulating film made of an acrylic resin is used as the transparent insulating layer 12.
  • a shield electrode layer 13 made of ITO (Indium Tin Oxide), IZO (Indium Zinc Oxide), or the like is formed as shown in FIG. 1 (a) to FIG. 1 (d), with a P + layer 9p +, an I layer 9i and It is formed on the transparent insulating layer 12 so as to overlap with the N + layer 9n + in plan view.
  • ITO Indium Tin Oxide
  • IZO Indium Zinc Oxide
  • a light shielding member 14 having an opening 14 a is formed on the shield electrode layer 13.
  • FIG. 2 is a plan view of the optical sensors 8a, 8b, 8c, and 8d shown in FIG.
  • an optical sensor 8a having directivity in the upper left direction in the figure is formed.
  • the light component from the direction can be received, and the light from the upper left direction in the above figure is the direction opposite to the direction of the reflected light when it is specularly reflected by the surface on which the optical sensor 8a is formed.
  • the opening 14a of the light shielding member 14 partially overlaps with the I layer 9i of the polycrystalline semiconductor film 9 which is the light receiving part of the optical sensor 8a in plan view. They are shifted by a predetermined amount in the left direction in the figure.
  • the light component from the upper right direction in the figure in order to form the optical sensor 8b having the directivity in the upper right direction in the figure, the light component from the upper right direction in the figure. Can be received, and the light component from the upper left direction in the figure, which is the direction opposite to the direction of the reflected light when the light from the upper right direction in the figure is specularly reflected by the surface on which the optical sensor 8b is formed, is In the right direction in the figure, the opening 14a of the light shielding member 14 is partially overlapped with the I layer 9i of the polycrystalline semiconductor film 9 which is the light receiving portion of the optical sensor 8b in plan view so that light can be shielded. Are shifted by a predetermined amount.
  • the optical sensor 8c in order to form the optical sensor 8c having the directivity in the diagonally upward direction in the drawing,
  • the light component can be received, and the light from the obliquely upward direction in the figure is from the obliquely downward direction in the figure, which is the direction opposite to the direction of the reflected light when it is specularly reflected by the surface on which the optical sensor 8c is formed.
  • the opening 14a of the light shielding member 14 is partially overlapped with the I layer 9i of the polycrystalline semiconductor layer 9 which is the light receiving portion of the optical sensor 8c in plan view. It is formed by shifting a predetermined amount in the middle upward direction.
  • an optical sensor 8d having directional characteristics in the diagonally downward direction in the figure is formed.
  • the light component can be received, and the light from the obliquely downward direction in the above figure is from the obliquely upward direction in the figure, which is the direction opposite to the direction of the reflected light when it is specularly reflected by the surface on which the optical sensor 8d is formed.
  • the opening 14a of the light shielding member 14 is partially overlapped with the I layer 9i of the polycrystalline semiconductor layer 9 which is the light receiving portion of the optical sensor 8d in plan view. It is formed by shifting a predetermined amount in the middle downward direction.
  • a predetermined amount of shifting in the left direction (opening 14a of the shielding member 14 which is formed the n + layer 9n + above in FIG. 2 (a)), the opening 14a of the shielding member 14 in FIG. 2 in (a) It is preferable that it is 5% or more and 50% or less of the lateral width which is the width in the left-right direction.
  • a predetermined amount of shift to the right direction (opening 14a of the shielding member 14 formed p + layer 9p + above in FIG. 2 (b)), the opening 14a of the shielding member 14 in FIG. 2 in (b) It is preferable that it is 5% or more and 50% or less of the lateral width which is the width in the left-right direction.
  • a predetermined amount of shifting in the upward direction is, in Figure 2 the opening 14a of the shielding member 14 (c) It is preferable that it is 5% or more and 50% or less of the vertical width which is the width in the vertical direction.
  • a predetermined amount of shift in the downward direction (the opening 14a of the light shielding member 14 does not overlap in the I layer 9i and the plan view of FIG. 2 (d)), in Figure 2 the opening 14a of the shielding member 14 (d) It is preferable that it is 5% or more and 50% or less of the vertical width which is the width in the vertical direction.
  • the opening 14a of the light shielding member 14 is formed by shifting a predetermined amount with respect to the I layer 9i of the polycrystalline semiconductor film 9.
  • the present invention is not limited to this. However, it may be formed shifted by a predetermined amount with respect to a portion where a non-permeable film such as a conductive film is not formed on the I layer 9i, which is a substantial light receiving portion of the optical sensor.
  • the infrared light emitted from the backlight 3 is reflected by the detected object 7 and is incident on the optical sensors 8a, 8b, 8c, and 8d. Therefore, on the region where the optical sensors 8a, 8b, 8c, and 8d are formed, a black matrix made of carbon black or the like provided on the counter substrate 5 is positioned. It has become.
  • a black matrix made of carbon black or the like blocks visible light and transmits infrared light. Therefore, light that passes through the black matrix and enters the optical sensors 8a, 8b, 8c, and 8d is only infrared light. Therefore, in the present embodiment, the material of the light blocking member 14 can be used without particular limitation as long as it has a function of blocking infrared light, and blocks infrared light. Examples of materials that can be used include metal materials.
  • the light shielding member 14 is not limited to this, and any material having a high reflectance in the wavelength region of the infrared light may be used. Is possible.
  • the optical sensors 8a, 8b, 8c, and 8d when considering the influence of oblique light directly incident on the optical sensors 8a, 8b, 8c, and 8d without using a black matrix made of carbon black or the like, or an area sensor including the optical sensors 8a, 8b, 8c, and 8d
  • the backlight 3 when the backlight 3 is a backlight that does not contain infrared light and emits visible light for display, the light sensors 8a, 8b, and 8c are used as the light shielding member 14.
  • -It is necessary to select a material capable of cutting the wavelength range of light sensitive to 8d.
  • the light shielding member 14 has a light wavelength range of 1100 nm or less.
  • a material that absorbs or reflects light may be selected.
  • the arrangement positions of the respective light receiving elements 8a, 8b, 8c, and 8d do not affect the directivity characteristics of the other light receiving elements 8a, 8b, 8c, and 8d.
  • each optical sensor 8a ⁇ 8b ⁇ 8c ⁇ 8d because a configuration that can be controlled independently directional characteristics, the arrangement position of the light sensor 8a ⁇ 8b ⁇ 8c ⁇ 8d is particularly limited Will not occur.
  • the optical sensors 8a, 8b, 8c, and 8d having different directivity characteristics do not necessarily have to be provided at adjacent positions. Therefore, the optical sensors 8a, 8b, 8c, and 8d having different directivity characteristics are provided. Can be arranged relatively freely.
  • FIG. 4 shows an example of forming the optical sensors 8a, 8b, 8c and 8d provided in the liquid crystal display device 1 of the present embodiment.
  • the pixel P (1, 1) when the pixel P (1, 1) is provided with the photosensor 8a having the directivity in the upper left direction in the drawing, the pixel P (1, 1) is the nearest adjacent pixel in the right direction.
  • a certain P (1,2) is provided with an optical sensor 8b having a directivity characteristic in the upper right direction in the figure, and the pixel P (2,1) which is the most adjacent pixel in the lower direction of the pixel P (1,1) is provided.
  • An optical sensor 8d having a directional characteristic is provided obliquely downward.
  • the optical sensors 8b, 8c, and 8d other than the optical sensor 8a are arranged in the eight adjacent pixels of the pixel provided with the optical sensor 8a. .
  • one optical sensor is surrounded by eight optical sensors having different directivity characteristics from the one optical sensor.
  • four types of light receiving elements 8a, 8b, 8c, and 8d having different directivity characteristics may be arranged in one pixel.
  • the light receiving elements 8a, 8b, 8c and 8d can be arranged regularly or randomly. Further, four types of light receiving elements 8a, 8b, 8c, and 8d can be distributed and arranged in a plurality of pixels.
  • optical sensors 8a, 8b, 8c and 8d only for the pixels relating to the specific color.
  • the present invention is not limited to this, and for detection of an object to be detected.
  • the light sensor 8a ⁇ 8b ⁇ 8c ⁇ 8d having a resolution required different directional characteristics in consideration of the accuracy of parallax will be described later, the light sensor 8a ⁇ 8b ⁇ 8c ⁇ 8d can be provided.
  • FIG. 5 is a diagram showing a schematic circuit configuration in each pixel of the liquid crystal display device 1 of the present embodiment.
  • the transistor 17 plays the role of the sensor output (output amplifier)
  • a configured photosensor circuit is provided, and a scanning signal is supplied from a scanning signal line driving circuit (not shown) (see FIG. 6) to an upper portion of the pixel in which the photosensor circuit is provided. It is formed so as to be the scanning signal lines GLn and not shown the data signal line drive circuit the data signal line SLn the data signal (see FIG. 6) is supplied to intersect the scanning signal lines GLn and the data signal line SLn
  • a pixel TFT 15 is formed in the vicinity of the position where the two intersect.
  • FIG. 5 shows an example of four adjacent pixels P (n ⁇ 1, n ⁇ 1) ⁇ P (n ⁇ 1, n) ⁇ P (n) in the liquid crystal display device 1 having n ⁇ n pixels shown in FIG. , N ⁇ 1) ⁇ P (n, n) is shown as an example.
  • the auxiliary capacitor Cs is provided, and one end of the auxiliary capacitor Cs is electrically connected to the drain electrode of the pixel TFT 15 and the pixel electrode of the liquid crystal capacitor Clc, and the other end is in the corresponding pixel. It is electrically connected to a storage capacitor bus line CSn formed in parallel with the scanning signal line GLn.
  • the optical sensor circuit is configured as a 1T (abbreviation of transistor) type circuit using only one transistor 17 that plays a role of sensor output, and the transistor 17 functions as a source follower transistor (voltage follower transistor). .
  • the drain electrode of the transistor 17 is connected to the AMP power supply bus line Vsn (n is a natural number indicating the pixel column number), and the source electrode is connected to the photosensor output bus line Von.
  • the AMP power supply bus line Vsn and the optical sensor output bus line Von are connected to a sensor readout circuit (not shown) (see FIG. 6).
  • the AMP power supply bus line Vsn receives the power supply voltage VDD from the sensor readout circuit. Applied.
  • the gate electrode of the transistor 17 is connected to one of optical sensors 8a, 8b, 8c, and 8d made of a PIN diode and one end of a boosting capacitor 16.
  • the photosensor row selection signal RWS has a role of selecting a specific row of photosensor circuits arranged in a matrix and outputting a detection signal from the photosensor circuit in the specific row.
  • a high level reset signal RST is sent to the wiring Vrstn from a sensor scanning signal line drive circuit (not shown) (see FIG. 6).
  • the step-up capacitor 16 is charged, The potential of the gate electrode of the transistor 17 gradually rises and finally reaches the initialization potential.
  • the cathode electrode of the PIN diode provided in the photosensors 8a, 8b, 8c, and 8d (FIG. 1). Since the potential of the metal electrode (wiring) 11b) from (a) to (d) of FIG. 1 becomes higher than the potential of the anode electrode, a reverse bias is applied to the PIN diode.
  • a high-level row selection signal RWS is applied to the other end of the boosting capacitor 16 from a sensor scanning signal line drive circuit (not shown) (see FIG. 6) via the wiring Vrwn.
  • the potential of the gate electrode of the transistor 17 is pushed up through the boosting capacitor 16, so that the potential of the gate electrode of the transistor 17 becomes a potential obtained by adding the high level potential of the row selection signal RWS to the detection potential. .
  • detection signals having a level corresponding to the intensity of light received by the optical sensors 8a, 8b, 8c, and 8d are generated, and the detection signals are detected by the optical sensors 8a, 8b, 8c, and 8d. It is generated for each pixel provided. Therefore, the detection operation can be performed on the object to be detected arranged close to the liquid crystal display device 1.
  • the AMP power supply bus line Vsn and the optical sensor output bus line Von are provided for each pixel separately from the data signal line SLn.
  • the data signal line SLn and the AMP power supply are provided.
  • the bus line Vsn can be shared to increase the aperture ratio.
  • the AMP power supply bus line Vsn is shared with the data signal line SLn-1
  • the optical sensor output bus line Von is shared with the data signal line SLn. You can also
  • the optical sensors 8a, 8b, 8c, and 8d, the pixel TFT 15, the auxiliary capacitor Cs, the boosting capacitor 16, the pixel electrode, and the transistor 17 are formed by the same process, but the present invention is not limited thereto. It will never be done.
  • FIG. 6 is a schematic block diagram showing the configuration of the liquid crystal display device 1.
  • the liquid crystal display device 1 includes a liquid crystal display panel 2, a scanning signal line driving circuit 18, a data signal line driving circuit / sensor reading circuit 19, a sensor scanning signal line driving circuit 20, a control circuit 21, and sensing.
  • An image processing unit 22 and an interface circuit 28 are provided.
  • the data signal line driving circuit / sensor reading circuit 19 which combines the function of the data signal line driving circuit and the function of the sensor reading circuit is used.
  • the present invention is not limited to this.
  • the data signal line driving circuit and the sensor readout circuit can be provided separately.
  • the scanning signal line driving circuit 18, the data signal line driving circuit / sensor reading circuit 19, and the sensor scanning signal line driving circuit 20 may have a separately created LSI attached to the liquid crystal display panel 2, or the liquid crystal display panel 2. It may be formed monolithically on top.
  • the scanning signal line driving circuit 18 generates a scanning signal for selectively scanning the pixels P (n, n) row by row by using a scanning signal line (not shown).
  • the data signal line driving circuit / sensor reading circuit 19 supplies a data signal to each pixel P (n, n) using a data signal line (not shown).
  • the sensor scanning signal line driving circuit 20 selects and drives the optical sensor circuit row by row, and the data signal line driving circuit / sensor reading circuit 19 uses an AMP power supply bus line (not shown).
  • a power supply voltage VDD having a constant potential is supplied to the photosensor circuit, and a detection signal of an object to be detected is read from the photosensor circuit using a photosensor output bus line (not shown).
  • the control circuit 21 supplies necessary power supply voltages and control signals to the scanning signal line driving circuit 18, the data signal line driving circuit / sensor reading circuit 19, the sensor scanning signal line driving circuit 20, and the sensing image processing unit 22.
  • the sensing image processing unit 22 includes a left image frame memory 23 for storing image data of an object to be detected obtained from the optical sensor 8a having a directivity in the upper left direction and an optical sensor 8b having the directivity in the upper right direction. And a right image frame memory 24 for storing detected image data.
  • the correlation between the left image obtained from the left image frame memory 23 and the right image obtained from the right image frame memory 24 is calculated, and the parallax between the two images and the height (Z coordinate) of the detected object are obtained.
  • a required parallax and height detection circuit 25 is provided.
  • reference image generation for generating a reference image based on the left image obtained from the left image frame memory 23, the right image obtained from the right image frame memory 24, and the parallax obtained from the parallax and height detection circuit 25.
  • a circuit 26 and a coordinate extraction circuit 27 that extracts XY coordinates from the reference image are provided.
  • the sensing image processing unit 22 further has an upper image frame memory 29 for storing image data of an object to be detected obtained from the optical sensor 8c having a directional characteristic in a diagonally upward direction, and has a directional characteristic in a diagonally downward direction. And a lower image frame memory 30 for storing image data of the detected object obtained from the optical sensor 8d.
  • the correlation between the upper image obtained from the upper image frame memory 29 and the lower image obtained from the lower image frame memory 30 is calculated, and the parallax between the two images and the height (Z coordinate) of the detected object are obtained.
  • a desired parallax and height detection circuit 31 is provided.
  • reference image generation for generating a reference image based on the upper image obtained from the upper image frame memory 29, the lower image obtained from the lower image frame memory 30, and the parallax obtained from the parallax and height detection circuit 31.
  • a circuit 32 and a coordinate extraction circuit 33 for extracting XY coordinates from the reference image are provided.
  • each information obtained by the sensing image processing unit 22 can be taken out to a CPU or the like not shown via the interface circuit 28.
  • FIG. 7 is a diagram for explaining a case where a parallax is calculated by performing a correlation calculation using the binarized left image and right image.
  • FIG. 7A shows a case where a finger is present as the detected object 7 on the liquid crystal display panel 2.
  • FIG. 7B shows image data of an object to be detected obtained from the optical sensor 8a having the directivity in the upper left direction so that the image data is black when it is less than a certain threshold value and is white when it is greater than or equal to a certain threshold value.
  • FIG. 7 (c) shows a left image that has been digitized, and the image data of the detected object obtained from the optical sensor 8b having the directivity characteristic in the upper right direction is constant in black if it is less than a certain threshold value. The right image binarized so that it becomes white above the threshold value is shown.
  • the left image is shifted leftward from the actual finger position
  • the right image is shifted from the actual finger position. It is shifted to the right. Therefore, there is a parallax between the left image and the right image.
  • a normalized correlation method is used to set one of the left image and the right image as a reference, and the reference image as 1
  • the left image and the right image are matched for each line while shifting pixel by pixel, and the correlation value R becomes maximum when the seven pixels are shifted.
  • the parallax corresponds to seven pixels, and the liquid crystal display panel 2
  • the parallax distance can be calculated based on the pixel pitch.
  • FIG. 8 is a diagram for explaining a case where the parallax is calculated by matching the left image and the right image of 256 gradations for each line using the normalized correlation method.
  • the peak value of the correlation value R is broad, so the parallax is calculated with high accuracy. It is difficult.
  • FIG. 9 is a diagram for explaining a case where the parallax is calculated by performing a correlation operation for each line using the left image and the right image in which the edge is detected.
  • FIG. 9A shows a case where a finger is present as the detected object 7 on the liquid crystal display panel 2.
  • FIG. 9B shows the left image of the detected object obtained from the optical sensor 8a having the directivity in the upper left direction using a known filter such as a Sobel filter or a Roberts filter to detect the detected object. It is the image which detected the outline of the finger
  • a known filter such as a Sobel filter or a Roberts filter
  • FIG. 9 shows that the right image of the detected object obtained from the optical sensor 8b having the directivity characteristic in the upper right direction is filtered using a known filter such as a Sobel filter or Roberts filter, and the detected object is detected. It is an image (edge image) in which the contour line of a finger as an object, that is, an edge is detected.
  • the left image is shifted leftward from the actual finger position
  • the right image is shifted from the actual finger position. It is shifted to the right. Therefore, there is a parallax between the left image and the right image.
  • FIG. 10 illustrates a case where parallax is obtained by performing matching using the normalized correlation method on the left image and the right image in which the edges illustrated in FIGS. 9B and 9C are detected. It is a figure for doing.
  • FIG. 10A shows a left image in which an edge is detected
  • FIG. 10B shows a right image in which an edge is detected
  • FIG. 10C shows the right image as a reference.
  • the left image and the right image are matched one line at a time while shifting the reference right image one pixel at a time, and when the seven pixels are shifted, the left image and the right image are detected objects. It shows that the outlines of the fingers just match.
  • the correlation value R is obtained while performing matching using the left image and the right image in which the edge is detected, the correlation value R is illustrated in FIG. Compared with the case where a binarized image is used, no ghost is generated, so that the parallax can be calculated with relatively high accuracy.
  • FIG. 11 is a diagram for explaining a case where parallax is obtained by obtaining an edge direction image of a left image and a right image and performing a correlation calculation including a term relating to the edge direction.
  • FIG. 11A shows a case where a finger is present as the detected object 7 on the liquid crystal display panel 2.
  • FIG. 11B shows the left image of the detected object obtained from the optical sensor 8a having the directivity in the upper left direction using a known filter such as a Sobel filter or a Roberts filter to detect the detected object.
  • a known filter such as a Sobel filter or a Roberts filter to detect the detected object.
  • This is an image in which the contour line of a finger, that is, an edge, is detected, and the direction of the edge is also detected.
  • the number from 1 to 8 is described by the direction.
  • (c) of FIG. 11 filters the right image of the detected object obtained from the optical sensor 8b having the directivity characteristic in the upper right direction by using a known filter such as a Sobel filter or Roberts filter, and detects the detected object.
  • a known filter such as a Sobel filter or Roberts filter
  • This is an image in which the contour line of a finger, that is, an edge, is detected, and the direction of the edge is also detected.
  • the number of 1-8 is described by (c) of FIG. 11 with the direction.
  • ILi is the i-th edge intensity of the left image
  • IRi + j is the i-th edge intensity shifted to the left by j pixels of the right image.
  • Wdiri is a COS value of an angle formed between the direction of the i-th edge of the left image and the direction of the i-th edge shifted to the left by j pixels of the right image. Therefore, for example, if the directions of the two edges are the same, wdiri is 1, and if the directions of the two edges are different by 90 degrees, wdiri is 0, and if the directions of the two edges are 180 degrees, wdiri is -1.
  • the left edge of the left image and the left edge of the right image, the right edge of the left image, and the right edge of the right image are weighted by wdiri, which is a term related to the edge direction. If the two match, a high correlation value R can be obtained.
  • the left image of the detection object obtained from the optical sensor 8a having the directivity characteristic in the upper left direction and the right image of the detection object obtained from the optical sensor 8b having the directivity characteristic in the upper right direction are used.
  • the example in which the correlation calculation is performed to calculate the parallax has been described.
  • FIG. 12 is a diagram for explaining a case where the parallax is calculated by performing a correlation operation for each line using the upper image and the lower image in which the edge is detected.
  • FIG. 12 shows the case where a finger exists as the detected object 7 on the liquid crystal display panel 2.
  • FIG. 12B shows an upper image of an object to be detected obtained from the optical sensor 8c having a directional characteristic in an obliquely upward direction using a known filter such as a Sobel filter or a Roberts filter. It is the image which detected the outline of the finger
  • a known filter such as a Sobel filter or a Roberts filter.
  • (c) in FIG. 12 filters the lower image of the detected object obtained from the optical sensor 8d having the directional characteristic in the diagonally downward direction using a known filter such as a Sobel filter or a Roberts filter, It is the image which detected the outline of the finger
  • a known filter such as a Sobel filter or a Roberts filter
  • the upper image is shifted upward from the actual finger position, and the lower image is shifted from the actual finger position. It is shifted downward. Therefore, there is a parallax between the upper image and the lower image.
  • FIG. 13 illustrates a case where the upper image and the lower image in which the edge illustrated in FIGS. 12B and 12C is detected are matched using the normalized correlation method to obtain the parallax. It is a figure for doing.
  • FIG. 13A shows an upper image in which an edge is detected
  • FIG. 13B shows a lower image in which an edge is detected
  • FIG. 13C shows the lower image as a reference.
  • the upper image and the lower image are matched one line at a time while shifting the reference lower image one pixel at a time, and are detected objects in the upper image and the lower image when seven pixels are shifted It shows that the outlines of the fingers just match.
  • the correlation value R is obtained while performing matching using the upper image and the lower image in which the edge is detected, the correlation value R is illustrated in FIG. Compared with the case where a binarized image is used, no ghost is generated, so that the parallax can be calculated with relatively high accuracy.
  • the liquid crystal display device 1 uses the left image of the detection object obtained from the optical sensor 8a having the directivity in the upper left direction and the right image of the detection object obtained from the optical sensor 8b having the directivity in the upper right direction.
  • the parallax in the left-right direction can be detected, and further obtained from the upper image of the detection object obtained from the optical sensor 8c having the directional characteristic in the obliquely upward direction and the optical sensor 8d having the directional characteristic in the obliquely downward direction.
  • the parallax in the vertical direction can be detected using the lower image of the detected object.
  • the detected object 7 since the height of the detected object 7 can be detected based on the parallax in the biaxial direction (left and right direction and up and down direction), the detected object 7 has a similar shape, for example, a plurality of fingers. Even when a plurality of fingers are linearly arranged at equal intervals, it is possible to estimate the distance (height) with high accuracy.
  • IUi is the i-th edge strength of the upper image
  • IDi + j is the i-th edge strength shifted upward by j pixels of the lower image.
  • Wdiri is a COS value of an angle formed by the direction of the i-th edge of the upper image and the direction of the i-th edge shifted upward by j pixels of the lower image.
  • Figure 14 is a parallax d calculated in the manner described above, the incident angle of the light sensor 8a ⁇ 8b ⁇ 8c ⁇ 8d of the reflected light from the object to be detected (finger) 7, i.e., the light sensor 8a ⁇ 8b ⁇ It is a figure for demonstrating the method of calculating
  • the height h from the I layer 9i to the detected object (finger) 7 can be obtained from the parallax d obtained for each line and the incident angle ⁇ which is a set value.
  • the incident angle ⁇ is defined by the perpendicular from the center of the I layer 9i in the optical sensors 8a, 8b, 8c, and 8d, the center of the I layer 9i, and the center of the opening 14a of the light shielding member 14.
  • the present invention is not limited to this.
  • the reliability information (reliability) obtained from the parallax and height detection circuits 25 and 31 shown in FIG. 6 can be obtained by the following (formula 3) and the following (formula 4), and the maximum correlation value R is obtained.
  • the square root of the value Rjmax is normalized by the edge strength.
  • the liquid crystal display device 1 can obtain the height (Z coordinate) and the reliability information (reliability) from each of the parallax and height detection circuits 25 and 31 shown in FIG. In the meantime, by setting the more reliable one as the final height (Z coordinate), it is possible to estimate the height of the detection object with higher accuracy.
  • parallax and height detection circuit 31 shown in FIG. 6 performs a correlation operation between the upper image obtained from the upper image frame memory 29 and the lower image obtained from the lower image frame memory 30, and
  • the flow for obtaining the parallax is the same as the flow performed in the parallax and height detection circuit 25 except that the direction of shifting is different, and the description thereof is omitted.
  • FIG. 15 is a diagram for explaining a flow for obtaining a correlation function for each pixel.
  • FIG. 16 is a diagram illustrating R (j, k) in FIG. 15.
  • HSZ is the number of pixels in the horizontal direction. Indicates the number of pixels in the vertical direction.
  • FIG. 17 is a reference table for obtaining the value of Wdir, which is a term related to the edge direction.
  • Wdir which is a COS value determined by an angle formed by an edge direction in a pixel in the left image and an edge direction in a pixel in the right image, is obtained with reference to the reference table shown in FIG.
  • the edge direction is described by a number from 1 to 8, and the edge direction in a pixel in the left image is the same as the edge direction in a pixel in the right image.
  • COS0, and Wdir becomes 1.
  • a correlation function between the upper left pixel (0, 0) of the left image and the upper left pixel (0, 0) of the right image is calculated.
  • i is determined to be equal to or higher than HSZ. Until the upper left pixel (0, 0) of the left image and all the pixels (0, 0) to (0, HSZ) of the uppermost row of the right image -1) is calculated.
  • each pixel (1, 0) to (1, HSZ-1) obtained by shifting the left image by one row and all pixels in the second row of the right image A correlation function with (1, 0) to (1, HSZ-1) is calculated.
  • the correlation function between all the left image pixels and all the right image pixels is calculated in the same manner as described above while shifting the left image line by line until it is determined that k is equal to or greater than VSZ.
  • the correlation function for each line of the left image and the right image can be obtained.
  • FIG. 18 is a diagram for explaining a flow for obtaining the maximum value of the correlation function for each line
  • FIG. 19 is a diagram showing R (j, k) and parallax z (k) in FIG. .
  • the left image is shifted by one column in the same row while shifting the left image pixels (0, 1) to (0, HSZ-1) and the right image.
  • the maximum value of the correlation function with all the pixels (0, 0) to (0, HSZ-1) in the top row is obtained.
  • each pixel (1, 0) to (1, HSZ-1) obtained by shifting the left image by one row and all of the second row of the right image
  • the maximum value of the correlation function with the pixels (1, 0) to (1, HSZ-1) is obtained.
  • the maximum value of the correlation function between all the left image pixels and all the right image pixels is set in the same manner as described above while shifting the left image line by line until it is determined that k is equal to or greater than VSZ. Ask.
  • the parallax z (k) can be obtained by obtaining the maximum value of the correlation function for each line.
  • lenses (optical elements) 34a, 34b, 34c, and 34d that limit the incident angles to the optical sensors 8a, 8b, 8c, and 8d within a certain range are provided.
  • the other configurations are the same as described in the first embodiment.
  • members having the same functions as those shown in the drawings of the first embodiment are given the same reference numerals, and descriptions thereof are omitted.
  • FIG. 20 is a diagram showing a schematic configuration of an optical sensor provided with lenses 34a, 34b, 34c, and 34d that limit the incident angle within a certain range.
  • each of the optical sensors 8a, 8b, 8c, and 8d is an upper layer than the I layer 9i that is a light receiving portion, and has an opening of the light shielding member 14.
  • Lenses 34a, 34b, 34c, and 34d that limit the incident angle to the I layer 9i that is the light receiving portion within a certain range are provided at positions that overlap the portion 14a in plan view.
  • the lenses 34a, 34b, 34c, and 34d have different lens surface thicknesses on the left and right or top and bottom, depending on the directivity characteristics of the optical sensors 8a, 8b, 8c, and 8d.
  • the lens surface facing the I layer 9i as the light receiving portion is formed so that the left side in the figure is thicker.
  • the liquid crystal display device 1 can improve the blur of incident light incident on the I layer 9i that is a light receiving unit, and can estimate the distance of the detection object with high accuracy. Can be realized.
  • the first image of the object detected by the plurality of first light receiving elements and the second image of the object detected by the plurality of second light receiving elements are the same.
  • Detected at the timing performs a correlation operation between the first image and the second image, and based on the calculated parallax between the first image and the second image, the detected object
  • a second detection circuit for detecting a height of the detected object from the surface It is preferable to provide.
  • each of the first image, the second image, the third image, and the fourth image is a binarized image
  • the first image and the second image It is preferable that the parallax between the second image and the parallax between the third image and the fourth image are calculated by correlation calculation of the binarized image.
  • the amount of information is large compared to the case where the binarized image is used, and the amount of calculation processing in the correlation calculation is larger by the amount of information. Therefore, it becomes necessary to provide hardware and software that can process a huge amount of calculation at high speed.
  • the first image, the second image, the third image, and the fourth image are edge images obtained by extracting a contour line of the detected object, and It is preferable that the parallax between the first image and the second image and the parallax between the third image and the fourth image are calculated by correlation calculation of the edge image.
  • the portion where the pattern between images is the best match and the correlation value (similarity) is the largest is broad, so the accurate parallax is calculated.
  • the distance (height) of the detected object cannot be estimated with high accuracy.
  • the edge image obtained by extracting the contour line of the detection object is used for the calculation of the parallax, the correlation value (similarity) having the highest matching pattern between the images is maximized. Since the portion to be shown can be obtained relatively sharply, an area sensor that can calculate accurate parallax and can estimate the distance (height) from the object to be detected with high accuracy can be realized.
  • the first image, the second image, the third image, and the fourth image are edge images obtained by extracting a contour line of the detected object, and It is preferable that the parallax between the image and the second image and the parallax between the third image and the fourth image are calculated by a correlation calculation including a term related to the edge direction.
  • the correlation value (similarity) is Each parallax is calculated by a correlation calculation including a term related to the direction of the edge set to be higher.
  • the first reference image is generated based on the first image, the second image, and the parallax between the first image and the second image.
  • a second reference image is generated based on the first reference image generation circuit, the third image, the fourth image, and the parallax between the third image and the fourth image. 2 reference image generation circuits, a first coordinate extraction circuit for extracting coordinates from the first reference image, and a second coordinate extraction circuit for extracting coordinates from the second reference image. Is preferred.
  • the distance (height) from the surface of the object to be detected that is, the distance (height) from the surface on which each light receiving element is formed, but also the first object.
  • the coordinates (XY coordinates) on the surface on which the light receiving elements are formed can be extracted from the reference image or the second reference image.
  • the area sensor can be used as information input means for three-dimensional coordinates (XYZ coordinates).
  • the first light receiving portion, the second light receiving portion, the third light receiving portion, and the fourth light receiving portion are higher layers, the opening of the first light shielding member,
  • the first light receiving unit, the second light receiving unit, the second light shielding member, the third light shielding member, and the fourth light shielding member are overlapped with each other in plan view. It is preferable that an optical element for limiting the incident angle of the light to the light receiving unit, the third light receiving unit, and the fourth light receiving unit within a certain range is provided.
  • the optical element that restricts the incident angle of the light to each light receiving unit within a certain range is provided at a position overlapping with each opening in plan view. It is possible to improve the blur of incident light that is incident, and to realize an area sensor that can estimate the distance (height) from the detected object with high accuracy.
  • the display device of the present invention includes an active matrix substrate having a surface on which the first light receiving element, the second light receiving element, the third light receiving element, and the fourth light receiving element are formed, A light receiving element, a second light receiving element, a third light receiving element, and a fourth light receiving element, a counter substrate disposed to face the surface, the active matrix substrate, It is preferable to include a liquid crystal layer sandwiched between the counter substrate and the counter substrate.
  • a liquid crystal display device in which the area sensor is integrated can be realized. Therefore, information can be input using the area sensor while viewing the image of the liquid crystal display device.
  • the present invention can be applied to an area sensor including a light receiving element (light sensor) and a display device including such an area sensor.
  • Liquid crystal display device (display device) 2 LCD panel 3 Backlight 7 Finger (object to be detected) 8a Optical sensor (first light receiving element) 8b Optical sensor (second light receiving element) 8c Optical sensor (third light receiving element) 8d Optical sensor (fourth light receiving element) 9i light receiving part (first, second, third and fourth light receiving part) DESCRIPTION OF SYMBOLS 14 Light-shielding member 14a Aperture 25 * 31 Parallax and height detection circuit 26 * 32 Reference image generation circuit 27 * 33 Coordinate extraction circuit 34a * 34b * 34c * 34d Lens (optical element)

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Power Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Position Input By Displaying (AREA)

Abstract

Dans la présente invention, pour des capteurs optiques (8a, 8b, 8c et 8d), des parties d'ouverture (14a) de matériaux bloquant la lumière (14) sont formées afin d'être décalées, par rapport à chaque partie réceptrice de lumière, dans des directions allant vers la gauche, la droite, le haut et le bas.
PCT/JP2011/067895 2010-08-05 2011-08-04 Capteur de zone et dispositif d'affichage WO2012018092A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010176361 2010-08-05
JP2010-176361 2010-08-05

Publications (1)

Publication Number Publication Date
WO2012018092A1 true WO2012018092A1 (fr) 2012-02-09

Family

ID=45559588

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/067895 WO2012018092A1 (fr) 2010-08-05 2011-08-04 Capteur de zone et dispositif d'affichage

Country Status (1)

Country Link
WO (1) WO2012018092A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007219486A (ja) * 2006-01-20 2007-08-30 Denso Corp 表示装置
JP2009009098A (ja) * 2007-05-25 2009-01-15 Seiko Epson Corp 表示装置および検出方法
JP2010061639A (ja) * 2008-08-04 2010-03-18 Sony Corp 生体認証装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007219486A (ja) * 2006-01-20 2007-08-30 Denso Corp 表示装置
JP2009009098A (ja) * 2007-05-25 2009-01-15 Seiko Epson Corp 表示装置および検出方法
JP2010061639A (ja) * 2008-08-04 2010-03-18 Sony Corp 生体認証装置

Similar Documents

Publication Publication Date Title
TWI354919B (en) Liquid crystal display having multi-touch sensing
US8350827B2 (en) Display with infrared backlight source and multi-touch sensing function
TWI491246B (zh) 立體影像顯示器件,物件近接偵測器件及電子裝置
EP3174044B1 (fr) Circuit d'excitation de pixel et procédé d'excitation de celui-ci, substrat de matrice, et dispositif d'affichage réfléchissant transparent
JP6204539B2 (ja) 半導体装置
JP4553002B2 (ja) 表示装置
US8878817B2 (en) Area sensor and liquid crystal display device with area sensor
KR101495918B1 (ko) 디스플레이 장치
WO2011129131A1 (fr) Dispositif d'affichage
KR101948870B1 (ko) 지문 인식 센서 및 이를 포함하는 디스플레이 장치
JP2010009584A (ja) 表示装置
KR20090098446A (ko) 액정 표시 장치, 표시 시스템, 및 액정 표시 장치를 이용한물체 형상의 인식 방법
TW200941087A (en) Display device
KR101464751B1 (ko) 표시 장치 및 표시 장치의 검출 방법
US20100225616A1 (en) Display device with position detecting function and electronic apparatus
US20210232841A1 (en) Texture recognition device and driving method of texture recognition device
TW201001002A (en) Liquid crystal display
JP2010097160A (ja) 表示装置および電子機器
CN110633695B (zh) 纹路识别模组和显示设备
TW201118849A (en) Information input device, information input program, and electronic instrument
US20120074406A1 (en) Photosensor
US20220317496A1 (en) Display device
CN111258096A (zh) 指纹识别显示面板、指纹识别方法和显示装置
WO2012018092A1 (fr) Capteur de zone et dispositif d'affichage
WO2012018090A1 (fr) Capteur de section et dispositif d'affichage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11814717

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11814717

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP