WO2010047256A1 - 撮像装置、表示撮像装置および電子機器 - Google Patents
撮像装置、表示撮像装置および電子機器 Download PDFInfo
- Publication number
- WO2010047256A1 WO2010047256A1 PCT/JP2009/067784 JP2009067784W WO2010047256A1 WO 2010047256 A1 WO2010047256 A1 WO 2010047256A1 JP 2009067784 W JP2009067784 W JP 2009067784W WO 2010047256 A1 WO2010047256 A1 WO 2010047256A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light receiving
- image
- light
- receiving element
- display
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
Definitions
- the present invention relates to an imaging device and a display imaging device that acquire information such as the position of an object in contact with or close to a panel, and an electronic apparatus equipped with such a display imaging device.
- touch panels Although there are various types of touch panels, ones that are widely used include those that detect capacitance. This type is adapted to detect a change in surface charge of the panel by touching the touch panel with a finger to detect the position of an object or the like. Therefore, the user can operate intuitively by using such a touch panel.
- the present applicant also has a display unit (display imaging panel) having a display function for displaying an image and an imaging function (detection function) for imaging (detecting) an object in, for example, Patent Document 1 and Patent Document 2.
- a display device is proposed.
- Patent Document 1 When the display device described in Patent Document 1 is used, for example, when an object such as a finger is brought into contact with or in proximity to a display and imaging panel, reflected light by the irradiation light from the display and imaging panel reflected by the object It is also possible to detect the position or the like of an object based on a captured image by using Therefore, by using this display device, it is possible to detect the position or the like of the object with a simple configuration without separately providing a component such as a touch panel on the display imaging panel.
- Patent Document 2 the difference between the image obtained in the light emission state (image obtained using the reflected light by the irradiation light) and the image obtained in the light off state is the above-described outside It is made to remove the influence by light and fixed noise. It is considered that this makes it possible to detect the position of an object or the like without being affected by external light or fixed noise.
- the present invention has been made in view of such problems, and an object thereof is an imaging device capable of stably detecting an object regardless of the use situation, a display-imaging device, a method of detecting an object, and the like It is an object of the present invention to provide an electronic device provided with a display and imaging apparatus.
- An imaging device includes an imaging panel including a plurality of first and second light receiving elements, and an irradiation light source emitting light including detection light of a predetermined wavelength range for detecting a proximity object. And an image processing unit that performs image processing on a signal obtained by capturing an adjacent object by the imaging panel and acquires object information including at least one of the position, shape, or size of the adjacent object.
- the first light receiving element includes the wavelength range of the detection light in the light receiving wavelength range.
- the light receiving sensitivity in the wavelength region of the detection light is lower than that of the first light receiving element.
- the image processing unit processes the signals obtained from the first and second light receiving elements to acquire object information.
- a display and imaging device includes a plurality of display elements and a plurality of first and second light receiving elements, and includes detection light of a predetermined wavelength range for detecting a proximity object.
- a display imaging panel that emits light; and an image processing unit that performs image processing on a signal obtained by imaging a proximity object by the display imaging panel, and acquires object information including at least one of position, shape, or size of the proximity object.
- the first light receiving element includes the wavelength range of the detection light in the light receiving wavelength range.
- the light receiving sensitivity in the wavelength region of the detection light is lower than that of the first light receiving element.
- the image processing unit processes the signals obtained from the first and second light receiving elements to acquire object information.
- An electric device includes the display and imaging device having an image display function and an imaging function.
- the signal obtained by imaging the proximity object by the imaging panel is subjected to image processing to obtain object information of the proximity object.
- object information is obtained using a composite image based on a captured image obtained by the first light receiving element and a captured image obtained by the second light receiving element.
- the first light receiving element includes the wavelength region of the detection light as the light receiving wavelength region, for example, when the proximity object is moving on the imaging panel (or the display imaging panel), the first light reception is performed.
- a false signal may occur in addition to the detection signal of the close object.
- the light receiving sensitivity in the wavelength region of the detection light is lower than that of the first light receiving element. Therefore, in the captured image obtained by the second light receiving element, the first light receiving element As in the case of the light receiving element, a false signal may be generated while the generation of a detection signal of a close object is suppressed. Therefore, the object information is acquired using the composite image based on the captured image obtained by the first light receiving element and the captured image obtained by the second light receiving element, whereby the close object is the image pickup panel (or display image pickup) Even when moving on the panel), the generation of false signals in the composite image is suppressed.
- the light receiving wavelength region of the detection light for detecting a close object is included in the imaging panel (or display imaging panel).
- a plurality of first light receiving elements, and a plurality of second light receiving elements whose light receiving sensitivity in the wavelength region of the detection light is lower than that of the first light receiving element are provided. Since the signal obtained from the element is processed to acquire the object information of the close object, for example, even if the close object is moving on the imaging panel (or the display imaging panel), the false signal The occurrence can be suppressed. Therefore, it becomes possible to detect an object stably irrespective of a use condition.
- FIG. 6 is a timing chart for explaining the relationship between the on / off state of the backlight and the display state. It is a flow chart showing difference image fingertip extraction processing.
- FIG. 9 is a timing chart for explaining the differential image extraction process shown in FIG. 8;
- FIG. 6 is a photograph configuration diagram for describing differential image fingertip extraction processing. It is a figure for demonstrating the difference image fingertip extraction process in case external light is bright. It is a figure for demonstrating the difference image fingertip extraction process in case external light is dark. It is a figure for demonstrating the dynamic range of the light reception signal by the difference image finger tip extraction process. It is a photograph block diagram for explaining a difference image fingertip extraction process in a case where a plurality of fingertips to be detected are simultaneously present.
- FIG. 21 is a perspective view illustrating an appearance of application example 3; 21 is a perspective view illustrating an appearance of application example 4.
- FIG. (A) is a front view of the open state of application example 5, (B) is a side view thereof, (C) is a front view of a closed state, (D) is a left side view, (E) is a right side view, (F) is a top view, (G) is a bottom view.
- FIG. 37 is a circuit diagram illustrating a configuration example of each pixel in the display and imaging device illustrated in FIG. 36. It is a figure for demonstrating the difference image fingertip extraction process in the display imaging device shown to FIG. 36 and FIG.
- FIG. 1 shows the entire configuration of a display-and-image-pickup apparatus according to an embodiment of the present invention.
- the display and imaging apparatus includes an I / O display panel 20, a backlight 15, a display drive circuit 12, a light reception drive circuit 13, an image processing unit 14, and an application program execution unit 11.
- the I / O display panel 20 is configured of a liquid crystal panel (LCD; Liquid Crystal Display) in which a plurality of pixels are arranged in a matrix over the entire surface.
- the I / O display panel 20 has a function (display function) for displaying an image such as a predetermined figure or character based on display data, and an object which contacts or approaches the I / O display panel 20 as described later It has a function (imaging function) for imaging a proximity object).
- the backlight 15 is, for example, a light source for displaying and detecting the I / O display panel 20, in which a plurality of light emitting diodes are disposed. The backlight 15 performs on / off operation at high speed at predetermined timing synchronized with the operation timing of the I / O display panel 20 as described later.
- the display drive circuit 12 drives the I / O display panel 20 so that an image based on display data is displayed on the I / O display panel 20 (to perform a display operation) (line sequential display operation). Drive) circuit.
- the light reception drive circuit 13 (image generation unit) drives the I / O display panel 20 so that a light reception signal (imaging signal) can be obtained from each pixel of the I / O display panel 20 (so as to pick up an object). (A line-sequential imaging operation is driven).
- the light receiving drive circuit 13 is configured to generate a composite image to be described later by performing predetermined image processing (image generation processing) on the light receiving signal from each pixel.
- the generated composite image is stored, for example, in frame units in the frame memory 13A, and is output to the image processing unit 14 as a captured image. The details of such image generation processing will be described later.
- the image processing unit 14 performs predetermined image processing (arithmetic processing) based on the captured image (composite image) output from the light receiving drive circuit 13, and detects object information (position coordinate data, shape and size of the object) related to the proximity object. Data, etc.) is detected and acquired. The details of the processing to be detected will be described later.
- the application program execution unit 11 executes processing according to predetermined application software based on the detection result of the image processing unit 14. As such processing, for example, there is one in which position coordinates of a detected object are included in display data and displayed on the I / O display panel 20. The display data generated by the application program execution unit 11 is supplied to the display drive circuit 12.
- the I / O display panel 20 has a display area (sensor area) 21, a display H driver 22, a display V driver 23, a sensor readout H driver 25, and a sensor V driver 24. There is.
- the display area 21 is an area for modulating light from the backlight 15 to emit irradiation light and for imaging an object near this area.
- the irradiation light includes display light and detection light (infrared light or the like) for detecting a proximity object by, for example, an infrared light source or the like (not shown) (the same applies to the following).
- liquid crystal elements as light emitting elements (display elements) and light receiving sensors (main sensors 32 and auxiliary sensors 33 described later) as light receiving elements (imaging elements) are arranged in a matrix. .
- the display H driver 22 sequentially drives the liquid crystal elements of the respective pixels in the display area 21 together with the display V driver 23 on the basis of the display drive signal and control clock supplied from the display drive circuit 12. It is.
- the sensor readout H driver 25 line-sequentially drives the light receiving elements of the respective pixels in the sensor area 21 together with the sensor V driver 24 to acquire a light reception signal.
- the pixel 31 is configured of a display pixel (display unit) 31 RGB including a liquid crystal element, and an imaging pixel (light receiving unit).
- the display pixel 31RGB is configured of a display pixel 31R for red (R), a display pixel 31G for green (G), and a display pixel 31B for blue (B).
- two types of light receiving sensors that is, a main sensor 32 (first light receiving element) and an auxiliary sensor 33 (second light receiving element) are disposed.
- one light receiving sensor is arranged for one display pixel 31RGB in FIG. 3, one light receiving sensor may be arranged for a plurality of display pixels 31RGB.
- the main sensor 32 and the auxiliary sensor 33 be alternately arranged at a ratio of one to one on the display area 21.
- the number of auxiliary sensors 33 may be smaller than the number of main sensors 32. In this case, since it is necessary to perform interpolation processing on the light reception signal by the auxiliary sensor 33, the processing becomes complicated, and it is necessary to take into consideration whether the omission of detection causes a problem depending on the type of signal and application.
- the display pixel 31RGB is not shown for convenience of description.
- FIG. 6 shows an example of the relationship between the light emission wavelength region of the light source for detection (FIG. 6A) and the detection wavelength regions of the main sensor 32 and the auxiliary sensor 33 (FIGS. 6B and 6C). It is a thing.
- the light reception wavelength region is a wavelength region on the long wavelength side of the wavelength ⁇ 1 or more. Therefore, the main sensor 32 includes, as a light receiving wavelength region, a wavelength region ⁇ 23 (wavelength region of wavelength ⁇ 2 to wavelength ⁇ 3) of the detection light for detecting a near object, which is indicated by reference symbol G1 in FIG. 6A. And function as a light receiving sensor for detecting a near object. Further, in the main sensor 32, the light reception sensitivity in the wavelength region ⁇ 23 of the detection light is higher than the light reception sensitivity in a predetermined wavelength region different from this wavelength region (here, a wavelength region less than the wavelength ⁇ 2).
- the light reception wavelength region is a wavelength region on the short wavelength side of the wavelength ⁇ 2 or less. That is, the auxiliary sensor 33 has a light receiving sensitivity characteristic different from that of the main sensor 32 with respect to the light receiving wavelength. Therefore, in the auxiliary sensor 33, the light reception sensitivity in the wavelength region ⁇ 23 of the detection light is lower than that of the main sensor 32 (here, the light reception sensitivity in the wavelength region ⁇ 23 of the detection light is 0 (zero) ). Thereby, the auxiliary sensor 33 functions as a light receiving sensor for detecting a false signal described later.
- the light reception sensitivity in the wavelength region ⁇ 23 of the detection light is lower than the light reception sensitivity in the predetermined wavelength region (here, a wavelength region less than the wavelength ⁇ 2).
- the wavelength region ⁇ 12 (the wavelength region of the wavelengths ⁇ 1 to ⁇ 2) is a light reception wavelength region by both the main sensor 32 and the auxiliary sensor 33.
- the main sensor 32 when infrared light is used as the detection light, the main sensor 32 includes the wavelength region of the infrared light as a light reception wavelength region, and the auxiliary sensor 33 detects, for example, the wavelength region of visible light May be included as the light receiving wavelength region.
- the relationship between the wavelength region of the detection light and the light reception wavelength regions of the main sensor 32 and the auxiliary sensor 33 is not limited to this.
- green light may be used as the detection light
- the light reception wavelength region of the auxiliary sensor 33 may be set only to the wavelength region of red light.
- the wavelength region of the detection light as the light reception wavelength region of the auxiliary sensor 33
- the wavelength of the external light that can be received by the main sensor 32 make it possible to receive light. This is because, as described later, it is the role of the auxiliary sensor 33 to detect a false signal caused by external light incident on the main sensor 32.
- the relationship between the wavelength region of such detection light and the light reception wavelength regions of the main sensor 32 and the auxiliary sensor 33 can be realized by, for example, a combination of existing color filters or design of the spectral sensitivity of the light reception sensor. .
- the light reception wavelength region of the main sensor 32 may be one indicated by the reference G22 in FIG. 6B, and the light reception wavelength region of the auxiliary sensor 33 is similarly the reference G32 in FIG. 6C. It may be indicated by.
- the wavelength range ⁇ 12 (the wavelength range of wavelength ⁇ 1 to wavelength ⁇ 2)
- the wavelength range ⁇ 34 (the wavelength range of wavelength ⁇ 3 to wavelength ⁇ 4) are light reception wavelength ranges of both the main sensor 32 and the auxiliary sensor 33.
- the display drive circuit 12 In this display-and-image-pickup device, the display drive circuit 12 generates a drive signal for display based on the display data supplied from the application program execution unit 11. Then, the I / O display panel 20 is driven line-sequentially by this drive signal, and an image is displayed. Further, at this time, the backlight 15 is also driven by the display drive circuit 12, and the on / off operation synchronized with the I / O display panel 20 is performed.
- the backlight 15 is turned off (is turned off) in the first half period (1/120 seconds) of each frame period, and display is not performed.
- the backlight 15 is turned on (turned on), a display signal is supplied to each pixel, and an image of the frame period is displayed.
- the first half period of each frame period is a non-light period during which no irradiation light is emitted from the I / O display panel 20, while the second half period of each frame period emits irradiation light from the I / O display panel 20.
- Period is
- the light receiving element of each pixel in the I / O display panel 20 is driven by line sequential light receiving drive by the light receiving drive circuit 13. A close object is imaged.
- the light reception signal from each light reception element is supplied to the light reception drive circuit 13.
- the light reception drive circuit 13 accumulates light reception signals of pixels for one frame, and outputs the light reception signals to the image processing unit 14 as a captured image.
- the image processing unit 14 performs predetermined image processing (arithmetic processing) to be described below based on the captured image to obtain information (positional coordinate data, shape of the object) on the proximity object to the I / O display panel 20. And data on size, etc.).
- FIG. 8 is a flowchart showing the fingertip extraction process (differential image fingertip extraction process described later) performed by the image processing unit 14
- FIG. 9 is a timing diagram showing a part of the differential image fingertip extraction process. It is a thing.
- an I / O display panel 20 performs imaging processing of a close object. Thereby, the image A (shadow image) is acquired (step S11 of FIG. 8, a period of timing t1 to t2 of FIG. 9).
- the I / O display panel 20 performs an imaging process of a close object.
- an image B (a reflected light utilization image utilizing the reflected light by the irradiation light) is acquired (step S12 of FIG. 8, a period of timing t3 to t4 of FIG. 9).
- the image processing unit 14 generates a difference image C between the image B and the image A (shadow image) obtained by imaging in a period in which the backlight 15 is off (no light period) (see FIG. 8).
- Step S13 a period of timing t3 to t4 in FIG.
- the image processing unit 14 performs arithmetic processing to determine the center of gravity based on the generated difference image C (step S14), and specifies the center of contact (proximity) (step S15). Then, the detection result of the proximity object is output from the image processing unit 14 to the application program execution unit 11, and the differential image fingertip extraction processing by the image processing unit 14 is completed.
- the fingertip B is extracted based on the differential image C between the image B using the reflected light by the irradiation light and the image A using the external light (environmental light) without using the irradiation light. Extraction processing is performed.
- the influence of the brightness of the external light is removed, and the proximity object is detected without being affected by the brightness of the external light.
- the light reception output voltage Von1 in the state in which the backlight 15 is lit is as shown in FIG. As shown in). That is, except for the part touched by the finger, the voltage value Va corresponds to the brightness of the external light, while at the part touched by the finger, the surface of the object (finger) touched at that time, from the backlight 15 The voltage drops to the voltage value Vb corresponding to the reflectance for reflecting light.
- the light reception output voltage Voff1 in the state in which the backlight 15 is turned off is the same as the voltage value Va corresponding to the brightness of the external light except for the part touched by the finger. At the point touched, the external light is blocked and the voltage value Vc becomes very low.
- the light reception output voltage Von2 in the state where the backlight 15 is lit is as shown in FIG.
- the voltage value Vc is very low because there is no external light, while the portion touched by the finger is a backlight on the surface of the object (finger) touched at that time.
- the voltage rises to a voltage value Vb corresponding to the reflectance for reflecting the light from 15.
- the light reception output voltage Voff2 in the state in which the backlight 15 is turned off remains unchanged at the very low level of the voltage value Vc at the place touched with the finger and at the other places.
- the light reception output voltage is largely different between the case where there is external light and the case where there is no external light at a portion which is not in contact with the display area 21 of the panel.
- the voltage value Vb when the backlight 15 is on and the voltage value Vc when the backlight 15 is off are substantially the same regardless of the presence of external light. Become.
- the difference between the voltage at the time of lighting of the backlight 15 and the voltage at the time of extinguishing is detected, and a place where there is a certain difference or more like the difference between the voltage value Vb and the voltage value Vc It can be determined that the location is close. As a result, even in the case where external light entering the panel is strong or in the case where there is little external light, contact or proximity is detected well under uniform conditions.
- FIG. 13A shows a contact state of the display area 21 of the panel, in which the panel surface is touched with the finger f, and a circular object m having a reflectance of almost 100% is displayed. It is in the state of being placed on the area 21. In this state, the light reception output voltage in the line which scans both the finger f and the object m is in the state shown in FIG. 13 (B). Further, in FIG. 13B, a voltage Von3 is a light reception output voltage in a state in which the backlight 15 is lit, and a voltage Voff3 is a light reception output voltage in a state in which the backlight 15 is extinguished.
- a voltage higher than the voltage Vd detected when the backlight 15 is lit is a level Vy which is not necessary to be observed, and its level
- the following range Vx is a dynamic range required for detection. Therefore, it can be understood that the signal of the level Vy which is not necessary to be observed may be overflowed to be regarded as the same intensity.
- the image processing unit 14 acquires object information of a proximity object by using a composite image based on a captured image obtained by the main sensor 32 and a captured image obtained by the auxiliary sensor 33.
- the difference image HC is generated. That is, in the auxiliary sensor 33, the light receiving sensitivity in the wavelength region of the detection light is lower than that of the first light receiving element (here, it is 0). Similarly, while the false signal F101 is generated, the generation of the detection signal of the close object is suppressed (here, avoided).
- V (HC) Von (HB)-Voff (HA)
- the light reception drive circuit 13 generates a predetermined mask image E based on the difference image HC obtained by the auxiliary sensor 33. Further, the light reception drive circuit 13 generates a composite image F of these by taking a logical product of the difference image MC obtained by the main sensor 32 and the generated mask image E. Then, using the composite image F, the image processing unit 14 acquires object information of the proximity object. At this time, the light reception drive circuit 13 can generate a mask image E by performing, for example, a binarization process and an image inversion process on the difference image HC.
- the binarization processing it is preferable to regard a light reception signal having a certain value (threshold value) or more in the difference image HC as a false signal and convert the false signal into an image that masks a portion of the false signal. .
- the detection signal is slightly This is because it may also appear in the difference image HC on the auxiliary sensor 33 side.
- Minimizing the leakage of the detection signal to the auxiliary sensor 33 improves the performance of the present system.
- the wavelength range ⁇ 23 of the detection light is limited, and in the auxiliary sensor 33, the sensitivity to the wavelength range ⁇ 23 of the detection light is designed as low as possible.
- the auxiliary sensor 33 detects a false signal generated by the external light, it is possible to improve the performance by relatively increasing the sensitivity to the external light than the wavelength sensitivity of the detection light.
- object information is obtained using the composite image F based on the difference image MC obtained by the main sensor 32 and the difference image HC obtained by the auxiliary sensor 33.
- generation of false signals in the composite image F can be suppressed (or avoided).
- the plurality of main sensors 32 including the wavelength region ⁇ 23 of the detection light for detecting the proximity object as the light reception wavelength region are provided in the display area 21 of the I / O display panel 20.
- a plurality of auxiliary sensors 33 are provided in which the light reception sensitivity in the wavelength region of the detection light is lower than that of the main sensor 32. Then, using the composite image F based on the difference image MC obtained by the main sensor 32 and the difference image HC obtained by the auxiliary sensor 33, the object information of the proximity object is obtained.
- FIG. 22 and FIG. 23 show a differential image fingertip extraction process according to the first modification.
- FIG. 24 shows a differential image fingertip extraction process according to the second modification.
- the image processing unit 14 is configured to perform generation processing of the difference image MC and generation processing of the difference image HC by sequential processing in units of the main sensor 32 and the auxiliary sensor 33.
- the main sensors 32 and the auxiliary sensors 33 are alternately arranged at a ratio of one to one on the display area 21.
- the difference image MC and the difference image HC are respectively constituted by a plurality of difference pixel values m0, h1, m2, h3, m4, h5,.
- the light reception drive circuit 13 performs the above-described sequential processing, when the difference pixel value obtained by the auxiliary sensor 33 adjacent to each main sensor 32 is equal to or more than the predetermined threshold value Vth (H), the main sensor 32.
- the difference pixel value in is regarded as 0 (zero) and output.
- FIG. 25 shows a differential image fingertip extraction process according to the third modification.
- the processing of modification 2 is performed by combining interpolation processing on both the auxiliary sensor 33 and the main sensor 32 side.
- the light reception drive circuit 13 interpolates and generates the difference pixel value of the main sensor 32 at the position corresponding to each auxiliary sensor 33 when performing the above-described sequential processing. Further, the difference pixel value of the auxiliary sensor 33 at the position corresponding to each main sensor 32 is generated by interpolation. Then, in consideration of these differentially generated differential pixel values, sequential processing is performed according to the comparison result with the threshold value.
- the auxiliary sensor and the main sensor can be accurately associated with each other, and ideal processing results can be obtained. Also in this method, it is not necessary to prepare a special frame memory only by holding processing results for one or two sensors. In addition, the processing delay is extremely fast because it is processed with a delay of several sensors because of the processing for each sensor.
- the main sensor 32 and the auxiliary sensor 33 are alternately arranged at a ratio of 1: 1 on the display area 21 as shown in, for example, FIGS. 4A and 4B. Is preferred.
- FIG. 26A is an example in which the surface of the I / O display panel 20 is touched with a fingertip 61, and the locus of the touched portion is displayed on the screen as a drawing line 611.
- the example shown in FIG. 26B is for gesture recognition using a hand shape. Specifically, the shape of the hand 62 touching (or in proximity to) the I / O display panel 20 is recognized, the recognized hand shape is displayed as an image, and some processing is performed by moving the display object 621. It is intended to
- the hand 63A in the closed state is changed to the hand 63B in the open state, and the I / O display panel 20 recognizes the touch or proximity of the hand in each state.
- processing based on the image recognition For example, an instruction such as zooming in can be given.
- an instruction such as zooming in can be given.
- the operation of switching the command on the computer device by connecting the I / O display panel 20 to the personal computer device can be more naturally formed by these image recognitions. You can enter
- a plurality of I / O display panels 20 may be prepared, and the plurality of I / O display panels 20 may be connected by any transmission means. Then, an image in which contact or proximity is detected may be transmitted and displayed on the other party's I / O display panel 20 so that users who operate both display panels communicate. That is, the two I / O display panels 20 are prepared, and the handprinted hand of the hand recognized by one panel is sent to the other hand, the handprint 642 is displayed on the other panel, or the other panel is handed 64 Processing such as sending the locus 641 displayed by touching to the other party's panel for display is possible.
- the state of drawing is transmitted as a moving image, and there is the possibility of a new communication tool by sending handwritten characters and figures to the other party.
- application of the I / O display panel 20 to a display panel of a mobile phone terminal is assumed.
- the writing 66 is used to write characters on the surface of the I / O display panel 20 using the writing 66, and the portion touched by the writing 66 is attached to the I / O display panel 20.
- the image 661 By displaying as the image 661, handwriting input by a brush can be performed. In this case, it is possible to recognize and realize even a fine touch of the brush.
- a special pen tilt has been realized by electric field detection, but in this example, it is more realistic by detecting the contact surface of the real brush itself. Information can be input in an intuitive way.
- the display-and-image-pickup device according to the above-described embodiment and the like can be applied to electronic devices in various fields such as television devices, digital cameras, notebook personal computers, portable terminal devices such as mobile phones, and video cameras.
- the display-and-image-pickup device according to the above-described embodiment and the like can be applied to electronic devices in any field that displays an externally input video signal or an internally generated video signal as an image or video.
- an application such as a surveillance camera may be considered by taking advantage of the feature of the present invention that only the reflection component by the detection light is extracted.
- FIG. 30 illustrates the appearance of a television set to which the display-and-image-pickup device of the above-described embodiment and the like is applied.
- the television apparatus includes, for example, a video display screen unit 510 including a front panel 511 and a filter glass 512.
- the video display screen unit 510 is configured by the display and imaging device according to the above-described embodiment and the like. There is.
- FIG. 31 illustrates an appearance of a digital camera to which the display-and-image-pickup device of the above-described embodiment and the like is applied.
- the digital camera has, for example, a light emitting unit 521 for flash, a display unit 522, a menu switch 523 and a shutter button 524.
- the display unit 522 is configured by the display and imaging device according to the above-described embodiment and the like. ing.
- FIG. 32 illustrates the appearance of a notebook personal computer to which the display-and-image-pickup device of the above-described embodiment and the like is applied.
- the notebook personal computer includes, for example, a main body 531, a keyboard 532 for input operation of characters and the like, and a display unit 533 for displaying an image.
- the display unit 533 is a display according to the above-described embodiment and the like. It comprises an imaging device.
- FIG. 33 illustrates the appearance of a video camera to which the display-and-image-pickup device of the above-described embodiment and the like is applied.
- the video camera has, for example, a main body 541, a lens 542 for photographing an object provided on the front side of the main body 541, a start / stop switch 543 at the time of photographing, and a display 544.
- the display unit 544 is configured by the display and imaging device according to the above-described embodiment and the like.
- FIG. 34 illustrates an appearance of a mobile phone to which the display-and-image-pickup device of the above-described embodiment and the like is applied.
- This mobile phone is, for example, one in which an upper housing 710 and a lower housing 720 are connected by a connecting portion (hinge portion) 730, and has a display 740, a sub display 750, a picture light 760 and a camera 770. There is.
- the display 740 or the sub display 750 is configured by the display and imaging device according to the above-described embodiment and the like.
- the wavelength range ⁇ 12 (wavelength range from wavelength ⁇ 1 to wavelength ⁇ 2) and wavelength range ⁇ 34 (wavelength range from wavelength ⁇ 3 to wavelength ⁇ 4) are It is a light reception wavelength region by both the main sensor 32 and the auxiliary sensor 33.
- application of the invention is not limited to this. That is, for example, as shown in FIG. 35 (B), the light reception wavelength region of the main sensor 32 and the light reception wavelength region of the auxiliary sensor 33 may be separated from each other. It can be said that it is preferable.
- the light reception sensitivity of the main sensor 32 is 0 (zero) in wavelength regions other than the light reception wavelength region of the main sensor 32 indicated by solid arrows.
- the light reception sensitivity of the auxiliary sensor 33 is 0 (zero). That is, in FIG. 35B, the light reception sensitivity of the auxiliary sensor 33 is 0 (zero) in the light reception wavelength region of the main sensor 32, and the light reception sensitivity of the main sensor 32 is 0 in the light reception wavelength region of the auxiliary sensor 33. (Zero), and as a result, the light reception wavelength regions in these two sensors are separated (different) from each other as described above.
- the backlight for display may double as detection light, or detection only An illumination light source may be provided.
- the irradiation light source only for a detection it is more preferable to use the light (for example, infrared light) of wavelength ranges other than a visible light range.
- the display element is a liquid crystal element and the light receiving element is separately provided in the I / O display panel 20 has been described, but the application of the present invention is limited to this. Absent. Specifically, for example, as in the display and imaging device according to another modification shown in FIGS. 36 to 38, it is possible to perform light emission operation and light reception operation of an organic EL (ElectroLuminescence) element or the like in time division.
- the I / O display panel (I / O display panel 60) may be configured by the light emitting and receiving element (display image pickup element).
- FIG. 36 is a block diagram showing a configuration example of a display-and-image-pickup apparatus according to the present modification.
- the display and imaging apparatus includes an I / O display panel 60, a display drive circuit 82, a light receiving drive circuit 83 having a frame memory 83A, an image processing unit 84, and an application program execution unit 81.
- the display drive circuit 82, the frame memory 83A, the light reception drive circuit 83, the image processing unit 84 and the application program execution unit 81 are the display drive circuit 12, the frame memory 13A and the light reception drive circuit described in the above embodiment. 13, the same operations as the image processing unit 14 and the application program execution unit 11 are performed, so the description will be omitted.
- the I / O display panel 60 is configured as an organic EL display using an organic EL element, and a plurality of pixels (display image pickup elements) are formed in a matrix in a display area (sensor area) ing.
- the pixels including an organic EL element functioning as a sensor are arranged, for example, in a matrix.
- the signal charge accumulated corresponding to the received light amount in the light receiving period is read out by the light receiving drive by the light receiving drive circuit 83.
- FIG. 37 shows a circuit configuration example (a configuration example of a pixel circuit) of each pixel in the display and imaging device shown in FIG.
- This pixel circuit comprises an organic EL element 91, parasitic capacitance 91A of the organic EL element 91, switches SW1 to SW3, a display data signal line 92, a readout line selection line 93, and a reception data signal line 94. It is done.
- the switch SW1 when the switch SW1 is set to the on state in the display period (light emission period), display data for image display is supplied from the display data signal line 92 to the organic EL element 91 via the switch SW1.
- the organic EL element 91 performs a light emitting operation.
- FIG. 38 is for explaining the differential image fingertip extraction process in the display-and-image-pickup device shown in FIG. 36 and FIG. Specifically, in the I / O display panel 60 using the above-described organic EL element, while displaying an image or the like, detection processing of an object (finger f) in contact with or in proximity to the I / O display panel 60 is performed. An example of the case is shown.
- the light emitting area is constituted by a plurality of specific horizontal lines in one screen. Then, by moving such a light emitting area in the scanning direction indicated by an arrow in the figure, for example, within one field period, it is viewed as being displayed on the whole of one screen due to the afterimage effect.
- the reading operation by the reading line positioned in the light emitting area and the reading line positioned away from the light emitting area to some extent vertically is sequentially performed in conjunction with the movement of the light emitting area.
- the reading operation can detect the reflected light of the light from the light emitting area, as shown in FIG. 38, the read data in the self light emitting ON state (image B4: reflection It corresponds to the light utilization image B).
- image B4 reflection It corresponds to the light utilization image B.
- the read operation is not affected by the light emission from the light emission area in the read line positioned apart from the light emission area to some extent vertically, as shown in FIG.
- the display and imaging device having the display and imaging panel (I / O display panel 20) including the plurality of display elements and the plurality of imaging elements has been described. Not limited to this. Specifically, the present invention can also be applied to, for example, an imaging apparatus having an imaging panel including a plurality of imaging elements without providing a display element.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
- Studio Devices (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
- Liquid Crystal (AREA)
- Liquid Crystal Display Device Control (AREA)
- Image Input (AREA)
Abstract
Description
図1は、本発明の一実施の形態に係る表示撮像装置の全体構成を表すものである。この表示撮像装置は、I/Oディスプレイパネル20と、バックライト15と、表示ドライブ回路12と、受光ドライブ回路13と、画像処理部14と、アプリケーションプログラム実行部11とを備えている。
次に、図2を参照して、I/Oディスプレイパネル20の詳細構成例について説明する。このI/Oディスプレイパネル20は、表示エリア(センサエリア)21と、表示用Hドライバ22と、表示用Vドライバ23と、センサ読み出し用Hドライバ25と、センサ用Vドライバ24とを有している。
次に、図3~図5を参照して、表示エリア21における各画素の詳細構成例について説明する。
次に、図6を参照して、光源および受光センサにおける波長領域の構成例について説明する。図6は、検出用光源の発光波長領域(図6(A))と、メインセンサ32および補助センサ33の検出波長領域(図6(B),(C))との関係の一例を表したものである。
最初に、この表示撮像装置の基本動作、すなわち画像の表示動作および物体の撮像動作について説明する。
次に、図8~図14を参照して、画像処理部14によるI/Oディスプレイパネル20への近接物体;例えば指等)の抽出処理(指先抽出処理)の基本動作について説明する。図8は、この画像処理部14による指先抽出処理(後述する差分画像指先抽出処理)を流れ図で表したものであり、図9は、この差分画像指先抽出処理の一部をタイミング図で表したものである。
次に、図15~図21を参照して、本発明の特徴的部分の1つである、近接物体が移動している場合等における差分画像指先抽出処理について、比較例と比較しつつ説明する。
図22および図23は、変形例1に係る差分画像指先抽出処理を表すものである。
図24は、変形例2に係る差分画像指先抽出処理を表すものである。本変形例では、画像処理部14は、差分画像MCの生成処理と、差分画像HCの生成処理とを、メインセンサ32および補助センサ33の単位での逐次処理により行うようになっている。
図25は、変形例3に係る差分画像指先抽出処理を表すものである。本変形例では、実際にはより精密な結果を得るために、補助センサ33側およびメインセンサ32側ともに、補間処理を組み合わせて変形例2の処理を行うようにしたものである。
次に、図26~図29を参照して、これまで説明した指先抽出処理によって検出された物体の位置情報等を利用した、アプリケーションプログラム実行部11によるアプリケーションプログラム実行例について、いくつか説明する。
次に、図30~図34を参照して、上記実施の形態および変形例で説明した表示撮像装置の適用例について説明する。上記実施の形態等の表示撮像装置は、テレビジョン装置,デジタルカメラ,ノート型パーソナルコンピュータ、携帯電話等の携帯端末装置あるいはビデオカメラなどのあらゆる分野の電子機器に適用することが可能である。言い換えると、上記実施の形態等の表示撮像装置は、外部から入力された映像信号あるいは内部で生成した映像信号を、画像あるいは映像として表示するあらゆる分野の電子機器に適用することが可能である。なお、以下に示した電子機器の他にも、例えば、検出光による反射成分のみを取り出すという本発明の特徴を活かして、監視カメラのような応用も考えられる。
図30は、上記実施の形態等の表示撮像装置が適用されるテレビジョン装置の外観を表したものである。このテレビジョン装置は、例えば、フロントパネル511およびフィルターガラス512を含む映像表示画面部510を有しており、この映像表示画面部510は、上記実施の形態等に係る表示撮像装置により構成されている。
図31は、上記実施の形態等の表示撮像装置が適用されるデジタルカメラの外観を表したものである。このデジタルカメラは、例えば、フラッシュ用の発光部521、表示部522、メニュースイッチ523およびシャッターボタン524を有しており、その表示部522は、上記実施の形態等に係る表示撮像装置により構成されている。
図32は、上記実施の形態等の表示撮像装置が適用されるノート型パーソナルコンピュータの外観を表したものである。このノート型パーソナルコンピュータは、例えば、本体531,文字等の入力操作のためのキーボード532および画像を表示する表示部533を有しており、その表示部533は、上記実施の形態等に係る表示撮像装置により構成されている。
図33は、上記実施の形態等の表示撮像装置が適用されるビデオカメラの外観を表したものである。このビデオカメラは、例えば、本体部541,この本体部541の前方側面に設けられた被写体撮影用のレンズ542,撮影時のスタート/ストップスイッチ543および表示部544を有している。そして、その表示部544は、上記実施の形態等に係る表示撮像装置により構成されている。
図34は、上記実施の形態等の表示撮像装置が適用される携帯電話機の外観を表したものである。この携帯電話機は、例えば、上側筐体710と下側筐体720とを連結部(ヒンジ部)730で連結したものであり、ディスプレイ740,サブディスプレイ750,ピクチャーライト760およびカメラ770を有している。そのディスプレイ740またはサブディスプレイ750は、上記実施の形態等に係る表示撮像装置により構成されている。
Claims (20)
- 複数の第1および第2の受光素子と、近接物体を検出するための所定の波長領域の検出光を含む光を発する照射光源とを有する撮像パネルと、
前記撮像パネルによって前記近接物体を撮像した信号を画像処理し、その近接物体の位置、形状または大きさの少なくとも1つを含む物体情報を取得する画像処理部と
を備え、
前記第1の受光素子は、その受光波長領域に前記検出光の波長領域を含んでおり、
前記第2の受光素子は、前記検出光の波長領域における受光感度が、前記第1の受光素子よりも低くなっており、
前記画像処理部は、前記第1および第2の受光素子から得られる信号を処理して前記物体情報を取得する
撮像装置。 - 前記第2の受光素子は、受光波長に対し前記第1の受光素子と異なる受光感度特性を有する
請求項1に記載の撮像装置。 - 前記第1の受光素子は、前記検出光の波長領域における受光感度が、この波長領域と異なる所定の波長領域における受光感度より高くなっており、
前記第2の受光素子は、前記検出光の波長領域における受光感度が、前記所定の波長領域における受光感度より低くなっている
請求項2に記載の撮像装置。 - 前記画像処理部は、前記第1の受光素子により得られる撮像画像と、前記第2の受光素子により得られる撮像画像とに基づく合成画像を用いて、前記物体情報を取得する
請求項1に記載の撮像装置。 - 前記検出光による反射光を利用して前記撮像パネルによって前記近接物体を撮像して得られる反射光利用画像と、前記撮像パネルによって前記近接物体の影を撮像して得られる影画像との差分画像を、前記第1および第2の受光素子のそれぞれに対応して生成する画像生成部を備え、
前記画像処理部は、前記第1の受光素子により得られる前記反射光利用画像と前記第1の受光素子により得られる前記影画像との差分画像に対応する第1の差分画像と、前記第2の受光素子により得られる前記反射光利用画像と前記第2の受光素子により得られる前記影画像との差分画像に対応する第2の差分画像とに基づく合成画像を用いて、前記物体情報を取得する
請求項4に記載の撮像装置。 - 前記画像生成部は、前記第2の差分画像に基づいて所定のマスク画像を生成し、
前記画像処理部は、前記第1の差分画像と前記マスク画像との合成画像を用いて、前記物体情報を取得する
請求項5に記載の撮像装置。 - 前記画像生成部は、前記第2の差分画像に対して2値化処理および画像反転処理を施すことにより、前記マスク画像を生成する
請求項5に記載の撮像装置。 - 前記画像処理部は、前記第1の差分画像と前記第2の差分画像との差分画像を用いて、前記物体情報を取得する
請求項5に記載の撮像装置。 - 前記画像生成部は、前記第1の差分画像の生成処理と、前記第2の差分画像の生成処理とを、前記第1および第2の受光素子単位の逐次処理により行う
請求項5に記載の撮像装置。 - 前記撮像パネルにおいて、前記第1の受光素子と前記第2の受光素子とが、1対1の割合で交互に配置されている
請求項9に記載の撮像装置。 - 前記第1の差分画像が、複数の第1の差分画素値により構成されると共に、前記第2の差分画像が、複数の第2の差分画素値により構成され、
前記画像生成部は、前記逐次処理を行う際に、
各第1の受光素子に隣接する第2の受光素子により得られる前記第2の差分画素値が所定の閾値以上の場合には、その第1の受光素子における前記第1の差分画素値を0(ゼロ)とみなして出力し、
各第1の受光素子に隣接する第2の受光素子により得られる前記第2の差分画素値が前記閾値未満の場合には、その第1の受光素子における実際の差分演算結果を前記第1の差分画素値として出力する
請求項10に記載の撮像装置。 - 前記画像生成部は、前記逐次処理を行う際に、
各第2の受光素子に対応する位置における前記第1の差分画素値を、第1の補間差分画素値として補間生成すると共に、
各第1の受光素子に対応する位置における前記第2の差分画素値を、第2の補間差分画素値として補間生成し、
前記第1および第2の補間差分画素値をも考慮して、前記閾値との比較結果に応じた逐次処理を行う
請求項11に記載の撮像装置。 - 前記画像処理部は、前記検出光による反射光を利用して前記複数の第1の受光素子によって前記近接物体を撮像して得られる反射光利用画像と、前記検出光による反射光を利用して前記複数の第2の受光素子によって前記近接物体を撮像して得られる反射光利用画像との差分画像に基づいて、前記物体情報を取得する
請求項1に記載の撮像装置。 - 前記第2の受光素子において、前記検出光の波長領域における受光感度が0(ゼロ)となっている
請求項1に記載の撮像装置。 - 前記第1の受光素子の受光波長領域と、前記第2の受光素子の受光波長領域とが、互いに分離している
請求項14に記載の撮像装置。 - 前記検出光が赤外光であり、
前記第1の受光素子は、前記赤外光の波長領域を受光波長領域として含んでおり、
前記第2の受光素子は、可視光の波長領域を受光波長領域として含んでいる
請求項1に記載の撮像装置。 - 複数の表示素子と、複数の第1および第2の受光素子とを有し、近接物体を検出するための所定の波長領域の検出光を含む光を発する表示撮像パネルと、
前記表示撮像パネルによって前記近接物体を撮像した信号を画像処理し、その近接物体の位置、形状または大きさの少なくとも1つを含む物体情報を取得する画像処理部と
を備え、
前記第1の受光素子は、その受光波長領域に前記検出光の波長領域を含んでおり、
前記第2の受光素子は、前記検出光の波長領域における受光感度が、前記第1の受光素子よりも低くなっており、
前記画像処理部は、前記第1および第2の受光素子から得られる信号を処理して前記物体情報を取得する
表示撮像装置。 - 前記表示撮像パネルは、前記検出光を含む光を発する照射光源を有する
請求項17に記載の表示撮像装置。 - 前記表示素子が、前記検出光を含む光を発する
請求項17に記載の表示撮像装置。 - 画像表示機能および撮像機能を有する表示撮像装置を備え、
前記表示撮像装置は、
複数の表示素子と、複数の第1および第2の受光素子とを有し、近接物体を検出するための所定の波長領域の検出光を含む光を発する表示撮像パネルと、
前記表示撮像パネルによって前記近接物体を撮像した信号を画像処理し、その近接物体の位置、形状または大きさの少なくとも1つを含む物体情報を取得する画像処理部と
を有し、
前記第1の受光素子は、その受光波長領域に前記検出光の波長領域を含んでおり、
前記第2の受光素子は、前記検出光の波長領域における受光感度が、前記第1の受光素子よりも低くなっており、
前記画像処理部は、前記第1および第2の受光素子から得られる信号を処理して前記物体情報を取得する
電子機器。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/809,810 US8514201B2 (en) | 2008-10-21 | 2009-10-14 | Image pickup device, display-and-image pickup device, and electronic device |
CN2009801014386A CN101903853B (zh) | 2008-10-21 | 2009-10-14 | 图像拾取装置、显示及图像拾取装置和电子装置 |
BRPI0906069-3A BRPI0906069A2 (pt) | 2008-10-21 | 2009-10-14 | Dispositivo de captação de imagem, de exibição e captação de imagem e eletrônico. |
JP2010534775A JP5300859B2 (ja) | 2008-10-21 | 2009-10-14 | 撮像装置、表示撮像装置および電子機器 |
EP09821953.8A EP2343633A4 (en) | 2008-10-21 | 2009-10-14 | IMAGE CAPTURE DEVICE, IMAGE DISPLAY AND CAPTURE DEVICE, AND ELECTRONIC APPARATUS |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008-271349 | 2008-10-21 | ||
JP2008271349 | 2008-10-21 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010047256A1 true WO2010047256A1 (ja) | 2010-04-29 |
Family
ID=42119297
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/067784 WO2010047256A1 (ja) | 2008-10-21 | 2009-10-14 | 撮像装置、表示撮像装置および電子機器 |
Country Status (9)
Country | Link |
---|---|
US (1) | US8514201B2 (ja) |
EP (1) | EP2343633A4 (ja) |
JP (1) | JP5300859B2 (ja) |
KR (1) | KR20110074488A (ja) |
CN (1) | CN101903853B (ja) |
BR (1) | BRPI0906069A2 (ja) |
RU (1) | RU2456659C2 (ja) |
TW (1) | TW201027408A (ja) |
WO (1) | WO2010047256A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012008541A (ja) * | 2010-05-28 | 2012-01-12 | Semiconductor Energy Lab Co Ltd | 光検出装置 |
JP2012044143A (ja) * | 2010-08-20 | 2012-03-01 | Samsung Electronics Co Ltd | センサアレイ基板、これを含む表示装置およびこれの製造方法 |
WO2022085406A1 (ja) * | 2020-10-19 | 2022-04-28 | ソニーセミコンダクタソリューションズ株式会社 | 撮像素子、及び電子機器 |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5093523B2 (ja) * | 2007-02-23 | 2012-12-12 | ソニー株式会社 | 撮像装置、表示撮像装置および撮像処理装置 |
JP2012053840A (ja) * | 2010-09-03 | 2012-03-15 | Toshiba Tec Corp | 光学式タッチデバイス |
GB201110159D0 (en) * | 2011-06-16 | 2011-07-27 | Light Blue Optics Ltd | Touch sensitive display devices |
JP5128699B1 (ja) * | 2011-09-27 | 2013-01-23 | シャープ株式会社 | 配線検査方法および配線検査装置 |
US10019112B2 (en) * | 2011-10-25 | 2018-07-10 | Semiconductor Components Industries, Llc | Touch panels with dynamic zooming and low profile bezels |
RU2548923C2 (ru) * | 2013-02-07 | 2015-04-20 | Открытое акционерное общество "АНГСТРЕМ" | Устройство для измерения координат |
WO2019100329A1 (zh) * | 2017-11-24 | 2019-05-31 | 深圳市汇顶科技股份有限公司 | 背景去除方法、影像模块及光学指纹辨识系统 |
EP3585047B1 (de) * | 2018-06-20 | 2021-03-24 | ZKW Group GmbH | Verfahren und vorrichtung zur erstellung von hochkontrastbildern |
JP2020136361A (ja) * | 2019-02-14 | 2020-08-31 | ファスフォードテクノロジ株式会社 | 実装装置および半導体装置の製造方法 |
WO2020196196A1 (ja) * | 2019-03-26 | 2020-10-01 | ソニー株式会社 | 画像処理装置、画像処理方法及び画像処理プログラム |
CN112001879B (zh) * | 2019-06-18 | 2023-07-14 | 杭州美盛红外光电技术有限公司 | 气体检测装置和气体检测方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004127272A (ja) | 2002-09-10 | 2004-04-22 | Sony Corp | 情報処理装置および方法、記録媒体、並びにプログラム |
JP2006018219A (ja) * | 2004-05-31 | 2006-01-19 | Toshiba Matsushita Display Technology Co Ltd | 画像取込機能付き表示装置 |
JP2006276223A (ja) | 2005-03-28 | 2006-10-12 | Sony Corp | 表示装置及び表示方法 |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3321053B2 (ja) * | 1996-10-18 | 2002-09-03 | 株式会社東芝 | 情報入力装置及び情報入力方法及び補正データ生成装置 |
JP3967083B2 (ja) * | 2000-03-22 | 2007-08-29 | 株式会社東芝 | 半導体受光素子 |
JP2002176192A (ja) * | 2000-09-12 | 2002-06-21 | Rohm Co Ltd | 照度センサチップ、照度センサ、照度測定装置、および照度測定方法 |
JP2003337657A (ja) * | 2002-03-13 | 2003-11-28 | Toto Ltd | 光学式タッチパネル装置 |
US7265740B2 (en) * | 2002-08-30 | 2007-09-04 | Toshiba Matsushita Display Technology Co., Ltd. | Suppression of leakage current in image acquisition |
JP3715616B2 (ja) * | 2002-11-20 | 2005-11-09 | Necディスプレイソリューションズ株式会社 | 液晶表示装置及び該装置のコモン電圧設定方法 |
JP2005275644A (ja) * | 2004-03-24 | 2005-10-06 | Sharp Corp | 液晶表示装置 |
CN100394463C (zh) * | 2004-05-31 | 2008-06-11 | 东芝松下显示技术有限公司 | 配备图像获取功能的显示装置 |
US7684029B2 (en) * | 2004-10-29 | 2010-03-23 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Method and apparatus for identifying a sensed light environment |
TWI274505B (en) * | 2005-01-14 | 2007-02-21 | Primax Electronics Ltd | Scanning method for scanning apparatus |
JP4645822B2 (ja) * | 2005-04-19 | 2011-03-09 | ソニー株式会社 | 画像表示装置および物体の検出方法 |
US8970501B2 (en) * | 2007-01-03 | 2015-03-03 | Apple Inc. | Proximity and multi-touch sensor detection and demodulation |
JP4932526B2 (ja) * | 2007-02-20 | 2012-05-16 | 株式会社 日立ディスプレイズ | 画面入力機能付き画像表示装置 |
RU2008100299A (ru) * | 2008-01-15 | 2008-08-10 | Александр Виль мович Казакевич (RU) | Способ ввода информации виртуальным стилусом |
-
2009
- 2009-10-14 EP EP09821953.8A patent/EP2343633A4/en not_active Withdrawn
- 2009-10-14 BR BRPI0906069-3A patent/BRPI0906069A2/pt not_active Application Discontinuation
- 2009-10-14 CN CN2009801014386A patent/CN101903853B/zh not_active Expired - Fee Related
- 2009-10-14 JP JP2010534775A patent/JP5300859B2/ja not_active Expired - Fee Related
- 2009-10-14 WO PCT/JP2009/067784 patent/WO2010047256A1/ja active Application Filing
- 2009-10-14 RU RU2010125277/07A patent/RU2456659C2/ru not_active IP Right Cessation
- 2009-10-14 US US12/809,810 patent/US8514201B2/en not_active Expired - Fee Related
- 2009-10-14 KR KR1020107013472A patent/KR20110074488A/ko not_active Application Discontinuation
- 2009-10-21 TW TW098135659A patent/TW201027408A/zh unknown
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004127272A (ja) | 2002-09-10 | 2004-04-22 | Sony Corp | 情報処理装置および方法、記録媒体、並びにプログラム |
JP2006018219A (ja) * | 2004-05-31 | 2006-01-19 | Toshiba Matsushita Display Technology Co Ltd | 画像取込機能付き表示装置 |
JP2006276223A (ja) | 2005-03-28 | 2006-10-12 | Sony Corp | 表示装置及び表示方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2343633A4 |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012008541A (ja) * | 2010-05-28 | 2012-01-12 | Semiconductor Energy Lab Co Ltd | 光検出装置 |
JP2015181022A (ja) * | 2010-05-28 | 2015-10-15 | 株式会社半導体エネルギー研究所 | 光検出装置、及び、タッチパネル |
JP2016192222A (ja) * | 2010-05-28 | 2016-11-10 | 株式会社半導体エネルギー研究所 | 光検出装置、タッチパネル |
US9846515B2 (en) | 2010-05-28 | 2017-12-19 | Semiconductor Energy Laboratory Co., Ltd. | Photodetector and display device with light guide configured to face photodetector circuit and reflect light from a source |
JP2012044143A (ja) * | 2010-08-20 | 2012-03-01 | Samsung Electronics Co Ltd | センサアレイ基板、これを含む表示装置およびこれの製造方法 |
WO2022085406A1 (ja) * | 2020-10-19 | 2022-04-28 | ソニーセミコンダクタソリューションズ株式会社 | 撮像素子、及び電子機器 |
Also Published As
Publication number | Publication date |
---|---|
CN101903853A (zh) | 2010-12-01 |
JPWO2010047256A1 (ja) | 2012-03-22 |
EP2343633A4 (en) | 2013-06-12 |
US8514201B2 (en) | 2013-08-20 |
RU2456659C2 (ru) | 2012-07-20 |
EP2343633A1 (en) | 2011-07-13 |
RU2010125277A (ru) | 2011-12-27 |
JP5300859B2 (ja) | 2013-09-25 |
TW201027408A (en) | 2010-07-16 |
CN101903853B (zh) | 2013-08-14 |
US20100271336A1 (en) | 2010-10-28 |
BRPI0906069A2 (pt) | 2015-06-30 |
KR20110074488A (ko) | 2011-06-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2010047256A1 (ja) | 撮像装置、表示撮像装置および電子機器 | |
JP4915367B2 (ja) | 表示撮像装置および物体の検出方法 | |
US9703403B2 (en) | Image display control apparatus and image display control method | |
JP4835578B2 (ja) | 表示撮像装置、物体検出プログラムおよび物体の検出方法 | |
US8610670B2 (en) | Imaging and display apparatus, information input apparatus, object detection medium, and object detection method | |
TWI470507B (zh) | 具可切換漫射器的互動式平面電腦 | |
JP4270248B2 (ja) | 表示撮像装置、情報入力装置、物体検出プログラムおよび物体の検出方法 | |
JP5424475B2 (ja) | 情報入力装置、情報入力方法、情報入出力装置、情報入力プログラムおよび電子機器 | |
JP5093523B2 (ja) | 撮像装置、表示撮像装置および撮像処理装置 | |
JP5481127B2 (ja) | センサ素子およびその駆動方法、センサ装置、ならびに入力機能付き表示装置および電子機器 | |
TWI387903B (zh) | 顯示裝置 | |
CN106201118B (zh) | 触控及手势控制系统与触控及手势控制方法 | |
US11863855B2 (en) | Terminal device and image capturing method | |
JP4270247B2 (ja) | 表示撮像装置、物体検出プログラムおよび物体の検出方法 | |
EP2717568A2 (en) | Imaging device and method | |
JP5392656B2 (ja) | 情報入力装置、情報入出力装置および電子機器 | |
JP2010119064A (ja) | 色検出装置、色検出プログラム及びコンピュータ読み取り可能な記録媒体、並びに色検出方法 | |
JP4788755B2 (ja) | 表示撮像装置、物体検出プログラムおよび物体の検出方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980101438.6 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010534775 Country of ref document: JP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09821953 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009821953 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 20107013472 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010125277 Country of ref document: RU Ref document number: 4402/DELNP/2010 Country of ref document: IN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12809810 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: PI0906069 Country of ref document: BR Kind code of ref document: A2 Effective date: 20100614 |