WO2010016411A1 - 画像処理装置、画像処理方法、画像入力装置および画像入出力装置 - Google Patents
画像処理装置、画像処理方法、画像入力装置および画像入出力装置 Download PDFInfo
- Publication number
- WO2010016411A1 WO2010016411A1 PCT/JP2009/063382 JP2009063382W WO2010016411A1 WO 2010016411 A1 WO2010016411 A1 WO 2010016411A1 JP 2009063382 W JP2009063382 W JP 2009063382W WO 2010016411 A1 WO2010016411 A1 WO 2010016411A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- pixel
- label
- image
- label information
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
-
- G—PHYSICS
- G02—OPTICS
- G02F—OPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
- G02F1/00—Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
- G02F1/01—Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour
- G02F1/13—Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour based on liquid crystals, e.g. single liquid crystal display cells
- G02F1/133—Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/04166—Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/047—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using sets of wires, e.g. crossed wires
Definitions
- the present invention relates to an image input device having an image pickup function, an image input / output device having an image display function and an image pickup function, and an image applied at the time of labeling processing in such an image input device or image input / output device.
- the present invention relates to a processing apparatus and an image processing method.
- Some image display devices have a touch panel.
- the touch panel includes a resistance type using a change in electric resistance, a capacitance type using a change in capacitance, and an optical touch panel for optically detecting a finger or the like.
- this optical touch panel for example, light from a backlight is modulated by a liquid crystal element to display an image on a display surface, and light emitted from the display surface and reflected by a nearby object such as a finger is displayed on the display surface. Light is received by the arrayed light receiving elements, and the position of a proximity object is detected.
- Patent document 1 is mentioned as what described such an image display apparatus.
- the display device described in Patent Literature 1 includes a display unit having a display unit that displays an image and an imaging unit that captures an object.
- the present invention has been made in view of such problems, and an object of the present invention is to provide an image processing apparatus and an image processing method capable of realizing a labeling process faster than the prior art, and such an image processing apparatus.
- An object of the present invention is to provide an image input device and an image input / output device.
- the image processing apparatus includes a scanning unit that sequentially scans each pixel in an image expressed by binarized pixel data, and pixel data of a pixel of interest and its surrounding pixels when sequentially scanning each pixel.
- the position information and the area information for each connected area corresponding to each label information are updated as needed while assigning label information indicating the identification number for each connected area in the image as needed according to the value of
- an information acquisition unit that performs processing so that the label information, the position information, and the area information about the entire image are acquired at the time of completion of the sequential scanning.
- the “connected region” means a pixel region that can be regarded as a lump of one point.
- the image processing method sequentially scans each pixel in an image expressed by binarized pixel data, and values of pixel data of a pixel of interest and its surrounding pixels when sequentially scanning each pixel. Accordingly, the sequential scanning is performed by updating the position information and the area information for each connected area corresponding to each label information as needed while assigning the label information indicating the identification number for each connected area in the image as needed. At the time of completion of the process, the process is performed so that the label information, the position information, and the area information about the entire image are acquired.
- the image input device of the present invention is obtained on the basis of an input panel having a plurality of light receiving elements that are arranged along an imaging surface and receives light reflected by an external proximity object, and light reception signals from the respective light receiving elements.
- a scanning unit that sequentially scans each pixel, and when each pixel is sequentially scanned, imaging is performed according to the pixel data values of the pixel of interest and its surrounding pixels.
- An information acquisition unit that performs processing so as to acquire the label information, the position information, and the area information about the entire captured image, and the information acquisition unit. Obtained label information, on the basis of the position information and the area information, but with the position of the external proximity object, and a position detecting unit for obtaining information on at least one of shape and size.
- a first image input / output device is arranged along a display surface, displays a plurality of display elements based on an image signal, and is arranged along the display surface.
- the scanning unit that sequentially scans each pixel and the label information that indicates the identification number for each connected region in the captured image according to the pixel data values of the pixel of interest and its surrounding pixels when scanning each pixel. While assigning the pixels as needed, the position information and the area information for each connected region corresponding to each label information are updated as needed.
- a second image input / output device includes a display panel having a liquid crystal layer between a first substrate and a second substrate, and a first sensor that is formed in the display panel and can be contacted by the deflection of the second substrate.
- a position detection unit having an electrode and a second sensor electrode and detecting a deflection position of the second substrate according to the position of the external proximity object by reading a change in potential due to contact between the first sensor electrode and the second sensor electrode;
- a scanning unit that sequentially scans each pixel, and each pixel in turn When scanning, according to the value of pixel data of the pixel of interest and its surrounding pixels, label information indicating an identification number for each connected region in the image is assigned to the pixel of interest at any time, By updating the position information and area information for each connected region corresponding to the label information as needed, the label information, the position information, and the area information for the entire image are acquired at the time of completion
- each pixel is sequentially scanned in an image (for example, a captured image) expressed by binarized pixel data.
- image for example, a captured image
- label information indicating identification numbers for each connected region in the image is assigned to the pixel of interest at any time, and for each connected region corresponding to each label information
- the position information and area information are updated as needed.
- the label information, the position information, and the area information about the entire image are obtained. That is, it is not necessary to create a labeled image as in the prior art, and label information and the like for the entire image can be obtained by one sequential scanning.
- each pixel is sequentially scanned in the image represented by the binarized pixel data, and at the time of the sequential scanning,
- the position information for each connected area corresponding to each label information while assigning label information indicating the identification number for each connected area in the image as needed according to the pixel data values of the target pixel and its surrounding pixels, Since the area information is updated as needed, the label information, the position information, and the area information about the entire image can be acquired by one sequential scanning. Therefore, it is possible to realize a labeling process that is faster than in the past.
- FIG. 1 is a block diagram illustrating a configuration of an image input / output device according to a first embodiment of the present invention.
- FIG. 2 is a block diagram showing the configuration of the image input / output device in FIG. 1 in more detail. It is the cross section which expanded a part of input-output panel. It is a block diagram showing in more detail the structure of the labeling process part of FIG. It is a schematic diagram showing an example of the binarized data, the line buffer, the address list, and the additional information used in the labeling process according to the first embodiment.
- 3 is a flowchart of the entire image processing by the image input / output device. It is a flowchart showing the detail of the labeling process of 1st Embodiment.
- FIG. 9 is a schematic diagram for explaining the details of the labeling process following FIG. 8.
- FIG. 10 is a schematic diagram for explaining the details of the labeling process following FIG. 9.
- FIG. 11 is a schematic diagram for explaining the details of a labeling process following FIG. 10.
- FIG. 12 is a schematic diagram for explaining the details of the labeling process following FIG. 11.
- FIG. 13 is a schematic diagram for explaining the details of the labeling process following FIG. 12.
- FIG. 14 is a schematic diagram for explaining the details of the labeling process following FIG. 13.
- FIG. 15 is a schematic diagram for explaining the details of the labeling process following FIG. 14.
- FIG. 16 is a schematic diagram for explaining the details of the labeling process following FIG. 15.
- FIG. 17 is a schematic diagram for explaining the details of the labeling process following FIG. 16.
- FIG. 18 is a schematic diagram for explaining the details of the labeling process following FIG. 17.
- FIG. 19 is a schematic diagram for explaining the details of the labeling process following FIG. 18.
- FIG. 20 is a schematic diagram for explaining the details of the labeling process following FIG. 19.
- FIG. 21 is a schematic diagram for explaining the details of the labeling process following FIG. 20.
- FIG. 22 is a schematic diagram for explaining the details of the labeling process following FIG. 21.
- FIG. 23 is a schematic diagram for explaining the details of a labeling process following FIG. 22.
- FIG. 27 is a flowchart illustrating details of a labeling process according to the second embodiment following FIG. 26. It is a schematic diagram for demonstrating the detail of the labeling process of 2nd Embodiment.
- FIG. 29 is a schematic diagram for explaining the details of the labeling process following FIG. 28.
- FIG. 30 is a schematic diagram for explaining the details of the labeling process following FIG. 29.
- FIG. 31 is a schematic diagram for explaining the details of the labeling process following FIG. 30.
- FIG. 32 is a schematic diagram for explaining the details of the labeling process following FIG. 31.
- FIG. 33 is a schematic diagram for explaining the details of the labeling process following FIG. 32. It is a schematic diagram for demonstrating the detail of a labeling process following FIG.
- FIG. 35 is a schematic diagram for explaining the details of the labeling process following FIG. 34.
- FIG. 36 is a schematic diagram for explaining the details of the labeling process following FIG. 35.
- FIG. 37 is a schematic diagram for explaining the details of the labeling process following FIG. 36. It is sectional drawing showing the structure of the input-output panel which concerns on the modification of this invention.
- FIG. 1 shows a schematic configuration of an image input / output device 1 according to a first embodiment of the present invention.
- FIG. 2 shows a detailed configuration of the image input / output device 1 according to the present embodiment.
- FIG. 3 shows an enlarged cross section of a part of the input / output panel.
- the image input / output device 1 according to the present embodiment includes a display 10 and an electronic device main body 20 that uses the display 10.
- the display 10 includes an input / output panel 11, a display signal processing unit 12, a received light signal processing unit 13, and an image processing unit 14, and the electronic device main body 20 includes a control unit 21.
- the image processing method according to the first embodiment of the present invention is embodied by the image input / output device 1 according to the present embodiment, and will be described below.
- the input / output panel 11 includes a liquid crystal display panel in which a plurality of pixels 16 are arranged in a matrix, and includes a display element 11a and a light receiving element 11b.
- the display element 11a is a liquid crystal element that displays an image such as a figure or a character on the display surface using light emitted from a backlight serving as a light source.
- the light receiving element 11b is a light receiving element such as a photodiode that receives light and outputs it as an electrical signal.
- the light receiving element 11b receives the reflected light that is returned by the light emitted from the backlight reflected by an external proximity object such as a finger outside the input / output panel 11, and outputs a light receiving signal.
- the light receiving elements 11b are arranged for each pixel 16, and a plurality of light receiving elements 11b are arranged in the plane.
- the input / output panel 11 has a plurality of light emitting / receiving cells CWR having a structure separated from each other by a partition wall 32 arranged in a matrix between a pair of transparent substrates 30 and 31. Configured.
- Each light emitting / receiving cell CWR includes a light emitting cell CW (CW1, CW2, CW3,...) And a plurality of light receiving cells CR (CR1, CR2, CR3,...) Included in these light emitting cells CW.
- the light emitting cell CW includes a liquid crystal cell as the display element 11a
- the light receiving cell CR includes a light receiving element PD as the light receiving element 11b.
- a shielding layer 33 is disposed between the transparent substrate 30 on the backlight side and the light receiving element PD so that the light LB emitted from the backlight does not enter. Only light incident from the direction of the transparent substrate 31 opposite to the backlight is detected without being affected by the light beam LB.
- the display signal processing unit 12 shown in FIG. 1 is a circuit that is connected to the front stage of the input / output panel 11 and drives the input / output panel 11 so that the input / output panel 11 displays an image based on display data.
- the display signal processing unit 12 includes a display signal holding control unit 40, a light emitting side scanner 41, a display signal driver 42, and a light receiving side scanner 43.
- the display signal holding control unit 40 stores the display signal output from the display signal generation unit 44 in a field memory composed of, for example, SRAM (Static Random Access Memory) for each screen (each field display).
- SRAM Static Random Access Memory
- the light emitting side scanner 41 and the display signal driver 42 that drive each light emitting cell CW and the light receiving side scanner 43 that drives each light receiving cell CR are controlled to operate in conjunction with each other.
- the light emission side scanner 41 is based on the light emission timing control signal
- the light reception side scanner 43 is based on the light reception timing control signal
- the display signal driver 42 is based on the control signal and the display signal held in the field memory.
- a display signal for one horizontal line is output.
- a line sequential operation is performed by these control signals and display signals.
- the light emitting side scanner 41 has a function of selecting the light emitting cell CW to be driven in accordance with the light emission timing control signal output from the display signal holding control unit 40. Specifically, a light emission selection signal is supplied via a light emission gate line connected to each pixel 16 of the input / output panel 11 to control the light emitting element selection switch. That is, when a voltage for turning on the light emitting element selection switch of a certain pixel 16 is applied by the light emission selection signal, the pixel 16 performs a light emission operation with a luminance corresponding to the voltage supplied from the display signal driver 42. It is like that.
- the display signal driver 42 has a function of supplying display data to the light emitting cell CW to be driven according to a display signal for one horizontal line output from the display signal holding control unit 40. Specifically, a voltage corresponding to display data is supplied to the pixel 16 selected by the light-emitting side scanner 41 through a data supply line connected to each pixel 16 of the input / output panel 11. The light emitting side scanner 41 and the display signal driver 42 operate line-sequentially to display an image corresponding to arbitrary display data on the input / output panel 11.
- the light receiving side scanner 43 has a function of selecting a light receiving cell CR to be driven according to a light receiving timing control signal output from the display signal holding control unit 40. Specifically, a light receiving selection signal is supplied through a light receiving gate line connected to each pixel 16 of the input / output panel 11 to control the light receiving element selection switch. That is, similar to the operation of the light emitting side scanner 41 described above, when a voltage is applied to turn on the light receiving element selection switch of a certain pixel 16 by the light receiving selection signal, the light receiving signal detected from the pixel 16 is received. It is output to the signal receiver 45.
- the light receiving cell CR can receive and detect light reflected from an object that comes into contact with or close to the light based on light emitted from a certain light emitting cell CW.
- the light receiving side scanner 43 outputs a light receiving block control signal to the light receiving signal receiver 45 and the light receiving signal holding unit 46, and has a function of controlling the blocks contributing to the light receiving operation.
- the light emitting gate line and the light receiving gate line are separately connected to each light emitting / receiving cell CWR, and the light emitting side scanner 41 and the light receiving side scanner 43 are Each can operate independently.
- the light reception signal processing unit 13 shown in FIG. 1 is connected to the rear stage of the input / output panel 11, and takes in a light reception signal from the light receiving element 11b and performs amplification or the like. As shown in FIG. 2, the light reception signal processing unit 13 includes a light reception signal receiver 45 and a light reception signal holding unit 46.
- the light reception signal receiver 45 has a function of acquiring a light reception signal for one horizontal line output from each light reception cell CR in accordance with a light reception block control signal output from the light reception side scanner 43.
- the light reception signal for one horizontal line acquired by the light reception signal receiver 45 is output to the light reception signal holding unit 46.
- the light reception signal holding unit 46 reconfigures the light reception signal output from the light reception signal receiver 45 into a light reception signal for each screen (each field display) in accordance with the light reception block control signal output from the light reception side scanner 43. For example, it has a function of storing and holding in a field memory composed of an SRAM or the like.
- the received light signal data stored in the received light signal holding unit 46 is output to the position detecting unit 47 in the image processing unit 14 (FIG. 1).
- the received light signal holding unit 46 may be formed of a storage element other than the memory.
- the received light signal may be held in the capacitive element as analog data (charge).
- the image processing unit 14 (FIG. 1) is connected to the subsequent stage of the light reception signal processing unit 13, takes a captured image from the light reception signal processing unit 13, performs processing such as binarization, noise removal, and labeling, and external proximity objects Point information, that is, information indicating the center of gravity and center coordinates of the external proximity object and the area (size and shape) of the external proximity object.
- the labeling processing unit 14a (image processing apparatus) in the image processing unit 14 performs labeling processing as described below, and label information (the identification number for each connected region in the captured image) for the entire captured image. Information), position information and area information for each connected region are acquired. That is, as will be described in detail later, the labeling processing unit 14a sequentially scans each pixel in the captured image expressed by the binarized pixel data, and at the time of the sequential scanning, the pixel of interest and its surrounding pixels The label information, the position information, and the area information are updated by updating the position information and the area information for each connected region corresponding to each label information as needed while assigning the label information at any time in accordance with the value of the pixel data. To get each.
- the labeling processing unit 14a corresponds to a specific example of “scanning unit” and “information acquisition unit” in the present invention.
- the position detection unit 47 (FIG. 2) in the image processing unit 14 performs signal processing based on the label information, position information, and area information obtained from the labeling processing unit 14a, and detects the object detected in the light receiving cell CR.
- the position etc. where there exists is specified. This makes it possible to specify the position of a finger or the like that is in contact with or close to it.
- the electronic device body 20 (FIG. 1) outputs display data to the display signal processing unit 12 of the display 10 and inputs point information from the image processing unit 14.
- the control unit 21 changes the display image using the point information.
- the control unit 21 includes a display signal generation unit 44 as shown in FIG.
- the display signal generation unit 44 is configured by a CPU (Central Processing Unit) (not shown) and the like, and generates a display signal for displaying, for example, every screen (every field display) based on the supplied image data. Then, it is output to the display signal holding control unit 40.
- a CPU Central Processing Unit
- FIG. 4 is a block diagram showing the detailed configuration of the labeling processing unit 14a.
- FIG. 5 schematically shows an example of binarized data, a line buffer, an address list, and additional information used in the labeling process of the present embodiment.
- the labeling processing unit 14a includes a condition determination circuit 141, a new label number issuing circuit 142, an address list 143, a line buffer 144, a line buffer control circuit 145, and an address list control circuit 146.
- the condition determination circuit 141 sequentially obtains binarized data Din that is binarized pixel data as shown in FIG. 5, for example, and according to the pixel data values of the target pixel and its surrounding pixels, It is determined whether or not to perform label information allocation processing and position information and area information update processing for each connected region. Specifically, the validity or invalidity determination of the pixel data value in the target pixel (here, determination of whether the effective value is “1” or the invalid value “0”) is performed, and the peripheral pixel (here Then, the label information on the left side and the upper side of the target pixel is referred to, and an invalid label or a new label (new label information) is issued and assigned, and a label integration operation command is issued. In addition, the condition determination circuit 141 issues a command for organizing the address list when the target pixel is located at the end of one line (here, the right end).
- the new label number issuing circuit 142 issues a new label based on the determination result by the condition determining circuit 141. Specifically, when the label is new, an unused register number (corresponding to label information) is issued in the address list 143.
- the line buffer 144 is a part that stores a register number (label information) for one line, for example, as shown in FIG. Note that the line buffer (image) 144a shown in FIG. 5 and the like is shown for convenience of explanation of labeling processing to be described later, and the actual line buffer 144 is a buffer for one line.
- the line buffer control circuit 145 performs control such as writing and reading of register numbers in the line buffer 144.
- the additional information memory 148 stores additional information as shown in FIG. 5, that is, position information for each connected area corresponding to each label information (xsum; total value of x coordinate values in the connected area, ysum; in the connected area)
- the total value of the y coordinate values, region; the x coordinate in the connected region, the minimum and maximum values of the y coordinate, etc.) and the area information (sum; the number of pixels in the connected region) are each associated with a label number (address number). ) And memorize it.
- the address list 143 includes a register number (RegNo; corresponding to label information) stored in the line buffer 144 and a label number (No; address number) stored in the additional information memory 148. ) And the presence / absence (Flag) of assignment of each label information in association with each other.
- the register number is held as a pointer of the array, and the label number is described in the array, and this label number is also the address of the label number itself. Thereby, the label number is linked to the register number.
- the address list control circuit 146 performs control such as writing and reading of information in the address list 143.
- the label memory controller 147 performs control such as writing and reading of additional information in the additional information memory 148, and outputs label information about the entire captured image, position information and area information for each connected region, as label information Dout. To do.
- FIG. 6 shows the overall flow of image processing by the image input / output device 1.
- FIG. 7 is a flowchart showing details of the labeling process according to the present embodiment. 8 to 23 schematically show details of the labeling process of the present embodiment.
- Display data output from the electronic device main body 20 is input to the display signal processing unit 12.
- the display signal processing unit 12 drives the input / output panel 11 so that an image is displayed on the input / output panel 11 based on the display data.
- the input / output panel 11 drives the light receiving element 11b while displaying an image on the display element 11a using the light emitted from the backlight.
- an external proximity object such as a finger contacts or approaches the display element 11a
- the image displayed on the display element 11a is reflected by the external proximity object, and the reflected light is detected by the light receiving element 11b.
- a light receiving signal is output from the light receiving element 11b.
- the received light signal processing unit 13 receives the received light signal, performs processing such as amplification, and processes the received light signal (step S10 in FIG. 6). In this way, a captured image is obtained in the light reception signal processing unit 13.
- the image processing unit 14 inputs a captured image from the received light signal processing unit 13, and performs binarization processing on the captured image (step S11). That is, the image processing unit 14 stores a preset threshold value. For example, the image processing unit 14 compares the signal strength of the captured image data with a value smaller than the threshold value or greater than the threshold value. A binarization process set to “1” is performed. As a result, the portion that receives the light reflected by the external proximity object is set to “1”, and the other portions are set to “0”.
- the image processing unit 14 removes isolated points from the binarized captured image (step S12). That is, when binarized as described above, the image processing unit 14 performs noise removal by removing a portion set to “1” that is isolated from an external proximity object.
- the image processing unit 14 performs a labeling process in the labeling processing unit 14a (step S13). That is, the labeling processing unit 14a performs the labeling process on the portion set to “1” when binarization is performed as described above. Then, the labeling processing unit 14a detects the area set to “1” as the area of the external proximity object, and obtains the center of gravity or the center coordinates of this area. Such data is output to the control unit 21 as point information (label information Dout described above).
- control unit 21 performs necessary processing such as changing the display image using the point information input from the image processing unit 14. For example, assuming that an operation menu is displayed on the screen, it is detected which button in the operation menu is selected by the user's finger, and an instruction corresponding to the selected button is executed. Thus, the basic operation in the image input / output device 1 is completed.
- values are initialized in the line buffer 144, the address list 143, and the additional information memory 148, respectively.
- the condition determination circuit 141 determines whether or not the pixel value (pixel data) of the pixel of interest is “1” (effective value) in the captured image composed of the binarized data Din (step in FIG. 7). S131).
- step S131: N when the pixel data of the target pixel is “0” (invalid value) (step S131: N), the line buffer control circuit 145 and the address list control circuit 146 are respectively The label information is not issued and assigned to the target pixel. That is, for example, as shown in FIG. 10, “z” (invalid label) is assigned in the line buffer 144 and the address list 143 (step S132). After that, the condition determination circuit 141 determines whether or not scanning for one line is completed (whether or not the target pixel is located at the right end of one line) (step S144).
- step S144 N
- the target pixel is moved to the next pixel (right adjacent pixel) in the line.
- step S131: Y when the pixel data of the target pixel is “1” (effective value) (step S131: Y), the condition determination circuit 141 next displays the surrounding pixels (here Then, whether the label of the pixel on the upper side and the left side of the target pixel is valid or invalid (whether the pixel data of the surrounding pixels is a valid value or an invalid value, or whether the target pixel is an isolated point) Is determined (step S133).
- the label is invalid (pixel data is “0” (invalid value) and the pixel of interest is an isolated point) in both the upper pixel and the left pixel of the pixel of interest. Therefore (step S133: both are invalid), for example, as shown in FIG.
- the new label number issuing circuit 142 issues and assigns a new label (new label information) at the target pixel (step S134).
- the line buffer control circuit 145, the address list control circuit 146, and the label memory controller 147 each update additional information as shown in FIG. 12, for example (step S135). Thereafter, in this case, for example, as shown in FIG. 13, the processes in steps S144 and S145 are repeated again.
- “(1)” or the like shown in a pixel in the binarized data Din means a register number (label information) assigned to the pixel.
- step S144 determines whether scanning for one line has been completed (step S144: Y). If it is determined that scanning for one line has been completed (step S144: Y), then the condition determination circuit 141 determines whether scanning of all lines in the captured image has been completed (step S146). ).
- step S147 For example, as shown in FIG. 15, the target pixel is moved (sequentially scanned) to the first pixel (leftmost pixel) in the next line (step S148). However, since the address list is not organized here (in the case shown in FIG. 14), the organization of the address list will be described later. After that, the process returns to step S131.
- the address list control circuit 146 organizes an address list described below (step S147). Specifically, for example, as shown in FIG. 17, in the address list 143, the flag of the register number that does not exist in the line buffer 144 is set to “0” (the label information corresponding to the register number is not assigned). Set). As a result, for example, as shown in FIG. 18, the register number whose flag is “0” in the address list 143 can be reused (reuse of label information) thereafter. After that, as shown in FIG. 18, the target pixel moves to the first pixel in the next line (step S148), and the process returns to step S131.
- step S131 it is determined in step S131 that the pixel data of the target pixel is “1” (effective value) (step S131: Y), and in step S133, the upper side of the target pixel is determined.
- step S133 valid only on the upper side
- steps S136 and S137 the processes of steps S136 and S137 described below are performed. That is, for example, as shown in FIG. 19, the same label as the upper pixel is assigned to the target pixel (the issued label information assigned to the pixel indicating the effective value is assigned) (step S136).
- the additional information position information and area information for each connected region is updated (step S137).
- step S131 when it is determined in step S131 that the pixel data of the target pixel is “1” (step S131: Y), and in step S133, only the label of the pixel on the left side of the target pixel is determined to be valid. (Step S133: Only the left is valid), Steps S138 and S139 described below are performed. That is, the same pixel as the left pixel is assigned to the target pixel (step S138), and the additional information is updated (step S139).
- step S131 it is determined in step S131 that the pixel data of the target pixel is “1” (step S131: Y), and in step S133, the upper pixel and the left side pixel of the target pixel are determined. If it is determined that both of the pixel labels are valid (step S133: both valid), then the condition determination circuit 141 determines whether the upper pixel and the left pixel have different labels (step) S140). Here, when the upper pixel and the left pixel have the same label (step S140: N), the processes of steps S138 and S139 described above are performed.
- step S140 determines whether the upper pixel and the left pixel have different labels.
- step S141 address list integration processing
- step S142 address list integration processing
- step S143 the additional information is updated (step S143).
- the line buffer control circuit 145, the address list control circuit 146, and the label memory controller 147 respectively, for example, as shown in FIGS.
- a register number (RegNo; corresponding to label information) is selected, and additional information is integrated into the smaller label number (No; corresponding to address number). As a result, the two connected areas are integrated and the same label is assigned.
- step S146 If it is determined in step S146 that scanning for all lines has been completed (step S146: Y), the labeling process ends.
- each pixel is sequentially scanned in the captured image represented by the binarized data Din. Then, during the sequential scanning, a register number (label information) is assigned to the target pixel as needed according to the pixel data values of the target pixel and its peripheral pixels, and additional information for each connected region corresponding to each label information (Location information and area information) is updated as needed.
- label information about the entire captured image, position information for each connected region, and area information are obtained. That is, it is not necessary to create a labeled image as in the prior art, and label information and the like for the entire image can be obtained by one sequential scanning.
- each pixel is sequentially scanned in the captured image represented by the binarized data Din, and in accordance with the pixel data values of the target pixel and the surrounding pixels during the sequential scanning.
- the register number (label information) indicating the identification number for each connected area in the captured image is assigned to the target pixel as needed, and additional information (position information and area information) for each connected area corresponding to each label information is updated as needed.
- the label information, the position information, and the area information about the entire captured image can be acquired by one sequential scanning. Therefore, it is possible to realize a labeling process that is faster than in the past.
- the real-time property of the labeling process can be improved compared to the conventional case, and a streaming process can be realized.
- a frame memory for holding such an image is also unnecessary. That is, in this embodiment, since the labeling process is performed using the line buffer, the amount of memory used can be reduced as compared with the conventional case. Therefore, the labeling process can be easily realized on hardware.
- the image input / output device is such that, in the image input / output device 1 according to the first embodiment shown in FIG. 1, a labeling processing unit 14b described later is provided instead of the labeling processing unit 14a. It is.
- symbol is attached
- FIG. 24 shows a block configuration of the labeling processing unit 14b of the present embodiment.
- the labeling processing unit 14b includes a condition determination circuit 141, a new label number issuing circuit 142, a line buffer 144b, a label memory controller 147, an additional information memory 148, and an empty address information register 149. That is, in the labeling processing unit 14a of the first embodiment shown in FIG. 4, an empty address information register 149 is provided instead of the address list 143 and the address list control circuit 146, and the line buffer 144 and the line buffer control circuit 145 are provided. Instead of this, a line buffer 144b is provided.
- the line buffer 144b is a part that stores a label number (corresponding to label information) for one line, for example, as shown in FIG.
- the line buffer 144b is configured to include a pixel-unit controller, so that reference, writing, updating, and the like of the label numbers of the target pixel and its peripheral pixels (here, the upper pixel and the left pixel) can be performed. It is possible.
- the line buffer (image) 144c shown in FIG. 25 and the like is shown for convenience of explanation of labeling processing to be described later, and the actual line buffer 144b is a buffer for one line.
- the empty address information register 149 stores the presence / absence (blank list) of each label number.
- This empty address information register 149 together with the new label number issuing circuit 142, manages the label numbers that are in use or not used, searches for new label numbers, and the like. Specifically, the newly issued label number becomes in use, and the label number disappeared by integration is rewritten as a number to be unused as appropriate. Thereby, the used label number can be reused any number of times.
- label numbers are used in ascending order of numbers.
- the additional information memory 148 of the present embodiment stores additional information as shown in FIG. 25, for example, the above-described label number (No) and position information (xsum, ysum, region) and area information (sum) are stored in association with each other.
- the label number is updated on the current label being accessed (target pixel) in the label memory controller 147.
- the additional information memory 148 is updated. Writing is performed.
- FIGS. 26 to 37 are flowcharts showing details of the labeling process according to the present embodiment.
- FIGS. 28 to 37 schematically show details of the labeling process of the present embodiment. Since the basic operation of the image input / output device is the same as that of the first embodiment, description thereof is omitted.
- values are initialized in the line buffer 144b, the additional information memory 148, and the empty address information register 149, respectively.
- the condition determining circuit 141 determines whether or not the pixel value (pixel data) of the pixel of interest is “1” (effective value) in the captured image composed of the binarized data Din (step in FIG. 26). S231).
- step S231: N when the pixel data of the target pixel is “0” (invalid value) (step S231: N), the label number is issued and assigned to the target pixel. I will not. Specifically, the condition determination circuit 141 next determines whether or not the label of the pixel on the left side of the target pixel is “0” (step S232). Here, since the label of the pixel on the left side is not “0” (step S232: N), the line buffer 144b and the label memory controller 147 perform the following processes of steps S233 and 234, respectively. That is, as shown in FIG.
- step S233 the current label information “0” is recorded in the additional information memory 148 (step S233), and the current label information is deleted from the label memory controller 147 (step S234), and then step S245. Proceed to If the left pixel label is “0” (step S232: Y), the process proceeds directly to step S245.
- step S245 the condition determination circuit 141 determines whether or not scanning for one line is completed (whether or not the target pixel is located at the right end of one line) (step S245 in FIG. 27).
- step S245 N
- the target pixel is moved to the next pixel (right adjacent pixel) in the line. Are moved (sequential scanning) (step S246). Then, the process returns again to step S231.
- step S245: Y the condition determination circuit 141 determines whether the label of the pixel on the left side of the target pixel is “0”.
- step S247 since the left pixel label is “0” (step S232: Y), the process proceeds to step S250. If the label of the left pixel is “1” (step S247: N), the line buffer 144b and the label memory controller 147 perform the following processes of steps S248 and 249, respectively. That is, the current label information “0” is recorded in the additional information memory 148 (step S248), and the current label information is deleted from the label memory controller 147 (step S249), and the process proceeds to step S250.
- step S250 the condition determination circuit 141 determines whether scanning of all lines in the captured image has been completed (step S250).
- step S250 N
- the pixel of interest moves (sequentially scans) to the pixels in the next line (step 1).
- step S251 the process returns to step S231.
- the address list 143 is not provided in the present embodiment, the address is not organized unlike the first embodiment.
- the condition determination circuit 141 next displays the surrounding pixels (here, Whether the label of the pixel above and to the left of the pixel of interest is valid or invalid (whether the pixel data of the surrounding pixels are valid or invalid, or whether the pixel of interest is an isolated point) ) Is determined (step S235).
- the label is invalid (pixel data is “0” (invalid value), and the pixel of interest is an isolated point) in the upper pixel and the left pixel of the pixel of interest.
- step S235 both are invalid
- the new label number issuing circuit 142 uses the empty address information register 149 to search for empty label numbers as shown in FIG. 30, for example (step S236).
- each of the line buffer 144b and the label memory controller 147 assigns a new label (new label information) to the target pixel by using the current location information as current label information as shown in FIG. Step S237). Thereafter, in this case, for example, as shown in FIG. 31, the processes of steps S245 and S246 are repeated again.
- “(1)” or the like shown in a pixel in the binarized data Din means a label number (label information) assigned to the pixel.
- step S231 it is determined in step S231 that the pixel data of the target pixel is “1” (effective value) (step S231: Y), and in step S235, the upper side of the target pixel is determined.
- step S235 valid only on the upper side
- the process of step S238 described below is performed. That is, each of the line buffer 144b and the label memory controller 147 uses (current location information + label information of the upper pixel) as current label information, as shown in FIG. The same label is assigned. Thereby, for example, as shown in FIG. 33, the additional information (position information and area information for each connected region) is updated.
- step S231 it is determined in step S231 that the pixel data of the target pixel is “1” (step S231: Y), and in step S235, only the label of the pixel on the left side of the target pixel is displayed. Is determined to be effective (step S235: only left is effective), the process of step S239 described below is performed. That is, for example, as shown in FIG. 34, by using (current location information + label information of left pixel) as current label information, the same label as the left pixel is assigned to the target pixel.
- step S231 it is determined in step S231 that the pixel data of the target pixel is “1” (step S231: Y), and in step S235, the upper pixel and the left side pixel of the target pixel are determined. If it is determined that both of the pixel labels are valid (step S235: both valid), then the condition determination circuit 141 determines whether the upper pixel and the left pixel have different labels (step) S240). Here, when the upper pixel and the left pixel have the same label (step S240: N), the process of step S239 described above is performed.
- step S240 determines whether the upper pixel and the left pixel have different labels.
- steps S241 to S244 are performed, and the upper pixel and the left pixel are processed.
- the additional information is updated.
- each of the line buffer 144b and the label memory controller 147 uses (current location information + label information of upper pixel + label information of left pixel) as current label information as shown in FIG. Step S241).
- the label number on the line buffer 144b is updated to the update destination label number at once (step S242).
- step S243 the larger label number of the upper pixel and the left pixel is deleted from the additional information memory 148 (step S243), and the empty address information (empty label number) is updated (step S244).
- the two linked areas are integrated and the same label is assigned.
- step S250 By performing the labeling process shown in steps S231 to S251 in this way, for example, as shown in FIG. 37, label information about the entire captured image, position information for each connected region, and area information are labeled. Obtained as information Dout. If it is determined in step S250 that scanning for all lines has been completed (step S250: Y), the labeling process ends.
- each pixel is sequentially scanned in the captured image represented by the binarized data Din, as in the first embodiment. Then, during the sequential scanning, a label number (label information) is assigned to the target pixel as needed according to the pixel data values of the target pixel and its surrounding pixels, and additional information for each connected region corresponding to each label information (Location information and area information) is updated as needed.
- label information about the entire captured image, position information for each connected region, and area information are obtained. That is, it is not necessary to create a labeled image as in the prior art, and label information and the like for the entire image can be obtained by one sequential scanning.
- the same effect can be obtained by the same operation as that of the first embodiment. That is, label information, position information, and area information about the entire captured image can be acquired by one sequential scanning. Therefore, it is possible to realize a labeling process that is faster than in the past.
- the address list 143 in the first embodiment is not necessary, and the label information can be directly updated. Therefore, the real-time property is further improved as compared with the first embodiment. Can be made. Therefore, the labeling process on the hardware can be realized more easily, and the amount of memory used can be reduced.
- the labeling process is performed using the pixels in the upper and left sides of the target pixel as the peripheral pixels of the target pixel.
- the upper side, the left side, and the upper right side of the target pixel may be performed using pixels in the three directions as peripheral pixels.
- the case where the value of the pixel data is “1” is the effective value and the case where the pixel data is “0” is the invalid value.
- An effective value may be set when the value is “0”, and an invalid value may be set when the pixel data is “1”.
- one light receiving cell is provided corresponding to one light emitting cell.
- one light receiving cell may be provided corresponding to a plurality of light emitting cells. Good.
- the configuration using the liquid crystal display panel as the input / output panel 11 has been described.
- the image input / output device of the present invention can also be configured using an organic electroluminescence (EL) panel or the like as the input / output panel.
- the organic EL element has a property of emitting light when a forward bias voltage is applied and receiving light to generate a current when a reverse bias voltage is applied. For this reason, the organic EL element has the display element 11a and the light receiving element 11b.
- the input / output panel 11 is configured by arranging an organic EL element for each pixel 16, and displays an image when a forward bias voltage is applied to each organic EL element in accordance with display data to perform a light emission operation. A reverse bias voltage is applied to the organic EL element to receive the reflected light.
- the present invention has been described by taking the image input / output device 1 including the input / output panel 11 having the plurality of display elements 11a and the plurality of light receiving elements 11b as an example.
- the present invention can also be applied to an image input device (imaging device) including an input panel having a plurality of light receiving elements 11b.
- the image processing apparatus of the present invention can be applied not only to a captured image based on a light reception signal obtained by the light receiving element 11b but also to an image generated by another method.
- the image processing apparatus of the present invention can be applied to an image generated in an image input / output apparatus including the input / output panel 5 (cross-sectional structure at the pixel Px) shown in FIG. 38, for example. It is.
- the input / output panel 5 includes a glass substrate 50A, a gate insulating film 51A, a first interlayer insulating film 12A, a signal line SL, a second interlayer insulating film 52B, a common electrode 53, a third interlayer insulating film 52C, and a pixel electrode 54 (first A first substrate 50 having one sensor electrode), a second substrate 60 having a glass substrate 60A, a color filter 61 and a counter sensor electrode 62 (second sensor electrode), and a liquid crystal layer 70 including liquid crystal molecules 71.
- a resistive touch sensor is configured by the pixel electrode 54 and the counter sensor electrode 62.
- the pixel electrode 54 has, for example, a cross-sectional shape including a plurality of edges 54B.
- the alignment film (not shown) tends to be thin, and the edge 54B is exposed from the alignment film.
- a counter sensor electrode 62 (consisting of a slit 62A and a pattern 62B) is disposed to face the edge 54B.
- the pixel electrode 54 when the input / output panel 5 is an FFS (Fringe Field Switching) type liquid crystal display panel, the pixel electrode 54 originally has a planar shape including a plurality of slits 54A, so that the aperture ratio is not lowered. It becomes possible to improve detection performance.
- FFS Flexible Field Switching
- the series of processes described in the above embodiments can be performed by hardware or software.
- a program constituting the software is installed in a general-purpose computer or the like. Such a program may be recorded in advance on a recording medium built in the computer.
Abstract
Description
図1は、本発明の第1の実施の形態に係る画像入出力装置1の概略構成を表すものである。また、図2は、本実施の形態に係る画像入出力装置1の詳細構成を表すものである。また、図3は、入出力パネルの一部を拡大した断面を表すものである。本実施の形態に係る画像入出力装置1は、図1に示すように、ディスプレイ10と、このディスプレイ10を利用する電子機器本体20とを備えたものである。ディスプレイ10は、入出力パネル11、表示信号処理部12、受光信号処理部13および画像処理部14を有し、電子機器本体20は、制御部21を有する。なお、本発明の第1の実施の形態に係る画像処理方法は、本実施の形態の画像入出力装置1によって具現化されるため、以下併せて説明する。
次に、本発明の第2の実施の形態について説明する。本実施の形態の画像入出力装置は、図1に示した第1の実施の形態の画像入出力装置1において、ラベリング処理部14aの代わりに、後述するラベリング処理部14bを設けるようにしたものである。なお、第1の実施の形態における構成要素と同一のものには同一の符号を付し、適宜説明を省略する。
Claims (11)
- 2値化された画素データによって表現される画像において、各画素を順次走査する走査部と、
各画素を順次走査する際に、注目画素およびその周辺画素の画素データの値に応じて、前記画像内における連結領域ごとの識別番号を示すラベル情報を注目画素において随時割り当てつつ、各ラベル情報に対応する連結領域ごとの位置情報および面積情報を随時更新することにより、その順次走査の完了時点で、前記画像全体についてのラベル情報、前記位置情報および前記面積情報がそれぞれ取得されるように処理を行う情報取得部と
を備えた画像処理装置。 - 前記情報取得部は、注目画素の画素データが有効値を示すものであると共に、その周辺画素の画素データがいずれも無効値を示すものである場合には、この注目画素において、新たなラベル情報を発行して割り当てる
請求項1に記載の画像処理装置。 - 前記情報取得部は、注目画素の画素データが有効値を示すものであると共に、その周辺画素のうちの1画素の画素データのみが有効値を示すものである場合には、この注目画素において、有効値を示す1画素に割り当てられている発行済みのラベル情報を割り当てることにより、前記連結領域ごとの位置情報および面積情報をそれぞれ更新する
請求項1または請求項2に記載の画像処理装置。 - 前記情報取得部は、注目画素の画素データが有効値を示すものであると共に、その周辺画素のうちの複数の画素の画素データがそれぞれ有効値を示すものである場合には、この注目画素において、有効値を示す複数の画素のうちの選択した1画素に割り当てられている発行済みのラベル情報を割り当てることにより、前記連結領域ごとの位置情報および面積情報をそれぞれ更新する
請求項1または請求項2に記載の画像処理装置。 - 前記情報取得部は、注目画素の画素データが無効値を示すものである場合には、この注目画素において、ラベル情報の発行および割り当てを行わない
請求項1または請求項2に記載の画像処理装置。 - 前記情報取得部は、
注目画素およびその周辺画素の画素データの値に応じて、前記ラベル情報の割り当て処理と、前記連結領域ごとの位置情報および面積情報の更新処理とを、それぞれ行うか否かについて判定する判定部と、
前記判定部による判定結果に基づいて、新たなラベル情報を発行するラベル発行部と、
画素単位のコントローラを含んで構成され、前記ラベル情報を記憶するラインバッファと、
前記ラベル情報と、前記位置情報および面積情報とを、それぞれ対応づけて記憶する付加情報メモリと、
各ラベル情報についての割り当ての有無を記憶する空き番地情報レジスタとを有する
請求項1または請求項2に記載の画像処理装置。 - 前記情報取得部は、
注目画素およびその周辺画素の画素データの値に応じて、前記ラベル情報の割り当て処理と、前記連結領域ごとの位置情報および面積情報の更新処理とを、それぞれ行うか否かについて判定する判定部と、
前記判定部による判定結果に基づいて、新たなラベル情報を発行するラベル発行部と、 前記ラベル情報を記憶するラインバッファと、
前記位置情報および面積情報をそれぞれ、アドレス番号と対応づけて記憶する付加情報メモリと、
前記ラインバッファに記憶されているラベル情報と、前記付加情報メモリに記憶されているアドレス番号と、各ラベル情報についての割り当ての有無とを、それぞれ対応づけて記憶するアドレスリストとを有する
請求項1または請求項2に記載の画像処理装置。 - 2値化された画素データによって表現される画像において、各画素を順次走査すると共に、
各画素を順次走査する際に、注目画素およびその周辺画素の画素データの値に応じて、前記画像内における連結領域ごとの識別番号を示すラベル情報を注目画素において随時割り当てつつ、各ラベル情報に対応する連結領域ごとの位置情報および面積情報を随時更新することにより、その順次走査の完了時点で、前記画像全体についてのラベル情報、前記位置情報および前記面積情報がそれぞれ取得されるように処理を行う
画像処理方法。 - 撮像面に沿って配列され、外部近接物体で反射された光を受光する複数の受光要素を有する入力パネルと、
各受光要素からの受光信号に基づいて得られる、2値化された画素データによって表現される撮像画像において、各画素を順次走査する走査部と、
各画素を順次走査する際に、注目画素およびその周辺画素の画素データの値に応じて、前記撮像画像内における連結領域ごとの識別番号を示すラベル情報を注目画素において随時割り当てつつ、各ラベル情報に対応する連結領域ごとの位置情報および面積情報を随時更新することにより、その順次走査の完了時点で、前記撮像画像全体についてのラベル情報、前記位置情報および前記面積情報がそれぞれ取得されるように処理を行う情報取得部と、
前記情報取得部により得られたラベル情報、位置情報および面積情報に基づいて、前記外部近接物体の位置、形状および大きさのうちの少なくとも1つに関する情報を取得する位置検出部と
を備えた画像入力装置。 - 表示面に沿って配列され、画像信号に基づいて画像を表示する複数の表示要素と、前記表示面に沿って配列され、前記表示面から出射されて外部近接物体で反射された光を受光する複数の受光要素とを有する入出力パネルと、
各受光要素からの受光信号に基づいて得られる、2値化された画素データによって表現される撮像画像において、各画素を順次走査する走査部と、
各画素を順次走査する際に、注目画素およびその周辺画素の画素データの値に応じて、前記撮像画像内における連結領域ごとの識別番号を示すラベル情報を注目画素において随時割り当てつつ、各ラベル情報に対応する連結領域ごとの位置情報および面積情報を随時更新することにより、その順次走査の完了時点で、前記撮像画像全体についてのラベル情報、前記位置情報および前記面積情報がそれぞれ取得されるように処理を行う情報取得部と、
前記情報取得部により得られたラベル情報、位置情報および面積情報に基づいて、前記外部近接物体の位置、形状および大きさのうちの少なくとも1つに関する情報を取得する位置検出部と
を備えた画像入出力装置。 - 第1基板および第2基板の間に液晶層を有する表示パネルと、この表示パネル内に形成され、前記第2基板のたわみにより接触可能な第1センサ電極および第2センサ電極を有すると共に前記第1センサ電極および前記第2センサ電極の接触による電位の変化を読み取ることにより、外部近接物体の位置に応じた第2基板のたわみ位置を検出する位置検出部とを備えた入出力パネルと、
前記位置検出部からの位置検出信号に基づいて得られる、2値化された画素データによって表現される画像において、各画素を順次走査する走査部と、
各画素を順次走査する際に、注目画素およびその周辺画素の画素データの値に応じて、前記画像内における連結領域ごとの識別番号を示すラベル情報を注目画素において随時割り当てつつ、各ラベル情報に対応する連結領域ごとの位置情報および面積情報を随時更新することにより、その順次走査の完了時点で、前記画像全体についてのラベル情報、前記位置情報および前記面積情報がそれぞれ取得されるように処理を行う情報取得部と、
前記情報取得部により得られたラベル情報、位置情報および面積情報に基づいて、前記外部近接物体の位置、形状および大きさのうちの少なくとも1つに関する情報を取得する位置検出部と
を備えた画像入出力装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/680,567 US8836670B2 (en) | 2008-08-05 | 2009-07-28 | Image processing apparatus, image processing method, image input device and image input/output device |
EP09804895A EP2312421A4 (en) | 2008-08-05 | 2009-07-28 | Image processing device, image processing method, image input device and image input / output device |
CN2009801011091A CN101878465A (zh) | 2008-08-05 | 2009-07-28 | 图像处理装置、图像处理方法、图像输入设备和图像输入/输出设备 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008-201463 | 2008-08-05 | ||
JP2008201463A JP5027075B2 (ja) | 2008-08-05 | 2008-08-05 | 画像処理装置、画像入力装置および画像入出力装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010016411A1 true WO2010016411A1 (ja) | 2010-02-11 |
Family
ID=41663631
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/063382 WO2010016411A1 (ja) | 2008-08-05 | 2009-07-28 | 画像処理装置、画像処理方法、画像入力装置および画像入出力装置 |
Country Status (7)
Country | Link |
---|---|
US (1) | US8836670B2 (ja) |
EP (1) | EP2312421A4 (ja) |
JP (1) | JP5027075B2 (ja) |
KR (1) | KR20110051164A (ja) |
CN (1) | CN101878465A (ja) |
TW (1) | TW201020888A (ja) |
WO (1) | WO2010016411A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013120598A (ja) * | 2011-12-06 | 2013-06-17 | Lg Display Co Ltd | タッチ領域ラベリング方法及びタッチセンサ駆動装置 |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5224973B2 (ja) * | 2008-08-26 | 2013-07-03 | 株式会社ジャパンディスプレイウェスト | 情報入出力装置および情報入出力方法 |
JP5366051B2 (ja) * | 2009-04-20 | 2013-12-11 | 株式会社ジャパンディスプレイ | 情報入力装置、表示装置 |
JP5382658B2 (ja) * | 2010-02-26 | 2014-01-08 | 株式会社ジャパンディスプレイ | タッチセンサ付き表示装置、タッチパネル、タッチパネルの駆動方法、および電子機器 |
TWI433004B (zh) * | 2010-05-14 | 2014-04-01 | Alcor Micro Corp | 觸控面板上之觸控點判斷方法及其系統 |
US8553003B2 (en) * | 2010-08-20 | 2013-10-08 | Chimei Innolux Corporation | Input detection method, input detection device, input detection program and media storing the same |
JP5064552B2 (ja) * | 2010-08-20 | 2012-10-31 | 奇美電子股▲ふん▼有限公司 | 入力検出方法、入力検出装置、入力検出プログラム及び記録媒体 |
CN102456079B (zh) * | 2010-10-18 | 2016-08-03 | 赛恩倍吉科技顾问(深圳)有限公司 | 影像离线编程的尺寸引导系统及方法 |
TWI486547B (zh) * | 2010-10-20 | 2015-06-01 | Hon Hai Prec Ind Co Ltd | 影像離線編程的尺寸引導系統及方法 |
TWI470997B (zh) * | 2011-10-31 | 2015-01-21 | Au Optronics Corp | 立體顯示器 |
KR101885216B1 (ko) * | 2011-12-30 | 2018-08-30 | 삼성전자주식회사 | 터치 센서 시스템의 멀티 터치 구분 방법 |
JP6025456B2 (ja) * | 2012-08-28 | 2016-11-16 | キヤノン株式会社 | 被検体情報取得装置、表示方法、及びプログラム |
US9332167B1 (en) * | 2012-11-20 | 2016-05-03 | Amazon Technologies, Inc. | Multi-directional camera module for an electronic device |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS58151669A (ja) * | 1982-03-03 | 1983-09-08 | Hitachi Ltd | 画像処理装置のラベリング処理回路 |
JPS61145689A (ja) * | 1984-12-18 | 1986-07-03 | Toshiba Corp | 領域ラベリング回路 |
JPH07175925A (ja) * | 1993-12-17 | 1995-07-14 | Mitsubishi Electric Corp | 特徴量算出装置及び特徴量算出方法 |
JP2000242798A (ja) * | 1999-02-19 | 2000-09-08 | Nippon Chemicon Corp | 2値画像の特徴量抽出方法 |
JP2002164017A (ja) | 2000-11-24 | 2002-06-07 | Matsushita Electric Ind Co Ltd | 蛍光ランプ |
JP2004127272A (ja) | 2002-09-10 | 2004-04-22 | Sony Corp | 情報処理装置および方法、記録媒体、並びにプログラム |
JP2008097172A (ja) * | 2006-10-10 | 2008-04-24 | Sony Corp | 表示装置および表示方法 |
JP2008146077A (ja) * | 2006-12-08 | 2008-06-26 | Samsung Electronics Co Ltd | 液晶表示装置及びその製造方法 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS60200379A (ja) * | 1984-03-26 | 1985-10-09 | Hitachi Ltd | 画像処理用セグメンテ−シヨン装置 |
US4953224A (en) * | 1984-09-27 | 1990-08-28 | Hitachi, Ltd. | Pattern defects detection method and apparatus |
JP2766053B2 (ja) * | 1990-07-30 | 1998-06-18 | 株式会社日立製作所 | 画像データ処理方法 |
JP2891616B2 (ja) * | 1993-09-24 | 1999-05-17 | 富士通株式会社 | 仮ラベル割付処理方式と実ラベル割付処理方式 |
US6483942B1 (en) * | 1999-09-27 | 2002-11-19 | Xerox Corporation | Micro region count image texture characterization |
JP2005339444A (ja) * | 2004-05-31 | 2005-12-08 | Toshiba Matsushita Display Technology Co Ltd | 表示装置 |
JP2007299210A (ja) * | 2006-04-28 | 2007-11-15 | Sharp Corp | 画像処理装置、画像形成装置、画像読取装置及び画像処理方法 |
US8121414B2 (en) * | 2007-06-13 | 2012-02-21 | Sharp Kabushiki Kaisha | Image processing method, image processing apparatus, and image forming apparatus |
-
2008
- 2008-08-05 JP JP2008201463A patent/JP5027075B2/ja active Active
-
2009
- 2009-07-28 WO PCT/JP2009/063382 patent/WO2010016411A1/ja active Application Filing
- 2009-07-28 US US12/680,567 patent/US8836670B2/en active Active
- 2009-07-28 KR KR1020107007269A patent/KR20110051164A/ko not_active Application Discontinuation
- 2009-07-28 EP EP09804895A patent/EP2312421A4/en not_active Withdrawn
- 2009-07-28 CN CN2009801011091A patent/CN101878465A/zh active Pending
- 2009-08-03 TW TW098126095A patent/TW201020888A/zh unknown
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS58151669A (ja) * | 1982-03-03 | 1983-09-08 | Hitachi Ltd | 画像処理装置のラベリング処理回路 |
JPS61145689A (ja) * | 1984-12-18 | 1986-07-03 | Toshiba Corp | 領域ラベリング回路 |
JPH07175925A (ja) * | 1993-12-17 | 1995-07-14 | Mitsubishi Electric Corp | 特徴量算出装置及び特徴量算出方法 |
JP2000242798A (ja) * | 1999-02-19 | 2000-09-08 | Nippon Chemicon Corp | 2値画像の特徴量抽出方法 |
JP2002164017A (ja) | 2000-11-24 | 2002-06-07 | Matsushita Electric Ind Co Ltd | 蛍光ランプ |
JP2004127272A (ja) | 2002-09-10 | 2004-04-22 | Sony Corp | 情報処理装置および方法、記録媒体、並びにプログラム |
JP2008097172A (ja) * | 2006-10-10 | 2008-04-24 | Sony Corp | 表示装置および表示方法 |
JP2008146077A (ja) * | 2006-12-08 | 2008-06-26 | Samsung Electronics Co Ltd | 液晶表示装置及びその製造方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2312421A4 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013120598A (ja) * | 2011-12-06 | 2013-06-17 | Lg Display Co Ltd | タッチ領域ラベリング方法及びタッチセンサ駆動装置 |
US8847901B2 (en) | 2011-12-06 | 2014-09-30 | Lg Display Co., Ltd. | Labeling touch regions of a display device |
US9182848B2 (en) | 2011-12-06 | 2015-11-10 | Lg Display Co., Ltd. | Labeling touch regions of a display device |
Also Published As
Publication number | Publication date |
---|---|
EP2312421A4 (en) | 2013-02-20 |
US20100253642A1 (en) | 2010-10-07 |
CN101878465A (zh) | 2010-11-03 |
EP2312421A1 (en) | 2011-04-20 |
JP2010039732A (ja) | 2010-02-18 |
JP5027075B2 (ja) | 2012-09-19 |
US8836670B2 (en) | 2014-09-16 |
TW201020888A (en) | 2010-06-01 |
KR20110051164A (ko) | 2011-05-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5027075B2 (ja) | 画像処理装置、画像入力装置および画像入出力装置 | |
JP5224973B2 (ja) | 情報入出力装置および情報入出力方法 | |
JP5780970B2 (ja) | タッチ感応ディスプレイ | |
JP5191321B2 (ja) | 情報入力装置、情報入力方法、情報入出力装置および情報入力プログラム | |
JP5274507B2 (ja) | タッチ動作認識方法及び装置 | |
JP2009163739A (ja) | 位置センサディスプレイ | |
EP2511801B1 (en) | Optical touch screen | |
JP5424475B2 (ja) | 情報入力装置、情報入力方法、情報入出力装置、情報入力プログラムおよび電子機器 | |
US9454260B2 (en) | System and method for enabling multi-display input | |
JP5203070B2 (ja) | 画像入出力装置およびその受光レベル補正方法、ならびに画像入力方法 | |
JP2008097172A (ja) | 表示装置および表示方法 | |
KR20070038430A (ko) | 표시장치 및 표시 방법 | |
KR20180064631A (ko) | 디스플레이 장치 및 그의 구동 방법 | |
JP2011521331A (ja) | 光学ベゼル付き対話型入力装置 | |
US20110122096A1 (en) | Method of driving touch screen display apparatus, medium for recording method, and touch screen display apparatus | |
JP2009217461A (ja) | 表示装置および位置検出方法 | |
JP5322163B2 (ja) | 表示装置、表示方法、および表示プログラム | |
JP2010003325A (ja) | 表示装置および表示方法 | |
CN109542276B (zh) | 一种触控点识别方法及装置和显示设备 | |
JP2009070160A (ja) | 座標入力装置及び手書き入力表示装置 | |
JP2009037464A (ja) | 画像表示装置及びコンピュータプログラム | |
JP2008276317A (ja) | 画像処理プログラム、画像処理装置、及び画像処理システム | |
JP2000020227A (ja) | 高分解能手段を有する光学式タッチパネル及び方法 | |
KR20090071374A (ko) | 위치 감지 디스플레이 | |
JP2014203204A (ja) | 走査型タッチパネル装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980101109.1 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12680567 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09804895 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2243/DELNP/2010 Country of ref document: IN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009804895 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 20107007269 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |