US20140340340A1 - Visual interface system - Google Patents

Visual interface system Download PDF

Info

Publication number
US20140340340A1
US20140340340A1 US14/344,462 US201114344462A US2014340340A1 US 20140340340 A1 US20140340340 A1 US 20140340340A1 US 201114344462 A US201114344462 A US 201114344462A US 2014340340 A1 US2014340340 A1 US 2014340340A1
Authority
US
United States
Prior art keywords
information
interface system
visual interface
signal
matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/344,462
Other languages
English (en)
Inventor
Hsiung-Kuang Tsai
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Slim Hmi Technology
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of US20140340340A1 publication Critical patent/US20140340340A1/en
Assigned to SLIM HMI TECHNOLOGY reassignment SLIM HMI TECHNOLOGY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TSAI, HSIUNG-KUANG
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B5/00Near-field transmission systems, e.g. inductive or capacitive transmission systems
    • H04B5/70Near-field transmission systems, e.g. inductive or capacitive transmission systems specially adapted for specific purposes
    • H04B5/72Near-field transmission systems, e.g. inductive or capacitive transmission systems specially adapted for specific purposes for local intradevice communication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/14Handling requests for interconnection or transfer
    • G06F13/20Handling requests for interconnection or transfer for access to input/output bus
    • G06F13/28Handling requests for interconnection or transfer for access to input/output bus using burst mode transfer, e.g. direct memory access DMA, cycle steal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04166Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • G06F3/0446Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using a grid-like structure of electrodes in at least two directions, e.g. using row and column electrodes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/046Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by electromagnetic means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B5/00Near-field transmission systems, e.g. inductive or capacitive transmission systems
    • H04B5/20Near-field transmission systems, e.g. inductive or capacitive transmission systems characterised by the transmission technique; characterised by the transmission medium
    • H04B5/22Capacitive coupling
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B5/00Near-field transmission systems, e.g. inductive or capacitive transmission systems
    • H04B5/20Near-field transmission systems, e.g. inductive or capacitive transmission systems characterised by the transmission technique; characterised by the transmission medium
    • H04B5/24Inductive coupling

Definitions

  • the present invention relates to a human-machine interface system and, in particular, to a visual interface system.
  • touch panels have been widely applied to the commercial electronic products such as mobile phones, digital cameras, MP 3 , PDA, GPS, tablet PC, UMPC, and the likes.
  • the touch panel is bound with a screen to form a touch input display apparatus.
  • a manufacturing method of a conventional touch input display apparatus is to dispose a touch panel on a display panel of a display module.
  • this approach not only increases the weight and sized of the product, but also the cost.
  • NFC near field communication
  • An objective of the present invention is to provide a visual interface system that can achieve the touch input function without configuring an additional touch panel, and is equipped with a near field communication (NFC) function.
  • NFC near field communication
  • the present invention can be implemented by the following technical proposals.
  • a visual interface system of the invention includes an operation apparatus and a matrix display apparatus.
  • the matrix display apparatus includes a display surface and a matrix substrate including a substrate and a matrix.
  • the matrix is disposed at one side of the substrate while the display surface is located at the other side of the substrate.
  • the encoded signal is capacitive coupled to the operation apparatus from the matrix substrate.
  • the transmission signal is transmitted to the matrix display apparatus or other apparatuses outside the visual interface system.
  • the transmission signal includes touch input information, instruction information, identification information, transaction information, file information or other information.
  • Another apparatus outside the system or the matrix display apparatus processes the transmission signal to obtain an information signal, which may includes touch input information, instruction information, identification information, transaction information, file information or other information.
  • the visual interface system further includes at least one relay apparatus for processing the transmission signal to generate a relay process signal.
  • the relay process signal is transmitted to the matrix display apparatus or other apparatuses outside the visual interface system.
  • the transmission signal may include touch input information, instruction information, identification information, transaction information, file information or other information.
  • the relay process signal may also include touch input information, instruction information, identification information, transaction information, file information or other information.
  • Another apparatus outside the system or the matrix display apparatus processes the relay process signal to obtain an information signal, which may includes touch input information, instruction information, identification information, transaction information, file information or other information.
  • the visual interface system further includes a mode trigger apparatus for enabling the matrix display apparatus into an operating mode to output the encoded signal as the mode trigger apparatus is triggered by a user or the operation apparatus.
  • the visual interface system when the operation apparatus is a user, the visual interface system further includes a sensing apparatus, and then when the user touches the display surface and the sensing apparatus simultaneously, the transmission signal is transmitted to the sensing apparatus.
  • the sensing apparatus is electrically coupled with other apparatuses outside the visual interface system or the matrix display apparatus.
  • the matrix further includes a plurality of row electrodes and a plurality of column electrodes.
  • the row electrodes and the column electrodes are intersected, and the encoded signal is applied to the row electrodes or the column electrodes.
  • the matrix further includes a plurality of transistors and a plurality of pixel electrodes, and the transistors are electrically connected with the row electrodes, the column electrodes and the pixel electrodes.
  • the encoded signal when the operation apparatus is operated on the display surface, the encoded signal is coupled to the operation apparatus from the matrix substrate, and the operation apparatus receives the encoded signal to generate a transmission signal.
  • the transmission signal can be transmitted to other apparatuses outside the system or at least one relay apparatus, or be sent back to the matrix display apparatus.
  • the transmission signal can be processed by other apparatuses outside the system, and/or at least one relay apparatus, and/or the matrix display apparatus, so as to retrieve the information (e.g. touch input information, instruction information, identification information, transaction information, file information or other information) contained in the encoded signal/signal processed by the relay apparatus/transmission signal.
  • the visual interface system of the invention can be directly applied to the system containing matrix structure such as TFT LCD panel, OLED panel, LED panel, electrophoretic display panel, MEMS display panel, or the likes, thereby integrating display, touch input and data transmission functions together.
  • the manufactured products can be lighter and thinner and the product cost can be decreased, thereby improving the product competitiveness.
  • the encoded signal is coupled to the external operation apparatus instead of being directly read by the matrix substrate, so that it is unnecessary to modify the layout on the matrix substrate. For example, regarding the touch input application, it is unnecessary to add the capacitance sensing components in the display panel for detecting the change of external capacitance values. As a result, the present invention can decrease the manufacturing cost and shrink the process time.
  • FIG. 1 is a block diagram of a visual interface system according to a first embodiment of the invention
  • FIG. 2 is a side view of a matrix display apparatus according to the first embodiment of the invention.
  • FIG. 3 is a schematic diagram showing a TFT substrate used in the first embodiment of the invention.
  • FIG. 4 is a schematic diagram showing the signals for two row electrodes and two column electrodes of the TFT substrate of FIG. 3 ;
  • FIG. 5 is a timing chart of the encoded signal transmitted through each column electrode according to the first embodiment of the invention.
  • FIG. 6 is a perspective view of the matrix display apparatus of the visual interface system according to the first embodiment of the invention.
  • FIG. 7 is a schematic diagram showing the matrix display apparatus and a user, as the operation apparatus, of the visual interface system according to the first embodiment of the invention.
  • FIG. 8 is a block diagram of a visual interface system according to a second embodiment of the invention.
  • FIG. 1 is a block diagram of a visual interface system 1 according to a first embodiment of the invention.
  • the visual interface system 1 includes an operation apparatus 11 and a matrix display apparatus 12 , which are coupled with each other.
  • the operation apparatus 11 is capacitive coupled to the matrix display apparatus 12 for transmitting signals, which is a non-contact signal transmission.
  • the operation apparatus 11 is, for example, a stylus, an IC card, an NFC reading apparatus, or a user (especially the hand of a user).
  • the operation apparatus 11 is an electronic apparatus, it may include some functional circuits such as a process control circuit, a storage circuit or a transmission circuit.
  • any circuit can be composed of hardware, software or firmware, or their combinations.
  • the operation apparatus 11 is a user, the user (operation apparatus) can serve as a conductor for transmitting signals.
  • FIG. 2 is a side view of the matrix display apparatus 12 .
  • the matrix display apparatus 12 includes a display surface 121 and a matrix substrate 122 .
  • the matrix substrate 122 includes a substrate 123 and a matrix 124 .
  • the matrix 124 is disposed at one side of the substrate 123 , while the display surface 121 is located at the other side of the substrate 123 .
  • the display surface 121 is the surface of the matrix display apparatus 12 , which is closest to the viewer when the viewer is watching the images displayed on the matrix display apparatus 12 .
  • the matrix display apparatus 12 may further include a protect glass 125 disposed on one side of the substrate 123 opposite to the matrix 124 .
  • the display surface 121 is the surface of the protect glass closest to the viewer.
  • one side of the substrate 123 close to the protect glass 125 may be further configured with other components such as a polarizer, frame, or the likes.
  • the matrix substrate 122 is a substrate configured with pixel matrix for displaying images, such as the TFT substrate of LCD panel, OLED panel, LED panel, electrophoretic panel, MEMS display panel, and the likes.
  • the matrix 124 includes a plurality of row electrodes, a plurality of column electrodes, and a plurality of pixel electrodes, wherein the row electrodes and the column electrodes are intersected.
  • the matrix 124 can be an active matrix or a passive matrix.
  • the matrix 124 is an active matrix for example.
  • the matrix 124 may further include a plurality of transistors electrically connected with the row electrodes, the column electrodes and the pixel electrodes, respectively.
  • an encoded signal ES is coupled to the operation apparatus 11 from the matrix substrate 122 , and the operation apparatus 11 receives the encoded signal ES so as to generate a transmission signal TS.
  • the encoded signal ES contains the coordinates of the display screen of the matrix display apparatus 12
  • the transmission signal TS contains the coordinate information.
  • the encoded signal can be composed of any information to be transmitted such as touch input information, instruction information, identification information, transaction information, or file information (e.g. music, images, texts, and etc.), so that the transmission signal TS contains the corresponding information.
  • information to be transmitted such as touch input information, instruction information, identification information, transaction information, or file information (e.g. music, images, texts, and etc.), so that the transmission signal TS contains the corresponding information.
  • the encoded signal ES is applied to the matrix substrate 122 , and additional display data signal is applied to the matrix substrate 122 for displaying images.
  • the encoded signal ES is applied during the blanking time of the display data signals.
  • the encoded signal ES can be applied between two frames or scan operations of two row electrodes, or during the gap generated as shortening the input time of the display signals.
  • the encoded signal ES can have a higher frequency and be directly added to the display signal.
  • the encoded signal ES and the transmission signal TS will be further described hereinbelow, wherein the matrix substrate 122 is a TFT substrate of an LCD apparatus.
  • FIG. 3 is a schematic diagram showing a TFT substrate used in this embodiment.
  • the matrix 124 includes a plurality of row electrodes S 1 ⁇ S M , a plurality of column electrodes D 1 ⁇ D N , and a plurality of pixel electrodes E 11 ⁇ E MN .
  • the row electrodes and the column electrodes are intersected and they are substantially perpendicular to each other or have an included angle.
  • the matrix 124 further includes a plurality of transistors for electrically connected with the row electrodes S 1 ⁇ S M , column electrodes D 1 ⁇ D N , and pixel electrodes E 11 ⁇ E MN .
  • the row electrodes S 1 ⁇ S M are referred to scan lines, while the column electrodes D 1 ⁇ D N are referred to data lines.
  • the substrate 123 may further be configured with a driving module, which includes data driving circuit, scan driving circuit, timing control circuit (not shown), and gamma calibration circuit (not shown), for driving the LCD panel to display images. Since the function of the driving module is well known in this art, the detailed description thereof will be omitted here.
  • the above-mentioned matrix substrate is for illustrations only and is not to limit the invention.
  • the point of this embodiment is that the encoded signal is transmitted from the matrix substrate to the operation apparatus through the row electrodes S 1 ⁇ S M and/or the column electrodes D 1 ⁇ D N so as to generate the transmission signal.
  • the encoded signal may contain various information for different applications such as the reference coordinates of the display screen, the file information of different formats (e.g. personal data, music, images, and etc.), and the likes.
  • the column electrodes D 1 ⁇ D N can transmit not only the data signals for displaying images but also the encoded signal.
  • the display signal can be directly added to the encoded signal with higher frequency or be added to the blank period of the displayed data signal such as the period after the scan procedure of all row electrodes S 1 ⁇ S M are finished and before the next scan procedure start (the blank period between frames).
  • the display signal can be inserted after one row electrode is scanned and before the scan of next row electrode, or within the scan period of row electrode by reducing the display data signal period and before sending the display data.
  • the encoded signal can be provided by expending T-con circuit function and data or scan driving circuit, thereby simplifying the circuit design.
  • FIG. 4 is a schematic diagram showing the signals for two adjacent row electrodes and two adjacent column electrodes.
  • the row electrodes S 1 ⁇ S M transmit the scan signals SS, respectively, for sequentially turning on the transistors of each column.
  • each of the column electrodes D 1 ⁇ D N respectively transmits the corresponding encoded signal ES and the displayed data signal DS.
  • the encoded signal ES will be described hereinafter by taking one dimension touch input perpendicular to the row electrode and sequentially encoded in time as an example. Of course, this method can also be applied to another dimension touch input perpendicular to the column electrode. Consequently, a complete two-dimensional touch input can be built. In this embodiment, as shown in FIG.
  • the encoded signal ES is labeled with a level different from that of the displayed data signal DS; or, in practice, the encoded signal ES and the displayed data signal DS may have the same level.
  • FIG. 5 is a timing chart of the encoded signal transmitted through each column electrode, wherein the data signals for display are omitted.
  • the column electrodes D 1 ⁇ D N transmit the encoded signals ES 1 ⁇ ESN, respectively.
  • the encoded signals can be transmitted through different column electrodes, respectively, or multiple column electrodes may transmit the same encoded signal.
  • the column electrodes D 1 ⁇ D 3 transmit the encoded signal ES 1 .
  • This approach may also be applied to the transmission of encoded signals through the row electrodes.
  • the encoded signals transmitted through the row electrodes and the column electrodes may be independent.
  • the column electrodes transmit the encoded signals ES, respectively, it is necessary to provide a time reference point for determining the positions of the column electrodes by comparing with the reference point.
  • This reference point can be a specific code and transmitted by the same manner. For example, all column electrodes may output the code “1” twice and then transmit the sequential signal.
  • the encoded signal coupled to the operation apparatus is “110010000”, it represents that the operation apparatus is located on the third column electrode, thereby figuring out the coordinate (x-coordinate) of the operation apparatus.
  • the y-coordinate of the operation apparatus can be estimated according to another encoded signal applied to the row electrode.
  • the scan signals from the row electrodes for driving the display signal are sequentially generated, they can also be used as the encoded signals of the row electrodes.
  • the encoded signals of the row electrodes and the column electrodes may have a common reference point such as the horizontal or vertical sync signals for displaying images.
  • the column electrodes D 1 ⁇ D N and the row electrodes S 1 ⁇ S M may have more complicated encoded signals, which will not be described in detail here.
  • the duty cycle of the encoded signal of this embodiment is smaller than that of the data signals so as to maintain the display quality.
  • the encoded signal is capacitive coupled from the matrix substrate 122 to the operation apparatus 11 .
  • This embodiment takes the column electrodes D 1 ⁇ D N for transmitting the encoded signals ES for an example, so the column electrode can serve as one of the capacitive coupling electrodes, and the operation apparatus 11 has the other capacitive coupling electrode.
  • the operation apparatus 11 is a stylus
  • a conductor configured at the tip of the stylus functions as the other capacitive coupling electrode.
  • the operation apparatus 11 After receiving the encoded signal ES through the capacitive coupling, the operation apparatus 11 processes the received encoded signal ES to generate a transmission signal TS.
  • This process includes amplifying and/or decoding the encoded signal ES so as to determine the touch position, the touch gesture (writing style), the corresponding function instruction, or which column electrode is touched or pressed.
  • the encoded signal ES is capacitive coupled to the operation apparatus 11 , and the value of the capacitance relies upon the distance between the operation apparatus and the display surface, which means the amplitude of the signal can provide the z-axis information, so that the operation apparatus 11 can get not only the two-dimensional coordinates but also the z coordinate.
  • the transmission signal TS stands for the result of processing the encoded signal ES, ranging from simple amplification to extract the information like commands of action.
  • the operation apparatus 11 can transmit the transmission signal TS to the matrix display apparatus 12 , other relay apparatus, or other apparatuses outside the visual interface system through wire/wireless electrical coupling (including capacitive coupling) or optical coupling.
  • the transmission signal TS is directly transmitted to the matrix display apparatus 12 .
  • the information to be transmitted is encoded to generate an encoded signal ES based on a specific coding rule, and then the encoded signal ES is capacitive coupled from the matrix substrate 122 (e.g. configured as a cell phone or tablet computer) to the operation apparatus 11 (e.g. short distance wireless reading apparatus attached on the wall).
  • the operation apparatus 11 can process (decodes or modifies) the encoded signal ES based on the preset coding rule so as to obtain the transmission signal TS, and then uses the transmission signal TS on the corresponding application such as access control, payment, financial transaction, file transmission, and the likes.
  • the operation apparatus 11 processes the encoded signal ES to obtain the information contained in the transmission signal TS such as the touch input information, instruction information, identification information, transaction information, file information or other information.
  • the matrix display apparatus 12 may process the transmission signal TS to obtain an information signal, which contains the touch input information, instruction information, identification information, transaction information, file information or other information.
  • the information signal instead of the transmission signal TS, carries the complete information.
  • the signal is processed by, for example, amplification and decoding, which can be handled by either one of the operation apparatus 11 and the matrix display apparatus 12 or among these units. Accordingly, the resulting transmission signal TS or the information signal can contain the touch input information, instruction information, identification information, transaction information, file information or other information.
  • a response signal RS can also be transmitted between the operation apparatus 11 and the matrix display apparatus 12 .
  • the response signal RS is for providing the information of the receiving status of the operation apparatus 11 to the matrix display apparatus 12 , announcing the operation apparatus 11 to get ready for receiving the signal, or synchronizing the operation apparatus 11 and the matrix display apparatus 12 .
  • This configuration can create an interactive mechanism between the transmitting and receiving signals.
  • the response signal RS can provide the synchronization function for establishing an information handshaking procedure between the operation apparatus 11 and the matrix display apparatus 12 .
  • FIG. 6 is a schematic drawing of the matrix display apparatus 12 of the visual interface system according to the first embodiment of the invention.
  • the visual interface system further includes a mode trigger apparatus 127 .
  • the mode trigger apparatus 127 can enable the matrix display apparatus 12 into an operating mode to output the encoded signal ES.
  • the user needs the touch input function, he/she activates the mode trigger apparatus 127 so as to enable the matrix display apparatus 12 into the touch input mode.
  • the row electrode or column electrode starts to transmit the encoded signal. Otherwise, the matrix display apparatus 12 does not enter the touch input mode and the touch input function of the matrix display apparatus 12 may be partially or totally shut down.
  • the operation of the mode trigger apparatus 127 may have different operation modes. For example, after being activated, the mode trigger apparatus 127 may remain in the new state for a while and then return to the original state, or it may change the state while been activated each time, or it remains in the new state only when the activation lasts.
  • the mode trigger apparatus 127 can be configured on the operation apparatus (e.g. a switch on the stylus) as well. In this case, when the mode trigger apparatus 127 is activated, the operation apparatus 11 transmits a trigger signal to the matrix display apparatus 12 to control it to enter the touch input mode.
  • the mode trigger apparatus 127 can switch to touch input function when the mode trigger apparatus 127 is activated once by the user or requests the user to keep activating the mode trigger apparatus 127 to maintain in touch input function.
  • the user can trigger the mode trigger apparatus to transmit the encoded signal for authorization or personal identification to the corresponding data receiving device.
  • the mode trigger apparatus 127 can be, for example, a mechanical switch, a touch sensing switch, or the likes.
  • FIG. 7 is a schematic diagram showing the matrix display apparatus 12 and a user, as the operation apparatus 11 , of the visual interface system 1 according to the first embodiment of the invention.
  • the visual interface system 1 further includes a sensing apparatus 128 , which is electrically coupled with the matrix display apparatus 12 .
  • the transmission signal TS is transmitted to the matrix display apparatus 12 .
  • the user serves as a large conductor for transmitting the transmission signal TS to the matrix display apparatus 12 .
  • the user can use his/her right hand to operate on the display surface 121 , while use the left hand to press the sensing apparatus 128 .
  • the encoded signal ES can enter the user body through the right hand, and the transmission signal TS can be outputted from the left hand.
  • the sensing apparatus 128 may also contain the function of the mode trigger apparatus 127 . For example, only when the sensing apparatus 128 is pressed by hand, the operation mode will be enabled. This specific function can sufficiently reduce the power consumption and problem of unintentional touch.
  • FIG. 8 is a block diagram of a visual interface system 1 a according to a second embodiment of the invention.
  • the visual interface system 1 a includes an operation apparatus 11 and a matrix display apparatus 12 .
  • the visual interface system 1 a further includes at least one relay apparatus 13 , and the transmission signal TS is transmitted to the matrix display apparatus 12 or other apparatuses outside the visual interface system through the relay apparatus 13 .
  • Transmitting the transmission signal TS through user's hand to a relay apparatus 13 will be used to describe the implementation of this embodiment.
  • the relay apparatus 13 From the transmission signal TS, the relay apparatus 13 generates a relay processed signal IS and transmits back to the matrix display apparatus 12 .
  • the relay apparatus 13 is only for illustration and it is possible to configure multiple relay apparatuses.
  • the operation apparatus 11 When the operation apparatus 11 is a user, the user can transmit the transmission signal TS to the relay apparatus 13 instead of transmitting back to the matrix display apparatus 12 .
  • the relay apparatus 13 can be a portable communication device such as a cell phone. Accordingly, the user can be the transmission media for conducting the encoded signal ES outputted from the matrix display apparatus 12 to the relay apparatus 13 for the purpose of transmitting file information.
  • the relay apparatus 13 can process the transmission signal TS to generate a relay processed signal IS and then transmit the relay processed signal IS to the matrix display apparatus 12 .
  • the signal is processed by means, for example, amplification, decoding, modifying and/or interpretation, which can be implemented by all or either one of the operation apparatus 11 , the matrix display apparatus 12 and the relay apparatus 13 .
  • the transmission signal TS, the relay processed signal IS or the information signal can contain the touch input information, instruction information, identification information, transaction information, file information or other information.
  • the response signal RS of the first embodiment can also be applied to the operation apparatus, relay apparatus and/or matrix display apparatus of the second embodiment, thereby creating an interactive mechanism between the transmitting and receiving signals.
  • the response signal RS can provide the synchronization function for establishing an information handshaking procedure between the operation apparatus, relay apparatus and matrix display apparatus.
  • the encoded signal when the operation apparatus is operated on the display surface, the encoded signal is coupled to the operation apparatus from the matrix substrate, and the operation apparatus receives the encoded signal to generate a transmission signal.
  • the transmission signal can be directly or indirectly transmitted to the matrix display apparatus.
  • the transmission signal can be processed by operation apparatus, at least one relay apparatus, and/or the matrix display apparatus, so that the matrix display apparatus can retrieve the information contained in the encoded signal and transmission signal, such as touch input information, instruction information, identification information, transaction information, file information or other information.
  • the visual interface system of the invention can be directly applied to the system containing matrix structure such as TFT LCD panel, OLED panel, LED panel, electrophoretic display panel, MEMS display panel, or the likes, thereby integrating display, touch input and data transmission functions together.
  • the products can be lighter and thinner and the product cost can be decreased, thereby improving the product competitiveness.
  • the encoded signal is coupled to the external operation apparatus instead of being directly read by the matrix substrate, so that it is unnecessary to modify the layout on the matrix substrate. For example, regarding to the touch input application, it is unnecessary to add the capacitance sensing component in the display panel for detecting the change of external capacitance values. As a result, the present invention can decrease the manufacturing cost and time.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Electromagnetism (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Position Input By Displaying (AREA)
  • Mobile Radio Communication Systems (AREA)
  • Near-Field Transmission Systems (AREA)
  • Telephone Function (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)
US14/344,462 2011-09-13 2011-09-13 Visual interface system Abandoned US20140340340A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2011/079576 WO2013037103A1 (zh) 2011-09-13 2011-09-13 视觉界面系统

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2011/079576 A-371-Of-International WO2013037103A1 (zh) 2011-09-13 2011-09-13 视觉界面系统

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/488,965 Continuation US20170220190A1 (en) 2011-09-13 2017-04-17 Visual interface system

Publications (1)

Publication Number Publication Date
US20140340340A1 true US20140340340A1 (en) 2014-11-20

Family

ID=47882523

Family Applications (4)

Application Number Title Priority Date Filing Date
US14/344,462 Abandoned US20140340340A1 (en) 2011-09-13 2011-09-13 Visual interface system
US14/344,596 Expired - Fee Related US9335849B2 (en) 2011-09-13 2012-07-10 Visual interface system
US14/344,056 Active US9489071B2 (en) 2011-09-13 2012-09-10 Electronic apparatus and data transmission system
US15/488,965 Abandoned US20170220190A1 (en) 2011-09-13 2017-04-17 Visual interface system

Family Applications After (3)

Application Number Title Priority Date Filing Date
US14/344,596 Expired - Fee Related US9335849B2 (en) 2011-09-13 2012-07-10 Visual interface system
US14/344,056 Active US9489071B2 (en) 2011-09-13 2012-09-10 Electronic apparatus and data transmission system
US15/488,965 Abandoned US20170220190A1 (en) 2011-09-13 2017-04-17 Visual interface system

Country Status (7)

Country Link
US (4) US20140340340A1 (ko)
EP (3) EP2757444B1 (ko)
JP (3) JP6142149B2 (ko)
KR (3) KR101662413B1 (ko)
CN (3) CN103797448B (ko)
TW (2) TWI592841B (ko)
WO (3) WO2013037103A1 (ko)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150261331A1 (en) * 2012-11-06 2015-09-17 Hewlett-Packard Development Company, L.P. Interactive Display
US20150301676A1 (en) * 2013-03-13 2015-10-22 Beijing Boe Opetoelectronics Technology Co., Ltd. Driving method and driving device of touch control display

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9577712B2 (en) 2012-05-25 2017-02-21 Hsiung-Kuang Tsai Non-display signal encoding method and matrix substrate
CN102916729B (zh) * 2012-09-04 2014-12-10 深圳市汇顶科技股份有限公司 一种触摸屏终端的近场通信方法、系统及触摸屏终端
US9582186B2 (en) * 2013-12-20 2017-02-28 Mediatek Inc. Signature verification between a mobile device and a computing device
US9892628B2 (en) 2014-10-14 2018-02-13 Logitech Europe S.A. Method of controlling an electronic device
JP2020030884A (ja) * 2016-12-26 2020-02-27 コニカミノルタ株式会社 パッシブマトリックス型有機エレクトロルミネッセンスディスプレイ及びタッチ検出方法
US20210014674A1 (en) * 2019-07-11 2021-01-14 Slim Hmi Technology Secure interaction system and communication display device
TWI726488B (zh) 2019-11-19 2021-05-01 元太科技工業股份有限公司 非接觸式智慧卡及其操作方法
KR20220091700A (ko) 2020-12-23 2022-07-01 삼성디스플레이 주식회사 전자 장치

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4841290A (en) * 1986-09-25 1989-06-20 Mitsubishi Denki Kabushiki Kaisha Display unit
US20020186341A1 (en) * 2001-06-08 2002-12-12 Kuni Yamamura IC chip and display device using the same
US20060000893A1 (en) * 2004-07-01 2006-01-05 American Express Travel Related Services Company, Inc. Method for biometric security using a smartcard-reader
US7777719B2 (en) * 2007-01-19 2010-08-17 Nokia Corporation System using a living body as a transmission medium
US20110007037A1 (en) * 2009-07-07 2011-01-13 Panasonic Corporation Electronic pen and electronic pen system

Family Cites Families (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03294919A (ja) * 1990-04-12 1991-12-26 Nippon Telegr & Teleph Corp <Ntt> タブレット機能付き表示装置
US5117071A (en) * 1990-10-31 1992-05-26 International Business Machines Corporation Stylus sensing system
JPH06230884A (ja) * 1993-02-03 1994-08-19 Matsushita Electric Ind Co Ltd 座標入力装置
US5491706A (en) * 1993-04-07 1996-02-13 Sharp Kabushiki Kaisha Display-integrated type tablet device capable of detecting correct coordinates at a tip end of a detection pen by detecting external noise
JP3421078B2 (ja) 1993-05-07 2003-06-30 シャープ株式会社 情報入出力装置
JPH0720421A (ja) * 1993-06-29 1995-01-24 Asahi Glass Co Ltd 液晶表示素子
JP2653014B2 (ja) * 1993-07-26 1997-09-10 日本電気株式会社 アクティブマトリックス液晶ディスプレイ装置
JPH08106358A (ja) * 1994-08-10 1996-04-23 Fujitsu Ltd タブレット機能付き液晶表示装置、アクティブマトリクス型液晶表示装置及びタブレット機能付き液晶表示装置の駆動方法
JP4939682B2 (ja) * 1999-04-27 2012-05-30 エーユー オプトロニクス コーポレイション 表示装置
JP2001075074A (ja) * 1999-08-18 2001-03-23 Internatl Business Mach Corp <Ibm> タッチセンサ一体型液晶表示素子
JP2001144661A (ja) * 1999-11-17 2001-05-25 Sony Corp データ送信装置およびデータ受信装置
JP3319462B2 (ja) * 2000-06-26 2002-09-03 松下電工株式会社 信号伝送経路として人体を利用したデータ伝送システム
JP2004005415A (ja) * 2002-04-19 2004-01-08 Sharp Corp 入力装置および入出力一体型表示装置
JP3966069B2 (ja) * 2002-05-02 2007-08-29 セイコーエプソン株式会社 データ通信装置および携帯時計
JP3794411B2 (ja) * 2003-03-14 2006-07-05 セイコーエプソン株式会社 表示装置および電子機器
US7310779B2 (en) * 2003-06-26 2007-12-18 International Business Machines Corporation Method for creating and selecting active regions on physical documents
JP5008823B2 (ja) * 2004-03-19 2012-08-22 シャープ株式会社 表示装置
JP4547000B2 (ja) * 2004-04-01 2010-09-22 株式会社ワコム パネルとコードレス・トランスデューサのシステム
US7657242B2 (en) * 2004-09-27 2010-02-02 Qualcomm Mems Technologies, Inc. Selectable capacitance circuit
JP2006127190A (ja) * 2004-10-29 2006-05-18 Citizen Watch Co Ltd 入力装置
US20060094411A1 (en) 2004-10-29 2006-05-04 Dupont Pierre B Mobile station telephony service applications for mobile station having integrated transponder readers
CN100374995C (zh) * 2004-11-08 2008-03-12 英华达(南京)科技有限公司 手写笔、手写笔系统以及其控制方法
JP2006195925A (ja) * 2005-01-17 2006-07-27 Nippon Signal Co Ltd:The タッチパネル装置
WO2006087670A1 (en) 2005-02-17 2006-08-24 Koninklijke Philips Electronics N.V. Device capable of being operated within a network, network system, method of operating a device within a network, program element, and computer-readable medium
JP2006238328A (ja) 2005-02-28 2006-09-07 Sony Corp 会議システム及び会議端末装置並びに携帯端末装置
CN2785540Y (zh) * 2005-04-21 2006-06-07 周常安 手持式生理信号测量及无线传输/接收装置
CN101017419B (zh) * 2005-06-30 2010-05-12 智点科技(深圳)有限公司 触控式平板显示器
CN1716018A (zh) * 2005-07-14 2006-01-04 深圳市联思精密机器有限公司 具有触控功能的平板显示器
CN1904812A (zh) * 2005-07-29 2007-01-31 姚华 能用于普通计算机显示屏的触摸笔
KR100862578B1 (ko) * 2006-05-16 2008-10-09 엘지전자 주식회사 플라즈마 디스플레이 장치
US20090275283A1 (en) * 2006-08-29 2009-11-05 Zhao Zhuyan Use of intra-body communication
JP5224546B2 (ja) * 2007-02-14 2013-07-03 カバ・アクチェンゲゼルシャフト 識別信号の送信のためのシステムおよび持ち運び可能な装置
CN201707209U (zh) * 2007-03-09 2011-01-12 营口方舟科技有限公司 腕带显示器
EP2056186A1 (en) 2007-10-26 2009-05-06 Research In Motion Limited Touch screen and electronic device
JP2009146088A (ja) * 2007-12-13 2009-07-02 Hitachi Displays Ltd 静電結合型信号送受信回路
JP2009253478A (ja) * 2008-04-02 2009-10-29 Sony Ericsson Mobilecommunications Japan Inc 情報通信装置、情報通信装置の制御方法
CN102112950B (zh) * 2008-09-12 2015-01-28 奥博特瑞克斯株式会社 电容型触摸面板、显示装置及电容型触摸面板的制造方法
US8482545B2 (en) * 2008-10-02 2013-07-09 Wacom Co., Ltd. Combination touch and transducer input system and method
TWM364503U (en) * 2009-03-23 2009-09-11 Astek Technology Ltd Wireless ring-type physiological detector
US20120113051A1 (en) * 2009-05-27 2012-05-10 Koninklijke Philips Electronics N.V. Touch- or proximity -sensitive interface
JP2010286895A (ja) * 2009-06-09 2010-12-24 Toshiba Tec Corp 情報入力装置及び情報処理装置
KR101562565B1 (ko) * 2009-06-25 2015-10-22 삼성전자주식회사 전기장을 이용한 데이터 송수신 방법 및 이를 위한 장치
TWM371930U (en) * 2009-06-29 2010-01-01 Ming-Chang Wu Prompt system for object sensing
US20110001717A1 (en) 2009-07-06 2011-01-06 Charles Hayes Narrow Border for Capacitive Touch Panels
KR20110027572A (ko) 2009-09-08 2011-03-16 한국전자통신연구원 근접장을 이용하는 통신 장치
US20110059692A1 (en) * 2009-09-08 2011-03-10 Electronics And Telecommunications Research Institute Communications device using near field
TWI428661B (zh) 2009-11-09 2014-03-01 Silicon Integrated Sys Corp 觸碰顯示裝置
KR101351413B1 (ko) * 2009-12-11 2014-01-14 엘지디스플레이 주식회사 터치 패널 및 이를 적용한 터치 패널 일체형 액정 표시 장치
KR101107171B1 (ko) * 2010-02-11 2012-01-25 삼성모바일디스플레이주식회사 접촉 감지 장치, 이를 포함하는 표시 장치 및 그 구동 방법
JP5440376B2 (ja) * 2010-05-18 2014-03-12 日本電気株式会社 タッチセンサ、タッチパネルおよび該タッチセンサを用いた情報伝達方法
CN201828896U (zh) 2010-09-27 2011-05-11 南京点面光电有限公司 一种电容式触摸屏
CN102073409A (zh) 2010-12-29 2011-05-25 广东中显科技有限公司 触摸屏

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4841290A (en) * 1986-09-25 1989-06-20 Mitsubishi Denki Kabushiki Kaisha Display unit
US20020186341A1 (en) * 2001-06-08 2002-12-12 Kuni Yamamura IC chip and display device using the same
US20060000893A1 (en) * 2004-07-01 2006-01-05 American Express Travel Related Services Company, Inc. Method for biometric security using a smartcard-reader
US7777719B2 (en) * 2007-01-19 2010-08-17 Nokia Corporation System using a living body as a transmission medium
US20110007037A1 (en) * 2009-07-07 2011-01-13 Panasonic Corporation Electronic pen and electronic pen system

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150261331A1 (en) * 2012-11-06 2015-09-17 Hewlett-Packard Development Company, L.P. Interactive Display
US10705631B2 (en) * 2012-11-06 2020-07-07 Hewlett-Packard Development Company, L.P. Interactive display
US20150301676A1 (en) * 2013-03-13 2015-10-22 Beijing Boe Opetoelectronics Technology Co., Ltd. Driving method and driving device of touch control display
US9958980B2 (en) * 2013-03-13 2018-05-01 Boe Technology Group Co., Ltd. Driving method and driving device of touch control display

Also Published As

Publication number Publication date
EP2757444A4 (en) 2015-03-18
KR101570309B1 (ko) 2015-11-18
KR20140074902A (ko) 2014-06-18
EP2757445B1 (en) 2017-10-04
WO2013037238A1 (zh) 2013-03-21
JP6033308B2 (ja) 2016-11-30
CN103827934A (zh) 2014-05-28
EP2757538A1 (en) 2014-07-23
KR101662413B1 (ko) 2016-10-04
JP5951775B2 (ja) 2016-07-13
EP2757445A4 (en) 2015-07-15
KR101641802B1 (ko) 2016-07-21
TW201312413A (zh) 2013-03-16
JP2014531150A (ja) 2014-11-20
KR20140075688A (ko) 2014-06-19
US20140220892A1 (en) 2014-08-07
WO2013037281A1 (zh) 2013-03-21
EP2757445A1 (en) 2014-07-23
WO2013037103A1 (zh) 2013-03-21
CN103827934B (zh) 2017-10-13
CN103797448A (zh) 2014-05-14
CN103797450A (zh) 2014-05-14
US9489071B2 (en) 2016-11-08
JP6142149B2 (ja) 2017-06-07
TW201312366A (zh) 2013-03-16
EP2757538A4 (en) 2015-05-06
TWI485565B (zh) 2015-05-21
US9335849B2 (en) 2016-05-10
EP2757538B1 (en) 2019-09-11
JP2014529134A (ja) 2014-10-30
CN103797450B (zh) 2018-03-23
CN103797448B (zh) 2018-09-04
KR20140074921A (ko) 2014-06-18
US20140340357A1 (en) 2014-11-20
EP2757444B1 (en) 2017-04-19
JP2014530404A (ja) 2014-11-17
US20170220190A1 (en) 2017-08-03
EP2757444A1 (en) 2014-07-23
TWI592841B (zh) 2017-07-21

Similar Documents

Publication Publication Date Title
US20170220190A1 (en) Visual interface system
US10884619B2 (en) Character input method and display apparatus
US8743021B1 (en) Display device detecting gaze location and method for controlling thereof
US20180081463A1 (en) Touch control liquid crystal display device and electronic apparatus
US9158405B2 (en) Electronic device including touch-sensitive display and method of controlling same
CN103135849A (zh) 在显示模式和触摸模式间切换的显示设备及其方法和系统
KR102062724B1 (ko) 이동 단말기와 그 구동방법
US9141243B2 (en) Electronic device including touch-sensitive display and method of detecting touches
US20200004375A1 (en) Visual interface system
US20180373387A1 (en) Visual interface system
EP4167582A1 (en) Electronic device and control method thereof
KR101668937B1 (ko) 시각적 인터페이스 장치 및 데이터 전송 시스템
EP2946269B1 (en) Electronic device with touch-sensitive display and gesture-detection, method for operating same, and computer-readable storage device
JP2014535088A (ja) ビジョンインタフェースシステムの駆動方法
TWI492110B (zh) 視覺介面系統
US11507218B2 (en) Dual interface smart device
US20150205372A1 (en) Method and apparatus for providing input interface for mobile terminal
US20230280858A1 (en) Electronic apparatus and control method thereof
EP2674838A1 (en) Electronic device including touch-sensitive display and method of controlling same

Legal Events

Date Code Title Description
AS Assignment

Owner name: SLIM HMI TECHNOLOGY, TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TSAI, HSIUNG-KUANG;REEL/FRAME:041163/0341

Effective date: 20170113

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION