WO2017028491A1 - 触控显示设备及触控显示方法 - Google Patents
触控显示设备及触控显示方法 Download PDFInfo
- Publication number
- WO2017028491A1 WO2017028491A1 PCT/CN2016/071171 CN2016071171W WO2017028491A1 WO 2017028491 A1 WO2017028491 A1 WO 2017028491A1 CN 2016071171 W CN2016071171 W CN 2016071171W WO 2017028491 A1 WO2017028491 A1 WO 2017028491A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- touch
- signal
- sound
- display screen
- type
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
- G06F3/04186—Touch location disambiguation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/043—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves
- G06F3/0433—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves in which the acoustic waves are either generated by a movable member and propagated within a surface layer or propagated within a surface layer and captured by a movable member
Definitions
- the present disclosure relates to the field of touch technologies, and in particular to a touch display device and a touch display method.
- touch devices based on touch operations have gradually become standard devices for various portable mobile terminals.
- touch screen Through the touch screen, the user can directly input coordinate information to the mobile terminal by hand, which is an input device like the mouse and the keyboard.
- the touch screen has many advantages such as ruggedness, fast response, space saving, and easy communication.
- smartphones with touch screens are becoming more powerful and have larger screens.
- the large screen makes it easy for users to browse the whole webpage and watch videos, which brings good display and viewing comfort.
- touch technologies such as resistive, capacitive, infrared, and electromagnetic pens.
- the technology ranges from single-touch to multi-touch, and the sensitivity is also significantly improved. Touch technology has gradually become more practical and mature.
- the existing touch technology generally requires the user to perform a touch operation by hand, such as by a finger or a stylus.
- a touch operation by hand such as by a finger or a stylus.
- existing touch technologies often fail to meet their control needs.
- the technical problem to be solved by the present disclosure is to provide a touch display device and a touch display method, which enable a user to perform touch operations by using different parts of the body, and then control the display screen to display corresponding screens according to the touch operations. Therefore, it is convenient for the disabled to use the touch display device.
- a touch display device including:
- a plurality of detecting elements disposed on the display screen for acquiring sound vibration signals generated when different parts of the body contact the display screen;
- a central processing unit coupled to the detecting component, configured to analyze a sound vibration signal transmitted by the detecting component, determine a body touch type, and generate a command signal including the body touch type;
- control chip connected to the central processor, for receiving a command signal sent by the central processor, and generating image data and a driving signal according to the command signal;
- the image processor is connected to the control chip and configured to receive the image data and the driving signal sent by the control chip, thereby driving the display screen to display a corresponding screen.
- the touch display device further includes:
- a sound feature signal database for storing a correspondence between the sound feature signal and the body touch type
- the central processor is specifically configured to acquire a sound feature signal according to the sound vibration signal, and determine a body touch type corresponding to the sound feature signal according to the correspondence relationship.
- the central processor is further configured to acquire a sound feature signal according to the received sound vibration signal, and receive the input body touch type corresponding to the sound vibration signal, and the sound feature signal and the input a body touch type is sent to the sound feature signal database;
- the sound feature signal database is further configured to update a correspondence between the sound feature signal and the body touch type according to the received sound feature signal and the body touch type.
- the body touch type includes a single touch and a plurality of touches, the single touch is a single body part touch, and the multiple touch is a combination of at least two body parts touch.
- the central processing unit includes:
- a sampling module configured to receive a sound vibration signal sent by the detecting component
- a signal processing module configured to process the sound vibration signal to obtain a sound parameter of the sound vibration signal
- a feature extraction module configured to perform feature extraction on the sound vibration signal according to the acquired sound parameter, and acquire a sound feature signal
- a comparison module configured to compare the acquired sound feature signal with data in the sound feature signal database to determine a body touch type corresponding to the sound feature signal.
- the touch display device further includes:
- a trigger event database for storing a correspondence between a body touch type and a trigger event
- the control chip is specifically configured to compare the body touch type included in the received command signal with the information in the trigger event database, and determine image data and a driving signal corresponding to the body touch type.
- the touch display device further includes:
- a plurality of touch sensors disposed on the display screen for acquiring a coordinate position of the touch point when the different parts of the body contact the display screen;
- the central processor is further connected to the touch sensor for generating a command signal including the body touch type and the touch point coordinate position.
- the trigger event database further stores a correspondence relationship between the touch point coordinate position and the body touch type and the trigger event;
- the control chip is specifically configured to compare the body touch type and the touch point position coordinate included in the received command signal with the information in the trigger event database, and determine the body touch type and the touch. Image data and drive signals corresponding to the point position coordinates.
- the touch display device further includes:
- a data storage unit connected to the control chip, configured to store a command signal of a current frame received by the control chip.
- the touch display device further includes:
- a buffer unit connected to the control chip for buffering an instruction signal of a next frame received by the control chip.
- the embodiment of the present disclosure further provides a touch display method, which is applied to the touch display device as described above, and the method includes:
- the display screen is driven by the image data and the driving signal to display a corresponding picture.
- the method further includes:
- the step of generating the command signal including the body touch type is specifically:
- a command signal is generated that includes the body touch type and the touch point coordinate position.
- a plurality of detecting elements are disposed on the display screen, and the sound vibration signals generated when the different parts of the body are in contact with the display screen can be acquired; then the central processor can determine the body touch type according to the sound vibration signal, and generate corresponding
- the command signal receives the command signal and generates corresponding image data and driving signals; the image processor receives the image data and the driving signal, thereby driving the display screen to display the corresponding picture.
- the embodiment of the present disclosure can identify the body part contacting the display screen, so that different touch parts can be realized by using different body parts, thereby controlling the display screen to display the corresponding picture.
- the forehead touch can be used to determine the function
- the chin touch can realize the cancel function
- the nose touch can realize the double click function
- the toe touch can realize the drawing function
- the display screen can be controlled according to the touch operation. Display, which makes it easier for people with disabilities to use touch display devices.
- FIG. 1 is a schematic structural diagram of a touch display device according to some embodiments of the present disclosure
- FIG. 2 is a schematic structural diagram of a central processing unit according to some embodiments of the present disclosure.
- FIG. 3 is a schematic structural diagram of a touch display device according to some embodiments of the present disclosure.
- FIG. 4 is a schematic flow chart of a touch display method according to some embodiments of the present disclosure.
- FIG. 5 is a schematic flowchart of a touch display method according to some embodiments of the present disclosure.
- FIG. 6 is a schematic diagram of a body touch type including a single touch and multiple touches.
- the embodiments of the present disclosure are directed to the problem that the touch display device is inconvenient for the disabled group in the prior art, and provides a touch display device and a touch display method, so that the user can perform touch operations by using different parts of the body, and further According to these touch operations, the display screen is controlled to display the corresponding screen, thereby facilitating the use of the touch display device by the disabled.
- the present disclosure provides a touch display device, as shown in FIG. 1, comprising:
- the central processing unit 20 is connected to the detecting component 10 for analyzing the sound vibration signal transmitted by the detecting component 10, determining the body touch type, and generating a command signal including the body touch type;
- the control chip 30 is connected to the central processing unit 20 for receiving a command signal sent by the central processing unit 20, and generating image data and a driving signal according to the command signal;
- the image processor 40 is connected to the control chip 30 for receiving the image data and the driving signal sent by the control chip 30, thereby driving the display screen 50 to display the corresponding picture.
- a plurality of detecting elements are disposed on the display screen, and the sound vibration signals generated when the different parts of the body touch the display screen can be acquired, and then the central processing unit can determine the body touch type according to the sound vibration signal, and generate corresponding
- the command signal receives the command signal and generates corresponding image data and a driving signal; the image processor receives the image data and the driving signal, thereby driving the display screen to display the corresponding picture.
- the embodiment of the present disclosure can identify the body part contacting the display screen, so that different touch parts can be realized by using different body parts, thereby controlling the display screen to display the corresponding picture.
- the forehead touch can be used to determine the function
- the chin touch can be implemented.
- the function of canceling, the tip touch of the nose realizes the double-click function, the touch of the toe can realize the drawing function, and the display screen is controlled according to the touch operation, thereby facilitating the use of the touch display device by the disabled.
- the touch display device further includes: a sound feature signal database for storing a correspondence between the sound feature signal and the body touch type.
- the body touch type is divided into a single touch and a variety of touch.
- the single touch is a single body part touch, for example, there may be a forehead touch, a nose touch, a chin touch, a toe touch, and the like.
- the combination of multiple touches for at least two body parts can be arranged in the above manner. Different parts of the body will emit different sounds when they touch the display, and each body touch type has a unique sound characteristic signal.
- the sound feature signal A corresponds to the forehead touch
- the sound feature signal B corresponds to the nose touch
- the sound feature signal C corresponds to the chin touch
- the sound feature signal D corresponds to the toe touch
- the sound feature signal E corresponds.
- the sound feature signal F corresponds to the chin + nose tip touch and so on.
- the central processing unit 20 is specifically configured to acquire a sound feature signal according to the sound vibration signal, and determine a body touch type corresponding to the sound feature signal according to the correspondence relationship. For example, if the central processing unit 20 acquires the sound feature signal A according to the sound vibration signal, it can be determined that the body touch type is the forehead touch. When the central processing unit 20 acquires the sound characteristic signal B according to the sound vibration signal, it can be determined that the body touch type is the nose touch. The central processor 20 acquires the sound feature signal C according to the sound vibration signal, and then determines that the body touch type is chin touch. The central processor 20 acquires the sound feature signal D according to the sound vibration signal, and then determines that the body touch type is toe touch.
- the central processor 20 acquires the sound feature signal E according to the sound vibration signal, and then determines that the body touch type is the forehead + nose tip touch.
- the central processing unit 20 acquires the sound feature signal F according to the sound vibration signal, it can be determined that the body touch type is chin + nose touch.
- the central processing unit 20 is further configured to acquire a sound feature signal according to the received sound vibration signal, and receive the input body touch type corresponding to the sound vibration signal, and send the sound feature signal and the input body touch type to Sound signature signal database;
- the sound feature signal database is further configured to update a correspondence between the sound feature signal and the body touch type according to the received sound feature signal and the body touch type, so that the sound feature signal database can not only store the sound feature signal and the body touch Correspondence between types, you can also update the correspondence between the sound feature signal and the body touch type as needed, and add new ones as needed. Correspondence between the sound signature signal and the type of body touch.
- the central processing unit 20 includes:
- the sampling module 21 is configured to receive the sound vibration signal sent by the detecting component 10.
- the detecting component distributed on the display screen generates a sound vibration signal
- the sampling module mainly collects the original sound vibration signal generated by the different body parts contacting the display screen.
- the original sound vibration signal cannot directly judge the type of body touch;
- the signal processing module 22 is configured to process the sound vibration signal collected by the sampling module 21 to obtain the sound parameter of the sound vibration signal. Specifically, the signal processing module 22 may perform a Fourier transform process or the like on the original sound vibration signal to acquire key key parameters, such as a characteristic amplitude of the sound vibration signal, a characteristic frequency, and the like;
- the feature extraction module 23 is configured to perform feature extraction on the sound vibration signal according to the sound parameter acquired by the signal processing module 22, and acquire the sound feature signal;
- the comparison module 24 is configured to compare the acquired sound feature signal with data in the sound feature signal database to determine a body touch type corresponding to the sound feature signal.
- the touch display device further includes: a trigger event database, configured to store a correspondence between the body touch type and the trigger event, such as a forehead touch corresponding determination function, a chin touch corresponding cancellation function, and a nose touch corresponding double click Function, toe touch corresponding painting function and so on.
- a trigger event database configured to store a correspondence between the body touch type and the trigger event, such as a forehead touch corresponding determination function, a chin touch corresponding cancellation function, and a nose touch corresponding double click Function, toe touch corresponding painting function and so on.
- the control chip 30 is specifically configured to compare the body touch type included in the received command signal with the information in the trigger event database, and determine image data and a driving signal corresponding to the body touch type. For example, when the central processing unit 20 determines that the body touch type is forehead touch, the control chip 30 sends the image data and the driving signal corresponding to the determining function to the image processor, thereby driving the display screen to display a screen corresponding to the determining function. When the central processing unit 20 determines that the body touch type is the chin touch, the control chip 30 transmits the image data and the driving signal corresponding to the cancel function to the image processor, thereby driving the display screen to display the screen corresponding to the cancel function.
- the control chip 30 transmits the image data and the driving signal corresponding to the double-click function to the image processor, thereby driving the display screen to display the screen corresponding to the double-click function.
- the control chip 30 sends the image data and the driving signal corresponding to the drawing function to the image processor, thereby driving the display screen display and drawing work. The screen that can correspond.
- the touch display device further includes: a data storage unit 70 connected to the control chip 30 for storing an instruction signal of the current frame received by the control chip 30, so that the data storage unit 70 can record historical data.
- the touch display device further includes a buffer unit 80 connected to the control chip 30 for buffering the instruction signal of the next frame received by the control chip 30.
- the control chip 30 can directly generate the corresponding image data and the driving signal by using the command signal buffered in the buffer unit 80, thereby speeding up the recognition of the body part and the speed of the touch trigger.
- the buffer unit 80, the data storage unit 70, and the image processor 40 can be connected to the control chip 30 via a bus.
- the body part contacting the display screen can be identified, so that different touch parts can be realized by using different body parts, and then the display screen can be controlled to display the corresponding picture.
- the forehead touch can be used to determine the function
- the chin touch can realize the cancel function
- the nose touch can realize the double click function
- the toe touch can realize the drawing function
- the display screen can be controlled according to the touch operation. Display, which makes it easier for people with disabilities to use touch display devices.
- the present disclosure further provides a touch display device, as shown in FIG. 3, including:
- the plurality of touch sensors 60 are disposed on the display screen 50 for acquiring coordinate positions of the touch points when the different parts of the body are in contact with the display screen 50.
- the touch sensors 60 may be arranged in rows on the display screen 50. Ground, the number of touch sensors can be determined according to the precision of the sensing, and the number of touch sensors is increased when the required sensing precision is high, and the number of touch sensors is reduced when the required sensing precision is low;
- the central processing unit 20 is respectively connected to the detecting component 10 and the touch sensor 60 for detecting
- the sound vibration signal transmitted by the component 10 is analyzed, the body touch type is determined, and a command signal including the body touch type and the touch point coordinate position is generated;
- the control chip 30 is connected to the central processing unit 20 for receiving a command signal sent by the central processing unit 20, and generating image data and a driving signal according to the command signal;
- the image processor 40 is connected to the control chip 30 for receiving the image data and the driving signal sent by the control chip 30, thereby driving the display screen 50 to display the corresponding picture.
- a plurality of detecting components and a touch sensor are disposed on the display screen, and the sound vibration signal and the touch point coordinate position generated when the different parts of the body are in contact with the display screen can be acquired, and then the body touch can be determined according to the sound vibration signal.
- the control type generates image data and a driving signal corresponding to the body touch type and the touch point coordinate position, thereby driving the display screen to display the corresponding picture.
- the embodiment of the present disclosure can identify the body part of the touch screen and the coordinate position of the touch point, so that different touch parts can be realized by using different body parts, thereby controlling the display screen to display the corresponding picture.
- using the forehead + touch point coordinate position can realize the function of determining the icon at the touch point of the forehead, and the coordinate position of the chin + touch point can realize the cancel function of the icon at the touch point of the chin, and the coordinates of the nose point + touch point
- the position can realize the double-click function of the icon at the tip touch point of the nose, and the coordinate position of the toe + touch point can realize the zoom function of the screen at the touch point of the toe, and then control the display screen according to the touch operation to display the corresponding screen. Therefore, it is convenient for the disabled to use the touch display device.
- the touch display device further includes: a sound feature signal database for storing a correspondence between the sound feature signal and the body touch type.
- the body touch type is divided into a single touch and a variety of touches, and a single touch is a single body touch.
- a single touch is a single body touch.
- a plurality of touches may be a combination of touches of at least two body parts, which may be arranged in combination in the above manner. Different parts of the body will emit different sounds when they touch the display screen.
- Each body touch type has a unique sound feature signal.
- the sound feature signal A corresponds to the forehead touch
- the sound feature signal B corresponds to the nose tip touch.
- the sound characteristic signal C corresponds to the chin touch
- the sound feature signal D corresponds to the toe touch
- the sound feature signal E corresponds to the forehead + nose touch
- the sound feature signal F corresponds to the chin + nose touch
- the central processing unit 20 is configured to acquire a sound feature signal according to the sound vibration signal, and determine a body touch type corresponding to the sound feature signal according to the corresponding relationship. For example, the central processor 20 obtains the sound feature signal A according to the sound vibration signal, Determine the body touch type as forehead touch; The central processor 20 obtains the sound feature signal B according to the sound vibration signal, and then determines that the body touch type is the nose touch; and when the central processor 20 acquires the sound feature signal C according to the sound vibration signal, the body touch type can be determined as The chin touches; the central processor 20 obtains the sound feature signal D according to the sound vibration signal, and then determines that the body touch type is toe touch; the central processor 20 obtains the sound feature signal E according to the sound vibration signal, and the body can be determined.
- the touch type is forehead + nose touch; when the central processor 20 obtains the sound feature signal F according to the sound vibration signal, it can be determined that the body touch type is chin + nose touch.
- the central processing unit 20 is further configured to acquire a sound feature signal according to the received sound vibration signal, and receive the input body touch type corresponding to the sound vibration signal, and send the sound feature signal and the input body touch type to Sound signature signal database.
- the sound feature signal database is further configured to update a correspondence between the sound feature signal and the body touch type according to the received sound feature signal and the body touch type.
- the sound feature signal database can not only store the correspondence between the sound feature signal and the body touch type, but also update the correspondence between the sound feature signal and the body touch type as needed, and add a new sound feature signal as needed. Correspondence with body touch types.
- the central processing unit 20 includes:
- the sampling module 21 is configured to receive the sound vibration signal sent by the detecting component 10.
- the detecting component distributed on the display screen generates a sound vibration signal
- the sampling module mainly collects the original sound vibration signal generated by the different body parts contacting the display screen.
- the original sound vibration signal cannot directly judge the type of body touch.
- the signal processing module 22 is configured to process the sound vibration signal collected by the sampling module 21 to obtain the sound parameter of the sound vibration signal. Specifically, the signal processing module may perform processing such as Fourier transform on the original sound vibration signal to acquire key key parameters, such as characteristic amplitude and characteristic frequency of the sound vibration signal.
- the feature extraction module 23 is configured to perform feature extraction on the sound vibration signal according to the sound parameter acquired by the signal processing module 22, and acquire the sound feature signal;
- the comparison module 24 is configured to compare the acquired sound feature signal with data in the sound feature signal database to determine a body touch type corresponding to the sound feature signal.
- the touch display device further includes: a trigger event database, configured to store a correspondence between the touch point coordinate position and the body touch type and the trigger event, such as the forehead touch + the touch point position coordinate corresponding to the touch
- the icon at the point is used to determine the function
- the chin touch + the touch point position coordinate corresponds to the icon at the touch point to achieve the cancel function
- the nose touch + the touch point position coordinate corresponds to the icon at the touch point to achieve the double click function
- the toe touch + touch point position coordinates correspond to the zoom function on the screen at the touch point, and so on.
- the control chip 30 is specifically configured to compare the body touch type and the touch point position coordinate included in the received command signal with the information in the trigger event database, and determine the correspondence with the body touch type and the touch point position coordinate.
- the image data and the driving signal are sent to the image processor, for example, when the central processing unit 20 determines that the body touch type is the forehead touch, the control chip 30 sends the image data and the driving signal corresponding to the icon implementation determining function at the touch point. And driving the display screen to display the screen corresponding to the determining function; when the central processing unit 20 determines that the body touch type is the chin touch, the control chip 30 sends the image data and the driving signal corresponding to the cancel function of the icon at the touch point.
- the display screen is driven to display a screen corresponding to the cancel function; when the central processing unit 20 determines that the body touch type is the nose touch, the control chip 30 will perform an image corresponding to the double click function of the icon at the touch point. Data and drive signals are sent to the image processor, which in turn drives the display to display a screen corresponding to the double tap function; When the central processor 20 determines that the body touch type is toe touch, the control chip 30 sends the image data and the driving signal corresponding to the zoom function of the screen at the touch point to the image processor, thereby driving the display display and the zoom function. Corresponding screen.
- the body part of the touch screen and the position coordinates of the touch point can be identified, so that different touch parts can be realized by using different body parts, and the display screen can be controlled to display the corresponding picture.
- the forehead touch can be used to determine the function
- the chin touch can achieve the cancel function
- the nose touch can realize the double click function
- the toe touch can realize the zoom function, etc.
- the display screen is controlled according to the touch operation. Display, which makes it easier for people with disabilities to use touch display devices.
- the touch display device further includes: a data storage unit 70 connected to the control chip 30 for storing an instruction signal of the current frame received by the control chip 30, so that the data storage unit 70 can record historical data.
- the touch display device further includes: a buffer unit 80 connected to the control chip 30, It is used to buffer the instruction signal of the next frame received by the control chip 30.
- the control chip 30 can directly generate the corresponding image data and the driving signal by using the command signal buffered in the buffer unit 80, thereby speeding up the recognition of the body part and the speed of the touch trigger.
- the buffer unit 80, the data storage unit 70, and the image processor 40 can be connected to the control chip 30 via a bus.
- the present disclosure further provides a touch display device, which is applied to the touch display device of some embodiments, as shown in FIG. 4, the method includes:
- Step 101 Acquire a sound vibration signal generated when different parts of the body touch the display screen
- the sound vibration sensor can sense different sound vibrations generated when different parts of the body such as the forehead, chin, nose and the like touch the display screen. signal.
- Step 102 analyzing the sound vibration signal, determining a body touch type, and generating a command signal including a body touch type;
- the correspondence between the sound feature signal and the body touch type is stored in the sound feature signal database.
- the body touch type is divided into a single touch and a variety of touch.
- a single touch is a single body part touch, such as forehead touch, nose touch, chin touch, and toe touch.
- the combination of multiple touches for at least two body parts can be arranged in the above manner. Different parts of the body will emit different sounds when they touch the display, and each body touch type has a unique sound characteristic signal.
- the sound feature signal A corresponds to the forehead touch
- the sound feature signal B corresponds to the nose touch
- the sound feature signal C corresponds to the chin touch
- the sound feature signal D corresponds to the toe touch
- the sound feature signal E corresponds.
- the sound feature signal F corresponds to the chin + nose tip touch and so on.
- the central processor collects the original sound vibration signal, and performs Fourier transform processing on the original sound vibration signal to acquire key parameters such as the characteristic amplitude and the characteristic frequency of the sound vibration signal, perform feature extraction, and acquire the sound characteristic signal. And determining a body touch type corresponding to the sound feature signal according to the correspondence stored in the sound feature signal database. For example, if the central processing unit acquires the sound characteristic signal A according to the sound vibration signal, it can be determined that the body touch type is the forehead touch; and the central processor obtains the sound characteristic signal B according to the sound vibration signal, and the body touch type can be determined to be the tip of the nose.
- the central processor acquires a sound characteristic signal according to the sound vibration signal C, it can be determined that the body touch type is chin touch; the central processor obtains the sound feature signal D according to the sound vibration signal, and then the body touch type can be determined as the toe touch; the central processor obtains the sound according to the sound vibration signal.
- the characteristic signal E can determine that the body touch type is forehead + nose touch; the central processor obtains the sound feature signal F according to the sound vibration signal, and can determine that the body touch type is chin + nose touch.
- the command signal including the body touch type is sent to the control chip.
- Step 103 Generate corresponding image data and a driving signal according to the command signal
- the trigger event database stores the correspondence between the body touch type and the trigger event, such as the forehead touch corresponding determination function, the chin touch corresponding cancellation function, the nose touch corresponding double click function, the toe touch corresponding drawing function, and the like.
- the control chip compares the body touch type included in the received command signal with the information in the trigger event database, and determines image data and a driving signal corresponding to the body touch type.
- the control chip when the central processing unit determines that the body touch type is the forehead touch, the control chip sends the image data and the driving signal corresponding to the determining function to the image processor, thereby driving the display screen to display the screen corresponding to the determining function;
- the control chip sends the image data and the driving signal corresponding to the cancel function to the image processor, thereby driving the screen to display the screen corresponding to the cancel function;
- the control type is the tip touch
- the control chip sends the image data and the driving signal corresponding to the double-click function to the image processor, thereby driving the display screen to display the screen corresponding to the double-click function;
- the central processor determines the body touch type as the toe During touch, the control chip sends the image data and the driving signal corresponding to the drawing function to the image processor, thereby driving the display screen to display a screen corresponding to the drawing function.
- Step 104 Use the image data and the driving signal to drive the display screen to display the corresponding screen.
- the image processor After receiving the image data and the driving signal sent by the control chip, the image processor drives the display screen to display the corresponding screen by using the image data and the driving signal.
- the central processing unit may further acquire a sound feature signal according to the received sound vibration signal, and receive a body touch type corresponding to the sound vibration signal input by the user, and send the sound feature signal and the input body touch type to the sound.
- the feature signal database, the sound feature signal database updates the sound feature signal and the body touch according to the received sound feature signal and the body touch type The correspondence between control types.
- the sound feature signal database can not only store the correspondence between the sound feature signal and the body touch type, but also update the correspondence between the sound feature signal and the body touch type as needed, and add a new sound feature signal as needed. Correspondence with body touch types.
- the body part contacting the display screen can be identified, so that different touch parts can be realized by using different body parts, and then the display screen can be controlled to display the corresponding picture.
- the forehead touch can be used to determine the function
- the chin touch can realize the cancel function
- the nose touch can realize the double click function
- the toe touch can realize the drawing function
- the display screen can be controlled according to the touch operation. Display, which makes it easier for people with disabilities to use touch display devices.
- the present disclosure further provides a touch display device, which is applied to the touch display device of some embodiments, as shown in FIG. 5, the method includes:
- Step 201 Acquire a sound vibration signal and a touch point position coordinate generated when different parts of the body touch the display screen;
- the sound vibration sensor can sense different sound vibrations generated when different parts of the body such as the forehead, chin, nose and the like touch the display screen. signal.
- touch sensors distributed on the display screen.
- the touch sensor can be used to know the change of the touch position, and the coordinate position of the touch point is sensed.
- the sensing methods used include, but are not limited to, capacitive, resistive, optical, and the like.
- Step 202 analyzing the sound vibration signal, determining a body touch type, and generating a command signal including a body touch type and a touch point coordinate position;
- the correspondence between the sound feature signal and the body touch type is stored in the sound feature signal database.
- the body touch type is divided into a single touch and a variety of touches, and the single touch is a single body touch, such as forehead touch, nose touch, chin touch, and toe touch.
- the multiple touches are combinations of touches of at least two body parts, which can be arranged and combined in the above manner, and different parts of the body will emit different sounds when they touch the display screen, and each body touch type corresponds to a unique sound feature signal.
- the sound feature signal A corresponds to the forehead touch
- the sound feature signal B corresponds to the nose touch
- the sound feature signal C corresponds to the chin touch and the sound feature signal D.
- the sound characteristic signal E corresponds to the forehead + nose tip touch
- the sound feature signal F corresponds to the chin + nose tip touch and the like.
- the central processor collects the original sound vibration signal, and performs Fourier transform processing on the original sound vibration signal to acquire key parameters such as the characteristic amplitude and the characteristic frequency of the sound vibration signal, perform feature extraction, and acquire the sound characteristic signal. And determining a body touch type corresponding to the sound feature signal according to the correspondence stored in the sound feature signal database. For example, if the central processing unit acquires the sound characteristic signal A according to the sound vibration signal, it can be determined that the body touch type is the forehead touch; and the central processor obtains the sound characteristic signal B according to the sound vibration signal, and the body touch type can be determined to be the tip of the nose.
- the central processor obtains the sound feature signal C according to the sound vibration signal, and then determines that the body touch type is the chin touch; and the central processor obtains the sound feature signal D according to the sound vibration signal, and then the body touch type can be determined.
- the central processor obtains the sound feature signal E according to the sound vibration signal, and then determines that the body touch type is the forehead + nose tip touch; the central processor obtains the sound feature signal F according to the sound vibration signal, and then can determine The body touch type is chin + nose touch.
- the central processor After the central processor recognizes the body touch type, generates a command signal including the body touch type and the touch point coordinate position, and sends the command signal to the control chip.
- Step 203 Generate corresponding image data and a driving signal according to the command signal
- the trigger event database stores the correspondence between the touch point coordinate position and the body touch type and the trigger event.
- the forehead touch + touch point position coordinates correspond to the icon realization function at the touch point
- the chin touch + the touch point position coordinate corresponds to the icon at the touch point to cancel the function
- the nose touch + the touch point The position coordinates correspond to the double-click function of the icon at the touch point
- the toe touch + the touch point position coordinate corresponds to the zoom function of the screen at the touch point, and the like.
- the control chip compares the body touch type and the touch point position coordinate included in the received command signal with the information in the trigger event database, and determines image data and driving corresponding to the body touch type and the touch point position coordinate. signal. For example, when the central processing unit determines that the body touch type is forehead touch, the control chip sends image data and driving signals corresponding to the icon determining function at the touch point to the image processor, thereby driving the display display and determining function.
- the control chip when the central processor determines that the body touch type is the chin touch, the control chip sends the image data and the driving signal corresponding to the icon cancellation function at the touch point to the image processor, thereby driving the display Displaying a screen corresponding to the cancel function; when the CPU determines that the body touch type is the nose touch, the control chip sends the image data and the driving signal corresponding to the double-click function of the icon at the touch point to the image processor, and further The driving display screen displays a screen corresponding to the double-click function; when the central processing unit determines that the body touch type is the toe touch, the control chip sends the image data and the driving signal corresponding to the zoom function of the screen at the touch point to the image processing. The device, in turn, drives the display to display a screen corresponding to the zoom function.
- Step 204 Use the image data and the driving signal to drive the display screen to display the corresponding screen.
- the image processor After receiving the image data and the driving signal sent by the control chip, the image processor drives the display screen to display the corresponding screen by using the image data and the driving signal.
- the central processing unit may further acquire a sound feature signal according to the received sound vibration signal, and receive a body touch type corresponding to the sound vibration signal input by the user, and send the sound feature signal and the input body touch type to the sound.
- the feature signal database updates the correspondence between the sound feature signal and the body touch type according to the received sound feature signal and the body touch type.
- the sound feature signal database can not only store the correspondence between the sound feature signal and the body touch type, but also update the correspondence between the sound feature signal and the body touch type as needed, and add a new sound feature signal as needed. Correspondence with body touch types.
- the body part of the touch screen and the position coordinates of the touch point can be identified, so that different touch parts can be realized by using different body parts, and the display screen can be controlled to display the corresponding picture.
- the forehead touch can be used to determine the function
- the chin touch can realize the cancel function
- the nose touch can realize the double click function
- the toe touch can realize the zoom function, etc.
- the display screen is controlled according to the touch operation to display the corresponding screen. Therefore, it is convenient for the disabled to use the touch display device.
- modules and units may be implemented in software for execution by various types of processors.
- an identified executable code module, unit can comprise one or more physical or logical blocks of computer instructions, which can be constructed, for example, as an object, procedure, or function.
- the executable code of the identified modules and units need not be physically located together, but may include different instructions stored in different physicalities, when these instructions are logically combined, It constitutes a module, a unit and achieves the stated purpose of the module and unit.
- the executable code modules, units may be a single instruction or a plurality of instructions, and may even be distributed over a plurality of different code segments, distributed among different programs, and distributed across multiple memory devices.
- operational data may be identified within modules, units, and may be implemented in any suitable form and organized within any suitable type of data structure. The operational data may be collected as a single data set, or may be distributed at different locations (including on different storage devices), and may at least partially exist as an electronic signal on a system or network.
- the module and the unit can be implemented by software, considering the level of the existing hardware process, the module and the unit that can be implemented by software can be constructed by a person skilled in the art without considering the cost.
- the hardware circuitry includes conventional Very Large Scale Integration (VLSI) circuits or gate arrays and existing semiconductors such as logic chips, transistors, or other discrete components.
- VLSI Very Large Scale Integration
- Modules, units can also be implemented with programmable hardware devices, such as field programmable gate arrays, programmable array logic, programmable logic devices, and the like.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- User Interface Of Digital Computer (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
Description
Claims (12)
- 一种触控显示设备,包括:显示屏;设置在所述显示屏上的多个检测元件,用于获取在身体不同部位接触所述显示屏时所产生的声音振动信号;中央处理器,与所述检测元件连接,用于对所述检测元件传递的声音振动信号进行分析,确定身体触控类型,并生成包含有所述身体触控类型的指令信号;控制芯片,与所述中央处理器连接,用于接收所述中央处理器发送的指令信号,并根据所述指令信号生成影像数据和驱动信号;图像处理器,与所述控制芯片连接,用于接收所述控制芯片发送的所述影像数据和驱动信号,进而驱动所述显示屏显示相应的画面。
- 根据权利要求1所述的触控显示设备,还包括:声音特征信号数据库,用于存储声音特征信号与身体触控类型之间的对应关系;所述中央处理器具体用于根据所述声音振动信号获取声音特征信号,并根据所述对应关系确定与所述声音特征信号对应的身体触控类型。
- 根据权利要求2所述的触控显示设备,其中,所述中央处理器还用于根据接收到的声音振动信号获取声音特征信号,并接收输入的与所述声音振动信号对应的身体触控类型,将所述声音特征信号和所述输入的身体触控类型发送给所述声音特征信号数据库;所述声音特征信号数据库还用于根据接收到的声音特征信号和身体触控类型,更新声音特征信号与身体触控类型之间的对应关系。
- 根据权利要求1所述的触控显示设备,其中,身体触控类型包括单一触控和多样触控,单一触控为单一身体部位触控,多样触控为至少两个身体部位触控的组合。
- 根据权利要求2所述的触控显示设备,其中,所述中央处理器包括:采样模块,用于接收所述检测元件发送的声音振动信号;信号处理模块,用于对所述声音振动信号进行处理,获取所述声音振动信号的声音参数;特征提取模块,用于根据所获取的声音参数对所述声音振动信号进行特征提取,获取声音特征信号;比对模块,用于将所获取的声音特征信号与所述声音特征信号数据库中的数据进行比对,确定与所述声音特征信号对应的身体触控类型。
- 根据权利要求1-5中任一项所述的触控显示设备,还包括:触发事件数据库,用于存储身体触控类型与触发事件之间的对应关系;所述控制芯片具体用于将接收到的指令信号中包含的身体触控类型与所述触发事件数据库中的信息进行比对,确定与所述身体触控类型对应的影像数据和驱动信号。
- 根据权利要求6所述的触控显示设备,还包括:设置在所述显示屏上的多个触控传感器,用于获取在身体不同部位接触所述显示屏时的触控点坐标位置;所述中央处理器还与所述触控传感器连接,用于生成包含有所述身体触控类型和触控点坐标位置的指令信号。
- 根据权利要求7所述的触控显示设备,其中,所述触发事件数据库中还存储有触控点坐标位置和身体触控类型与触发事件之间的对应关系;所述控制芯片具体用于将接收到的指令信号中包含的身体触控类型和触控点位置坐标与所述触发事件数据库中的信息进行比对,确定与所述身体触控类型和触控点位置坐标对应的影像数据和驱动信号。
- 根据权利要求1所述的触控显示设备,还包括:数据存储单元,与所述控制芯片连接,用于存储所述控制芯片接收到的当前帧的指令信号。
- 根据权利要求1所述的触控显示设备,还包括:缓存单元,与所述控制芯片连接,用于对所述控制芯片接收到的下一帧的指令信号进行缓存。
- 一种触控显示方法,应用于如权利要求1-10中任一项所述的触控显 示设备,所述方法包括:获取在身体不同部位接触所述显示屏时所产生的声音振动信号;对所述声音振动信号进行分析,确定身体触控类型,并生成包含有所述身体触控类型的指令信号;根据所述指令信号生成对应的影像数据和驱动信号;利用所述影像数据和驱动信号驱动所述显示屏显示相应的画面。
- 根据权利要求11所述的触控显示方法,还包括:获取在身体不同部位接触所述显示屏时的触控点坐标位置;所述生成包含有所述身体触控类型的指令信号的步骤,具体为:生成包含有所述身体触控类型和触控点坐标位置的指令信号。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/301,719 US20170177144A1 (en) | 2015-08-20 | 2016-01-18 | Touch display device and touch display method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510515461.7A CN105183217B (zh) | 2015-08-20 | 2015-08-20 | 触控显示设备及触控显示方法 |
CN201510515461.7 | 2015-08-20 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017028491A1 true WO2017028491A1 (zh) | 2017-02-23 |
Family
ID=54905335
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2016/071171 WO2017028491A1 (zh) | 2015-08-20 | 2016-01-18 | 触控显示设备及触控显示方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20170177144A1 (zh) |
CN (1) | CN105183217B (zh) |
WO (1) | WO2017028491A1 (zh) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105183217B (zh) * | 2015-08-20 | 2016-11-09 | 京东方科技集团股份有限公司 | 触控显示设备及触控显示方法 |
CN107704108A (zh) * | 2017-07-06 | 2018-02-16 | 中国科学院合肥物质科学研究院 | 一种基于柔性压力传感器的足部书写板 |
CN111596798A (zh) * | 2020-05-18 | 2020-08-28 | 京东方科技集团股份有限公司 | 显示装置及其发声方法 |
CN113778249B (zh) * | 2020-06-09 | 2024-01-23 | 京东方科技集团股份有限公司 | 触控显示驱动模组、方法和显示装置 |
CN113867158B (zh) * | 2020-06-12 | 2023-10-24 | 青岛海尔电冰箱有限公司 | 厨房电器运行状态判断方法、检测装置、冰箱及可读存储介质 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010152693A (ja) * | 2008-12-25 | 2010-07-08 | Nec Corp | 携帯情報端末の報知方法、携帯情報端末およびプログラム |
CN101794173A (zh) * | 2010-03-23 | 2010-08-04 | 浙江大学 | 无手残疾人专用电脑输入装置及其方法 |
CN202334482U (zh) * | 2011-11-25 | 2012-07-11 | 上海天逸电器有限公司 | 声光振一体式触摸开关 |
US20140240262A1 (en) * | 2013-02-27 | 2014-08-28 | Samsung Electronics Co., Ltd. | Apparatus and method for supporting voice service in a portable terminal for visually disabled people |
CN104298397A (zh) * | 2014-09-24 | 2015-01-21 | 合肥鑫晟光电科技有限公司 | 触摸屏及其定位方法 |
CN105183217A (zh) * | 2015-08-20 | 2015-12-23 | 京东方科技集团股份有限公司 | 触控显示设备及触控显示方法 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9436282B2 (en) * | 2013-03-14 | 2016-09-06 | Immersion Corporation | Contactor-based haptic feedback generation |
US20150035759A1 (en) * | 2013-08-02 | 2015-02-05 | Qeexo, Co. | Capture of Vibro-Acoustic Data Used to Determine Touch Types |
US9740046B2 (en) * | 2013-11-12 | 2017-08-22 | Nvidia Corporation | Method and apparatus to provide a lower power user interface on an LCD panel through localized backlight control |
CN103995587B (zh) * | 2014-05-13 | 2017-09-29 | 联想(北京)有限公司 | 一种信息控制方法及电子设备 |
-
2015
- 2015-08-20 CN CN201510515461.7A patent/CN105183217B/zh active Active
-
2016
- 2016-01-18 US US15/301,719 patent/US20170177144A1/en not_active Abandoned
- 2016-01-18 WO PCT/CN2016/071171 patent/WO2017028491A1/zh active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010152693A (ja) * | 2008-12-25 | 2010-07-08 | Nec Corp | 携帯情報端末の報知方法、携帯情報端末およびプログラム |
CN101794173A (zh) * | 2010-03-23 | 2010-08-04 | 浙江大学 | 无手残疾人专用电脑输入装置及其方法 |
CN202334482U (zh) * | 2011-11-25 | 2012-07-11 | 上海天逸电器有限公司 | 声光振一体式触摸开关 |
US20140240262A1 (en) * | 2013-02-27 | 2014-08-28 | Samsung Electronics Co., Ltd. | Apparatus and method for supporting voice service in a portable terminal for visually disabled people |
CN104298397A (zh) * | 2014-09-24 | 2015-01-21 | 合肥鑫晟光电科技有限公司 | 触摸屏及其定位方法 |
CN105183217A (zh) * | 2015-08-20 | 2015-12-23 | 京东方科技集团股份有限公司 | 触控显示设备及触控显示方法 |
Also Published As
Publication number | Publication date |
---|---|
US20170177144A1 (en) | 2017-06-22 |
CN105183217B (zh) | 2016-11-09 |
CN105183217A (zh) | 2015-12-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102578253B1 (ko) | 전자 장치 및 전자 장치의 지문 정보 획득 방법 | |
US8059111B2 (en) | Data transfer using hand-held device | |
WO2017028491A1 (zh) | 触控显示设备及触控显示方法 | |
EP2940555B1 (en) | Automatic gaze calibration | |
US9652070B2 (en) | Integrating multiple different touch based inputs | |
US8884885B2 (en) | Touch pad, method of operating the same, and notebook computer with the same | |
US8816964B2 (en) | Sensor-augmented, gesture-enabled keyboard and associated apparatus and computer-readable storage medium | |
US8358200B2 (en) | Method and system for controlling computer applications | |
US10551961B2 (en) | Touch gesture offset | |
WO2018026202A1 (ko) | 펜과 관련된 정보를 판단하는 터치 감지 장치 및 그 제어 방법과 펜 | |
US9342184B2 (en) | Managing multiple touch sources with palm rejection | |
US20120249448A1 (en) | Method of identifying a gesture and device using the same | |
US20150185850A1 (en) | Input detection | |
US10228795B2 (en) | Gesture recognition and control based on finger differentiation | |
US20150062043A1 (en) | Method of operating electronic handwriting and electronic device for supporting the same | |
WO2019071594A1 (zh) | 一种显示处理方法及电子设备 | |
US20140022206A1 (en) | Hardware accelerator for touchscreen data processing | |
US10733280B2 (en) | Control of a mobile device based on fingerprint identification | |
US10620759B2 (en) | Method and system for scanning matrix electrode sub-regions | |
CN105122192B (zh) | 防止触摸输入错误的电子设备和方法 | |
KR102353919B1 (ko) | 터치의 압력에 응답하여 지정된 동작을 수행하는 전자 장치 및 방법 | |
US10318084B2 (en) | Methods and systems for matrix electrode arrays | |
TWI598748B (zh) | 電子設備及字元校正方法 | |
US8531412B1 (en) | Method and system for processing touch input | |
US10241614B2 (en) | Object classification under low-power scan |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 15301719 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16836358 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16836358 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16836358 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 200918) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16836358 Country of ref document: EP Kind code of ref document: A1 |