WO2017028491A1 - 触控显示设备及触控显示方法 - Google Patents

触控显示设备及触控显示方法 Download PDF

Info

Publication number
WO2017028491A1
WO2017028491A1 PCT/CN2016/071171 CN2016071171W WO2017028491A1 WO 2017028491 A1 WO2017028491 A1 WO 2017028491A1 CN 2016071171 W CN2016071171 W CN 2016071171W WO 2017028491 A1 WO2017028491 A1 WO 2017028491A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
signal
sound
display screen
type
Prior art date
Application number
PCT/CN2016/071171
Other languages
English (en)
French (fr)
Inventor
李文波
Original Assignee
京东方科技集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司 filed Critical 京东方科技集团股份有限公司
Priority to US15/301,719 priority Critical patent/US20170177144A1/en
Publication of WO2017028491A1 publication Critical patent/WO2017028491A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • G06F3/04186Touch location disambiguation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/043Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves
    • G06F3/0433Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves in which the acoustic waves are either generated by a movable member and propagated within a surface layer or propagated within a surface layer and captured by a movable member

Definitions

  • the present disclosure relates to the field of touch technologies, and in particular to a touch display device and a touch display method.
  • touch devices based on touch operations have gradually become standard devices for various portable mobile terminals.
  • touch screen Through the touch screen, the user can directly input coordinate information to the mobile terminal by hand, which is an input device like the mouse and the keyboard.
  • the touch screen has many advantages such as ruggedness, fast response, space saving, and easy communication.
  • smartphones with touch screens are becoming more powerful and have larger screens.
  • the large screen makes it easy for users to browse the whole webpage and watch videos, which brings good display and viewing comfort.
  • touch technologies such as resistive, capacitive, infrared, and electromagnetic pens.
  • the technology ranges from single-touch to multi-touch, and the sensitivity is also significantly improved. Touch technology has gradually become more practical and mature.
  • the existing touch technology generally requires the user to perform a touch operation by hand, such as by a finger or a stylus.
  • a touch operation by hand such as by a finger or a stylus.
  • existing touch technologies often fail to meet their control needs.
  • the technical problem to be solved by the present disclosure is to provide a touch display device and a touch display method, which enable a user to perform touch operations by using different parts of the body, and then control the display screen to display corresponding screens according to the touch operations. Therefore, it is convenient for the disabled to use the touch display device.
  • a touch display device including:
  • a plurality of detecting elements disposed on the display screen for acquiring sound vibration signals generated when different parts of the body contact the display screen;
  • a central processing unit coupled to the detecting component, configured to analyze a sound vibration signal transmitted by the detecting component, determine a body touch type, and generate a command signal including the body touch type;
  • control chip connected to the central processor, for receiving a command signal sent by the central processor, and generating image data and a driving signal according to the command signal;
  • the image processor is connected to the control chip and configured to receive the image data and the driving signal sent by the control chip, thereby driving the display screen to display a corresponding screen.
  • the touch display device further includes:
  • a sound feature signal database for storing a correspondence between the sound feature signal and the body touch type
  • the central processor is specifically configured to acquire a sound feature signal according to the sound vibration signal, and determine a body touch type corresponding to the sound feature signal according to the correspondence relationship.
  • the central processor is further configured to acquire a sound feature signal according to the received sound vibration signal, and receive the input body touch type corresponding to the sound vibration signal, and the sound feature signal and the input a body touch type is sent to the sound feature signal database;
  • the sound feature signal database is further configured to update a correspondence between the sound feature signal and the body touch type according to the received sound feature signal and the body touch type.
  • the body touch type includes a single touch and a plurality of touches, the single touch is a single body part touch, and the multiple touch is a combination of at least two body parts touch.
  • the central processing unit includes:
  • a sampling module configured to receive a sound vibration signal sent by the detecting component
  • a signal processing module configured to process the sound vibration signal to obtain a sound parameter of the sound vibration signal
  • a feature extraction module configured to perform feature extraction on the sound vibration signal according to the acquired sound parameter, and acquire a sound feature signal
  • a comparison module configured to compare the acquired sound feature signal with data in the sound feature signal database to determine a body touch type corresponding to the sound feature signal.
  • the touch display device further includes:
  • a trigger event database for storing a correspondence between a body touch type and a trigger event
  • the control chip is specifically configured to compare the body touch type included in the received command signal with the information in the trigger event database, and determine image data and a driving signal corresponding to the body touch type.
  • the touch display device further includes:
  • a plurality of touch sensors disposed on the display screen for acquiring a coordinate position of the touch point when the different parts of the body contact the display screen;
  • the central processor is further connected to the touch sensor for generating a command signal including the body touch type and the touch point coordinate position.
  • the trigger event database further stores a correspondence relationship between the touch point coordinate position and the body touch type and the trigger event;
  • the control chip is specifically configured to compare the body touch type and the touch point position coordinate included in the received command signal with the information in the trigger event database, and determine the body touch type and the touch. Image data and drive signals corresponding to the point position coordinates.
  • the touch display device further includes:
  • a data storage unit connected to the control chip, configured to store a command signal of a current frame received by the control chip.
  • the touch display device further includes:
  • a buffer unit connected to the control chip for buffering an instruction signal of a next frame received by the control chip.
  • the embodiment of the present disclosure further provides a touch display method, which is applied to the touch display device as described above, and the method includes:
  • the display screen is driven by the image data and the driving signal to display a corresponding picture.
  • the method further includes:
  • the step of generating the command signal including the body touch type is specifically:
  • a command signal is generated that includes the body touch type and the touch point coordinate position.
  • a plurality of detecting elements are disposed on the display screen, and the sound vibration signals generated when the different parts of the body are in contact with the display screen can be acquired; then the central processor can determine the body touch type according to the sound vibration signal, and generate corresponding
  • the command signal receives the command signal and generates corresponding image data and driving signals; the image processor receives the image data and the driving signal, thereby driving the display screen to display the corresponding picture.
  • the embodiment of the present disclosure can identify the body part contacting the display screen, so that different touch parts can be realized by using different body parts, thereby controlling the display screen to display the corresponding picture.
  • the forehead touch can be used to determine the function
  • the chin touch can realize the cancel function
  • the nose touch can realize the double click function
  • the toe touch can realize the drawing function
  • the display screen can be controlled according to the touch operation. Display, which makes it easier for people with disabilities to use touch display devices.
  • FIG. 1 is a schematic structural diagram of a touch display device according to some embodiments of the present disclosure
  • FIG. 2 is a schematic structural diagram of a central processing unit according to some embodiments of the present disclosure.
  • FIG. 3 is a schematic structural diagram of a touch display device according to some embodiments of the present disclosure.
  • FIG. 4 is a schematic flow chart of a touch display method according to some embodiments of the present disclosure.
  • FIG. 5 is a schematic flowchart of a touch display method according to some embodiments of the present disclosure.
  • FIG. 6 is a schematic diagram of a body touch type including a single touch and multiple touches.
  • the embodiments of the present disclosure are directed to the problem that the touch display device is inconvenient for the disabled group in the prior art, and provides a touch display device and a touch display method, so that the user can perform touch operations by using different parts of the body, and further According to these touch operations, the display screen is controlled to display the corresponding screen, thereby facilitating the use of the touch display device by the disabled.
  • the present disclosure provides a touch display device, as shown in FIG. 1, comprising:
  • the central processing unit 20 is connected to the detecting component 10 for analyzing the sound vibration signal transmitted by the detecting component 10, determining the body touch type, and generating a command signal including the body touch type;
  • the control chip 30 is connected to the central processing unit 20 for receiving a command signal sent by the central processing unit 20, and generating image data and a driving signal according to the command signal;
  • the image processor 40 is connected to the control chip 30 for receiving the image data and the driving signal sent by the control chip 30, thereby driving the display screen 50 to display the corresponding picture.
  • a plurality of detecting elements are disposed on the display screen, and the sound vibration signals generated when the different parts of the body touch the display screen can be acquired, and then the central processing unit can determine the body touch type according to the sound vibration signal, and generate corresponding
  • the command signal receives the command signal and generates corresponding image data and a driving signal; the image processor receives the image data and the driving signal, thereby driving the display screen to display the corresponding picture.
  • the embodiment of the present disclosure can identify the body part contacting the display screen, so that different touch parts can be realized by using different body parts, thereby controlling the display screen to display the corresponding picture.
  • the forehead touch can be used to determine the function
  • the chin touch can be implemented.
  • the function of canceling, the tip touch of the nose realizes the double-click function, the touch of the toe can realize the drawing function, and the display screen is controlled according to the touch operation, thereby facilitating the use of the touch display device by the disabled.
  • the touch display device further includes: a sound feature signal database for storing a correspondence between the sound feature signal and the body touch type.
  • the body touch type is divided into a single touch and a variety of touch.
  • the single touch is a single body part touch, for example, there may be a forehead touch, a nose touch, a chin touch, a toe touch, and the like.
  • the combination of multiple touches for at least two body parts can be arranged in the above manner. Different parts of the body will emit different sounds when they touch the display, and each body touch type has a unique sound characteristic signal.
  • the sound feature signal A corresponds to the forehead touch
  • the sound feature signal B corresponds to the nose touch
  • the sound feature signal C corresponds to the chin touch
  • the sound feature signal D corresponds to the toe touch
  • the sound feature signal E corresponds.
  • the sound feature signal F corresponds to the chin + nose tip touch and so on.
  • the central processing unit 20 is specifically configured to acquire a sound feature signal according to the sound vibration signal, and determine a body touch type corresponding to the sound feature signal according to the correspondence relationship. For example, if the central processing unit 20 acquires the sound feature signal A according to the sound vibration signal, it can be determined that the body touch type is the forehead touch. When the central processing unit 20 acquires the sound characteristic signal B according to the sound vibration signal, it can be determined that the body touch type is the nose touch. The central processor 20 acquires the sound feature signal C according to the sound vibration signal, and then determines that the body touch type is chin touch. The central processor 20 acquires the sound feature signal D according to the sound vibration signal, and then determines that the body touch type is toe touch.
  • the central processor 20 acquires the sound feature signal E according to the sound vibration signal, and then determines that the body touch type is the forehead + nose tip touch.
  • the central processing unit 20 acquires the sound feature signal F according to the sound vibration signal, it can be determined that the body touch type is chin + nose touch.
  • the central processing unit 20 is further configured to acquire a sound feature signal according to the received sound vibration signal, and receive the input body touch type corresponding to the sound vibration signal, and send the sound feature signal and the input body touch type to Sound signature signal database;
  • the sound feature signal database is further configured to update a correspondence between the sound feature signal and the body touch type according to the received sound feature signal and the body touch type, so that the sound feature signal database can not only store the sound feature signal and the body touch Correspondence between types, you can also update the correspondence between the sound feature signal and the body touch type as needed, and add new ones as needed. Correspondence between the sound signature signal and the type of body touch.
  • the central processing unit 20 includes:
  • the sampling module 21 is configured to receive the sound vibration signal sent by the detecting component 10.
  • the detecting component distributed on the display screen generates a sound vibration signal
  • the sampling module mainly collects the original sound vibration signal generated by the different body parts contacting the display screen.
  • the original sound vibration signal cannot directly judge the type of body touch;
  • the signal processing module 22 is configured to process the sound vibration signal collected by the sampling module 21 to obtain the sound parameter of the sound vibration signal. Specifically, the signal processing module 22 may perform a Fourier transform process or the like on the original sound vibration signal to acquire key key parameters, such as a characteristic amplitude of the sound vibration signal, a characteristic frequency, and the like;
  • the feature extraction module 23 is configured to perform feature extraction on the sound vibration signal according to the sound parameter acquired by the signal processing module 22, and acquire the sound feature signal;
  • the comparison module 24 is configured to compare the acquired sound feature signal with data in the sound feature signal database to determine a body touch type corresponding to the sound feature signal.
  • the touch display device further includes: a trigger event database, configured to store a correspondence between the body touch type and the trigger event, such as a forehead touch corresponding determination function, a chin touch corresponding cancellation function, and a nose touch corresponding double click Function, toe touch corresponding painting function and so on.
  • a trigger event database configured to store a correspondence between the body touch type and the trigger event, such as a forehead touch corresponding determination function, a chin touch corresponding cancellation function, and a nose touch corresponding double click Function, toe touch corresponding painting function and so on.
  • the control chip 30 is specifically configured to compare the body touch type included in the received command signal with the information in the trigger event database, and determine image data and a driving signal corresponding to the body touch type. For example, when the central processing unit 20 determines that the body touch type is forehead touch, the control chip 30 sends the image data and the driving signal corresponding to the determining function to the image processor, thereby driving the display screen to display a screen corresponding to the determining function. When the central processing unit 20 determines that the body touch type is the chin touch, the control chip 30 transmits the image data and the driving signal corresponding to the cancel function to the image processor, thereby driving the display screen to display the screen corresponding to the cancel function.
  • the control chip 30 transmits the image data and the driving signal corresponding to the double-click function to the image processor, thereby driving the display screen to display the screen corresponding to the double-click function.
  • the control chip 30 sends the image data and the driving signal corresponding to the drawing function to the image processor, thereby driving the display screen display and drawing work. The screen that can correspond.
  • the touch display device further includes: a data storage unit 70 connected to the control chip 30 for storing an instruction signal of the current frame received by the control chip 30, so that the data storage unit 70 can record historical data.
  • the touch display device further includes a buffer unit 80 connected to the control chip 30 for buffering the instruction signal of the next frame received by the control chip 30.
  • the control chip 30 can directly generate the corresponding image data and the driving signal by using the command signal buffered in the buffer unit 80, thereby speeding up the recognition of the body part and the speed of the touch trigger.
  • the buffer unit 80, the data storage unit 70, and the image processor 40 can be connected to the control chip 30 via a bus.
  • the body part contacting the display screen can be identified, so that different touch parts can be realized by using different body parts, and then the display screen can be controlled to display the corresponding picture.
  • the forehead touch can be used to determine the function
  • the chin touch can realize the cancel function
  • the nose touch can realize the double click function
  • the toe touch can realize the drawing function
  • the display screen can be controlled according to the touch operation. Display, which makes it easier for people with disabilities to use touch display devices.
  • the present disclosure further provides a touch display device, as shown in FIG. 3, including:
  • the plurality of touch sensors 60 are disposed on the display screen 50 for acquiring coordinate positions of the touch points when the different parts of the body are in contact with the display screen 50.
  • the touch sensors 60 may be arranged in rows on the display screen 50. Ground, the number of touch sensors can be determined according to the precision of the sensing, and the number of touch sensors is increased when the required sensing precision is high, and the number of touch sensors is reduced when the required sensing precision is low;
  • the central processing unit 20 is respectively connected to the detecting component 10 and the touch sensor 60 for detecting
  • the sound vibration signal transmitted by the component 10 is analyzed, the body touch type is determined, and a command signal including the body touch type and the touch point coordinate position is generated;
  • the control chip 30 is connected to the central processing unit 20 for receiving a command signal sent by the central processing unit 20, and generating image data and a driving signal according to the command signal;
  • the image processor 40 is connected to the control chip 30 for receiving the image data and the driving signal sent by the control chip 30, thereby driving the display screen 50 to display the corresponding picture.
  • a plurality of detecting components and a touch sensor are disposed on the display screen, and the sound vibration signal and the touch point coordinate position generated when the different parts of the body are in contact with the display screen can be acquired, and then the body touch can be determined according to the sound vibration signal.
  • the control type generates image data and a driving signal corresponding to the body touch type and the touch point coordinate position, thereby driving the display screen to display the corresponding picture.
  • the embodiment of the present disclosure can identify the body part of the touch screen and the coordinate position of the touch point, so that different touch parts can be realized by using different body parts, thereby controlling the display screen to display the corresponding picture.
  • using the forehead + touch point coordinate position can realize the function of determining the icon at the touch point of the forehead, and the coordinate position of the chin + touch point can realize the cancel function of the icon at the touch point of the chin, and the coordinates of the nose point + touch point
  • the position can realize the double-click function of the icon at the tip touch point of the nose, and the coordinate position of the toe + touch point can realize the zoom function of the screen at the touch point of the toe, and then control the display screen according to the touch operation to display the corresponding screen. Therefore, it is convenient for the disabled to use the touch display device.
  • the touch display device further includes: a sound feature signal database for storing a correspondence between the sound feature signal and the body touch type.
  • the body touch type is divided into a single touch and a variety of touches, and a single touch is a single body touch.
  • a single touch is a single body touch.
  • a plurality of touches may be a combination of touches of at least two body parts, which may be arranged in combination in the above manner. Different parts of the body will emit different sounds when they touch the display screen.
  • Each body touch type has a unique sound feature signal.
  • the sound feature signal A corresponds to the forehead touch
  • the sound feature signal B corresponds to the nose tip touch.
  • the sound characteristic signal C corresponds to the chin touch
  • the sound feature signal D corresponds to the toe touch
  • the sound feature signal E corresponds to the forehead + nose touch
  • the sound feature signal F corresponds to the chin + nose touch
  • the central processing unit 20 is configured to acquire a sound feature signal according to the sound vibration signal, and determine a body touch type corresponding to the sound feature signal according to the corresponding relationship. For example, the central processor 20 obtains the sound feature signal A according to the sound vibration signal, Determine the body touch type as forehead touch; The central processor 20 obtains the sound feature signal B according to the sound vibration signal, and then determines that the body touch type is the nose touch; and when the central processor 20 acquires the sound feature signal C according to the sound vibration signal, the body touch type can be determined as The chin touches; the central processor 20 obtains the sound feature signal D according to the sound vibration signal, and then determines that the body touch type is toe touch; the central processor 20 obtains the sound feature signal E according to the sound vibration signal, and the body can be determined.
  • the touch type is forehead + nose touch; when the central processor 20 obtains the sound feature signal F according to the sound vibration signal, it can be determined that the body touch type is chin + nose touch.
  • the central processing unit 20 is further configured to acquire a sound feature signal according to the received sound vibration signal, and receive the input body touch type corresponding to the sound vibration signal, and send the sound feature signal and the input body touch type to Sound signature signal database.
  • the sound feature signal database is further configured to update a correspondence between the sound feature signal and the body touch type according to the received sound feature signal and the body touch type.
  • the sound feature signal database can not only store the correspondence between the sound feature signal and the body touch type, but also update the correspondence between the sound feature signal and the body touch type as needed, and add a new sound feature signal as needed. Correspondence with body touch types.
  • the central processing unit 20 includes:
  • the sampling module 21 is configured to receive the sound vibration signal sent by the detecting component 10.
  • the detecting component distributed on the display screen generates a sound vibration signal
  • the sampling module mainly collects the original sound vibration signal generated by the different body parts contacting the display screen.
  • the original sound vibration signal cannot directly judge the type of body touch.
  • the signal processing module 22 is configured to process the sound vibration signal collected by the sampling module 21 to obtain the sound parameter of the sound vibration signal. Specifically, the signal processing module may perform processing such as Fourier transform on the original sound vibration signal to acquire key key parameters, such as characteristic amplitude and characteristic frequency of the sound vibration signal.
  • the feature extraction module 23 is configured to perform feature extraction on the sound vibration signal according to the sound parameter acquired by the signal processing module 22, and acquire the sound feature signal;
  • the comparison module 24 is configured to compare the acquired sound feature signal with data in the sound feature signal database to determine a body touch type corresponding to the sound feature signal.
  • the touch display device further includes: a trigger event database, configured to store a correspondence between the touch point coordinate position and the body touch type and the trigger event, such as the forehead touch + the touch point position coordinate corresponding to the touch
  • the icon at the point is used to determine the function
  • the chin touch + the touch point position coordinate corresponds to the icon at the touch point to achieve the cancel function
  • the nose touch + the touch point position coordinate corresponds to the icon at the touch point to achieve the double click function
  • the toe touch + touch point position coordinates correspond to the zoom function on the screen at the touch point, and so on.
  • the control chip 30 is specifically configured to compare the body touch type and the touch point position coordinate included in the received command signal with the information in the trigger event database, and determine the correspondence with the body touch type and the touch point position coordinate.
  • the image data and the driving signal are sent to the image processor, for example, when the central processing unit 20 determines that the body touch type is the forehead touch, the control chip 30 sends the image data and the driving signal corresponding to the icon implementation determining function at the touch point. And driving the display screen to display the screen corresponding to the determining function; when the central processing unit 20 determines that the body touch type is the chin touch, the control chip 30 sends the image data and the driving signal corresponding to the cancel function of the icon at the touch point.
  • the display screen is driven to display a screen corresponding to the cancel function; when the central processing unit 20 determines that the body touch type is the nose touch, the control chip 30 will perform an image corresponding to the double click function of the icon at the touch point. Data and drive signals are sent to the image processor, which in turn drives the display to display a screen corresponding to the double tap function; When the central processor 20 determines that the body touch type is toe touch, the control chip 30 sends the image data and the driving signal corresponding to the zoom function of the screen at the touch point to the image processor, thereby driving the display display and the zoom function. Corresponding screen.
  • the body part of the touch screen and the position coordinates of the touch point can be identified, so that different touch parts can be realized by using different body parts, and the display screen can be controlled to display the corresponding picture.
  • the forehead touch can be used to determine the function
  • the chin touch can achieve the cancel function
  • the nose touch can realize the double click function
  • the toe touch can realize the zoom function, etc.
  • the display screen is controlled according to the touch operation. Display, which makes it easier for people with disabilities to use touch display devices.
  • the touch display device further includes: a data storage unit 70 connected to the control chip 30 for storing an instruction signal of the current frame received by the control chip 30, so that the data storage unit 70 can record historical data.
  • the touch display device further includes: a buffer unit 80 connected to the control chip 30, It is used to buffer the instruction signal of the next frame received by the control chip 30.
  • the control chip 30 can directly generate the corresponding image data and the driving signal by using the command signal buffered in the buffer unit 80, thereby speeding up the recognition of the body part and the speed of the touch trigger.
  • the buffer unit 80, the data storage unit 70, and the image processor 40 can be connected to the control chip 30 via a bus.
  • the present disclosure further provides a touch display device, which is applied to the touch display device of some embodiments, as shown in FIG. 4, the method includes:
  • Step 101 Acquire a sound vibration signal generated when different parts of the body touch the display screen
  • the sound vibration sensor can sense different sound vibrations generated when different parts of the body such as the forehead, chin, nose and the like touch the display screen. signal.
  • Step 102 analyzing the sound vibration signal, determining a body touch type, and generating a command signal including a body touch type;
  • the correspondence between the sound feature signal and the body touch type is stored in the sound feature signal database.
  • the body touch type is divided into a single touch and a variety of touch.
  • a single touch is a single body part touch, such as forehead touch, nose touch, chin touch, and toe touch.
  • the combination of multiple touches for at least two body parts can be arranged in the above manner. Different parts of the body will emit different sounds when they touch the display, and each body touch type has a unique sound characteristic signal.
  • the sound feature signal A corresponds to the forehead touch
  • the sound feature signal B corresponds to the nose touch
  • the sound feature signal C corresponds to the chin touch
  • the sound feature signal D corresponds to the toe touch
  • the sound feature signal E corresponds.
  • the sound feature signal F corresponds to the chin + nose tip touch and so on.
  • the central processor collects the original sound vibration signal, and performs Fourier transform processing on the original sound vibration signal to acquire key parameters such as the characteristic amplitude and the characteristic frequency of the sound vibration signal, perform feature extraction, and acquire the sound characteristic signal. And determining a body touch type corresponding to the sound feature signal according to the correspondence stored in the sound feature signal database. For example, if the central processing unit acquires the sound characteristic signal A according to the sound vibration signal, it can be determined that the body touch type is the forehead touch; and the central processor obtains the sound characteristic signal B according to the sound vibration signal, and the body touch type can be determined to be the tip of the nose.
  • the central processor acquires a sound characteristic signal according to the sound vibration signal C, it can be determined that the body touch type is chin touch; the central processor obtains the sound feature signal D according to the sound vibration signal, and then the body touch type can be determined as the toe touch; the central processor obtains the sound according to the sound vibration signal.
  • the characteristic signal E can determine that the body touch type is forehead + nose touch; the central processor obtains the sound feature signal F according to the sound vibration signal, and can determine that the body touch type is chin + nose touch.
  • the command signal including the body touch type is sent to the control chip.
  • Step 103 Generate corresponding image data and a driving signal according to the command signal
  • the trigger event database stores the correspondence between the body touch type and the trigger event, such as the forehead touch corresponding determination function, the chin touch corresponding cancellation function, the nose touch corresponding double click function, the toe touch corresponding drawing function, and the like.
  • the control chip compares the body touch type included in the received command signal with the information in the trigger event database, and determines image data and a driving signal corresponding to the body touch type.
  • the control chip when the central processing unit determines that the body touch type is the forehead touch, the control chip sends the image data and the driving signal corresponding to the determining function to the image processor, thereby driving the display screen to display the screen corresponding to the determining function;
  • the control chip sends the image data and the driving signal corresponding to the cancel function to the image processor, thereby driving the screen to display the screen corresponding to the cancel function;
  • the control type is the tip touch
  • the control chip sends the image data and the driving signal corresponding to the double-click function to the image processor, thereby driving the display screen to display the screen corresponding to the double-click function;
  • the central processor determines the body touch type as the toe During touch, the control chip sends the image data and the driving signal corresponding to the drawing function to the image processor, thereby driving the display screen to display a screen corresponding to the drawing function.
  • Step 104 Use the image data and the driving signal to drive the display screen to display the corresponding screen.
  • the image processor After receiving the image data and the driving signal sent by the control chip, the image processor drives the display screen to display the corresponding screen by using the image data and the driving signal.
  • the central processing unit may further acquire a sound feature signal according to the received sound vibration signal, and receive a body touch type corresponding to the sound vibration signal input by the user, and send the sound feature signal and the input body touch type to the sound.
  • the feature signal database, the sound feature signal database updates the sound feature signal and the body touch according to the received sound feature signal and the body touch type The correspondence between control types.
  • the sound feature signal database can not only store the correspondence between the sound feature signal and the body touch type, but also update the correspondence between the sound feature signal and the body touch type as needed, and add a new sound feature signal as needed. Correspondence with body touch types.
  • the body part contacting the display screen can be identified, so that different touch parts can be realized by using different body parts, and then the display screen can be controlled to display the corresponding picture.
  • the forehead touch can be used to determine the function
  • the chin touch can realize the cancel function
  • the nose touch can realize the double click function
  • the toe touch can realize the drawing function
  • the display screen can be controlled according to the touch operation. Display, which makes it easier for people with disabilities to use touch display devices.
  • the present disclosure further provides a touch display device, which is applied to the touch display device of some embodiments, as shown in FIG. 5, the method includes:
  • Step 201 Acquire a sound vibration signal and a touch point position coordinate generated when different parts of the body touch the display screen;
  • the sound vibration sensor can sense different sound vibrations generated when different parts of the body such as the forehead, chin, nose and the like touch the display screen. signal.
  • touch sensors distributed on the display screen.
  • the touch sensor can be used to know the change of the touch position, and the coordinate position of the touch point is sensed.
  • the sensing methods used include, but are not limited to, capacitive, resistive, optical, and the like.
  • Step 202 analyzing the sound vibration signal, determining a body touch type, and generating a command signal including a body touch type and a touch point coordinate position;
  • the correspondence between the sound feature signal and the body touch type is stored in the sound feature signal database.
  • the body touch type is divided into a single touch and a variety of touches, and the single touch is a single body touch, such as forehead touch, nose touch, chin touch, and toe touch.
  • the multiple touches are combinations of touches of at least two body parts, which can be arranged and combined in the above manner, and different parts of the body will emit different sounds when they touch the display screen, and each body touch type corresponds to a unique sound feature signal.
  • the sound feature signal A corresponds to the forehead touch
  • the sound feature signal B corresponds to the nose touch
  • the sound feature signal C corresponds to the chin touch and the sound feature signal D.
  • the sound characteristic signal E corresponds to the forehead + nose tip touch
  • the sound feature signal F corresponds to the chin + nose tip touch and the like.
  • the central processor collects the original sound vibration signal, and performs Fourier transform processing on the original sound vibration signal to acquire key parameters such as the characteristic amplitude and the characteristic frequency of the sound vibration signal, perform feature extraction, and acquire the sound characteristic signal. And determining a body touch type corresponding to the sound feature signal according to the correspondence stored in the sound feature signal database. For example, if the central processing unit acquires the sound characteristic signal A according to the sound vibration signal, it can be determined that the body touch type is the forehead touch; and the central processor obtains the sound characteristic signal B according to the sound vibration signal, and the body touch type can be determined to be the tip of the nose.
  • the central processor obtains the sound feature signal C according to the sound vibration signal, and then determines that the body touch type is the chin touch; and the central processor obtains the sound feature signal D according to the sound vibration signal, and then the body touch type can be determined.
  • the central processor obtains the sound feature signal E according to the sound vibration signal, and then determines that the body touch type is the forehead + nose tip touch; the central processor obtains the sound feature signal F according to the sound vibration signal, and then can determine The body touch type is chin + nose touch.
  • the central processor After the central processor recognizes the body touch type, generates a command signal including the body touch type and the touch point coordinate position, and sends the command signal to the control chip.
  • Step 203 Generate corresponding image data and a driving signal according to the command signal
  • the trigger event database stores the correspondence between the touch point coordinate position and the body touch type and the trigger event.
  • the forehead touch + touch point position coordinates correspond to the icon realization function at the touch point
  • the chin touch + the touch point position coordinate corresponds to the icon at the touch point to cancel the function
  • the nose touch + the touch point The position coordinates correspond to the double-click function of the icon at the touch point
  • the toe touch + the touch point position coordinate corresponds to the zoom function of the screen at the touch point, and the like.
  • the control chip compares the body touch type and the touch point position coordinate included in the received command signal with the information in the trigger event database, and determines image data and driving corresponding to the body touch type and the touch point position coordinate. signal. For example, when the central processing unit determines that the body touch type is forehead touch, the control chip sends image data and driving signals corresponding to the icon determining function at the touch point to the image processor, thereby driving the display display and determining function.
  • the control chip when the central processor determines that the body touch type is the chin touch, the control chip sends the image data and the driving signal corresponding to the icon cancellation function at the touch point to the image processor, thereby driving the display Displaying a screen corresponding to the cancel function; when the CPU determines that the body touch type is the nose touch, the control chip sends the image data and the driving signal corresponding to the double-click function of the icon at the touch point to the image processor, and further The driving display screen displays a screen corresponding to the double-click function; when the central processing unit determines that the body touch type is the toe touch, the control chip sends the image data and the driving signal corresponding to the zoom function of the screen at the touch point to the image processing. The device, in turn, drives the display to display a screen corresponding to the zoom function.
  • Step 204 Use the image data and the driving signal to drive the display screen to display the corresponding screen.
  • the image processor After receiving the image data and the driving signal sent by the control chip, the image processor drives the display screen to display the corresponding screen by using the image data and the driving signal.
  • the central processing unit may further acquire a sound feature signal according to the received sound vibration signal, and receive a body touch type corresponding to the sound vibration signal input by the user, and send the sound feature signal and the input body touch type to the sound.
  • the feature signal database updates the correspondence between the sound feature signal and the body touch type according to the received sound feature signal and the body touch type.
  • the sound feature signal database can not only store the correspondence between the sound feature signal and the body touch type, but also update the correspondence between the sound feature signal and the body touch type as needed, and add a new sound feature signal as needed. Correspondence with body touch types.
  • the body part of the touch screen and the position coordinates of the touch point can be identified, so that different touch parts can be realized by using different body parts, and the display screen can be controlled to display the corresponding picture.
  • the forehead touch can be used to determine the function
  • the chin touch can realize the cancel function
  • the nose touch can realize the double click function
  • the toe touch can realize the zoom function, etc.
  • the display screen is controlled according to the touch operation to display the corresponding screen. Therefore, it is convenient for the disabled to use the touch display device.
  • modules and units may be implemented in software for execution by various types of processors.
  • an identified executable code module, unit can comprise one or more physical or logical blocks of computer instructions, which can be constructed, for example, as an object, procedure, or function.
  • the executable code of the identified modules and units need not be physically located together, but may include different instructions stored in different physicalities, when these instructions are logically combined, It constitutes a module, a unit and achieves the stated purpose of the module and unit.
  • the executable code modules, units may be a single instruction or a plurality of instructions, and may even be distributed over a plurality of different code segments, distributed among different programs, and distributed across multiple memory devices.
  • operational data may be identified within modules, units, and may be implemented in any suitable form and organized within any suitable type of data structure. The operational data may be collected as a single data set, or may be distributed at different locations (including on different storage devices), and may at least partially exist as an electronic signal on a system or network.
  • the module and the unit can be implemented by software, considering the level of the existing hardware process, the module and the unit that can be implemented by software can be constructed by a person skilled in the art without considering the cost.
  • the hardware circuitry includes conventional Very Large Scale Integration (VLSI) circuits or gate arrays and existing semiconductors such as logic chips, transistors, or other discrete components.
  • VLSI Very Large Scale Integration
  • Modules, units can also be implemented with programmable hardware devices, such as field programmable gate arrays, programmable array logic, programmable logic devices, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • User Interface Of Digital Computer (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

一种触控显示设备及触控显示方法,触控显示设备包括:显示屏(50);设置在显示屏(50)上的多个检测元件(10),用于获取在身体不同部位接触显示屏(50)时所产生的声音振动信号;中央处理器(20),与检测元件(10)连接,用于对检测元件(10)传递的声音振动信号进行分析,确定身体触控类型,并生成包含有身体触控类型的指令信号;控制芯片(30),与中央处理器(20)连接,用于接收中央处理器(20)发送的指令信号,并根据指令信号生成影像数据和驱动信号;图像处理器(40),与控制芯片(30)连接,用于接收控制芯片(30)发送的影像数据和驱动信号,进而驱动显示屏(50)显示相应的画面。

Description

触控显示设备及触控显示方法
相关申请的交叉引用
本申请主张在2015年8月20日在中国提交的中国专利申请号No.201510515461.7的优先权,其全部内容通过引用包含于此。
技术领域
本公开涉及触控技术领域,特别是指一种触控显示设备及触控显示方法。
背景技术
随着触控技术的发展,基于触控操作的触控装置逐渐成为各种便携移动终端的标准配置。通过触控屏,用户可以直接用手向移动终端输入坐标信息,它和鼠标、键盘一样,是一种输入设备。触控屏具有坚固耐用、反应速度快、节省空间、易于交流等许多优点。例如,采用触摸屏的智能手机,性能越来越强劲,屏幕越来越大。大屏幕便于用户全网页浏览、观看视频,带来良好的显示效果及观看舒适性。
触控技术类别较多,常见的有电阻式、电容式、红外式、电磁笔式等多种方式。技术涉及从单点触控到多点触控,灵敏度也提升显著。触控技术已逐渐趋于实用化、成熟化。
但是现有的触控技术一般都需要用户通过手来执行触控操作,比如通过手指或者触控笔来进行操控。对于残缺手脚的残疾人群,现有触控技术往往无法满足他们的操控需求。
发明内容
本公开要解决的技术问题是提供一种触控显示设备及触控显示方法,使用户能够利用身体的不同部位来进行触控操作,进而根据这些触控操作来控制显示屏进行相应画面的显示,从而方便残疾人群使用触控显示设备。
为解决上述技术问题,本公开的实施例提供技术方案如下:
一方面,提供一种触控显示设备,包括:
显示屏;
设置在所述显示屏上的多个检测元件,用于获取在身体不同部位接触所述显示屏时所产生的声音振动信号;
中央处理器,与所述检测元件连接,用于对所述检测元件传递的声音振动信号进行分析,确定身体触控类型,并生成包含有所述身体触控类型的指令信号;
控制芯片,与所述中央处理器连接,用于接收所述中央处理器发送的指令信号,并根据所述指令信号生成影像数据和驱动信号;
图像处理器,与所述控制芯片连接,用于接收所述控制芯片发送的所述影像数据和驱动信号,进而驱动所述显示屏显示相应的画面。
进一步地,触控显示设备还包括:
声音特征信号数据库,用于存储声音特征信号与身体触控类型之间的对应关系;
所述中央处理器具体用于根据所述声音振动信号获取声音特征信号,并根据所述对应关系确定与所述声音特征信号对应的身体触控类型。
进一步地,所述中央处理器还用于根据接收到的声音振动信号获取声音特征信号,并接收输入的与所述声音振动信号对应的身体触控类型,将所述声音特征信号和所述输入的身体触控类型发送给所述声音特征信号数据库;
所述声音特征信号数据库还用于根据接收到的声音特征信号和身体触控类型,更新声音特征信号与身体触控类型之间的对应关系。
进一步地,身体触控类型包括单一触控和多样触控,单一触控为单一身体部位触控,多样触控为至少两个身体部位触控的组合。
进一步地,所述中央处理器包括:
采样模块,用于接收所述检测元件发送的声音振动信号;
信号处理模块,用于对所述声音振动信号进行处理,获取所述声音振动信号的声音参数;
特征提取模块,用于根据所获取的声音参数对所述声音振动信号进行特征提取,获取声音特征信号;
比对模块,用于将所获取的声音特征信号与所述声音特征信号数据库中的数据进行比对,确定与所述声音特征信号对应的身体触控类型。
进一步地,触控显示设备还包括:
触发事件数据库,用于存储身体触控类型与触发事件之间的对应关系;
所述控制芯片具体用于将接收到的指令信号中包含的身体触控类型与所述触发事件数据库中的信息进行比对,确定与所述身体触控类型对应的影像数据和驱动信号。
进一步地,触控显示设备还包括:
设置在所述显示屏上的多个触控传感器,用于获取在身体不同部位接触所述显示屏时的触控点坐标位置;
所述中央处理器还与所述触控传感器连接,用于生成包含有所述身体触控类型和触控点坐标位置的指令信号。
进一步地,所述触发事件数据库中还存储有触控点坐标位置和身体触控类型与触发事件之间的对应关系;
所述控制芯片具体用于将接收到的指令信号中包含的身体触控类型和触控点位置坐标与所述触发事件数据库中的信息进行比对,确定与所述身体触控类型和触控点位置坐标对应的影像数据和驱动信号。
进一步地,触控显示设备还包括:
数据存储单元,与所述控制芯片连接,用于存储所述控制芯片接收到的当前帧的指令信号。
进一步地,触控显示设备还包括:
缓存单元,与所述控制芯片连接,用于对所述控制芯片接收到的下一帧的指令信号进行缓存。
本公开实施例还提供了一种触控显示方法,应用于如上所述的触控显示设备,所述方法包括:
获取在身体不同部位接触所述显示屏时所产生的声音振动信号;
对所述声音振动信号进行分析,确定身体触控类型,并生成包含有所述身体触控类型的指令信号;
根据所述指令信号生成对应的影像数据和驱动信号;
利用所述影像数据和驱动信号驱动所述显示屏显示相应的画面。
进一步地,所述方法还包括:
获取在身体不同部位接触所述显示屏时的触控点坐标位置;
所述生成包含有所述身体触控类型的指令信号的步骤,具体为:
生成包含有所述身体触控类型和触控点坐标位置的指令信号。
本公开的实施例具有以下有益效果:
上述方案中,在显示屏上设置有多个检测元件,能够获取在身体不同部位接触显示屏时所产生的声音振动信号;之后中央处理器可以根据声音振动信号确定身体触控类型,并生成相应的指令信号;控制芯片接收指令信号,并生成相应的影像数据和驱动信号;图像处理器接收影像数据和驱动信号,进而驱动显示屏显示相应的画面。本公开实施例能够识别接触显示屏的身体部位,这样就可以利用不同的身体部位可实现不同的触控操作,进而可以控制显示屏显示相应的画面。例如,用额头触控可实现确定功能,下巴触控可实现取消功能,鼻尖触控实现双击功能,脚趾触控可实现画画功能等,进而根据这些触控操作来控制显示屏进行相应画面的显示,从而方便残疾人群使用触控显示设备。
附图说明
图1为本公开一些实施例中触控显示设备的结构示意图;
图2为本公开一些实施例中中央处理器的结构示意图;
图3为本公开一些实施例中触控显示设备的结构示意图;
图4为本公开一些实施例中触控显示方法的流程示意图;
图5为本公开一些实施例中触控显示方法的流程示意图;
图6为身体触控类型包括单一触控和多样触控的示意图。
具体实施方式
为使本公开实施例的目的、技术方案和优点更加清楚,下面将结合本公开实施例的附图,对本公开实施例的技术方案进行清楚、完整地描述。显然,所描述的实施例是本公开的一部分实施例,而不是全部的实施例。基于所描述的本公开的实施例,本领域普通技术人员所获得的所有其他实施例,都属于本公开保护的范围。
除非另作定义,此处使用的技术术语或者科学术语应当为本公开所属领域内具有一般技能的人士所理解的通常意义。本公开专利申请说明书以及权利要求书中使用的“第一”、“第二”以及类似的词语并不表示任何顺序、数量或者重要性,而只是用来区分不同的组成部分。同样,“一个”或者“一” 等类似词语也不表示数量限制,而是表示存在至少一个。“连接”或者“相连”等类似的词语并非限定于物理的或者机械的连接,而是可以包括电性的连接,不管是直接的还是间接的。“上”、“下”、“左”、“右”等仅用于表示相对位置关系,当被描述对象的绝对位置改变后,则该相对位置关系也相应地改变。
本公开的实施例针对现有技术中触控显示设备不方便残疾人群使用的问题,提供一种触控显示设备及触控显示方法,使用户能够利用身体的不同部位来进行触控操作,进而根据这些触控操作来控制显示屏进行相应画面的显示,从而方便残疾人群使用触控显示设备。
在一些实施例中,本公开提供了一种触控显示设备,如图1所示,包括:
显示屏50;
设置在显示屏50上的多个检测元件10,用于获取在身体不同部位接触显示屏50时所产生的声音振动信号,检测元件10可以成行成列地设置在显示屏50上;具体地,可以根据感应的精度来确定检测元件的个数,在需要的感应精度较高时,增加检测元件的个数,在需要的感应精度较低时,减少检测元件的个数,具体地,检测元件可以为声音传感器;
中央处理器20,与检测元件10连接,用于对检测元件10传递的声音振动信号进行分析,确定身体触控类型,并生成包含有身体触控类型的指令信号;
控制芯片30,与中央处理器20连接,用于接收中央处理器20发送的指令信号,并根据指令信号生成影像数据和驱动信号;
图像处理器40,与控制芯片30连接,用于接收控制芯片30发送的影像数据和驱动信号,进而驱动显示屏50显示相应的画面。
本实施例在显示屏上设置有多个检测元件,能够获取在身体不同部位接触显示屏时所产生的声音振动信号,之后中央处理器可以根据声音振动信号确定身体触控类型,并生成相应的指令信号;控制芯片接收指令信号,并生成相应的影像数据和驱动信号;图像处理器接收影像数据和驱动信号,进而驱动显示屏显示相应的画面。本公开实施例能够识别接触显示屏的身体部位,这样就可以利用不同的身体部位可实现不同的触控操作,进而可以控制显示屏显示相应的画面。例如,用额头触控可实现确定功能,下巴触控可实现取 消功能,鼻尖触控实现双击功能,脚趾触控可实现画画功能等,进而根据这些触控操作来控制显示屏进行相应画面的显示,从而方便残疾人群使用触控显示设备。
进一步地,触控显示设备还包括:声音特征信号数据库,用于存储声音特征信号与身体触控类型之间的对应关系。其中,身体触控类型分为单一触控和多样触控。如图6所示,单一触控为单一身体部位触控,例如可有额头触控、鼻尖触控、下巴触控、脚趾触控等。而多样触控为至少两个身体部位触控的组合,可以为上述方式的排列组合。身体不同部位接触显示屏时会发出不同的声音,每一身体触控类型对应有独特的声音特征信号。比如声音特征信号A对应的是额头触控、声音特征信号B对应的是鼻尖触控、声音特征信号C对应的是下巴触控、声音特征信号D对应的是脚趾触控、声音特征信号E对应的是额头+鼻尖触控、声音特征信号F对应的是下巴+鼻尖触控等等。
中央处理器20具体用于根据声音振动信号获取声音特征信号,并根据对应关系确定与声音特征信号对应的身体触控类型。比如中央处理器20根据声音振动信号获取到声音特征信号A,则可以确定身体触控类型为额头触控。中央处理器20根据声音振动信号获取到声音特征信号B,则可以确定身体触控类型为鼻尖触控。中央处理器20根据声音振动信号获取到声音特征信号C,则可以确定身体触控类型为下巴触控。中央处理器20根据声音振动信号获取到声音特征信号D,则可以确定身体触控类型为脚趾触控。中央处理器20根据声音振动信号获取到声音特征信号E,则可以确定身体触控类型为额头+鼻尖触控。中央处理器20根据声音振动信号获取到声音特征信号F,则可以确定身体触控类型为下巴+鼻尖触控。
进一步地,中央处理器20还用于根据接收到的声音振动信号获取声音特征信号,并接收输入的与声音振动信号对应的身体触控类型,将声音特征信号和输入的身体触控类型发送给声音特征信号数据库;
声音特征信号数据库还用于根据接收到的声音特征信号和身体触控类型,更新声音特征信号与身体触控类型之间的对应关系,这样声音特征信号数据库不但可以存储声音特征信号与身体触控类型之间的对应关系,还可以根据需要更新声音特征信号与身体触控类型之间的对应关系,并根据需要添加新 的声音特征信号与身体触控类型之间的对应关系。
进一步地,如图2所示,中央处理器20包括:
采样模块21,用于接收检测元件10发送的声音振动信号。当身体的某一部位如额头、下巴、鼻尖等接触显示屏时,分布在显示屏上的检测元件会产生声音振动信号,采样模块主要对不同身体部位接触显示屏产生的原始声音振动信号进行收集,该原始声音振动信号无法直接进行身体触控类型的判断;
信号处理模块22,用于对采样模块21收集的声音振动信号进行处理,获取声音振动信号的声音参数。具体地,信号处理模块22可以对原始声音振动信号进行傅里叶转换等处理,获取主要关键参数,如声音振动信号的特征振幅、特征频率等信息;
特征提取模块23,用于根据信号处理模块22所获取的声音参数对声音振动信号进行特征提取,获取声音特征信号;
比对模块24,用于将所获取的声音特征信号与声音特征信号数据库中的数据进行比对,确定与声音特征信号对应的身体触控类型。
进一步地,触控显示设备还包括:触发事件数据库,用于存储身体触控类型与触发事件之间的对应关系,比如额头触控对应确定功能、下巴触控对应取消功能、鼻尖触控对应双击功能、脚趾触控对应画画功能等等。
控制芯片30具体用于将接收到的指令信号中包含的身体触控类型与触发事件数据库中的信息进行比对,确定与身体触控类型对应的影像数据和驱动信号。比如在中央处理器20确定身体触控类型为额头触控时,控制芯片30将与确定功能对应的影像数据和驱动信号发送至图像处理器,进而驱动显示屏显示与确定功能对应的画面。在中央处理器20确定身体触控类型为下巴触控时,控制芯片30将与取消功能对应的影像数据和驱动信号发送至图像处理器,进而驱动显示屏显示与取消功能对应的画面。在中央处理器20确定身体触控类型为鼻尖触控时,控制芯片30将与双击功能对应的影像数据和驱动信号发送至图像处理器,进而驱动显示屏显示与双击功能对应的画面。在中央处理器20确定身体触控类型为脚趾触控时,控制芯片30将与画画功能对应的影像数据和驱动信号发送至图像处理器,进而驱动显示屏显示与画画功 能对应的画面。
进一步地,如图3所示,触控显示设备还包括:数据存储单元70,与控制芯片30连接,用于存储控制芯片30接收到的当前帧的指令信号,这样通过数据存储单元70可以记录历史数据。
进一步地,触控显示设备还包括:缓存单元80,与控制芯片30连接,用于对控制芯片30接收到的下一帧的指令信号进行缓存。这样在下一帧时,控制芯片30可以直接利用缓存在缓存单元80中的指令信号生成对应的影像数据和驱动信号,能够加快身体部位识别以及触控触发的速度。
其中,缓存单元80、数据存储单元70和图像处理器40可以通过总线与控制芯片30连接。
本实施例能够识别接触显示屏的身体部位,这样就可以利用不同的身体部位可实现不同的触控操作,进而可以控制显示屏显示相应的画面。例如,用额头触控可实现确定功能,下巴触控可实现取消功能,鼻尖触控实现双击功能,脚趾触控可实现画画功能等,进而根据这些触控操作来控制显示屏进行相应画面的显示,从而方便残疾人群使用触控显示设备。
在一些实施例中,本公开还提供了一种触控显示设备,如图3所示,包括:
显示屏50;
设置在显示屏50上的多个检测元件10,用于获取在身体不同部位接触显示屏50时所产生的声音振动信号,检测元件10可以成行成列地设置在显示屏50上;具体地,可以根据感应的精度来确定检测元件的个数,在需要的感应精度较高时,增加检测元件的个数,在需要的感应精度较低时,减少检测元件的个数,具体地,检测元件可以声音传感器;
设置在显示屏50上的多个触控传感器60,用于获取在身体不同部位接触显示屏50时的触控点坐标位置,触控传感器60可以成行成列地设置在显示屏50上,具体地,可以根据感应的精度来确定触控传感器的个数,在需要的感应精度较高时,增加触控传感器的个数,在需要的感应精度较低时,减少触控传感器的个数;
中央处理器20,与检测元件10和触控传感器60分别连接,用于对检测 元件10传递的声音振动信号进行分析,确定身体触控类型,并生成包含有身体触控类型和触控点坐标位置的指令信号;
控制芯片30,与中央处理器20连接,用于接收中央处理器20发送的指令信号,并根据指令信号生成影像数据和驱动信号;
图像处理器40,与控制芯片30连接,用于接收控制芯片30发送的影像数据和驱动信号,进而驱动显示屏50显示相应的画面。
本实施例在显示屏上设置有多个检测元件和触控传感器,能够获取在身体不同部位接触显示屏时所产生的声音振动信号和触控点坐标位置,之后可以根据声音振动信号确定身体触控类型,生成与身体触控类型和触控点坐标位置对应的影像数据和驱动信号,进而驱动显示屏显示相应的画面。本公开实施例能够识别接触显示屏的身体部位和触控点坐标位置,这样就可以利用不同的身体部位可实现不同的触控操作,进而可以控制显示屏显示相应的画面。例如,用额头+触控点坐标位置可实现对额头触控点处图标的确定功能,下巴+触控点坐标位置可实现对下巴触控点处图标的取消功能,而鼻尖+触控点坐标位置可实现对鼻尖触控点处图标的双击功能,脚趾+触控点坐标位置可实现对脚趾触控点处画面的放大功能等,进而根据这些触控操作来控制显示屏进行相应画面的显示,从而方便残疾人群使用触控显示设备。
进一步地,触控显示设备还包括:声音特征信号数据库,用于存储声音特征信号与身体触控类型之间的对应关系。其中,身体触控类型分为单一触控和多样触控,单一触控为单一身体部位触控。例如可有额头触控、鼻尖触控、下巴触控、脚趾触控等,而多样触控为至少两个身体部位触控的组合,可以为上述方式的排列组合。身体不同部位接触显示屏时会发出不同的声音,每一身体触控类型对应有独特的声音特征信号,比如声音特征信号A对应的是额头触控、声音特征信号B对应的是鼻尖触控、声音特征信号C对应的是下巴触控、声音特征信号D对应的是脚趾触控、声音特征信号E对应的是额头+鼻尖触控、声音特征信号F对应的是下巴+鼻尖触控等等;
中央处理器20具体用于根据声音振动信号获取声音特征信号,并根据对应关系确定与声音特征信号对应的身体触控类型,比如中央处理器20根据声音振动信号获取到声音特征信号A,则可以确定身体触控类型为额头触控; 中央处理器20根据声音振动信号获取到声音特征信号B,则可以确定身体触控类型为鼻尖触控;中央处理器20根据声音振动信号获取到声音特征信号C,则可以确定身体触控类型为下巴触控;中央处理器20根据声音振动信号获取到声音特征信号D,则可以确定身体触控类型为脚趾触控;中央处理器20根据声音振动信号获取到声音特征信号E,则可以确定身体触控类型为额头+鼻尖触控;中央处理器20根据声音振动信号获取到声音特征信号F,则可以确定身体触控类型为下巴+鼻尖触控。
进一步地,中央处理器20还用于根据接收到的声音振动信号获取声音特征信号,并接收输入的与声音振动信号对应的身体触控类型,将声音特征信号和输入的身体触控类型发送给声音特征信号数据库。
声音特征信号数据库还用于根据接收到的声音特征信号和身体触控类型,更新声音特征信号与身体触控类型之间的对应关系。这样声音特征信号数据库不但可以存储声音特征信号与身体触控类型之间的对应关系,还可以根据需要更新声音特征信号与身体触控类型之间的对应关系,并根据需要添加新的声音特征信号与身体触控类型之间的对应关系。
进一步地,如图2所示,中央处理器20包括:
采样模块21,用于接收检测元件10发送的声音振动信号。当身体的某一部位如额头、下巴、鼻尖等接触显示屏时,分布在显示屏上的检测元件会产生声音振动信号,采样模块主要对不同身体部位接触显示屏产生的原始声音振动信号进行收集,该原始声音振动信号无法直接进行身体触控类型的判断。
信号处理模块22,用于对采样模块21收集的声音振动信号进行处理,获取声音振动信号的声音参数。具体地,信号处理模块可以对原始声音振动信号进行傅里叶转换等处理,获取主要关键参数,如声音振动信号的特征振幅、特征频率等信息。
特征提取模块23,用于根据信号处理模块22所获取的声音参数对声音振动信号进行特征提取,获取声音特征信号;
比对模块24,用于将所获取的声音特征信号与声音特征信号数据库中的数据进行比对,确定与声音特征信号对应的身体触控类型。
进一步地,触控显示设备还包括:触发事件数据库,用于存储触控点坐标位置和身体触控类型与触发事件之间的对应关系,比如额头触控+触控点位置坐标对应对触控点处的图标实现确定功能、下巴触控+触控点位置坐标对应对触控点处的图标实现取消功能、鼻尖触控+触控点位置坐标对应对触控点处的图标实现双击功能、脚趾触控+触控点位置坐标对应对触控点处的画面实现放大功能等等。
控制芯片30具体用于将接收到的指令信号中包含的身体触控类型和触控点位置坐标与触发事件数据库中的信息进行比对,确定与身体触控类型和触控点位置坐标对应的影像数据和驱动信号,比如在中央处理器20确定身体触控类型为额头触控时,控制芯片30将与对触控点处图标实现确定功能对应的影像数据和驱动信号发送至图像处理器,进而驱动显示屏显示与确定功能对应的画面;在中央处理器20确定身体触控类型为下巴触控时,控制芯片30将与对触控点处图标实现取消功能对应的影像数据和驱动信号发送至图像处理器,进而驱动显示屏显示与取消功能对应的画面;在中央处理器20确定身体触控类型为鼻尖触控时,控制芯片30将与对触控点处图标实现双击功能对应的影像数据和驱动信号发送至图像处理器,进而驱动显示屏显示与双击功能对应的画面;在中央处理器20确定身体触控类型为脚趾触控时,控制芯片30将与对触控点处画面实现放大功能对应的影像数据和驱动信号发送至图像处理器,进而驱动显示屏显示与放大功能对应的画面。
本实施例能够识别接触显示屏的身体部位和触控点位置坐标,这样就可以利用不同的身体部位可实现不同的触控操作,进而可以控制显示屏显示相应的画面。例如,用额头触控可实现确定功能,下巴触控可实现取消功能,而鼻尖触控实现双击功能,脚趾触控可实现放大功能等,进而根据这些触控操作来控制显示屏进行相应画面的显示,从而方便残疾人群使用触控显示设备。
进一步地,如图3所示,触控显示设备还包括:数据存储单元70,与控制芯片30连接,用于存储控制芯片30接收到的当前帧的指令信号,这样通过数据存储单元70可以记录历史数据。
进一步地,触控显示设备还包括:缓存单元80,与控制芯片30连接, 用于对控制芯片30接收到的下一帧的指令信号进行缓存。这样在下一帧时,控制芯片30可以直接利用缓存在缓存单元80中的指令信号生成对应的影像数据和驱动信号,能够加快身体部位识别以及触控触发的速度。
其中,缓存单元80、数据存储单元70和图像处理器40可以通过总线与控制芯片30连接。
在一些实施例中,本公开还提供了一种触控显示方法,应用于上述一些实施例的触控显示设备,如图4所示,所述方法包括:
步骤101:获取在身体不同部位接触显示屏时所产生的声音振动信号;
显示屏上分布有多个检测元件,在用户利用身体部位接触显示屏进行触控操作时,利用声振传感器可以感应身体不同部位如额头、下巴、鼻尖等接触显示屏时所产生的不同声音振动信号。
步骤102:对声音振动信号进行分析,确定身体触控类型,并生成包含有身体触控类型的指令信号;
声音特征信号数据库中存储有声音特征信号与身体触控类型之间的对应关系。其中,身体触控类型分为单一触控和多样触控。单一触控为单一身体部位触控,例如可有额头触控、鼻尖触控、下巴触控、脚趾触控等。而多样触控为至少两个身体部位触控的组合,可以为上述方式的排列组合。身体不同部位接触显示屏时会发出不同的声音,每一身体触控类型对应有独特的声音特征信号。比如声音特征信号A对应的是额头触控、声音特征信号B对应的是鼻尖触控、声音特征信号C对应的是下巴触控、声音特征信号D对应的是脚趾触控、声音特征信号E对应的是额头+鼻尖触控、声音特征信号F对应的是下巴+鼻尖触控等等。
中央处理器收集原始声音振动信号,可以对原始声音振动信号进行傅里叶转换等处理,获取主要关键参数,如声音振动信号的特征振幅、特征频率等信息,进行特征提取,获取声音特征信号,并根据声音特征信号数据库中存储的对应关系确定与声音特征信号对应的身体触控类型。比如中央处理器根据声音振动信号获取到声音特征信号A,则可以确定身体触控类型为额头触控;中央处理器根据声音振动信号获取到声音特征信号B,则可以确定身体触控类型为鼻尖触控;中央处理器根据声音振动信号获取到声音特征信号 C,则可以确定身体触控类型为下巴触控;中央处理器根据声音振动信号获取到声音特征信号D,则可以确定身体触控类型为脚趾触控;中央处理器根据声音振动信号获取到声音特征信号E,则可以确定身体触控类型为额头+鼻尖触控;中央处理器根据声音振动信号获取到声音特征信号F,则可以确定身体触控类型为下巴+鼻尖触控。
中央处理器识别出身体触控类型后,将包含有身体触控类型的指令信号发送给控制芯片。
步骤103:根据指令信号生成对应的影像数据和驱动信号;
触发事件数据库中存储有身体触控类型与触发事件之间的对应关系,比如额头触控对应确定功能、下巴触控对应取消功能、鼻尖触控对应双击功能、脚趾触控对应画画功能等等,控制芯片将接收到的指令信号中包含的身体触控类型与触发事件数据库中的信息进行比对,确定与身体触控类型对应的影像数据和驱动信号。比如在中央处理器确定身体触控类型为额头触控时,控制芯片将与确定功能对应的影像数据和驱动信号发送至图像处理器,进而驱动显示屏显示与确定功能对应的画面;在中央处理器确定身体触控类型为下巴触控时,控制芯片将与取消功能对应的影像数据和驱动信号发送至图像处理器,进而驱动显示屏显示与取消功能对应的画面;在中央处理器确定身体触控类型为鼻尖触控时,控制芯片将与双击功能对应的影像数据和驱动信号发送至图像处理器,进而驱动显示屏显示与双击功能对应的画面;在中央处理器确定身体触控类型为脚趾触控时,控制芯片将与画画功能对应的影像数据和驱动信号发送至图像处理器,进而驱动显示屏显示与画画功能对应的画面。
步骤104:利用影像数据和驱动信号驱动显示屏显示相应的画面。
图像处理器接收到控制芯片发送的影像数据和驱动信号后,利用影像数据和驱动信号驱动显示屏显示相应的画面。
进一步地,中央处理器还可以根据接收到的声音振动信号获取声音特征信号,并接收用户输入的与声音振动信号对应的身体触控类型,将声音特征信号和输入的身体触控类型发送给声音特征信号数据库,声音特征信号数据库根据接收到的声音特征信号和身体触控类型,更新声音特征信号与身体触 控类型之间的对应关系。这样声音特征信号数据库不但可以存储声音特征信号与身体触控类型之间的对应关系,还可以根据需要更新声音特征信号与身体触控类型之间的对应关系,并根据需要添加新的声音特征信号与身体触控类型之间的对应关系。
本实施例能够识别接触显示屏的身体部位,这样就可以利用不同的身体部位可实现不同的触控操作,进而可以控制显示屏显示相应的画面。例如,用额头触控可实现确定功能,下巴触控可实现取消功能,鼻尖触控实现双击功能,脚趾触控可实现画画功能等,进而根据这些触控操作来控制显示屏进行相应画面的显示,从而方便残疾人群使用触控显示设备。
在一些实施例中,本公开还提供了一种触控显示方法,应用于上述一些实施例的触控显示设备,如图5所示,所述方法包括:
步骤201:获取在身体不同部位接触显示屏时所产生的声音振动信号和触控点位置坐标;
显示屏上分布有多个检测元件,在用户利用身体部位接触显示屏进行触控操作时,利用声振传感器可以感应身体不同部位如额头、下巴、鼻尖等接触显示屏时所产生的不同声音振动信号。
显示屏上还分布有多个触控传感器,在用户利用身体部位接触显示屏进行触控操作时,利用触控传感器可以获知触控位置的变化,感知触控点的坐标位置,触控传感器所采用的感应方式包括但不限定于电容式、电阻式、光学式等。
步骤202:对声音振动信号进行分析,确定身体触控类型,并生成包含有身体触控类型和触控点坐标位置的指令信号;
声音特征信号数据库中存储有声音特征信号与身体触控类型之间的对应关系。其中,身体触控类型分为单一触控和多样触控,单一触控为单一身体部位触控,例如可有额头触控、鼻尖触控、下巴触控、脚趾触控等。而多样触控为至少两个身体部位触控的组合,可以为上述方式的排列组合,身体不同部位接触显示屏时会发出不同的声音,每一身体触控类型对应有独特的声音特征信号。比如声音特征信号A对应的是额头触控、声音特征信号B对应的是鼻尖触控、声音特征信号C对应的是下巴触控、声音特征信号D对应的 是脚趾触控、声音特征信号E对应的是额头+鼻尖触控、声音特征信号F对应的是下巴+鼻尖触控等等。
中央处理器收集原始声音振动信号,可以对原始声音振动信号进行傅里叶转换等处理,获取主要关键参数,如声音振动信号的特征振幅、特征频率等信息,进行特征提取,获取声音特征信号,并根据声音特征信号数据库中存储的对应关系确定与声音特征信号对应的身体触控类型。比如中央处理器根据声音振动信号获取到声音特征信号A,则可以确定身体触控类型为额头触控;中央处理器根据声音振动信号获取到声音特征信号B,则可以确定身体触控类型为鼻尖触控;中央处理器根据声音振动信号获取到声音特征信号C,则可以确定身体触控类型为下巴触控;中央处理器根据声音振动信号获取到声音特征信号D,则可以确定身体触控类型为脚趾触控;中央处理器根据声音振动信号获取到声音特征信号E,则可以确定身体触控类型为额头+鼻尖触控;中央处理器根据声音振动信号获取到声音特征信号F,则可以确定身体触控类型为下巴+鼻尖触控。
中央处理器识别出身体触控类型后,生成包含有身体触控类型和触控点坐标位置的指令信号,并将指令信号发送给控制芯片。
步骤203:根据指令信号生成对应的影像数据和驱动信号;
触发事件数据库存储有触控点坐标位置和身体触控类型与触发事件之间的对应关系。比如额头触控+触控点位置坐标对应对触控点处的图标实现确定功能、下巴触控+触控点位置坐标对应对触控点处的图标实现取消功能、鼻尖触控+触控点位置坐标对应对触控点处的图标实现双击功能、脚趾触控+触控点位置坐标对应对触控点处的画面实现放大功能等等。
控制芯片将接收到的指令信号中包含的身体触控类型和触控点位置坐标与触发事件数据库中的信息进行比对,确定与身体触控类型和触控点位置坐标对应的影像数据和驱动信号。比如在中央处理器确定身体触控类型为额头触控时,控制芯片将与对触控点处图标实现确定功能对应的影像数据和驱动信号发送至图像处理器,进而驱动显示屏显示与确定功能对应的画面;在中央处理器确定身体触控类型为下巴触控时,控制芯片将与对触控点处图标实现取消功能对应的影像数据和驱动信号发送至图像处理器,进而驱动显示屏 显示与取消功能对应的画面;在中央处理器确定身体触控类型为鼻尖触控时,控制芯片将与对触控点处图标实现双击功能对应的影像数据和驱动信号发送至图像处理器,进而驱动显示屏显示与双击功能对应的画面;在中央处理器确定身体触控类型为脚趾触控时,控制芯片将与对触控点处画面实现放大功能对应的影像数据和驱动信号发送至图像处理器,进而驱动显示屏显示与放大功能对应的画面。
步骤204:利用影像数据和驱动信号驱动显示屏显示相应的画面。
图像处理器接收到控制芯片发送的影像数据和驱动信号后,利用影像数据和驱动信号驱动显示屏显示相应的画面。
进一步地,中央处理器还可以根据接收到的声音振动信号获取声音特征信号,并接收用户输入的与声音振动信号对应的身体触控类型,将声音特征信号和输入的身体触控类型发送给声音特征信号数据库,声音特征信号数据库根据接收到的声音特征信号和身体触控类型,更新声音特征信号与身体触控类型之间的对应关系。这样声音特征信号数据库不但可以存储声音特征信号与身体触控类型之间的对应关系,还可以根据需要更新声音特征信号与身体触控类型之间的对应关系,并根据需要添加新的声音特征信号与身体触控类型之间的对应关系。
本实施例能够识别接触显示屏的身体部位和触控点位置坐标,这样就可以利用不同的身体部位可实现不同的触控操作,进而可以控制显示屏显示相应的画面。例如,用额头触控可实现确定功能,下巴触控可实现取消功能,鼻尖触控实现双击功能,脚趾触控可实现放大功能等,进而根据这些触控操作来控制显示屏进行相应画面的显示,从而方便残疾人群使用触控显示设备。
此说明书中所描述的许多功能部件都被称为模块、单元,以便更加特别地强调其实现方式的独立性。
本公开实施例中,模块、单元可以用软件实现,以便由各种类型的处理器执行。举例来说,一个标识的可执行代码模块、单元可以包括计算机指令的一个或多个物理或者逻辑块,举例来说,其可以被构建为对象、过程或函数。尽管如此,所标识模块、单元的可执行代码无需物理地位于一起,而是可以包括存储在不同物理上的不同的指令,当这些指令逻辑上结合在一起时, 其构成模块、单元并且实现该模块、单元的规定目的。
实际上,可执行代码模块、单元可以是单条指令或者是许多条指令,并且甚至可以分布在多个不同的代码段上,分布在不同程序当中,以及跨越多个存储器设备分布。同样地,操作数据可以在模块、单元内被识别,并且可以依照任何适当的形式实现并且被组织在任何适当类型的数据结构内。所述操作数据可以作为单个数据集被收集,或者可以分布在不同位置上(包括在不同存储设备上),并且至少部分地可以仅作为电子信号存在于系统或网络上。
在模块、单元可以利用软件实现时,考虑到现有硬件工艺的水平,所以可以以软件实现的模块、单元,在不考虑成本的情况下,本领域技术人员都可以搭建对应的硬件电路来实现对应的功能,所述硬件电路包括常规的超大规模集成(VLSI)电路或者门阵列以及诸如逻辑芯片、晶体管之类的现有半导体或者是其它分立的元件。模块、单元还可以用可编程硬件设备,诸如现场可编程门阵列、可编程阵列逻辑、可编程逻辑设备等实现。
在本公开各方法实施例中,所述各步骤的序号并不能用于限定各步骤的先后顺序,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,对各步骤的先后变化也在本公开的保护范围之内。
以上所述是本公开的优选实施方式,应当指出,对于本技术领域的普通技术人员来说,在不脱离本公开所述原理的前提下,还可以作出若干改进和润饰,这些改进和润饰也应视为本公开的保护范围。

Claims (12)

  1. 一种触控显示设备,包括:
    显示屏;
    设置在所述显示屏上的多个检测元件,用于获取在身体不同部位接触所述显示屏时所产生的声音振动信号;
    中央处理器,与所述检测元件连接,用于对所述检测元件传递的声音振动信号进行分析,确定身体触控类型,并生成包含有所述身体触控类型的指令信号;
    控制芯片,与所述中央处理器连接,用于接收所述中央处理器发送的指令信号,并根据所述指令信号生成影像数据和驱动信号;
    图像处理器,与所述控制芯片连接,用于接收所述控制芯片发送的所述影像数据和驱动信号,进而驱动所述显示屏显示相应的画面。
  2. 根据权利要求1所述的触控显示设备,还包括:
    声音特征信号数据库,用于存储声音特征信号与身体触控类型之间的对应关系;
    所述中央处理器具体用于根据所述声音振动信号获取声音特征信号,并根据所述对应关系确定与所述声音特征信号对应的身体触控类型。
  3. 根据权利要求2所述的触控显示设备,其中,
    所述中央处理器还用于根据接收到的声音振动信号获取声音特征信号,并接收输入的与所述声音振动信号对应的身体触控类型,将所述声音特征信号和所述输入的身体触控类型发送给所述声音特征信号数据库;
    所述声音特征信号数据库还用于根据接收到的声音特征信号和身体触控类型,更新声音特征信号与身体触控类型之间的对应关系。
  4. 根据权利要求1所述的触控显示设备,其中,身体触控类型包括单一触控和多样触控,单一触控为单一身体部位触控,多样触控为至少两个身体部位触控的组合。
  5. 根据权利要求2所述的触控显示设备,其中,所述中央处理器包括:
    采样模块,用于接收所述检测元件发送的声音振动信号;
    信号处理模块,用于对所述声音振动信号进行处理,获取所述声音振动信号的声音参数;
    特征提取模块,用于根据所获取的声音参数对所述声音振动信号进行特征提取,获取声音特征信号;
    比对模块,用于将所获取的声音特征信号与所述声音特征信号数据库中的数据进行比对,确定与所述声音特征信号对应的身体触控类型。
  6. 根据权利要求1-5中任一项所述的触控显示设备,还包括:
    触发事件数据库,用于存储身体触控类型与触发事件之间的对应关系;
    所述控制芯片具体用于将接收到的指令信号中包含的身体触控类型与所述触发事件数据库中的信息进行比对,确定与所述身体触控类型对应的影像数据和驱动信号。
  7. 根据权利要求6所述的触控显示设备,还包括:
    设置在所述显示屏上的多个触控传感器,用于获取在身体不同部位接触所述显示屏时的触控点坐标位置;
    所述中央处理器还与所述触控传感器连接,用于生成包含有所述身体触控类型和触控点坐标位置的指令信号。
  8. 根据权利要求7所述的触控显示设备,其中,
    所述触发事件数据库中还存储有触控点坐标位置和身体触控类型与触发事件之间的对应关系;
    所述控制芯片具体用于将接收到的指令信号中包含的身体触控类型和触控点位置坐标与所述触发事件数据库中的信息进行比对,确定与所述身体触控类型和触控点位置坐标对应的影像数据和驱动信号。
  9. 根据权利要求1所述的触控显示设备,还包括:
    数据存储单元,与所述控制芯片连接,用于存储所述控制芯片接收到的当前帧的指令信号。
  10. 根据权利要求1所述的触控显示设备,还包括:
    缓存单元,与所述控制芯片连接,用于对所述控制芯片接收到的下一帧的指令信号进行缓存。
  11. 一种触控显示方法,应用于如权利要求1-10中任一项所述的触控显 示设备,所述方法包括:
    获取在身体不同部位接触所述显示屏时所产生的声音振动信号;
    对所述声音振动信号进行分析,确定身体触控类型,并生成包含有所述身体触控类型的指令信号;
    根据所述指令信号生成对应的影像数据和驱动信号;
    利用所述影像数据和驱动信号驱动所述显示屏显示相应的画面。
  12. 根据权利要求11所述的触控显示方法,还包括:
    获取在身体不同部位接触所述显示屏时的触控点坐标位置;
    所述生成包含有所述身体触控类型的指令信号的步骤,具体为:
    生成包含有所述身体触控类型和触控点坐标位置的指令信号。
PCT/CN2016/071171 2015-08-20 2016-01-18 触控显示设备及触控显示方法 WO2017028491A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/301,719 US20170177144A1 (en) 2015-08-20 2016-01-18 Touch display device and touch display method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510515461.7A CN105183217B (zh) 2015-08-20 2015-08-20 触控显示设备及触控显示方法
CN201510515461.7 2015-08-20

Publications (1)

Publication Number Publication Date
WO2017028491A1 true WO2017028491A1 (zh) 2017-02-23

Family

ID=54905335

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/071171 WO2017028491A1 (zh) 2015-08-20 2016-01-18 触控显示设备及触控显示方法

Country Status (3)

Country Link
US (1) US20170177144A1 (zh)
CN (1) CN105183217B (zh)
WO (1) WO2017028491A1 (zh)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105183217B (zh) * 2015-08-20 2016-11-09 京东方科技集团股份有限公司 触控显示设备及触控显示方法
CN107704108A (zh) * 2017-07-06 2018-02-16 中国科学院合肥物质科学研究院 一种基于柔性压力传感器的足部书写板
CN111596798A (zh) * 2020-05-18 2020-08-28 京东方科技集团股份有限公司 显示装置及其发声方法
CN113778249B (zh) * 2020-06-09 2024-01-23 京东方科技集团股份有限公司 触控显示驱动模组、方法和显示装置
CN113867158B (zh) * 2020-06-12 2023-10-24 青岛海尔电冰箱有限公司 厨房电器运行状态判断方法、检测装置、冰箱及可读存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010152693A (ja) * 2008-12-25 2010-07-08 Nec Corp 携帯情報端末の報知方法、携帯情報端末およびプログラム
CN101794173A (zh) * 2010-03-23 2010-08-04 浙江大学 无手残疾人专用电脑输入装置及其方法
CN202334482U (zh) * 2011-11-25 2012-07-11 上海天逸电器有限公司 声光振一体式触摸开关
US20140240262A1 (en) * 2013-02-27 2014-08-28 Samsung Electronics Co., Ltd. Apparatus and method for supporting voice service in a portable terminal for visually disabled people
CN104298397A (zh) * 2014-09-24 2015-01-21 合肥鑫晟光电科技有限公司 触摸屏及其定位方法
CN105183217A (zh) * 2015-08-20 2015-12-23 京东方科技集团股份有限公司 触控显示设备及触控显示方法

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9436282B2 (en) * 2013-03-14 2016-09-06 Immersion Corporation Contactor-based haptic feedback generation
US20150035759A1 (en) * 2013-08-02 2015-02-05 Qeexo, Co. Capture of Vibro-Acoustic Data Used to Determine Touch Types
US9740046B2 (en) * 2013-11-12 2017-08-22 Nvidia Corporation Method and apparatus to provide a lower power user interface on an LCD panel through localized backlight control
CN103995587B (zh) * 2014-05-13 2017-09-29 联想(北京)有限公司 一种信息控制方法及电子设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010152693A (ja) * 2008-12-25 2010-07-08 Nec Corp 携帯情報端末の報知方法、携帯情報端末およびプログラム
CN101794173A (zh) * 2010-03-23 2010-08-04 浙江大学 无手残疾人专用电脑输入装置及其方法
CN202334482U (zh) * 2011-11-25 2012-07-11 上海天逸电器有限公司 声光振一体式触摸开关
US20140240262A1 (en) * 2013-02-27 2014-08-28 Samsung Electronics Co., Ltd. Apparatus and method for supporting voice service in a portable terminal for visually disabled people
CN104298397A (zh) * 2014-09-24 2015-01-21 合肥鑫晟光电科技有限公司 触摸屏及其定位方法
CN105183217A (zh) * 2015-08-20 2015-12-23 京东方科技集团股份有限公司 触控显示设备及触控显示方法

Also Published As

Publication number Publication date
US20170177144A1 (en) 2017-06-22
CN105183217B (zh) 2016-11-09
CN105183217A (zh) 2015-12-23

Similar Documents

Publication Publication Date Title
KR102578253B1 (ko) 전자 장치 및 전자 장치의 지문 정보 획득 방법
US8059111B2 (en) Data transfer using hand-held device
WO2017028491A1 (zh) 触控显示设备及触控显示方法
EP2940555B1 (en) Automatic gaze calibration
US9652070B2 (en) Integrating multiple different touch based inputs
US8884885B2 (en) Touch pad, method of operating the same, and notebook computer with the same
US8816964B2 (en) Sensor-augmented, gesture-enabled keyboard and associated apparatus and computer-readable storage medium
US8358200B2 (en) Method and system for controlling computer applications
US10551961B2 (en) Touch gesture offset
WO2018026202A1 (ko) 펜과 관련된 정보를 판단하는 터치 감지 장치 및 그 제어 방법과 펜
US9342184B2 (en) Managing multiple touch sources with palm rejection
US20120249448A1 (en) Method of identifying a gesture and device using the same
US20150185850A1 (en) Input detection
US10228795B2 (en) Gesture recognition and control based on finger differentiation
US20150062043A1 (en) Method of operating electronic handwriting and electronic device for supporting the same
WO2019071594A1 (zh) 一种显示处理方法及电子设备
US20140022206A1 (en) Hardware accelerator for touchscreen data processing
US10733280B2 (en) Control of a mobile device based on fingerprint identification
US10620759B2 (en) Method and system for scanning matrix electrode sub-regions
CN105122192B (zh) 防止触摸输入错误的电子设备和方法
KR102353919B1 (ko) 터치의 압력에 응답하여 지정된 동작을 수행하는 전자 장치 및 방법
US10318084B2 (en) Methods and systems for matrix electrode arrays
TWI598748B (zh) 電子設備及字元校正方法
US8531412B1 (en) Method and system for processing touch input
US10241614B2 (en) Object classification under low-power scan

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 15301719

Country of ref document: US

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16836358

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16836358

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 16836358

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 200918)

122 Ep: pct application non-entry in european phase

Ref document number: 16836358

Country of ref document: EP

Kind code of ref document: A1