CN115016671A - Touch display control device, touch display device, equipment and method - Google Patents

Touch display control device, touch display device, equipment and method Download PDF

Info

Publication number
CN115016671A
CN115016671A CN202210524178.0A CN202210524178A CN115016671A CN 115016671 A CN115016671 A CN 115016671A CN 202210524178 A CN202210524178 A CN 202210524178A CN 115016671 A CN115016671 A CN 115016671A
Authority
CN
China
Prior art keywords
touch
image data
display
unit
position information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210524178.0A
Other languages
Chinese (zh)
Inventor
康世振
王浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Eswin Computing Technology Co Ltd
Original Assignee
Beijing Eswin Computing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Eswin Computing Technology Co Ltd filed Critical Beijing Eswin Computing Technology Co Ltd
Priority to CN202210524178.0A priority Critical patent/CN115016671A/en
Publication of CN115016671A publication Critical patent/CN115016671A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04166Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
    • G06F3/041662Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving using alternate mutual and self-capacitive scanning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1407General aspects irrespective of display type, e.g. determination of decimal point position, display with fixed or driving decimal point, suppression of non-significant zeros

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A touch display control device, a touch display device, an electronic device and a touch display control method are provided. The touch display control device comprises a touch drawing buffer unit and a superposition unit. The touch drawing buffer unit is configured to process the touch position information to obtain current second image data corresponding to the touch operation in a current processing cycle. The superimposing unit is configured to receive, from the system processing device, previous first image data corresponding to the touch operation in a previous processing period, and superimpose the current second image data with the previous first image data to obtain third image data. By superimposing the image data generated by the touch display control device on the image data generated by the system processing device for display, the delay time from touch to display is shortened because the image data generated by the touch display control device does not need to be subjected to the delay caused by multi-layer data processing in the system processing device.

Description

Touch display control device, touch display device, equipment and method
Technical Field
Embodiments of the present disclosure relate to a touch display control device, a touch display device, an electronic apparatus, and a touch display control method.
Background
With the development of computer technology, the input mode of a computer goes through four stages of paper tape input to keyboard input, mouse input and touch input. The touch display technology is an interactive input technology, combines two functions of touch control and display, is widely applied to multiple fields of handheld consumer electronics equipment, medical application equipment, vending machines/ticket vending machines/ATMs, industrial and process control equipment and the like, and has the characteristics of simplicity in operation, flexibility in use and the like.
Disclosure of Invention
At least one embodiment of the present disclosure provides a touch display control device, including: a touch drawing buffer unit configured to process the touch position information to obtain current second image data corresponding to the touch operation in a current processing cycle; and the superposition unit is configured to receive the prior first image data corresponding to the touch operation in the prior processing period from the system processing device, and superpose the current second image data and the prior first image data to obtain third image data.
For example, in the touch display control device provided in at least one embodiment of the present disclosure, the touch drawing buffer unit includes a touch drawing portion configured to process the touch position information to generate second intermediate image data and write the second intermediate image data into the drawing frame storage portion, and a drawing frame storage portion configured to update at least one frame buffer of the drawing frame storage portion using the second intermediate image data to obtain the current second image data.
For example, the touch display control device provided in at least one embodiment of the present disclosure further includes: the touch sensing unit is configured to receive a touch signal corresponding to touch operation in the current processing period to obtain the touch position information and provide the touch position information to the system processing device; a display driving unit configured to generate a display driving signal for display using the third image data.
At least one embodiment of the present disclosure provides a touch display device, which includes a touch display control device provided in any one embodiment of the present disclosure.
For example, the touch display device provided in at least one embodiment of the present disclosure further includes: a touch display panel configured to detect a touch operation in the current processing cycle to generate a touch signal and configured to receive a display driving signal generated by the touch display control device to display; and the system processing device is configured to receive and process the touch position information corresponding to the previous processing period to obtain the previous first image data.
For example, in the touch display device provided in at least one embodiment of the present disclosure, the touch display panel includes a touch panel and a display panel, the touch panel is configured to detect a touch operation in the current processing cycle to generate the touch signal, and the display panel is configured to receive the display driving signal generated by the touch display control device to perform display.
For example, in the touch display device provided in at least one embodiment of the present disclosure, the system processing device is further configured to issue a drawing control command or an erasing control command according to a drawing application program, so that the touch drawing buffer unit obtains the current second image data or erases a previous second image data.
For example, in the touch display device provided in at least one embodiment of the present disclosure, the system processing device is further configured to issue the erasing control command according to the drawing application program, so that the touch drawing buffer unit erases a portion of the previous second image data equivalent to the previous first image data and a drawing result corresponding to the previous second image data.
For example, in the touch display device provided in at least one embodiment of the present disclosure, the system processing device is further configured to issue a superimposition control command according to the drawing application program, so that the superimposition unit starts or stops a function of superimposing the current second image data and the previous first image data to obtain the third image data.
For example, in a touch display device provided in at least one embodiment of the present disclosure, the system processing device includes a touch drawing unit, a frame buffer management unit, and a display management unit, the touch drawing unit is configured to receive and process touch position information corresponding to the previous processing cycle to generate first intermediate image data and write the first intermediate image data into the frame buffer management unit, the frame buffer management unit is configured to update at least one frame buffer of the frame buffer management unit using the first intermediate image data to obtain the previous first image data, and the display management unit is configured to output the previous first image data to the overlay unit.
At least one embodiment of the present disclosure further provides an electronic device including the touch display device provided in any one of the embodiments of the present disclosure.
At least one embodiment of the present disclosure further provides a touch display control method, including: processing the touch position information to obtain current second image data corresponding to the touch operation in the current processing period; receiving, from a system processing apparatus, prior first image data corresponding to a touch operation in a prior processing period; and superposing the current second image data and the prior first image data to obtain third image data, wherein the third image data is used for generating display driving signals.
For example, the touch display control method provided by at least one embodiment of the present disclosure further includes receiving a touch signal corresponding to the touch operation in the current processing cycle to obtain the touch position information and providing the touch position information to the system processing device.
For example, in a touch display control method provided in at least one embodiment of the present disclosure, the processing the touch position information to obtain current second image data corresponding to the touch operation in a current processing cycle includes: processing the touch position information to generate second intermediate image data and writing the second intermediate image data into a drawing frame storage section; updating at least one frame buffer of the drawing frame storage section using the second intermediate image data to obtain the current second image data.
For example, the touch display control method provided by at least one embodiment of the present disclosure further includes receiving and processing, by the system processing device, the touch position information corresponding to the previous processing cycle to obtain the previous first image data.
Drawings
To more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings of the embodiments will be briefly introduced below, and it is apparent that the drawings in the following description relate only to some embodiments of the present disclosure and are not limiting to the present disclosure.
FIG. 1 is a schematic diagram of a process from touch to display;
FIG. 2A is a schematic illustration of a delay time from touch to display;
FIG. 2B is a diagram illustrating exemplary delay times for different models of handsets in different transmission paths;
FIG. 3 is a diagram illustrating a display process based on a display frame;
fig. 4A is a schematic block diagram of a touch display control device according to at least one embodiment of the present disclosure;
fig. 4B is a schematic block diagram of another touch display control device according to at least one embodiment of the present disclosure;
fig. 5 is a schematic block diagram of a touch display device according to at least one embodiment of the present disclosure;
FIG. 6 is a diagram illustrating a specific example of the touch display device shown in FIG. 5 and a working process thereof;
fig. 7 is a schematic diagram illustrating a delay time from touch to display of a touch display device according to at least one embodiment of the present disclosure;
fig. 8A is a schematic diagram of an N +4 th frame drawing result of a touch display device according to at least one embodiment of the present disclosure;
fig. 8B is a schematic diagram of an N +5 th frame drawing result of a touch display device according to at least one embodiment of the present disclosure;
fig. 9 is a schematic block diagram of an electronic device provided in at least one embodiment of the present disclosure;
fig. 10 is a schematic block diagram of another electronic device provided by at least one embodiment of the present disclosure;
fig. 11 is a flowchart illustrating a touch display control method according to at least one embodiment of the present disclosure; and
fig. 12 is an exemplary flowchart of an example of step S120 in fig. 11.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more apparent, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings of the embodiments of the present disclosure. It is to be understood that the described embodiments are only a few embodiments of the present disclosure, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the described embodiments of the disclosure without any inventive step, are within the scope of protection of the disclosure.
Unless otherwise defined, technical or scientific terms used herein shall have the ordinary meaning as understood by one of ordinary skill in the art to which this disclosure belongs. The use of "first," "second," and similar terms in this disclosure is not intended to indicate any order, quantity, or importance, but rather is used to distinguish one element from another. Also, the use of the terms "a," "an," or "the" and similar referents do not denote a limitation of quantity, but rather denote the presence of at least one. The word "comprising" or "comprises", and the like, means that the element or item listed before the word covers the element or item listed after the word and its equivalents, but does not exclude other elements or items. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", and the like are used only to indicate relative positional relationships, and when the absolute position of the object being described is changed, the relative positional relationships may also be changed accordingly.
The present disclosure is illustrated by the following specific examples. Detailed descriptions of known functions and known components may be omitted in order to keep the following description of the embodiments of the present disclosure clear and concise. When any component of an embodiment of the present disclosure appears in more than one drawing, that component is represented by the same or similar reference numeral in each drawing.
Touch display is a common application that combines a touch device and a display device. In this application, the touch device extracts the touch position (x and y axes) of the plane where the display image is located. The continuous movement of the touch position represents a user's action command and the display frame will be updated according to the action command. The display of the response of the display device under the action command may be one of scrolling, zooming, and paging.
In a touch display application, for example, which performs free drawing (e.g., lines, curves, or characters) on a touch display device using a stylus or the like, strokes drawn on a touch screen are displayed on the display screen in synchronization, in which operation a space corresponding to the touch device and a space corresponding to the display device are aligned with each other. The touch position data is extracted from the touch device and sent to the system processing means. The system processing means will update the drawing (e.g. a line, curve or part of a character) to combine it into the main display frame memory and send the updated drawing to the display driving unit. The display driving unit outputs a control signal to the display device to display the updated drawing.
Fig. 1 is a schematic flow chart from touch to display. For example, as shown in fig. 1, the touch display control device includes a touch sensing unit and a display driving unit, which are implemented, for example, as a Touch Display Drive Integrated (TDDI) chip; the touch display control device is used for driving a touch display device, and the touch display device comprises a touch panel and a display panel which are arranged in an overlapping mode. More specifically, the touch sensing unit is used for touch driving of the touch panel, and the display driving unit is used for display driving of the display panel. The system processing device includes, for example, a processor corresponding to the touch display device, and an operating system, application software, and the like running on the processor.
In the process of drawing and other operations performed by a user on a touch display device (e.g., in an operation interface of an application) using, for example, a stylus or a finger, a touch sensing unit receives a touch signal generated by a touch panel according to a user touch operation, processes the touch signal (e.g., an analog signal) to obtain touch position information, and reports the touch position information to the system processing device. For example, as shown in FIG. 1, the system processing device includes a window drawing and system processing unit, a frame buffer management unit, and a display management unit to process image data in a variety of different frame buffers. The window drawing and system processing unit converts the touch position information reported by the touch sensing unit into image data related to the action of touch operation and the like. The frame buffer management unit updates the image data existing in the plurality of frame buffers using the image data, and obtains updated image data by combining with image data of the operation interface, for example. And the display management unit outputs the updated image data to a display driving unit of the touch display control device. The display driving unit obtains display data according to image data and outputs the display data to a data driving unit (or a source driving unit) of the display panel, and outputs a gate driving signal to a gate driving circuit of the display panel, which are combined to drive a pixel array in the display panel to display on the display panel, and a displayed picture comprises strokes and the like which occur due to touch operation.
For example, in the best case, the touch sensing unit reports the touch position information that the system processing device received within a certain nth frame time. At the N +1 th frame time, the touch position information is converted into image data in the system processing device. At the (N +2) th frame time, the system processing device outputs and displays the image data updated in the frame buffer management unit. Therefore, in the best case, a delay time of at least 3 frames is required from the detection of a touch operation on the touch panel to the display of the image portion corresponding to the touch operation on the display panel.
For example, the delay time from touch to display mainly includes two parts: the touch control display system comprises a touch control display control part and a system processing part. The touch display control section samples the touch signal acquired on the touch panel and extracts the touch position by some filtering algorithm, and the delay time of the section is represented as N0. N0 is a physical delay and is therefore essential. The System processing part is usually implemented and executed in a complex Operating System (OS), which is usually implemented by a hierarchical structure, so that delay is caused in the multi-layer processing of the OS:
(1) the driver layer and the kernel layer contain driver codes, OS codes, rendering codes, synthesis codes and the like, input commands are obtained from the touch panel, system calling is carried out on user codes, image data are output to the display panel, and the series of processes require delay time N1;
(2) the run layer and the framework layer are composed of a plurality of dynamic Application Programming Interface (API) libraries to integrate input, output and kernel functions, the series of processes requires a delay time N2, and the delay time N2 of different operating systems (such as an android system or a Linux system) may be different;
(3) the application layer may operate multiple Applications (APPs) simultaneously, and the multiple APPs have multiple dependencies, so that the multiple APPs are very complex in code, and this part is prone to generate the delay time N3.
As can be seen from the above, the sum of the delay times from the touch to the display is at least (N0+ N1+ N2+ N3).
Fig. 2A is a schematic diagram of a delay time from touch to display. For example, as can be seen from fig. 2A and the above system processing from touch to display, in the illustrated example, the sum of the delay times from touch to display is (N0+ N1+ N2+ N3) 58 milliseconds (ms). For example, as shown in fig. 2A, since N0 (e.g., the input scan and display change section in fig. 2A) is a delay section indispensable for touch display, the sum of delay times other than N0 (N1+ N2+ N3) exceeds 40ms, which is a time delay of about 3 display frames.
Fig. 2B is a schematic diagram illustrating delay times of different types of mobile phones in different transmission paths, where the mobile phones use a capacitive touch display device (i.e., a capacitive touch display screen) and the operating system used is an Android (Android) operating system. For example, as shown in fig. 2B, the delay time from touch to display is divided into a part from touch detection to the system kernel, a part from the system kernel to callback (callback), and a part from color setting operation (setColor ()) to delay of screen display, the delay time from touch to display of the model 1 handset is about 78ms, the delay time from touch to display of the model 2 handset is about 88ms, and there is a problem of large display delay.
Fig. 3 is a schematic diagram for describing a display process by using a display frame as a time base. For example, as shown in fig. 3, taking the touch display device shown in fig. 1 as an example, at the nth frame time, the touch sensing unit receives a touch signal generated by the touch panel according to a user touch operation to obtain touch position information; at the (N +1) th frame time, the touch sensing unit reports the touch position information to the system processing device, and a window drawing and system processing unit of the system processing device converts the touch position information into image data; at the time of the (N +2) th frame and the (N +3) th frame, the frame buffer management unit updates the image data existing in the plurality of frame buffers by using the image data; and at the moment of the (N +4) th frame, the display management unit outputs the updated image data to a display driving unit of the touch display control device, and the display driving unit drives the display panel to display. As can be seen from the example shown in fig. 3, a delay time of at least 5 frames is typically required from touch to display.
The delay time from touch to display affects the user experience for applications that require immediate feedback, such as drawing or writing applications.
At least one embodiment of the present disclosure provides a touch display control device, which includes a touch drawing buffer unit and a superimposing unit. The touch drawing buffer unit is configured to process the touch position information to obtain current second image data corresponding to the touch operation in a current processing cycle. The superimposing unit is configured to receive, from the system processing device, previous first image data corresponding to the touch operation in a previous processing period, and superimpose the current second image data with the previous first image data to obtain third image data.
Some embodiments of the disclosure also provide a touch display device corresponding to the touch display control device, an electronic device, and a touch display control method corresponding to the touch display control device.
At least one embodiment of the present disclosure provides a touch display control device, a touch display device, an electronic apparatus, and a touch display control method, in which a touch drawing portion and a frame buffer storage portion are provided in the touch display control device, for example, the touch drawing portion and the frame buffer storage portion of a system processing device are copied into the touch display control device, and image data generated by the touch display control device is superimposed onto image data generated by the system processing device by a superimposing unit to obtain superimposed image data for display. In the same processing period, because the image data generated by the touch display control device does not need to be subjected to time delay caused by multi-layer data processing in the system processing device (including an operating system), the image data generated by the touch display control device can be ahead of the image data generated by the system processing device, and the superposed image data is ahead of the image data generated by the system processing device, so that the delay time from touch to display is shortened, the system processing speed is increased, and the use experience of a user is improved.
At least one embodiment of the present disclosure will be described in detail below with reference to the accompanying drawings. It should be noted that the same reference numerals in different figures will be used to refer to the same elements that have been described.
Fig. 4A is a schematic block diagram of a touch display control device according to at least one embodiment of the present disclosure.
For example, as shown in fig. 4A, the touch display control device 100 includes a touch drawing buffer unit 110 and a superimposing unit 120. The touch-drawing buffer unit 110 is configured to process the touch position information to obtain second image data corresponding to the touch operation, for example, process the touch position information to obtain current second image data corresponding to the touch operation in a current processing cycle. The superimposing unit 120 is configured to receive first image data corresponding to a previous touch operation from the system processing apparatus and superimpose second image data obtained in the touch display control apparatus 100 with the first image data to obtain third image data, for example, receive previous first image data corresponding to a touch operation in a previous processing cycle from the system processing apparatus and superimpose current second image data with the previous first image data to obtain third image data to be used for driving the display panel for display, thereby realizing extraction display of an image portion resulting from the touch operation. For example, the touch display control apparatus 100 may be implemented by a combination of hardware and software or by firmware, for example, by a driver chip (IC) and a driver running on the driver chip, which is not limited in this respect in the embodiments of the disclosure; for example, the touch-drawing buffer unit 110 and the overlay unit 120 may be implemented separately and communicatively connected to each other, or integrated in the same hardware (e.g., integrated circuit).
For example, the image superimposing operation of the superimposing unit 120 may be to overlay the second image portion to be presented by the second image data on a predetermined position of the first image portion to be presented by the first image data, for example, with the first image portion as a background and the second image portion as a foreground, in the same (virtual) coordinate system corresponding to the touch operation and the display operation.
Fig. 4B is a schematic block diagram of another touch display control device according to at least one embodiment of the present disclosure.
For example, as shown in fig. 4B, with respect to the example shown in fig. 4A, the touch display control device 100 further includes a touch sensing unit 130 and a display driving unit 140. The touch sensing unit 130 is configured to receive a touch signal corresponding to a touch operation to obtain touch position information and provide the touch position information to the system processing device, for example, receive a touch signal corresponding to a touch operation in a current processing cycle to obtain touch position information and provide the touch position information to the system processing device. The display driving unit 140 is configured to generate a display driving signal for display using the third image data. In this embodiment, for example, the touch-drawing buffer unit 110, the superimposing unit 120, the touch sensing unit 130, and the display driving unit 140 may be implemented separately and communicatively connected to each other, respectively, or the touch display control device 100 may be implemented as a Touch Display Drive Integrated (TDDI) chip, thereby integrating the touch-drawing buffer unit 110, the superimposing unit 120, the touch sensing unit 130, and the display driving unit 140 into the same hardware (e.g., an integrated circuit).
For example, as shown in fig. 4B, the touch-drawing buffer unit 110 may further include a touch-drawing part 111 and a drawing frame storage part 112. The touch drawing section 111 is configured to process the touch position information to generate second intermediate image data and write the second intermediate image data to the drawing frame storage section 112. The drawing frame storage section 112 is configured to update at least one frame buffer of the drawing frame storage section using the second intermediate image data to obtain current second image data.
Fig. 5 is a schematic block diagram of a touch display device according to at least one embodiment of the present disclosure.
For example, as shown in fig. 5, the touch display device 200 includes the touch display control device 100 shown in fig. 4A or fig. 4B. The touch display device 200 further includes a system processing device 210 and a touch display panel 220.
For example, the system processing device 210 includes a processor and software running on the processor, the software including an operating system, an application, and the like, for example, the system processing device 210 may further include a memory, for example, may further include a communication device and the like as needed; embodiments of the present disclosure are not limited as to the type, configuration, etc. of the processor, nor as to the software system running on the processor. The system processing device 210 is configured to receive and process the touch location information to obtain first image data, e.g., receive and process touch location information corresponding to a prior processing cycle to obtain prior first image data.
The touch display panel 220 is configured to detect a touch operation to generate a touch signal and to receive a display driving signal generated by the touch display control device 100 to perform display, for example, detect a touch operation in a current processing cycle to generate a touch signal and to receive a display driving signal generated by the touch display control device 100 to perform display. For example, the display mode of the touch display panel 220 may be a liquid crystal display, an electronic paper display, an organic light emitting display, a quantum dot display, or the like, the touch mode may be a resistive type, a capacitive type, an infrared type, a surface acoustic wave type, or the like, and for example, the capacitive type may further be a self-capacitive type or a mutual capacitive type, or the like; the combination mode of the touch structure and the display structure can be an external hanging type (On-Cell) or an internal (In-Cell) touch display panel and the like. For example, for the external hanging type, the touch display panel may be divided into a touch panel and a display panel, and for the internal mounting type, the touch display panel may not be clearly divided into a touch panel and a display panel. Embodiments of the present disclosure do not limit the implementation manner of the touch display panel 220.
Fig. 6 is a schematic diagram of a specific example of the touch display device in fig. 5 and a work flow thereof.
For example, as shown in fig. 6, the touch display panel 220 includes a display panel 221 and a touch panel 222, for example, the touch panel 222 is stacked on the display panel 221. The touch panel 222 is configured to detect a touch operation in a processing cycle to generate a touch signal, for example, a touch operation in a current processing cycle to generate a touch signal. The display panel 221 is configured to receive a display driving signal generated by the touch display control device 100 for displaying.
For example, as shown in fig. 6, the system processing device 210 includes a touch drawing unit 211, a frame buffer management unit 212, and a display management unit 213. The touch drawing unit 211 is configured to receive and process the touch position information provided by the touch sensing unit 130 to generate first intermediate image data and write the first intermediate image data into the frame buffer management unit 212, for example, receive and process the touch position information corresponding to a previous processing cycle provided by the touch sensing unit 130 to generate first intermediate image data and write the first intermediate image data into the frame buffer management unit 212. The frame buffer management unit 212 is configured to update at least one frame buffer of the frame buffer management unit 212 with the first intermediate image data to obtain the previous first image data. The display management unit 213 is configured to output the previous first image data to the superimposing unit 120.
It should be noted that the at least one frame buffer of the frame buffer management unit 212 may include, for example, the drawing frame buffer 2121, the touch object frame buffer 2122, and the main frame buffer 2123 shown in fig. 6, and in other examples, a combination of other types of frame buffers may also be used, which is not limited in this regard by the embodiments of the present disclosure.
For example, as shown in fig. 6, the touch panel 222 detects a touch operation of a user in a current processing cycle (e.g., N +2 th frame) to generate a touch signal. The touch sensing unit 130 receives the touch signal to obtain touch position information, and provides the touch position information to the system processing device 210. The touch drawing section 111 processes the touch position information to generate second intermediate image data, and writes the second intermediate image data into the drawing frame storage section 112. The drawing frame storage section 112 updates at least one frame buffer of the drawing frame storage section 112 using the second intermediate image data to obtain the current second image data. The current second image data corresponds to a touch operation of the user in a current processing cycle (e.g., N +2 th frame).
For example, as shown in fig. 6, the touch panel 222 detects a touch operation of a user in a previous processing cycle (e.g., a previous nth frame) to generate a touch signal. The touch sensing unit 130 receives the touch signal to obtain touch position information, and provides the touch position information to the system processing device 210. The touch-drawing unit 211 receives and processes the touch position information to generate first intermediate image data, and writes the first intermediate image data into the frame buffer management unit 212. The frame buffer management unit 212 updates at least one frame buffer (e.g., the drawing frame buffer 2121, the touch object frame buffer 2122, and the main frame buffer 2123 shown in FIG. 6) of the frame buffer management unit 212 using the first intermediate image data to obtain the previous first image data. The display management unit 213 outputs the previous first image data to the superimposing unit 120. Thus, the previous first image data corresponds to a touch operation by the user in a previous processing cycle (e.g., nth frame).
For example, as shown in fig. 6, the superimposing unit 120 superimposes the current second image data with the previous first image data to obtain the third image data. The display driving unit 140 generates a display driving signal for display using the third image data. The display panel 221 receives the display driving signal to perform display.
As in the example shown in fig. 6, the touch-drawing portion 111 may be a copy of the touch-drawing unit 211, i.e., both have the same function, and may also be implemented in the same manner; alternatively, the drawing frame storage part 112 may be a copy of the frame buffer management unit 212, i.e., both have the same function, and may also be implemented in the same manner. In the same processing cycle, since the touch-drawing buffer unit 110 generates the second image data without a delay due to multi-layer data processing in the system processing device, only the second image data for the touch operation is ahead of the first image data for the entire display image (e.g., including both the image portion of the stroke caused by the touch operation and the image portion of the user interface of the application program, etc.), and thus the third image data generated by the superimposition unit is ahead of the first image data, thereby shortening the delay time from touch to display.
For example, as shown in fig. 6, a drawing application 214 is also run in the system processing apparatus 210, the drawing application 214 allows a user to use a drawing pen (stylus pen) to write or draw on a user interface through touch operation, and the system processing apparatus 210 is further configured to issue a command 1 according to the drawing application 214, for example, the command 1 is a drawing control command or an erasing control command. For example, when the command 1 is a drawing control command, the system processing device 210 controls the touch-drawing buffer unit 110 to obtain the current second image data. For example, when the command 1 is an erase control command, the system processing device 210 controls the touch-control drawing buffer unit 110 to erase a portion of the previous second image data equivalent to the previous first image data and a drawing result corresponding to the previous second image data, so as to avoid overlapping or overlapping display; for another example, when the command 1 is an erase control command, the system processing device 210 controls the touch-screen buffer unit 110 to erase the second image data related to the touch operation previously stored therein, and clears the touch-screen buffer unit 110, so that the overlay display is not performed in the overlay unit 120; for another example, when command 1 is an erase control command, other previous image data portions related to touch operations buffered in the system processing device 210 may be further erased according to the content to be erased.
For example, the command interface of the drawing application 214 for command 1 needs to select drawing pens of the same type, color and thickness, so that the drawing results corresponding to the first image data and the second image data are the same except for the corresponding processing cycles. For example, the size of the image plane and the color plate needs to be matched with the first image data and the second image data.
For example, since the current second image data is the same as the previous first image data except that it is a predetermined frame ahead of the previous first image data, when the received command 1 is an erase command, a portion of the second image data that overlaps with the first image data may be partially or entirely erased, and the predetermined frame is generally small in number (for example, 2 frames or 3 frames), and thus has little influence on the visual effect of the user.
For example, as shown in fig. 6, the system processing device 210 is further configured to issue a command 2 according to the drawing application 214, where the command 2 is a superimposition control command. The system processing device 210 controls the superimposition unit 120 to start or stop a function of superimposing the current second image data with the previous first image data to obtain the third image data by command 2.
Fig. 7 is a schematic diagram illustrating a delay time from touch to display of a touch display device according to at least one embodiment of the present disclosure.
For example, as shown in fig. 7, taking the touch display device shown in fig. 6 as an example, at the nth frame time, the touch panel 222 detects the touch operation (N) of the user in the nth frame to generate the touch signal (N). At the time of the (N +1) th frame, the touch sensing unit 130 reports the touch position information (N) obtained by receiving and processing the touch signal (N) to the system processing device 210 (N); meanwhile, the touch drawing part 111 processes the touch position information (N) to generate second intermediate image data (N), and writes the second intermediate image data (N) into the drawing frame storage part 112. At the time of the N +2 th frame, the drawing frame storage part 112 updates at least one frame buffer of the drawing frame storage part 112 using the second intermediate image data (N) to obtain the second image data (N). Since the corresponding first image data (N) is not generated by the system processing device 210 at this time, the superimposing unit 120 directly uses the second image data (N) to superimpose the previous first image data (not shown in the figure, here, the first image data (N-2), i.e., the first image corresponding to the touch operation of the N-2 th frame) to obtain the third image data (N), which is sent to the display driving unit 140 to drive the display panel 221 to display.
For example, as shown in fig. 7, taking the touch display device shown in fig. 6 as an example, at the time of the N +2 th frame, the touch panel 222 detects the touch operation (N +2) of the user in the N +2 th frame to generate the touch signal (N + 2). At the time of the (N +3) th frame, the touch sensing unit 130 reports the touch position information (N +2) obtained by receiving and processing the touch signal (N +2) to the system processing device 210(N + 2); meanwhile, the touch drawing section 111 processes the touch position information (N +2) to generate second intermediate image data (N +2), and writes the second intermediate image data (N +2) into the drawing frame storage section 112. At the time of the N +4 th frame, the drawing frame storage part 112 updates at least one frame buffer of the drawing frame storage part 112 using the second intermediate image data (N +2) to obtain the second image data (N + 2). At this time, after a delay of N +4 frames from the reception of the touch position information corresponding to the touch operation (N), the system processing device 210 generates, for example, the first image data (N) including, for example, an image portion corresponding to the touch operation of the nth frame (equivalent to the second image data (N)) and an image portion that needs to be displayed by the other portion of the application, through, for example, the multi-layer processing, and the superimposing unit 120 receives the first image data (N) from the system processing device 210 and superimposes the second image data (N +2) with the first image data (N) to obtain the third image data (N +2), which is sent to the display driving unit 140 to drive the display panel 221 to be displayed. Since the system processing device 210 has generated the first image data (N) at this time, the system processing device 210 issues an erase command to erase the second image data (N) (i.e., the portion equivalent to the first image data (N)) and the corresponding drawing result, thereby avoiding the overlapping or superimposed display, and obtaining the drawing result (N +2) corresponding to the third image data (N +2) on the display panel 221. In turn, similarly, the display driving unit 140 receives the sequentially generated third image data (N +3), third image data (N +4) … …, and the like to perform a continuous display driving operation, thereby obtaining corresponding drawing results (N +3), drawing results (N +4) … …, and the like at the display panel 221.
For example, as shown in fig. 7, in the touch display device such as shown in fig. 6, the touch display control device 100 generates the second image data with a delay time of 2 frames less than that of the system processing device 210. It should be noted that, in other examples, the reduced delay time for generating the second image data compared to generating the first image data may also be 1 frame or more than 2 frames, which is not limited by the embodiments of the present disclosure.
Fig. 8A is a schematic diagram of an N +4 th frame drawing result of a touch display device according to at least one embodiment of the present disclosure; fig. 8B is a schematic diagram of an N +6 th frame drawing result of a touch display device according to at least one embodiment of the present disclosure. Fig. 8A and 8B illustrate an application in which a curve is drawn on a whiteboard by a stylus (drawing pen) as an example.
For example, as shown in fig. 8A, the touch display device shown in fig. 6 is taken as an example, and the delay time of the touch display device is shown in fig. 7. At the time of the (N +4) th frame, the system processing device 210 generates first image data (N), and the drawing result corresponding to the first image data (N) is the first drawing result (N); the touch-control drawing buffer unit 110 obtains the second image data (N +2), and the drawing result corresponding to the second image data (N +2) is the second drawing result (N + 2). Here, the first drawing result (N) includes the second drawing result (N) and a line drawn before the nth frame, i.e., here the first drawing result omits an image portion other than a stroke caused by the touch operation.
For example, as shown in fig. 8A, the second drawing result (N) is a drawing result corresponding to the second image data (N) obtained at the nth frame time, the second drawing result (N +2) is a drawing result corresponding to the second image data (N +2) obtained at the N +2 th frame time, and the second drawing result (N +2) is a portion where the N +2 th frame to be displayed at the N +4 th frame time is updated with respect to the drawing result at the nth frame time. At the time of the N +4 th frame, for example, the system processing device 210 may issue an erase command to erase the second drawing result (N) previously displayed, so as to avoid the repeated or overlapped display, and the content displayed on the display panel 221 may be the first drawing result (N) plus the second drawing result (N + 2); for another example, the system processing device 210 does not issue the erase command, and the content displayed on the display panel 221 may be the first drawing result (N), the second drawing result (N) and the second drawing result (N +2), as long as the operation does not affect the display effect. For example, as shown in fig. 8A, the drawing result corresponding to the second image data (N +2) is advanced by 2 frames compared to the drawing result corresponding to the first image data (N).
For example, as shown in fig. 8B, the touch display device shown in fig. 6 is taken as an example, and the delay time of the touch display device is shown in fig. 7. For another example, at the time of the (N + 6) th frame, the system processing device 210 generates the first image data (N +2), and the corresponding drawing result of the first image data (N +2) includes the corresponding parts of the previous first drawing result (N) and the second drawing result (N + 2); the touch-control drawing buffer unit 110 obtains the second image data (N +2), and the drawing result corresponding to the second image data (N +2) is included as the second drawing result (N + 4).
For example, as shown in fig. 8B, the first drawing result (N +2) is a drawing result corresponding to the first image data (N +2) generated at the time of the N +4 th frame, and the first drawing result (N +2) includes the entire image by the previous N +2 th frame, and thus is a combination of the previous first drawing result (N) and the second drawing result (N + 2). The second drawing result (N +4) is a portion in which the time of the (N +4) th frame to be displayed at the time of the (N + 6) th frame is updated with respect to the drawing result of the (N +2) th frame. At the time of the N +6 th frame, for example, the system processing device 210 may issue an erase command to erase the second drawing result (N +2) displayed previously, so as to avoid the repeated or overlapped display, and the content displayed on the display panel 221 may be the first drawing result (N +2) plus the second drawing result (N + 4); for another example, the system processing device 210 does not issue the erase command, and the content displayed on the display panel 221 may be the first drawing result (N +2), the second drawing result (N +2), and the second drawing result (N +4), as long as the operation does not affect the display effect. For example, as shown in fig. 8B, the drawing result corresponding to the second image data (N +4) is also advanced by 2 frames compared to the drawing result corresponding to the first image data (N + 2).
In the above description, the schematic diagram at the time of the N +5 th frame is omitted, and similarly, at the time of the N +5 th frame, for example, the system processing apparatus 210 may issue an erase command to erase the second drawing result (N +1) displayed previously, so as to avoid the duplicate or superimposed display, and the content displayed on the display panel 221 may be the first drawing result (N +1) plus the second drawing result (N + 3); for another example, the system processing device 210 does not issue the erase command, and the content displayed on the display panel 221 may be the first drawing result (N +1), the second drawing result (N +1), and the second drawing result (N +3), as long as the display effect is not affected by the operation. For example, the drawing result corresponding to the second image data (N +3) is also advanced by 2 frames compared to the drawing result corresponding to the first image data (N + 1).
In at least one embodiment of the present disclosure, in the same processing cycle, since the touch display control device generates the second image data without undergoing a delay due to multi-layer data processing in the operating system, the second image data is ahead of the first image data, and thus the third image data generated by superimposing the third image data by the superimposing unit is also ahead of the first image data, thereby shortening a delay time from touch to display.
Fig. 9 is a schematic block diagram of an electronic device according to at least one embodiment of the present disclosure.
For example, as shown in fig. 9, the electronic device 10 includes a touch display device 200, and the touch display device 200 is a touch display device provided in any embodiment of the disclosure, such as the touch display device 200 shown in fig. 5.
For example, the electronic device 10 may be a DDR digital system, or may be any device such as a mobile phone, a tablet computer, a notebook computer, an electronic book, a game machine, a television, a digital photo frame, a navigator, or may be any combination of an electronic apparatus and hardware, which is not limited in this embodiment of the disclosure.
It should be noted that, for clarity and conciseness of representation, not all the constituent elements of the electronic device 10 are shown in the embodiments of the present disclosure. In order to implement the necessary functions of the electronic device, those skilled in the art may provide and set other components, which are not shown, according to specific needs, for example, a communication unit (for example, a network communication unit), an input and output unit (for example, a keyboard, a speaker, etc.), and the like, and the embodiment of the present disclosure is not limited thereto. For the related description and technical effects of the electronic device 10, reference may be made to the related description and technical effects of the touch display control apparatus provided in the embodiments of the present disclosure, which are not repeated herein.
Fig. 10 is a schematic block diagram of another electronic device provided in at least one embodiment of the present disclosure.
For example, as shown in fig. 10, the electronic device 400 includes, for example, the touch display device provided in the embodiment of the present disclosure. The electronic device 400 may be a terminal device or a server or the like. It should be noted that the electronic device 400 shown in fig. 10 is only one example, and does not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
For example, as shown in fig. 10, the electronic device 400 may include a processing device (e.g., a central processing unit, a graphics processor, etc.) 41, which includes, for example, a touch display device according to any of the embodiments of the present disclosure, and which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)42 or a program loaded from a storage device 48 into a Random Access Memory (RAM) 43. In the RAM 43, various programs and data necessary for the operation of the electronic apparatus 400 are also stored. The processing device 41, the ROM 42, and the RAM 43 are connected to each other via a bus 44. An input/output (I/O) interface 45 is also connected to the bus 44. Generally, the following devices may be connected to the I/O interface 45: input devices 46 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 47 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage devices 48 including, for example, magnetic tape, hard disk, etc.; and a communication device 49. The communication means 49 may allow the electronic device 400 to communicate wirelessly or by wire with other electronic devices to exchange data.
While fig. 10 illustrates an electronic device 400 having various means, it is to be understood that not all illustrated means are required to be implemented or provided, and that the electronic device 400 may alternatively be implemented or provided with more or less means.
For a detailed description and technical effects of the electronic device 10/400, reference may be made to the description in the art and further description herein may be omitted.
Fig. 11 is a flowchart illustrating a touch display control method according to at least one embodiment of the present disclosure, for example, the touch display control method corresponds to the display control device.
For example, as shown in fig. 11, in at least one embodiment of the present disclosure, the touch display control method includes the following steps S110 to S140.
Step S110: receiving a touch signal corresponding to touch operation in a current processing period to obtain touch position information and providing the touch position information to a system processing device;
step S120: processing the touch position information to obtain second image data corresponding to the touch operation in the current processing period;
step S130: receiving first image data corresponding to a touch operation in a previous processing period from a system processing device;
step S140: the current second image data is superimposed with the previous first image data to obtain third image data.
For example, in the embodiment shown in fig. 6, for step S110, a touch signal corresponding to a touch operation in the current processing cycle (e.g., N +2 th frame) may be received by the touch sensing unit 130 to obtain touch position information, and the touch position information is provided to the system processing device 210. For step S120, the touch position information may be processed by the touch-drawing buffer unit 110 to obtain second image data corresponding to the touch operation in the current processing cycle (e.g., N +2 th frame). For step S130, the first image data corresponding to the touch operation in the previous processing cycle (e.g., nth frame) may be received from the system processing device 210 through the superimposing unit 120. For step S140, the current second image data may be superimposed with the previous first image data by the superimposing unit 120 to obtain third image data.
For example, in at least one embodiment of the present disclosure, the touch display control method further includes receiving and processing, by the system processing device, touch position information corresponding to a previous processing cycle to obtain previous first image data. For example, in the embodiment shown in fig. 6, the touch position information corresponding to the previous processing cycle (e.g., nth frame) is received and processed by the touch drawing unit 211 of the system processing device 210 to generate the first intermediate image data, and the first intermediate image data is written into the frame buffer management unit 212. The frame buffer management unit 212 updates at least one frame buffer of the frame buffer management unit 212 using the first intermediate image data to obtain the previous first image data.
Fig. 12 is an exemplary flowchart of an example of step S120 in fig. 11.
For example, as shown in fig. 12, in one example of step S120 in fig. 11, processing the touch position information to obtain second image data corresponding to the touch operation in the current processing cycle includes the following steps S121 to S122.
Step S121: processing the touch position information to generate second intermediate image data and writing the second intermediate image data to the drawing frame storage section;
step S122: at least one frame buffer of the drawing frame storage section is updated with the second intermediate image data to obtain current second image data.
For example, in the embodiment as shown in fig. 6, for step S121, the touch position information may be processed by the touch drawing part 111 to generate second intermediate image data, and the second intermediate image data may be written in the drawing frame storage part 112. For step S122, at least one frame buffer of the drawing frame storage section 112 may be updated by the drawing frame storage section 112 using the second intermediate image data to obtain the current second image data.
In at least one embodiment of the present disclosure, in the same processing cycle, since the second image data is generated by the touch display control device without being subjected to a delay time caused by multi-layer data processing in the system processing device, the second image data is ahead of the first image data, and thus the third image data generated by superimposing by the superimposing unit is also ahead of the first image data, thereby shortening a delay time from touch to display.
For the present disclosure, there are the following points to be explained:
(1) in the drawings of the embodiments of the present disclosure, only the structures related to the embodiments of the present disclosure are referred to, and other structures may refer to general designs.
(2) Features of the disclosure in the same embodiment and in different embodiments may be combined with each other without conflict.
The above is only a specific embodiment of the present disclosure, but the scope of the present disclosure is not limited thereto, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the present disclosure, and shall be covered by the scope of the present disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (15)

1. A touch display control device, comprising:
a touch drawing buffer unit configured to process the touch position information to obtain current second image data corresponding to the touch operation in a current processing cycle;
and the superposition unit is configured to receive the prior first image data corresponding to the touch operation in the prior processing period from the system processing device, and superpose the current second image data and the prior first image data to obtain third image data.
2. The touch display control device according to claim 1, wherein the touch drawing buffer unit includes a touch drawing section and a drawing frame storage section,
the touch drawing section is configured to process the touch position information to generate second intermediate image data and write the second intermediate image data into the drawing frame storage section,
the drawing frame storage section is configured to update at least one frame buffer of the drawing frame storage section with the second intermediate image data to obtain the current second image data.
3. The touch display control device of claim 1, further comprising:
the touch sensing unit is configured to receive a touch signal corresponding to touch operation in the current processing period to obtain the touch position information and provide the touch position information to the system processing device;
a display driving unit configured to generate a display driving signal for display using the third image data.
4. A touch display device, comprising:
a touch display control device according to any one of claims 1-3.
5. The touch display device of claim 4, further comprising:
a touch display panel configured to detect a touch operation in the current processing cycle to generate a touch signal and configured to receive a display driving signal generated by the touch display control device to display;
and the system processing device is configured to receive and process the touch position information corresponding to the previous processing period to obtain the previous first image data.
6. The touch display device according to claim 5, wherein the touch display panel comprises a touch panel and a display panel,
the touch panel is configured to detect a touch operation in the current processing cycle to generate the touch signal,
the display panel is configured to receive the display driving signal generated by the touch display control device for displaying.
7. The touch display device of claim 5, wherein the system processing device is further configured to issue a drawing control command or an erasing control command according to a drawing application program to cause the touch drawing buffer unit to obtain the current second image data or erase the previous second image data.
8. The touch display device of claim 7, wherein the system processing device is further configured to issue the erase control command according to the drawing application program to cause the touch drawing buffer unit to erase a portion of the previous second image data equivalent to the previous first image data and a drawing result corresponding to the previous second image data.
9. The touch display device of claim 5, wherein the system processing device is further configured to issue a superimposition control command to cause the superimposition unit to start or stop a function of superimposing the current second image data with the previous first image data to obtain the third image data, according to the drawing application.
10. The touch display device of claim 5, wherein the system processing device comprises a touch drawing unit, a frame buffer management unit, and a display management unit,
the touch drawing unit is configured to receive and process the touch position information corresponding to the previous processing cycle to generate first intermediate image data and write the first intermediate image data into the frame buffer management unit,
the frame buffer management unit is configured to update at least one frame buffer of the frame buffer management unit with the first intermediate image data to obtain the previous first image data,
the display management unit is configured to output the preceding first image data to the superimposing unit.
11. An electronic device comprising a touch display device according to any one of claims 4-10.
12. A touch display control method comprises the following steps:
processing the touch position information to obtain current second image data corresponding to the touch operation in the current processing cycle;
receiving, from a system processing apparatus, prior first image data corresponding to a touch operation in a prior processing period;
and superposing the current second image data and the prior first image data to obtain third image data, wherein the third image data is used for generating display driving signals.
13. The touch display control method of claim 12, further comprising:
receiving a touch signal corresponding to the touch operation in the current processing period to obtain the touch position information and providing the touch position information to the system processing device.
14. The touch display control method according to claim 12, wherein the processing the touch position information to obtain current second image data corresponding to the touch operation in a current processing cycle comprises:
processing the touch position information to generate second intermediate image data and writing the second intermediate image data to a drawing frame storage section;
updating at least one frame buffer of the drawing frame storage section using the second intermediate image data to obtain the current second image data.
15. The touch display control method of claim 12, further comprising:
and receiving and processing the touch position information corresponding to the previous processing period through the system processing device to obtain the previous first image data.
CN202210524178.0A 2022-05-13 2022-05-13 Touch display control device, touch display device, equipment and method Pending CN115016671A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210524178.0A CN115016671A (en) 2022-05-13 2022-05-13 Touch display control device, touch display device, equipment and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210524178.0A CN115016671A (en) 2022-05-13 2022-05-13 Touch display control device, touch display device, equipment and method

Publications (1)

Publication Number Publication Date
CN115016671A true CN115016671A (en) 2022-09-06

Family

ID=83069361

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210524178.0A Pending CN115016671A (en) 2022-05-13 2022-05-13 Touch display control device, touch display device, equipment and method

Country Status (1)

Country Link
CN (1) CN115016671A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI823743B (en) * 2023-01-06 2023-11-21 大陸商北京集創北方科技股份有限公司 Touch driving method, TDDI chip and information processing device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI823743B (en) * 2023-01-06 2023-11-21 大陸商北京集創北方科技股份有限公司 Touch driving method, TDDI chip and information processing device

Similar Documents

Publication Publication Date Title
US11392271B2 (en) Electronic device having touchscreen and input processing method thereof
US10152228B2 (en) Enhanced display of interactive elements in a browser
US11443453B2 (en) Method and device for detecting planes and/or quadtrees for use as a virtual substrate
KR102318610B1 (en) Mobile device and displaying method thereof
US9870144B2 (en) Graph display apparatus, graph display method and storage medium
CN103955339A (en) Terminal operation method and terminal equipment
KR20100104804A (en) Display driver ic, method for providing the display driver ic, and data processing apparatus using the ddi
EP4024186A1 (en) Screenshot method and terminal device
US10481790B2 (en) Method and apparatus for inputting information by using on-screen keyboard
EP2801967A1 (en) Electronic device for providing information to a user
US20190065030A1 (en) Display apparatus and control method thereof
CN110377220B (en) Instruction response method and device, storage medium and electronic equipment
US20150007102A1 (en) Method of displaying page and electronic device implementing the same
US20160195975A1 (en) Touchscreen computing device and method
US20120249585A1 (en) Information processing device, method thereof, and display device
US20150220205A1 (en) User input method and apparatus in electronic device
US10114501B2 (en) Wearable electronic device using a touch input and a hovering input and controlling method thereof
CN115016671A (en) Touch display control device, touch display device, equipment and method
CN109718554B (en) Real-time rendering method and device and terminal
CN108885556B (en) Controlling digital input
CN104820489B (en) Manage the system and method for directly controlling feedback of low delay
CN113495641A (en) Touch screen ghost point identification method and device, terminal and storage medium
JP7467842B2 (en) Display device, display method, and display program
US10768670B2 (en) Control method, electronic device and non-transitory computer readable recording medium device
US20220334673A1 (en) Operation judgment method for interactive touch system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 101102 Room 101, 1/F, Building 3, No. 18 Courtyard, Kechuang 10th Street, Beijing Economic and Technological Development Zone, Daxing District, Beijing

Applicant after: Beijing yisiwei Computing Technology Co.,Ltd.

Address before: Room 101, 1st Floor, Building 3, Yard 18, Kechuang 14th Street, Beijing Economic and Technological Development Zone, Daxing District, Beijing 101102

Applicant before: Beijing yisiwei Computing Technology Co.,Ltd.

CB02 Change of applicant information