WO2022242011A1 - Procédé et appareil de présentation d'écriture manuscrite, et tablette d'interaction et support d'enregistrement - Google Patents

Procédé et appareil de présentation d'écriture manuscrite, et tablette d'interaction et support d'enregistrement Download PDF

Info

Publication number
WO2022242011A1
WO2022242011A1 PCT/CN2021/121986 CN2021121986W WO2022242011A1 WO 2022242011 A1 WO2022242011 A1 WO 2022242011A1 CN 2021121986 W CN2021121986 W CN 2021121986W WO 2022242011 A1 WO2022242011 A1 WO 2022242011A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
touch point
information
point
handwriting
Prior art date
Application number
PCT/CN2021/121986
Other languages
English (en)
Chinese (zh)
Inventor
林德熙
吕毅
李少珺
Original Assignee
广州视源电子科技股份有限公司
广州视睿电子科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广州视源电子科技股份有限公司, 广州视睿电子科技有限公司 filed Critical 广州视源电子科技股份有限公司
Publication of WO2022242011A1 publication Critical patent/WO2022242011A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04162Control or interface arrangements specially adapted for digitisers for exchanging data with external devices, e.g. smart pens, via the digitiser sensing hardware
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present application relates to the technical field of touch writing of electronic equipment, and in particular to a method and device for presenting handwriting, an interactive tablet and a storage medium.
  • the touch frame is an important hardware component of the interactive tablet, and is mainly used to collect touch information of touch signals generated by the user after a touch operation on the interactive tablet.
  • Most of the touch frames used in the interactive panels on the market are not high enough in touch accuracy, such as non-high-precision touch frames below a certain accuracy range.
  • the defects of such non-high-precision touch frames are mainly manifested in: it is difficult to write It is difficult to ensure that the touch area generated by the same writing pen is the same during writing; it is difficult to determine whether the touch medium is a writing pen or a finger or an eraser; it is also difficult to determine the touch rotation angle.
  • the software level of the interactive tablet cannot maximize the use of the touch information fed back by the touch frame, resulting in performance effects related to touch on the interactive tablet (such as the presentation of handwriting when writing) effect) did not increase significantly.
  • the embodiment of the present application provides a handwriting display method, device, interactive tablet and storage medium, which improves the display effect of handwriting on the interactive tablet.
  • the embodiment of the present application provides a method for presenting handwriting, which is applied to an interactive tablet, and the touch accuracy of the touch frame equipped on the interactive tablet reaches a set accuracy range, and the method includes:
  • the touch object touches the surface of the display screen and moves, the touch point information fed back through the touch frame is obtained, and the touch object is manipulated by the user;
  • the handwriting matching the movement state of the touch object is presented on the writing interface.
  • the moving state is reflected by the magnitude of the pressure sensitivity of the touch object acting on the display screen and the moving direction.
  • the obtaining the touch point information fed back through the touch frame includes:
  • one touch point information corresponds to one touch point
  • the touch point information includes: touch point coordinates and touch point pressure sensitivity.
  • Each touch point information is processed so that each touch point information has a unified unit format and data structure.
  • processing of each touch point information includes:
  • the unit of each data information in the touch point information is converted into a unified set unit format
  • the data structure corresponding to the set unit format is used to record the touch point information.
  • the handwriting matching the movement state of the touch object is presented on the writing interface, including:
  • the outline of the handwriting is filled and presented on the writing interface.
  • the determination of the movement status information of the touch point generated by the touch object during the movement process by analyzing the information of each touch point includes:
  • the determination of the movement status information of the touch point generated by the touch object during the movement process by analyzing the information of each touch point includes:
  • the reference of the touch point in the four directions of up, down, left and right correcting the margin value to obtain corrected second margin values
  • Each of the second margin values is recorded as the moving state information of the touch point.
  • the touch point information also includes: the generation time of the touch point and the moving direction of the touch point;
  • determining the second thickness scaling factor corresponding to each of the touch points in the four directions of up, down, left, and right includes:
  • For each touch point obtain the generation time of the touch point from the corresponding touch point information, and search for each target touch point included in the set time period before the generation time;
  • the touch point and the touch point coordinates and moving directions of each of the target touch points determine the touch offset information corresponding to the touch point, and the touch point in the four directions of up, down, left and right target distance value;
  • each of the target distance values and the touch pressure sensitivity of the touch point determine the second coarse and fine scaling coefficients of the touch point in the four directions of up, down, left and right respectively.
  • the touch point coordinates and moving direction of the touch point and each of the target touch points determine the touch offset value corresponding to the touch point, and the touch point is up, down, left and right Target distance values in four directions, including:
  • the positive and negative values of the coordinate difference are determined based on the set horizontal and vertical positive directions.
  • each of the target distance values and the touch pressure sensitivity of the touch point respectively determine the second position of the touch point in the four directions of up, down, left and right.
  • Thin and thick scaling factors including:
  • the product of the scaling factor to be corrected and the touch pressure sensitivity is used as the second thickness scaling factor in the direction.
  • forming a handwriting outline matching the moving state of the touch object includes:
  • each of the current margin values combined with the touch point coordinates of the touch point determine the stroke point corresponding to the touch point in the four directions of up, down, left and right;
  • the first setting rule is that the area of the formed closed area is the largest.
  • forming a handwriting outline matching the moving state of the touch object includes:
  • each of the current margin values combined with the touch point coordinates of the touch point, determine the asymmetric elliptical area corresponding to the touch point, and determine the tangent point of the asymmetric elliptical area;
  • the asymmetrical ellipse regions are connected along the tangent direction through corresponding tangent points to form a second handwriting outline with closed regions.
  • the determining the asymmetric elliptical area corresponding to the touch point according to each of the margin values combined with the touch point coordinates of the touch point includes:
  • the extracting the effective area from the ellipse includes:
  • An area within the quadrant interval in the ellipse is determined as an effective area.
  • a handwriting outline matching the moving state of the touch object is formed, including:
  • a third handwriting outline is formed based on the first stroke points and the second stroke points corresponding to the touch points.
  • constituting a third handwriting outline based on the first stroke point and the second stroke point corresponding to each of the touch points includes:
  • Each of the circumscribed octagons forms an approximate tangent line through the corresponding approximate tangent point, and connects them along the approximate tangent direction respectively to form a third handwriting outline with a closed area.
  • the embodiment of the present application provides a device for presenting handwriting, which is configured on an interactive tablet, and the touch accuracy of the touch frame equipped on the interactive tablet reaches the set accuracy range, and the device includes:
  • a display module configured to display a writing interface through a display screen
  • An acquisition module configured to acquire touch point information fed back through the touch frame when a touch object touches the surface of the display screen and moves, and the touch object is manipulated by the user;
  • the presentation module is configured to present, on the writing interface, handwriting matching the movement state of the touch object by analyzing the information of each touch point.
  • an interactive tablet including:
  • the touch frame which has a touch accuracy up to the set accuracy range, is used to collect touch point information generated when the touch object is touched;
  • the display screen is combined with the touch frame to form a touch screen for displaying interactive content
  • processors one or more processors; memory means for storing one or more programs;
  • the one or more processors are made to implement the method provided in the first aspect of the present application.
  • the embodiment of the present application further provides a storage medium containing computer-executable instructions, and the computer-executable instructions are used to execute the method as described in the first aspect when executed by a computer processor.
  • the handwriting presentation method, device, interactive tablet and storage medium provided above.
  • the proposed method can be executed by an interactive flat panel, and the touch accuracy of the touch box equipped on the interactive flat panel reaches the set accuracy range; wherein the method can first display the writing interface through the display screen; and then touch the surface of the display screen with the touch object and When moving, the touch point information fed back by the touch frame can be obtained; finally, by analyzing the information of each touch point, the handwriting matching the moving state of the touch object can be presented on the writing interface.
  • the configured high-precision touch frame can be optimized by the method provided in this embodiment at the software application level.
  • the method provided in this embodiment ensures that the handwriting presented on the writing interface can better match the movement state of the user's moving touch object during the writing process, thereby more Good presentation of the handwriting with the user's writing style, thereby realizing the improvement of the handwriting rendering effect on the interactive tablet.
  • FIG. 1 shows a schematic flow chart of a method for presenting handwriting provided in Embodiment 1 of the present application
  • Fig. 1a is a diagram showing the effect of a touch frame responding to a touch object in a method for presenting handwriting provided in Embodiment 1 of the present application;
  • Fig. 1b shows an effect display diagram of a conventional rendering in the handwriting rendering method in the related art
  • Figure 1c shows the effect display diagram of the handwriting rendering method provided by the first embodiment of the present application
  • FIG. 2 shows a schematic flowchart of a method for presenting handwriting provided in Embodiment 2 of the present application
  • Fig. 2a shows one implementation flow chart of determining the moving state information in the handwriting presentation method provided in Embodiment 2 of the present application
  • Figure 2b shows an effect display diagram of the mobile state information determined in Embodiment 2 of the present application
  • Fig. 2c shows one implementation flow chart of handwriting contour determination in the handwriting rendering method provided in Embodiment 2 of the present application;
  • Figure 2d shows the effect of handwriting outline determination in the handwriting rendering method provided by the second embodiment of the present application
  • Fig. 2e shows another implementation flow chart of determining the moving state information in the handwriting presentation method provided in the second embodiment of the present application
  • Fig. 2f shows another implementation flow chart of handwriting outline determination in the handwriting rendering method provided in the second embodiment of the present application
  • Figure 2g shows the effect display diagram of the asymmetric ellipse area determined in the second embodiment of the present application
  • Fig. 2h shows the effect display diagram of the second handwriting outline determined in the second embodiment of the present application
  • Fig. 2i shows another implementation flowchart of handwriting contour determination in the handwriting rendering method provided in Embodiment 2 of the present application;
  • Fig. 2j shows the effect diagram of the third handwriting outline determined in the second embodiment of the present application
  • Figure 2k shows a handwriting display diagram with special effects of the user's writing style presented by the method provided in Embodiment 2 of the present application;
  • FIG. 3 is a structural block diagram of a handwriting display device provided in Embodiment 3 of the present application.
  • FIG. 4 is a schematic structural diagram of an interactive panel provided in Embodiment 4 of the present application.
  • the hardware part of the interactive panel is composed of display screens, intelligent processing systems and other parts, which are combined by integral structural parts and supported by dedicated software systems.
  • the display screen may specifically include a Light Emitting Diode (Light Emitting Diode, LED) display screen, an Organic Light-Emitting Diode (OLED) display screen, a Liquid Crystal Display (Liquid Crystal Display, LCD) display screen, and the like.
  • a touch frame can be formed to form a touch screen.
  • the optical touch sensor constituting the touch frame can scan the touch object, such as a user's finger, a stylus, etc., on the surface of the display screen using light signals.
  • a cover glass is provided on the surface of the display screen. Therefore, in the embodiments of this specification, the surface of the display screen refers to the cover glass of the display screen. surface.
  • the touch frame can respond to the above touch operations and pass the corresponding touch operation information to the intelligent processing system at the application level, so that through intelligent processing
  • the system implements various interactive applications.
  • the optical touch sensor may include an infrared emitter and an infrared receiver.
  • the infrared emitter is used to emit infrared signals
  • the infrared receiver is used to receive infrared signals.
  • the densely distributed infrared signals in different directions are used to form a beam grid to locate the touch point.
  • the display screen is equipped with a frame with a circuit board, which is used to arrange infrared emitters and infrared receivers around the display screen to form a horizontal and vertical beam grid touch frame.
  • the display screen has the above-mentioned touch frame
  • the touch object blocks the infrared signal
  • the light measurement value will be weakened at the corresponding infrared receiver, so the position of the touch point on the screen can be determined.
  • the infrared transmitter is installed on the first side of the frame of the display screen
  • the infrared receiver is installed on the second side of the frame of the display screen
  • the first side is opposite to the second side, that is, the infrared receiver is on the side of the infrared transmitter.
  • the infrared signal emitted by the infrared transmitter is received by the infrared receiver.
  • the shape of the display screen is different, such as rectangle, hexagon, circle, etc.
  • the shape of the frame also varies with the shape of the display screen, such as rectangle, hexagon, circle, etc.
  • the settings of infrared emitters and infrared receivers in each infrared module are also different.
  • a conventional touch frame configured on an interactive panel responds to a touch signal of a touching object
  • the touch precision usually falls within a conventional range, and such a conventional touch frame can be recorded as a non-high-precision touch frame.
  • a non-high-precision touch frame with a touch accuracy in the conventional range, it may be difficult to recognize the size of the touch area of the touch object on the display screen. Therefore, in the touch writing mode, it is difficult to judge what type of touch screen the user uses. Touch the object to write, or it is difficult to judge what touch medium (finger, writing pen) the user uses to touch.
  • it is difficult for a non-high-precision touch frame to ensure that the same type of touch object presents the same touch area during the touch process.
  • the touch frame adopted by it can optionally adopt a high-precision touch frame with high touch accuracy, the so-called High touch accuracy can be understood as the touch accuracy has reached the set accuracy range, wherein the accuracy limit of the set accuracy range is higher than the conventional accuracy range.
  • the touch frame adopted in this embodiment it can provide more detailed touch information to the upper application level, such as the touch area of the touch object, more accurate coordinates of the touch point, and the touch object’s touch during the touch process. rotation angle etc.
  • the intelligent processing system in the interactive whiteboard can include a host processor, which belongs to the processor of the interactive panel. Lively audio-visual effects.
  • the host processor is a computing module with higher performance.
  • the host processor can be an Android (Android) module, the Android (Android) system can be installed, and the CPU (Central Processing Unit, central processing unit), GPU (Graphics Processing Unit, graphics processing unit), RAM (random access memory, random access memory) and ROM (Read-Only Memory, read-only memory) and other components, for example, for Android7.0 version, CPU is dual-core A72 and quad-core A53, GPU is Mali T860, RAM is 4GB, ROM is 32GB, etc.
  • Android Android
  • CPU Central Processing Unit, central processing unit
  • GPU Graphics Processing Unit, graphics processing unit
  • RAM random access memory
  • ROM Read-Only Memory, read-only memory
  • other components for example, for Android7.0 version, CPU is dual-core A72 and quad-core A53, GPU is Mali T860, RAM is 4GB, ROM is 32GB, etc.
  • the host processor can be a PC (personal computer, personal computer) module configured with components such as CPU, GPU, memory, and hard disk.
  • the CPU is an Intel Core i5 /i7
  • the GPU is Intel HD Graphics
  • the memory is DDR4 8G/16G
  • the hard disk is 128G/256G.
  • FIG. 1 shows a schematic flowchart of a method for presenting handwriting provided in Embodiment 1 of the present application.
  • This embodiment is applicable to the situation of presenting handwriting on a writing interface.
  • the method can be executed by a handwriting presentation device, which can be implemented by software and/or hardware, and can be configured in an interactive panel, especially in a processor of an interactive panel, which can be intelligent processing
  • the host processor in the system the touch frame equipped in the interactive flat panel has a touch accuracy up to the set accuracy range; in addition, the touch frame is also electrically connected to the display screen.
  • a method for presenting handwriting provided in Embodiment 1 of the present application specifically includes the following steps:
  • the execution subject of the method provided in this embodiment is also provided with a graphics processor (Graphics Processing Unit, GPU), which can provide video processing functions.
  • the information of the device is placed into the frame memory, and the serial display data and scan control timing required by the display are generated for the video signal according to the partition driving method.
  • the display screen set on the interactive panel can play frame data information according to the serial display data and scan control timing, thereby displaying various pictures on the display screen.
  • the display interface can be regarded as the interface displayed on the display screen after the user enters the writing mode through a trigger.
  • an interface can be displayed on the display screen, and the user can select handwriting attribute parameters, such as handwriting color and handwriting thickness, and can present handwriting on the interface according to the configuration parameters.
  • the writing interface can be an independent interface.
  • the interactive tablet provides an electronic whiteboard, and the user triggers a control operation to display the electronic whiteboard in the interactive tablet.
  • the interactive tablet receives the control operation and displays the electronic whiteboard as a writing interface.
  • the user can trigger a touch operation on the electronic whiteboard, and the touch operation is expressed in the form of a trajectory. Therefore, through the analysis of the generated trajectory by the upper layer application of the interactive panel, the trajectory can be displayed on the screen of the interactive panel.
  • the control operations on the electronic whiteboard include but not limited to touch operations, keyboard operations, mouse operations, and physical button operations.
  • the writing interface can also be an interface with a background.
  • the interactive flat panel displays local courseware, displays information transmitted by a screen transfer device (USB Dongle, USB dongle), and belongs to a source device (such as a notebook computer, etc.). Data such as screen images.
  • the user triggers an annotation operation on the interactive tablet, and the interactive tablet receives the annotation operation, freezes data such as courseware and screen images, and makes them become the background, that is, maintains the current frame of data displaying courseware, screen images, etc., and displays them in the courseware, screen images, etc.
  • a mask is generated on top of the data to act as a writing interface.
  • the user can trigger a touch operation on the screen of the interactive tablet, and the touch operation is represented in the form of a track, and the interactive tablet can display handwriting corresponding to the touch track on the mask.
  • the so-called courseware can refer to the course documents made according to the requirements of teaching, after the determination of teaching objectives, the analysis of teaching content and tasks, the structure of teaching activities and the design of interface, etc.
  • the courseware can be Word documents, PPT Files in public formats such as PowerPoint, presentations, courseware presentation whiteboard documents, etc., may also be custom pages composed of text, tables, pictures and other elements, which is not limited in this embodiment.
  • the touch object may specifically be a user's finger, an active stylus or a passive stylus, etc., and the user may manipulate the touch object to move on the display surface of the interactive panel, wherein, when the touch object moves, the The rendered movement state can be used for the rendering of handwriting.
  • the interactive panel is further configured with a touch frame combined with the display screen, wherein the touch frame may specifically be a frame formed by an optical touch sensor nested at the edge of the display screen.
  • the touch frame can generate a touch signal based on the included optical touch sensor when the touch object moves on the display screen, and identify corresponding touch point information through a response to the touch signal.
  • FIG. 1a is a diagram showing the effect of a touch frame responding to a touch object in a method for presenting handwriting provided in Embodiment 1 of the present application.
  • one or more optical touch sensors 120 are installed on both sides of the edge of the display screen 110 of the interactive panel, forming a touch frame.
  • the movement state of the touch object (such as a finger) manipulated by the user on the display screen 110 can be presented by using finger states 131 to 135 .
  • the processor can activate the optical touch sensor 120, and the optical touch sensor 120 scans the light signal on the display surface of the interactive tablet, and detects whether the display surface appears or not according to the transmission of the light signal.
  • the touch object when the touch object is detected, generates a corresponding touch signal in real time during the movement of the touch object.
  • the touch frame can respond to the generated touch signal, so as to feed back the touch point data identified after the response to the upper layer of the interactive panel (such as the main processor in the intelligent processing system).
  • the touch point data is recorded is the touch point information.
  • the touch frame used in this embodiment is a high-precision touch frame.
  • the touch point information fed back by the touch frame is superior to the touch point information fed back by the conventional touch frame in terms of accuracy and information detail.
  • the touch point information includes at least coordinate information of the touch point, pressure sensitivity information of the touch point, and the like.
  • the user touches to enter the writing mode, and it can be known that the user's writing intention is to write text, or to mark and identify important content through the presented handwriting.
  • the user writes text
  • a better writing experience is that the handwriting can be presented according to the user's writing style when writing on paper, such as presenting the user's unique strokes, or the user also prefers to present a smoother and coherent Write handwriting as a marker.
  • this embodiment wants to realize the personalized presentation of the user's handwriting, it needs to be able to obtain the data information representing the user's personalized characteristics, and then the data information can be processed to present the effect that meets the user's needs.
  • the user's writing process is equivalent to the process of the user controlling the movement of the touch object on the surface of the display screen, and the user's personalized style is mainly reflected in the movement control of the touch object.
  • the touch objects controlled by different users have different moving states on the display screen .
  • the different moving states of the touch object can be reflected in the feedback touch point information, and further, the moving state acts on the display screen through the touch object.
  • the size of the pressure sensitivity value and the direction of movement can be reflected by the pressure sensitivity value and the direction of movement, both of which can be extracted from the touch point information.
  • the handwriting matching the moving state of the touch object can be presented in the writing interface.
  • the text can show the user's writing style;
  • the presented handwriting is a marker such as a marking line, the marking line can more smoothly and accurately mark the content that the user wants to mark.
  • the position of each touch point required for determining the handwriting can also be determined first through the touch point information in the moving process , and then analyze the points to be connected, the lines to be connected and the required connection methods between them that diverge from each touch point when the handwriting outline is presented based on each touch point, and finally based on the above-mentioned information analyzed, the user can present a Manipulate touch objects to move matching handwriting.
  • FIG. 1 b shows a display diagram of a conventional rendering effect in the handwriting rendering method in the related art
  • FIG. 1 c shows an effect display diagram of the handwriting rendering method provided in Embodiment 1 of the present application.
  • the handwriting shown in Fig. 1b and Fig. 1c is mainly written by the user.
  • the conventional handwriting presentation method is adopted, and the displayed text is only conventional handwriting presentation, which does not reflect the user's writing characteristics.
  • the presented text can be displayed in a personalized manner, such as rough or neat, round or sharp when the user writes.
  • the handwriting presentation method provided in Embodiment 1 of the present application can be executed by an interactive tablet, and the touch accuracy of the touch box equipped on the interactive tablet reaches the set precision range; the method can firstly display the writing interface through the display screen; and then When the touch object touches the surface of the display screen and moves, the touch point information fed back through the touch frame can be obtained; finally, by analyzing the information of each touch point, it can be displayed on the writing interface to match the moving state of the touch object handwriting.
  • the execution subject of the method, the interactive tablet is equipped with a high-precision touch frame on the hardware structure, and can realize the function optimization at the software application level of the configured high-precision touch frame through the method provided in this embodiment.
  • the method provided in Embodiment 1 ensures that the handwriting presented on the writing interface can better match the movement state of the user’s moving touch object during the writing process, thereby better presenting Handwriting with the user's writing style, which in turn improves the rendering effect of handwriting on the interactive tablet.
  • this optional embodiment may further include: processing each of the touch point information, so that each of the touch Point information has a unified unit format and data structure.
  • the display operation of handwriting is mainly performed by the intelligent processing system on the upper layer of the interactive tablet, specifically, it can be executed by the host processor, and the touch point information required for the display of handwriting mainly comes from the hardware level of the interactive tablet. Touch the frame to provide feedback.
  • the touch point information fed back by the touch frame can be regarded as the input information required by the upper layer.
  • the execution parameters of the touch boxes are also different, which may lead to differences in the representation of the touch information fed back by the touch boxes, which will affect the handwriting presentation method normal execution.
  • the information processing operation proposed in this optional embodiment is added on the basis of the first embodiment above.
  • this optional embodiment can analyze the production information and batch information of the touch frame, determine the original information format of the touch point information fed back by the touch frame, and then process the unit format and data structure of the touch point information, Ensure that the data input to the upper layer of the interactive panel has a unified information format.
  • the processed touch point information removes the unit format related to the touch frame manufacturer or batch.
  • the touch area unit fed back in the touch point information in the original information format is basically based on the blocked optical trigger on the touch frame.
  • the number of sensors is used as a touch width unit and a touch height unit, which can be converted into an abstract unit in unified software, such as a pixel unit, in this optional embodiment.
  • each touch point information can be embodied as:
  • the unit of each data information in the touch point information is converted into a unified set unit format
  • the data structure corresponding to the set unit format is used to record the touch point information.
  • this optional embodiment can uniformly convert data information such as the coordinates of the touch point, the height and width of the touch point, or the vertices of the geometric figure formed when the touch frame is identified in the original information format into More abstract unit values at the software level, such as coordinate points represented by pixels, width or height values, etc.
  • the high-precision touch frame can also capture the rotation operation of the touch object during the touch process, and can determine the rotation angle of the touch rotation. At this time, through the processing method of this optional embodiment , and the initially obtained rotation angle can also be processed in a unified radian unit.
  • the above optional embodiment of the first embodiment of the present application specifically optimizes and adds the processing operation of the touch point information fed back by the touch frame. Through this processing operation, the unified input of the touch point information can be realized, avoiding the problem caused by the touch frame itself.
  • the incompatibility of the touch point information in the subsequent execution process caused by the different attribute parameter information of different attributes effectively improves the execution efficiency of handwriting presentation.
  • FIG. 2 shows a schematic flow chart of a method for presenting handwriting provided in Embodiment 2 of the present application.
  • This embodiment is optimized on the basis of the above-mentioned embodiment.
  • the feedback obtained through the touch frame can be
  • the specific optimization of the touch point information is as follows: identify each touch signal through the hardware circuit in the touch frame, and the touch signal is generated when the touch object moves on the display screen; obtain the touch frame through the man-machine
  • the interactive (Human Interface Device, HID) standard protocol is for the touch point information fed back by each touch signal, wherein one touch point information corresponds to one touch point, and the touch point information includes: touch point coordinates and touch point pressure sensitivity.
  • HID Human Interface Device
  • the writing interface by analyzing the information of each of the touch points, presenting on the writing interface the written handwriting that matches the movement state of the touch object as follows: by analyzing the information of each of the touch points The analysis of the information determines the movement state information of the touch point generated by the touch object during the movement process; through the analysis of each of the movement state information, a handwriting outline matching the movement state of the touch object is formed; filling the The outline of the handwriting is presented on the writing interface.
  • a method for presenting handwriting provided in Embodiment 2 of the present application specifically includes the following operations:
  • the writing button can be triggered by the user to enter the writing interface, or the editing function can be triggered to enter the writing interface in editing mode in some associated scenarios (such as the courseware display scenario).
  • the optical touch sensor can be regarded as a core component constituting the touch frame.
  • the optical touch sensor (such as an infrared emitter set on one side and an infrared receiver set on the other side) can be set on the edge of the display screen in real time. Whether there is a touch object on the surface of the display screen is detected by using whether the beam grid formed by the densely distributed infrared signals in different directions is blocked or not.
  • a corresponding touch signal can be generated at the corresponding position when the touch object blocks the normally emitted infrared signal; after that, the hardware circuit set in the touch frame can identify the touch signal, such as through Identify the high and low levels of the touch signal to determine the coordinate information of the position of the touch signal represented by the data at the hardware level, the corresponding width information and height information when the touch object blocks the infusion grid, and even the touch object Touch area information and rotation information, etc.
  • a group of touch signals can be generated correspondingly, and the hardware circuit on the touch frame can effectively identify the relevant touch information of each touch signal in the group, and at the same time,
  • the pressure sensitivity information of the touch object at each touch point can be determined through the pressure of the touch object acting on the hardware circuit of the touch frame.
  • the touch frame is a hardware structure on an interactive panel
  • the touch point information identified by the hardware circuit on the touch frame relative to each touch point is difficult to directly input to the upper-layer software processing module. Therefore, this step can be adopted.
  • the special human-computer interaction HID standard protocol obtains the converted touch point information readable at the software level, and the object of the conversion process is the touch point information recognized at the hardware level.
  • each touch point information fed back by the touch frame specifically represents each touch point triggered by the touch object, and the required touch point information must at least include the coordinates of the touch point and the touch Feel the pressure.
  • the touch point coordinates are the basic information of touch, and the touch point pressure sensitivity can be used to represent the touch pressure of the user when operating the touch object to move, and this information can be used to indirectly reflect the user's writing style.
  • the touch point information fed back by the touch frame can be obtained in real time during the movement of the touch object. Therefore, this embodiment can realize the presentation of handwriting on the writing interface through the following S204 to S206.
  • this step can be used to first determine the information related to the moving state of the touch object during the moving process.
  • the touch object can be The determination of the movement state of the conversion is determined by the movement state information of each touch point generated during the movement process.
  • the generation of a touch point is related to the touch frame's sensing of the touch object, and in the touch frame's sensing of the touch object, the touch signal can be generated by the touch object blocking the light-speed grid.
  • the hardware circuit of the touch frame Responding to the touch signal the corresponding blocked position on the touch frame of the touch signal can be determined, and the corresponding blocked position can be regarded as the generation position of the touch point, and the coordinates of the blocked position can be obtained After that, it is equivalent to obtaining the coordinate information of the touch point.
  • each touch point generated is discrete.
  • associated information can be simply connected to adjacent touch points, and the line generated after the connection is used as one of the traces in the handwriting.
  • the displayed handwriting is preset with attribute information such as the thickness or color of the displayed handwriting. Therefore, simply connecting touch points to form line handwriting cannot obtain handwriting that matches the user.
  • the Bezier curve algorithm is often used, that is, by treating the touch point as a control point and adding a virtual point between the two touch points to make a smooth curve, but it is difficult to use this algorithm for the handwriting of different users Personalized presentation, and the thickness of the handwriting is also uniform and unchanged.
  • each touch point can be considered as the center to determine its extension distance in the four directions of up, down, left, and right, and the extension distance of the touch point in each direction can be the pressure applied by the user relative to the touch point.
  • the embodiment of the sense of touch and the direction of movement can also be specifically determined based on the direction of movement of the touch point and the touch pressure sensitivity.
  • the moving state of the touch object is related to the pressure sense and the direction of movement when the user controls the movement of the touch object, thus, if you want to obtain the handwriting that matches the user's writing style, you can first extend the touch point in all directions
  • the distance is used as the moving state information representing the moving state of the touch object.
  • the extension distance of the touch point in each direction can be recorded as the margin value of the touch point in each direction, and uniformly regarded as the moving state information corresponding to the touch point .
  • the margin values of the touch point in each direction may be the same data value or different data values.
  • this embodiment records the margin value under the same data value as the first
  • the margin value the margin value in the case of different values is recorded as the second margin value.
  • the first margin value and the second margin value of the touch point in each direction may be determined in different ways, and the different ways of determining the margin values directly reflect the different ways of determining the moving state information.
  • the margin value can be the calculated value of the preset handwriting thickness value and a coefficient to be determined.
  • the specific values of the coefficients to be determined in each direction can be the same;
  • the specific values of the undetermined coefficients in each direction can be different.
  • FIG. 2a shows an implementation flowchart of determining the moving state information in the handwriting presentation method provided in the second embodiment of the present application. As shown in Figure 2a, this optional embodiment further determines the movement state information of the touch point generated by the touch object during the movement process by analyzing the information of each touch point, which is embodied in the following steps:
  • the handwriting thickness value can be considered as an attribute parameter value preset when the user configures the handwriting attribute information after entering the writing mode triggered by the user.
  • the thickness basis of handwriting can be considered as an attribute parameter value preset when the user configures the handwriting attribute information after entering the writing mode triggered by the user.
  • S2042 to S2044 are implemented relative to the determination of the movement state information of a single touch point, and the movement state information of each touch point can be determined through the following steps.
  • this step can directly extract the touch point pressure included in the touch point information.
  • the relationship between the touch pressure sensitivity and the thickness scaling coefficient can be established in advance.
  • the value range of the thickness scaling coefficient is (0,1), and the touch pressure sensitivity values of different sizes can be in (0 , 1) Determine a corresponding thickness scaling coefficient value within the range to form a pressure sensitivity coefficient association table.
  • a matching thickness scaling factor can be found from the pressure sensitivity coefficient association table.
  • this embodiment marks the thickness scaling factor as the first thickness scaling factor.
  • the combination of the thickness scaling factor and the handwriting thickness value can be used to determine the actual thickness of the touch point on the writing interface.
  • the product value of the handwriting thickness value and the first thickness scaling factor may be regarded as the reference value on which the actual thickness is displayed on the writing interface.
  • the above-mentioned determined product value can be regarded as the extension distance of the handwriting to be presented that takes the touch point as the intersection point and extends to the surroundings.
  • the extension distance of the point in the four directions of up, down, left, and right can be regarded as the margin value of the touch point in the four directions of up, down, left, and right.
  • This embodiment is recorded as the first margin value, and the summarized four margin values can be recorded as The movement state information of the touch point.
  • FIG. 2b shows an effect display diagram of the mobile state information determined in Embodiment 2 of the present application.
  • points A, B, C, and D can be used as touch points generated during the movement of the touch object, and each touch point can form two intersecting lines in the horizontal and vertical directions with each touch point as the intersection point.
  • the extension in four directions of up, down, left, and right relative to the touch point is realized, and the above-mentioned determined margin values in each direction can be obtained and marked with endpoints.
  • the 16 endpoints A1 ⁇ A4, B1 ⁇ B4, C1-C4, D1-D4 are respectively shown in FIG. 2b, showing the margin values of touch points A, B, C, and D in the four directions of up, down, left, and right respectively.
  • the characteristics of the presented handwriting are mainly matched with the moving state of the touch object, and the user's writing style can be displayed through the presented handwriting.
  • the moving state of the touch object is determined by the touch pressure and movement direction when the user operates the touch object, in order to present user-style handwriting on the writing interface, it is necessary to first determine the movement of the touch object through the above steps Mobile state information for the state.
  • the thickness of the handwriting that each touch point can present on the writing interface can be determined.
  • this implementation For example, it is necessary to ensure that the touch points generated during the movement are connected to each other, and after the connection, a smooth and user-style handwriting outline can be presented.
  • the margins in the four directions of up, down, left, and right included in the movement status information can be considered Value
  • the end point of the margin value is used as the stroke point corresponding to the touch point
  • the connection between two adjacent touch points can be established through the determined stroke points
  • the closed curve obtained after the connection is used as the The moving state of the touch object matches the handwriting outline.
  • the margins in the four directions of up, down, left, and right included in the movement status information can be considered value, combine the two margin values in adjacent directions to build an ellipse, and finally stitch the ellipse area from the four ellipses formed to form an asymmetrical ellipse area around the touch point.
  • the margins in the four directions of up, down, left, and right included in the movement status information can be considered value, combine the two margin values in adjacent directions to build an ellipse, and finally stitch the ellipse area from the four ellipses formed to form an asymmetrical ellipse area around the touch point.
  • the margins in the four directions of up, down, left, and right included in the movement status information Value to determine the margin values of the touch point in the four directions of upper left, upper right, lower left and lower right, and connect the endpoints of each margin value sequentially to form an octagon corresponding to a touch point.
  • there is an octagon corresponding to each formed touch point and two adjacent touch points can be connected through the corresponding octagon and the approximate tangent point on the octagon.
  • all The line outline formed after the touch points are connected is used as the handwriting outline.
  • the handwriting in this embodiment is generated in real time relative to the movement of the touch object, and the latest touch point information fed back during the movement of the touch object can be regarded as the last touch point in the touch point sequence.
  • FIG. 2c shows a flow chart of one implementation of handwriting outline determination in the method for presenting handwriting provided in the second embodiment of the present application.
  • this optional embodiment by analyzing the moving state information, forming a handwriting outline matching the moving state of the touch object is embodied in the following steps:
  • the touch point can obtain the first margin value with the same data value as the margin value in the four directions; Second margin values with different data values for the margin values in the four directions may be obtained.
  • the current margin value used in this step is specifically related to the method used to determine the margin value, and may be the value recorded as the first margin value or the value recorded as the second margin value.
  • this step and the following S2052 are operations relative to each touch point.
  • the coordinate values of the end points of the margin values corresponding to each current margin value can be determined in the same coordinate system, and The determined coordinate values are used as the coordinate values of the pen gallery point of the touch point in the four directions of up, down, left, and right.
  • this embodiment can receive the information of each touch point generated within a cycle fed back by the touch frame at a certain moment. Touch point information, and at this moment, handwriting presentation can be performed on each received touch point according to the method provided in this embodiment.
  • the pen outline points of each touch point fed back in a cycle can be obtained, and by connecting the pen outline points, adjacent touch points can be connected through the connection of the pen outline points, and finally the first setting can be obtained.
  • the regular handwriting outline is recorded as the first handwriting outline in this embodiment, wherein the first setting rule is that the area of the formed closed area is the largest.
  • Fig. 2d is a diagram showing the effect of handwriting contour determination in the method for presenting handwriting provided in Embodiment 2 of the present application.
  • the stroke points corresponding to the touch points A, B, C, and D can be determined.
  • the above-mentioned determined outline of the handwriting is a closed area, and the closed area can be filled with the previously set handwriting color to form the handwriting presented on the writing interface.
  • the method provided in this embodiment can be performed in real time with the touch point information fed back by the touch frame, because the touch point information is fed back periodically, that is, this embodiment can be based on a certain number of points fed back in a cycle Touch the point to reveal a piece of writing corresponding to that period.
  • This embodiment simultaneously considers the connection between the last touch point in the previous cycle and the first touch point in the next cycle.
  • the connection method between two adjacent touch points under the same cycle given in this embodiment can be used to realize the connection, such as directly connecting through the outline points, or through the non-contact formed by two adjacent touch points.
  • connection realized by the symmetrical ellipse and the corresponding tangent or the connection realized by the octagon formed by two adjacent touch points and the corresponding approximate tangent point.
  • the method provided by this embodiment can ensure a smooth connection between touch points fed back by the touch frame.
  • the second embodiment of the present application provides a method for presenting handwriting, which embodies the feedback form of touch point information, and also embodies the presentation manner of handwriting.
  • the implementation of the method is based on the premise that the touch frame equipped on the interactive panel has a touch precision within a set precision range.
  • the high-precision touch frame can feed back more accurate information containing more effective information to the application layer. Touch Point Information.
  • the above S204 is given, that is, by analyzing the information of each of the touch points, determine the movement state information of the touch points generated during the movement of the touch object Another implementation.
  • This third optional embodiment provides another kind of mobile state information Definite implementation. By implementing this implementation method, it can be ensured that the margin values determined by the touch point in the four directions of up, down, left, and right are different data values.
  • FIG. 2e shows another implementation flow chart of determining the moving state information in the method for presenting handwriting provided in Embodiment 2 of the present application.
  • the determination of the mobile state information specifically includes the following steps:
  • This step is equivalent to a preprocessing step, and a reference margin value can be given for each touch point in the four directions of up, down, left, and right.
  • the key in order to realize the difference in the margin values of the touch point in the four directions of up, down, left, and right, the key is to determine the thickness scaling factor that the touch point should have in the four directions of up, down, left, and right. Considering that when the touch object moves The movement directions sensed by the touch point are different, and it can be known that the pressure sensitivity adopted by the user in the four directions of up, down, left, and right when controlling the movement of the touch object is also different.
  • this step specifically considers what happens to each touch point in the horizontal and vertical directions relative to a certain number of touch points generated before it. Offset, finally taking into account the effect of this offset on the coarse and fine scaling factors.
  • the further optimization of the touch point information in this third optional embodiment further includes: the generation time of the touch point and the moving direction of the touch point; and further analyzing each of the touch point information, Determining the second thickness scaling factor corresponding to each touch point in the four directions of up, down, left and right, embodied as:
  • For each touch point obtain the generation time of the touch point from the corresponding touch point information, and search for each target touch point included in a set time period before the generation time.
  • the touch point information fed back by the touch frame also includes the generation time of the touch point during the touch movement.
  • this step can be used to determine how many target touch points are included in a set time period.
  • the touch point and the touch point coordinates and moving direction of each of the target touch points determine the touch offset information corresponding to the touch point, and the four touch points: up, down, left, and right Target distance value in direction.
  • each touch point can be used as the current touch point to be calculated for the margin value, and the target touch point selected relative to it can be used as a reference point for margin value calculation.
  • the offset of the current touch point relative to the target touch point in the horizontal and vertical directions can be determined, so that the current touch point can be further determined.
  • the steps of determining the touch offset information and the target distance value can be described as:
  • touch point A the current touch point to be calculated for the margin value is touch point A
  • touch point A includes 4 target touch points before touch point A, which are respectively recorded as touch points B, C, D, and E in order of generation time.
  • the time difference between the generation times of touch point B and touch point A is the largest, so touch point B is recorded as the core touch point.
  • the moving direction of touch point A can be selected as the positive direction in the horizontal and vertical directions, and then the horizontal and vertical coordinates of touch point A are used to subtract the touch
  • the obtained coordinate difference can be recorded as the horizontal offset distance X1 and the vertical offset distance Y1 respectively.
  • the difference between the horizontal offset distance X1 and the horizontal distance value x1 is used as the corrected horizontal offset distance value, and the difference between the vertical offset distance Y1 and the vertical distance value y1 is used as the corrected vertical offset distance value.
  • this step is equivalent to an optional step, and when there are no touch points with different moving directions, this step can be skipped and step b13 can be directly executed.
  • the positive and negative values of the coordinate difference are determined based on the set horizontal and vertical positive directions.
  • each of the target distance values and the touch pressure sensitivity of the touch point respectively determine the second thickness scaling factor of the touch point in the four directions of up, down, left and right .
  • step c1 the following sub-steps can be used to specifically describe the process of determining the second coarse and fine scaling coefficient in step c1:
  • the directions in this step specifically include four directions: up, down, left, and right.
  • the target offset distance corresponding to the up and down direction should be the vertical offset distance Y1
  • the target offset distance in the left and right direction should be the horizontal offset distance X1.
  • c12. Determine the difference between the target distance in the direction and the target offset distance, and record the quotient of the difference and the set touch constant as the scaling factor to be corrected.
  • the target distance values of touch point A relative to touch points B ⁇ E are 4 in the up and right directions, and 0 in the down and left directions respectively.
  • the scaling factor to be corrected can be Y1-k1, and the right direction can be X1-k1; the scaling factor to be corrected in the downward direction can be Y1/k; the scaling factor to be corrected in the left direction can be X1/k .
  • k1 is 4/k
  • 4 is the above-mentioned determined target distance value
  • k is a set touch constant.
  • the touch pressure sensitivity of it can also be obtained, and this step can be directly used to multiply the scaling factor to be corrected in each direction of the touch point in the up, down, left, and right directions by the touch pressure sensitivity calculation, the calculated product value can be regarded as the second thickness scaling factor in the corresponding direction.
  • the reference margin in the corresponding direction can be directly based on the second thickness zoom factor of the touch point in the up, down, left, and right directions. value is corrected.
  • the correction process can be understood as the product of the reference margin value in this direction and the corresponding second thickness scaling factor, and the finally determined product value is used as the margin value of the touch point in this direction, which is recorded as the second side in this embodiment distance value.
  • the second margin values in each direction can be collectively referred to as the moving state of the touch point information.
  • this optional embodiment provides the above S205, that is, by analyzing each of the moving state information, a handwriting outline matching the moving state of the touch object is formed. Another implementation.
  • This fourth optional embodiment provides another way to determine the moving state information way of realization.
  • the method of forming the outline of the handwriting provided by this alternative embodiment can improve the smoothness of the outline of the handwriting.
  • FIG. 2f shows another implementation flowchart of handwriting outline determination in the handwriting presentation method provided in Embodiment 2 of the present application.
  • the determination of the handwriting outline specifically includes the following steps:
  • the touch point can obtain the first value of the same data value of the margin value in the four directions.
  • Margin value you can also get the second margin value with different data values for the margin values in the four directions.
  • the current margin value used in this step can also be the value recorded as the first margin value, or the value recorded as the second margin value.
  • S2502. Determine an asymmetric ellipse area corresponding to the touch point according to each current margin value combined with the touch point coordinates of the touch point, and determine a tangent point of the asymmetric ellipse area.
  • This optional embodiment introduces the concept of determining the outline of handwriting based on ellipses. Considering the difference in margin values in each direction, this optional embodiment considers constructing the handwriting outline through an asymmetrical ellipse area. Wherein, the asymmetrical ellipse area may evolve from a conventional ellipse, which is specifically related to the current margin value of the touch point in each direction.
  • the situation that the touch point has the same margin value in all directions can be regarded as a special case where the second margin value is used in each direction.
  • this fourth optional embodiment further optimizes the determination of the asymmetric elliptical area corresponding to the touch point according to each of the current margin values combined with the touch point coordinates of the touch point as the following steps :
  • the obtained four groups can be respectively (q, e) , (q,r), (w,e) and (w,r).
  • the specific realization of extracting the valid area from the ellipse can be described as: determining the quadrant interval corresponding to the two fixed values associated with the construction of the ellipse in the virtual coordinate system; An area in the quadrant interval of the ellipse is determined as an effective area.
  • FIG. 2g shows an effect display diagram of the asymmetric ellipse area determined in Embodiment 2 of the present application.
  • the asymmetric elliptical area includes four effective areas, which are the first effective area 01, the second effective area 02, the third effective area 03 and the fourth effective area 04.
  • the four effective areas are respectively From four different ellipses, each ellipse can be determined by the method of step b2 above, and each effective area can also be determined by the method of determining the effective area.
  • Figure 2h shows the effect display diagram of the second handwriting outline determined in the second embodiment of the present application.
  • the first closed area 22 is formed after the two are connected, and the first closed area 22 can be used as the second handwriting outline of the touch point.
  • Fig. 2h is only a schematic diagram, and the tangents given to each asymmetrical ellipse in the figure are not strict, but in practical applications, the connection method adopted can be connected strictly according to the tangents of the tangent points.
  • the above-mentioned steps of S2501 and S2502 can be used to determine the corresponding asymmetric ellipse area for each touch point, and then two asymmetric ellipse areas can be determined from each asymmetric ellipse according to the tangent point determination method of the ellipse.
  • the tangent point (the distance between the tangent lines corresponding to the two tangent points is the largest), and then each asymmetric ellipse can be connected through the determined tangent point, according to the adjacency relationship of the touch points, and connected along the direction of the tangent line in turn, and finally the connection
  • the formed enclosed area serves as the second handwriting outline. It can be known that each touch point is included in the second handwriting outline.
  • this optional embodiment provides the above S205, that is, by analyzing each of the moving state information, a handwriting outline matching the moving state of the touch object is formed. Yet another implementation.
  • This fifth optional embodiment is different from the above fourth optional embodiment, and optimizes the calculation performance of the fourth optional embodiment, because the calculation process of the ellipse tangent point in the fourth optional embodiment is relatively complicated, and the process Can consume computing resources of the process. Therefore, the implementation manner provided by this fifth alternative embodiment may be used to optimize computing performance.
  • FIG. 2i shows another implementation flowchart of handwriting outline determination in the handwriting presentation method provided in Embodiment 2 of the present application.
  • the determination of the handwriting outline specifically includes the following steps:
  • the method of determining the effective margin value in this embodiment can be described as:
  • the way of determining the asymmetric elliptical area can refer to the method of the fourth optional embodiment above;
  • the effective margin values in the four directions of left, lower right, upper right and lower left respectively have margin value endpoints, that is, the intersection points of the above-described ray and the asymmetric ellipse.
  • the end point of each margin value can be used as the stroke point in the corresponding direction, which is recorded as the first stroke point in this embodiment.
  • each touch point can determine the corresponding movement state information through the method given above in this embodiment, and the movement state information specifically includes the margin values ( It may be the first margin value or the second margin value), and the margin endpoints corresponding to the margin values in each direction may also be used as the corresponding stroke point, which is recorded as the second stroke point in this embodiment.
  • each touch point there are 8 stroke points correspondingly, and the 8 stroke points can be connected to form an octagon. Therefore, each touch point corresponds to an octagon.
  • the octagons of each touch point can be connected to obtain the final handwriting outline, which is recorded as the third handwriting outline in this embodiment.
  • this fifth optional embodiment further concretizes the formation of the third handwriting outline based on the first and second stroke points corresponding to the touch points as follows:
  • the condition for being an approximate tangent point is that two points on a circumscribed octagon are respectively connected with the touch point to form a corresponding line segment, respectively pass through these two points and connect with the corresponding touch point. Lines that form segments perpendicular to them are used as corresponding approximate tangents. Assuming that the distance between the two approximate tangents is the largest, the two points can be considered as the approximate tangent points of the circumscribed octagon.
  • FIG. 2j shows an effect display diagram of the third handwriting outline determined in Embodiment 2 of the present application. As shown in Figure 2j, it includes the second closed area 23 formed by connecting the circumscribed octagon of each touch point with the corresponding approximate tangent point and the formed approximate tangent line. The second closed area 23 can be used as a touch point Handwriting outline of third. It should also be noted that Fig. 2j is only a schematic diagram, and the approximate tangents given to the circumscribed octagons in the figure are not rigorous, or it is difficult to clearly identify the positions of the approximate tangents, but in practical applications, the adopted The connection method can be strictly connected according to the tangent line of the approximate tangent point.
  • FIG. 2k shows a handwriting display diagram with special effects of the user's writing style presented by using the method provided in Embodiment 2 of the present application. As shown in FIG. 2k , it specifically shows a brush stroke profile 24 formed when a user writes in a brush style.
  • the effect diagram also better illustrates that the method provided by this embodiment can ensure that the handwriting presented on the writing interface can better match the movement state of the user's moving touch object during the writing process, which is beneficial to the improvement of user experience.
  • Fig. 3 is a structural block diagram of a handwriting display device provided in Embodiment 3 of the present application.
  • the handwriting display device can be integrated in an interactive tablet, wherein the touch accuracy of the touch frame equipped in the interactive tablet is within a set accuracy range.
  • the device may specifically include the following modules:
  • a display module 31, configured to display a writing interface through a display screen
  • An obtaining module 32 configured to obtain touch point information fed back through the touch frame when a touch object touches the surface of the display screen and moves, the touch object is controlled by the user;
  • the presentation module 33 is configured to present, on the writing interface, handwriting matching the movement state of the touch object by analyzing the information of each touch point.
  • Embodiment 3 of the present application provides a device for presenting handwriting.
  • the executive body of the device, the interactive tablet is equipped with a high-precision touch frame on the hardware structure, and the configured high-precision touch frame can be realized by the method provided in this embodiment.
  • Functional optimization at the software application level Compared with the interactive tablet that is not optimized at the software level in the related art, the third embodiment integrates a handwriting display device on the interactive tablet, which can ensure that the handwriting presented on the writing interface The handwriting can better match the moving state of the user's moving touch object during the writing process, so as to better present the handwriting with the user's writing style, and then realize the improvement of the handwriting rendering effect on the interactive tablet.
  • the moving state is reflected by the pressure sensitivity value and moving direction of the touch object acting on the display screen.
  • the acquisition module 32 can specifically be used for:
  • each touch signal is identified through the hardware circuit in the touch frame, and the touch signal is generated when the touch object moves on the display screen;
  • one touch point information corresponds to one touch point
  • the touch point information includes: touch point coordinates and touch point pressure sensitivity.
  • the device also includes an input processing module,
  • the input processing module may be configured to process each touch point information after obtaining the touch point information fed back by the touch frame, so that each touch point information has a unified unit format and data structure.
  • the specific implementation of the input processing module processing the touch point information may include:
  • the unit of each data information in the touch point information is converted into a unified set unit format
  • the data structure corresponding to the set unit format is used to record the touch point information.
  • presentation module 33 may specifically include:
  • the first determination unit is configured to determine the movement state information of the touch point generated by the touch object during the movement process by analyzing the information of each touch point;
  • the second determining unit is configured to form a handwriting outline matching the moving state of the touch object by analyzing the moving state information
  • the handwriting presenting unit is configured to fill the outline of the handwriting and present it on the writing interface.
  • the first determining unit can be specifically used for:
  • the first determining unit can also be used for:
  • the preprocessing subunit is used to obtain a preset handwriting thickness value, and use half of the handwriting thickness value as the reference margin value of each touch point in the four directions of up, down, left and right;
  • the coefficient determining subunit is configured to determine the second thickness scaling coefficient corresponding to each of the touch points in the four directions of up, down, left and right by analyzing the information of each of the touch points;
  • the margin determination subunit is configured to, for each touch point, adjust the upper, lower, left and the reference margin values in the four right directions are corrected to obtain the corrected second margin values;
  • the information recording subunit is configured to record each of the second margin values as the moving state information of the touch point.
  • the touch point information also includes: the generation time of the touch point and the moving direction of the touch point;
  • coefficient determination subunit can specifically be used for:
  • For each touch point obtain the generation time of the touch point from the corresponding touch point information, and search for each target touch point included in the set time period before the generation time;
  • the touch point and the touch point coordinates and moving directions of each of the target touch points determine the touch offset information corresponding to the touch point, and the touch point in the four directions of up, down, left and right the target distance value;
  • each of the target distance values and the touch pressure sensitivity of the touch point determine the second coarse and fine scaling coefficients of the touch point in the four directions of up, down, left and right respectively.
  • the touch point and the touch point coordinates and moving direction of each of the target touch points determine the touch offset information corresponding to the touch point, and determine whether the touch point is on the top, bottom, left, and Target distance values in the four right directions, including:
  • the positive and negative values of the coordinate difference are determined based on the set horizontal and vertical positive directions.
  • each of the target distance values and the touch pressure sensitivity of the touch point respectively determine the second position of the touch point in the four directions of up, down, left and right.
  • Coarse and fine scaling factors which can specifically include:
  • the product of the scaling factor to be corrected and the touch pressure sensitivity is used as the second thickness scaling factor in the direction.
  • the second determining unit may specifically be used for:
  • each of the current margin values combined with the touch point coordinates of the touch point determine the stroke point corresponding to the touch point in the four directions of up, down, left and right;
  • the first setting rule is that the area of the formed closed area is the largest.
  • the second determining unit may specifically include:
  • the first determination subunit is configured to, for each touch point, extract the current margin value of the touch point in the four directions of up, down, left and right from the moving state information, the current margin value is the first margin value or the second margin value;
  • the second determining subunit is configured to determine the asymmetric elliptical area corresponding to the touch point according to each of the current margin values combined with the touch point coordinates of the touch point, and determine the tangent point of the asymmetric elliptical area;
  • the third determining subunit is configured to connect the asymmetrical elliptical regions through corresponding tangent points along a tangential direction to form a second handwriting outline with closed regions.
  • the second determining subunit may specifically be used for:
  • An area within the quadrant interval in the ellipse is determined as an effective area.
  • the second determining unit may specifically include:
  • the fourth determination subunit is configured to determine, for each touch point, the four directions of the touch point in the upper left, lower right, upper right, and lower left directions of the touch point using a preset margin value determination strategy. effective margin value;
  • the fifth determining subunit is used to determine the first stroke point corresponding to the touch point in the four directions of upper left, lower right, upper right and lower left according to each effective margin value;
  • the sixth determination subunit is used to determine the second stroke point corresponding to the touch point in the four directions of up, down, left and right according to the corresponding movement state information;
  • the seventh determining subunit is configured to form a third handwriting outline based on the first stroke points and the second stroke points corresponding to the touch points.
  • the seventh determination subunit can be specifically used for:
  • Each of the circumscribed octagons forms an approximate tangent line through the corresponding approximate tangent point, and connects them along the approximate tangent direction respectively to form a third handwriting outline with a closed area.
  • FIG. 4 is a schematic structural diagram of an interactive panel provided in Embodiment 4 of the present application.
  • the interactive panel includes: a processor 40 , a memory 41 , a display screen 42 , an input device 43 , an output device 44 , and a touch frame 45 .
  • the number of processors 40 in the interactive panel may be one or more, and one processor 40 is taken as an example in FIG. 4 .
  • the number of memory 41 in the interactive panel can be one or more, and one memory 41 is taken as an example in FIG. 4 .
  • the processor 40 , memory 41 , display screen 42 , input device 43 , output device 44 and touch frame 45 of the interactive panel can be connected through a bus or in other ways. In FIG. 4 , connection through a bus is taken as an example.
  • the memory 41 can be used to store software programs, computer-executable programs and modules, such as the program instructions/modules corresponding to the interactive panel described in any embodiment of the present application (for example, in the handwriting display device)
  • the memory 41 can mainly include a program storage area and a data storage area, wherein the program storage area can store an operating system and at least one application required by a function; the data storage area can store data created according to the use of the device, etc.
  • the memory 41 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid-state storage devices. In some instances, the memory 41 may further include memory located remotely relative to the processor 40, and these remote memories may be connected to the device through a network. Examples of the aforementioned networks include, but are not limited to, the Internet, intranets, local area networks, mobile communication networks, and combinations thereof.
  • the display screen 42 covers the touch frame 45 and is used for displaying interactive content. Generally speaking, the display screen 42 is used to display data according to the instructions of the processor 40, and is also used to receive touch operations acting on the display screen 42 , and send corresponding signals to the processor 40 or other devices.
  • the input device 43 can be used to receive input digital or character information, and generate key signal input related to user settings and function control of the display device, and can also be a camera for capturing images and a sound pickup device for capturing audio data.
  • the output device 44 may include an audio device such as a speaker. It should be noted that the specific composition of the input device 43 and the output device 44 can be set according to actual conditions.
  • the touch frame 45 has a touch precision that reaches a set precision range, and is used to collect touch point information generated when a touch object performs a touch operation.
  • the processor 40 executes various functional applications and data processing of the device by running the software programs, instructions and modules stored in the memory 41 , that is, realizes the above-mentioned handwriting presentation method.
  • the interactive panel provided above can be used to implement the method for presenting handwriting provided in any of the above embodiments, and has corresponding functions and beneficial effects.
  • Embodiment 5 of the present application also provides a storage medium containing computer-executable instructions, the computer-executable instructions are used to execute a handwriting presentation method when executed by a computer processor, including:
  • the handwriting matching the movement state of the touch object is presented on the writing interface.
  • the storage medium containing computer-executable instructions provided by the embodiments of the present application the computer-executable instructions are not limited to the operation of the method for presenting handwriting as described above, and can also execute the writing method provided by any embodiment of the present application. Relevant operations in the handwriting presentation method, and have corresponding functions and beneficial effects.
  • each part of the present application may be realized by hardware, software, firmware or a combination thereof.
  • various steps or methods may be implemented by software or firmware stored in memory and executed by a suitable instruction execution system.
  • a suitable instruction execution system For example, if implemented in hardware, as in another embodiment, it can be implemented by any one or combination of the following techniques known in the art: Discrete logic circuits, ASICs with suitable combinatorial logic gates, Programmable Gate Array (PGA), Field-Programmable Gate Array (FPGA), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Procédé et appareil de présentation d'écriture manuscrite, et tablette d'interaction et support d'enregistrement. La précision tactile d'un cadre tactile qui est fourni pour la tablette d'interaction se situe dans une plage de précision définie. Le procédé consiste à : afficher une interface d'écriture au moyen d'un écran d'affichage ; lorsqu'un objet tactile touche la surface de l'écran d'affichage et se déplace, obtenant des informations de point tactile qui sont renvoyées par un cadre tactile, l'objet tactile étant manipulé par un utilisateur ; et au moyen de l'analyse de divers éléments des informations de point tactile, présenter, sur l'interface d'écriture, l'écriture manuscrite qui correspond à l'état de mouvement de l'objet tactile. Au moyen du procédé, il est garanti que l'écriture manuscrite présentée sur une interface d'écriture peut mieux correspondre à l'état de mouvement d'un objet tactile, qui est déplacé par un utilisateur pendant un processus d'écriture, de telle sorte que l'écriture manuscrite avec le style d'écriture de l'utilisateur est mieux présentée, ce qui permet d'améliorer l'effet de présentation d'écriture manuscrite sur une tablette d'interaction.
PCT/CN2021/121986 2021-05-20 2021-09-30 Procédé et appareil de présentation d'écriture manuscrite, et tablette d'interaction et support d'enregistrement WO2022242011A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110553073.3 2021-05-20
CN202110553073.3A CN115373534A (zh) 2021-05-20 2021-05-20 书写笔迹的呈现方法、装置、交互平板及存储介质

Publications (1)

Publication Number Publication Date
WO2022242011A1 true WO2022242011A1 (fr) 2022-11-24

Family

ID=84058925

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/121986 WO2022242011A1 (fr) 2021-05-20 2021-09-30 Procédé et appareil de présentation d'écriture manuscrite, et tablette d'interaction et support d'enregistrement

Country Status (2)

Country Link
CN (1) CN115373534A (fr)
WO (1) WO2022242011A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117058688A (zh) * 2023-08-14 2023-11-14 北京东舟技术股份有限公司 一种书写轨迹相似度评估方法及处理设备

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116974400B (zh) * 2023-09-14 2024-01-16 深圳市磐鼎科技有限公司 屏幕触显识别方法、装置、设备及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106155540A (zh) * 2015-04-03 2016-11-23 北大方正集团有限公司 电子毛笔笔形处理方法和装置
CN106293276A (zh) * 2016-10-18 2017-01-04 青岛海信电器股份有限公司 基于红外触控的笔锋确定方法、装置和触摸屏系统
CN110275646A (zh) * 2019-06-27 2019-09-24 深圳市康冠商用科技有限公司 一种应用于红外触摸屏的书写方法和相关装置
CN111142770A (zh) * 2019-12-23 2020-05-12 江苏欧帝电子科技有限公司 一种生成笔锋的方法及处理装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106155540A (zh) * 2015-04-03 2016-11-23 北大方正集团有限公司 电子毛笔笔形处理方法和装置
CN106293276A (zh) * 2016-10-18 2017-01-04 青岛海信电器股份有限公司 基于红外触控的笔锋确定方法、装置和触摸屏系统
CN110275646A (zh) * 2019-06-27 2019-09-24 深圳市康冠商用科技有限公司 一种应用于红外触摸屏的书写方法和相关装置
CN111142770A (zh) * 2019-12-23 2020-05-12 江苏欧帝电子科技有限公司 一种生成笔锋的方法及处理装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117058688A (zh) * 2023-08-14 2023-11-14 北京东舟技术股份有限公司 一种书写轨迹相似度评估方法及处理设备
CN117058688B (zh) * 2023-08-14 2024-04-05 北京东舟技术股份有限公司 一种书写轨迹相似度评估方法及处理设备

Also Published As

Publication number Publication date
CN115373534A (zh) 2022-11-22

Similar Documents

Publication Publication Date Title
US9471192B2 (en) Region dynamics for digital whiteboard
WO2022242011A1 (fr) Procédé et appareil de présentation d'écriture manuscrite, et tablette d'interaction et support d'enregistrement
US8860675B2 (en) Drawing aid system for multi-touch devices
US9465434B2 (en) Toolbar dynamics for digital whiteboard
WO2019140987A1 (fr) Procédé de commande de tableau, dispositif, appareil et support d'informations
US20140111483A1 (en) Monitoring interactions between two or more objects within an environment
WO2019041653A1 (fr) Procédé, dispositif et appareil d'affichage de carte heuristique et support de stockage
WO2021203724A1 (fr) Procédé et appareil de sélection d'écriture manuscrite, dispositif informatique et support d'enregistrement
CN111801641A (zh) 采用物理操纵的对象创建
US10761721B2 (en) Systems and methods for interactive image caricaturing by an electronic device
CN111758122A (zh) 用于混合现实系统的浏览器
US20140351718A1 (en) Information processing device, information processing method, and computer-readable medium
WO2021227628A1 (fr) Dispositif électronique et son procédé d'interaction
US11669204B2 (en) Data processing method and apparatus, and smart interaction device
US10129335B2 (en) Method and system for dynamic group creation in a collaboration framework
WO2021077539A1 (fr) Procédé et appareil d'ajustement graphique, dispositif et support de stockage
WO2014121209A2 (fr) Fonctionnalité de dessin de ligne pour un tableau blanc numérique
CN114690930A (zh) 一种书写笔迹处理方法、装置、交互平板及存储介质
US11137903B2 (en) Gesture-based transitions between modes for mixed mode digital boards
US11809701B2 (en) Handwriting forming method and apparatus, and electronic device
US20240004537A1 (en) Method and device for erasing handwriting, interactive board and storage medium
US20200225787A1 (en) Stroke-based object selection for digital board applications
WO2017205987A1 (fr) Formatage et manipulation de frappes d'encre numérique
WO2023065939A1 (fr) Procédé et appareil de réponse tactile, panneau interactif, et support de stockage
WO2024113271A1 (fr) Dispositif d'affichage intelligent d'écriture manuscrite, procédé d'affichage intelligent d'écriture manuscrite et dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21940457

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21940457

Country of ref document: EP

Kind code of ref document: A1