WO2021189706A1 - 书写交互方法、智能交互显示设备以及书写交互系统 - Google Patents

书写交互方法、智能交互显示设备以及书写交互系统 Download PDF

Info

Publication number
WO2021189706A1
WO2021189706A1 PCT/CN2020/100379 CN2020100379W WO2021189706A1 WO 2021189706 A1 WO2021189706 A1 WO 2021189706A1 CN 2020100379 W CN2020100379 W CN 2020100379W WO 2021189706 A1 WO2021189706 A1 WO 2021189706A1
Authority
WO
WIPO (PCT)
Prior art keywords
writing
touch event
generated
smart pen
handwriting
Prior art date
Application number
PCT/CN2020/100379
Other languages
English (en)
French (fr)
Inventor
于子鹏
Original Assignee
深圳市鸿合创新信息技术有限责任公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市鸿合创新信息技术有限责任公司 filed Critical 深圳市鸿合创新信息技术有限责任公司
Priority to US17/913,853 priority Critical patent/US11861160B2/en
Priority to EP20927976.9A priority patent/EP4123439A4/en
Publication of WO2021189706A1 publication Critical patent/WO2021189706A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0382Plural input, i.e. interface arrangements in which a plurality of input device of the same type are in communication with a PC
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/14Image acquisition
    • G06V30/142Image acquisition using hand-held instruments; Constructional details of the instruments
    • G06V30/1423Image acquisition using hand-held instruments; Constructional details of the instruments the instrument generating sequences of position coordinates corresponding to handwriting

Definitions

  • This application relates to the technical field of interaction between a user and a computer, and in particular to a writing interactive method, an intelligent interactive display device, and a writing interactive system.
  • Intelligent interactive display devices such as smart interactive tablets
  • the existing smart interactive display device is also used with a smart pen, and through the communication connection between the smart pen and the smart interactive display device, more writing control can be achieved.
  • multiple smart pens actually share the same writing area, so there are some problems as follows. For example, when several people need to hold the smart pen to write or write on the blackboard at the same time, because each person’s Writing on the blackboard, writing habits and layout are different.
  • the main purpose of this application is to provide a writing interaction method to improve the user experience of writing by multiple people operating the smart pen.
  • the present application provides a writing interaction method, which is applied to the interaction between a smart interactive display device and a smart pen, and the writing interaction method includes:
  • the handwriting corresponds to multiple smart pens
  • multiple non-overlapping writing areas are generated, and the multiple writing areas correspond to the multiple smart pens one-to-one, and each writing area covers the handwriting corresponding to the smart pen, and each Each writing area only responds to the touch event corresponding to the smart pen to generate handwriting.
  • the present application also provides an intelligent interactive display device for interacting with a smart pen.
  • the intelligent interactive display device includes a touch event matching module, a touch event response module, and a writing area generation module, wherein:
  • the touch event matching module includes a touch detection unit and a touch matching unit, the touch detection unit is used to detect a first touch event of the smart pen, and the touch matching unit is used to obtain the smart pen's Identify and match the corresponding touch event;
  • the touch event response module includes a handwriting generation unit and a quantity monitoring unit.
  • the handwriting generation unit is used to generate handwriting corresponding to the first touch event; Judging whether the generated handwriting corresponds to multiple smart pens;
  • the writing area generation module includes a matching generation unit and a response control unit, and the matching generation unit is used to generate multiple non-overlapping writing areas when the generated handwriting corresponds to multiple smart pens, and multiple writing areas One-to-one correspondence with multiple smart pens, each writing area covers the handwriting corresponding to the smart pen; the response control unit is used to control each writing area to respond only to the touch event corresponding to the smart pen as generating Handwriting.
  • the present application also provides an intelligent interactive display device, including a capacitive touch screen, a processor, and a computer-readable storage medium.
  • the computer-readable storage medium stores a written interactive program. When the written interactive program is executed, the above The written interactive method described.
  • the present application also provides a writing interactive system, including a smart pen, and the above-mentioned intelligent interactive display device.
  • the intelligent interactive display device includes a touch event matching module, a touch event response module, and a writing area generation module, wherein:
  • the touch event matching module includes a touch detection unit and a touch matching unit, the touch detection unit is used to detect a first touch event of the smart pen, and the touch matching unit is used to obtain the smart pen's Identify and match the corresponding touch event;
  • the touch event response module includes a handwriting generation unit and a quantity monitoring unit.
  • the handwriting generation unit is used to generate handwriting corresponding to the first touch event; Judging whether the generated handwriting corresponds to multiple smart pens;
  • the writing area generation module includes a matching generation unit and a response control unit, and the matching generation unit is used to generate multiple non-overlapping writing areas when the generated handwriting corresponds to multiple smart pens, and multiple writing areas One-to-one correspondence with multiple smart pens, each writing area covers the handwriting corresponding to the smart pen; the response control unit is used to control each writing area to respond only to the touch event corresponding to the smart pen as generating Handwriting.
  • the present application also provides a writing interactive system, including a smart pen and the above-mentioned smart interactive display device.
  • the smart interactive display device includes a capacitive touch screen, a processor, and a computer-readable storage medium.
  • the computer-readable storage The medium stores a writing interactive program, and the writing interactive program realizes the above-mentioned writing interactive method when the writing interactive program is executed.
  • the writing interaction method, smart interactive display device, and writing interaction system identify the source of the first touch event, that is, the smart pen, and automatically generate the written note when it is detected that the written note comes from multiple smart pens.
  • a corresponding number of writing areas In addition to covering the notes of the corresponding smart pen, the writing area only responds to touch events of the corresponding smart pen to generate handwriting, which ensures that multiple writing areas are generated when multiple people use multiple pens to write Independent of each other, the content generated by different smart pens will not be confused, and the written content will be clear, thus improving the user experience of writing by multiple people operating the smart pen.
  • the writing area is generated by matching the initial handwriting of each smart pen, in other words, the writing area is generated with the initial notes of the smart pen and is positioned according to the initial handwriting, so that a more flexible area layout can be realized, and more Well adapted to different application scenarios.
  • Fig. 1 is a flowchart of a first implementation manner of a writing interaction method according to the present application
  • Fig. 2 is a flowchart of a second embodiment of the writing interaction method according to the present application.
  • Fig. 3 is a flowchart of a third embodiment of the writing interaction method according to the present application.
  • FIG. 4 is a flowchart of a fourth implementation manner of the writing interaction method according to the present application.
  • Fig. 5 is a flowchart of a fifth implementation manner of a writing interaction method according to the present application.
  • Fig. 6 is a flowchart of a sixth implementation manner of a writing interaction method according to the present application.
  • FIG. 7 is a flowchart of a seventh implementation manner of the writing interaction method according to the present application.
  • Fig. 8 is a flowchart of an eighth implementation manner of a writing interaction method according to the present application.
  • FIG. 9 is a schematic diagram of the first state of the implementation process of the writing interaction method according to the present application.
  • FIG. 10 is a schematic diagram of the second state of the implementation process of the writing interaction method according to the present application.
  • Fig. 11 is a module structure of an embodiment of a writing interactive system according to the present application.
  • FIG. 1 is a flowchart of the first embodiment of the writing interaction method according to the application. Methods include:
  • Step 100 Detect the first touch event of the smart pen and obtain the corresponding identification of the smart pen.
  • the smart pen can generate a touch event on the screen of the smart interactive display device, the touch event includes the above-mentioned first touch event, and the type of the touch event can be capacitive touch, infrared touch, etc.;
  • a communication connection is established between the smart pen and the smart interactive device, and the connection between the two can transmit the identification of the smart pen.
  • the active active capacitive pen passes through Microsoft's MPP (Microsoft Pen Protocol), and Wacom's proprietary AES (Active Electrostatic Solution) Establish a communication connection with the touchpad card of the capacitive touch screen, and send its own identification. It is understandable that the smart pen can also send its own identification through electromagnetic induction input technology or infrared communication technology.
  • touch events of different smart pens can occur simultaneously or successively in time. From the application point of view, although multiple people operate the smart pen at the same time, when writing, the start time of each person's pen writing generally comes first. Technically speaking, touch events that occur sequentially in time are easier to match with the corresponding smart pen identification. As shown in Figure 9, the initial stroke of the word "one" is a horizontal line, and the initial stroke of the word "design” is a pause. The time when the initial positions of the two strokes are generated can be simultaneous or sequential.
  • the writing interaction method further includes step 102: generating handwriting according to the first touch event, and judging whether the generated handwriting corresponds to a plurality of the smart pens according to the identification of the smart pen.
  • step 102A of generating handwriting is arranged before the step 102B of judging the number of smart pens, judging the number of smart pens is actually judging the first touch that occurred. Which smart pen the event belongs to, and then determine the number of smart pens based on the identification of the smart pen. Therefore, on the surface, the handwriting is associated with the smart pen identification, and in essence, the first touch event is associated with the smart pen identification. Therefore, the sequence between step 102A and step 102B may not be limited.
  • step 102B can also be performed after the first touch event occurs. For example, in FIG. 9, the determination of step 102B is completed after the writing of the words "a" and "design" is completed.
  • the writing interaction method further includes step 104: If the handwriting corresponds to multiple smart pens, generate multiple non-overlapping writing areas, wherein the multiple writing areas correspond to the multiple smart pens one-to-one, and each writing The area covers the handwriting corresponding to the smart pen, and each writing area only responds to the touch event corresponding to the smart pen to generate handwriting.
  • the size and position of the writing area can be determined according to the customary size and the location where the initial first touch event occurs.
  • the two writing areas generated in Figure 10 are set up, and according to the default left-aligned typesetting mode, the left boundary of the writing area is close to the position where the first touch event occurs, that is, the position where the initial handwriting is generated. .
  • the left and right writing areas in Fig. 10 do not overlap each other, but in order to make full use of the display area, the two writing areas can share a boundary. It is understandable that the writing area can be determined in the form of a graphics window, or can be defined only by colored lines.
  • each writing area is independent of each other, suppose that the writing area on the left in Figure 10 belongs to the smart pen with the identification 010, and the writing area on the right belongs to the smart pen with the identification 011, then when the smart pen with the identification 011 is in When writing in the writing area on the left, it will not be responded to display handwriting; it can be understood that if the handwriting only corresponds to one smart pen, the default writing area such as the display area of the entire display screen will be used; or only a suitable size corresponding to the smart pen will be generated Writing area.
  • the writing interaction method, smart interactive display device, and writing interaction system identify the source of the first touch event, that is, the smart pen, and automatically generate the written note when it is detected that the written note comes from multiple smart pens.
  • a corresponding number of writing areas In addition to covering the notes of the corresponding smart pen, the writing area only responds to touch events of the corresponding smart pen to generate handwriting, which ensures that multiple writing areas are generated when multiple people use multiple pens to write Independent of each other, the content generated by different smart pens will not be confused, and the written content will be clear, thus improving the user experience of writing by multiple people operating the smart pen.
  • the writing area is generated by matching the initial handwriting of each smart pen, in other words, the writing area is generated with the initial notes of the smart pen and is positioned according to the initial handwriting, so that a more flexible area layout can be realized, and more Well adapted to different application scenarios.
  • FIG. 2 is a flowchart of the second embodiment of the writing interaction method according to the present application.
  • the writing interaction method further includes:
  • Step 206 Detect a second touch event of the smart pen and obtain an identifier corresponding to the smart pen.
  • the touch event of the smart pen is detected again, that is, the second touch event of the smart pen is detected, and the smart pen identifier corresponding to the second touch event is matched.
  • the writing interaction method further includes step 208: judging whether the second touch event is generated by a newly-added smart pen according to the identification of the smart pen.
  • each touch event is associated with a smart pen identification, by querying the smart pen identification data, it is possible to determine whether a smart pen identification is added, and thus indirectly determine whether a smart pen or a smart pen operator is added.
  • the writing interaction method further includes step 210: if yes, determine whether the second touch event is generated in the generated writing area according to the location where the second touch event is generated.
  • the smart pen identifier corresponding to the second touch event is a newly-added smart pen identifier, there is no existing writing area that matches the second touch event.
  • the writing interaction method further includes step 212: if yes, no handwriting is generated according to the second touch event.
  • the writing interaction method further includes step 214: if not, generate handwriting according to the second touch event, and generate a new writing area for the smart pen that should be added, and the new writing area is different from the existing writing area.
  • the writing areas do not overlap each other, the new writing area covers the handwriting corresponding to the newly added smart pen, and the new writing area only responds to the touch event of the newly added smart pen by generating handwriting.
  • FIG. 3 is a flowchart of the third embodiment of the writing interaction method according to the present application. After the step 104, the writing interaction method further includes:
  • Step 306 Detect a third touch event of the smart pen and obtain an identifier corresponding to the smart pen.
  • Step 308 Determine whether the third touch event is generated by a newly added smart pen according to the identification of the smart pen.
  • Step 310 If yes, determine whether the third touch event is generated in the generated writing area according to the location where the third touch event is generated.
  • the writing interaction method further includes step 312: if yes, regenerate a plurality of non-overlapping writing areas, the plurality of writing areas correspond to the plurality of smart pens one-to-one, and each writing area covers the area corresponding to the smart pen. Touch track, and each writing area only responds to the touch event corresponding to the smart pen to generate handwriting.
  • the writing interaction method further includes step 314: if not, a new writing area is generated corresponding to the newly added smart pen, the existing multiple writing areas remain unchanged, and the new writing area is different from the existing multiple writing areas.
  • the writing areas do not overlap each other, the new writing area covers the handwriting corresponding to the newly added smart pen, and the new writing area only responds to the touch event of the newly added smart pen by generating handwriting.
  • the third touch event of the newly added smart pen falls on the remaining area of the display area, a new writing area is generated for the newly added smart pen, and multiple existing writing areas remain unchanged.
  • the third touch event of the newly added smart pen can be flexibly responded to, and the independence of the existing writing area can be maintained. For example, if the third touch event of the newly added smart pen occurs in the lower right corner of the display area, then a new writing area is generated for the newly added smart pen in the lower right corner while keeping the original left and right writing areas unchanged.
  • FIG. 4 is a flowchart of the fourth implementation manner of the writing interaction method according to the present application, and the writing interaction method further includes:
  • Step 406 Detect a fourth touch event of the smart pen and obtain an identifier corresponding to the smart pen.
  • Step 408 Determine whether the fourth touch event is generated by a newly added smart pen according to the identification of the smart pen.
  • Step 410 If the fourth touch event is not generated by a newly-added smart pen, determine whether the fourth touch event is generated in the generated writing area according to the location where the fourth touch event is generated.
  • the fourth touch event generated by it may fall within the generated writing area Or fall outside the generated writing area, but when investigating whether to expand the generated writing area, according to the principle of not overlapping with other writing areas, it is especially necessary to investigate the touch events falling on the remaining area of the display area.
  • the writing interaction method further includes step 412: if not, try to expand the writing area corresponding to the smart pen according to a preset rule to cover the trajectory corresponding to the fourth touch event.
  • the writing area is usually rectangular, but in some embodiments, the writing area may also be other shapes such as ellipse.
  • the preset rule is to expand the corresponding writing area according to the preset rule to cover the fourth touch.
  • the writing area on the right belongs to the smart pen identified as 011. When the smart pen generates a fourth touch event near the lower edge of the writing area, the writing area on the right can be tested downwards. Expansion; and when the smart pen generates a fourth touch event near the left edge of the writing area, the writing area on the right can try to expand to the left.
  • the writing interaction method further includes step 414: judging whether the writing area to be expanded is overlapped with other writing areas;
  • the writing interaction method further includes step 416: if yes, cancel expanding the writing area corresponding to the smart pen and not generate handwriting according to the fourth touch event.
  • step 412 for example, following the example in step 412, if it is expanded to the left, it will overlap with the writing area on the left, thereby canceling the expansion of the writing area on the right, and no handwriting will be generated according to the fourth touch event.
  • the writing interaction method further includes step 418: if not, expand the writing area corresponding to the smart pen and generate handwriting according to the fourth touch event.
  • step 412 if it expands downwards, for example, the writing area on the right expands down to the lower edge and the lower edge of the left is flush, so that the writing area corresponding to the smart pen is actually expanded, and The corresponding fourth touch event is responded to to generate handwriting.
  • the layout of the initially generated writing area may not be enough for writing. Therefore, when the touch event of the corresponding smart pen falls outside the original writing area, the original area is automatically expanded through automatic trial. Under the premise of not overlapping with other writing areas, expand the original writing area, so as to more flexibly adapt to the needs of users.
  • the step of generating multiple non-overlapping writing areas specifically includes:
  • generating a plurality of non-overlapping writing areas on the screen display area includes:
  • a vertical dividing line is set at the coordinate points to form the writing area, wherein the first preset distance is equal to the product of ⁇ and L, where L is the horizontal length of the screen display area, and ⁇ is less than 0.5; or
  • the vertical center line of the screen display area is used as the dividing line to form the writing area, wherein the first A preset distance is equal to L/2, and L is the horizontal length of the screen display area.
  • FIG. 5 is a flowchart of a fifth embodiment of the writing interaction method according to the present application.
  • the generating multiple non-overlapping writing areas specifically includes:
  • Step 541 Obtain the position of the leftmost coordinate point of the handwriting corresponding to each of the smart pens.
  • the leftmost coordinate point of the handwriting of each smart pen is obtained as a reference point for generating the writing area.
  • the leftmost point of the text "one kind” is P2
  • its coordinates are (c, d)
  • the leftmost point of the text "design” is P1
  • its coordinates are (a, b).
  • Generating multiple non-overlapping writing areas further includes step 543: judging whether the distance between two adjacent leftmost coordinate points is greater than or equal to the product of ⁇ and L, where L is the horizontal length of the screen display area, ⁇ Less than 0.5.
  • the possible horizontal length of the left handwriting among the two adjacent handwritings in the horizontal direction so as to ensure that a vertical dividing line can be set between the two handwritings.
  • the horizontal length of the two handwritings is generally determined by the time required for judging the number of smart pens each time, which is usually less than L/3. Of course, it can be extended for a period of time on the basis of the time required for judgment, thereby extending each time.
  • the length of the initial handwriting of a smart pen is generally determined by the time required for judging the number of smart pens each time, which is usually less than L/3. Of course, it can be extended for a period of time on the basis of the time required for judgment, thereby extending each time.
  • the length of the initial handwriting of a smart pen is usually less than L/3.
  • Generating a plurality of non-overlapping writing areas further includes step 545: if yes, except for the leftmost coordinate point close to the left side of the screen display area, refer to the remaining leftmost coordinate points to set a vertical dividing line to form all the writing areas. ⁇ Writing area.
  • the writing area corresponding to the handwriting may be bounded by the left side of the display area, as shown in the left display area in FIG. 10.
  • the handwriting of the remaining smart pens is used to set a vertical dividing line with reference to the leftmost end point in turn to form the corresponding writing area.
  • the touch screen of the smart interactive display device is usually a wide screen, such as a 16:9 wide screen, and when multiple people are operating, they are also standing in a horizontal direction. Therefore, when writing by multiple people at the same time, only the horizontal is considered. It is also suitable to divide the display area to generate the writing area, and the algorithm required for this setting method is relatively simpler.
  • FIG. 6 is a flowchart of the sixth implementation manner of the writing interaction method according to the present application.
  • the step 104 specifically includes:
  • Step 642 If the handwriting corresponds to multiple smart pens, obtain the position of the leftmost coordinate point of the handwriting corresponding to each smart pen when the generated handwriting corresponds to two smart pens.
  • step 541 since left alignment is the most common typesetting method, in order to adapt to this typesetting method, the leftmost coordinate point of the handwriting of each smart pen is obtained as a reference point for generating the writing area.
  • the step 104 further includes a step 644: judging whether the distance between two adjacent leftmost coordinate points is greater than or equal to L/2, where L is the horizontal length of the screen display area.
  • the distance between P1 and P2 is greater than or equal to L/2, that is, whether the value of ac is greater than or equal to L/2, indicating that the distance between the two is large enough to cover
  • L/2 the distance between the two is large enough to cover
  • the step 104 further includes a step 646: if yes, take the vertical center line of the screen display area as the dividing line to form the writing area.
  • the vertical center line of the display area is used as the boundary to form two writings of the same size area.
  • the vertical dividing line can no longer be set with reference to the leftmost coordinate point of the handwriting to form the writing area corresponding to different handwritings, and the vertical center line of the display area is used as the boundary to form two writings of the same size area.
  • FIG. 7 is a flowchart of the seventh implementation manner of the writing interaction method according to the present application.
  • Step 104 includes:
  • Step 703A If the handwriting corresponds to multiple smart pens, an instruction window asking the user to confirm whether to generate multiple writing areas is popped up.
  • a pop-up window is set up to request the user to confirm.
  • Step 104 also includes step 703B: monitoring the instruction command input by the user in the instruction window.
  • the input instruction command can be a touch event, a voice command or a character command input by the keyboard.
  • Step 104 also includes step 703C: If a user's command to determine to generate multiple writing areas is received, multiple non-overlapping writing areas are generated, and the multiple writing areas correspond to the multiple smart pens one-to-one, and each writing area Covering the handwriting corresponding to the smart pen, and each writing area only responds to the touch event corresponding to the smart pen to generate handwriting.
  • the writing interaction method further includes:
  • Step 705 If a user's command to cancel the generation of multiple writing areas is received, the original writing areas are maintained.
  • the original writing area is maintained, for example, multiple smart pens are mixed writing in the same writing area. It is understandable that, optionally, if a newly added smart pen writes in one of the multiple writing areas that have been generated, and the user chooses to cancel the instruction command to regenerate multiple writing areas, the writing area can be newly created. Incorporating a smart pen logo, that is, the writing area can accept two smart pen writing input.
  • the intelligent interactive display device can more accurately meet the needs of the user, thereby achieving a better user experience.
  • each of the writing areas includes an editing area and a menu bar area, and the editing area of each of the writing areas covers the notes corresponding to the smart pen and correspondingly corresponds to the touch events of the smart pen;
  • the menu bar area It includes the writing main menu, and the sub-menus of the main menu include color, eraser, and stroke thickness.
  • the upper part of the left and right writing areas are provided with a menu bar area.
  • the square patterns in the menu bar in the figure represent graphical function buttons or icons, and the menu bars are all editing areas.
  • the menu bar area can also be set to be hidden and called up when needed.
  • the user can set the color of the handwriting by calling the color command, or erase the generated handwriting by calling the eraser, and can also call the stroke thickness to set the thickness of the handwriting.
  • the user can edit the handwriting in each writing area independently.
  • FIG. 8 is a flowchart of an eighth implementation manner of a writing interaction method according to the present application.
  • the menu bar area includes a cancel partition main menu; the writing interaction method further includes:
  • Step 806 Monitor the user's input command in the menu bar.
  • the input instruction command may be a touch event, a voice command or a character command input by the keyboard.
  • the writing interaction method further includes step 810: when a command to cancel the partition is detected, restore the writing area to the original single writing area, and delete the content in each of the writing areas.
  • the user when the user wants to cancel the partition, he can conveniently enter the corresponding trigger command in the menu bar to restore the writing area to the original single area, and delete the content to quickly clear the screen.
  • the math class is about to end and the next class is English.
  • the display area of the intelligent interactive display device has multiple writing areas. All operators have completed the writing, and the teacher or any operator can trigger the cancellation Commands for partitions can quickly clear the screen, making it convenient to use the writing interaction method of the intelligent interactive display device of this application during English class.
  • the main menu for canceling partitions can be used to quickly achieve the goal.
  • the present application also provides an intelligent interactive display device for interacting with a smart pen. Please refer to FIG. 11.
  • the intelligent interactive display device includes a touch event matching module, a touch event response module, and a writing area generation module, wherein,
  • the touch event matching module includes a touch detection unit and a touch matching unit, the touch detection unit is used to detect a first touch event of the smart pen, and the touch matching unit is used to obtain the smart pen's Identify and match the corresponding first touch event;
  • the touch event response module includes a handwriting generation unit and a quantity monitoring unit.
  • the handwriting generation unit is used to generate handwriting corresponding to the first touch event; Judging whether the generated handwriting corresponds to multiple smart pens;
  • the writing area generation module includes a matching generation unit and a response control unit, and the matching generation unit is used to generate multiple non-overlapping writing areas when the generated handwriting corresponds to multiple smart pens, and multiple writing areas One-to-one correspondence with multiple smart pens, each writing area covers the handwriting corresponding to the smart pen; the response control unit is used to control each writing area to respond only to the touch event corresponding to the smart pen as generating Handwriting.
  • the matching generating unit is used to generate a plurality of non-overlapping writing areas
  • the touch detection unit is further configured to detect a second touch event of the smart pen
  • the touch matching unit is further configured to obtain the identification of the smart pen and match the corresponding second touch event
  • the quantity monitoring unit is further configured to determine whether the second touch event is generated by a newly-added smart pen according to the identification of the smart pen;
  • the response control unit is further configured to determine whether the second touch event is generated in the generated writing area according to the location where the second touch event is generated;
  • the handwriting generation unit is configured not to generate handwriting according to the second touch event, that is, the handwriting generation unit does not respond to generate handwriting;
  • the handwriting generation unit is used to generate handwriting according to the second touch event; the matching generation unit is also used to correspond to the newly added smart
  • the pen generates a new writing area, the new writing area does not overlap with multiple existing writing areas, the new writing area covers the handwriting corresponding to the newly added smart pen, and the new writing area is only for the new writing area.
  • the newly added touch event response of the smart pen is to generate handwriting.
  • the matching generating unit is used to generate a plurality of non-overlapping writing areas
  • the touch detection unit is further configured to detect a third touch event of the smart pen, and the touch matching unit is further configured to obtain an identifier of the smart pen and match the corresponding third touch event;
  • the quantity monitoring unit is further configured to determine whether the third touch event is generated by a newly-added smart pen according to the identification of the smart pen;
  • the response control unit is further configured to determine whether the third touch event is generated in the generated writing area according to the location where the third touch event is generated;
  • the matching generating unit is also used to regenerate a plurality of non-overlapping writing areas, and the plurality of writing areas correspond to the plurality of smart pens one-to-one, each Each writing area covers the touch track corresponding to the smart pen, and each writing area only responds to the touch event corresponding to the smart pen to generate handwriting;
  • the matching generating unit is also used to generate a new writing area corresponding to the newly added smart pen, and the new writing area is different from the existing multiple writing areas.
  • the writing areas do not overlap each other, the new writing area covers the handwriting corresponding to the newly added smart pen, and the new writing area only responds to the touch event of the newly added smart pen by generating handwriting.
  • the matching generating unit is used to generate a plurality of non-overlapping writing areas
  • the touch detection unit is further configured to detect a fourth touch event of the smart pen
  • the touch matching unit is further configured to obtain the identification of the smart pen and match the corresponding fourth touch event
  • the quantity monitoring unit is further configured to determine whether the fourth touch event is generated by a newly-added smart pen according to the identification of the smart pen;
  • the response control unit is further configured to determine whether the fourth touch event is generated in the generated writing area according to the location where the fourth touch event is generated;
  • the matching generating unit is further configured to try to expand the writing area corresponding to the smart pen according to a preset rule to cover the writing area corresponding to the fourth touch event Trajectory, and determine whether the writing area to be expanded is overlapped with other writing areas;
  • the matching generation unit is also used to cancel the expansion of the writing area corresponding to the smart pen, and the handwriting generation unit is used to not generate handwriting according to the fourth touch event ;
  • the matching generation unit is also used to expand the writing area corresponding to the smart pen, and the handwriting generation unit is used to generate handwriting according to the fourth touch event.
  • the matching generating unit includes a reference point acquiring unit, a distance judging unit, and an area generating unit, wherein:
  • the reference point obtaining unit is configured to obtain the position of the leftmost coordinate point of the handwriting corresponding to each smart pen when the generated handwriting corresponds to a plurality of the smart pens;
  • the distance judging unit is used to judge whether the distance between two adjacent leftmost coordinate points is greater than or equal to a first preset distance
  • the area generating unit is used to generate a plurality of non-overlapping writing areas on the screen display area.
  • region generating unit is specifically configured to:
  • a vertical dividing line is set at the coordinate points to form the writing area, wherein the first preset distance is equal to the product of ⁇ and L, where L is the horizontal length of the screen display area, and ⁇ is less than 0.5; or,
  • the vertical center line of the screen display area is used as the dividing line to form the writing area, wherein the first A preset distance is equal to L/2, and L is the horizontal length of the screen display area.
  • the writing area generating module further includes a partition pop-up unit and a command monitoring unit, wherein:
  • the partition pop-up unit is configured to pop up an instruction window requesting the user to confirm whether to generate multiple writing areas when the generated handwriting corresponds to multiple smart pens;
  • the command monitoring unit monitors the instruction command input by the user in the instruction window
  • the matching generation module When receiving a user's command to determine to generate multiple writing areas, the matching generation module is also used to generate multiple non-overlapping writing areas, and the multiple writing areas correspond to the multiple smart pens one-to-one, each The writing area covers the handwriting corresponding to the smart pen, and each writing area only responds to the touch event corresponding to the smart pen to generate handwriting;
  • the matching generation module Upon receiving the user's command to cancel the generation of multiple writing areas, the matching generation module is also used to maintain the original writing areas.
  • each of the writing areas includes an editing area and a menu bar area, and the editing area of each of the writing areas covers the notes corresponding to the smart pen and correspondingly corresponds to the touch events of the smart pen;
  • the menu bar area It includes the writing main menu, and the sub-menus of the main menu include color, eraser, and stroke thickness.
  • the writing area generation module further includes a command monitoring unit that monitors the user's input command in the menu bar; when the command monitoring unit detects the command to cancel the partition, the matching generation module also It is used to restore the writing area to the original single writing area and delete the content in each of the writing areas.
  • the present application also provides an intelligent interactive display device, including a capacitive touch screen, a processor, and a computer-readable storage medium.
  • the computer-readable storage medium stores a written interactive program. When the written interactive program is executed, the above The written interactive method described.
  • the specific steps of the writing interaction method refer to the above-mentioned embodiments. Since the intelligent interactive display device adopts all the technical solutions of all the above-mentioned embodiments, it has at least all the beneficial effects brought by the technical solutions of the above-mentioned embodiments, and will not be omitted here. Go into details one by one.
  • the present application also provides a writing interactive system, including a smart pen, and the above-mentioned smart interactive display device including various modules.
  • a writing interactive system including a smart pen
  • the above-mentioned smart interactive display device including various modules.
  • the smart interactive display device adopts the above-mentioned All the technical solutions of all the embodiments have at least all the beneficial effects brought about by the technical solutions of the above-mentioned embodiments, and will not be repeated here.
  • the present application also provides a writing interactive system, including an active capacitive smart pen and an intelligent interactive display device including a computer storage medium as described above.
  • the specific structure of the intelligent interactive display device and the writing interactive program can be realized when executed Refer to the foregoing embodiment for the steps of the writing interaction system. Since this writing interaction system adopts all the technical solutions of all the foregoing embodiments, it has at least all the beneficial effects brought about by the technical solutions of the foregoing embodiments, and will not be repeated here.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种书写交互方法、智能交互显示设备以及书写交互系统,其中,该书写交互方法包括:检测智能笔的第一触控事件并获取对应的所述智能笔的标识(S100);根据所述第一触控事件生成笔迹(S102A),并根据所述智能笔的标识判断生成的所述笔迹是否对应多个所述智能笔(S102B);若是,生成多个互不重叠的书写区域,多个书写区域与多个所述智能笔一一对应,每个书写区域覆盖对应所述智能笔的笔迹,且每个书写区域仅对于对应所述智能笔的触控事件响应为生成笔迹(S104)。该方法通过匹配各智能笔的初始笔迹生成不同且相互独立的书写区域,保证了不同智能笔产生的内容得到清晰呈现,由此提升了多人书写的用户体验。

Description

书写交互方法、智能交互显示设备以及书写交互系统
相关申请的交叉引用
本申请要求享有于2020年03月24日提交的名称为“书写交互方法、智能交互显示设备以及书写交互系统”的中国专利申请202010213966.9的优先权,该申请的全部内容通过引用并入本文中。
技术领域
本申请涉及用户与计算机之间的交互技术领域,尤其涉及一种书写交互方法、智能交互显示设备以及书写交互系统。
背景技术
智能交互显示设备,如智能交互平板越来越多的被应用于各个领域,如教育教学、企业会议、商业展示等领域。为了更好地实现交互功能现有智能交互显示设备还配套智能笔一起使用,并且通过智能笔与智能交互显示设备之间的通信连接,还可以实现更多书写控制。但是,在为多人同时提供板书应用时,多个智能笔实际上共用同一个书写区域,如此存在如下一些问题,例如,在需要几个人分别拿智能笔同时进行书写或板书,由于每个人的板书、书写习惯和布局不一样,有些人对书写或板书的布局比较随意,导致布局混乱、占用他人书写空间,甚至由于布局混乱问题浪费自身书写空间而将自己部分的书写内容混入他人的书写空间或空隙中,导致整体页面的书写布局混乱,观看者甚至分不清哪部分是谁写的。可见目前智能交互显示设备在应对多人操作智能笔进行书写方面能够提供的用户体验较差。
发明内容
基于上述现状,本申请的主要目的在于提供书写交互方法以提高多人操作智能笔进行书写方面的用户体验。
为实现上述目的,本申请提供一种书写交互方法,应用于智能交互显示设备与智能笔之间的交互,所述书写交互方法包括:
检测智能笔的第一触控事件并获取对应的所述智能笔的标识;
根据所述第一触控事件生成笔迹,并根据所述智能笔的标识判断生成的所述笔迹是否对应多个所述智能笔;
若所述笔迹对应多个智能笔,生成多个互不重叠的书写区域,多个书写区域与多个所述智能笔一一对应,每个书写区域覆盖对应所述智能笔的笔迹,且每个书写区域仅对于对应所述智能笔的触控事件响应为生成笔迹。
本申请还提供一种智能交互显示设备,用于与智能笔进行交互,所述智能交互显示设备包括触控事件匹配模块,触控事件响应模块,以及书写区域生成模块,其中,
所述触控事件匹配模块包括触控检测单元与触控匹配单元,所述触控检测单元用于检测智能笔的第一触控事件,所述触控匹配单元用于获取所述智能笔的标识并匹配对应的触控事件;
所述触控事件响应模块包括笔迹生成单元与数量监测单元,所述笔迹生成单元用于生成与所述第一触控事件对应的笔迹;所述数量监测单元用于根据所述智能笔的标识判断生成的所述笔迹是否对应多个所述智能笔;
所述书写区域生成模块包括匹配生成单元与响应控制单元,所述匹配生成单元用于在生成的所述笔迹对应多个所述智能笔时生成多个互不重叠的书写区域,多个书写区域与多个所述智能笔一一对应,每个书写区域覆盖对应所述智能笔的笔迹;所述响应控制单元用于控制每个书写区域仅对于对应所述智能笔的触控事件响应为生成笔迹。
本申请还提供一种智能交互显示设备,包括电容触控屏、处理器以及计算机可读存储介质,所述计算机可读存储介质存储有书写交互程序,所述书写交互程序被执行时实现如上所述的书写交互方法。
本申请还提供一种书写交互系统,包括智能笔,以及如上所述的智能交互显示设备,该智能交互显示设备包括触控事件匹配模块,触控事件响应模块,以及书写区域生成模块,其中,
所述触控事件匹配模块包括触控检测单元与触控匹配单元,所述触控检测单元用于检测智能笔的第一触控事件,所述触控匹配单元用于获取所述智能笔的标识并匹配对应的触控事件;
所述触控事件响应模块包括笔迹生成单元与数量监测单元,所述笔迹生成单元用于生成与所述第一触控事件对应的笔迹;所述数量监测单元用于根据所述智能笔的标识判断生成的所述笔迹是否对应多个所述智能笔;
所述书写区域生成模块包括匹配生成单元与响应控制单元,所述匹配生成单元用于在生成的所述笔迹对应多个所述智能笔时生成多个互不重叠的书写区域,多个书写区域与多个所述智能笔一一对应,每个书写区域覆盖对应所述智能笔的笔迹;所述响应控制单元用于控制每个书写区域仅对于对应所述智能笔的触控事件响应为生成笔迹。
本申请还提供一种书写交互系统,包括智能笔以及如上所述的智能交互显示设备,所述智能交互显示设备包括电容触控屏、处理器以及计算机可读存储介质,所述计算机可读存储介质存储有书写交互程序,所述书写交互程序被执行时实现如上所述的书写交互方法。
本申请提供的书写交互方法、智能交互显示设备以及书写交互系统,通过对第一触控事件的产生源,即智能笔进行辨别,在检测到书写的笔记来自多支智能笔时,自动地生成相应数量的书写区域,该书写区域除了覆盖对应智能笔的笔记,还仅对于对应智能笔的触控事件响应为生成笔迹,如此保证了多人使用多支笔书写时,生成的多个书写区域相互独立,不同智能笔产生的内容不会混乱,书写内容清晰,由此提升了多人操作智能笔进行书写方面的用户体验。此外,由于书写区域是匹配各智能笔的初始笔迹所产生,换句话说,书写区域随智能笔的初始笔记所产生,并根据初始笔迹所定位,如此,能够实现更为灵活的区域布局,更好地适应不同的应用场景。
本申请的其他有益效果,将在具体实施方式中通过具体技术特征和技术方案的介绍来阐述,本领域技术人员通过这些技术特征和技术方案的介绍,应能理解所述技术特征和技术方案带来的有益技术效果。
附图说明
以下将参照附图对根据本申请的书写交互方法、智能交互显示设备以及书写交互系统的优选实施方式进行描述。图中:
图1为根据本申请的书写交互方法第一实施方式的流程图;
图2为根据本申请的书写交互方法第二实施方式的流程图;
图3为根据本申请的书写交互方法第三实施方式的流程图;
图4为根据本申请的书写交互方法第四实施方式的流程图;
图5为根据本申请的书写交互方法第五实施方式的流程图;
图6为根据本申请的书写交互方法第六实施方式的流程图;
图7为根据本申请的书写交互方法第七实施方式的流程图;
图8为根据本申请的书写交互方法第八实施方式的流程图;
图9为根据本申请的书写交互方法的实现过程第一状态示意图;
图10为根据本申请的书写交互方法的实现过程第二状态示意图;
图11为根据本申请的书写交互系统一实施方式的模块结构。
具体实施方式
本申请提供一种书写交互方法,应用于智能交互显示设备与智能笔之间的交互,请参照图1,图1为根据本申请的书写交互方法第一实施方式的流程图,所述书写交互方法包括:
步骤100:检测智能笔的第一触控事件并获取对应的所述智能笔的标识。
本步骤中,智能笔可以在智能交互显示设备的屏幕上产生触控事件,该触控事件包括上述的第一触控事件,该触控事件的类型可以为电容触控、红外触控等;同时智能笔与智能交互设备之间通过建立通信连接,两者的连接可以传输智能笔的标识,例如有源主动电容笔通过微软的MPP (Microsoft Pen Protocol),Wacom私有的AES(Active Electrostatic Solution)与电容触摸屏的触控板卡建立通信连接,发送自身的标识。可以理解的是,智能笔也可应通过电磁感应输入技术,或红外通信技术发送自身的标识。不同智能笔的触控事件在时间上既可以是同时发生,也可以先后发生。从应用上来说,虽然有多个人同时操作智能笔,但是具体书写时,每个人的落笔书写的起始时间一般都有先有后。从技术上来说,时间上先后发生的触控事件更容易将其与对应的智能笔标识匹配起来。如图9中的文字“一种”起始笔画为一横,文字“设计”的起始笔画为顿点,两个笔画起始位置产生的时间既可以为同时也可以为有先后顺序。
所述书写交互方法还包括步骤102:根据第一触控事件生成笔迹,并根据所述智能笔的标识判断生成的所述笔迹是否对应多个所述智能笔。
在本步骤中,尽管在图1所示的示例中,生成笔迹的步骤102A被安排在判断智能笔数量的步骤102B之前,但是,判断智能笔的数量,实际上是判断发生的第一触控事件属于哪支智能笔,然后再根据智能笔的标识判断智能笔的数量。从而表面上是将笔迹与智能笔标识关联起来,实质上是将第一触控事件与智能笔的标识关联起来。因此,步骤102A与步骤102B之间的顺序也可以不作限定。此外,步骤102B也可以在发生了第一触控事件之后再进行,例如在图9中,在完成文字“一种”“设计”的书写之后,才完成步骤102B的判断。
所述书写交互方法还包括步骤104:若所述笔迹对应多个智能笔,生成多个互不重叠的书写区域,其中,多个书写区域与多个所述智能笔一一对应,每个书写区域覆盖对应所述智能笔的笔迹,且每个书写区域仅对于对应所述智能笔的触控事件响应为生成笔迹。
在本步骤中,生成多个书写区域时,可以不必完全填满智能交互显示设备的整个显示区域。可选地,可以根据惯用的尺寸大小,初始的第一触控事件发生的位置确定书写区域的大小与位置。例如在图10中生成的两个书写区域均靠上设置,并且按照默认的左对齐排版模式,将书写区域的左边界靠近初始的第一触控事件发生的位置,也即初始字迹产生的位置。图10中的左右两个书写区域互不重叠,但为了充分利用显示区域,两个 书写区域可以共用一段边界。可以理解的是,书写区域可以通过图形窗口的形式确定,也可以仅以有色线条界定。生成多个书写区域后,由于各个书写区域相互独立,设图10中左边书写区域的属于标识为010的智能笔,右边书写区域属于标识为011的智能笔,那么当标识为011的智能笔在左边书写区域书写时,则不会被响应为显示笔迹;可以理解是若笔迹仅对应一个智能笔,则使用默认的书写区域例如整个显示屏的显示区域;或仅对应该智能笔生成一个合适大小的书写区域。
本申请提供的书写交互方法、智能交互显示设备以及书写交互系统,通过对第一触控事件的产生源,即智能笔进行辨别,在检测到书写的笔记来自多支智能笔时,自动地生成相应数量的书写区域,该书写区域除了覆盖对应智能笔的笔记,还仅对于对应智能笔的触控事件响应为生成笔迹,如此保证了多人使用多支笔书写时,生成的多个书写区域相互独立,不同智能笔产生的内容不会混乱,书写内容清晰,由此提升了多人操作智能笔进行书写方面的用户体验。此外,由于书写区域是匹配各智能笔的初始笔迹所产生,换句话说,书写区域随智能笔的初始笔记所产生,并根据初始笔迹所定位,如此,能够实现更为灵活的区域布局,更好地适应不同的应用场景。
进一步地,请参照图2,图2为根据本申请的书写交互方法第二实施方式的流程图,在所述步骤104之后,所述书写交互方法还包括:
步骤206:检测智能笔的第二触控事件并获取对应所述智能笔的标识。
本步骤为最近一次生成书写区域后,再次检测智能笔的触控事件,即检测到智能笔的第二触控事件,并匹配该第二触控事件对应的智能笔标识。通过将智能笔的每个触控事件与对应的智能笔标识关联起来,可以为后续的智能控制提供数据基础。
所述书写交互方法还包括步骤208:根据所述智能笔的标识判断该第二触控事件是否为新增的智能笔所产生。
在本步骤中,虽然在上一步骤中已经检测到了第二触控事件,但是可以不用马上对该第二触控事件马上响应为显示相应的笔迹。由于每个触控 事件都关联有智能笔的标识,因此通过对智能笔标识数据进行查询,就可以确定是否新增了智能笔标识,如此间接判断是否增加了智能笔或智能笔的操作者。
所述书写交互方法还包括步骤210:若是,根据所述第二触控事件产生的位置判断第二触控事件是否产生在已生成的书写区域内。
在本步骤中,由于第二触控事件对应的智能笔标识为新增的智能笔标识,因此,没有现成的书写区域与该第二触控事件匹配。为了确定合适的响应,首先需要知道第二触控事件产生的位置是位于显示区域中处书写区域之外的剩余区域还是落在任意一个现成的书写区域中。
所述书写交互方法还包括步骤212:若是,则不根据所述第二触控事件生成笔迹。
在本步骤中,在现有书写区域优先且其大小不作调整的原则下,若第二触控事件产生在已有的书写区域中,则为了保证已有书写区域相对新增智能笔的独立性,同样对新增智能笔的第二触控事件不响应为生成笔迹。例如,请参考图10,若新增第三支智能笔,其标识号为001,那么标识号为001的智能笔若在图10中的左右书写区域任意一个中书写,则不会生成笔迹。
所述书写交互方法还包括步骤214:若否,根据所述第二触控事件生成笔迹,并对应该新增的所述智能笔生成新的书写区域,该新的书写区域与已有的多个书写区域互不重叠,该新的书写区域覆盖对应新增的所述智能笔的笔迹,且该新的书写区域仅对于该新增的所述智能笔的触控事件响应为生成笔迹。
在本步骤中,在现有书写区域优先且其大小不作调整的原则下,若第二触控事件产生在显示区域的剩余区域中,则为了囊括地响应新增智能笔的第二触控事件、生成相应笔迹,为该新增的智能笔建立新的书写区域。例如,请参考图10,若新增第三支标识号为001的智能笔,若该智能笔的第二触控事件产生在靠近显示区域下边缘、位于已有的左右书写区域之外的部位,则生成新的书写区域。
在本实施例中,通过确立现有书写区域优先的原则,可以保证现有书写区域高度的独立性,但对于显示区域的剩余区域,则可以为新增智能笔的第二触控事件建立新的书写区域,从而又保证了一定的灵活性。
进一步地,请参照图3,图3为根据本申请的书写交互方法第三实施方式的流程图,在所述步骤104之后,所述书写交互方法还包括:
步骤306:检测智能笔的第三触控事件并获取对应所述智能笔的标识。
步骤308:根据所述智能笔的标识判断所述第三触控事件是否为新增的智能笔所产生。
步骤310:若是,根据所述第三触控事件产生的位置判断第三触控事件是否产生在已生成的书写区域内。
本步骤中,在确定了对于新增智能笔的响应策略的基础上,为了顺利的执行相应的策略,首先需要知道第三触控事件产生的位置是位于显示区域中处书写区域之外的剩余区域还是落在任意一个现成的书写区域中。
所述书写交互方法还包括步骤312:若是,则重新生成多个互不重叠的书写区域,多个书写区域与多个所述智能笔一一对应,每个书写区域覆盖对应所述智能笔的触控轨迹,且每个书写区域仅对于对应所述智能笔的触控事件响应为生成笔迹。
本步骤中,由于确定了新增智能笔相对于现有书写区域优先的策略,每次新增加一个智能笔,就需要重新生成多个书写区域。例如,若新增的智能笔的第三触控事件落在左边或右边的书写区域,均需要重新生成多个书写区域。举例而言,若新增的智能笔在文字“一种”与“设计”之间书写“交互”,那么重新生的书写区域可以包括左中右三个书写区域,文字“交互”位于中间的书写区域。
所述书写交互方法还包括步骤314:若否,则对应该新增的所述智能笔生成新的书写区域,已有的多个书写区域保持不变,该新的书写区域与已有的多个书写区域互不重叠,该新的书写区域覆盖对应新增的所述智能笔的笔迹,且该新的书写区域仅对于该新增的所述智能笔的触控事件响应为生成笔迹。
本步骤中,若新增智能笔的第三触控事件落在显示区域的剩余区域,则为新增的智能笔生成新的书写区域,且已有的多个书写区域保持不变。这样既能灵活地响应新增的智能笔的第三触控事件,又能保持已有书写区域的独立性。例如,新增的智能笔的第三触控事件发生在显示区域的右下角,那么在右下角为该新增的智能笔生成新的书写区域,同时保持原有的左右书写区域不变。
在本实施例中,通过确立新增智能笔优先的原则,可以保证无论如何都可以为新增的智能笔生成书写区域,可以理解的是,通过利用该原则,如有需要可以对现有书写区域的大小进行调整。
进一步地,请参照图4,图4为根据本申请的书写交互方法第四实施方式的流程图,所述书写交互方法还包括:
步骤406:检测智能笔的第四触控事件并获取对应所述智能笔的标识。
步骤408:根据所述智能笔的标识判断该第四触控事件是否为新增的智能笔所产生。
步骤410:若该第四触控事件不是新增的智能笔所产生,根据所述第四触控事件产生的位置判断第四触控事件是否产生在已生成的书写区域内。
在本步骤中,可以理解的是,在一些应用场景中,无论是为已有的智能笔还是新增的智能笔,其产生的第四触控事件均有可能落在已生成的书写区域内或落在已生成的书写区域之外,然而考察是否对已生成的书写区域进行扩展时,根据不与其他书写区域重叠的原则,尤其需要考察落在显示区域的剩余区域的触控事件。
所述书写交互方法还包括步骤412:若否,则按预设的规则试扩展所述智能笔对应的书写区域以覆盖对应所述第四触控事件的轨迹。
在本步骤中,通常书写区域为矩形,但一些实施例中,书写区域也可以为其他形状例如椭圆形,预设的规则即为按照预设的规则扩展对应的书写区域以囊括覆盖第四触控事件的轨迹,例如,请参考图10,右边书写区域属于标识为011的智能笔,当该智能笔在靠近书写区域的下边缘产生第 四触控事件时,右边的书写区域可以向下试扩展;而当该智能笔在靠近书写区域的左边缘产生第四触控事件时,右边的书写区域可以向左试扩展。
所述书写交互方法还包括步骤414:判断试扩展的书写区域是否与其他书写区域重叠;
在本步骤中,在保证已有书写区域独立性的原则下,并不是所有的试扩展都是合适的,扩展的区域仍然需要保持不与其他书写区域重叠。
所述书写交互方法还包括步骤416:若是,则取消扩展所述智能笔对应的书写区域且不根据所述第四触控事件生成笔迹。
在本步骤中,例如,接着步骤412中的例子,若向左扩展则将与左边的书写区域重叠,由此取消扩展右边的书写区域,并且不会根据第四触控事件产生笔迹。
所述书写交互方法还包括步骤418:若否,则扩展所述智能笔对应的书写区域且根据所述第四触控事件生成笔迹。
在本步骤中,例如,接着步骤412中的例子,若向下扩展,例如右边的书写区域向下扩展至下边缘与左边的下边缘平齐,从而实际执行扩展智能笔对应的书写区域,并且对应的第四触控事件被响应生成笔迹。
在本实施例中,考虑最初生成的书写区域其版面可能不够用于书写,因此,在对应的智能笔其触控事件落在原有的书写区域之外时,通过自动试扩展原有区域,在不与其他书写区域重叠的前提下,扩展原有的书写区域,从而更为灵活地适应用户的需求。
进一步地,若所述笔迹对应多个智能笔,生成多个互不重叠的书写区域的步骤具体包括:
获取每个所述智能笔对应的笔迹最左端坐标点的位置;
判断相邻的两所述最左端坐标点之间的距离是否大于或等于第一预设距离;
若相邻的两所述最左端坐标点之间的距离大于或等于第一预设距离,则在屏幕显示区域上生成多个互不重叠的书写区域。
进一步地,所述若相邻的两所述最左端坐标点之间的距离大于或等于第一预设距离,则在屏幕显示区域上生成多个互不重叠的书写区域,包括:
若相邻的两所述最左端坐标点之间的距离大于或等于所述第一预设距离,除靠近屏幕显示区域左侧边的所述最左端坐标点之外,参考其余所述最左端坐标点设置竖直分界线以形成所述书写区域,其中,所述第一预设距离等于δ与L的乘积,L为屏幕显示区域的水平长度,δ小于0.5;或
若相邻的两所述最左端坐标点之间的距离大于或等于所述第一预设距离,以屏幕显示区域的竖直中心线为分界线以形成所述书写区域,其中,所述第一预设距离等于L/2,L为屏幕显示区域的水平长度。
进一步地,请参照图5,图5为根据本申请的书写交互方法第五实施方式的流程图,所述生成多个互不重叠的书写区域具体包括:
步骤541:获取每个所述智能笔对应的笔迹最左端坐标点的位置。
在本步骤中,由于左对齐是最为常用的排版方式,为了适应这种排版方式,获取各个智能笔的笔迹最左端的坐标点,以作为生成书写区域的参考点。例如,在图9中,文字“一种”最左端的点为P2,其坐标为(c,d),文字“设计”最左端的点为P1,其坐标为(a,b)。
生成多个互不重叠的书写区域还包括步骤543:判断相邻的两所述最左端坐标点之间的距离是否大于或等于δ与L的乘积,其中L为屏幕显示区域的水平长度,δ小于0.5。
在本步骤中,接着步骤543中的例子,δ=1/3,判断P1与P2之间的距离是否大于或等于L/3,即a-c的值是否大于或等于L/3,该距离可以覆盖水平方向上相邻的两种笔迹中靠左笔迹可能的水平长度,从而保证可以在两种笔迹之间设置竖直分界线。两种笔迹的水平长度,一般由每次判断智能笔的数量时所需的时间决定,其通常小于L/3,当然也可以在判断所需的时间的基础上再延长一段时间,从而延长每种智能笔初始笔迹的所占的长度。
生成多个互不重叠的书写区域还包括步骤545:若是,除靠近屏幕显示区域左侧边的所述最左端坐标点之外,参考其余所述最左端坐标点设置竖直分界线以形成所述书写区域。
在本步骤中,靠近屏幕显示区域左侧边的所述最左端坐标点,其笔迹对应的书写区域可以以显示区域的左侧边为边界,如图10中的左显示区域一样。其余智能笔的笔迹则依次参考最左端端点设置竖直分界线以形成对应的书写区域。可以理解的是,“参考”最基本的原则是不分割笔迹,因此,例如图10中左边书写区域其分界线X最极端的取值为X=a,a为P1的横坐标,在这个基础上,分界线X可以向左偏移一定距离,即X=a-a1,a1为一常数,例如a1=5cm,实际中可以根据屏幕大小而不同。
在本实施例中,由于通常智能交互显示设备的触摸屏为宽屏,例如16:9的宽屏,并且多人操作时也是沿水平方向排列站立,因此,在同一时间进行多人书写时,仅考虑水平分割显示区域以生成书写区域也是合适的,并且这种设置方式所需的算法也相对更为简单。
进一步地,请参照图6,图6为根据本申请的书写交互方法第六实施方式的流程图,所述步骤104具体包括:
步骤642:若所述笔迹对应多个智能笔,在生成的所述笔迹对应两个所述智能笔时获取每个所述智能笔对应的笔迹最左端坐标点的位置。
在本步骤中,同步骤541,由于左对齐是最为常用的排版方式,为了适应这种排版方式,获取各个智能笔的笔迹最左端的坐标点,以作为生成书写区域的参考点。
所述步骤104还包括步骤644:判断相邻的两所述最左端坐标点之间的距离是否大于或等于L/2,其中L为屏幕显示区域的水平长度。
在本步骤中,请参考图9,在P1与P2之间的距离大于或等于L/2,即a-c的值是否大于或等于L/2,说明两者之间的距离足够大,完全可以覆盖水平方向上相邻的两种笔迹中靠左笔迹可能的水平长度,从而保证可以在两种笔迹之间设置竖直分界线的灵活性更大。
所述步骤104还包括步骤646:若是,以屏幕显示区域的竖直中心线为分界线以形成所述书写区域。
在本步骤中,考虑到在两种笔迹之间设置竖直分界线的灵活性较大,为了方便两种笔迹的排版,以显示区域的竖直中心线为界以形成大小一致的两个书写区域。
在本实施例中,在生成的笔迹对应两个智能笔时,且两种笔迹之间有足够的距离,即在二者之间的间隔距离达到L/2及以上时,为了更好地排版与协调显示区域空间的利用,可不再参考笔迹的最左端坐标点设置竖直分界线以形成不同笔迹对应的书写区域,而以显示区域的竖直中心线为界以形成大小一致的两个书写区域。
进一步地,请参照图7,图7为根据本申请的书写交互方法第七实施方式的流程图,步骤104包括:
步骤703A:若所述笔迹对应多个智能笔,弹出请求用户确认是否生成多个书写区域的指令窗口。
本步骤中,为了给予用户选择是否生成多个书写区域的权利,设置一弹窗请求用户确认。
步骤104还包括步骤703B:监测用户在所述指令窗口输入的指示命令。
在本步骤中,输入的指示命令可以为触摸事件,语音命令或键盘输入的字符命令。通过监测该指示命令,方便机器根据用户的指令执行后续的动作。
步骤104还包括步骤703C:若接收到用户的确定生成多个书写区域的命令,生成多个互不重叠的书写区域,多个书写区域与多个所述智能笔一一对应,每个书写区域覆盖对应所述智能笔的笔迹,且每个书写区域仅对于对应所述智能笔的触控事件响应为生成笔迹。
在本步骤中,在接收到确认的命令时才生成多个书写区域,从而更为准确的满足用户的需求。
所述监测用户在所述指令窗口输入的指示命令之后,所述书写交互方法还包括:
步骤705:若接收到用户的取消生成多个书写区域的命令,则保持原有的书写区域。
在本步骤中,若接收到取消的命令则保持原有的书写区域,例如多个智能笔在同一个书写区域混合书写。可以理解的是,可选地,若新增的智能笔在已生成的多个书写区域的其中一个区域书写,且用户选择取消重新生成多个书写区域的指示命令,则可以将该书写区域新纳入一个智能笔的标识,即该书写区域可以接受两支智能笔书写输入。
在本实施例中,通过决定是否生成多个书写区域之前设置用户确认流程,使得智能交互显示设备可以更为准确的满足用户的需求,从而实现更好的用户体验。
进一步地,每个所述书写区域包括编辑区与菜单栏区,每个所述书写区域的编辑区覆盖对应所述智能笔的笔记,且相应对应智能笔的触控事件;所述菜单栏区包括书写主菜单,所述主菜单的子菜单包括颜色,橡皮擦,笔画粗细。
在本实施例中,如图10所示,左右书写区域的上部均设置有菜单栏区,图中菜单栏中的方形图案代表图形功能按钮或图标,菜单栏外均为编辑区。当然在一些实施例中,菜单栏区也可设置为隐藏,在需要时再调出。具体地,用户可以通过调用颜色命令设置笔迹的颜色,也可以通过调用橡皮擦擦除已生成的笔迹,还可以调用笔画粗细设置笔迹的粗细。通过在每个书写区域均设置菜单栏区,使得用户在可以每个书写区域均可以独立地对各书写区域内的笔迹进行编辑。
进一步地,请参照图8,图8为根据本申请的书写交互方法第八实施方式的流程图,所述菜单栏区包括取消分区主菜单;所述书写交互方法还包括:
步骤806:监测用户在所述菜单栏的输入命令。
本步骤中,与步骤703B相似,输入的指示命令可以为触摸事件,语音命令或键盘输入的字符命令。通过监测该指示命令,方便机器根据用户的指令执行后续的动作。
所述书写交互方法还包括步骤810:在监测到取消分区的命令时,将书写区域恢复到原始的单个书写区域,并删除各所述书写区域内的内容。
本步骤中,当用户想要取消分区时,可以方便地在菜单栏输入相应的触发命令,将书写区域恢复到原始的单个区域,并删除内容以快速清屏。例如,在课堂上课的场景,数学课快要结束且下一节课为英语课,智能交互显示设备的显示区域有多个书写区域,所有操作者均完成书写,老师或任意操作者均可以触发取消分区的命令,快速清屏,方便在英语课时使用本申请智能交互显示设备的书写交互方法。
在本实施例中,在用户想要取消已经生成的多个书写区域,并实现清屏时,可以使用取消分区主菜单快速地达到目的。
本申请还提供一种智能交互显示设备,用于与智能笔进行交互,请参照图11,该智能交互显示设备包括触控事件匹配模块,触控事件响应模块,以及书写区域生成模块,其中,
所述触控事件匹配模块包括触控检测单元与触控匹配单元,所述触控检测单元用于检测智能笔的第一触控事件,所述触控匹配单元用于获取所述智能笔的标识并匹配对应的第一触控事件;
所述触控事件响应模块包括笔迹生成单元与数量监测单元,所述笔迹生成单元用于生成与所述第一触控事件对应的笔迹;所述数量监测单元用于根据所述智能笔的标识判断生成的所述笔迹是否对应多个所述智能笔;
所述书写区域生成模块包括匹配生成单元与响应控制单元,所述匹配生成单元用于在生成的所述笔迹对应多个所述智能笔时生成多个互不重叠的书写区域,多个书写区域与多个所述智能笔一一对应,每个书写区域覆盖对应所述智能笔的笔迹;所述响应控制单元用于控制每个书写区域仅对于对应所述智能笔的触控事件响应为生成笔迹。
进一步地,在所述匹配生成单元用于生成多个互不重叠的书写区域后,
所述触控检测单元还用于检测智能笔的第二触控事件,所述触控匹配单元还用于获取所述智能笔的标识并匹配对应的第二触控事件;
所述数量监测单元还用于根据所述智能笔的标识判断第二触控事件是否为新增的智能笔所产生;
所述响应控制单元还用于根据所述第二触控事件产生的位置判断第二触控事件是否产生在已生成的书写区域内;
若第二触控事件产生在已生成的书写区域内,所述笔迹生成单元用于不根据所述第二触控事件生成笔迹,即所述笔迹生成单元不响应为生成笔迹;
若第二触控事件未产生在已生成的书写区域内,所述笔迹生成单元用于根据所述第二触控事件生成笔迹;所述匹配生成单元还用于对应该新增的所述智能笔生成新的书写区域,该新的书写区域与已有的多个书写区域互不重叠,该新的书写区域覆盖对应新增的所述智能笔的笔迹,且该新的书写区域仅对于该新增的所述智能笔的触控事件响应为生成笔迹。
进一步地,在所述匹配生成单元用于生成多个互不重叠的书写区域后,
所述触控检测单元还用于检测智能笔的第三触控事件,所述触控匹配单元还用于获取所述智能笔的标识并匹配对应的第三触控事件;
所述数量监测单元还用于根据所述智能笔的标识判断该第三触控事件是否为新增的智能笔所产生;
若第三触控事件为新增的智能笔所产生,所述响应控制单元还用于根据所述第三触控事件产生的位置判断第三触控事件是否产生在已生成的书写区域内;
若第三触控事件产生在已生成的书写区域内,所述匹配生成单元还用于重新生成多个互不重叠的书写区域,多个书写区域与多个所述智能笔一一对应,每个书写区域覆盖对应所述智能笔的触控轨迹,且每个书写区域仅对于对应所述智能笔的触控事件响应为生成笔迹;
若第三触控事件未产生在已生成的书写区域内,所述匹配生成单元还用于对应该新增的所述智能笔生成新的书写区域,该新的书写区域与已有的多个书写区域互不重叠,该新的书写区域覆盖对应新增的所述智能笔的笔迹,且该新的书写区域仅对于该新增的所述智能笔的触控事件响应为生成笔迹。
进一步地,在所述匹配生成单元用于生成多个互不重叠的书写区域后,
所述触控检测单元还用于检测智能笔的第四触控事件,所述触控匹配单元还用于获取所述智能笔的标识并匹配对应的第四触控事件;
所述数量监测单元还用于根据所述智能笔的标识判断第四触控事件是否为新增的智能笔所产生;
若第四触控事件非新增的智能笔所产生,所述响应控制单元还用于根据所述第四触控事件产生的位置判断第四触控事件是否产生在已生成的书写区域内;
若第四触控事件未产生在已生成的书写区域内,所述匹配生成单元还用于按预设的规则试扩展所述智能笔对应的书写区域以覆盖对应所述第四触控事件的轨迹,以及判断试扩展的书写区域是否与其他书写区域重叠;
若试扩展的书写区域与其他书写区域重叠,所述匹配生成单元还用于取消扩展所述智能笔对应的书写区域,且所述笔迹生成单元用于不根据所述第四触控事件生成笔迹;
若试扩展的书写区域未与其他书写区域重叠,所述匹配生成单元还用于扩展所述智能笔对应的书写区域,且所述笔迹生成单元用于根据所述第四触控事件生成笔迹。
进一步地,所述匹配生成单元包括参考点获取单元、距离判断单元以及区域生成单元,其中,
所述参考点获取单元用于在生成的所述笔迹对应多个所述智能笔时获取每个所述智能笔对应的笔迹最左端坐标点的位置;
所述距离判断单元用于判断相邻的两所述最左端坐标点之间的距离是否大于或等于第一预设距离;
若所述距离判断单元的结果为是,则所述区域生成单元用于在屏幕显示区域上生成多个互不重叠的书写区域。
进一步地,所述区域生成单元具体用于,
若相邻的两所述最左端坐标点之间的距离大于或等于所述第一预设距离,除靠近屏幕显示区域左侧边的所述最左端坐标点之外,参考其余所述 最左端坐标点设置竖直分界线以形成所述书写区域,其中,所述第一预设距离等于δ与L的乘积,L为屏幕显示区域的水平长度,δ小于0.5;或,
若相邻的两所述最左端坐标点之间的距离大于或等于所述第一预设距离,以屏幕显示区域的竖直中心线为分界线以形成所述书写区域,其中,所述第一预设距离等于L/2,L为屏幕显示区域的水平长度。
进一步地,所述书写区域生成模块还包括分区弹窗单元以及命令监测单元,其中,
所述分区弹窗单元用于在生成的所述笔迹对应多个所述智能笔时弹出请求用户确认是否生成多个书写区域的指令窗口;
所述命令监测单元监测用户在所述指令窗口输入的指示命令;
在接收到用户的确定生成多个书写区域的命令时,所述匹配生成模块还用于生成多个互不重叠的书写区域,多个书写区域与多个所述智能笔一一对应,每个书写区域覆盖对应所述智能笔的笔迹,且每个书写区域仅对于对应所述智能笔的触控事件响应为生成笔迹;
在接收到用户的取消生成多个书写区域的命令,所述匹配生成模块还用于保持原有的书写区域。
进一步地,每个所述书写区域包括编辑区与菜单栏区,每个所述书写区域的编辑区覆盖对应所述智能笔的笔记,且相应对应智能笔的触控事件;所述菜单栏区包括书写主菜单,所述主菜单的子菜单包括颜色,橡皮擦,笔画粗细。
进一步地,所述书写区域生成模块还包括命令监测单元,所述命令监测单元监测用户在所述菜单栏的输入命令;在所述命令监测单元监测到取消分区的命令时所述匹配生成模块还用于将书写区域恢复到原始的单个书写区域,并删除各所述书写区域内的内容。
本申请各实施例中所涉及的智能交互显示设备各个模块和单元的具体描述可参考书写交互方法对应的实施例的具体描述,在此不做赘述。
本申请还提供一种智能交互显示设备,包括电容触控屏、处理器以及计算机可读存储介质,所述计算机可读存储介质存储有书写交互程序,所 述书写交互程序被执行时实现如上所述的书写交互方法。该书写交互方法的具体步骤参照上述实施例,由于本智能交互显示设备采用了上述所有实施例的全部技术方案,因此至少具有上述实施例的技术方案所带来的所有有益效果,在此不再一一赘述。
本申请还提供一种书写交互系统,包括智能笔,以及如上所述的包括各模块的智能交互显示设备,该智能交互显示设备的具体结构参照上述实施例,由于本智能交互显示设备采用了上述所有实施例的全部技术方案,因此至少具有上述实施例的技术方案所带来的所有有益效果,在此不再一一赘述。
本申请还提供一种书写交互系统,包括主动式电容智能笔以及如上所述的包括了计算机存储介质的智能交互显示设备,该智能交互显示设备的具体结构,以及书写交互程序被执行时能够实现的步骤参照上述实施例,由于本书写交互系统采用了上述所有实施例的全部技术方案,因此至少具有上述实施例的技术方案所带来的所有有益效果,在此不再一一赘述。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者系统不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者系统所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者系统中还存在另外的相同要素。
在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。
上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件或者软件来实现,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在如上所述的一 个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本申请各个实施例所述的方法。
本领域的技术人员能够理解的是,在不冲突的前提下,上述各方案可以自由地组合、叠加。
应当理解,上述的实施方式仅是示例性的,而非限制性的,在不偏离本申请的基本原理的情况下,本领域的技术人员可以针对上述细节做出的各种明显的或等同的修改或替换,都将包含于本申请的权利要求范围内。

Claims (10)

  1. 一种书写交互方法,包括:
    检测智能笔的第一触控事件并获取对应的所述智能笔的标识;
    根据所述第一触控事件生成笔迹,并根据所述智能笔的标识判断生成的所述笔迹是否对应多个所述智能笔;
    若所述笔迹对应多个智能笔,生成多个互不重叠的书写区域,其中,多个书写区域与多个所述智能笔一一对应,每个书写区域覆盖对应所述智能笔的笔迹,且每个书写区域仅对于对应所述智能笔的触控事件响应为生成笔迹。
  2. 如权利要求1所述的书写交互方法,还包括:
    检测智能笔的第二触控事件并获取对应所述智能笔的标识;
    根据所述智能笔的标识判断所述第二触控事件是否为新增的智能笔所产生;
    若所述第二触控事件为新增的智能笔所产生,根据所述第二触控事件产生的位置判断所述第二触控事件是否产生在已生成的书写区域内;
    若所述第二触控事件产生在已生成的书写区域内,则不根据所述第二触控事件生成笔迹;
    若所述第二触控事件未产生在已生成的书写区域内,根据所述第二触控事件生成笔迹,并对应该新增的所述智能笔生成新的书写区域,该新的书写区域与已有的多个书写区域互不重叠,该新的书写区域覆盖对应新增的所述智能笔的笔迹,且该新的书写区域仅对于该新增的所述智能笔的触控事件响应为生成笔迹。
  3. 如权利要求1所述的书写交互方法,还包括:
    检测智能笔的第三触控事件并获取对应所述智能笔的标识;
    根据所述智能笔的标识判断所述第三触控事件是否为新增的智能笔所产生;
    若所述第三触控事件为新增的智能笔所产生,根据所述第三触控事件产生的位置判断所述第三触控事件是否产生在已生成的书写区域内;
    若所述第三触控事件产生在已生成的书写区域内,则重新生成多个互不重叠的书写区域,多个书写区域与多个所述智能笔一一对应,每个书写区域覆盖对应所述智能笔的触控轨迹,且每个书写区域仅对于对应所述智能笔的触控事件响应为生成笔迹;
    若所述第三触控事件未产生在已生成的书写区域内,则对应该新增的所述智能笔生成新的书写区域,该新的书写区域与已有的多个书写区域互不重叠,该新的书写区域覆盖对应新增的所述智能笔的笔迹,且该新的书写区域仅对于该新增的所述智能笔的触控事件响应为生成笔迹。
  4. 如权利要求1所述的书写交互方法,还包括:
    检测智能笔的第四触控事件并获取对应所述智能笔的标识;
    根据所述智能笔的标识判断所述第四触控事件是否为新增的智能笔所产生;
    若所述第四触控事件不是新增的智能笔所产生,根据所述第四触控事件产生的位置判断所述第四触控事件是否产生在已生成的书写区域内;
    若所述第四触控事件未产生在已生成的书写区域内,则按预设的规则试扩展所述智能笔对应的书写区域以覆盖对应所述第四触控事件的轨迹;
    判断试扩展的书写区域是否与其他书写区域重叠;
    若试扩展的书写区域与其他书写区域重叠,则取消扩展所述智能笔对应的书写区域且不根据所述第四触控事件生成笔迹;
    若试扩展的书写区域未与其他书写区域重叠,则扩展所述智能笔对应的书写区域且根据所述第四触控事件生成笔迹。
  5. 如权利要求1所述的书写交互方法,其中,所述若所述笔迹对应多个智能笔,生成多个互不重叠的书写区域的步骤具体包括:
    获取每个所述智能笔对应的笔迹最左端坐标点的位置;
    判断相邻的两所述最左端坐标点之间的距离是否大于或等于第一预设距离;
    若相邻的两所述最左端坐标点之间的距离大于或等于第一预设距离,则在屏幕显示区域上生成多个互不重叠的书写区域。
  6. 如权利要求5所述的书写交互方法,其中,所述若相邻的两所述最左端坐标点之间的距离大于或等于第一预设距离,则在屏幕显示区域上生成多个互不重叠的书写区域,包括:
    若相邻的两所述最左端坐标点之间的距离大于或等于所述第一预设距离,除靠近屏幕显示区域左侧边的所述最左端坐标点之外,参考其余所述最左端坐标点设置竖直分界线以形成所述书写区域,其中,所述第一预设距离等于δ与L的乘积,L为屏幕显示区域的水平长度,δ小于0.5;或
    若相邻的两所述最左端坐标点之间的距离大于或等于所述第一预设距离,以屏幕显示区域的竖直中心线为分界线以形成所述书写区域,其中,所述第一预设距离等于L/2,L为屏幕显示区域的水平长度。
  7. 如权利要求1所述的书写交互方法,所述的若所述笔迹对应多个智能笔,生成多个互不重叠的书写区域的步骤,包括:
    若所述笔迹对应多个智能笔,弹出请求用户确认是否生成多个书写区域的指令窗口;
    监测用户在所述指令窗口输入的指示命令;
    若接收到用户的确定生成多个书写区域的命令,生成多个互不重叠的书写区域,多个书写区域与多个所述智能笔一一对应,每个书写区域覆盖对应所述智能笔的笔迹,且每个书写区域仅对于对应所述智能笔的触控事件响应为生成笔迹,
    所述监测用户在所述指令窗口输入的指示命令之后,所述方法还包括:
    若接收到用户的取消生成多个书写区域的命令,则保持原有的书写区域。
  8. 一种智能交互显示设备,用于与智能笔进行交互,所述智能交互显示设备包括触控事件匹配模块,触控事件响应模块,以及书写区域生成模块,其中,
    所述触控事件匹配模块包括触控检测单元与触控匹配单元,所述触控检测单元用于检测智能笔的第一触控事件,所述触控匹配单元用于获取所述智能笔的标识并匹配对应的第一触控事件;
    所述触控事件响应模块包括笔迹生成单元与数量监测单元,所述笔迹生成单元用于生成与所述第一触控事件对应的笔迹;所述数量监测单元用于根据所述智能笔的标识判断生成的所述笔迹是否对应多个所述智能笔;
    所述书写区域生成模块包括匹配生成单元与响应控制单元,所述匹配生成单元用于在生成的所述笔迹对应多个所述智能笔时生成多个互不重叠的书写区域,多个书写区域与多个所述智能笔一一对应,每个书写区域覆盖对应所述智能笔的笔迹;所述响应控制单元用于控制每个书写区域仅对于对应所述智能笔的触控事件响应为生成笔迹。
  9. 一种智能交互显示设备,包括电容触控屏、处理器以及计算机可读存储介质,所述计算机可读存储介质存储有书写交互程序,所述书写交互程序被执行时实现如权利要求1-7任一项所述的书写交互方法。
  10. 一种书写交互系统,包括智能笔,以及如权利要求8或9所述的智能交互显示设备。
PCT/CN2020/100379 2020-03-24 2020-07-06 书写交互方法、智能交互显示设备以及书写交互系统 WO2021189706A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/913,853 US11861160B2 (en) 2020-03-24 2020-07-06 Writing interaction method, smart interactive display device and writing interaction system
EP20927976.9A EP4123439A4 (en) 2020-03-24 2020-07-06 WRITING INTERACTION METHOD, INTELLIGENT INTERACTIVE DISPLAY DEVICE AND WRITING INTERACTION SYSTEM

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010213966.9 2020-03-24
CN202010213966.9A CN111352570B (zh) 2020-03-24 2020-03-24 书写交互方法、智能交互显示设备以及书写交互系统

Publications (1)

Publication Number Publication Date
WO2021189706A1 true WO2021189706A1 (zh) 2021-09-30

Family

ID=71194555

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/100379 WO2021189706A1 (zh) 2020-03-24 2020-07-06 书写交互方法、智能交互显示设备以及书写交互系统

Country Status (4)

Country Link
US (1) US11861160B2 (zh)
EP (1) EP4123439A4 (zh)
CN (1) CN111352570B (zh)
WO (1) WO2021189706A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113934323A (zh) * 2021-10-19 2022-01-14 河北师达教育科技有限公司 基于智能黑板的多点显示方法、装置和终端设备
CN115937861A (zh) * 2022-11-30 2023-04-07 广州市保伦电子有限公司 一种基于触摸屏的多人同步书写识别方法及系统

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111352570B (zh) 2020-03-24 2021-06-01 深圳市鸿合创新信息技术有限责任公司 书写交互方法、智能交互显示设备以及书写交互系统
CN112578987A (zh) * 2020-12-25 2021-03-30 广州壹创电子科技有限公司 屏外交互式触摸一体机及其交互方法
CN113434064B (zh) * 2021-07-01 2022-03-29 掌阅科技股份有限公司 手写阅读器笔锋切换方法、电子设备和存储介质
WO2024113271A1 (zh) * 2022-11-30 2024-06-06 京东方科技集团股份有限公司 智能手写显示设备、智能手写显示方法、电子设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130050101A1 (en) * 2011-08-24 2013-02-28 Dexin Corporation Wireless transmission method for touch pen with wireless storage and forwarding capability and system thereof
CN104571815A (zh) * 2014-12-15 2015-04-29 联想(北京)有限公司 一种显示窗口的匹配方法及电子设备
CN107515690A (zh) * 2017-07-06 2017-12-26 广州视源电子科技股份有限公司 电磁屏书写操作方法及电磁屏
CN108829327A (zh) * 2018-05-07 2018-11-16 广州视源电子科技股份有限公司 交互智能设备的书写方法和装置
CN109840046A (zh) * 2017-11-29 2019-06-04 鸿合科技股份有限公司 触摸屏书写处理方法及装置
CN111352570A (zh) * 2020-03-24 2020-06-30 深圳市鸿合创新信息技术有限责任公司 书写交互方法、智能交互显示设备以及书写交互系统

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3994183B2 (ja) * 1998-07-28 2007-10-17 キヤノン株式会社 表示制御装置、表示制御方法、及び記憶媒体
JP5664303B2 (ja) * 2011-02-09 2015-02-04 大日本印刷株式会社 コンピュータ装置、入力システム、及びプログラム
KR20130138880A (ko) * 2012-06-11 2013-12-20 삼성전자주식회사 단말기의 터치 입력 제어장치 및 방법
CN105320390A (zh) * 2014-06-20 2016-02-10 鸿合科技有限公司 基于电磁白板的双人手写识别方法、装置及电磁笔
KR102523154B1 (ko) * 2016-04-22 2023-04-21 삼성전자주식회사 터치 스크린 장치, 입력 장치 및 그 제어 방법
CN106339135A (zh) 2016-08-30 2017-01-18 科盟(福州)电子科技有限公司 一种支持多人独立操作的红外电子白板a/b分屏方法
WO2018042583A1 (ja) * 2016-09-01 2018-03-08 株式会社ワコム スタイラス、センサコントローラ、及び電子定規
CN106547402A (zh) * 2016-10-31 2017-03-29 广州华欣电子科技有限公司 一种触控方法、触摸框和智能笔
CN106775314A (zh) 2016-12-09 2017-05-31 珠海市魅族科技有限公司 分屏显示方法及分屏显示装置
CN107491210B (zh) * 2017-08-14 2020-06-09 广州视源电子科技股份有限公司 多电磁笔书写区分方法、装置及电子设备
CN108319391B (zh) * 2018-01-31 2021-03-02 海信视像科技股份有限公司 一种边写边擦实现方法、装置及终端设备
CN108919983B (zh) * 2018-06-15 2021-10-26 广州视源电子科技股份有限公司 基于智能笔操作的书写启动方法及系统
JP7299754B2 (ja) * 2019-05-22 2023-06-28 シャープ株式会社 情報処理装置、情報処理方法、及び情報処理プログラム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130050101A1 (en) * 2011-08-24 2013-02-28 Dexin Corporation Wireless transmission method for touch pen with wireless storage and forwarding capability and system thereof
CN104571815A (zh) * 2014-12-15 2015-04-29 联想(北京)有限公司 一种显示窗口的匹配方法及电子设备
CN107515690A (zh) * 2017-07-06 2017-12-26 广州视源电子科技股份有限公司 电磁屏书写操作方法及电磁屏
CN109840046A (zh) * 2017-11-29 2019-06-04 鸿合科技股份有限公司 触摸屏书写处理方法及装置
CN108829327A (zh) * 2018-05-07 2018-11-16 广州视源电子科技股份有限公司 交互智能设备的书写方法和装置
CN111352570A (zh) * 2020-03-24 2020-06-30 深圳市鸿合创新信息技术有限责任公司 书写交互方法、智能交互显示设备以及书写交互系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4123439A4 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113934323A (zh) * 2021-10-19 2022-01-14 河北师达教育科技有限公司 基于智能黑板的多点显示方法、装置和终端设备
CN113934323B (zh) * 2021-10-19 2023-12-29 河北师达教育科技有限公司 基于智能黑板的多点显示方法、装置和终端设备
CN115937861A (zh) * 2022-11-30 2023-04-07 广州市保伦电子有限公司 一种基于触摸屏的多人同步书写识别方法及系统
CN115937861B (zh) * 2022-11-30 2023-09-08 广东保伦电子股份有限公司 一种基于触摸屏的多人同步书写识别方法及系统

Also Published As

Publication number Publication date
CN111352570B (zh) 2021-06-01
EP4123439A4 (en) 2024-04-24
US20230315282A1 (en) 2023-10-05
CN111352570A (zh) 2020-06-30
US11861160B2 (en) 2024-01-02
EP4123439A1 (en) 2023-01-25

Similar Documents

Publication Publication Date Title
WO2021189706A1 (zh) 书写交互方法、智能交互显示设备以及书写交互系统
CN108829327B (zh) 交互智能设备的书写方法和装置
CN109284059A (zh) 笔迹绘制方法、装置、交互智能平板和存储介质
CN106445442B (zh) 一种三屏同步显示方法
CN110609654B (zh) 数据同步显示方法、装置、设备以及远程会议系统
JP2008118301A (ja) 電子黒板システム
CN108469945A (zh) 一种显示终端及其控制方法、装置和存储介质
CN108604173A (zh) 图像处理装置、图像处理系统和图像处理方法
CN108595401A (zh) 批注同步系统、方法、装置、设备和存储介质
CN109697004B (zh) 用于触摸设备书写批注的方法、装置、设备及存储介质
CN110442264A (zh) 一种触摸数据处理方法、装置、设备及存储介质
CN110109635A (zh) 投屏显示桌、投屏显示方法、装置、控制器、设备和介质
CN111580903B (zh) 实时投票方法、装置、终端设备和存储介质
US9285962B2 (en) Display with shared control panel for different input sources
CN109062491A (zh) 交互智能设备的笔迹处理方法和装置
KR20040043454A (ko) 펜 컴퓨팅 시스템에서의 펜 입력 방법 및 장치
CN108845757A (zh) 一种智能交互平板的触控输入方法及装置、计算机可读存储介质、智能交互平板
CN107770253A (zh) 远程控制方法及系统
CN106648432A (zh) 用于大屏显示设备的控制方法和控制装置
CN107679219B (zh) 匹配方法及装置、交互智能平板及存储介质
JP2015203989A (ja) 表示制御プログラム、表示制御装置及び表示制御方法
US20190243521A1 (en) Information processing apparatus and information processing method
CN109491732A (zh) 一种虚拟控件显示方法、装置及车载显示屏
CN104731451B (zh) 信息处理方法及电子设备
JP2018077921A (ja) 表示制御プログラム及び表示制御装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20927976

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020927976

Country of ref document: EP

Effective date: 20221024