CN116137915A - Cross-device drawing system - Google Patents

Cross-device drawing system Download PDF

Info

Publication number
CN116137915A
CN116137915A CN202280002664.4A CN202280002664A CN116137915A CN 116137915 A CN116137915 A CN 116137915A CN 202280002664 A CN202280002664 A CN 202280002664A CN 116137915 A CN116137915 A CN 116137915A
Authority
CN
China
Prior art keywords
color
texture
target area
information
user operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280002664.4A
Other languages
Chinese (zh)
Inventor
肖冬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN116137915A publication Critical patent/CN116137915A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • G06F3/0441Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using active external devices, e.g. active pens, for receiving changes in electrical potential transmitted by the digitiser, e.g. tablet driving signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • G06F3/0447Position sensing using the local deformation of sensor cells
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour

Abstract

The embodiment of the application provides a cross-device drawing system, which comprises a first device and a second device, wherein a user can select textures on the first device when drawing handwriting on the second device, the first device can send selected colors to the second device, and the second device displays handwriting in the colors when drawing handwriting on the second device. In the embodiment of the application, the user can acquire the color by using the touch pen across the devices, and then the handwriting is drawn by using the color, so that the handwriting drawing process is not limited to the limited color on the second device, the style of the electronic device for displaying the handwriting is enriched, and the user experience can be improved.

Description

Cross-device drawing system Technical Field
The embodiment of the application relates to intelligent equipment technology, in particular to a cross-equipment drawing system.
Background
With the development of electronic devices, electronic devices and touch pens equipped with touch screens are becoming tools for drawing handwriting and operations. A user may draw handwriting on an electronic device using a stylus, operate controls displayed on an interface of the electronic device. When a user draws handwriting on the electronic device by using the stylus, the user can select a writing color control displayed on the electronic device by using the stylus, and the color of the drawn handwriting is changed.
At present, the electronic equipment in the prior art has single handwriting display style.
Disclosure of Invention
The embodiment of the application provides a cross-device drawing system which can enrich the style of handwriting display of electronic equipment.
In a first aspect, embodiments of the present application provide a cross-device rendering system that includes a first device and a second device.
Wherein the first device is configured to: responsive to a first user operation on the first device display, displaying a first graphical interface; responsive to a second user operation on the first device display screen, displaying a second graphical interface, the second graphical interface being different from the first graphical interface; responsive to a third user operation on the first device display, a first target area is selected on the second graphical interface, the first target area including a first color therein.
The second device is configured to: and displaying the handwriting of the first color in response to a fourth user operation on the second device display screen.
In one possible implementation, the second device is further configured to: and responding to a fifth user operation on the display screen of the second device, and selecting a second target area on the graphical interface displayed by the second device, wherein the second target area comprises a second color.
Accordingly, the first device is further configured to: and displaying the handwriting of the second color in response to a sixth user operation on the first device display screen.
In one possible implementation, the first device is further configured to: detecting the third operation on the first device display screen in response to receiving a first instruction from a stylus, the first instruction being sent by the stylus detecting that the stylus performs a first preset action; or, in response to detecting that the stylus performs a second preset action, detecting the third user operation on the first device display screen; or, in response to detecting that the first device performs a third preset action, detecting the third user operation on the first device display screen.
In one possible implementation, the first target area further includes a first texture.
The second device is further configured to: displaying handwriting combined by the first color and the first texture in response to the fourth user operation on the second device display screen.
In one possible implementation, the first device is further configured to: and after the first target area is selected on the second graphical interface, acquiring texture information of the first texture.
In one possible implementation manner, the first device is specifically configured to: detecting whether the first target area contains the same pattern. If the first target area contains the same pattern, screenshot is carried out on the pattern to obtain an image of the pattern; acquiring vector data of the pattern based on the image of the pattern; and taking the vector data of the pattern as texture information of the first texture. If the first target area does not contain the same pattern, screenshot is carried out on the target area to obtain an image of the first target area; acquiring vector data of the first target area based on the image of the target area; and taking the vector data of the first target area as texture information of the first texture.
In one possible implementation manner, the first device is specifically configured to: dividing the first target area into a plurality of grids, wherein each grid has a first preset size; acquiring first similarity of patterns in every two grids; and if the first similarity greater than or equal to the preset similarity exists, and the duty ratio of the first similarity greater than or equal to the preset similarity is greater than or equal to the preset duty ratio, determining that the first target area contains the same pattern.
In one possible implementation manner, the first device is specifically configured to: if the first similarity which is larger than or equal to the preset similarity does not exist, or the duty ratio is smaller than the preset duty ratio, the size of the grids is increased, and the second similarity of the patterns in every two grids after the size is increased is obtained; if the second similarity which is larger than or equal to the preset similarity exists, and the duty ratio of the second similarity which is larger than or equal to the preset similarity is larger than or equal to the preset duty ratio, determining that the first target area contains the same pattern; if the second similarity which is greater than or equal to the preset similarity does not exist, or the duty ratio of the second similarity which is greater than or equal to the preset similarity is greater than or equal to the preset duty ratio, the size of the grid is continuously increased until the size of the grid reaches a second preset size.
In one possible implementation, the first device is further configured to: in response to a seventh user operation on the first device display screen, a third target area is selected on the second graphical interface, the third target area including a third color therein.
The second device is further configured to: and responding to the fourth user operation on the display screen of the second device, and displaying handwriting of the color obtained by fusing the first color and the third color.
In one possible implementation, the first device is further configured to: in response to a seventh user operation on the first device display screen, a third target area is selected on the second graphical interface, the third target area including a third texture therein.
Accordingly, the second device is further configured to: and displaying handwriting of the first color and the third texture combination in response to the fourth user operation on the second device display screen.
In one possible implementation, the first target area further includes a first texture, and the first device is further configured to: after a first target area is selected on the second graphical interface, displaying a color control and a texture control to be selected on the second graphical interface; an eighth user operation of selection of the color control and/or texture control is detected.
Accordingly, the second device is configured to: and displaying handwriting of the first color and/or the first texture combination in response to the fourth user operation on the second device display screen.
In one possible implementation, the first device is further configured to: responsive to a seventh user operation on the first device display screen, selecting a third target area on the second graphical interface, the third target area including a third color and a third texture therein; displaying a color control and a texture control to be selected on the second graphical interface; a ninth user operation of selection of the color control and/or texture control is detected.
Accordingly, the second device is further configured to: displaying handwriting of the first color, the first texture, the third color, and/or the third texture combination in response to the fourth user operation on the second device display screen.
In a second aspect, an embodiment of the present application provides a handwriting drawing method, where an execution body of the method may be a first device or a chip in the first device, and the method may include: responsive to a first user operation on the first device display, displaying a first graphical interface; responsive to a second user operation on the first device display screen, displaying a second graphical interface, the second graphical interface being different from the first graphical interface; responsive to a third user operation on the first device display screen, selecting a first target area on the second graphical interface, the first target area including a first color therein; and sending the color information of the first color to a second device, wherein the color information of the first color is used for indicating the second device to display handwriting of the first color.
In the embodiment of the application, the user can acquire the color by using the touch pen across the devices, and then the handwriting is drawn by using the color, so that the handwriting drawing process is not limited to the limited color on the second device, the style of the electronic device for displaying the handwriting is enriched, and the user experience can be improved.
In one possible implementation, the method further includes: and responding to a sixth user operation on the display screen of the first device, and displaying handwriting of a second color, wherein the second color is the color of a second target area selected on the graphical interface of the second device.
In one possible implementation, before the responding to the third operation on the first device display screen, the method further includes: detecting the third operation on the first device display screen in response to receiving a first instruction from a stylus, the first instruction being sent by the stylus detecting that the stylus performs a first preset action; or, in response to detecting that the stylus performs a second preset action, detecting the third user operation on the first device display screen; or, in response to detecting that the first device performs a third preset action, detecting the third user operation on the first device display screen.
In one possible implementation, the first target area further includes a first texture; the sending the color information of the first color to a second device includes: and sending the color information of the first color and the texture information of the first texture to the second device.
In one possible implementation manner, after the first target area is selected on the second graphical interface, the method further includes: and obtaining texture information of the first texture.
In one possible implementation manner, the acquiring texture information of the first texture includes: detecting whether the first target area contains the same pattern; if yes, screenshot is carried out on the pattern, and an image of the pattern is obtained; acquiring vector data of the pattern based on the image of the pattern; taking vector data of the pattern as texture information of the first texture; if not, capturing a picture of the first target area to obtain an image of the first target area; acquiring vector data of the first target area based on the image of the first target area; and taking the vector data of the first target area as texture information of the first texture.
In one possible implementation manner, the detecting whether the first target area includes the same pattern includes: dividing the first target area into a plurality of grids, wherein each grid has a first preset size; acquiring first similarity of patterns in every two grids; and if the first similarity greater than or equal to the preset similarity exists, and the duty ratio of the first similarity greater than or equal to the preset similarity is greater than or equal to the preset duty ratio, determining that the first target area contains the same pattern.
In one possible implementation, the method further includes: if the first similarity which is larger than or equal to the preset similarity does not exist, or the duty ratio is smaller than the preset duty ratio, the size of the grids is increased, and the second similarity of the patterns in every two grids after the size is increased is obtained; if the second similarity which is larger than or equal to the preset similarity exists, and the duty ratio of the second similarity which is larger than or equal to the preset similarity is larger than or equal to the preset duty ratio, determining that the first target area contains the same pattern; if the second similarity which is greater than or equal to the preset similarity does not exist, or the duty ratio of the second similarity which is greater than or equal to the preset similarity is greater than or equal to the preset duty ratio, the size of the grid is continuously increased until the size of the grid reaches a second preset size.
In one possible implementation manner, after the first target area is selected on the second graphical interface, the method further includes: responsive to a seventh user operation on the first device display screen, selecting a third target area on the second graphical interface, the third target area including a third color therein; fusing the first color and the third color to obtain fused color information; the sending the color information of the first color to a second device includes: and sending the fusion color information to the second device, wherein the fusion color information is used for indicating the second device to display the handwriting of the color obtained by fusing the first color and the third color.
In one possible implementation manner, after the first target area is selected on the second graphical interface, the method further includes: responsive to a seventh user operation on the first device display screen, selecting a third target area on the second graphical interface, the third target area including a third texture therein; and fusing the first color and the third texture to obtain fusion information.
The sending the color information of the first color to a second device includes: and sending the fusion information to the second device, wherein the fusion information is used for indicating the second device to display the handwriting of the first color and the third texture combination.
In one possible implementation manner, the first target area further includes a first texture, and after the first target area is selected on the second graphical interface, the method further includes: displaying a color control and a texture control to be selected on the second graphical interface; an eighth user operation of selection of the color control and/or texture control is detected.
In one possible implementation, after the eighth user operation for detecting the selection of the color control and/or texture control, the method further includes: responsive to a seventh user operation on the first device display screen, selecting a third target area on the second graphical interface, the third target area including a third color and a third texture therein; displaying a color control and a texture control to be selected on the second graphical interface; detecting a ninth user operation of the selection of the color control and/or the texture control; and fusing the first information indicated by the eighth user operation and the second information indicated by the ninth user operation to obtain fusion information, wherein the first information is color information of a first color and/or texture information of a first texture, and the second information is color information of a third color and/or texture information of a third texture.
The sending the color information of the first color to a second device includes:
and sending the fusion information to the second device, wherein the fusion information is used for indicating the second device to display the handwriting combined by the first color, the first texture, the third color and/or the third texture.
In a third aspect, an embodiment of the present application provides a handwriting drawing method, applied to a second device, where the method includes: receiving information of a first color from a first device; and displaying the handwriting of the first color in response to a fourth user operation on the second device display screen.
In one possible implementation, the method further includes: responsive to a fifth user operation on the second device display screen, selecting a second target area on the graphical interface displayed by the second device, the second target area including a second color therein; and sending color information of the second color to the first device, wherein the color information of the second color is used for indicating the first device to display handwriting of the second color.
In one possible implementation, the receiving information of the first color from the first device includes: receiving color information of the first color and texture information of a first texture from the first device; displaying handwriting combined by the first color and the first texture in response to the fourth user operation on the second device display screen.
In one possible implementation, the receiving information of the first color from the first device includes: receiving fusion color information from the first device, wherein the fusion color information is information obtained by fusing a first color of a first target area and a third color of a third target area on the first device; and responding to the fourth user operation on the display screen of the second device, and displaying handwriting of the color obtained by fusing the first color and the third color.
In one possible implementation, the receiving information of the first color from the first device includes: receiving fusion information from the first device, wherein the fusion information is information obtained by fusing a first color of a first target area on the first device and a third texture of a third target area; and displaying handwriting of the first color and the third texture combination in response to the fourth user operation on the second device display screen.
In one possible implementation, the receiving information of the first color from the first device includes: receiving fusion information from the first device, wherein the fusion information is first color, first texture, third color of a third target area and/or third texture fusion information of a first target area on the first device; displaying handwriting of the first color, the first texture, the third color, and/or the third texture combination in response to the fourth user operation on the second device display screen.
In a fourth aspect, an embodiment of the present application provides a handwriting drawing method, applied to a stylus, where the method includes: the method includes receiving texture information from a first device and transmitting the texture information to a second device.
In one possible implementation, the method further includes: and responding to the detection that the touch pen executes a first preset action, and sending a first instruction to the first device, wherein the first instruction is used for indicating to acquire texture information of a target area selected by a user on an interface of the first device.
In one possible implementation manner, after receiving the texture information from the first device, the method further includes: and displaying the texture represented by the texture information.
In a fifth aspect, an embodiment of the present application provides a handwriting drawing apparatus, the handwriting drawing apparatus being a first device or a chip in the first device, the handwriting drawing apparatus including:
a display module for displaying a first graphical interface in response to a first user operation on the first device display screen and displaying a second graphical interface in response to a second user operation on the first device display screen, the second graphical interface being different from the first graphical interface.
And the processing module is used for responding to a third user operation on the display screen of the first device and selecting a first target area on the second graphical interface, wherein the first target area comprises a first color.
The receiving and transmitting module is used for sending the color information of the first color to the second equipment, and the color information of the first color is used for indicating the second equipment to display the handwriting of the first color.
In one possible implementation, the display module is further configured to display, in response to a sixth user operation on the display screen of the first device, handwriting of a second color, where the second color is a color of a second target area selected on the graphical interface of the second device.
In a possible implementation manner, the processing module is further configured to detect, on the first device display screen, the third user operation in response to receiving a first instruction from the stylus, where the first instruction is sent by the stylus detecting that the stylus performs a first preset action; or, in response to detecting that the stylus performs a second preset action, detecting the third user operation on the first device display screen; or, in response to detecting that the first device performs a third preset action, detecting the third user operation on the first device display screen.
In one possible implementation, the first target area further includes a first texture.
The receiving and transmitting module is specifically configured to send color information of the first color and texture information of the first texture to the second device.
In a possible implementation manner, the processing module is further configured to obtain texture information of the first texture.
In a possible implementation manner, the processing module is specifically configured to detect whether the first target area contains the same pattern; if yes, screenshot is carried out on the pattern, and an image of the pattern is obtained; acquiring vector data of the pattern based on the image of the pattern; taking vector data of the pattern as texture information of the first texture; if not, capturing a picture of the first target area to obtain an image of the first target area; acquiring vector data of the first target area based on the image of the first target area; and taking the vector data of the first target area as texture information of the first texture.
In a possible implementation manner, the processing module is specifically configured to divide the first target area into a plurality of grids, where each grid has a first preset size; acquiring first similarity of patterns in every two grids; and if the first similarity greater than or equal to the preset similarity exists, and the duty ratio of the first similarity greater than or equal to the preset similarity is greater than or equal to the preset duty ratio, determining that the first target area contains the same pattern.
In one possible implementation manner, the processing module is specifically configured to increase the size of the grid if there is no first similarity greater than or equal to a preset similarity, or the duty ratio is smaller than the preset duty ratio, and obtain the second similarity of the patterns in each two grids after the size is increased; if the second similarity which is larger than or equal to the preset similarity exists, and the duty ratio of the second similarity which is larger than or equal to the preset similarity is larger than or equal to the preset duty ratio, determining that the first target area contains the same pattern; if the second similarity which is greater than or equal to the preset similarity does not exist, or the duty ratio of the second similarity which is greater than or equal to the preset similarity is greater than or equal to the preset duty ratio, the size of the grid is continuously increased until the size of the grid reaches a second preset size.
In a possible implementation manner, the processing module is further configured to select a third target area on the second graphical interface in response to a seventh user operation on the first device display screen, where the third target area includes a third color; and fusing the first color and the third color to obtain fused color information.
The receiving and transmitting module is specifically configured to send the fused color information to the second device, where the fused color information is used to instruct the second device to display handwriting of the color obtained by fusing the first color and the third color.
In a possible implementation manner, the processing module is further configured to select a third target area on the second graphical interface in response to a seventh user operation on the first device display screen, where the third target area includes a third texture; and fusing the first color and the third texture to obtain fusion information.
The receiving and transmitting module is specifically configured to send the fusion information to the second device, where the fusion information is used to instruct the second device to display handwriting that is a combination of the first color and the third texture.
In one possible implementation, the first target area further includes a first texture. And the display module is also used for displaying the color control and the texture control to be selected on the second graphical interface. And the processing module is also used for detecting eighth user operation on the selection of the color control and/or the texture control.
In one possible implementation, the processing module is further configured to select a third target area on the second graphical interface in response to a seventh user operation on the first device display screen, where the third target area includes a third color and a third texture.
And the display module is also used for displaying the color control and the texture control to be selected on the second graphical interface.
The processing module is further configured to detect a ninth user operation for selecting a color control and/or a texture control, and fuse first information indicated by the eighth user operation with second information indicated by the ninth user operation to obtain fusion information, where the first information is color information of a first color and/or texture information of a first texture, and the second information is color information of a third color and/or texture information of a third texture.
The receiving and transmitting module is specifically configured to send the fusion information to the second device, where the fusion information is used to instruct the second device to display handwriting combined by the first color, the first texture, the third color, and/or the third texture.
In a sixth aspect, an embodiment of the present application provides a handwriting drawing apparatus, where the handwriting drawing apparatus is a second device or a chip in the second device, and the handwriting drawing apparatus includes:
and the transceiver module is used for receiving the information of the first color from the first device.
And the display module is used for responding to a fourth user operation on the display screen of the second device and displaying the handwriting of the first color.
In one possible implementation, the display module is further configured to select a second target area on a graphical interface displayed by the second device in response to a fifth user operation on the display screen of the second device, where the second target area includes a second color.
The receiving and transmitting module is used for sending the color information of the second color to the first device, and the color information of the second color is used for indicating the first device to display handwriting of the second color.
In a possible implementation manner, the transceiver module is specifically configured to receive color information of the first color and texture information of a first texture from the first device.
And the display module is particularly used for responding to the fourth user operation on the display screen of the second device and displaying handwriting combined by the first color and the first texture.
In one possible implementation manner, the transceiver module is specifically configured to receive the fused color information from the first device, where the fused color information is information obtained by fusing a first color of a first target area and a third color of a third target area on the first device.
And the display module is specifically used for responding to the fourth user operation on the display screen of the second device and displaying the handwriting of the color obtained by fusing the first color and the third color.
In one possible implementation manner, the transceiver module is specifically configured to receive fusion information from the first device, where the fusion information is information obtained by fusing a first color of a first target area and a third texture of a third target area on the first device.
And the display module is specifically used for responding to the fourth user operation on the display screen of the second device and displaying handwriting of the first color and the third texture combination.
In one possible implementation manner, the transceiver module is specifically configured to receive fusion information from the first device, where the fusion information is a first color, a first texture, a third color of a third target area, and/or third texture fusion information of a first target area on the first device.
And the display module is specifically used for responding to the fourth user operation on the display screen of the second device and displaying handwriting combined by the first color, the first texture, the third color and/or the third texture.
In a seventh aspect, an embodiment of the present application provides a handwriting drawing device, which may be a stylus or a chip in the stylus, including:
And the receiving and transmitting module is used for receiving the texture information from the first device and transmitting the texture information to the second device.
In one possible implementation, the handwriting drawing device further includes: and the processing module is used for detecting the action of the touch control pen.
The receiving and transmitting module is further used for responding to the processing module to detect that the touch pen executes a first preset action, and sending a first instruction to the first device, wherein the first instruction is used for indicating to acquire texture information of a target area selected by a user on an interface of the first device.
In one possible implementation, the handwriting drawing device further includes: and a display module. And the display module is used for displaying the textures represented by the texture information.
In an eighth aspect, embodiments of the present application provide an electronic device, which may be the first device of the second aspect, the second device of the third aspect, or the stylus of the fourth aspect. The electronic device may include: a processor and a memory. The memory is for storing computer executable program code, the program code comprising instructions; the instructions, when executed by a processor, cause the electronic device to perform the method as in the second, third and fourth aspects.
In one embodiment, an electronic device may include a display.
In a ninth aspect, an embodiment of the present application provides an electronic device, where the electronic device may be the handwriting drawing device of the fourth aspect, the handwriting drawing device of the fifth aspect, or the handwriting drawing device of the sixth aspect. The electronic device may comprise means, modules or circuits for performing the methods provided in the second, third, fourth aspects above.
In a tenth aspect, embodiments of the present application provide a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method of the second, third and fourth aspects described above.
In an eleventh aspect, embodiments of the present application provide a computer-readable storage medium having instructions stored therein, which when run on a computer, cause the computer to perform the methods in the second, third, and fourth aspects described above.
The advantages of the foregoing first aspect and the possible implementations of the third to eleventh aspects may be referred to the advantages of the foregoing second aspect, and are not described herein.
Drawings
FIG. 1 is a schematic diagram of an interface of a conventional electronic device;
FIG. 2A is another illustration of a cross-device rendering system provided by embodiments of the present application;
FIG. 2B is a schematic diagram of interactions in a cross-device rendering system provided in an embodiment of the present application;
FIG. 2C is another schematic diagram of a cross-device rendering system provided by embodiments of the present application;
FIG. 3 is a schematic diagram of interaction between an electronic device and a stylus according to an embodiment of the present application;
FIG. 4 is another schematic diagram of interaction between an electronic device and a stylus according to an embodiment of the present application;
fig. 5 is a schematic diagram of a change in the amount of change in the capacitance sampling value at the position corresponding to the TP sensor provided in the embodiment of the present application;
FIG. 6 is another interaction diagram in a cross-device rendering system provided in an embodiment of the present application;
fig. 7 is a schematic view of a scenario provided in an embodiment of the present application;
FIG. 8 is a schematic diagram of an interface according to an embodiment of the present disclosure;
FIG. 9 is a schematic diagram of another interface provided in an embodiment of the present application;
FIG. 10 is another interaction diagram in a cross-device rendering system provided in an embodiment of the present application;
FIG. 11 is another interaction diagram in a cross-device rendering system provided in an embodiment of the present application;
FIG. 12 is another interaction diagram in a cross-device rendering system provided in an embodiment of the present application;
FIG. 13 is a schematic view of another scenario provided in an embodiment of the present application;
FIG. 14 is a schematic diagram of texture overlay according to an embodiment of the present disclosure;
FIG. 15 is a schematic diagram of another texture overlay provided in an embodiment of the present application;
FIG. 16 is a schematic view of another scenario provided in an embodiment of the present application;
FIG. 17 is a schematic view of another scenario provided in an embodiment of the present application;
fig. 18 is a schematic structural diagram of a first device according to an embodiment of the present application;
fig. 19 is another schematic structural diagram of the second device according to the embodiment of the present application.
Detailed Description
Fig. 1 is a schematic diagram of an interface of a conventional electronic device. Referring to fig. 1, taking an electronic device as an example of a tablet computer, a color taking area 10 may be displayed on the electronic device, where the color taking area includes selectable color patches 11 of different colors, and when a user draws handwriting on the electronic device using a stylus, the user may operate the stylus to select a color in the color patches 11 to change the color of the handwriting drawn on the electronic device by the stylus. The color of the color taking area 10 is currently fixed singly, resulting in less selection by the user. It should be appreciated that different colors are represented in fig. 1 in different gray scales. In addition, when drawing a draft of a garment, as for a garment designer, not only the color of the garment but also the pattern of the garment need to be drawn, but the conventional electronic device has no pattern for the garment designer to select, so that the garment designer needs to manually draw the pattern, and the efficiency is low. In summary, in the prior art, the style of the handwriting displayed by the electronic device is single. It should be understood that "interface" and "graphical interface" in the embodiments of the present application are understood to be graphical user interfaces (graphical user interface, GUI) of electronic devices.
When the user goes out, a lot of attractive pictures are often shot, attractive patterns exist on the pictures, or the user sees the attractive patterns when browsing the webpage, if the user can timely extract the patterns on other devices for use in the electronic device, the user can draw handwriting without being limited by the original colors in the electronic device, and the user experience can be improved. Therefore, the embodiment of the application provides a cross-device drawing system, a user can select required colors and/or textures on a first device to extract the colors and/or textures required by the user to a second device, so that the user can directly use the colors and/or textures on the second device, the handwriting style of the electronic device can be enriched, and the drawing efficiency of the user is improved.
Fig. 2A is a schematic diagram of a cross-device drawing system according to an embodiment of the present application. Referring to fig. 2A, a first device and a second device may be included in the cross-device rendering system. Wherein the second device may be plural. Fig. 2A and the following embodiments illustrate a second device, and the first device is a mobile phone, and the second device is a tablet (portable android device, PAD). The first device and the second device may be interconnected by a communication network. The communication network may be, but is not limited to: WI-FI hotspot networks, WI-FI peer-to-peer (P2P) networks, bluetooth networks, zigbee networks, or near field communication (near field communication, NFC) networks.
Based on the cross-device rendering system shown in fig. 2A, the following describes functions of the first device and the second device in conjunction with fig. 2B:
s201, in response to a first user operation on the first device display, displaying a first graphical interface.
In an embodiment of the present application, the user operation on the first device display screen and the second device display screen may include, but is not limited to,: a stylus, a user's finger, a mouse, a keyboard, etc. on the display screen. In the embodiment of the application, the first, second and the like are used for representing multiple operations of a user on the display screen of the first device.
The first user operation may include, but is not limited to,: click, slide, long press, etc.
S202, responding to a second user operation on the display screen of the first device, and displaying a second graphical interface, wherein the second graphical interface is different from the first graphical interface.
The second user operation may include, but is not limited to,: click, slide, long press, etc.
As in S201 and S202 above, the electronic device displays a first graphical interface in response to a first user operation on the first device display screen, and the electronic device displays a second graphical interface in response to a second user operation on the first device display screen for purposes of indicating: the electronic device may display different graphical interfaces, i.e. the user may select the first target area on any of the graphical interfaces displayed by the electronic device.
For example, the first graphical interface may be a first page of a document page if the first user operation is an operation to open a document, and the second graphical interface may be a second page of the document if the second user operation is a swipe or flip. For example, the first graphical interface may be a first page of a document page when the first user operation is an operation to open a document, and the second graphical interface may be an interface to display an image when the second user operation is an operation to switch to an album.
S203, responding to a third user operation on the display screen of the first device, and selecting a first target area on the second graphical interface, wherein the first target area comprises a first color.
The third user operation may be an operation of selecting the first target area on the second image interface for the user. For example, the third user operation may be a user circling on the first device display screen to delineate the first target area, and the electronic device may select the area in the circle as the first target area based on the user circling operation.
In one embodiment, the first device may display the user's handwriting on the second graphical interface to select the first target area on the second graphical interface. The first device may acquire a first color of the first target area. When the first target area has only one color, the first color is the color. When the first target area has a plurality of colors, the first color is a color obtained by fusing the plurality of colors, and the fused color can be an average value of RGB of the plurality of colors.
In one embodiment, a first device may send color information of a first color to a second device. The color information includes RGB values of the first color.
S204, displaying handwriting of the first color in response to a fourth user operation on the display screen of the second device.
The fourth user operation may be an operation of drawing handwriting for the user, such as drawing a line on a memo interface of the second device by the user. The second device may display the handwriting of the first color in response to a fourth user operation on a display screen of the second device.
According to the method and the device for drawing the handwriting, the user can select the color on the first device, and draw the handwriting on the second device by the color, so that the handwriting does not need to be drawn by using the color of the second device, and the style of the handwriting displayed by the electronic device can be enriched.
Referring to the illustration in fig. 2B, in the embodiment of the present application, the user may also select the target area on the second device, and draw handwriting on the first device by using the color of the target area on the second device, in other words, the cross-device drawing system provided in the embodiment of the present application may further include:
s205, responding to a fifth user operation on the display screen of the second device, and selecting a second target area on the graphical interface displayed by the second device, wherein the second target area comprises a second color.
It will be appreciated that the fifth user operation may be referred to the above description of the third user operation. It should be noted that S205 may be performed after S204, or may be performed after S201 to S202 with reference to the first device.
S206, displaying the handwriting of the second color in response to a sixth user operation on the display screen of the first device.
The sixth user operation may refer to the related description of the fourth user operation described above. It should be understood that S205, S206 are not shown in fig. 2B.
That is, the first device and the second device in the embodiments of the present application may synchronize colors with each other, and one device may draw handwriting using the color of the target area on the other device. The following description will be given taking, as an example, selection of a target area on a first device and drawing of handwriting on a second device, and further, taking, as an example, selection of a target area on a first device by a user using a stylus and drawing of handwriting on a second device.
Fig. 2C is another schematic diagram of a cross-device rendering system provided in an embodiment of the present application. Referring to fig. 2C, the scene includes a first device, a second device, and a stylus.
In one embodiment, the stylus and the first device may be interconnected by a communication network, and the stylus and the second device may be interconnected by a communication network. The communication network may be, but is not limited to: WI-FI hotspot networks, WI-FI peer-to-peer (P2P) networks, bluetooth networks, zigbee networks, or near field communication (near field communication, NFC) networks. In such an embodiment, the stylus may select a color on the first device, which may be synchronized by the stylus to the second device, as indicated by a in fig. 2C, particularly with reference to the associated description in fig. 6.
In one embodiment, the stylus and the first device may be interconnected by a communication network, the stylus and the second device may be interconnected by a communication network, and the first device and the second device may be interconnected by a communication network, which may be as described above. In this embodiment, the stylus may select a texture on the first device, which may be synchronized directly to the second device, as shown by b in fig. 2C, as shown in fig. 2A above, and in particular with reference to the associated description in fig. 10.
The first device and the second device in this embodiment of the present application may be electronic devices including a touch screen, where the electronic devices may be referred to as User Equipment (UE), terminals (terminal), and the like, for example, the electronic devices may be mobile phones, tablet computers (portable android device, PAD), personal digital assistants (personal digital assistant, PDA), handheld devices with wireless communication functions, computing devices, vehicle-mounted devices, or mobile devices, virtual Reality (VR) terminal devices, augmented reality (augmented reality, AR) terminal devices, wireless terminals in industrial control (industrial control), wireless terminals in smart home (smart home), and the like, and the form of the electronic devices is not specifically limited in this embodiment of the present application.
In this embodiment of the present application, the structures of the touch screens of the second device and the first device may be the same, and the following describes, by taking interaction between the stylus and the electronic device as an example, an interaction principle between the stylus and the first device and the second device respectively:
fig. 3 is a schematic diagram of interaction between an electronic device and a stylus according to an embodiment of the present application. Referring to fig. 3, the electronic device includes: a touch panel, a display panel, a graphics processor (graphics processing unit, GPU), an application processor (application processor, AP), and a second communication module. The touch panel includes: a touch sensor (TP sensor) and a touch processing module. The display panel includes: a display screen and a display IC chip (integrated circuit chip). A touch panel may be understood as a touch screen of an electronic device, which may also be referred to as a touch screen. In one embodiment, the display panel and the touch panel may be collectively referred to as a screen or a display screen.
The stylus comprises: a micro-processing unit (micro controller unit, MCU), a first communication module, a transmit module (TX) and a receive module (RX).
The first communication module and the second communication module may be, for example, a bluetooth module, a wireless local area network module, a WI-FI module, or the like, for implementing communication between the electronic device and the stylus. It should be appreciated that the stylus and the electronic device may establish a wireless path through the first communication module and the second communication module, the wireless path being for transmitting wireless signals.
In an electronic device:
the touch sensor is composed of an electrode array, wherein the electrode array comprises a plurality of electrodes which are arranged in rows and columns. Touch sensor for gathering touch data, the touch data can include: the stylus touches the data of the touch screen, and the data of the user touching the touch screen. The touch data in the following embodiments will be described by taking the example that the touch data includes data of touching the touch screen with a stylus.
The touch processing module is configured to determine a position of the stylus on the touch screen based on the touch data collected by the touch sensor, and send the position of the stylus on the touch screen to the application processor, and the touch processing module determines the position of the stylus on the touch screen as described in association with fig. 4 and 5. In one embodiment, the touch processing module may be a touch IC chip, which may also be referred to as a touch chip, represented in fig. 3 as a touch chip.
And the display chip is used for controlling the display screen to display an interface so that a user can see the interface of the electronic equipment.
A graphics processor for processing the resolved image to obtain color and texture may be referred to in the description of the embodiments.
And the application processor is used for executing corresponding operation based on the position of the touch pen from the touch chip on the touch screen.
In the touch pen:
the MCU is respectively connected with the first communication module, the sending module and the receiving module. The sending module may include: the device comprises a first electrode and a driving circuit, wherein the first electrode is connected with the driving circuit, and the driving circuit is connected with the MCU. The receiving module comprises a second electrode and a decoding circuit, the second electrode is connected with the decoding circuit, and the decoding circuit is connected with the MCU.
And the MCU is used for generating a pulse width modulation (pulse width modulation, PWM) signal and sending the PWM signal to the driving circuit. The driving circuit may drive the first electrode to transmit a signal based on the PWM signal. Among them, the first electrode may be referred to as a transmitting electrode (TX), and the first electrode may be disposed near the tip of the stylus.
And a second electrode for receiving a signal from the TP sensor in the electronic device and transmitting the signal to the decoding circuit. And the decoding circuit is used for decoding the signals from the electronic equipment and sending the decoded signals to the MCU. The second electrode may be referred to as a receiving electrode (RX). It should be appreciated that the signal sent by the stylus through the first electrode and the signal sent by the electronic device through the TP sensor are both square wave signals.
Referring to fig. 3, in one embodiment, the stylus may further include: a charging module and a sensor module. The charging module is used for charging the touch pen. The sensor module may include, but is not limited to: pressure sensor, acceleration sensor (accelerometer sensor, G-sensor), gyroscope, etc., which are not described in detail in this embodiment. The sensor module may be connected to the MCU.
It should be understood that the structure of the stylus shown in fig. 3 is an example. In one embodiment, two electrodes may be disposed in the stylus, where one electrode is TX, and the other electrode may be switched between TX and RX, which may be described with reference to the related art, and the number and principles of the electrodes in the stylus are not limited in this embodiment of the present application.
The structure of the electronic device and the stylus and the functions of the modules in the electronic device and the stylus are described above. Referring to fig. 4, because the tip of the stylus is provided with electrodes, an electrode array is included in a touch sensor in an electronic device. Between the tip of the stylus and the electrode of the touch sensor, insulating substances (such as air and cover glass) are stored, so that a capacitance can be formed between the tip of the stylus and the electrode of the touch sensor, the tip of the stylus and the touch sensor in the electronic device can be connected through the capacitance to establish a circuit, and a path between the tip of the stylus and the touch sensor in the electronic device can be referred to as a circuit path. The stylus and the electronic device may interact with signals through a circuit path. It should be understood that, in fig. 4, the first communication module in the stylus and the second communication module in the electronic device are both bluetooth modules, and a bluetooth path is established between the stylus and the electronic device.
The point of the touch pen is close to the touch screen of the electronic device, the variation of the capacitance sampling value of the TP sensor in the touch screen is caused to change, and the closer the point of the touch pen is to the touch screen, the larger the variation of the capacitance sampling value of the TP sensor is. Referring to fig. 5, in fig. 5, the change amount of the capacitance sampling value at the position corresponding to the peak representation TP sensor changes, and the touch chip in the touch screen may determine the position of the touch pen on the touch screen of the electronic device based on the change amount of the capacitance sampling value on the TP sensor, for example, the touch chip may use the position of the touch pen on the touch screen where the change amount of the capacitance sampling value on the TP sensor is the largest as the position of the touch pen. It should be appreciated that FIG. 5 characterizes the location where the stylus contacts the touch screen as a black dot.
Based on the above electronic device and the structure of the stylus, the cross-device drawing system provided in the embodiments of the present application is described below with reference to specific embodiments. The following embodiments may be combined with each other, and some embodiments may not be repeated for the same or similar concepts or processes.
In one embodiment, the first device and the second device are both wirelessly connected to the stylus. Fig. 6 is another interaction schematic diagram in the cross-device drawing system provided in the embodiment of the present application. Referring to fig. 6, the interaction flow of the first device and the second device may include:
s601, the first device receives a first instruction, where the first instruction is used to instruct the first device to obtain texture information of a target area selected by the stylus.
In one embodiment, based on the embodiment shown in fig. 2B above, the texture information in the embodiment shown in fig. 6 may be replaced with color information, which is the color information of the first color of the target area (i.e., the first target area on the first device as described in fig. 2B above).
In one embodiment, when a user needs to acquire texture on a first device, the user may trigger the stylus to send a first instruction to the first device. Wherein the stylus may send the first instruction to the first device via the wireless or circuit path as above.
First, the user can hold the stylus to execute a first preset action to trigger the stylus to send a first instruction to the first device. In this embodiment, the stylus may detect an action of the stylus, and the stylus may send a first instruction to the first device in response to detecting that the user holds the stylus to perform a first preset action. It should be appreciated that the stylus may be configured with a G-sensor, gyroscope, etc., and the stylus may detect an action of the user holding the stylus based on data collected by the G-sensor, gyroscope, etc., to determine whether the user holds the stylus to perform the first preset action. Wherein the first preset actions may include, but are not limited to: shake, circle in the air, reverse the stylus, etc. Inverting the stylus can be understood as: the tail of the stylus is closer to the ground than the tip of the stylus. In the embodiment of the application, the touch pen is detected based on the data collected by the G-sensor, the gyroscope and the like, and the action of the touch pen is not repeated.
In one embodiment, a key may be disposed on the stylus, and the key may be a mechanical key or a touch key. When the stylus detects that the user operates the key, a first instruction may be sent to the first device.
In one embodiment, the first device receives the first command from the stylus if it detects that the stylus performs the second preset action.
First, the second preset action may be, but is not limited to,: the stylus double clicks the touch screen of the first device, or the stylus long presses the touch screen of the first device.
The second preset action may also be: the stylus draws a preset track on a touch screen of the first device. For example, the preset track may be preset characters, preset letters, preset shapes, or the like. In order to facilitate the first device to distinguish between the track drawn by the stylus and the preset track drawn by the stylus, the second preset action may be: the stylus draws a preset track in a preset area of the touch screen of the first device, which may be, but is not limited to, a central area of the touch screen. For example, if the first device detects that the stylus draws the preset letter m in the central area of the touch screen, it may be determined that a selection instruction from the stylus is detected, and if the first device detects that the stylus draws the preset letter m in other areas (non-preset areas) of the touch screen, it may be determined that the stylus draws a note, and m is displayed on corresponding positions in other areas of the touch screen. In the embodiment of the present application, the first device distinguishes between the track when the stylus draws the pen and the preset track drawn by the stylus, and the preset area is an example.
And secondly, a preset control, such as a texture taking control, can be displayed on the interface of the first device, and the first device can determine to receive the first instruction in response to detecting that the touch pen operates the preset control.
In one embodiment, the first device may determine to receive the first instruction if it detects that the user employs a finger or a finger joint to perform the second preset action, or if it detects that the user employs a finger or a finger joint to operate the preset control. That is, in the embodiment of the present application, the first instruction may be input to the first device by the user using a finger, a knuckle, or the like.
In one embodiment, the first device may detect whether the first device performs a third preset action, and the first device may determine to receive the first instruction in response to detecting that the first device performs the third preset action. Wherein the third preset action may include, but is not limited to,: shake, circle in the air, etc.
S602, the first device detects a target area selected by the stylus on an interface of the first device in response to receiving the first instruction.
It should be appreciated that the selection operation of the stylus on the interface of the first device may be referred to as a third user operation. The "interface of the first device" in S602 may be referred to as a second graphical interface.
The first device may detect a target area selected by the stylus on an interface of the first device in response to receiving the first instruction. As described above in relation to fig. 3-5, the first device may obtain the position of the stylus on the interface of the first device, and thus the first device may detect the target area selected by the stylus on the interface of the first device. It should be understood that the interface of the first device may be any interface, for example, an interface where the first device displays an image in an album of the first device, or an interface where the first device displays a web page, which is not limited in this embodiment of the present application, that is, in this embodiment of the present application, the target area may be selected by the stylus on any interface of the first device.
In this embodiment of the present application, the user may select the target area by using a manner of clicking the stylus on the interface of the first device, or drawing a preset shape, or the like. For example, when a user selects a target area in a click manner on an interface of the first device using a stylus, the first device may take as the target area a position clicked by the stylus. For example, when the user selects the target area with the stylus in such a manner as to draw a preset shape, the first device may determine that the area within the preset shape is the target area in response to detecting that the trajectory of the stylus is the preset shape.
It should be understood that S602 may be replaced with: the first device detects a target area selected by a user on an interface of the first device in response to receiving the first instruction. The user may select the target area on the interface of the first device by using a stylus, or may select the target area on the interface of the first device by using a finger, a knuckle, or the like. The following embodiments are described taking the example where the first device detects that the stylus selects the target area on the interface of the first device.
S602 can be understood as: the touch chip in the first device may obtain a target area selected by the touch pen on the interface of the first device based on the variation of the capacitance sampling value at the position corresponding to the TP sensor, and send the coordinates corresponding to the target area to the AP in the first device.
S603, the first device acquires texture information of the target area.
The first device may determine the content in the target area based on the content displayed by the interface of the first device and the location of the target area at the interface of the first device. The first device may analyze texture information of the target area based on content in the target area.
S603 can be understood as: and the AP extracts the content in the target area based on the coordinates corresponding to the target area and the content displayed on the interface of the first device. The AP may send the content in the target area to the GPU, which analyzes texture information of the target area in response to the content in the target area. The AP may intercept the content in the target area on the interface of the first device in a screenshot manner, and further send a screenshot including the content of the target area to the GPU. The following description uses the first device as an execution body for acquiring texture information of the target area, and describes a process of acquiring the texture information of the target area by the first device:
If the target area contains a plurality of identical patterns, the identical patterns in the target area can represent the texture of the target area, so that the first device can draw grids in the target area, acquire the patterns in each grid to detect whether the target area contains the identical patterns, and acquire the patterns if the target area contains the plurality of identical patterns. The first device may obtain a first similarity of patterns in each two grids, and detect whether a first similarity greater than or equal to a preset similarity exists. If the first similarity greater than or equal to the preset similarity exists, the first device also needs to detect the duty ratio of the first similarity greater than or equal to the preset similarity in all the first similarities, if the duty ratio is greater than or equal to the preset duty ratio, the target area is determined to contain the same pattern, and the same pattern in the grid corresponding to the greater than or equal to the preset similarity is used as the repeated pattern of the target area.
It should be appreciated that the grid drawn by the first device has a first preset size, which in one embodiment may be 1 pixel. If the first device does not have the first similarity greater than or equal to the preset similarity after drawing the grids, or if the duty ratio of the first device greater than or equal to the preset similarity is smaller than the preset duty ratio, the first device may continue to increase the size of the grids, and continue to acquire the second similarity of the patterns in each two grids, so as to detect whether the target area contains the same pattern. If the first device does not have the second similarity greater than or equal to the preset similarity after increasing the grids, or if the duty ratio of the second similarity greater than or equal to the preset similarity is smaller than the preset duty ratio, the first device may continue to increase the size of the grids, and continue to acquire the third similarity of the patterns in every two grids, so as to detect whether the target area contains the same pattern. And repeating the steps until the size of the grid reaches a second preset size, wherein the second preset size is half of the area of the target area, and the method for increasing the grid by the first device is not limited. It should be understood that, if the size of the grid reaches the second preset size, the first device may also not obtain a similarity greater than or equal to the preset similarity, or the ratio of the similarity greater than or equal to the preset similarity is smaller than the preset ratio, and then the first device may determine that the target area does not include the same pattern.
Thus, in one embodiment, if the first device is capable of acquiring a repeating pattern of the target area, the first device may capture an image of the repeating pattern, resulting in scalar data for the image of the repeating pattern. In one embodiment, if the first device is unable to acquire the repeated pattern of the target area, the first device may capture an image of the target area, resulting in scalar data for the image of the target area. In the embodiment of the application, in order to make the texture clear when the stylus draws the handwriting and uses the texture of the target area, the first device may convert the obtained scalar data into vector data. In one embodiment, the texture information includes vector data.
In fig. 7, taking a preset action of the touch pen to perform a shake in response to detecting that the touch pen is held by the user, the touch pen sends a first instruction to the first device, and referring to a in fig. 7, the user draws handwriting on a second device (such as a tablet computer) is illustrated. If the user wants to use the texture on the first device (handset), the user shakes a stylus, which can send a first instruction to the first device, see b in fig. 7. Referring to c of fig. 7, the interface of the first device is an interface displaying an image, and the user draws a circle on the interface using a stylus, and the first device may use an area in the circle as a target area. Illustratively, if the repetitive pattern of the target area is a triangle, the texture information obtained by the first device based on the description in S603 includes: vector data of triangles.
In one embodiment, the user may edit the texture of the target area in the first device, where the texture information obtained by the first device is texture information obtained after the user edits the texture of the target area. Illustratively, the editing process may be to adjust the depth of the texture of the target region, scale the texture of the target region, or the like.
Illustratively, in c of fig. 7, after the user draws a circle on the interface using the stylus, the first device may display a texture editing interface, as shown in fig. 8. A repeating pattern 81 in the target area, a depth editing control 82, and a scaling control 83 may be displayed on the editing interface. In fig. 8, the depth editing control 82 and the scaling control 83 are represented by a progress bar, and the user adjusts the progress bar to respectively adjust the depth of the texture of the target area and scale the texture of the target area. It should be appreciated that adjusting the depth of the texture of the target area may be understood as: and adjusting the contrast of the pattern of the target area and the background of the target area. The user can adjust the depth progress bar, reduce the contrast between the triangle and the background, and adjust the depth progress bar to amplify the triangle.
In this embodiment, the first device may acquire an edit processing parameter of a texture of the target area in the user, and correspondingly, the texture information obtained by the first device may include: editing the processing parameters. Referring to fig. 8, the texture information may include a depth parameter (80%) and a scale parameter (120%) in addition to vector data of triangles.
S604, the first device sends texture information to the stylus.
In an embodiment of the present application, a storage module may be included in the stylus, where the storage module is configured to store texture information from the first device.
In one embodiment, if the touch pen is provided with a display screen, after the touch pen receives texture information from the first device, a texture represented by the texture information may be displayed on the display screen. Illustratively, referring to d in FIG. 7, a triangle may be displayed on the display screen as a stylus.
S605, the stylus sends texture information to the second device.
In one embodiment, the stylus may send texture information to the second device in response to receiving the texture information from the first device.
In one embodiment, the tip of the stylus may be provided with a pressure sensor that may collect pressure data when the tip of the stylus contacts the touch screen of the second device. In this embodiment, in response to detecting that the pressure sensor collects pressure data, the stylus may determine that a tip of the stylus contacts the touch screen of the second device, which characterizes that a user needs to use the stylus to operate on the second device, and further the stylus may send texture information to the second device when contacting the touch screen of the second device.
In this manner, the second device may store texture information. In one embodiment, the second device includes a texture information memory therein, and the second device may store texture information in the texture information memory.
S606, the second device displays handwriting in the texture characterized by the texture information based on the position of the stylus on the touch screen.
When the stylus draws handwriting on the second device, the operation of drawing handwriting may be referred to as a fourth user operation, and the second device may detect a position of the stylus on a touch screen of the second device, and further display the handwriting on a corresponding position using a texture characterized by texture information from the stylus. After c in fig. 7 or fig. 8, if the user holds the stylus to draw handwriting on the second device, the second device may display the handwriting of the stylus in a triangle based on the position of the stylus on the touch screen, as shown with reference to d in fig. 7.
In one embodiment, the second device, upon receiving the texture information, may display the texture characterized by the texture information in a brush tool of the second device for user query and selection. Illustratively, an interface of an application program for drawing handwriting displayed on the second device is shown as a in fig. 9, and a brush tool 91, a "last step" control 92, a "history" control 93, a cancel control 94, and the like are displayed on the interface. If the second device is responsive to receiving texture information, a "triangle" texture may be stored in the brush tool 91 and the user operates the brush tool 91 with a stylus, the second device may display a plurality of selectable textures (including triangle texture 911, square texture 912, etc.) in the brush tool 91, as shown at b in fig. 9. It should be understood that b in fig. 9 represents the control in text, and the second device may also represent the control in a picture, a symbol, or the like, which is not limited in this embodiment of the present application. The stylus is not shown in fig. 9 a, b, d and e for clarity of illustration of the interface changes of the second device.
In one embodiment, the user may also select a texture and edit the selected texture. In b in fig. 9, the second device may also display a depth edit control 82, a zoom control 83, a resize control 95, an invert control 96, and the like. The depth editing control 82 and the scaling control 83 may refer to the related description of fig. 7, and the user may operate the control 95 according to the size adjustment ratio, so that the second device may adjust the size of the texture based on the thickness of the handwriting of the stylus. For example, the second device may increase the texture when the handwriting of the stylus is thickened. The user operates the inversion control 96 to invert the foreground and background, illustratively, if the texture is black and the background is white, then after inversion the texture is white and the background is black. Thus, the user can edit the selected texture, and correspondingly, the second device can display handwriting of the touch pen by adopting the edited texture based on the edit processing parameters of the texture by the user. It should be understood that the above editing process of the texture is an exemplary illustration, and the editing process manner of the texture is not limited in the embodiment of the present application.
The second device may store the used texture, which may include squares, circles, etc. as shown by d in fig. 9, when the user operates the "history" control 93 with the stylus. It should be understood that the texture is characterized in shape for purposes of illustration in the embodiments of the present application.
When the user operates the "last step" control 92 with the stylus, the second device may query the used texture stored in the "history" and display the handwriting of the stylus with the last used texture. Referring to c in fig. 9, if the texture last used by the second device is square, the second device may display handwriting in square.
The cancel control 94 is used to cancel the currently drawn handwriting, and illustratively, when the user operates the cancel control 94 with the stylus, the second device may delete the handwriting of the stylus displayed in a triangle, as shown by e in fig. 9.
In one embodiment, S606 may be replaced with: the second device displays the handwriting in a texture characterized by texture information based on the position of the user's finger or knuckle on the touch screen.
It will be appreciated that the user may draw handwriting on the touch screen of the second device using a finger or knuckle, and in such an embodiment the second device may detect the position of the user's finger or knuckle on the touch screen and further display the handwriting in texture characterized by texture information based on the position of the user's finger or knuckle on the touch screen.
In this embodiment, when a user draws handwriting on the second device by using the stylus, the texture may be selected on the first device by using the stylus, and the first device may send the texture selected by the stylus to the second device through the stylus. According to the texture processing mode in the embodiment of the application, a user can acquire textures by using the touch pen across equipment, and then handwriting is drawn by using the textures, so that the handwriting drawing process by the touch pen is not limited to limited colors on the second equipment, the handwriting drawing style by the electronic equipment is enriched, and the user experience can be improved.
In the above embodiment, the first device may send texture information to the second device through the stylus, in one embodiment, the first device may be wirelessly connected with the second device, and after obtaining the texture information, the first device may directly send the texture information to the second device.
In the embodiment of the application, the first device and the second device are both in wireless connection with the touch pen, and the first device and the second device are in wireless connection. Referring to fig. 10, S604 to S605 described above may be replaced with S604A:
S604A, the first device transmits texture information to the second device.
In this embodiment of the present application, because the first device is wirelessly connected with the second device, after the first device obtains texture information, the first device may directly send the texture information to the second device, which can improve transmission efficiency, and also can achieve the purpose that the second device displays handwriting with the texture corresponding to the texture information.
In an embodiment, the first device is further wirelessly connected to the third device and/or the second device is wirelessly connected to the third device, in which embodiment the first device, after obtaining the texture information, may send the texture information to the third device (or to the third device via the second device), so that the texture information may also be obtained and stored in the third device.
The user draws handwriting on the tablet computer and the notebook computer, the user can operate the stylus to select the target area on the mobile phone, and the mobile phone can send texture information of the target area to the tablet computer and the notebook computer, so that the user can draw handwriting on the tablet computer and the notebook computer by adopting the texture, and the description in the S606 can be referred to. It should be noted that when the first device is not wirelessly connected to the second device, the second device may send texture information to a third device wirelessly connected to the second device in response to receiving the texture information from the first device.
That is, synchronization of texture information may be established among the first device, the second device, and the third device, and the first device may send the texture information to the second device and the third device after acquiring the texture information, and the second device and the third device may update the stored texture information.
It should be appreciated that in one embodiment, the first device, the second device, and the third device each include: texture information memory. In other words, when a device acquires new texture information, the device can synchronize the acquired new texture information with the texture information memory in other devices through the texture information memory in the device, so that other devices can use the texture information.
In this embodiment, when the first device acquires the texture information, the texture information may be stored in the texture information memory of the first device, and the texture information memory of the first device may synchronize the texture information acquired by the first device with the texture information memory of the second device and the texture information memory of the third device.
It should be noted that the manner in which the first device synchronizes the texture information to the second device and the third device by means of the texture information memory is one manner of "the first device synchronizes the texture information".
In the embodiment of the application, if the user draws handwriting by adopting a computer inconvenient to carry at home, the user can adopt a touch pen to select a target area on the mobile phone, the mobile phone can send texture information of the target area to the computer and the notebook computer convenient to carry, so that the user can carry the notebook computer when going out, and the user can also draw handwriting by using textures stored on the notebook computer, thereby improving user experience.
In one embodiment, where the user may select not only the texture but also the color on the first device using the stylus, S603-S606 may be replaced with S603A-S606A with reference to fig. 11:
S603A, the first device acquires texture information and color information of the target area.
The first device may acquire texture information of the target area with reference to the related description in S603. The color information may include: RGB values, including R value for red, G value for green, and B value for blue.
In one embodiment, the color information and the texture information may be referred to as attribute information, i.e., in the embodiment of the present application, the attribute information may include: texture information and/or color information. In one embodiment, the texture information may include: texture information, or texture information and color information.
Unlike the above embodiment, the color of the texture is not considered in the above embodiment, for example, the texture is a triangle, and the color of the triangle is not considered in the above embodiment. In the embodiment of the present application, when the first device acquires the repetitive pattern in the target area, the color of the repetitive pattern needs to be considered, that is, the shape of the pattern in the repetitive pattern is not only the same, but also the color of the pattern needs to be the same, and the first device takes the same pattern with the same color as the repetitive pattern (i.e., the same pattern) of the target area. Correspondingly, the color information in the embodiment of the application is: color information of the repeated pattern. The color information of the repeated pattern includes: the RGB values for each location in the pattern are repeated.
In an embodiment, if the first device determines that the target area does not include the repetitive pattern, the texture information of the target area is vector data of an image of the target area, and the color information of the target area includes: RGB values for each location in the target area.
The first device acquiring color information in S603A may be understood as: based on the coordinates corresponding to the target area, the AP acquires the image of the content in the target area on the interface of the first device in a screenshot mode, and then sends the image of the content in the target area to the GPU, and the GPU can extract RGB values from the image of the content in the target area to obtain color information.
S604A, the first device sends texture information and color information to the stylus.
S605A, the stylus sends texture information and color information to the second device.
S606A, the second device displays the handwriting in the texture characterized by the texture information and the color characterized by the color information based on the position of the stylus on the touch screen.
In one embodiment, S604A-S605A may be replaced with: the first device transmits texture information and color information to the second device.
In the embodiment of the application, the user can select not only the texture but also the color on the first device by using the touch pen, so that the second device displays the handwriting of the touch pen by using the texture and the color of the target area, and the style of displaying the handwriting of the electronic device is further enriched.
In order to further enrich the style of the handwriting displayed by the electronic device, the cross-device drawing system provided in the embodiment of the present application may enable a user to select textures and/or colors in a plurality of regions of an interface of a first device, and after the first device fuses the textures and/or colors in the plurality of regions selected by the user, synchronize the texture information and the color information to a second device. In this embodiment, referring to fig. 12, the interaction procedure of the first device and the second device may include:
s1201, the first device receives a first instruction, where the first instruction is used to instruct the first device to obtain texture information and/or color information of the target area selected by the stylus.
S1201 can refer to the description in S601. Unlike S601, the first instruction is used to instruct the first device to acquire texture information and/or color information of the target area selected by the stylus, and is not single texture information.
S1202, the first device detects a first target area and a third target area selected by the stylus on an interface of the first device in response to receiving the first instruction.
It should be appreciated that the operation of the stylus selecting the first target area on the interface of the first device may be referred to as a third user operation and the operation of the stylus selecting the third target area on the interface of the first device may be referred to as a seventh user operation. The "interface of the first device" in S1202 may be referred to as a second graphical interface.
In the embodiment of the application, the stylus may select a plurality of target areas on the interface of the first device. In one embodiment, c in fig. 7 may be replaced with a in fig. 13, and referring to a in fig. 13, a "determination" control 131 may be displayed on the interface of the first device, and the user may operate the "determination" control 131 after selecting the target area. The first device may determine that the user selection is complete in response to detecting the user selection of the target area. As such, referring to a in fig. 13, a user selects the first target area 132, and the third target area 133 on the interface of the first device, wherein the user may select the "determine" control 131 to characterize that the user selection is complete. Correspondingly, the first device determines that the user selected the first target area and the third target area in response to detecting that the user selected the "determine" control 131.
S1203, the first device acquires first texture information of the first target region and third texture information of the third target region.
The manner in which the first device acquires the first texture information of the first target area and the third texture information of the third target area may refer to the description in which the first device acquires the texture information of the target area in S603.
And S1204, the first equipment performs fusion processing on the first texture information and the third texture information to obtain fusion texture information.
In one embodiment, the first device fuses the first texture information and the third texture information, which may be: the first equipment superimposes the first texture represented by the first texture information and the third texture represented by the third texture information to obtain a superimposed texture. The fusing texture information may include: image vector data of the superimposed texture. Referring to a in fig. 14, if the first texture is a triangle and the third texture is a square, the superimposed texture is to superimpose the square on the triangle.
In one embodiment, the relative positions of the first texture and the third texture may be preset, such as by arranging the third texture selected later in the preset relative position (e.g., right side, upper side, etc.) of the first texture selected earlier based on the order in which the user selected the target region. Referring to b in fig. 14, if the first texture is triangle and the third texture is square, the fused texture is: the triangles are arranged on the right side of the square.
In one embodiment, the first device may fuse the first texture and the third texture in multiple ways, and exemplary fusing of the first texture and the third texture by the first device may not be limited to: the first device superimposes the first texture and the third texture, or the first device arranges the first texture to the right of the third texture, etc.
The first device may display a plurality of fused textures on an interface of the first device for selection by a user, where the user may select the fused textures on the interface of the first device, and thus, the first device uses texture information corresponding to the fused textures selected by the user as fused texture information in response to detecting the fused textures selected by the user.
In one embodiment, the user may customize the relative positions of the first texture and the third texture to perform a fusion process on the first texture and the third texture. If the first device obtains the first texture information and the third texture information, an editing interface may be displayed, where the editing interface includes the first texture and the third texture. Accordingly, the interface shown in fig. 8 may be replaced by the interface shown in fig. 15, referring to a in fig. 15, the editing interface displays a first texture in the first target area and a third texture in the third target area, the user may drag any one texture (the user may drag with a finger or a stylus) to change the relative positions of the first texture and the third texture, referring to b in fig. 15, the user may drag a square so that the square and the triangle are superimposed, and the fused textures are: superposition of squares and triangles.
S1205, the first device synchronously fuses the texture information to the second device.
In one embodiment, a first device may send fused texture information to a stylus, which in response to receiving the fused texture information may send the fused texture information to a second device. Alternatively, in one embodiment, the first device may send the fused texture information directly to the second device.
S1206, the second device displays handwriting in a texture characterized by blending texture information based on the position of the stylus on the touch screen.
Referring to b in fig. 13, when a user draws handwriting on the second device using the stylus, the second device may display the handwriting with a texture characterized by the fused texture information (i.e., the fused texture), and b in fig. 13 is illustrated by taking the superposition of the fused texture as a square and a triangle as an example.
In the embodiment of the application, the user can select at least two target areas on the first device by using the touch pen, the first device can fuse textures in the at least two target areas to obtain fused texture information, and then the fused texture information is synchronously fused to the second device, so that the user can fuse a plurality of textures which are needed, and the style of the handwriting displayed by the electronic device can be further enriched.
The embodiment shown in fig. 12 above illustrates that the first device may blend the first texture of the first target region with the third texture of the third target region, and in one embodiment, the first device may blend the first texture of the first target region with the third color of the third target region, where in such a scenario, if the user prefers the texture of one region and prefers the color of the other region in the interface displayed by the first device, the user may select the two regions, which in turn causes the first device to blend the textures and colors in the two regions.
In this embodiment, S1203-S1206 in fig. 12 above may be replaced with S1203A-S1206A:
S1203A, the first device acquires first texture information of the first target region and third color information of the third target region.
In one embodiment, whether the first device acquires texture information or color information of the target area may be preset based on the order of the target area selected by the user. For example, it may be determined whether to obtain texture information or color information of the target area in a manner of intersecting "texture" and "color" sequentially, if the user first selects the first target area, the first device obtains first texture information of the first target area, the user second selects the third target area, the first device obtains third color information of the third target area, and so on, and the user may sequentially obtain texture, color, texture, and color … … of the target area selected by the user.
Or, the first device may be set to obtain texture information of the first n target areas selected by the user, obtain color information of the second m target areas selected by the user, and the like. Wherein exemplary n and m are integers greater than or equal to 1.
And S1204A, the first equipment performs fusion processing on the first texture information and the third color information to obtain fusion information.
The method for the first device to fuse the first texture information and the third color information may be: and the first equipment superimposes the color represented by the third color information on the texture represented by the first texture information to obtain the texture with superimposed color. The first device may obtain the fusion information based on the texture after the color is superimposed, where the fusion information includes vector data of the texture after the color is superimposed, and reference may be made to the related description in the above embodiment.
S1205A, the first device synchronizes the fusion information to the second device.
S1206A, the second device displays the handwriting in a texture and color that fuses the information representation based on the position of the stylus on the touch screen.
For example, fig. 13 may be replaced by fig. 16, and as shown in a in fig. 16, a circle is drawn on an interface of a first device by using a stylus, and then the first device may use an area in the circle as a first target area, and acquire first texture information of the first target area: triangle. The user draws a circle on the interface of the first device by using the touch pen, and the first device may use the area in the circle as a third target area, and acquire third color information of the third target area: gray (it should be understood that the colors are represented in gray in the examples of the present application). Based on the description in S1204A-S1205A, the second device may obtain the fusion information, and referring to b in fig. 16, when the user draws handwriting on the second device using the stylus, the second device may display the handwriting in a texture and a color characterized by the fusion texture information.
In one embodiment, S1203A may also be replaced with: the first device obtains first color information of a first target area and third color information of a third target area.
Accordingly, S1204A may be replaced with: and the first device performs fusion processing on the first color information and the third color information to obtain fused color information. S1205A may be replaced with: the first device synchronizes the fused color information to the second device. S1206A may be replaced with: the second device displays handwriting in the fused color based on the position of the stylus on the touch screen.
Wherein, in one embodiment, the color in the first target area may be referred to as a first color, the texture in the first target area may be referred to as a first texture, the information of the first color is referred to as first color information, and the information of the first texture is referred to as first texture information. The color in the third target area may be referred to as a third color, the texture in the third target area may be referred to as a third texture, information of the third color is referred to as third color information, and information of the third texture is referred to as third texture information.
In one embodiment, when a user selects a target area, the user can autonomously select to acquire texture information or color information of the target area, and user experience can be provided. The following description will be given taking, as an example, texture information of the first target area and color information of the third target area selected by the user.
In such an embodiment, S1202B-S1204B may be substituted as described above for S1202-S1206:
S1202B, the first device detects a first target area selected by the stylus on an interface of the first device in response to receiving the first instruction.
S1203B, the first device displays a color control and a texture control to be selected in response to detecting the first target area.
And S1204B, the first device responds to detection of the operation of the texture control by the user, and acquires first texture information of the first target area.
In this embodiment, the texture control may be understood as a first target control. In one embodiment, the selection operation of the color control and/or texture control after the first target region selection may be referred to as an eighth user operation.
S1205B, the first device detects a third target area selected by the stylus on the interface of the first device.
S1206B, the first device displays a color control and a texture control to be selected in response to detecting the third target area.
In S1207B, the first device obtains third color information of the first target area in response to detecting that the user operates the color control.
In this embodiment, the color control may be understood as a second target control. In one embodiment, the selection operation of the color control and/or texture control after the second target region selection may be referred to as a ninth user operation.
After S1207B, S1205A-S1206A may also be performed.
For example, in this embodiment, fig. 13 may be replaced with fig. 17, as shown by a in fig. 17, where a circle is drawn on the interface of the first device using the stylus, the first device may display a color control 161 and a texture control 162 selectable by the user on the interface, and the first device may use the area in the circle as the first target area. Where the user may select the color control 161 or the texture control 162 to trigger the first device to obtain the color information or the texture information of the first target area, for example, referring to a in fig. 17, when the user selects the texture control 162, the first device may obtain the first texture information of the first target area in response to detecting that the user selects the texture control 162.
Similarly, referring to b in fig. 17, when the user draws a circle again on another area of the interface of the first device using the stylus, the first device may display a user selectable color control 161 and texture control 162 on the interface, and the first device may take the area in the circle as the third target area. Referring to b in fig. 17, when the user selects the color control 161, the first device may acquire color information of the third target area in response to detecting that the user selects the color control 161.
It should be appreciated that, in one embodiment, referring to b in fig. 17, the user may select the color control 161 and the texture control 162 at the same time, and then trigger the first device to acquire the first color information and the first texture information of the first target area, and then the first device may fuse the first color information and the first texture information of the first target area, and the third color information (or the third texture information) of the second device, where the manner of texture fusion may be described with reference to fig. 12.
The color fusion mode can be as follows: the first device sums and averages the R value in the first color information and the R value in the third color information to obtain a fused R value, and similarly, the first device sums and averages the G value in the first color information and the G value in the third color information to obtain a fused G value, and the first device sums and averages the B value in the first color information and the B value in the third color information to obtain a fused B value, and the fused color information comprises the fused R value, the fused G value and the fused B value.
Referring to c in fig. 17, when a user draws handwriting on the second device using a stylus, the second device may display the handwriting in a texture and color that fuses the characterization of the texture information.
The above embodiments teach examples in which a user may select a color and/or texture from a first device to synchronize to use in a second device, and in one embodiment, a user may also select a color and/or texture from a second device to synchronize to use in a first device.
In the embodiment of the application, a user can select a plurality of target areas on the first device, the first device can preset the acquired color and/or texture of each target area, or the user can select to acquire the color information and/or texture information of each target area when selecting the target area, so that the first device fuses the color information and/or texture information of the plurality of target areas to obtain fusion information, and further, the second device can display handwriting of the stylus based on the color and/or texture represented by the fusion information, and the style of the electronic device for displaying the handwriting is further enriched.
FIG. 18 is a schematic diagram of a handwriting drawing device according to an embodiment of the present application. The handwriting drawing device according to this embodiment may be the first device described above, or may be a chip in the first device. The handwriting rendering means may be adapted to perform the actions of the first device in the method embodiments described above. As shown in fig. 18, the handwriting drawing apparatus 1800 may include: a display module 1801, a processing module 1802, and a transceiver module 1803.
The display module 1801 is configured to display a first graphical interface in response to a first user operation on the first device display screen, and to display a second graphical interface in response to a second user operation on the first device display screen, where the second graphical interface is different from the first graphical interface.
A processing module 1802 is configured to select a first target area on the second graphical interface in response to a third user operation on the first device display, the first target area including a first color therein.
The transceiver module 1803 is configured to send color information of a first color to the second device, where the color information of the first color is used to instruct the second device to display handwriting of the first color.
In one possible implementation, the display module 1801 is further configured to display, in response to a sixth user operation on the display screen of the first device, handwriting of a second color, where the second color is a color of the second target area selected on the graphical interface of the second device.
In one possible implementation, the processing module 1802 is further configured to detect, on the first device display screen, a third user operation in response to receiving a first instruction from the stylus, the first instruction being sent by the stylus detecting that the stylus performs a first preset action; or, in response to detecting that the stylus performs a second preset action, detecting a third user operation on the first device display screen; or, in response to detecting that the first device performs a third preset action, detecting a third user operation on the first device display.
In one possible implementation, the first target area further comprises a first texture.
The transceiver module 1803 is specifically configured to send color information of the first color and texture information of the first texture to the second device.
In one possible implementation, the processing module 1802 is further configured to obtain texture information of a first texture.
In one possible implementation, the processing module 1802 is specifically configured to detect whether the first target area includes the same pattern; if yes, screenshot is carried out on the pattern, and an image of the pattern is obtained; acquiring vector data of the pattern based on the image of the pattern; taking vector data of the pattern as texture information of a first texture; if not, capturing a first target area to obtain an image of the first target area; acquiring vector data of a first target area based on an image of the first target area; and taking the vector data of the first target area as texture information of the first texture.
In one possible implementation, the processing module 1802 is specifically configured to divide the first target area into a plurality of grids, each grid having a first preset size; acquiring first similarity of patterns in every two grids; if the first similarity greater than or equal to the preset similarity exists, and the duty ratio of the first similarity greater than or equal to the preset similarity is greater than or equal to the preset duty ratio, determining that the first target area contains the same pattern.
In one possible implementation manner, the processing module 1802 is specifically configured to increase the size of the grid if there is no first similarity greater than or equal to the preset similarity, or the duty ratio is smaller than the preset duty ratio, and obtain the second similarity of the patterns in each two grids after the size is increased; if the second similarity which is larger than or equal to the preset similarity exists, and the duty ratio of the second similarity which is larger than or equal to the preset similarity is larger than or equal to the preset duty ratio, determining that the first target area contains the same pattern; if the second similarity which is larger than or equal to the preset similarity does not exist, or the duty ratio of the second similarity which is larger than or equal to the preset similarity is larger than or equal to the preset duty ratio, the size of the grid is continuously increased until the size of the grid reaches the second preset size.
In one possible implementation, the processing module 1802 is further configured to select a third target area on the second graphical interface in response to a seventh user operation on the first device display, where the third target area includes a third color; and fusing the first color and the third color to obtain fused color information.
The transceiver module 1803 is specifically configured to send, to the second device, fused color information, where the fused color information is used to instruct the second device to display handwriting of the color fused by the first color and the third color.
In one possible implementation, the processing module 1802 is further configured to select a third target area on the second graphical interface in response to a seventh user operation on the first device display, where the third target area includes a third texture; and fusing the first color and the third texture to obtain fused information.
The transceiver module 1803 is specifically configured to send fusion information to the second device, where the fusion information is used to instruct the second device to display handwriting that is a combination of the first color and the third texture.
In one possible implementation, the first target area further comprises a first texture. The display module 1801 is further configured to display a color control and a texture control to be selected on the second graphical interface. The processing module 1802 is further configured to detect an eighth user operation of selecting a color control and/or a texture control.
In one possible implementation, the processing module 1802 is further configured to select a third target area on the second graphical interface in response to a seventh user operation on the first device display, the third target area including a third color and a third texture.
The display module 1801 is further configured to display a color control and a texture control to be selected on the second graphical interface.
The processing module 1802 is further configured to detect a ninth user operation for selecting the color control and/or the texture control, fuse first information indicated by the eighth user operation with second information indicated by the ninth user operation, and obtain fused information, where the first information is color information of a first color and/or texture information of a first texture, and the second information is color information of a third color and/or texture information of a third texture.
The transceiver module 1803 is specifically configured to send fusion information to the second device, where the fusion information is used to instruct the second device to display handwriting of the first color, the first texture, the third color, and/or the third texture combination.
The handwriting drawing device provided in the embodiment of the present application may perform the action of the first device in the embodiment of the method, and its implementation principle and technical effect are similar, and are not described herein again.
Fig. 19 is a schematic structural diagram of a handwriting drawing device according to an embodiment of the present application. The handwriting drawing device according to this embodiment may be the aforementioned second device, or may be a chip in the second device. The handwriting rendering means may be adapted to perform the actions of the second device in the method embodiments described above. As shown in fig. 19, the handwriting drawing apparatus 1900 may include: transceiver module 1901, display module 1902.
The transceiver module 1901 is configured to receive information of a first color from a first device.
A display module 1902 for displaying handwriting of the first color in response to a fourth user operation on the second device display.
In one possible implementation, the display module 1902 is further configured to select, in response to a fifth user operation on a display screen of the second device, a second target area on a graphical interface displayed by the second device, the second target area including a second color therein.
The transceiver module 1901 is configured to send color information of a second color to the first device, where the color information of the second color is used to instruct the first device to display handwriting of the second color.
In one possible implementation, the transceiver module 1903 is specifically configured to receive color information of a first color and texture information of a first texture from a first device.
The display module 1902 is specifically configured to display handwriting combined by the first color and the first texture in response to a fourth user operation on the second device display.
In one possible implementation, the transceiver module 1903 is specifically configured to receive fused color information from the first device, where the fused color information is information obtained by fusing a first color of the first target area and a third color of the third target area on the first device.
The display module 1902 is specifically configured to display handwriting of a color obtained by fusing the first color and the third color in response to a fourth user operation on the display screen of the second device.
In one possible implementation, the transceiver module 1903 is specifically configured to receive blending information from the first device, where the blending information is information obtained by blending the first color of the first target area and the third texture of the third target area on the first device.
The display module 1902 is specifically configured to display handwriting with a combination of the first color and the third texture in response to a fourth user operation on the second device display.
In one possible implementation, the transceiver module 1903 is specifically configured to receive fusion information from a first device, where the fusion information is a first color, a first texture, a third color of a third target area, and/or third texture fusion information of a first target area on the first device.
The display module 1902 is specifically configured to display handwriting of the first color, the first texture, the third color, and/or the third texture combination in response to a fourth user operation on the second device display.
The handwriting drawing device provided in the embodiment of the present application may perform the action of the second device in the embodiment of the method, and its implementation principle and technical effect are similar, and are not described herein again.
In an embodiment, the embodiment of the application further provides an electronic device, which may be the first device, the second device, or the stylus in the above embodiment. The electronic device may include: a processor (e.g., CPU), a memory. The memory may comprise a random-access memory (RAM) or may further comprise a non-volatile memory (NVM), such as at least one disk memory, in which various instructions may be stored for performing various processing functions and implementing method steps of the present application. Optionally, the electronic device related to the present application may further include: a power supply, a communication bus, and a communication port. The communication port is used for realizing connection communication between the electronic equipment and other peripheral equipment. In an embodiment of the present application, a memory is used to store computer executable program code, the program code including instructions; when the processor executes the instructions, the instructions cause the processor of the electronic device to execute the actions in the above method embodiments, and the implementation principle and technical effects are similar, which are not described herein again.
It should be noted that the modules or components in the above embodiments may be one or more integrated circuits configured to implement the above methods, for example: one or more application specific integrated circuits (application specific integrated circuit, ASIC), or one or more microprocessors (digital signal processor, DSP), or one or more field programmable gate arrays (field programmable gate array, FPGA), or the like. For another example, when a module above is implemented in the form of a processing element scheduler code, the processing element may be a general purpose processor, such as a central processing unit (central processing unit, CPU) or other processor that may invoke the program code, such as a controller. For another example, the modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, by wired (e.g., coaxial cable, fiber optic, digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.) means from one website, computer, server, or data center. Computer readable storage media can be any available media that can be accessed by a computer or data storage devices, such as servers, data centers, etc., that contain an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), etc.
The term "plurality" herein refers to two or more. The term "and/or" is herein merely an association relationship describing an associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship; in the formula, the character "/" indicates that the front and rear associated objects are a "division" relationship. In addition, it should be understood that in the description of this application, the words "first," "second," and the like are used merely for distinguishing between the descriptions and not for indicating or implying any relative importance or order.
It will be appreciated that the various numerical numbers referred to in the embodiments of the present application are merely for ease of description and are not intended to limit the scope of the embodiments of the present application.
It should be understood that, in the embodiments of the present application, the sequence number of each process described above does not mean that the execution sequence of each process should be determined by the function and the internal logic of each process, and should not constitute any limitation on the implementation process of the embodiments of the present application.
Examples:
1. a cross-device rendering system, wherein the system comprises a first device and a second device,
the first device is configured to:
responsive to a first user operation on the first device display, displaying a first graphical interface;
responsive to a second user operation on the first device display screen, displaying a second graphical interface, the second graphical interface being different from the first graphical interface;
responsive to a third user operation on the first device display screen, selecting a first target area on the second graphical interface, the first target area including a first color therein;
the second device is configured to:
and displaying the handwriting of the first color in response to a fourth user operation on the second device display screen.
2. The system of embodiment 1, wherein,
the second device is further configured to:
responsive to a fifth user operation on the second device display screen, selecting a second target area on the graphical interface displayed by the second device, the second target area including a second color therein;
the first device is further configured to:
and displaying the handwriting of the second color in response to a sixth user operation on the first device display screen.
3. The system of embodiment 1 or 2, wherein,
the first device is further configured to:
detecting the third operation on the first device display screen in response to receiving a first instruction from a stylus, the first instruction being sent by the stylus detecting that the stylus performs a first preset action; or alternatively, the process may be performed,
detecting the third user operation on the first device display screen in response to detecting that the stylus performs a second preset action; or alternatively, the process may be performed,
in response to detecting that the first device performs a third preset action, the third user operation is detected on the first device display screen.
4. The system of embodiment 1 or 2 wherein the first target region further comprises a first texture;
the second device is further configured to:
displaying handwriting combined by the first color and the first texture in response to the fourth user operation on the second device display screen.
5. The system of embodiment 4, wherein the first device is further configured to:
and after the first target area is selected on the second graphical interface, acquiring texture information of the first texture.
6. The system of embodiment 5, wherein the first device is specifically configured to:
detecting whether the first target area contains the same pattern;
if yes, screenshot is carried out on the pattern, and an image of the pattern is obtained;
acquiring vector data of the pattern based on the image of the pattern;
taking vector data of the pattern as texture information of the first texture;
if not, capturing a picture of the first target area to obtain an image of the first target area;
acquiring vector data of the first target area based on the image of the first target area;
and taking the vector data of the first target area as texture information of the first texture.
7. The system of embodiment 6, wherein the first device is specifically configured to:
dividing the first target area into a plurality of grids, wherein each grid has a first preset size;
acquiring first similarity of patterns in every two grids;
and if the first similarity greater than or equal to the preset similarity exists, and the duty ratio of the first similarity greater than or equal to the preset similarity is greater than or equal to the preset duty ratio, determining that the first target area contains the same pattern.
8. The system of embodiment 7, wherein the first device is specifically configured to:
if the first similarity which is larger than or equal to the preset similarity does not exist, or the duty ratio is smaller than the preset duty ratio, the size of the grids is increased, and the second similarity of the patterns in every two grids after the size is increased is obtained;
if the second similarity which is larger than or equal to the preset similarity exists, and the duty ratio of the second similarity which is larger than or equal to the preset similarity is larger than or equal to the preset duty ratio, determining that the first target area contains the same pattern;
if the second similarity which is greater than or equal to the preset similarity does not exist, or the duty ratio of the second similarity which is greater than or equal to the preset similarity is greater than or equal to the preset duty ratio, the size of the grid is continuously increased until the size of the grid reaches a second preset size.
9. The system of any of embodiments 1-8, wherein the first device is further configured to:
responsive to a seventh user operation on the first device display screen, selecting a third target area on the second graphical interface, the third target area including a third color therein;
The second device is further configured to:
and responding to the fourth user operation on the display screen of the second device, and displaying handwriting of the color obtained by fusing the first color and the third color.
10. The system of any of embodiments 1-8, wherein the first device is further configured to:
responsive to a seventh user operation on the first device display screen, selecting a third target area on the second graphical interface, the third target area including a third texture therein;
the second device is further configured to:
and displaying handwriting of the first color and the third texture combination in response to the fourth user operation on the second device display screen.
11. The system of any of embodiments 1-8, wherein the first target area further comprises a first texture, the first device further configured to:
after a first target area is selected on the second graphical interface, displaying a color control and a texture control to be selected on the second graphical interface;
detecting an eighth user operation of selection of the color control and/or texture control;
the second device is configured to:
And displaying handwriting of the first color and/or the first texture combination in response to the fourth user operation on the second device display screen.
12. The system of embodiment 11, wherein the first device is further configured to:
responsive to a seventh user operation on the first device display screen, selecting a third target area on the second graphical interface, the third target area including a third color and a third texture therein;
displaying a color control and a texture control to be selected on the second graphical interface;
detecting a ninth user operation of the selection of the color control and/or the texture control;
the second device is further configured to:
displaying handwriting of the first color, the first texture, the third color, and/or the third texture combination in response to the fourth user operation on the second device display screen.
13. A handwriting drawing method applied to a cross-device drawing system, characterized in that the method is applied to a first device, the method comprising:
responsive to a first user operation on the first device display, displaying a first graphical interface;
responsive to a second user operation on the first device display screen, displaying a second graphical interface, the second graphical interface being different from the first graphical interface;
Responsive to a third user operation on the first device display screen, selecting a first target area on the second graphical interface, the first target area including a first color therein;
and sending the color information of the first color to a second device, wherein the color information of the first color is used for indicating the second device to display handwriting of the first color.
14. The method of embodiment 13, further comprising:
and responding to a sixth user operation on the display screen of the first device, and displaying handwriting of a second color, wherein the second color is the color of a second target area selected on the graphical interface of the second device.
15. The method of embodiment 13 or 14, wherein the responding to the third user operation on the first device display screen further comprises:
detecting the third operation on the first device display screen in response to receiving a first instruction from a stylus, the first instruction being sent by the stylus detecting that the stylus performs a first preset action; or alternatively, the process may be performed,
detecting the third user operation on the first device display screen in response to detecting that the stylus performs a second preset action; or alternatively, the process may be performed,
In response to detecting that the first device performs a third preset action, the third user operation is detected on the first device display screen.
16. The method of any of embodiments 13-15, wherein the first target region further comprises a first texture;
the sending the color information of the first color to a second device includes:
and sending the color information of the first color and the texture information of the first texture to the second device.
17. The method according to embodiment 16, further comprising, after selecting the first target area on the second graphical interface:
and obtaining texture information of the first texture.
18. The method of embodiment 17 wherein the obtaining texture information for the first texture includes:
detecting whether the first target area contains the same pattern;
if yes, screenshot is carried out on the pattern, and an image of the pattern is obtained;
acquiring vector data of the pattern based on the image of the pattern;
taking vector data of the pattern as texture information of the first texture;
if not, capturing a picture of the first target area to obtain an image of the first target area;
Acquiring vector data of the first target area based on the image of the first target area;
and taking the vector data of the first target area as texture information of the first texture.
19. The method of embodiment 18 wherein the detecting whether the first target area contains the same pattern comprises:
dividing the first target area into a plurality of grids, wherein each grid has a first preset size;
acquiring first similarity of patterns in every two grids;
and if the first similarity greater than or equal to the preset similarity exists, and the duty ratio of the first similarity greater than or equal to the preset similarity is greater than or equal to the preset duty ratio, determining that the first target area contains the same pattern.
20. The method of embodiment 19, further comprising:
if the first similarity which is larger than or equal to the preset similarity does not exist, or the duty ratio is smaller than the preset duty ratio, the size of the grids is increased, and the second similarity of the patterns in every two grids after the size is increased is obtained;
if the second similarity which is larger than or equal to the preset similarity exists, and the duty ratio of the second similarity which is larger than or equal to the preset similarity is larger than or equal to the preset duty ratio, determining that the first target area contains the same pattern;
If the second similarity which is greater than or equal to the preset similarity does not exist, or the duty ratio of the second similarity which is greater than or equal to the preset similarity is greater than or equal to the preset duty ratio, the size of the grid is continuously increased until the size of the grid reaches a second preset size.
21. The method according to any one of embodiments 13-20, further comprising, after selecting a first target area on the second graphical interface:
responsive to a seventh user operation on the first device display screen, selecting a third target area on the second graphical interface, the third target area including a third color therein;
fusing the first color and the third color to obtain fused color information;
the sending the color information of the first color to a second device includes:
and sending the fusion color information to the second device, wherein the fusion color information is used for indicating the second device to display the handwriting of the color obtained by fusing the first color and the third color.
22. The method according to any one of embodiments 13-20, further comprising, after selecting a first target area on the second graphical interface:
Responsive to a seventh user operation on the first device display screen, selecting a third target area on the second graphical interface, the third target area including a third texture therein;
fusing the first color and the third texture to obtain fusion information;
the sending the color information of the first color to a second device includes:
and sending the fusion information to the second device, wherein the fusion information is used for indicating the second device to display the handwriting of the first color and the third texture combination.
23. The method according to any one of embodiments 13-20, wherein the first target area further comprises a first texture, and wherein after selecting the first target area on the second graphical interface, further comprising:
displaying a color control and a texture control to be selected on the second graphical interface;
an eighth user operation of selection of the color control and/or texture control is detected.
24. The method according to embodiment 23, wherein after the eighth user operation for detecting the selection of the color control and/or texture control, further comprises:
responsive to a seventh user operation on the first device display screen, selecting a third target area on the second graphical interface, the third target area including a third color and a third texture therein;
Displaying a color control and a texture control to be selected on the second graphical interface;
detecting a ninth user operation of the selection of the color control and/or the texture control;
fusing the first information indicated by the eighth user operation and the second information indicated by the ninth user operation to obtain fusion information, wherein the first information is color information of a first color and/or texture information of a first texture, and the second information is color information of a third color and/or texture information of a third texture;
the sending the color information of the first color to a second device includes:
and sending the fusion information to the second device, wherein the fusion information is used for indicating the second device to display the handwriting combined by the first color, the first texture, the third color and/or the third texture.
25. A handwriting drawing method applied to a cross-device drawing system, characterized in that it is applied to a second device, the method comprising:
receiving information of a first color from a first device;
and displaying the handwriting of the first color in response to a fourth user operation on the second device display screen.
26. The method of embodiment 25, further comprising:
selecting a second target area on a graphical interface displayed by the second device in response to a fifth user operation on the second device display screen, the second target area including a second color therein;
and sending color information of the second color to the first device, wherein the color information of the second color is used for indicating the first device to display handwriting of the second color.
27. The method of embodiment 25 or 26 wherein the receiving information of the first color from the first device comprises:
receiving color information of the first color and texture information of a first texture from the first device;
displaying handwriting combined by the first color and the first texture in response to the fourth user operation on the second device display screen.
28. The method of embodiment 25 or 26 wherein the receiving information of the first color from the first device comprises:
receiving fusion color information from the first device, wherein the fusion color information is information obtained by fusing a first color of a first target area and a third color of a third target area on the first device;
And responding to the fourth user operation on the display screen of the second device, and displaying handwriting of the color obtained by fusing the first color and the third color.
29. The method of embodiment 25 or 26 wherein the receiving information of the first color from the first device comprises:
receiving fusion information from the first device, wherein the fusion information is information obtained by fusing a first color of a first target area on the first device and a third texture of a third target area;
and displaying handwriting of the first color and the third texture combination in response to the fourth user operation on the second device display screen.
30. The method of embodiment 25 or 26 wherein the receiving information of the first color from the first device comprises:
receiving fusion information from the first device, wherein the fusion information is first color, first texture, third color of a third target area and/or third texture fusion information of a first target area on the first device;
displaying handwriting of the first color, the first texture, the third color, and/or the third texture combination in response to the fourth user operation on the second device display screen.
31. A handwriting drawing device applied to a cross-device drawing system is characterized in that,
a display module for:
responsive to a first user operation on a first device display screen, displaying a first graphical interface;
responsive to a second user operation on the first device display screen, displaying a second graphical interface, the second graphical interface being different from the first graphical interface;
a processing module, configured to select a first target area on the second graphical interface in response to a third user operation on the first device display screen, where the first target area includes a first color;
the receiving and transmitting module is used for sending the color information of the first color to the second equipment, and the color information of the first color is used for indicating the second equipment to display the handwriting of the first color.
32. A handwriting drawing device applied to a cross-device drawing system is characterized in that,
a transceiver module for receiving information of a first color from a first device;
and the display module is used for responding to a fourth user operation on the display screen of the second device and displaying the handwriting of the first color.
33. An electronic device, comprising: a processor and a memory;
The memory stores computer-executable instructions;
the processor executing computer-executable instructions stored in the memory, causing the processor to perform the method of any one of embodiments 13-30.
34. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program or instructions, which when executed, implement the method according to any of embodiments 13-30.
35. A computer program product comprising a computer program or instructions which, when executed by a processor, implements the method of any one of embodiments 13-30.
36. A handwriting drawing method, which is applied to a first device, wherein the first device is wirelessly connected with at least one second device, and each of the first device and each of the second devices comprises: a texture information store, the method comprising:
receiving a first instruction, wherein the first instruction is used for indicating to acquire attribute information of a target area selected by a user on an interface of the first device, and the attribute information comprises: texture information, or texture information and color information;
Detecting a target area selected by the user on an interface of the first device;
acquiring attribute information of the target area;
storing the attribute information of the target area into a texture information memory of the first device, wherein the texture information memory of the first device is used for synchronizing the attribute information of the target area to the texture information memory of the at least one second device, and the attribute information of the target area in the texture information memory of the second device is used for displaying handwriting by the texture represented by the attribute information or displaying handwriting by the texture and color represented by the attribute information.
37. The method of embodiment 36 wherein the receiving the first instruction comprises:
receiving the first instruction from the touch pen, wherein the first instruction is sent by the touch pen when the touch pen detects that the touch pen executes a first preset action, and the first device is in wireless connection with the touch pen; or alternatively, the process may be performed,
determining to receive the first instruction in response to detecting that the stylus performs a second preset action; or alternatively, the process may be performed,
in response to detecting that the first device performs a third preset action, it is determined to receive the first instruction.
38. The method of embodiment 36 wherein the detecting the target area selected by the user on the interface of the first device comprises:
detecting a position of the user on a touch screen of the first device;
and acquiring the target area based on the position of the user on the touch screen of the first device.
39. The method of embodiment 38 wherein the detecting the user's location on the touch screen of the first device comprises:
detecting the position of the finger or the finger joint of the user on the touch screen of the first device or detecting the position of a touch pen used by the user on the touch screen of the first device, wherein the first device and the touch pen are connected in a wireless mode.
40. The method according to embodiment 36, wherein the attribute information includes texture information, and acquiring the texture information of the target area includes:
detecting whether the target area contains the same pattern;
if yes, screenshot is carried out on the pattern, and an image of the pattern is obtained;
acquiring vector data of the pattern based on the image of the pattern;
taking vector data of the pattern as texture information of the target area;
If not, capturing a screenshot of the target area to obtain an image of the target area;
acquiring vector data of the target area based on the image of the target area;
and taking the vector data of the target area as texture information of the target area.
41. The method of embodiment 40 wherein the detecting whether the target area contains the same pattern comprises:
drawing a plurality of grids in the target area, wherein the grids have a first preset size;
acquiring first similarity of patterns in every two grids;
and if the first similarity which is larger than or equal to the preset similarity exists, and the duty ratio of the first similarity which is larger than or equal to the preset similarity is larger than or equal to the preset duty ratio, determining that the target area contains the same pattern.
42. The method of embodiment 41, further comprising:
if the first similarity which is larger than or equal to the preset similarity does not exist, or the duty ratio is smaller than the preset duty ratio, the size of the grids is increased, and the second similarity of the patterns in every two grids after the size is increased is obtained;
if the second similarity which is larger than or equal to the preset similarity exists, and the duty ratio of the second similarity which is larger than or equal to the preset similarity is larger than or equal to the preset duty ratio, determining that the target area contains the same pattern;
If the second similarity which is greater than or equal to the preset similarity does not exist, or the duty ratio of the second similarity which is greater than or equal to the preset similarity is greater than or equal to the preset duty ratio, the size of the grid is continuously increased until the size of the grid reaches a second preset size.
43. The method according to embodiment 36, wherein the target area includes a first target area and a second target area, and the acquiring attribute information of the target area includes:
acquiring first texture information of the first target area and second texture information of the second target area;
and carrying out fusion processing on the first texture information and the second texture information to obtain fusion texture information.

Claims (24)

  1. A cross-device rendering system, wherein the system comprises a first device and a second device,
    the first device is configured to:
    responsive to a first user operation on the first device display, displaying a first graphical interface;
    responsive to a second user operation on the first device display screen, displaying a second graphical interface, the second graphical interface being different from the first graphical interface;
    Responsive to a third user operation on the first device display screen, selecting a first target area on the second graphical interface, the first target area including a first color therein;
    the second device is configured to:
    and displaying the handwriting of the first color in response to a fourth user operation on the second device display screen.
  2. The system of claim 1, wherein the system further comprises a controller configured to control the controller,
    the second device is further configured to:
    responsive to a fifth user operation on the second device display screen, selecting a second target area on the graphical interface displayed by the second device, the second target area including a second color therein;
    the first device is further configured to:
    and displaying the handwriting of the second color in response to a sixth user operation on the first device display screen.
  3. The system according to claim 1 or 2, wherein,
    the first device is further configured to:
    detecting the third operation on the first device display screen in response to receiving a first instruction from a stylus, the first instruction being sent by the stylus detecting that the stylus performs a first preset action; or alternatively, the process may be performed,
    Detecting the third user operation on the first device display screen in response to detecting that the stylus performs a second preset action; or alternatively, the process may be performed,
    in response to detecting that the first device performs a third preset action, the third user operation is detected on the first device display screen.
  4. The system of claim 1 or 2, wherein the first target region further comprises a first texture;
    the second device is further configured to:
    displaying handwriting combined by the first color and the first texture in response to the fourth user operation on the second device display screen.
  5. The system of claim 4, wherein the first device is further configured to:
    and after the first target area is selected on the second graphical interface, acquiring texture information of the first texture.
  6. The system of claim 5, wherein the first device is specifically configured to:
    detecting whether the first target area contains the same pattern;
    if yes, screenshot is carried out on the pattern, and an image of the pattern is obtained;
    acquiring vector data of the pattern based on the image of the pattern;
    Taking vector data of the pattern as texture information of the first texture;
    if not, capturing a picture of the first target area to obtain an image of the first target area;
    acquiring vector data of the first target area based on the image of the first target area;
    and taking the vector data of the first target area as texture information of the first texture.
  7. The system of claim 6, wherein the first device is specifically configured to:
    dividing the first target area into a plurality of grids, wherein each grid has a first preset size;
    acquiring first similarity of patterns in every two grids;
    and if the first similarity greater than or equal to the preset similarity exists, and the duty ratio of the first similarity greater than or equal to the preset similarity is greater than or equal to the preset duty ratio, determining that the first target area contains the same pattern.
  8. The system of claim 7, wherein the first device is specifically configured to:
    if the first similarity which is larger than or equal to the preset similarity does not exist, or the duty ratio is smaller than the preset duty ratio, the size of the grids is increased, and the second similarity of the patterns in every two grids after the size is increased is obtained;
    If the second similarity which is larger than or equal to the preset similarity exists, and the duty ratio of the second similarity which is larger than or equal to the preset similarity is larger than or equal to the preset duty ratio, determining that the first target area contains the same pattern;
    if the second similarity which is greater than or equal to the preset similarity does not exist, or the duty ratio of the second similarity which is greater than or equal to the preset similarity is greater than or equal to the preset duty ratio, the size of the grid is continuously increased until the size of the grid reaches a second preset size.
  9. The system of any of claims 1-8, wherein the first device is further configured to:
    responsive to a seventh user operation on the first device display screen, selecting a third target area on the second graphical interface, the third target area including a third color therein;
    the second device is further configured to:
    and responding to the fourth user operation on the display screen of the second device, and displaying handwriting of the color obtained by fusing the first color and the third color.
  10. The system of any of claims 1-8, wherein the first device is further configured to:
    Responsive to a seventh user operation on the first device display screen, selecting a third target area on the second graphical interface, the third target area including a third texture therein;
    the second device is further configured to:
    and displaying handwriting of the first color and the third texture combination in response to the fourth user operation on the second device display screen.
  11. The system of any of claims 1-8, wherein the first target area further comprises a first texture, the first device further configured to:
    after a first target area is selected on the second graphical interface, displaying a color control and a texture control to be selected on the second graphical interface;
    detecting an eighth user operation of selection of the color control and/or texture control;
    the second device is configured to:
    and displaying handwriting of the first color and/or the first texture combination in response to the fourth user operation on the second device display screen.
  12. The system of claim 11, wherein the first device is further configured to:
    responsive to a seventh user operation on the first device display screen, selecting a third target area on the second graphical interface, the third target area including a third color and a third texture therein;
    Displaying a color control and a texture control to be selected on the second graphical interface;
    detecting a ninth user operation of the selection of the color control and/or the texture control;
    the second device is further configured to:
    displaying handwriting of the first color, the first texture, the third color, and/or the third texture combination in response to the fourth user operation on the second device display screen.
  13. A handwriting drawing method applied to a cross-device drawing system, characterized in that the method is applied to a first device, the method comprising:
    responsive to a first user operation on the first device display, displaying a first graphical interface;
    responsive to a second user operation on the first device display screen, displaying a second graphical interface, the second graphical interface being different from the first graphical interface;
    responsive to a third user operation on the first device display screen, selecting a first target area on the second graphical interface, the first target area including a first color therein;
    and sending the color information of the first color to a second device, wherein the color information of the first color is used for indicating the second device to display handwriting of the first color.
  14. The method of claim 13, wherein the method further comprises:
    and responding to a sixth user operation on the display screen of the first device, and displaying handwriting of a second color, wherein the second color is the color of a second target area selected on the graphical interface of the second device.
  15. The method of claim 13 or 14, wherein the responding to the third user operation on the first device display screen is preceded by:
    detecting the third operation on the first device display screen in response to receiving a first instruction from a stylus, the first instruction being sent by the stylus detecting that the stylus performs a first preset action; or alternatively, the process may be performed,
    detecting the third user operation on the first device display screen in response to detecting that the stylus performs a second preset action; or alternatively, the process may be performed,
    in response to detecting that the first device performs a third preset action, the third user operation is detected on the first device display screen.
  16. The method of any of claims 13-15, wherein the first target area further comprises a first texture;
    The sending the color information of the first color to a second device includes:
    and sending the color information of the first color and the texture information of the first texture to the second device.
  17. The method of claim 16, wherein after selecting the first target area on the second graphical interface, further comprising:
    and obtaining texture information of the first texture.
  18. The method of claim 17, wherein the obtaining texture information for the first texture comprises:
    detecting whether the first target area contains the same pattern;
    if yes, screenshot is carried out on the pattern, and an image of the pattern is obtained;
    acquiring vector data of the pattern based on the image of the pattern;
    taking vector data of the pattern as texture information of the first texture;
    if not, capturing a picture of the first target area to obtain an image of the first target area;
    acquiring vector data of the first target area based on the image of the first target area;
    and taking the vector data of the first target area as texture information of the first texture.
  19. The method of claim 18, wherein the detecting whether the first target region contains the same pattern comprises:
    Dividing the first target area into a plurality of grids, wherein each grid has a first preset size;
    acquiring first similarity of patterns in every two grids;
    and if the first similarity greater than or equal to the preset similarity exists, and the duty ratio of the first similarity greater than or equal to the preset similarity is greater than or equal to the preset duty ratio, determining that the first target area contains the same pattern.
  20. The method of claim 19, wherein the method further comprises:
    if the first similarity which is larger than or equal to the preset similarity does not exist, or the duty ratio is smaller than the preset duty ratio, the size of the grids is increased, and the second similarity of the patterns in every two grids after the size is increased is obtained;
    if the second similarity which is larger than or equal to the preset similarity exists, and the duty ratio of the second similarity which is larger than or equal to the preset similarity is larger than or equal to the preset duty ratio, determining that the first target area contains the same pattern;
    if the second similarity which is greater than or equal to the preset similarity does not exist, or the duty ratio of the second similarity which is greater than or equal to the preset similarity is greater than or equal to the preset duty ratio, the size of the grid is continuously increased until the size of the grid reaches a second preset size.
  21. The method of any of claims 13-20, wherein after selecting a first target area on the second graphical interface, further comprising:
    responsive to a seventh user operation on the first device display screen, selecting a third target area on the second graphical interface, the third target area including a third color therein;
    fusing the first color and the third color to obtain fused color information;
    the sending the color information of the first color to a second device includes:
    and sending the fusion color information to the second device, wherein the fusion color information is used for indicating the second device to display the handwriting of the color obtained by fusing the first color and the third color.
  22. The method of any of claims 13-20, wherein after selecting a first target area on the second graphical interface, further comprising:
    responsive to a seventh user operation on the first device display screen, selecting a third target area on the second graphical interface, the third target area including a third texture therein;
    fusing the first color and the third texture to obtain fusion information;
    The sending the color information of the first color to a second device includes:
    and sending the fusion information to the second device, wherein the fusion information is used for indicating the second device to display the handwriting of the first color and the third texture combination.
  23. The method of any of claims 13-20, wherein the first target area further comprises a first texture, the method further comprising, after selecting the first target area on the second graphical interface:
    displaying a color control and a texture control to be selected on the second graphical interface;
    an eighth user operation of selection of the color control and/or texture control is detected.
  24. The method of claim 23, wherein after the eighth user operation detecting the selection of the color control and/or texture control, further comprising:
    responsive to a seventh user operation on the first device display screen, selecting a third target area on the second graphical interface, the third target area including a third color and a third texture therein;
    displaying a color control and a texture control to be selected on the second graphical interface;
    detecting a ninth user operation of the selection of the color control and/or the texture control;
    Fusing the first information indicated by the eighth user operation and the second information indicated by the ninth user operation to obtain fusion information, wherein the first information is color information of a first color and/or texture information of a first texture, and the second information is color information of a third color and/or texture information of a third texture;
    the sending the color information of the first color to a second device includes:
    and sending the fusion information to the second device, wherein the fusion information is used for indicating the second device to display the handwriting combined by the first color, the first texture, the third color and/or the third texture.
CN202280002664.4A 2021-09-16 2022-08-04 Cross-device drawing system Pending CN116137915A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
CN202111089486 2021-09-16
CN2021110894867 2021-09-16
CN202111613488.1A CN114816135B (en) 2021-09-16 2021-12-27 Cross-device drawing system
CN2021116134881 2021-12-27
PCT/CN2022/110186 WO2023040505A1 (en) 2021-09-16 2022-08-04 Cross-device drawing system

Publications (1)

Publication Number Publication Date
CN116137915A true CN116137915A (en) 2023-05-19

Family

ID=82527722

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202111613488.1A Active CN114816135B (en) 2021-09-16 2021-12-27 Cross-device drawing system
CN202280002664.4A Pending CN116137915A (en) 2021-09-16 2022-08-04 Cross-device drawing system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202111613488.1A Active CN114816135B (en) 2021-09-16 2021-12-27 Cross-device drawing system

Country Status (2)

Country Link
CN (2) CN114816135B (en)
WO (1) WO2023040505A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114816135B (en) * 2021-09-16 2023-11-03 华为技术有限公司 Cross-device drawing system
CN117762606A (en) * 2022-09-23 2024-03-26 华为技术有限公司 Equipment control method and electronic equipment

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102306099A (en) * 2011-08-23 2012-01-04 上海网达软件有限公司 Cross-platform graphic display method and graphic display system on handheld terminal equipment
KR20150018237A (en) * 2013-08-09 2015-02-23 삼성전자주식회사 Method of providing user customized writing in electronic device, and electronic device for performing the same
KR20150071971A (en) * 2013-12-19 2015-06-29 삼성전자주식회사 Electronic Device And Method For Providing Graphical User Interface Of The Same
CN104796455A (en) * 2015-03-12 2015-07-22 安徽讯飞皆成软件技术有限公司 Cross-platform multi-screen interacting method, device and system
US10739988B2 (en) * 2016-11-04 2020-08-11 Microsoft Technology Licensing, Llc Personalized persistent collection of customized inking tools
CN106775374A (en) * 2016-11-17 2017-05-31 广州视源电子科技股份有限公司 Color acquisition methods and device based on touch-screen
CN107422974B (en) * 2017-07-21 2020-01-07 广州视源电子科技股份有限公司 Handwriting writing display method and system based on dual systems, storage medium and equipment
CN110083324A (en) * 2019-04-30 2019-08-02 华为技术有限公司 Method, apparatus, electronic equipment and the computer storage medium of Image Rendering
CN110187810B (en) * 2019-05-27 2020-10-16 维沃移动通信有限公司 Drawing method and terminal equipment
CN112083867A (en) * 2020-07-29 2020-12-15 华为技术有限公司 Cross-device object dragging method and device
CN113362410B (en) * 2021-05-31 2023-04-14 维沃移动通信(杭州)有限公司 Drawing method, drawing device, electronic apparatus, and medium
CN114816135B (en) * 2021-09-16 2023-11-03 华为技术有限公司 Cross-device drawing system

Also Published As

Publication number Publication date
CN114816135A (en) 2022-07-29
CN114816135B (en) 2023-11-03
WO2023040505A1 (en) 2023-03-23

Similar Documents

Publication Publication Date Title
CN106775313B (en) Split screen operation control method and mobile terminal
US11158057B2 (en) Device, method, and graphical user interface for processing document
CN114816135B (en) Cross-device drawing system
CN103729055A (en) Multi display apparatus, input pen, multi display apparatus controlling method, and multi display system
KR20130088104A (en) Mobile apparatus and method for providing touch-free interface
EP3721327B1 (en) Dynamic interaction adaptation of a digital inking device
RU2699236C2 (en) Telephone panel
CN115705130A (en) Multitask management method and terminal equipment
WO2020046452A1 (en) Computationally efficient human-computer interface for collaborative modification of content
CN111127595A (en) Image processing method and electronic device
CN111596817A (en) Icon moving method and electronic equipment
CN109542307B (en) Image processing method, device and computer readable storage medium
US20190369935A1 (en) Electronic whiteboard, electronic whiteboard system and control method thereof
US20160132478A1 (en) Method of displaying memo and device therefor
CN112416199A (en) Control method and device and electronic equipment
KR102077203B1 (en) Electronic apparatus and the controlling method thereof
CN114115691B (en) Electronic equipment and interaction method and medium thereof
CN106933463A (en) A kind of method and apparatus that object is shown in screen
EP2943039A1 (en) User terminal device for generating playable object, and interaction method therefor
WO2022001542A1 (en) Information processing method and apparatus, and storage medium, and electronic device
CN113872849A (en) Message interaction method and device and electronic equipment
CN111143300A (en) File compression method and electronic equipment
US10496241B2 (en) Cloud-based inter-application interchange of style information
CN115134317B (en) Message display method, device, storage medium and electronic device
JP2015005088A (en) Image processing apparatus, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination