CN116700554A - Information display method, electronic device and readable storage medium - Google Patents

Information display method, electronic device and readable storage medium Download PDF

Info

Publication number
CN116700554A
CN116700554A CN202211301237.4A CN202211301237A CN116700554A CN 116700554 A CN116700554 A CN 116700554A CN 202211301237 A CN202211301237 A CN 202211301237A CN 116700554 A CN116700554 A CN 116700554A
Authority
CN
China
Prior art keywords
information
drawn
hand
interface
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211301237.4A
Other languages
Chinese (zh)
Other versions
CN116700554B (en
Inventor
黄奔
范明超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211301237.4A priority Critical patent/CN116700554B/en
Priority to CN202410552358.9A priority patent/CN118502625A/en
Publication of CN116700554A publication Critical patent/CN116700554A/en
Application granted granted Critical
Publication of CN116700554B publication Critical patent/CN116700554B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an information display method, electronic equipment and a readable storage medium, and belongs to the technical field of terminals. The method is applied to the electronic equipment, and comprises the following steps: under the condition that a first interface of a first application program and a second interface of a second application program are displayed, responding to selection operation of target hand-drawn information in the first interface, displaying a selected frame at the position of the region where the target hand-drawn information is located, wherein the target hand-drawn information is located in the selected frame, and the target hand-drawn information is any hand-drawn information displayed in the first interface; and responding to the target operation, and displaying the target hand-drawn information in the second interface, wherein the target operation is an operation of dragging the target hand-drawn information into the second interface after the target operation is used for selecting the area where the frame is located, or the target operation is a scribing operation from the area where the frame is selected to the second interface. According to the method and the device for displaying the information, after the target hand-drawn information is selected, the target hand-drawn information is dragged into the second interface through the dragging operation or the scribing operation, so that the information display efficiency is improved.

Description

Information display method, electronic device and readable storage medium
Technical Field
The present application relates to the field of terminal technologies, and in particular, to an information display method, an electronic device, and a readable storage medium.
Background
With the development of terminal technology, the electronic device has more and more abundant note functions, and a user can record various note information in a note application program in a hand-drawing manner, such as conference recording, reading notes, classroom notes and the like. In general, since the hand-drawn note information may be stored in a picture form, the user may also control the electronic device to display a picture corresponding to the hand-drawn note information in other application programs.
However, when the hand-drawn note information is displayed as a picture in other application programs, the user is required to store the hand-drawn note information as a picture first, and then reselect the picture in other application programs, so that the electronic device displays the picture reselected by the user in other application programs, which is complicated, and the display efficiency of the information is low.
Disclosure of Invention
The application provides an information method, electronic equipment and a readable storage medium of information, which can be used for solving the problem of low information display efficiency caused by complex information display process in the related technology. The technical scheme is as follows:
In a first aspect, a method for displaying information is provided, and the method is applied to an electronic device, and includes:
under the condition that a first interface of a first application program and a second interface of a second application program are displayed, responding to a selection operation of target hand-drawn information in the first interface, displaying a selected frame at a region where the target hand-drawn information is located, wherein the target hand-drawn information is located in the selected frame, and the target hand-drawn information is any hand-drawn information displayed in the first interface;
and responding to a target operation, wherein the target hand-drawn information is displayed in the second interface, and the target operation is an operation of dragging the target hand-drawn information into the second interface after acting on the area where the selected frame is located, or is a scribing operation from the area where the selected frame is located to the second interface.
In this way, when the user needs to display the target hand-drawn information displayed in the first interface of the first application program on the second interface of the second application program, the user can drag the target hand-drawn information into the second interface after selecting the target hand-drawn information. Because the user does not need to store the target hand-drawn information as a picture manually, and does not need to perform picture selection operation in other application interfaces, the target hand-drawn information can be displayed in other application interfaces, so that the operation of displaying the target hand-drawn information in other application interfaces is simplified, and the display efficiency of displaying the target hand-drawn information in other application interfaces is improved.
As an example of the present application, in the case of displaying a first interface of a first application program and a second interface of a second application level, in response to a selection operation of target hand-drawn information in the first application interface, displaying a selected frame at an area where the target hand-drawn information is located, including:
receiving a user selection operation of a lasso tool in the first interface under the condition that the first interface and the second interface are displayed;
receiving a box selection operation of the user in the first interface for the target hand-drawn information by using the lasso tool;
and displaying the selected frame at the region where the target hand-drawn information is located based on the frame selection region of the frame selection operation.
Therefore, the lasso tool is used for carrying out frame selection operation on the target hand-drawn information, so that the target hand-drawn information can be clearly selected, and the accuracy of information selection is improved.
As an example of the present application, the selecting area based on the selecting operation displays the selected frame at the area where the target hand-drawn information is located, including:
responding to the frame selection operation, and displaying a primary selection frame according to a frame selection track of the frame selection operation;
Identifying the position of the hand-drawn handwriting of the target hand-drawn information;
and adjusting the shape of the primary selection border according to the position of the hand-drawn handwriting of the target hand-drawn information so that the shape of the selected border obtained after adjustment is the same as the outline shape of the hand-drawn handwriting of the target hand-drawn information.
Therefore, the edge shrinking operation is carried out on the primary selected frame, so that the frame selection range of the primary selected frame is reduced, and the calculated amount of pixel information for subsequently acquiring the hand-drawn handwriting is reduced.
As an example of the present application, the target operation is an operation of dragging the target hand-drawn information into the second interface after acting on the area where the selected frame is located; the displaying the target hand-drawn information in the second interface in response to a target operation includes:
responding to long-press operation of the target hand-drawn information, and generating a drag picture of the target hand-drawn information;
responding to the dragging operation of the dragging picture of the target hand-drawn information, and moving the dragging picture of the target hand-drawn information according to the dragging track of the dragging operation;
and displaying the target hand-drawn information in the second interface when the drag operation is finished in the second interface.
In this way, in the process of displaying the target hand-drawn information in the second interface, the user can clearly know the progress of the current drag operation by displaying the drag picture and moving the drag picture according to the drag track of the drag operation.
As an example of the present application, the generating, in response to a long press operation on the target hand-drawn information, a drag picture of the target hand-drawn information includes:
responding to long-press operation of the target hand-drawn information, and generating a first layer in the first interface according to the area range of the area where the target hand-drawn information is located;
acquiring pixel information of hand-drawn handwriting of the target hand-drawn information;
and displaying the pixel information of the hand-drawn handwriting in the first layer to obtain a drag picture of the target hand-drawn information.
Therefore, the dragging picture is generated through the first picture layer and the pixels of the hand-drawn handwriting, whether the target hand-drawn information corresponding to the dragged picture is the hand-drawn information required to be displayed in the second interface or not can be accurately reflected, and the accuracy of information display is improved.
As one example of the application, the electronic device includes a lasso module and a view module;
The responding to the long-press operation of the target hand-drawn information generates a first image layer in the first interface according to the area range of the area where the target hand-drawn information is located, and the method comprises the following steps:
the lasso module responds to the long-press operation of the target hand-drawn information and sends a long-press message to the view module;
the view module generates the first layer in the first interface according to the area range of the area where the target hand-drawn information is located under the condition that the long-press message is received;
the obtaining the pixel information of the hand-drawn handwriting of the target hand-drawn information includes:
the view module acquires pixel information of the hand-drawn handwriting;
the view module returns pixel information of the hand-drawn handwriting to the lasso module;
displaying the pixel information of the hand-drawn handwriting in the first layer to obtain a drag picture of the target hand-drawn information, including:
and the lasso module displays the pixel information of the hand-drawn handwriting in the first layer under the condition that the pixel information of the hand-drawn handwriting is received, so as to obtain a dragging picture of the target image.
Therefore, different operations are executed by setting different modules, so that the execution main body of each operation is clearer and more targeted.
As an example of the present application, in the case where the drag operation ends in the second interface, before displaying the target hand-drawn information in the second interface, the method further includes:
caching the hand-drawn handwriting of the target hand-drawn information into a target picture under the condition that the data processing modules corresponding to the second application program and the first application program are different, wherein the data processing modules are used for processing the information of the corresponding application program;
acquiring storage address information of the target picture;
adding the storage address information to the dragging picture;
and displaying the target hand-drawn information in the second interface when the drag operation is finished in the second interface, wherein the method comprises the following steps:
acquiring the storage address information from the drag picture under the condition that the drag operation is finished in the second interface;
deleting the dragging picture;
acquiring the target picture based on the storage address information;
and displaying the target picture in the second interface.
Therefore, under the condition that the dragging operation is finished on the dragging picture, the second application program can display the target picture according to the storage address information of the target picture, the target hand-drawn information is not required to be manually stored as the picture by a user, and the target hand-drawn information can be displayed in the second interface without the need of picture selection operation in the second interface by the user, so that the display efficiency of displaying the target hand-drawn information in the second interface is improved.
As an example of the present application, before the displaying of the target hand-drawn information in the second interface in the case where the drag operation ends in the second interface, the method further includes:
if the second application program is the same as the data processing module corresponding to the first application program, the hand-painted dot matrix information of the target hand-painted information is sent to the data processing module, the hand-painted dot matrix information is used for representing the hand-painted track of the target hand-painted information, and the data processing module is a module used for processing the information of the corresponding application program;
and displaying the target hand-drawn information in the second interface when the drag operation is finished in the second interface, wherein the method comprises the following steps:
deleting the drag picture when the drag operation is finished in the second interface;
acquiring the hand-drawn lattice information from the data processing module;
and displaying the target hand-drawn information in the second interface according to the hand-drawn lattice information.
In this way, under the condition that the data processing module corresponding to the second application program and the first application program is the same, by sending the hand-drawn lattice information of the hand-drawn handwriting to the data processing module, the hand-drawn handwriting of the target hand-drawn information displayed in the second interface can be edited for the second time, and the flexibility of displaying the target hand-drawn information is improved.
In a second aspect, an electronic device is provided, where the electronic device includes a processor and a memory, where the memory is configured to store a program for supporting the electronic device to execute the method for displaying information provided in the first aspect, and store data related to implementing the method for displaying information in the first aspect. The processor is configured to execute a program stored in the memory. The electronic device may further comprise a communication bus for establishing a connection between the processor and the memory. The processor is configured to:
under the condition that a first interface of a first application program and a second interface of a second application program are displayed, responding to a selection operation of target hand-drawn information in the first interface, displaying a selected frame at a region where the target hand-drawn information is located, wherein the target hand-drawn information is located in the selected frame, and the target hand-drawn information is any hand-drawn information displayed in the first interface;
and responding to a target operation, wherein the target hand-drawn information is displayed in the second interface, and the target operation is an operation of dragging the target hand-drawn information into the second interface after acting on the area where the selected frame is located, or is a scribing operation from the area where the selected frame is located to the second interface.
As one example of the application, the processor is configured to:
receiving a user selection operation of a lasso tool in the first interface under the condition that the first interface and the second interface are displayed;
receiving a box selection operation of the user in the first interface for the target hand-drawn information by using the lasso tool;
and displaying the selected frame at the region where the target hand-drawn information is located based on the frame selection region of the frame selection operation.
As one example of the application, the processor is configured to:
responding to the frame selection operation, and displaying a primary selection frame according to a frame selection track of the frame selection operation;
identifying the position of the hand-drawn handwriting of the target hand-drawn information;
and adjusting the shape of the primary selection border according to the position of the hand-drawn handwriting of the target hand-drawn information so that the shape of the selected border obtained after adjustment is the same as the outline shape of the hand-drawn handwriting of the target hand-drawn information.
As one example of the application, the processor is configured to:
responding to long-press operation of the target hand-drawn information, and generating a drag picture of the target hand-drawn information;
responding to the dragging operation of the dragging picture of the target hand-drawn information, and moving the dragging picture of the target hand-drawn information according to the dragging track of the dragging operation;
And displaying the target hand-drawn information in the second interface when the drag operation is finished in the second interface.
As one example of the application, the processor is configured to:
responding to long-press operation of the target hand-drawn information, and generating a first layer in the first interface according to the area range of the area where the target hand-drawn information is located;
acquiring pixel information of hand-drawn handwriting of the target hand-drawn information;
and displaying the pixel information of the hand-drawn handwriting in the first layer to obtain a drag picture of the target hand-drawn information.
As one example of the application, the electronic device includes a lasso module and a view module;
the processor is configured to:
the lasso module responds to the long-press operation of the target hand-drawn information and sends a long-press message to the view module;
the view module generates the first layer in the first interface according to the area range of the area where the target hand-drawn information is located under the condition that the long-press message is received;
the processor is configured to:
the view module acquires pixel information of the hand-drawn handwriting;
The view module returns pixel information of the hand-drawn handwriting to the lasso module;
the processor is configured to:
and the lasso module displays the pixel information of the hand-drawn handwriting in the first layer under the condition that the pixel information of the hand-drawn handwriting is received, so as to obtain a dragging picture of the target image.
As an example of the present application, the processor is further configured to:
caching the hand-drawn handwriting of the target hand-drawn information into a target picture under the condition that the data processing modules corresponding to the second application program and the first application program are different, wherein the data processing modules are used for processing the information of the corresponding application program;
acquiring storage address information of the target picture;
adding the storage address information to the dragging picture;
the processor is configured to:
acquiring the storage address information from the drag picture under the condition that the drag operation is finished in the second interface;
deleting the dragging picture;
acquiring the target picture based on the storage address information;
and displaying the target picture in the second interface.
As an example of the present application, the processor is further configured to:
if the second application program is the same as the data processing module corresponding to the first application program, the hand-painted dot matrix information of the target hand-painted information is sent to the data processing module, the hand-painted dot matrix information is used for representing the hand-painted track of the target hand-painted information, and the data processing module is a module used for processing the information of the corresponding application program;
the processor is configured to:
deleting the drag picture when the drag operation is finished in the second interface;
acquiring the hand-drawn lattice information from the data processing module;
and displaying the target hand-drawn information in the second interface according to the hand-drawn lattice information.
In a third aspect, there is provided a computer readable storage medium storing one or more programs, wherein the one or more programs are configured to be executed by one or more processors, the one or more programs comprising instructions which, when executed on a computer, cause the computer to perform the method of displaying information of the first aspect described above.
In a fourth aspect, there is provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method of displaying information of the first aspect described above.
The technical effects obtained by the second, third and fourth aspects are similar to the technical effects obtained by the corresponding technical means in the first aspect, and are not described in detail herein.
Drawings
FIG. 1 is a schematic diagram of an application scenario shown in accordance with an exemplary embodiment;
FIG. 2 is a schematic diagram of an application scenario illustrated in accordance with another exemplary embodiment;
FIG. 3 is a schematic diagram of an application scenario illustrated in accordance with another exemplary embodiment;
FIG. 4 is a schematic diagram of an application scenario illustrated in accordance with another exemplary embodiment;
FIG. 5 is a schematic diagram of an application scenario illustrated in accordance with another exemplary embodiment;
FIG. 6 is a schematic diagram of an application scenario illustrated in accordance with another exemplary embodiment;
FIG. 7 is a schematic diagram of an application scenario illustrated in accordance with another exemplary embodiment;
FIG. 8 is a schematic diagram of an application scenario illustrated in accordance with another exemplary embodiment;
FIG. 9 is a schematic diagram of a software architecture of an electronic device, according to an example embodiment;
FIG. 10 is a flow chart of a method of displaying information according to an exemplary embodiment;
FIG. 11 is a flow chart of a method of displaying information according to another exemplary embodiment;
FIG. 12 is a flow chart of a method of displaying information according to another exemplary embodiment;
FIG. 13 is a flow chart of a method of displaying information according to another exemplary embodiment;
fig. 14 is a schematic structural view of an electronic device according to another exemplary embodiment.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
It should be understood that references to "a plurality" in this disclosure refer to two or more. In the description of the present application, "/" means or, unless otherwise indicated, for example, A/B may represent A or B; "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. In addition, in order to facilitate the clear description of the technical solution of the present application, the words "first", "second", etc. are used to distinguish the same item or similar items having substantially the same function and function. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
In one possible scenario, where the electronic device is a tablet computer, after a user draws a pattern by hand-drawing in a note application, the hand-drawing may want to be applied to other applications, such as sending the hand-drawing as mail content. Illustratively, the user draws the hand drawing pattern 1 in a hand drawing manner in the note interface of the note application, and in the case where the user wants to send the hand drawing pattern 1 as mail content, referring to (a) in fig. 1, the user may perform a screenshot operation on the hand drawing pattern 1 in the note interface, which may be an operation of performing multi-directional upward sliding in the note interface; referring to fig. 1 (b), in response to the operation of capturing the hand-drawn pattern 1, the tablet computer may capture the hand-drawn pattern 1 to obtain a first picture 2 including the hand-drawn pattern 1, and store the first picture 2 in a photo file; or, the user directly stores the hand-drawn pattern 1, and selects a storage format as a picture format during storage, so as to obtain a first picture 2 comprising the hand-drawn pattern 1; thereafter, referring to fig. 1 (c), the user may slide a distance upward from the bottom of the note interface; the electronic device exits the note application to background operation in response to a user sliding a certain distance from the bottom up in the note interface, and displays the desktop of the tablet computer as in fig. 1 (d). After that, the operation of the user to transmit the first picture 2 as mail content may refer to a scene shown in fig. 2 described below.
Referring to fig. 2 (a), the user may continue to click on an application icon of the mailbox application in the desktop; responding to clicking operation of an application icon of a mail application program by the tablet computer, if a mail account is logged in the mail application program, displaying a mail box interface shown in a (b) diagram in fig. 2, and clicking a new mail control 3 in the mail box interface by a user; responding to the clicking operation of a user on the new mail control 3, the tablet computer displays a mail editing interface shown in a (c) diagram in fig. 2, and the user can click an insert picture control 4 in the mail editing interface; in response to the click operation of the insert picture control 4, the tablet computer may display a picture selection interface as shown in the (d) diagram in fig. 2, in which a recently stored pattern including a photo may be displayed, and the user may select the first picture 2 to be transmitted in the picture selection interface. After the selection is completed, referring to (e) in fig. 2, the user may click on the "complete" control; in response to the click operation on the "complete" control, referring to fig. 2 (f), the tablet computer may display the first picture 2 in the mail editing interface, and the user may perform other editing operations, such as editing of a recipient, a mail subject, etc. in the mail editing interface, and after the editing is completed, send the edited content and the first picture 2 as mail content.
When the hand-drawn note information is displayed as a picture in other application interfaces, a user is required to store the hand-drawn note information as a picture first, and then the picture is reselected in the application interfaces of other application programs, so that the electronic equipment displays the picture reselected by the user in the application interfaces of the other application programs, the process is complicated, and the information display efficiency is low.
In order to improve the display efficiency of application interfaces for displaying the hand-drawn note information in other application programs, the embodiment of the application provides an information display method, and when a user needs to display any one of the hand-drawn information displayed in a first interface of a first application program on a second interface of a second application program, the user can directly select any one of the hand-drawn information, for example, select target hand-drawn information, and then drag the target hand-drawn information to the second interface, or directly after selecting the target hand-drawn information, the user performs a scribing operation on the second interface from the area where the target hand-drawn information is located, so as to display the target hand-drawn information in the second interface. Because the user does not need to store the target hand-drawn information as a picture manually, and does not need to perform picture selection operation in other application interfaces, the target hand-drawn information can be displayed in other application interfaces, so that the operation of displaying the hand-drawn information in other application interfaces is simplified, and the display efficiency of displaying the hand-drawn information in other application interfaces is improved.
For easy understanding, before describing the method provided by the embodiment of the present application in detail, an application scenario related to the embodiment of the present application is described next by taking an example that the electronic device is a tablet computer and the tablet computer has a hand-drawn note function.
Referring to fig. 3, fig. 3 is a schematic diagram illustrating an application scenario according to an exemplary embodiment. In a possible scenario, if the user needs to send the hand-drawn pattern 1 displayed in the first note interface as mail content in the case that the first note interface of the first note application and the mail editing interface of the mailbox application can be displayed in a split screen manner in the tablet computer, referring to fig. 3 (a), the user can click on the lasso tool control 5 in the first note interface; referring then to fig. 3 (b), the user may use a lasso tool to frame the hand-drawn pattern 1 in the first note interface; in response to a user's frame selection operation of the hand-drawn pattern 1, the tablet computer may display a primary frame 6 as shown in fig. 3 (c) at the region where the hand-drawn pattern 1 is located, and the user may press the region where the hand-drawn pattern 1 is located for a long time; in response to the long press operation of the hand-drawn pattern 1, the tablet computer generates a drag picture 7 as shown in the (d) diagram in fig. 3, the hand-drawn pattern 1 may be displayed in the drag picture 7, and then the user may continue to drag the drag picture 7; referring to (e) of fig. 3, the user may drag the drag picture into the mail editing interface and end the drag operation; in a case where the drag operation of the user ends in the mail editing interface, referring to (f) in fig. 3, the tablet computer may display the first picture 2 containing the hand-drawn pattern 1 in the mail editing interface.
It should be noted that, in the case where the first picture 2 is displayed in the mail editing interface, the tablet computer may adaptively adjust the size of the second picture 2 according to the size of the mail editing interface or the size of the picture allowed to be displayed by the mail editing interface. Of course, the first picture 2 may be displayed according to the original size, which is not particularly limited in the embodiment of the present application.
As an example, in the process of moving the drag picture 7 through the drag operation, the tablet computer may delete the primary frame in the first note interface, or may continue to display, and delete the primary frame in the first note interface when the cancel display operation is received.
Referring to fig. 4, fig. 4 is a schematic diagram illustrating an application scenario according to another exemplary embodiment. In one possible scenario, in the case where the note interface of the first note application and the mail editing interface of the mailbox application are displayed on the screen in the tablet computer, referring to fig. 4 (a), if the user uses the lasso tool to frame the hand-drawn pattern 1 in the note interface, the tablet computer performs the frame selection operation on the hand-drawn pattern 1 by the user, after displaying the first selected border 6 shown in fig. 4 (b) in the area, the tablet computer may adaptively adjust the shape of the first selected border 6 according to the outline of the hand-drawn pattern 1, and after adjusting, display the selected border 8 on the outline of the hand-drawn pattern 1, where the shape of the selected border 8 is the same as the outline shape of the hand-drawn pattern 1.
It should be noted that, after the initially selected frame 6 is displayed, the tablet computer may adjust the initially selected frame 6 to obtain the selected frame 8; the first selected frame 6 may be automatically adjusted without displaying the first selected frame 6, and then the selected frame 8 may be directly displayed.
Referring to fig. 5, fig. 5 is a schematic diagram illustrating an application scenario according to another exemplary embodiment. In one possible scenario, in the case where the first note interface of the first note application and the second note interface of the second note application are displayed in the tablet computer in a split screen manner, if the user needs to display the hand-drawn pattern 1 displayed in the first note interface in the second note interface, referring to fig. 5 (a), the user may click on the lasso tool control 5 in the first note interface; referring then to fig. 5 (b), the user may use a lasso tool to frame the hand-drawn pattern 1 in the first note interface; in response to a user's frame selection operation of the hand-drawn pattern 1, the tablet computer may display a primary frame 6 as shown in fig. 5 (c) at the area where the hand-drawn pattern 1 is located, and the user may press the hand-drawn pattern 1 for a long time; in response to the long press operation of the hand-drawn pattern 1, the tablet computer generates a drag picture 7 as shown in the (d) diagram in fig. 5, the hand-drawn pattern 1 may be displayed in the drag picture 7, and then the user may continue to drag the drag picture 7; referring to (e) of fig. 5, the user may drag the drag picture into the second note interface and end the drag operation; in the case where the drag operation of the user ends in the second note interface, referring to (f) in fig. 5, the tablet computer may display the hand-drawn pattern 1 in the second note interface in which the hand-drawn pattern 1 is not in a picture form but in an editable hand-drawn form.
It should be noted that the first note application program and the second note application program have the same data processing module, and the data processing module is used for processing information of the corresponding application program.
Referring to fig. 6, fig. 6 is a schematic diagram of an application scenario shown according to another exemplary embodiment. In one possible scenario, where the first note interface of the first note application is displayed in full screen in the tablet computer, and a floating window is displayed in the first note interface, and a mail editing interface is displayed in the floating window, referring to fig. 6 (a), the user may also click on the lasso tool control 5 in the first note interface; referring then to fig. 6 (b), the user may use a lasso tool to frame the hand-drawn pattern 1 in the first note interface; the tablet computer responds to the frame selection operation of the hand-painted pattern 1 by the user, a primary frame 6 shown in a (c) diagram in fig. 6 can be displayed at the area where the hand-painted pattern 1 is located, and the user can press the area where the hand-painted pattern 1 is located for a long time and then drag the operation; in response to the long press operation of the hand-drawn pattern 1, the tablet computer generates a drag picture 7 as shown in the (d) diagram in fig. 6, the hand-drawn pattern 1 may be displayed in the drag picture 7, and then the user may continue to drag the drag picture 7; referring to (e) of fig. 6, the user may drag the drag picture into the mail editing interface in the floating window and end the drag operation; in a case where the drag operation of the user ends in the mail editing interface displayed in the floating window, referring to (f) diagram in fig. 6, the tablet computer may display the first picture 2 containing the hand-drawn pattern 1 in the mail editing interface.
Referring to fig. 7, fig. 7 is a schematic diagram of an application scenario shown according to another exemplary embodiment. In one possible scenario, in the case that the first note interface of the first note application program is displayed in the tablet computer in a full screen manner, and a floating window is displayed in the first note interface, and the pen-two note interface of the second note application program is displayed in the floating window, if the user needs to display the hand-drawn pattern 1 displayed in the first note interface in the second note interface, referring to fig. 7 (a), the user may click on the lasso tool control 5 in the note interface; referring then to fig. 7 (b), the user may use a lasso tool to frame the hand-drawn pattern 1 in the note interface; in response to a user's frame selection operation on the hand-drawn pattern 1, the tablet computer may display a primary frame 6 as shown in fig. 7 (c) at the area where the hand-drawn pattern 1 is located, and the user may press the hand-drawn pattern 1 for a long time and then perform a drag operation; in response to the long press operation of the hand-drawn pattern 1, the tablet computer generates a drag picture 7 as shown in a (d) diagram in fig. 7, the hand-drawn pattern 1 may be displayed in the drag picture 7, and then the user may continue to drag the drag picture 7; referring to (e) of fig. 7, the user may drag the drag picture into the second note interface and end the drag operation; in the case where the drag operation of the user ends in the second note interface in the floating window, referring to (f) in fig. 7, the tablet computer may display the containing hand-drawn pattern 1 in the second note interface in which the hand-drawn pattern 1 is not in a picture form but in an editable hand-drawn form.
Referring to fig. 8, fig. 8 is a schematic diagram illustrating an application scenario according to another exemplary embodiment. In one possible scenario, a first note interface of a first note application program and a second note interface of a second note application program are displayed in a split screen manner in the tablet computer, and a floating window is displayed in the second note interface, and if a user needs to display a hand-drawn pattern 1 displayed in the first note interface in the second note interface or a state release interface in the case that a mail editing interface of a mailbox application program is displayed in the floating window, referring to fig. 8 (a), the user may click on a lasso tool control 5 in the note interface; referring then to fig. 8 (b), the user may use a lasso tool to frame the hand-drawn pattern 1 in the note interface; in response to a user's frame selection operation on the hand-drawn pattern 1, the tablet computer may display a primary frame 6 as shown in fig. 8 (c) at the area where the hand-drawn pattern 1 is located, and the user may press the hand-drawn pattern 1 for a long time and then perform a drag operation; in response to the long press operation of the hand-drawn pattern 1, the tablet computer generates a drag picture 7 as shown in the (d) diagram in fig. 8, the hand-drawn pattern 1 may be displayed in the drag picture 7, and then the user may continue to drag the drag picture 7; in a case where the drag operation of the user ends in the mail editing interface in the floating window, referring to (e) diagram in fig. 8, the tablet computer may display the first picture 2 including the hand-drawn pattern 1 in the mail editing interface. In the case where the drag operation of the user ends in the second note interface, referring to (f) in fig. 8, the tablet computer may display the containing hand-drawn pattern 1 in the second note interface in which the hand-drawn pattern 1 is not in a picture form but in an editable hand-drawn form.
It should be noted that, the embodiment of the present application is only described by taking the application scenario shown in fig. 2-8 as an example, and the embodiment of the present application is not limited to the above description.
The software system of the electronic device 100 will be described next.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the application, an Android (Android) system with a layered architecture is taken as an example, and a software system of the electronic device 100 is illustrated.
Fig. 9 is a block diagram of a software system of the electronic device 100 according to an embodiment of the present application. Referring to fig. 9, the hierarchical architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run time) and system layer, and a kernel layer, respectively.
The application layer may include a series of application packages. As shown in fig. 9, the application package may include a first notes application, a second notes application, a mailbox application, a gallery, and the like.
As an example, the application layer may further include a designated interface for enabling communication between the first note application and the second note application, respectively, and the application framework layer. And the specified interface may be added in a stylus Engine software development kit (Software Development Kit, SDK).
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions. As shown in FIG. 2, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like. The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The content provider is used to store and retrieve data, which may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc., and make such data accessible to the application. The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to construct a display interface for an application, which may be comprised of one or more views, such as a view that includes displaying a text notification icon, a view that includes displaying text, and a view that includes displaying a picture. The telephony manager is used to provide communication functions of the electronic device 100, such as management of call status (including on, off, etc.). The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like. The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. For example, a notification manager is used to inform that the download is complete, a message alert, etc. The notification manager may also be a notification that appears in the system top status bar in the form of a chart or a scroll bar text, such as a notification of a background running application. The notification manager may also be a notification that appears on the screen in the form of a dialog window, such as prompting a text message in a status bar, sounding a prompt, vibrating an electronic device, flashing an indicator light, etc.
As one example, the application framework layer may further include a lasso module and a view module, where the lasso module is configured to select a hand-drawn pattern selected by a user in the first notes interface or the second notes interface; the view module is used for forming a drag picture in the process of displaying the hand drawing pattern on application interfaces of other application programs.
As an example, the application framework layer may include a data processing module, where the data processing module is configured to process information in the first note application and the second note application, for example, the data processing module may obtain, through a specified interface, hand-drawn dot matrix information sent by the first note application or the second note application, and store the hand-drawn dot matrix information. In addition, the data processing module can edit the hand-painted information, predict the dot matrix position, recognize the hand-painted information and the like.
It should be further noted that the lasso module and the view module may be sub-modules in the data processing module, and of course, the lasso module and the view module may also be modules independent of the data processing module, and the operations of the lasso module and the view module are under the monitoring of the data processing module. In the drawings of the embodiments of the present application, a lasso module and a view module are described as sub-modules belonging to a data processing module.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system. The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules, such as: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc. The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications. Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The workflow of the electronic device 100 software and hardware is illustrated below in connection with capturing a photo scene.
When touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into the original input event (including information such as touch coordinates, time stamp of touch operation, etc.). The original input event is stored at the kernel layer. The application framework layer acquires an original input event from the kernel layer, and identifies a control corresponding to the original input event. Taking the touch operation as a click operation, the control corresponding to the click operation is a control of a camera application icon as an example, the camera application calls an interface of an application program framework layer, starts the camera application, calls a kernel layer to start a camera driver, and captures a still image or video through a camera 193.
Based on the execution subject and the application scenario provided by the above embodiments, the method for displaying information provided by the embodiments of the present application is described next. Referring to fig. 10, fig. 10 is a flowchart illustrating a method for displaying information according to an exemplary embodiment. By way of example, and not limitation, the method is applied to an electronic device provided with a first note application program and a mailbox application program, the electronic device is illustrated by using a plurality of module interaction implementations shown in fig. 9, and the method may include some or all of the following:
Step 1001: the first note interface of the first note application and the mail editing interface of the mailbox application are simultaneously displayed in a screen of the electronic device.
It should be noted that, the first note interface may record hand-drawn information, so that in order to facilitate displaying the hand-drawn information in the first note interface in the mail editing interface of the mailbox application program, the first note interface and the mail editing interface may be simultaneously displayed in the screen of the electronic device. The case where the first note interface and the mail editing interface are simultaneously displayed in the screen of the electronic device includes that the first note interface and the mail editing interface are displayed in a split screen in the screen of the electronic device, for example, an application scene as shown in the (a) diagram in fig. 3, or the first note interface is displayed in a full screen in the screen of the electronic device, and a floating window in which the mail editing interface is displayed, for example, the scene may be an application scene as shown in the (a) diagram in fig. 6 or the (a) diagram in fig. 8, or the mail editing interface is displayed in a full screen in the screen of the electronic device, and a floating window in which the first note interface is displayed. The embodiment of the present application is not particularly limited thereto.
It should be further noted that, the functions implemented by the first note application program and the mailbox application program are different, and the operations executed by the first note application program and the mailbox application program are also different, so that the data processing modules corresponding to the first note application program and the mailbox application program are different.
Step 1002: the first note application receives a selection operation of the lasso tool in the first note interface.
Step 1003: the first note application initiates a lasso module and a view module.
It should be noted that the lasso tool is used to select any one of the hand-drawn information in the first note interface. Since the lasso tool is a handwriting tool provided by the lasso module, the lasso module may be activated in the event that the first note application receives a selection operation of the lasso tool in the first note interface. In the case that the user selects the lasso tool, it is stated that the user is likely to display the selected target hand-drawn information in the application interface of other application programs, and the function of the view module needs to be used in the display process, so that the view module needs to be started as well.
As an example, if the module used to display the hand-drawn information in the first application and the view module of the subsequent display layer are the same module, then the view module is already started in the case of the first application being started, and thus, the view module may not be started any more in step 1003. If the module used for displaying the hand-drawn information in the first application program and the view module of the subsequent display layer are the same module, then in step 1003, the view module needs to be started.
Step 1004: the lasso module receives a framing operation for the target hand drawn information in the first note interface by a user using a lasso tool.
It should be noted that, the target hand-drawn information is any hand-drawn information displayed in the first note interface, and the target hand-drawn information may be at least one of a hand-drawn text and a hand-drawn pattern.
Because the user is performing the framing operation in the first note interface after selecting the lasso tool, the lasso module may receive the framing operation for the target hand drawn information in the first note interface using the lasso tool by the user.
It should be noted that the frame selection operation may be an operation that a user performs line drawing around the target hand-drawn information in the first note interface using a stylus, so as to frame the target hand-drawn information in the drawn line, thereby selecting the target hand-drawn information. Alternatively, the frame selection operation may be other operations, for example, the frame selection operation may also be an operation of a user to draw from any one position to another position in the first note interface, where a track of the operation is a line segment, and the line segment may be a rectangular oblique line, and the rectangle may perform frame selection on the target hand-drawn information.
Step 1005: and the lasso module responds to the frame selection operation and displays a primary selection frame in the first note interface according to the frame selection track of the frame selection operation.
As one example, depending on the selection operation, the selection track is different, and the displayed primary frame is different. In an exemplary case where the box selection operation is an operation that a user draws a line around the target hand-drawn information in the first note interface through a handwriting pen, the box selection track is a track of drawing the line, the preliminary selection border may be a border of a closed graph obtained according to the drawing of the line, and in an exemplary case, the preliminary selection border may be a preliminary selection border in an application scenario shown in the (c) diagram in fig. 3. In the case where the frame selection operation is an operation in which the user makes a line from any one position to another position in the first note interface, the frame selection track is a rectangular oblique line, and the initial selection frame may be four sides of the rectangular.
It should be noted that, because the frame selection operation is an operation for the target hand-drawn information, after the lasso module displays the first selected frame in the first note interface, the target hand-drawn information may be displayed in the first selected frame.
Step 1006: the lasso module receives a long press operation on the target hand-drawn information in the initially selected border.
Under the condition that the user needs to display the target hand-drawn information in the mail editing interface of the mailbox application program, the user can press the target hand-drawn information in the primary frame for a long time, and accordingly the lasso module can receive the long-press operation on the target hand-drawn information in the primary frame.
Step 1007: the lasso module sends a long-press message to the view module, wherein the long-press message carries the area range of the primary frame.
It should be noted that, after the target hand-drawn information is displayed on the mail editing interface, the source information of the target hand-drawn information is still displayed on the first note interface, and in order to make the target hand-drawn information displayed on both interfaces, the lasso module and the view module need to process the target hand-drawn information. Therefore, the lasso module can send a long-press message to the view module under the condition of receiving the long-press operation of the target hand-drawn information in the primary frame, and the long-press message carries the area range of the primary frame.
Because the target hand-drawn information is displayed in the primary frame, the area range of the primary frame is the area range of the area where the target hand-drawn information is located.
Step 1008: the view module receives a long press message.
Step 1009: and the view module generates a first layer in the first interface according to the region range of the region where the target hand-drawn information is located.
Because the long press message carries the region range of the primary frame, the view module can generate the first layer in the first interface according to the region range of the region where the target hand-drawn information is located under the condition that the long press message is received. This first layer may also be referred to as a first view. In addition, the first layer may be a hidden layer, i.e., a layer that is not visible to the user.
As an example, the view module may generate, in the first interface, a first layer having the same size as the region range of the region in which the target hand-drawn information is located according to the region range of the region in which the target hand-drawn information is located; or generating a first layer with the area range being N times as large as the area range of the target hand-drawn information in the first interface.
It should be noted that N is a value set in advance according to the requirement, and the N may be a value greater than 0 and less than 1, for example, N may be 0.7, 0.5, 0.3, or 0.2.
It should be further noted that the shape of the first layer may be the same as or similar to the shape of the outer contour of the target hand-drawn information, or the first layer may be rectangular, square, or circular.
Step 1010: the view module obtains pixel information of the hand-drawn handwriting.
It should be noted that, the pixel information of the handwriting may be represented by a bitmap (bitmap), where each bit in the bitmap may be used to mark a pixel message of a corresponding pixel point.
Step 1011: the view module returns pixel information of the hand-drawn handwriting to the lasso module.
It should be noted that, the view module may return a pixel message of the hand-drawn handwriting represented by the bitmap to the lasso module.
Step 1012: the lasso module receives pixel information of the hand-drawn handwriting.
Step 1013: and the lasso module displays the pixel information of the hand-drawn handwriting in the first layer to obtain a drag picture of the target hand-drawn information.
As an example, since the pixel information of the hand-drawn handwriting can be represented by a bitmap, the lasso module can fill the bitmap of the hand-drawn handwriting in the first layer, so as to obtain a drag picture of the target hand-drawn information. For example, the application scenario may refer to the application scenario shown in fig. 3 (d), fig. 5 (d), fig. 6 (d), fig. 7 (d), or fig. 8 (d).
As one example, the lasso module may populate a bitmap of the hand-drawn handwriting in the first layer with a drawing function, which may be onDraw (Canvas canvas), and a bitmap drawing function, which may be canvas. The lasso module may reload onDraw (Canvas canvas) functions first, and then draw the bitmap of the hand-drawn handwriting on the first layer using canvas.
Since the pixel information of the hand-drawn handwriting is visible after the pixel information of the hand-drawn handwriting is displayed in the first layer, the drag picture is displayed in the first note interface when the drag picture is obtained.
Step 1014: and the lasso module caches the hand-drawn handwriting of the target hand-drawn information into a target picture.
As one example, the operation of the lasso module to cache the hand-drawn handwriting of the target hand-drawn information as the target picture includes: and the lasso module performs screenshot operation on the hand-drawn handwriting of the target hand-drawn information to obtain a target picture, and caches the target picture. Or, the lasso module stores the hand-drawn handwriting of the target hand-drawn information by taking the picture format as a storage format so as to cache the hand-drawn handwriting of the target hand-drawn information as a target picture.
It should be noted that the execution order of the execution step 1014 and the execution order of the step 1013 may not be limited in the embodiment of the present application, that is, the lasso module may execute the step 1013 first, execute the step 1014 first, execute the step 1013 second, or execute the step 1013 and the step 1014 simultaneously.
Step 1015: and the lasso module acquires the storage address information of the target picture.
The target picture is obtained by caching the hand-drawn handwriting of the target hand-drawn information by the lasso module, so that the lasso module can acquire the storage address information of the target picture.
It should be noted that the storage address information may be a storage path of the target picture.
Step 1016: the lasso module adds the storage address information to the drag picture.
In one possible implementation, the lasso module may directly add the storage address information to the drag picture.
In another possible implementation, the lasso module may set a drag shadow for the drag picture and add storage address information in the drag shadow.
For example, the lasso module may create ClipData (ClipData) of the target picture, where the ClipData is an object of the encapsulation data, and the ClipData carries storage address information representing the target picture, and generates a drag shadow according to the ClipData.
Step 1017: the lasso module receives a drag operation of a drag picture of the target hand-drawn information.
It should be noted that, the long-press operation of the user on the target hand-drawn information and the drag operation of the user on the drag picture of the target hand-drawn information are continuous, that is, after the user performs the long-press operation on the target hand-drawn information, the finger of the user does not leave the first note interface, but continues to perform the drag operation after the drag picture is displayed on the first note interface.
Step 1018: and the lasso module moves the drawing picture of the target hand-drawn information according to the drawing track of the drawing operation.
In order to make the user clearly know whether to move the drawing picture into the mail editing interface, the lasso module can move the drawing picture of the target hand-drawn information according to the drawing track of the drawing operation.
Step 1019: and under the condition that the drag operation is finished in the mail editing interface, the mailbox application program acquires the storage address information from the drag picture.
In order to smoothly display the target picture on the mail editing interface, the mailbox application may acquire the storage address information from the drag picture.
Step 1020: the lasso module deletes the dragged picture.
As an example, in the case where the drag operation ends in the mail editing interface, the lasso module may determine an area of the drag picture in the mail editing interface, and if the drag picture is completely displayed in the mail editing interface under the action of the drag operation, or if a portion of the drag picture that is M times larger in area is already displayed in the mail editing interface under the action of the drag operation, the lasso module deletes the drag picture.
It should be noted that, the M is a value set in advance according to the requirement, and the M may be a value greater than 0 and less than 1, and the M may be 0.3, 0.5, 0.7, or the like.
As one example, in the case where the drag operation ends in the mail editing interface, if the area of the drag picture displayed in the mail editing interface under the action of the drag operation is not M times, the lasso module deletes the drag picture and stops executing the following steps. Or, in the case where the drag operation ends in the first note interface, the lasso module deletes the drag picture and stops executing the following steps.
It should be noted that, the execution sequence of the execution step 1019 and the execution sequence of the step 1020 are not limited in the embodiment of the present application, that is, the lasso module may execute the step 1019 first and then execute the step 1020, or execute the step 1019 first and then execute the step 1019, or execute the step 1019 and the step 1020 simultaneously.
Step 1021: the mailbox application obtains the target picture based on the storage address information.
Because the storage address information may be a storage path of the target picture, the mailbox application may obtain the target picture through the storage path of the target picture.
Step 1022: the mailbox application displays the target picture in the mail editing interface.
As an example, the mailbox application may display the target picture at the position indicated by the cursor in the mail editing interface, may display the target picture at the position where the drag operation ends in the mail editing interface, and the like, which is not particularly limited in the embodiment of the present application. For example, the scene may refer to the application scene shown in the above-described (f) diagram in fig. 3, the (f) diagram in fig. 6, or the (e) diagram in fig. 8.
In the embodiment of the application, when the user needs to display the target hand-drawn information displayed in the first interface of the first application program on the second interface of the second application program, the user can drag the target hand-drawn information into the second interface after selecting the target hand-drawn information. Because the user does not need to store the target hand-drawn information as a picture manually, and does not need to perform picture selection operation in other application interfaces, the target hand-drawn information can be displayed in other application interfaces, so that the operation of displaying the target hand-drawn information in other application interfaces is simplified, and the display efficiency of displaying the target hand-drawn information in other application interfaces is improved.
In the above embodiment, the lasso module does not make any adjustment to the initially selected frame after displaying the initially selected frame, and the following description will proceed taking the adjustment to the initially selected frame as an example. Referring to fig. 11, fig. 11 is a flowchart illustrating a method for displaying information according to another exemplary embodiment. By way of example, and not limitation, the method is applied to an electronic device provided with a first note application program and a mailbox application program, the electronic device is illustrated by using a plurality of module interaction implementations shown in fig. 9, and the method may include some or all of the following:
the operations of steps 1101 to 1105 may refer to the operations of steps 1001 to 1005, which are not described in detail in the embodiments of the present application.
Step 1106: the lasso module sends a lasso selection message to the view module, wherein the lasso selection message carries the frame selection range of the frame selection operation.
The frame selection range of the frame selection operation is the area range of the primary selection frame.
Since the subsequent view module needs to identify all pixel information within the range of the initially selected frame so as to acquire the pixel information of the target hand-drawn information, if the range of the initially selected frame is relatively large, the calculation amount of the pixel information of the target hand-drawn information acquired by the subsequent view module is relatively large, so that in order to reduce the calculation amount of the subsequent view module, and also in order to display the aesthetic property of the frame, the lasso module can send the lasso selected message to the view module.
Step 1107: the view module receives a lasso selection message.
Step 1108: the view module obtains the positions of hand-drawn notes of the target hand-drawn information within the frame selection range.
In order to facilitate the lasso module to adjust the primary selection frame, the view module may acquire the position of the hand-drawn note of the target hand-drawn information within the frame selection range.
Step 1109: the view module sends the position of the hand-drawn handwriting of the target hand-drawn information to the lasso module.
Step 1110: the lasso module receives the position of the hand-drawn handwriting of the target hand-drawn information.
Step 1111: and the lasso module adjusts the shape of the primary selected frame according to the position of the hand-drawn handwriting of the target hand-drawn information to obtain the selected frame.
It should be noted that the shape of the selected frame obtained after adjustment is the same as the outline shape of the hand-drawn handwriting of the target hand-drawn information. For example, the scenario may refer to the application scenario shown in fig. 4 described above.
As an example, the lasso module may perform the edge shrinking operation on the initially selected border according to the position of the hand-drawn handwriting of the target hand-drawn information, after performing the edge shrinking operation on the initially selected border, the distance between any point in the adjusted selected border and the position of the closest hand-drawn handwriting is a preset distance, and the shape of the selected border is the same as or similar to the outline shape of the hand-drawn handwriting of the target hand-drawn information.
Step 1112: the lasso module displays the selected border in the first note interface.
In order to make the user aware of the selected region, the lasso module may display the selected border in the first note interface.
The operations of steps 1113 to 1129 may refer to the operations of steps 1006 to 1022, which are not described in detail in the embodiments of the present application.
In the embodiment of the application, when the user needs to display the target hand-drawn information displayed in the first interface of the first application program on the second interface of the second application program, the user can drag the target hand-drawn information into the second interface after selecting the target hand-drawn information. Because the user does not need to store the target hand-drawn information as a picture manually, and does not need to perform picture selection operation in other application interfaces, the target hand-drawn information can be displayed in other application interfaces, so that the operation of displaying the target hand-drawn information in other application interfaces is simplified, and the display efficiency of displaying the target hand-drawn information in other application interfaces is improved.
The above description is given by taking two applications having different data processing modules as examples, and the following description is given by taking two applications having the same data processing module as examples. Referring to fig. 12, fig. 12 is a flowchart of a method for displaying information according to another exemplary embodiment. By way of example, and not limitation, the method is applied to an electronic device provided with a first note application program and a second note application program, the electronic device is illustrated by using a plurality of module interaction implementations shown in fig. 9, and the method may include some or all of the following:
Step 1201: the first note interface of the first note application and the second note interface of the second note application are displayed simultaneously in a screen of the electronic device.
It should be noted that, the functions implemented by the first note application program and the second note application program are the same, the executed operations are also basically the same, and the first note application program and the second note application program both support the hand-drawn note function, so the data processing modules corresponding to the first note application program and the second note application program may be the same.
Operations of steps 1201-1213 may refer to operations of steps 1001-1013, and operations of steps 1214-1215 may refer to operations of steps 1017-1018, which are not described in detail herein.
Step 1216: and the lasso module sends the hand-drawn dot matrix information of the target hand-drawn information to the data processing module in the process of moving the dragging picture.
It should be noted that the hand-drawn lattice information is used to represent the hand-drawn track of the target hand-drawn information.
Step 1217: the data processing module receives the hand-drawn dot matrix information sent by the lasso module.
Step 1218: the data processing module stores the hand-painted dot matrix information.
As one example, the data processing module may store the hand-drawn lattice information in a preset location.
Step 1219: in the event that the drag operation ends in the second interface, the lasso module deletes the drag picture.
It should be noted that, the operation of step 1219 may refer to the operation of step 1019, which is not described in detail in the embodiments of the present application.
Step 1220: the second note application sends an information acquisition request to the data processing module.
Since the second note application uses the same data processing module as the first note application, i.e., the same data processing module, the second note application can send an information acquisition request to the data processing module.
Step 1221: the data processing module receives an information acquisition request.
Step 1222: the data processing module obtains the hand-drawn lattice information from the data processing module to the second note application program.
Step 1223: the second note application displays the target hand-drawn information in the second note interface according to the hand-drawn lattice information.
As one example, the second note application may perform an interface rendering operation according to the hand-drawn lattice information, thereby displaying the target hand-drawn information in the second note interface. For example, the scene may refer to the application scene shown in the above-described (f) diagram in fig. 5, the (f) diagram in fig. 7, or the (f) diagram in fig. 8.
It should be noted that, since the target hand-drawn information is rendered by the second application program according to the hand-drawn lattice information, the target hand-drawn information is not displayed in the form of a picture in the second note interface, and therefore, the state of the target hand-drawn information displayed in the second note interface is the same as the state of the target hand-drawn information displayed in the first note interface, that is, in the case that the target hand-drawn information is displayed in the second note interface, the hand-drawn lattice information of the target hand-drawn information may be edited.
In the embodiment of the application, when the user needs to display the target hand-drawn information displayed in the first interface of the first application program on the second interface of the second application program, the user can drag the target hand-drawn information into the second interface after selecting the target hand-drawn information. Because the user does not need to store the target hand-drawn information as a picture manually, and does not need to perform picture selection operation in other application interfaces, the target hand-drawn information can be displayed in other application interfaces, so that the operation of displaying the target hand-drawn information in other application interfaces is simplified, and the display efficiency of displaying the target hand-drawn information in other application interfaces is improved.
It should be noted that, the foregoing is described taking an example of a method for implementing information display by the electronic device through interaction of a plurality of modules shown in fig. 9, and next, for further understanding of the embodiments of the present application, the method is described taking an example of implementation by the electronic device with a hand-drawing note function, please refer to fig. 13, and fig. 13 is a flowchart illustrating a method for displaying information according to another exemplary embodiment. By way of example and not limitation, the method may include some or all of the following:
step 1301: and under the condition that a first interface of the first application program and a second interface of the second application program are displayed, responding to the selection operation of the target hand-drawn information in the first interface, and displaying a selected frame at the area where the target hand-drawn information is located.
It should be noted that, the target hand-drawn information is located in the selected frame, and the target hand-drawn information is any hand-drawn information displayed in the first interface. The first application may be a note-taking application that supports a hand-drawn note function. The second application program may be a note-type application program supporting the hand-drawing note function, or may be a note-type application program not supporting the hand-drawing note function, or may be another type application program other than the note-type application program, and the second application program supports the picture display function.
Because the user may wish to display the hand-drawn information displayed in the first interface in the application interfaces of other application programs, in this case, the user may perform a selection operation on the target hand-drawn information in the first interface, and the electronic device may display the selected frame at the area where the target hand-drawn information is located in response to the selection operation on the target hand-drawn information. The target hand-drawn information selected by the selection operation is the hand-drawn information which the user wants to display in the application interfaces of other application programs.
In order to make the user clearly know whether the target hand-drawn information selected by the selection operation is the hand-drawn information desired to be displayed, the electronic device may display the selected frame at the area where the target hand-drawn information is located.
As an example, in a case where the electronic device displays the first interface of the first application program and the second interface of the second application level, the operation of displaying the selected frame at the area where the target hand-drawn information is located in response to the selection operation of the target hand-drawn information in the first application interface includes: receiving a user's selection operation of a lasso tool in a first interface under the condition that the first interface and a second interface are displayed; receiving a frame selection operation of a user aiming at target hand-drawn information in a first interface by using a lasso tool; and displaying the selected frame at the region where the target hand-drawn information is located based on the frame selection region of the frame selection operation.
It should be noted that, the lasso tool is the most basic selecting tool, and may perform selecting an irregularly shaped region, or may perform selecting a regularly shaped region, and if the first application program also supports other selecting tools, the user may also select other selecting tools.
It is worth to say that, the lasso tool is used for carrying out frame selection operation on the target hand-drawn information, so that the target hand-drawn information can be clearly selected, and the accuracy of information selection is improved.
Since the user uses the lasso tool to perform the frame selection operation in the first interface, the frame selection track range of the frame selection operation is relatively large, and the calculation amount of the subsequent pixel information acquisition may be increased, as an example, the operation of displaying the selected frame at the region where the target hand-drawn information is located based on the frame selection region of the frame selection operation by the electronic device includes: responding to the frame selection operation, and displaying a primary selection frame according to a frame selection track of the frame selection operation; identifying the position of the hand-drawn handwriting of the target hand-drawn information; and adjusting the shape of the primary selected border according to the position of the hand-drawn handwriting of the target hand-drawn information so that the shape of the selected border obtained after adjustment is the same as the outline shape of the hand-drawn handwriting of the target hand-drawn information.
It should be noted that, the operation of displaying the selected frame at the region where the target hand-drawn information is located by the electronic device based on the frame selection region of the frame selection operation may refer to the operations in the steps 1105-1111, which are not described in detail in the embodiment of the present application.
It is worth to say that, because the edge shrinking operation can be carried out on the primary selected frame, the frame selection range of the primary selected frame is reduced, and the calculation amount of the pixel information for obtaining the hand-drawn handwriting subsequently is reduced.
As an example, the case of simultaneously displaying the first interface and the second interface includes a split screen display of the first interface and the second interface, or one of the interfaces is displayed in a floating window, and the floating window is displayed in the other interface.
Step 1302: in response to the target operation, target hand-drawn information is displayed in the second interface.
It should be noted that, the target operation may be an operation of dragging the target hand-drawn information into the second interface after acting on the area where the selected frame is located, or the target operation may be a scribing operation from the area where the selected frame is located to the second interface.
In some embodiments, in a case where the target operation is an operation of dragging the target hand-drawn information into the second interface after the region where the frame is selected, the operation of displaying the target hand-drawn information in the second interface by the electronic device in response to the target operation includes: responding to long-press operation of the target hand-drawn information, and generating a drag picture of the target hand-drawn information; responding to the dragging operation of the dragging picture of the target hand-drawn information, and moving the dragging picture of the target hand-drawn information according to the dragging track of the dragging operation; and displaying the target hand-drawn information in the second interface when the drag operation is finished in the second interface.
It is worth to describe that in the process of displaying the target hand-drawn information in the second interface, the user can clearly know the progress of the current drag operation by displaying the drag picture and moving the drag picture according to the drag track of the drag operation.
In some embodiments, the electronic device generating a drag picture of the target hand-drawn information in response to a long press operation on the target hand-drawn information includes: responding to long-press operation of target hand-drawn information, and generating a first layer in a first interface according to the area range of the area where the target hand-drawn information is located; acquiring pixel information of hand-drawn handwriting of target hand-drawn information; and displaying pixel information of the hand-drawn handwriting in the first layer to obtain a drag picture of the target hand-drawn information.
It is worth to say that the drawing picture is generated through the first picture layer and the pixels of the hand-drawn handwriting, so that whether the target hand-drawn information corresponding to the drawing picture is the hand-drawn information required to be displayed in the second interface can be accurately reflected, and the accuracy of information display is improved.
From the above, the electronic device includes a lasso module and a view module; the electronic device responding to the long-press operation of the target hand-drawn information, generating a first image layer in the first interface according to the area range of the area where the target hand-drawn information is located, wherein the operation comprises the following steps: the lasso module responds to the long-press operation of the target hand-drawn information and sends a long-press message to the view module; and under the condition that the long-press message is received, the view module generates the first layer in the first interface according to the region range of the region where the target hand-drawn information is located. The operation of the electronic device for acquiring the pixel information of the hand-drawn handwriting of the target hand-drawn information comprises the following steps: the view module acquires pixel information of the hand-drawn handwriting; the view module returns pixel information of the hand-drawn handwriting to the lasso module. In addition, the electronic device displays pixel information of the hand-drawn handwriting in the first layer, and the operation of obtaining the drag picture of the target hand-drawn information comprises the following steps: and the lasso module displays the pixel information of the hand-drawn handwriting in the first layer under the condition of receiving the pixel information of the hand-drawn handwriting, so as to obtain a dragging picture of the target image.
It should be noted that, the series of operations performed by the modules may refer to the operations from step 1008 to step 1013, which are not described in detail in the embodiment of the present application.
It is worth to describe that by setting different modules to execute different operations, the execution main body of each operation is clearer and more targeted.
Because the data processing modules corresponding to the first application program and the second application program may be the same or different, the operations of the electronic device for displaying the target hand-drawn information in the second interface are different for the two different situations, and the data processing modules are modules for processing the information of the corresponding application programs.
In one possible implementation manner, the electronic device may cache the hand-drawn handwriting of the target hand-drawn information as the target picture when the data processing modules corresponding to the second application program and the first application program are different; acquiring storage address information of a target picture; and adding the storage address information into the drag picture.
Correspondingly, when the drag operation is finished in the second interface, the electronic device displays the target hand-drawn information in the second interface, which comprises the following steps: under the condition that the drag operation is finished in the second interface, acquiring storage address information from the drag picture; deleting the dragged picture; acquiring a target picture based on the storage address information; and displaying the target picture in the second interface.
It should be noted that, the operations of the electronic device to buffer the hand-drawn handwriting of the target hand-drawn information into the target picture and to add the storage address information to the dragged picture may refer to the operations of steps 1014-1016; the electronic equipment acquires storage address information from the dragging picture; deleting the dragged picture; acquiring a target picture based on the storage address information; the operation of displaying the target picture in the second interface may refer to the operations of step 1019-step 1022, which will not be described in detail in the embodiments of the present application.
It is worth to say that, by caching the hand-drawn handwriting as the target picture, the second application program can display the target picture according to the storage address information of the target picture under the condition that the drag operation is finished on the drag picture, the target hand-drawn information is not required to be manually stored as the picture by a user, and the target hand-drawn information can be displayed in the second interface without the need of the user to perform the picture selection operation in the second interface, so that the display efficiency of displaying the target hand-drawn information in the second interface is improved.
In another possible implementation manner, if the second application program is the same as the data processing module corresponding to the first application program, the electronic device may send, to the data processing module, hand-drawn dot matrix information of the target hand-drawn information, where the hand-drawn dot matrix information is used to represent a hand-drawn track of the target hand-drawn information.
Accordingly, when the drag operation ends in the second interface, the electronic device displays the target hand-drawn information in the second interface, which includes: deleting the drag picture when the drag operation is finished in the second interface; acquiring hand-drawn dot matrix information from a data processing module; and displaying the target hand-drawn information in the second interface according to the hand-drawn lattice information.
It should be noted that, when the electronic device finishes the drag operation in the second interface, deleting the drag picture; acquiring hand-drawn dot matrix information from a data processing module; the operation of displaying the target hand-drawn information in the second interface according to the hand-drawn lattice information may refer to the operations from step 1219 to step 1223, which will not be described in detail in the embodiments of the present application.
It should be noted that, when the second application program is the same as the data processing module corresponding to the first application program, by sending the hand-drawn lattice information of the hand-drawn handwriting to the data processing module, the hand-drawn handwriting of the target hand-drawn information displayed in the second interface can be edited for the second time, so that the flexibility of displaying the target hand-drawn information is improved.
In some embodiments, in the case that the target operation is a scribing operation from the area where the selected border is located to the second interface, the electronic device may generate, in the process of executing the scribing operation, a drag image and a target image based on a hand-drawn handwriting of the target hand-drawn information and add storage address information of the target image in the drag image, in response to the scribing operation, in the case that the data processing modules corresponding to the first application program and the second application program are different; under the condition that the scribing operation is finished in the second interface, acquiring storage address information from the dragging picture; deleting the dragged picture; acquiring a target picture based on the storage address information; and displaying the target picture in the second interface.
It should be noted that, the operation of generating the drag image and the target image by the electronic device and adding the storage address information of the target image in the drag image may refer to the above related description, which is not particularly limited in the embodiment of the present application.
In some embodiments, when the target operation is a scribing operation from the area where the selected border is located to the second interface, the electronic device responds to the scribing operation, and when the data processing modules corresponding to the first application program and the second application program are the same, the hand-drawn dot matrix information of the hand-drawn handwriting can be sent to the data processing modules in the process of executing the scribing operation; deleting the dragging picture when the scribing operation is finished in the second interface; acquiring hand-drawn dot matrix information from a data processing module; and displaying the target hand-drawn information in the second interface according to the hand-drawn lattice information.
In the embodiment of the application, when the user needs to display the target hand-drawn information displayed in the first interface of the first application program on the second interface of the second application program, the user can drag the target hand-drawn information to the second interface directly after selecting the target hand-drawn information, or the area where the target hand-drawn information is located performs a scribing operation on the second interface, so as to display the target hand-drawn information in the second interface. Because the user does not need to store the target hand-drawn information as a picture manually, and does not need to perform picture selection operation in other application interfaces, the target hand-drawn information can be displayed in other application interfaces, so that the operation of displaying the target hand-drawn information in other application interfaces is simplified, and the display efficiency of displaying the target hand-drawn information in other application interfaces is improved.
Next, an electronic device according to an embodiment of the present application will be described.
The method provided by the embodiment of the application can be executed by the electronic equipment, and the electronic equipment can have a hand-drawing note function. Further, an application program capable of realizing a hand-drawing note function such as a memo, a cloud note, etc. may be installed in the electronic device, and an application program capable of displaying pictures such as a mailbox application, a social application, etc. may be installed. By way of example and not limitation, the electronic device may be, but is not limited to, a tablet, desktop, laptop, handheld, notebook, in-vehicle device, ultra-mobile personal computer (UMPC), netbook, cellular telephone, personal digital assistant (personal digital assistant, PDA), augmented reality (augmented reality, AR) \virtual reality (VR) device, cell phone, smart appliance, and the like, to which embodiments of the present application are not limited.
Fig. 14 is a schematic structural diagram of a terminal according to an embodiment of the present application. Referring to fig. 14, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, a user identification module (subscriber identification module, SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces, such as may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light emitting diode (AMOLED), a flexible light-emitting diode (flex), a mini, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being an integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. Such as storing files of music, video, etc. in an external memory card.
The internal memory 121 may be used to store computer-executable program code that includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data (e.g., audio data, phonebook, etc.) created by the electronic device 100 during use, and so forth. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device 100 may implement audio functions such as music playing, recording, etc. through the audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, and application processor, etc.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. Such as: when a touch operation with the touch operation intensity smaller than the pressure threshold is applied to the short message application icon, executing an instruction for checking the short message. And executing the instruction of newly creating the short message when the touch operation with the touch operation intensity being larger than or equal to the pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The acceleration sensor 180E may also be used to identify the gesture of the electronic device 100, and may be used in applications such as landscape switching, pedometers, and the like.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor 180K may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer instructions are loaded and executed on a computer, the processes or functions described in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, data subscriber line (Digital Subscriber Line, DSL)) or wireless (e.g., infrared, wireless, microwave, etc.) means. The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium such as a floppy Disk, a hard Disk, a magnetic tape, an optical medium such as a digital versatile Disk (Digital Versatile Disc, DVD), or a semiconductor medium such as a Solid State Disk (SSD), etc.
The above embodiments are not intended to limit the present application, and any modifications, equivalent substitutions, improvements, etc. within the technical scope of the present application should be included in the scope of the present application.

Claims (10)

1. A method for displaying information, which is applied to an electronic device, the method comprising:
under the condition that a first interface of a first application program and a second interface of a second application program are displayed, responding to a selection operation of target hand-drawn information in the first interface, displaying a selected frame at a region where the target hand-drawn information is located, wherein the target hand-drawn information is located in the selected frame, and the target hand-drawn information is any hand-drawn information displayed in the first interface;
and responding to a target operation, wherein the target hand-drawn information is displayed in the second interface, and the target operation is an operation of dragging the target hand-drawn information into the second interface after acting on the area where the selected frame is located, or is a scribing operation from the area where the selected frame is located to the second interface.
2. The method of claim 1, wherein the displaying the selected border at the region where the target hand-drawn information is located in response to the selection operation of the target hand-drawn information in the first application interface in the case of displaying the first interface of the first application program and the second interface of the second application degree, comprises:
Receiving a user selection operation of a lasso tool in the first interface under the condition that the first interface and the second interface are displayed;
receiving a box selection operation of the user in the first interface for the target hand-drawn information by using the lasso tool;
and displaying the selected frame at the region where the target hand-drawn information is located based on the frame selection region of the frame selection operation.
3. The method of claim 2, wherein the selecting the region based on the selecting operation displays the selected border at the region where the target hand-drawn information is located, comprising:
responding to the frame selection operation, and displaying a primary selection frame according to a frame selection track of the frame selection operation;
identifying the position of the hand-drawn handwriting of the target hand-drawn information;
and adjusting the shape of the primary selection border according to the position of the hand-drawn handwriting of the target hand-drawn information so that the shape of the selected border obtained after adjustment is the same as the outline shape of the hand-drawn handwriting of the target hand-drawn information.
4. The method of any one of claims 1-3, wherein the target operation is an operation of dragging the target hand-drawn information into the second interface after acting on an area where the selected border is located; the displaying the target hand-drawn information in the second interface in response to a target operation includes:
Responding to long-press operation of the target hand-drawn information, and generating a drag picture of the target hand-drawn information;
responding to the dragging operation of the dragging picture of the target hand-drawn information, and moving the dragging picture of the target hand-drawn information according to the dragging track of the dragging operation;
and displaying the target hand-drawn information in the second interface when the drag operation is finished in the second interface.
5. The method of claim 4, wherein the generating a drag picture of the target hand-drawn information in response to a long press operation on the target hand-drawn information comprises:
responding to long-press operation of the target hand-drawn information, and generating a first layer in the first interface according to the area range of the area where the target hand-drawn information is located;
acquiring pixel information of hand-drawn handwriting of the target hand-drawn information;
and displaying the pixel information of the hand-drawn handwriting in the first layer to obtain a drag picture of the target hand-drawn information.
6. The method of claim 5, wherein the electronic device comprises a lasso module and a view module;
the responding to the long-press operation of the target hand-drawn information generates a first image layer in the first interface according to the area range of the area where the target hand-drawn information is located, and the method comprises the following steps:
The lasso module responds to the long-press operation of the target hand-drawn information and sends a long-press message to the view module;
the view module generates the first layer in the first interface according to the area range of the area where the target hand-drawn information is located under the condition that the long-press message is received;
the obtaining the pixel information of the hand-drawn handwriting of the target hand-drawn information includes:
the view module acquires pixel information of the hand-drawn handwriting;
the view module returns pixel information of the hand-drawn handwriting to the lasso module;
displaying the pixel information of the hand-drawn handwriting in the first layer to obtain a drag picture of the target hand-drawn information, including:
and the lasso module displays the pixel information of the hand-drawn handwriting in the first layer under the condition that the pixel information of the hand-drawn handwriting is received, so as to obtain a dragging picture of the target image.
7. The method of any of claims 4-6, wherein, in the event that the drag operation ends in the second interface, then displaying the target hand-drawn information in the second interface is preceded by:
Caching the hand-drawn handwriting of the target hand-drawn information into a target picture under the condition that the data processing modules corresponding to the second application program and the first application program are different, wherein the data processing modules are used for processing the information of the corresponding application program;
acquiring storage address information of the target picture;
adding the storage address information to the dragging picture;
and displaying the target hand-drawn information in the second interface when the drag operation is finished in the second interface, wherein the method comprises the following steps:
acquiring the storage address information from the drag picture under the condition that the drag operation is finished in the second interface;
deleting the dragging picture;
acquiring the target picture based on the storage address information;
and displaying the target picture in the second interface.
8. The method of any of claims 4-6, wherein the displaying the target hand-drawn information in the second interface before the drag operation ends in the second interface further comprises:
if the second application program is the same as the data processing module corresponding to the first application program, the hand-painted dot matrix information of the target hand-painted information is sent to the data processing module, the hand-painted dot matrix information is used for representing the hand-painted track of the target hand-painted information, and the data processing module is a module used for processing the information of the corresponding application program;
And displaying the target hand-drawn information in the second interface when the drag operation is finished in the second interface, wherein the method comprises the following steps:
deleting the drag picture when the drag operation is finished in the second interface;
acquiring the hand-drawn lattice information from the data processing module;
and displaying the target hand-drawn information in the second interface according to the hand-drawn lattice information.
9. An electronic device, the electronic device comprising: a processor and a memory for storing one or more programs, the one or more programs comprising instructions, which when executed by the processor, the electronic device is for performing the method of displaying information of any of claims 1-8.
10. A computer readable storage medium storing one or more programs, wherein the one or more programs are configured to be executed by one or more processors, the one or more programs comprising instructions that cause an electronic device to perform the method of displaying information of any of claims 1-8.
CN202211301237.4A 2022-10-24 2022-10-24 Information display method, electronic device and readable storage medium Active CN116700554B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211301237.4A CN116700554B (en) 2022-10-24 2022-10-24 Information display method, electronic device and readable storage medium
CN202410552358.9A CN118502625A (en) 2022-10-24 2022-10-24 Information display method, electronic device and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211301237.4A CN116700554B (en) 2022-10-24 2022-10-24 Information display method, electronic device and readable storage medium

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202410552358.9A Division CN118502625A (en) 2022-10-24 2022-10-24 Information display method, electronic device and readable storage medium

Publications (2)

Publication Number Publication Date
CN116700554A true CN116700554A (en) 2023-09-05
CN116700554B CN116700554B (en) 2024-05-24

Family

ID=87826389

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202410552358.9A Pending CN118502625A (en) 2022-10-24 2022-10-24 Information display method, electronic device and readable storage medium
CN202211301237.4A Active CN116700554B (en) 2022-10-24 2022-10-24 Information display method, electronic device and readable storage medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202410552358.9A Pending CN118502625A (en) 2022-10-24 2022-10-24 Information display method, electronic device and readable storage medium

Country Status (1)

Country Link
CN (2) CN118502625A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102830963A (en) * 2012-06-28 2012-12-19 北京奇虎科技有限公司 Method and system for matching screenshot
US20160117072A1 (en) * 2014-10-24 2016-04-28 Google Inc. Drag-and-drop on a mobile device
US20180088784A1 (en) * 2016-09-29 2018-03-29 Beijing Xiaomi Mobile Software Co., Ltd. Method and Device for Sharing Content
CN108509142A (en) * 2018-04-08 2018-09-07 广州视源电子科技股份有限公司 Writing software interaction method and device, terminal equipment and storage medium
CN109462692A (en) * 2018-10-29 2019-03-12 努比亚技术有限公司 Split screen display available operating method, mobile terminal and computer readable storage medium
US20210350122A1 (en) * 2020-05-11 2021-11-11 Apple Inc. Stroke based control of handwriting input
WO2022052677A1 (en) * 2020-09-09 2022-03-17 华为技术有限公司 Interface display method and electronic device
WO2022089208A1 (en) * 2020-10-31 2022-05-05 华为技术有限公司 File dragging method, and electronic device
CN114548040A (en) * 2022-02-28 2022-05-27 掌阅科技股份有限公司 Note processing method, electronic device and storage medium
CN114860142A (en) * 2021-01-20 2022-08-05 华为技术有限公司 Dragging processing method and device

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102830963A (en) * 2012-06-28 2012-12-19 北京奇虎科技有限公司 Method and system for matching screenshot
US20160117072A1 (en) * 2014-10-24 2016-04-28 Google Inc. Drag-and-drop on a mobile device
US20180088784A1 (en) * 2016-09-29 2018-03-29 Beijing Xiaomi Mobile Software Co., Ltd. Method and Device for Sharing Content
CN108509142A (en) * 2018-04-08 2018-09-07 广州视源电子科技股份有限公司 Writing software interaction method and device, terminal equipment and storage medium
CN109462692A (en) * 2018-10-29 2019-03-12 努比亚技术有限公司 Split screen display available operating method, mobile terminal and computer readable storage medium
US20210350122A1 (en) * 2020-05-11 2021-11-11 Apple Inc. Stroke based control of handwriting input
WO2022052677A1 (en) * 2020-09-09 2022-03-17 华为技术有限公司 Interface display method and electronic device
WO2022089208A1 (en) * 2020-10-31 2022-05-05 华为技术有限公司 File dragging method, and electronic device
CN114527901A (en) * 2020-10-31 2022-05-24 华为技术有限公司 File dragging method and electronic equipment
CN114860142A (en) * 2021-01-20 2022-08-05 华为技术有限公司 Dragging processing method and device
CN114548040A (en) * 2022-02-28 2022-05-27 掌阅科技股份有限公司 Note processing method, electronic device and storage medium

Also Published As

Publication number Publication date
CN116700554B (en) 2024-05-24
CN118502625A (en) 2024-08-16

Similar Documents

Publication Publication Date Title
US9767359B2 (en) Method for recognizing a specific object inside an image and electronic device thereof
US9852491B2 (en) Objects in screen images
EP3964937B1 (en) Method for generating user profile photo, and electronic device
CN108463799B (en) Flexible display of electronic device and operation method thereof
CN114816167B (en) Application icon display method, electronic device and readable storage medium
JP2011526707A (en) Motion control view on mobile computing devices
CN116095413B (en) Video processing method and electronic equipment
CN115801943B (en) Display method, electronic device and storage medium
CN116826892B (en) Charging method, charging device, electronic apparatus, and readable storage medium
CN116664734B (en) Method for displaying ring chart, electronic device and readable storage medium
CN116700554B (en) Information display method, electronic device and readable storage medium
CN114461312B (en) Display method, electronic device and storage medium
CN112711636B (en) Data synchronization method, device, equipment and medium
CN116204254A (en) Annotating page generation method, electronic equipment and storage medium
CN116661645B (en) Method for displaying application card, electronic device and readable storage medium
CN116048373B (en) Display method of suspension ball control, electronic equipment and storage medium
CN118093067A (en) Method for displaying card, electronic device and readable storage medium
CN116935504B (en) Card punching method, electronic device and computer readable storage medium
WO2024125301A1 (en) Display method and electronic device
CN116700535B (en) Suspension bar display method based on note application, electronic equipment and storage medium
WO2023072113A1 (en) Display method and electronic device
EP4296840A1 (en) Method and apparatus for scrolling to capture screenshot
WO2023160455A1 (en) Object deletion method and electronic device
CN116719459A (en) Annotation frame display method, electronic device and readable storage medium
CN118363513A (en) Display method, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant