CN113407077A - Information extraction method and electronic equipment - Google Patents

Information extraction method and electronic equipment Download PDF

Info

Publication number
CN113407077A
CN113407077A CN202110629204.1A CN202110629204A CN113407077A CN 113407077 A CN113407077 A CN 113407077A CN 202110629204 A CN202110629204 A CN 202110629204A CN 113407077 A CN113407077 A CN 113407077A
Authority
CN
China
Prior art keywords
input
stylus
target area
pen body
movement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110629204.1A
Other languages
Chinese (zh)
Inventor
袁青青
刘由之
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202110629204.1A priority Critical patent/CN113407077A/en
Publication of CN113407077A publication Critical patent/CN113407077A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04162Control or interface arrangements specially adapted for digitisers for exchanging data with external devices, e.g. smart pens, via the digitiser sensing hardware
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/166Editing, e.g. inserting or deleting

Abstract

The application discloses an information extraction method and electronic equipment, and belongs to the field of communication. The information extraction method comprises the following steps: receiving a first input of a stylus body, wherein the first input comprises at least one of a movement input and a rotation input; in response to the first input, determining a target area corresponding to the pen body movement and/or rotation coverage range in a display screen of the electronic equipment; the content displayed in the target area is extracted.

Description

Information extraction method and electronic equipment
Technical Field
The application belongs to the technical field of communication, and particularly relates to an information extraction method and electronic equipment.
Background
With the popularization of electronic devices, the functions of the electronic devices are more and more complete. When the electronic device is used, the electronic device can be controlled through hardware buttons, software operation or gestures, so that the extraction of contents displayed in a screen, such as texts, pictures and the like, can be realized.
However, when a user wants to extract a part of the content displayed on the screen or extract more content displayed on the screen, all the displayed content on the screen needs to be selected first, and then the target content is determined from the selected content.
Disclosure of Invention
The embodiment of the application aims to provide an information extraction method and electronic equipment, and the problems that the operation process of extracting screen display content is complicated and time consumption is long can be solved.
In a first aspect, an embodiment of the present application provides an information extraction method, where an electronic device is in communication connection with a stylus, and the method includes:
receiving a first input of a stylus body, wherein the first input comprises at least one of a movement input and a rotation input;
in response to the first input, determining a target area corresponding to the pen body movement and/or rotation coverage range in a display screen of the electronic equipment;
the content displayed in the target area is extracted.
In a second aspect, an embodiment of the present application provides an apparatus for information extraction, where the apparatus includes:
the touch control device comprises a receiving module, a processing module and a display module, wherein the receiving module is used for receiving a first input of a stylus body, and the first input comprises at least one of a movement input and a rotation input;
the processing module is used for responding to the first input and determining a target area corresponding to the pen body moving and/or rotating coverage range in a display screen of the electronic equipment;
and the extraction module is used for extracting the content displayed in the target area.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, after the electronic device is connected with the stylus, the electronic device may receive a first input of a stylus body, where the first input may include at least one of a movement input and a rotation input, and stability and accuracy of determining a target area may be effectively improved through an auxiliary operation of the stylus body; after the electronic equipment receives the first input, the area corresponding to the pen body moving and/or rotating coverage range in the display screen of the electronic equipment is used as the target area, and the content displayed in the target area is extracted.
Drawings
Fig. 1 is a schematic flowchart of an information extraction method provided in an embodiment of the present application;
FIG. 2 is a schematic diagram of a movement input provided by an embodiment of the present application;
FIG. 3 is a schematic diagram of another movement input provided by embodiments of the present application;
FIG. 4 is a schematic diagram of yet another movement input provided by an embodiment of the present application;
FIG. 5 is a schematic diagram of a method for determining a target area according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of another method for determining a target area provided by an embodiment of the present application;
FIG. 7 is a schematic diagram of another embodiment of the present application for determining a target area;
FIG. 8 is a schematic diagram of generating a screenshot provided by an embodiment of the present application;
fig. 9 is a schematic structural diagram of an information extraction apparatus provided in an embodiment of the present application;
fig. 10 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 11 is a hardware configuration diagram of another electronic device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
For the problems occurring in the background art, the embodiments of the present application provide an information extraction method and an electronic device, after the electronic device is connected to a stylus, the electronic device may receive a first input of a stylus body, where the first input may include at least one of a movement input and a rotation input, and an area corresponding to a movement and/or rotation coverage area of the stylus body in a display screen of the electronic device is taken as a target area, so that stability and accuracy of determining the target area may be effectively improved through an auxiliary operation of the stylus body; and then, the content displayed in the target area is extracted, the whole process is simple to operate, convenience is brought to the user, and the user experience is effectively improved.
The following describes in detail the information extraction provided by the embodiments of the present application through specific embodiments and application scenarios thereof with reference to the accompanying drawings.
Referring to fig. 1, fig. 1 is a schematic flow chart of an information extraction method provided in an embodiment of the present application, and may include steps 110 to 130.
Step 110, a first input of a stylus body is received.
In particular, the first input may comprise at least one of a movement input and a rotation input.
In this embodiment of the application, the electronic device may be in communication connection with a stylus, the stylus or the electronic device may be provided with a detection element, and when the electronic device receives the first input of the stylus, a body of the stylus may be attached to a display screen of the electronic device, which may be within a sensing range of the detection element of the electronic device, where a gap exists between the body of the stylus and the display screen of the electronic device, which is not specifically limited herein.
Based on the perception of the electronic device and the stylus, the electronic device may receive at least one of movement input and rotation input of the stylus. The electronic device may include, but is not limited to, a smart phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, and the like.
And 120, responding to the first input, and determining a target area corresponding to the pen body moving and/or rotating coverage range in the display screen of the electronic equipment.
In some embodiments, the movement input may refer to a movement of the stylus in a certain direction, and when the movement of the stylus relative to the display screen of the electronic device is detected, the shape of the movement coverage of the stylus is corresponding, for example, a rectangle, a parallelogram, or the like, and a shape corresponding to other customized graphics may also be included. The rotation input may refer to that the stylus rotates based on a certain fixed point, and when the stylus is detected to rotate relative to the display screen of the electronic device, the shape corresponding to the movement coverage of the stylus may include, for example, a sector, a circle, or the like, or may include a shape corresponding to another customized graphic. It will be appreciated that the coverage of the first input may also be determined by a combination of movements and rotations during a single input.
After the electronic equipment detects that the first input is completed, the area corresponding to the coverage area in the display screen is used as the target area, so that in the process of determining the target area, the stability and accuracy of determining the target area can be effectively improved through the auxiliary operation of the stylus body.
Optionally, the user triggers an operation of ending the first input, for example, a touch element that stops obtaining the first input is disposed on the stylus or the electronic device, and after the user clicks the touch element, the electronic device may determine that the first input is completed. The operation of the user to trigger the ending of the first input may be, for example, that the stylus body is away from a range that can be sensed by the detection element in the electronic device, and the like, and is not particularly limited herein.
After the target area is determined, step 130 may be performed next.
Step 130, extracting the content displayed in the target area.
In some embodiments, the content displayed in the target area may be content including text information, images, and the like. Specifically, the extracting of the content displayed in the target area may be generating a screenshot corresponding to the target area. When the content displayed in the target area includes text information, the content displayed in the target area is extracted, and specifically, the text information in the target area may also be extracted, for example, operations such as cutting, copying, and selecting are not specifically limited herein.
According to the embodiment of the application, a user can quickly extract target display content from the content displayed on the screen only by controlling the stylus to move and/or rotate relative to the display screen of the electronic equipment, so that the operation complexity of determining the target area can be simplified, the operation of extracting the target content is more convenient, convenience is brought to the user, and the user experience is improved.
As a specific example, in the case that the first input includes a movement input, in this embodiment, step 120 may specifically be: in response to the movement input, a target area corresponding to the movement coverage of the pen body in the display screen can be determined according to the displacement of the pen body and the length of the pen body.
Taking the generation of the screenshot corresponding to the target area as an example, when the pen body movement coverage is determined, as shown in fig. 2(a), the stylus moves from the initial position of the movement to the direction of the arrow shown in fig. 2 (a). In order to facilitate a user to know the coverage range of the pen body more intuitively, the boundary of the coverage area can be displayed in the moving process of the pen body, and the boundary is used for representing the coverage range of the pen body moving in the display screen. Optionally, the pen body of the stylus may include a pen point of the stylus, and in a moving process of the stylus, a position corresponding to the pen point is used as a left side boundary of the coverage area, and when the length of the stylus pen body does not exceed a screen range, a position corresponding to a tail of the pen body may be used as a right side boundary of the coverage area. Therefore, when the electronic equipment detects that the movement input of the stylus pen is finished, the screenshot of the content displayed in the target area corresponding to the movement coverage range of the stylus pen body is generated according to the displacement of the stylus pen body. It is understood that the stylus moving direction in fig. 2(a) is only an example and does not constitute a specific limitation of the present application.
In order to facilitate the user to view the screenshot, optionally, the screenshot may be displayed in a floating window manner in a preset area of the screen, as shown in fig. 2(b), if the user wants to know details of the content in the screenshot or wants to edit the screenshot in real time, the screenshot in the floating window may be directly clicked, so that the generated screenshot is quickly processed, and the user experience is improved.
In some embodiments, the content displayed in the screen may include textual information. The text information displayed in the screen may be from a document page opened by the user, a web page browsed by the user, or a chat session interface, and the like, and is not particularly limited herein.
Taking the example of displaying a document page in a display screen and extracting text information in a target area, a user may cover a text desired to be selected in the display screen with a body of a stylus, optionally, the body of the stylus may be parallel to a text line, as shown in fig. 3(a), the stylus moves in a direction of an arrow shown in fig. 3(a), an initial position of the movement of the stylus may be an initial line, and when the electronic device detects that the movement input of the stylus is ended, an end position of the movement of the stylus may be an end line. Wherein the selected text information may be as shown in fig. 3 (b). Therefore, the text content displayed on the screen can be accurately selected, and the accuracy rate of selecting the text content is improved.
In order to make text processing more convenient and improve user experience, before extracting text information in a target region, a preset text processing control may be displayed in a screen, so that a user may select a target extraction manner, where the text processing control is, for example: cutting, copying, encrypting, deleting, generating a picture, enlarging, reducing, etc., as shown in fig. 3(b), and is not particularly limited herein. Next, the content displayed in the target area may be extracted according to the target extraction manner selected by the user. Therefore, the user can conveniently select the target extraction mode, and the use efficiency of the user can be improved.
Continuing to take the example of displaying a document page in a display screen and extracting text information in a target area as an example, fig. 4 is another schematic diagram for determining a target area provided in the embodiment of the present application, and as shown in fig. 4, a body of a stylus may cover multiple lines of text, where the initial lines and the end lines of the content of the text that a user wants to select may be included, as shown in fig. 4(a), the stylus may move in the direction of an arrow shown in fig. 4(a), and after detecting that the stylus movement input is ended, text information to be extracted is obtained, as shown in fig. 4 (b). Optionally, in order to facilitate the user to process the text displayed in the target area, before the content displayed in the target area is extracted, a preset word processing control may be displayed in the screen, as shown in fig. 4(b), so that the user may select a target extraction manner, and the use efficiency of the user may be improved.
As a specific example, in the case that the first input includes a movement input and a rotation input, step 120 in this embodiment may specifically be:
responding to the movement input, and determining a first area corresponding to the movement coverage range of the pen body in the display screen according to the displacement of the pen body and the length of the pen body;
in response to the rotation input, moving the target boundary of the first region to a first direction according to the rotation angle of the pen body;
and under the condition that the stylus stops rotating, stopping moving the target boundary to obtain a target area.
That is, if the coverage obtained through the movement input does not satisfy the user's requirement, the user can adjust the coverage by rotating the stylus pen to obtain the target area satisfying the user's requirement.
The rotation direction of the rotation input may include, for example, clockwise rotation, counterclockwise rotation, and the like. Alternatively, the first direction of movement of the target boundary may be set by the user according to application needs. Therefore, when a user wants to extract partial content from the interface, the user only needs to control the touch control pen to rotate, the size of the coverage area can be adjusted, and the target area can be flexibly determined.
Taking generation of a screenshot corresponding to a target area as an example, when determining a pen body movement coverage, first, referring to the schematic diagram of determining the target area shown in fig. 2 in the embodiment of the present application, in a process of determining a stylus movement input, after the stylus moves by a certain displacement, next, the stylus may adjust the coverage by rotation. As shown in fig. 5(a), the stylus rotates clockwise, and accordingly, the target boundary moves to the left, and the coverage decreases. And under the condition that the stylus stops rotating, stopping moving the target boundary, and when the electronic equipment detects that the first input of the stylus is finished, generating a screenshot of the content displayed in the target area corresponding to the adjusted coverage range, so as to realize rapid extraction of the display content in the target area.
In order to facilitate the user to view the screenshot, optionally, the screenshot may be displayed in a floating window manner in a preset area of the screen, as shown in fig. 5(b), if the user wants to know details of the content in the screenshot or wants to edit the screenshot in real time, the screenshot in the floating window may be directly clicked, so that the generated screenshot is quickly processed, and the user experience is improved.
In some embodiments, in a case where the first input includes a movement input and a rotation input, and the content displayed in the first area includes text information, the target boundary is an end position of the text information, and thus, after the first area is determined, in particular, the target area may be further determined according to the following steps:
responding to the rotation input, and moving the end point position of the text information to a second direction according to the rotation angle of the pen body;
and under the condition that the stylus stops rotating, stopping moving the end point position of the text information to obtain the target area and the text to be extracted in the target area.
For example, in order to facilitate the user to know the selected text, optionally, the text information displayed in the first area may display a shading of the text, the shading of the text may be dark gray, etc., in which case the end position of the text information is the last text character, which may be a text, a letter, a symbol, etc., and is not limited specifically herein. Further, the text identifier may be displayed to facilitate the user to know the selected text, wherein the end position of the text information may be as shown in fig. 6 (b).
Wherein the second direction in which the end position of the text information moves can be set by the user according to application needs. Therefore, when a user wants to select non-whole lines of characters or adjust the line of the end point position, the user only needs to control the touch control pen to rotate, the end point position of the text information can be adjusted, and the user can flexibly determine the target area.
Continuing with the example of the document page, fig. 6 is a schematic diagram of determining a target area according to an embodiment of the present application, first, a process of inputting a movement of the stylus pen may be combined with that shown in fig. 3(a) of the present application, after the stylus pen moves by a certain displacement, a first area is obtained, then, in response to the input of a rotation of the stylus pen, an end position of the text information moves to a direction indicated by a dotted arrow shown in fig. 6(a), and in a case that it is detected that the stylus pen stops rotating, the movement of the end position of the text information is stopped, and a target area and a text to be extracted in the target area are obtained, where the end positions of the text to be extracted and the text information may be as shown in fig. 6 (b). Therefore, the text content displayed on the screen can be accurately selected, and the efficiency and the accuracy of selecting the text content are improved.
Optionally, in order to facilitate the user to process the text displayed in the target area, before the text to be extracted is extracted, a preset word processing control may be displayed in the screen, as shown in fig. 6(b), so that the user may select a target extraction manner, which may improve the use efficiency of the user and improve the flexibility of text processing.
As another example of the present application, fig. 7 is a schematic diagram of determining a target area according to an embodiment of the present application, and a process of inputting a movement of a stylus may be combined with that shown in fig. 4(a) of the present application, and after the stylus moves by a certain displacement, a first area is obtained. Next, in response to the stylus rotation input, the end position of the text information moves in the direction indicated by the dotted arrow in fig. 7 (a). In the case that it is detected that the stylus stops rotating, the movement of the end position of the text information is stopped, and the target area and the text to be extracted in the target area are obtained, where the end positions of the text to be extracted and the text information may be as shown in fig. 7 (b).
Optionally, in order to facilitate the user to process the text displayed in the target region, before the content displayed in the target region is extracted, a preset word processing control may be displayed in the screen, as shown in fig. 7(b), so that the user may select a target extraction manner, the use efficiency of the user may be improved, the flexibility of text processing may be improved, and the user experience may be improved.
In order to meet the requirement of diversified information extraction of users, the method for extracting the content displayed in the target area further comprises the following steps:
under the condition that the stylus pen body is detected to move to the preset position, the target content is displayed in a rolling mode in a third direction, wherein the third direction is opposite to the moving direction of the moving input;
and under the condition that the stylus is detected to leave the preset position, extracting the content displayed in the target area within a first time period, wherein the first time period is the time period from the time when the stylus starts to move to the time when the stylus leaves the preset position.
The preset position is any position in the display screen, for example, may be set at an edge of the screen, and the preset position may also be set at any position in the display screen, which is not limited herein. And under the condition that the stylus reaches the preset position, the content displayed in the display screen can be scrolled in a third direction opposite to the moving input direction, so that the content displayed in the target area is increased.
In the case where it is detected that the stylus pen is away from the preset position, the content displayed in the target area in the period from when the stylus pen starts moving to when it is away from the preset position may be extracted. For example, generating a long screenshot corresponding to the content displayed in the target area; for example, when the content displayed in the target area includes text information, operations such as cutting, copying, and selecting are performed to extract the text information.
In the embodiment of the application, when the long screenshot needs to be generated, a user only needs to control the touch pen to move and move the touch pen to the preset position in the display screen, so that the content displayed in the target area can be quickly increased, the long screenshot function can be conveniently and quickly realized, and the use experience is improved.
Taking the preset position as an example to set the lower edge of the screen, fig. 8 is a schematic diagram of another screenshot generation provided in the embodiment of the present application, in which the stylus is slid from top to bottom, i.e. in the direction indicated by the solid arrow in fig. 8(a), where the area corresponding to the coverage of the stylus movement is the area in the dashed-line frame shown in fig. 8(a), and in the case that the stylus movement is detected to the lower edge of the screen, the content displayed in the screen is slid upwards, i.e. in the direction indicated by the dashed-line arrow in fig. 8(a), so that the content displayed in the dashed-line frame is increased, and then, in the case that the stylus movement is detected to leave the preset position and the movement input is detected to be finished, the content displayed in the target area in the first time period is intercepted, so as to obtain a long screenshot, and optionally, the screenshot is displayed in the preset area of the screen, as shown in fig. 8(b), according to the embodiment of the present application, the operation process of generating the screenshot can be simplified, and the user experience is improved.
It should be noted that, in the information extraction method provided in the embodiment of the present application, the execution subject may be an information extraction device, or a control module in the information extraction device for executing the information extraction method. In the embodiment of the present application, a method for executing information extraction by an information extraction device is taken as an example, and an information extraction device provided in the embodiment of the present application is described.
Fig. 9 is a schematic structural diagram of an information extraction apparatus provided in the present application. As shown in fig. 9, the data transmission apparatus 900 may include: a receiving module 910, a processing module 920, and an extracting module 930.
A receiving module 910, configured to receive a first input of a stylus body, where the first input includes at least one of a movement input and a rotation input;
the processing module 920 is configured to determine, in response to the first input, a target area corresponding to a pen body movement and/or rotation coverage in a display screen of the electronic device;
an extracting module 930, configured to extract the content displayed in the target area.
In the embodiment of the application, after the electronic device is connected with the stylus, the electronic device may receive a first input of a stylus body, where the first input may include at least one of a movement input and a rotation input, and stability and accuracy of determining a target area may be effectively improved through an auxiliary operation of the stylus body; after the electronic equipment receives the first input, the area corresponding to the pen body moving and/or rotating coverage range in the display screen of the electronic equipment is used as the target area, and the content displayed in the target area is extracted.
In a possible embodiment, in a case that the first input includes a movement input, the processing module 920 is further configured to determine, in response to the movement input, a target area in the display screen corresponding to a movement coverage of the pen body according to the displacement of the pen body and the length of the pen body.
Therefore, the operation complexity of determining the target area can be simplified, the operation of extracting the target content is more convenient, convenience is brought to a user, and the user experience is improved.
In a possible embodiment, in the case that the first input includes a movement input and a rotation input, the processing module 920 is further configured to determine, in response to the movement input, a first area in the display screen corresponding to the movement coverage of the pen body according to the displacement of the pen body and the length of the pen body;
the processing module 920 is further configured to, in response to the rotation input, move the target boundary of the first area to the first direction according to the rotation angle of the pen body;
the processing module 920 is further configured to stop moving the target boundary to obtain the target area when it is detected that the stylus stops rotating.
Therefore, the coverage range can be adjusted by rotating the touch pen by the user to obtain the target area meeting the user requirement, the operation is simple, and the user experience can be improved.
In a possible embodiment, in a case where the content displayed in the first area includes text information, the target boundary is an end position of the text information;
the processing module 920 is further configured to respond to the rotational input, and move the end position of the text message to the second direction according to the rotation angle of the pen body;
the processing module 920 is further configured to stop moving the end position of the text information when it is detected that the stylus stops rotating, so as to obtain the target area and the text to be extracted in the target area.
Therefore, when a user wants to select non-whole lines of characters or adjust the line of the end point position, the user only needs to control the touch control pen to rotate, the end point position of the text information can be adjusted, and the user can flexibly determine the target area.
In one possible embodiment, the display module is configured to scroll the target content in a third direction when the stylus body is detected to move to the preset position, where the third direction is opposite to the moving direction of the movement input;
the extracting module 930 is further configured to, in a case that it is detected that the stylus pen leaves the preset position, extract content displayed in the target area within a first time period, where the first time period is a time period from when the stylus pen starts to move to when the stylus pen leaves the preset position.
Therefore, the requirement of diversified information extraction of the user can be met.
The information extraction device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The information extraction device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The information extraction device provided in the embodiment of the present application can implement each process implemented by the method embodiments of fig. 1 to 9, and is not described here again to avoid repetition.
Optionally, as shown in fig. 10, an electronic device 1000 is further provided in this embodiment of the present application, and includes a processor 1001, a memory 1002, and a program or an instruction stored in the memory 1002 and executable on the processor 1001, where the program or the instruction is executed by the processor 1001 to implement each process of the above-mentioned embodiment of the information extraction method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 11 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 1100 includes, but is not limited to: a radio frequency unit 1101, a network module 1102, an audio output unit 1103, an input unit 1104, a sensor 1105, a display unit 1106, a user input unit 1107, an interface unit 1108, a memory 1109, a processor 1110, and the like.
Those skilled in the art will appreciate that the electronic device 1100 may further include a power source (e.g., a battery) for supplying power to the various components, and the power source may be logically connected to the processor 1110 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system. The electronic device structure shown in fig. 11 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
A user input unit 1107 for receiving a first input of the stylus body, wherein the first input includes at least one of a movement input and a rotation input;
a processor 1110 for determining a target area corresponding to a body movement and/or rotation coverage in a display screen of the electronic device in response to the first input;
the processor 1110 is further configured to extract content displayed in the target area.
In the embodiment of the application, after the electronic device is connected with the stylus, the electronic device may receive a first input of a stylus body, where the first input may include at least one of a movement input and a rotation input, and stability and accuracy of determining a target area may be effectively improved through an auxiliary operation of the stylus body; after the electronic equipment receives the first input, the area corresponding to the pen body moving and/or rotating coverage range in the display screen of the electronic equipment is used as the target area, and the content displayed in the target area is extracted.
Optionally, where the first input comprises a movement input,
the processor 1110 is further configured to determine, in response to the movement input, a target area in the display screen corresponding to the movement coverage of the pen body according to the displacement of the pen body and the length of the pen body.
Therefore, the operation complexity of determining the target area can be simplified, the operation of extracting the target content is more convenient, convenience is brought to a user, and the user experience is improved.
Optionally, in case the first input comprises a movement input and a rotation input,
the processor 1110 is further configured to determine, in response to the movement input, a first area corresponding to a movement coverage range of the pen body in the display screen according to the displacement of the pen body and the length of the pen body;
a processor 1110, further configured to respond to a rotational input, move a target boundary of the first region to a first direction according to a rotation angle of the pen body;
the processor 1110 is further configured to stop moving the target boundary to obtain the target area when it is detected that the stylus stops rotating.
Therefore, the coverage range can be adjusted by rotating the touch pen by the user to obtain the target area meeting the user requirement, the operation is simple, and the user experience can be improved.
Optionally, when the content displayed in the first area includes text information, the target boundary is an end position of the text information;
the processor 1110 is further configured to, in response to the rotation input, move the end position of the text message in the second direction according to the rotation angle of the pen body;
the processor 1110 is further configured to, when it is detected that the stylus pen stops rotating, stop moving the end position of the text information, and obtain the target area and the text to be extracted in the target area.
Therefore, when a user wants to select non-whole lines of characters or adjust the line of the end point position, the user only needs to control the touch control pen to rotate, the end point position of the text information can be adjusted, and the user can flexibly determine the target area.
Optionally, the display unit 1106 is configured to, in a case that it is detected that the stylus pen body moves to the preset position, scroll and display the target content in a third direction, where the third direction is opposite to a moving direction of the movement input;
the extracting module 1130 is further configured to, in a case that it is detected that the stylus pen leaves the preset position, extract content displayed in the target area within a first time period, where the first time period is a time period from when the stylus pen starts to move to when the stylus pen leaves the preset position.
Therefore, the requirement of diversified information extraction of the user can be met.
It should be understood that in the embodiment of the present application, the input Unit 1104 may include a Graphics Processing Unit (GPU) 11041 and a microphone 11042, and the Graphics processor 11041 processes image data of still pictures or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 1106 may include a display panel 11061, and the display panel 11061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1107 includes a touch panel 11071 and other input devices 11072. A touch panel 11071, also called a touch screen. The touch panel 11071 may include two portions of a touch detection device and a touch controller. Other input devices 11072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 1109 may be used for storing software programs and various data including, but not limited to, application programs and an operating system. Processor 1110 may integrate an application processor that handles primarily operating systems, user interfaces, applications, etc. and a modem processor that handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1110.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the above-mentioned information extraction method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the above information extraction method embodiment, and can achieve the same technical effect, and in order to avoid repetition, the description is omitted here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (12)

1. An information extraction method, wherein an electronic device is in communication connection with a stylus, the method comprising:
receiving a first input of the stylus body, wherein the first input comprises at least one of a movement input and a rotation input;
in response to the first input, determining a target area corresponding to the pen body movement and/or rotation coverage in a display screen of the electronic equipment;
and extracting the content displayed in the target area.
2. The method of claim 1, wherein if the first input comprises a movement input; the determining a target area corresponding to the pen body movement and/or rotation coverage in a display screen of the electronic device in response to the first input comprises:
and responding to the movement input, and determining a target area corresponding to the movement coverage range of the pen body in the display screen according to the displacement of the pen body and the length of the pen body.
3. The method of claim 1, wherein in a case that the first input comprises a movement input and a rotation input, the determining a target area in a display screen of the electronic device corresponding to the body movement and/or rotation coverage in response to the first input comprises:
responding to the movement input, and determining a first area corresponding to the movement coverage range of the pen body in the display screen according to the displacement of the pen body and the length of the pen body;
in response to the rotation input, moving a target boundary of the first region to a first direction according to a rotation angle of the pen body;
and under the condition that the stylus stops rotating, stopping moving the target boundary to obtain the target area.
4. The method according to claim 3, wherein in a case where the content displayed in the first area includes text information, the target boundary is an end position of the text information; the responding to the rotation input, moving the target boundary of the first area to a first direction according to the rotation angle of the pen body, and including:
responding to the rotation input, and moving the end point position of the text information to a second direction according to the rotation angle of the pen body;
the stopping moving the target boundary to obtain the target area when the stylus is detected to stop rotating includes:
and under the condition that the stylus stops rotating, stopping moving the end point position of the text information to obtain the target area and the text to be extracted in the target area.
5. The method of claim 1, wherein the extracting the content displayed in the target region comprises:
under the condition that the stylus pen body is detected to move to a preset position, displaying target content in a rolling mode in a third direction, wherein the third direction is opposite to the moving direction of the moving input;
under the condition that the stylus is detected to leave the preset position, extracting content displayed in the target area within a first time period, wherein the first time period is a time period from the time when the stylus starts to move to the time when the stylus leaves the preset position.
6. An information extraction apparatus characterized by comprising:
a receiving module, configured to receive a first input of a stylus body, wherein the first input includes at least one of a movement input and a rotation input;
the processing module is used for responding to the first input and determining a target area corresponding to the pen body moving and/or rotating coverage range in a display screen of the electronic equipment;
and the extraction module is used for extracting the content displayed in the target area.
7. The apparatus of claim 6, wherein in a case that the first input comprises a movement input, the processing module is further configured to determine, in response to the movement input, a target area corresponding to a movement coverage of the pen body in the display screen according to the displacement of the pen body and the length of the pen body.
8. The apparatus of claim 6, wherein, where the first input comprises a movement input and a rotation input,
the processing module is further configured to determine, in response to the movement input, a first area corresponding to a movement coverage area of the pen body in the display screen according to the displacement of the pen body and the length of the pen body;
the processing module is further configured to respond to the rotation input, and move a target boundary of the first area to a first direction according to the rotation angle of the pen body;
the processing module is further configured to stop moving the target boundary to obtain the target area when it is detected that the stylus stops rotating.
9. The apparatus according to claim 8, wherein in a case where the content displayed in the first area includes text information, the target boundary is an end position of the text information;
the processing module is further used for responding to the rotation input, and the end point position of the text message moves to a second direction according to the rotation angle of the pen body;
the processing module is further configured to stop moving the end position of the text information to obtain the target area and the text to be extracted in the target area when it is detected that the stylus stops rotating.
10. The apparatus of claim 6, further comprising:
the display module is used for scrolling and displaying target content to a third direction under the condition that the stylus body is detected to move to a preset position, wherein the third direction is opposite to the moving direction of the movement input;
the extracting module is further configured to extract content displayed in the target area within a first time period when the stylus is detected to leave the preset position, where the first time period is a time period from when the stylus starts to move to when the stylus leaves the preset position.
11. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the information extraction method of any one of claims 1-5.
12. A readable storage medium, characterized in that it stores thereon a program or instructions which, when executed by a processor, implement the steps of the information extraction method according to any one of claims 1 to 5.
CN202110629204.1A 2021-06-07 2021-06-07 Information extraction method and electronic equipment Pending CN113407077A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110629204.1A CN113407077A (en) 2021-06-07 2021-06-07 Information extraction method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110629204.1A CN113407077A (en) 2021-06-07 2021-06-07 Information extraction method and electronic equipment

Publications (1)

Publication Number Publication Date
CN113407077A true CN113407077A (en) 2021-09-17

Family

ID=77676520

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110629204.1A Pending CN113407077A (en) 2021-06-07 2021-06-07 Information extraction method and electronic equipment

Country Status (1)

Country Link
CN (1) CN113407077A (en)

Similar Documents

Publication Publication Date Title
US11681866B2 (en) Device, method, and graphical user interface for editing screenshot images
US8595645B2 (en) Device, method, and graphical user interface for marquee scrolling within a display area
US8274536B2 (en) Smart keyboard management for a multifunction device with a touch screen display
US20140372889A1 (en) Device, method, and graphical user interface with content display modes and display rotation heuristics
US20110163967A1 (en) Device, Method, and Graphical User Interface for Changing Pages in an Electronic Document
US20120064946A1 (en) Resizable filmstrip view of images
JP2020516994A (en) Text editing method, device and electronic device
EP2399186B1 (en) Method and apparatus for displaying additional information items
CN107479818B (en) Information interaction method and mobile terminal
CN112433693B (en) Split screen display method and device and electronic equipment
CN115357158A (en) Message processing method and device, electronic equipment and storage medium
CN112099714B (en) Screenshot method and device, electronic equipment and readable storage medium
CN112181252B (en) Screen capturing method and device and electronic equipment
CN112399010B (en) Page display method and device and electronic equipment
CN113783995A (en) Display control method, display control device, electronic apparatus, and medium
WO2023284640A9 (en) Picture processing method and electronic device
CN111796736B (en) Application sharing method and device and electronic equipment
CN112162689B (en) Input method and device and electronic equipment
CN113407077A (en) Information extraction method and electronic equipment
CN113311982A (en) Information selection method and device
CN113835578A (en) Display method and device and electronic equipment
US10019423B2 (en) Method and apparatus for creating electronic document in mobile terminal
CN106502515B (en) Picture input method and mobile terminal
CN112765500A (en) Information searching method and device
CN111949322A (en) Information display method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination