CN110427139B - Text processing method and device, computer storage medium and electronic equipment - Google Patents

Text processing method and device, computer storage medium and electronic equipment Download PDF

Info

Publication number
CN110427139B
CN110427139B CN201811407692.6A CN201811407692A CN110427139B CN 110427139 B CN110427139 B CN 110427139B CN 201811407692 A CN201811407692 A CN 201811407692A CN 110427139 B CN110427139 B CN 110427139B
Authority
CN
China
Prior art keywords
text
point
determining
interactive interface
control object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811407692.6A
Other languages
Chinese (zh)
Other versions
CN110427139A (en
Inventor
薛源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201811407692.6A priority Critical patent/CN110427139B/en
Publication of CN110427139A publication Critical patent/CN110427139A/en
Application granted granted Critical
Publication of CN110427139B publication Critical patent/CN110427139B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides a text processing method and device, a storage medium and electronic equipment, and relates to the technical field of human-computer interaction, wherein the method comprises the following steps: detecting non-contact operation of an operation object acting on the front of the interactive interface within a first preset distance range; determining a text selected area according to the non-contact operation; detecting an interactive operation between the control object and the interactive interface; and determining the text content in the selected text area according to the interactive operation. On one hand, the method and the device can acquire the text selection area comprising the required text in a non-contact operation mode, so that the text selection efficiency is improved; on the other hand, the text content can be determined through simple interactive operation, so that the text processing flow is reduced, and the user experience is improved.

Description

Text processing method and device, computer storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of human-computer interaction technologies, and in particular, to a text processing method and apparatus, a computer storage medium, and an electronic device.
Background
With the development of science and technology, intelligent terminal equipment gradually enters the lives of people, and provides convenience for the lives of people, such as shopping, reading, game playing and the like through mobile phones, tablet computers and the like.
At present, a user generally browses text information through a touch terminal, and needs to edit the text information when necessary, and there are problems in processing the text through the touch terminal, for example, when the text information is quickly edited, a large amount of time is consumed due to small characters; when a part of contents are intercepted and copied from a large article, a target text needs to be pressed for a long time, then a text copying area is selected in a frame mode, and then copying and pasting are carried out by clicking copy. Because the operation steps are long and the steps are separated, the operation is not smooth and the user experience is poor.
Therefore, there is a need in the art for a new method of text processing.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of the present disclosure is to provide a text processing method and apparatus, a storage medium, and an electronic device, thereby overcoming, at least to some extent, one or more problems due to limitations and disadvantages of the related art.
According to one aspect of the present disclosure, a text processing method is provided, which is applied to a touch terminal capable of presenting an interactive interface, and includes:
detecting non-contact operation of an operation object acting on the front of the interactive interface within a first preset distance range;
determining a text selected area according to the non-contact operation;
detecting an interactive operation between the control object and the interactive interface;
and determining the text content in the selected text area according to the interactive operation.
In an exemplary embodiment of the present disclosure, the determining a text selection area according to the non-contact operation is a line operation, and the determining includes:
and determining the selected text area according to the linear track formed by the linear operation.
In an exemplary embodiment of the present disclosure, the determining the text selection area according to the line-shaped trajectory formed by the line operation includes:
determining a mapping track of the line operation on the interactive interface according to a line track formed by the line operation;
and determining the selected text area according to the mapping track.
In an exemplary embodiment of the present disclosure, the determining a mapping track of the line operation on the interactive interface according to the line-shaped track formed by the line operation includes:
determining a starting point of the linear track according to a projection point of the reference point of the control object on the interactive interface;
acquiring a linear track formed by the sliding of the control object from the starting point to the end point;
and determining the mapping track according to the projection of the linear track on the interactive interface.
In an exemplary embodiment of the present disclosure, the mapping track is a straight line;
the determining the selected region of text according to the mapping track comprises:
and forming a box area by taking the mapping track as a diagonal line, and taking the box area as the text selected area.
In an exemplary embodiment of the present disclosure, the acquiring a linear trajectory formed by the manipulation object sliding from the starting point to the ending point includes:
when the control object is detected to move from the starting point, a text presentation mark state corresponding to the sliding track of the control object in the interactive interface is detected;
when the control object is detected to slide to the end point, determining the linear track according to the text with the marking state.
In an exemplary embodiment of the present disclosure, the acquiring a linear trajectory formed by the manipulation object sliding from the starting point to the ending point includes:
and detecting a track formed by the suspended sliding of the control object from the starting point to the end point along the peripheral path of the text, and taking the track as the linear track.
In an exemplary embodiment of the present disclosure, the determining a text selection area according to the non-contact operation is a point operation, including:
and determining the text selected area according to the position of the operating point of the point operation.
In an exemplary embodiment of the present disclosure, the determining the text selection area according to the position of the operating point of the point operation includes:
determining a mapping point of the point operation on the interactive interface according to the position of the operating point of the point operation;
and determining the selected text area according to the mapping points.
In an exemplary embodiment of the present disclosure, the determining the selected region of text from the mapping points includes:
and determining the central point of the selected area of the text according to the mapping point.
In an exemplary embodiment of the present disclosure, the determining the text selection area according to the position of the operating point of the point operation includes:
and determining the size of the selected text area according to the distance between the operating point of the point operation and the interactive interface.
In an exemplary embodiment of the present disclosure, the determining the size of the selected area of the text according to a distance from the interactive interface to an operation point of the point operation includes:
the distance is positively correlated to the size of the selected area of text.
In an exemplary embodiment of the present disclosure, the determining the selected region of text from the mapping points includes:
and determining the size of the selected text area according to the pressing force of the control object on the interactive interface by taking the central point as a datum point.
In an exemplary embodiment of the present disclosure, the method further comprises:
and generating the text selected area according to the pressing operation of the control object on the interactive interface within a preset time period.
In an exemplary embodiment of the present disclosure, the determining a size of the selected area of the text according to a degree of pressing the manipulation object against the interactive interface with the central point as a reference point includes:
the pressing degree of the control object is positively correlated with the size of the selected text area.
In an exemplary embodiment of the present disclosure, the detecting an interactive operation between the manipulation object and the interactive interface includes:
when the non-contact operation is line operation, detecting the click operation of the control object in the text selected area;
when the non-contact operation is a point operation, the sliding operation of the control object in the first preset distance range in front of the text selected area is detected.
In an exemplary embodiment of the present disclosure, the determining the text content in the selected area of the text according to the interaction operation includes:
and copying, cutting or pasting the text content in the selected text area according to the interactive operation.
In an exemplary embodiment of the present disclosure, the method further comprises:
the shape of the selected area of text is switched by a virtual button.
In an exemplary embodiment of the present disclosure, the method further comprises:
judging whether the selection of the text selected area is cancelled or not according to the distance between the control object and the interactive interface;
and when the distance between the control object and the interactive interface is greater than a second preset distance, canceling the selection of the text selected area, wherein the second preset distance is greater than or equal to the first preset distance.
According to an aspect of the present disclosure, a text processing apparatus is provided, which is applied to a touch terminal capable of presenting an interactive interface, and includes:
the first detection module is used for detecting non-contact operation of an operation object acting on the front of the interactive interface within a first preset distance range;
the area determining module is used for determining a text selected area according to the non-contact operation;
the second detection module is used for detecting the interactive operation between the control object and the interactive interface;
and the text processing module is used for determining the text content in the selected text area according to the interactive operation.
According to an aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a text processing method as recited in any one of the above.
According to an aspect of the present disclosure, there is provided an electronic device including:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the text processing method of any one of the above via execution of the executable instructions.
The text processing method in the disclosure can determine the text selection area according to different non-contact operations of the control object, and finally determine the text content in the text selection area according to the detected interaction operation between the control object and the interaction interface. According to the text processing method, on one hand, a text selection area comprising the required text can be obtained in a non-contact operation mode, and the efficiency of text selection is improved; on the other hand, the text content can be determined through simple interactive operation, the text is processed, the text processing flow is reduced, and the user experience is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The above and other features and advantages of the present disclosure will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
FIG. 1 is a flow chart of a method of text processing in an exemplary embodiment of the present disclosure;
FIG. 2 is a schematic flow chart illustrating the determination of a selected area of text based on a linear trajectory in an exemplary embodiment of the present disclosure;
FIG. 3 is a schematic flow chart illustrating the determination of a selected region of text based on point operations in an exemplary embodiment of the present disclosure;
4A-4B are schematic illustrations of altering the shape of a radioactive selected area in an exemplary embodiment of the present disclosure;
FIG. 5 is an interface diagram of an interactive interface displaying text manipulation options in an exemplary embodiment of the present disclosure;
FIG. 6 is a schematic structural diagram of a text processing apparatus in an exemplary embodiment of the present disclosure;
FIG. 7 is a block diagram of an electronic device in an exemplary embodiment of the disclosure;
FIG. 8 is a schematic diagram illustrating a program product in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The same reference numerals denote the same or similar parts in the drawings, and thus, a repetitive description thereof will be omitted.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the embodiments of the disclosure can be practiced without one or more of the specific details, or with other methods, components, materials, devices, steps, and so forth. In other instances, well-known structures, methods, devices, implementations, materials, or operations are not shown or described in detail to avoid obscuring aspects of the disclosure.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. That is, these functional entities may be implemented in the form of software, or in one or more software-hardened modules, or in different networks and/or processor devices and/or microcontroller devices.
To solve the problems in the related art, the exemplary embodiment first discloses a text processing method, which is applied to a touch terminal capable of presenting an interactive interface, where the touch terminal may be, for example, a mobile phone, a tablet computer, a notebook computer, a navigation device, a PDA, and various electronic devices with screens. The word processing application can control the screen of the touch terminal to present texts or editable objects and the like through an application program interface of the touch terminal. The interactive interface may be the whole area of the screen or a partial area of the screen, which is not particularly limited in this exemplary embodiment.
According to the touch terminal disclosed by the embodiment of the disclosure, a floating touch technology is adopted for a screen, when a control object is sensed to click on a currently displayed interface of the touch screen and stays in an area, where the control object can be sensed, of the screen after the control object leaves the screen, or when the control object is sensed to float on the currently displayed interface, a corresponding operation function of the touch terminal is triggered, and then after an operation action of the control object is sensed, corresponding operation processing of the corresponding operation function can be executed according to the sensed operation action.
In an exemplary embodiment of the disclosure, fig. 1 shows a text processing flow diagram, which may be executed by a server, and as shown in fig. 1, the text processing method may include the following steps:
s110, detecting non-contact operation of a control object acting on the front of the interactive interface within a first preset distance range;
s120, determining a text selection area according to the non-contact operation;
s130, detecting interactive operation between the control object and the interactive interface;
and S140, determining the text content in the selected text area according to the interactive operation.
According to the text processing method in the present exemplary embodiment, the text selected area may be determined according to different non-contact operations; and after the text selected area is determined, determining the text content of the text selected area according to the interactive operation between the control object and the interactive interface so as to perform corresponding operation on the selected text. According to the text processing method, on one hand, a text selection area comprising the required text can be obtained in a non-contact operation mode, and the selection efficiency is improved; on the other hand, the text content can be determined through simple interactive operation, so that the text processing flow is reduced, and the user experience is improved.
Next, the text processing method in the present exemplary embodiment will be further explained with reference to fig. 1.
In step S110, a non-contact operation of a manipulation object acting within a first preset distance range in front of the interactive interface is detected.
In an exemplary embodiment of the present disclosure, the manipulation object may be a finger of a user, may be a finger stall that implements signal interaction with a screen of the touch terminal, and the like. Since the user usually uses a finger to directly contact with the screen to operate the content in the interactive interface, for the convenience of understanding, the present disclosure is described below with the finger as the manipulation object.
In an exemplary embodiment of the present disclosure, a distance between a finger and a screen of the touch terminal may be compared with a first preset distance, and a state of the manipulation object may be determined according to a comparison result. Further, a certain point on the finger may be set as a reference point, and the state of the manipulation object may be determined by comparing the distance between the reference point and the screen of the touch terminal with a first preset distance, specifically, the reference point may be the lowest point of the finger, that is, the point at which the vertical distance of the finger from the screen is the smallest, and other points on the finger opposite to the screen may also be used as the reference point, but in order to improve the detection of the distance between the finger and the screen, it is preferable to use the lowest point of the finger as the reference point for further accurately determining the state of the finger. The touch terminal can sense the finger within a certain range, and when the finger exceeds the range, the touch terminal cannot sense the finger, so that the first preset distance is the critical distance at which the touch terminal can sense the reference point of the finger. In an exemplary embodiment of the present disclosure, the state of the finger may be classified into three types: the touch screen comprises a pressing state, a suspending state and an invalid state, wherein when the distance between the lowest point of the finger and the screen is zero, the state of the finger is the pressing state; when the distance between the lowest point of the finger and the screen is larger than zero and not more than a first preset distance, the state of the finger is a suspension state; and when the distance between the lowest point of the finger and the screen is greater than a first preset value, the state of the finger is an invalid state.
In an exemplary embodiment of the disclosure, a non-contact operation of a finger acting on the front of the interactive interface within a first preset distance range may be detected, and then the text selection area may be determined according to the non-contact operation. The non-contact operation can be divided into two forms of a line operation and a point operation, for example, the line operation is performed when a finger is suspended within a first preset distance range and slides within the first preset distance range above the interactive interface; and when the finger is suspended in the first preset distance range and the position of the operation point is determined, moving the finger towards the interactive interface direction to obtain point operation. In the embodiment of the present disclosure, the first preset distance may be set according to actual needs, for example, may be set to be 10mm, 15mm, and the like from the screen, which is not specifically limited by the present disclosure.
In this exemplary embodiment, a screen distance sensing device may be disposed in the touch terminal, and configured to detect a distance between a lowest point of the finger and a screen of the touch terminal, where the screen distance sensing device may include one or more distance sensors, and when the distance sensors include a plurality of distance sensors, the distance sensors may be dispersedly disposed below the screen according to a certain structure, for example, the distance sensors may be arranged in a matrix, a concentric circle, or the like.
In step S120, a text selection area is determined according to the non-contact operation.
In the present exemplary embodiment, after determining the non-contact operation of the user's finger, the text selection area may be determined according to the non-contact operation, and the text selection area may be determined in different manners for different types of non-contact operations.
In an exemplary embodiment of the present disclosure, when the non-contact manipulation is a line manipulation, the text selected area may be determined according to a linear trajectory of the line manipulation. Further, when the finger slides within the first preset distance range to form a linear track, a mapping track corresponding to the linear track can be correspondingly displayed in the interactive interface, and then the text selected area can be determined according to the mapping track. Fig. 2 is a schematic flow chart illustrating the process of determining the selected text area according to the linear track, and as shown in fig. 2, in step S201, the starting point of the linear track is determined according to the projected point of the reference point of the manipulation object on the interactive interface; when the finger is suspended in a first preset distance range in front of the interactive interface, the touch terminal can sense the presence of the finger, and a projection point is presented at a position opposite to the reference point of the finger in the interactive interface, and the projection point can have various different forms, such as different shapes of circle, triangle, cross and the like, and also can have different colors of red, yellow, blue and the like, and certainly can be other forms, which is not described herein again. In order to facilitate the user to judge whether the selected character is the character required by the user, the projection point can have a perspective effect, so that even if the projection point is covered above the character, the user can not be influenced on the recognition of the character. In step S202, a linear trajectory formed by the control object sliding from the starting point to the end point is acquired; the terminal point can be any point which is positioned in the same horizontal plane with the starting point and is different from the starting point, or any point which is positioned in different horizontal planes within a first preset distance range with the starting point, and the coordinate positions of the starting point and the terminal point are different; when the finger slides from the starting point to the end point, a linear track can be formed within a first preset distance range. In step S203, a mapping trajectory is determined from the linear trajectory; when the finger is suspended, a projection point is generated at a position opposite to the reference point of the finger in the interactive interface, so that a mapping track corresponding to the linear track is generated in the interactive interface along with the sliding of the finger. In step S204, determining a text selected area according to the mapping track; the mapping track can be a straight line or a curve, when the mapping track is a straight line, a box area can be formed by taking the mapping track as a diagonal line, and the box area is a text selection area; when the mapping track is a curve, the text range surrounded by the curve can be used as the text selection area.
Further, in embodiments of the present disclosure, the linear track may be formed in two ways, the first: when the control object is detected to slide from the starting point, a text corresponding to the sliding track of the control object in the interactive interface presents a mark state; and when the control object is detected to slide to the end point, determining a linear track according to the text with the mark state. The marking state may be a state in which the color of the text region corresponding to the sliding track changes and is compared with the color of the surrounding region, specifically, the text region corresponding to the sliding track may be represented as a mark of a marker pen, and of course, the marking state may also be another state, which is not described herein again. When the finger slides from the starting point to the end point, part of texts with marked states are generated on the interactive interface, and the linear track can be determined according to the texts with the marked states. The method has a prominent advantage for selecting the small-range texts, and only the required texts are marked by sliding the fingers. And the second method comprises the following steps: and detecting a track of the operation object which starts from the starting point and slides in the air to the end point along the peripheral path of the text, and taking the track as a linear track. For a large-scale text, the user can perform suspended sliding on the peripheral path of the text range from the starting point by sliding the finger to form a linear track, and determine the text selection area according to the linear track, for example, the size of the selected text range is 60px × 60px, and then the user can slide the finger in a suspended sliding manner for a frame greater than or equal to 60px × 60px to ensure that the desired text frame is in the text selection frame, wherein the frame greater than or equal to 60px × 60px is the linear track.
In an exemplary embodiment of the present disclosure, when the non-contact operation is a dot operation, the text selected area may be determined according to the dot operation. Fig. 3 is a schematic flow chart illustrating the process of determining a text selection area according to a dot operation, as shown in fig. 3, in step S301, the position of an operating point of the dot operation is determined; the user suspends a finger on a target point within a first preset distance range in front of the interactive interface, and the coordinate position of the target point is the position of the operating point; in step S302, a mapping point of a point operation in the interactive interface is determined according to the position of the operation point; the mapping point can be specifically a vertical projection point of the operation point in the interactive interface; in step S303, determining a center point of the selected text area according to the mapping point; in step S304, determining the size of the selected area of the text according to the distance between the operation point and the interactive interface; in the embodiment of the disclosure, the distance between the operation point and the interactive interface is positively correlated with the size of the text selected area, and when the operation point is closer to the interactive interface, the text selected area is smaller; the larger the text-select area is when the operating point is farther away from the interaction interface within the first preset distance range. Of course, the distance between the operation point and the interactive interface and the size of the text selected area can also be set to be in negative correlation, and when the operation point is closer to the interactive interface, the text selected area is larger; the text-select area is smaller as the operation point is farther away from the interactive interface within the first preset distance range.
In an exemplary embodiment of the present disclosure, in addition to changing the size of the text selected area according to a change in the distance between the manipulation object and the interactive interface, the size of the text selected area may also be changed according to a degree of pressing between the manipulation object and the interactive interface. Before pressing the screen of the touch terminal, the control object can be located in a first preset distance range in front of the interactive interface, for example, a space range 15mm away from the interactive interface; the control object is then slid into alignment with the center point of the selected region of target text, and after alignment is confirmed, the control object is moved down to press the screen. When the screen of the touch terminal is pressed, a selected area is displayed in the interactive interface, and the size of the selected area can be adjusted by changing the pressing force degree of the manipulation object, for example, the pressing force degree of the manipulation object is set to be positively correlated with the size of the text selected area, when the pressing force degree is increased, the selected area is enlarged, and when the pressing force degree is reduced, the selected area is reduced, so that the pressing force degree can be adjusted until the selected area is zoomed to a proper size, and the finally formed selected area is the text selected area containing the required text.
Further, a text selected area may be generated according to the pressing operation of the control object on the interactive interface within a preset time period, when the duration of pressing the interactive interface by the control object is equal to the preset time period, the text selected area corresponding to the end of the preset time period may be used as the final text selected area, and in the preset time period, the user may change the size of the text selected area by changing the size of the pressing degree. The setting of the preset time period may be specifically set according to actual needs, such as setting to 30s, 1min, and the like, which is not specifically limited by the present disclosure.
In an exemplary embodiment of the disclosure, the selected area may be a radioactive selected area with different regular shapes, and the radioactivity means adjustable in size, for example, a rectangular radioactive selected area, a square radioactive selected area, or a circular radioactive selected area, but may also be other shapes, which is not specifically limited by the disclosure. In the present disclosure, the shape of the radioactive selection area may be adaptively changed according to different application scenarios, for example, for characters with regular boundaries such as chinese characters, korean, etc., the text may be framed by a square radioactive selection area or a rectangular radioactive selection area, but for irregular characters such as arabic language, wiki, etc., if a square or rectangular radioactive selection area is used, a part of the characters may be missed or the framed characters are more than the characters required by the user, and therefore, for this case, the shape of the radioactive selection area needs to be changed to obtain the text required by the user. Fig. 4A-4B are schematic diagrams illustrating the change of the shape of the radioactive selection area, wherein the system default controls to generate a square radioactive selection area when the screen is pressed, a button appears at the upper left corner of the square radioactive selection area while the square radioactive selection area is displayed in the interactive interface, the shape of the radioactive selection area can be changed to the shape required by the user by clicking the button, and the square radioactive selection area is changed to a circular radioactive selection area after the user clicks the button and selects a circle, as shown in fig. 4A. In the present disclosure, the location of the button may be located anywhere around the periphery of the radioactive selected area, including but not limited to the upper left corner of the radioactive selected area.
In step S130, an interaction operation between the manipulation object and the interactive interface is detected.
In an exemplary embodiment of the disclosure, after the text selected area is determined, the interactive interface may be interactively operated through the control object, so as to trigger a corresponding operation on the text in the text selected area. For different non-contact operations, the interactive operations between the control object and the interactive interface are different, when the non-contact operations are line operations, the click operation of the control object at any position in the text selection area can be used as the interactive operations, and the corresponding operation on the text in the text selection area is triggered by detecting the click operation of the control object at any position in the text selection area; when the non-contact operation is a point operation, a sliding operation of the control object in a first preset distance range in front of the text selection area can be used as an interactive operation, and a corresponding operation on the text in the text selection area is triggered by detecting the sliding operation of the control object in the first preset distance range in front of the text selection area.
In step S140, the text content in the selected text area is determined according to the interaction operation.
In an exemplary embodiment of the disclosure, after the interactive operation between the control object and the interactive interface is detected, the text content in the text selected area may be determined, and a corresponding shortcut operation may be performed on the text content. In the embodiment of the disclosure, after the interaction operation between the control object and the interaction interface is detected, the shortcut operation on the text content can be directly triggered, for example, after the text selected area is determined by the line operation, the copy of the text in the text selected area is triggered by clicking any position of the text selected area; or after the interactive operation between the control object and the interactive interface is detected, generating a corresponding text operation option near the text selected area, and then performing corresponding operation on the text according to the operation option selected by the user. For example, after the text selection area is determined by the point operation, the user may lift the manipulation object to make the manipulation object perform floating sliding within a first preset distance range to trigger the interactive interface to display the text manipulation option, optionally, the floating sliding may be floating sliding upwards or floating sliding downwards to trigger the interactive interface to display the text manipulation option, and of course, the interactive interface may also be triggered to display the text manipulation option by other manners, which is not specifically limited by the present disclosure.
In an exemplary embodiment of the disclosure, the text operation option may be a commonly used text editing option, such as copy, cut, paste, select, bold, underline, etc., and may also be other operation options, which is not specifically limited by the disclosure. Fig. 5 is a schematic structural diagram illustrating a text operation option displayed on an interactive interface, and as shown in fig. 5, when an interactive operation between a user's manipulation object and the interactive interface is detected, a text operation option list is displayed in the upper right corner of a text selection area a, where the text operation option list includes a plurality of options, such as copy, cut, and bold, and an extension item "…" is further provided, and further operation options, such as delete, change font color, and the like, can be obtained by clicking "…".
In an exemplary embodiment of the present disclosure, after determining the text selection area according to the non-contact operation, if the text in the text selection area is not the text desired by the user, the user may raise the manipulation object so that the distance between the manipulation object and the screen is greater than a second preset distance, so as to abandon the operation. The second preset distance may be any value greater than or equal to the first preset distance, and after sensing the distance between the control object and the screen, the distance sensor in the touch terminal compares the distance with the first preset distance and the second preset distance, respectively, and executes corresponding operations according to the comparison result. Furthermore, when the operation object presses the screen of the touch terminal, if the text in the text selection area is not the text required by the user, the user can slide the operation object to change the position of the pressing center, so that the system can cancel the text selection area according to the action of the user.
It should be noted that although the various steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that these steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
In an exemplary embodiment of the present disclosure, there is also provided a text processing apparatus applied to a touch terminal capable of presenting an interactive interface, as shown in fig. 6, the text processing apparatus 600 may include: a first detection module 601, a region determination module 602, a second detection module 603, and a text processing module 604.
Specifically, the first detecting module 601 is configured to detect a non-contact operation of a control object acting on the interactive interface within a first preset distance range; an area determination module 602, configured to determine a text selected area according to the non-contact operation; a second detecting module 603, configured to detect an interaction operation between the control object and the interactive interface; and a text processing module 604, configured to determine text content in the selected text area according to the interaction operation.
The specific details of each text processing device module are already described in detail in the corresponding text processing method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the apparatus for performing are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 700 according to this embodiment of the disclosure is described below with reference to fig. 7. The electronic device 700 shown in fig. 7 is only an example and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 7, electronic device 700 is embodied in the form of a general purpose computing device. The components of the electronic device 700 may include, but are not limited to: the at least one processing unit 710, the at least one memory unit 720, a bus 730 connecting different system components (including the memory unit 720 and the processing unit 710), and a display unit 740.
Wherein the storage unit stores program code that is executable by the processing unit 710 to cause the processing unit 710 to perform steps according to various exemplary embodiments of the present disclosure as described in the above section "exemplary methods" of this specification. For example, the processing unit 710 may perform step s110 shown in fig. 1, detecting that a manipulation object acts on a non-contact operation in a first preset distance range in front of the interactive interface; s120, determining a text selection area according to the non-contact operation; s130, detecting interactive operation between the control object and the interactive interface; and S140, determining the text content in the selected text area according to the interactive operation.
The storage unit 720 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)7201 and/or a cache memory unit 7202, and may further include a read only memory unit (ROM) 7203.
The storage unit 720 may also include a program/utility 7204 having a set (at least one) of program modules 7205, such program modules 7205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 730 may be any representation of one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 700 may also communicate with one or more external devices 1500 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 700, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 700 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 750. Also, the electronic device 700 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the internet) via the network adapter 760. As shown, the network adapter 760 communicates with the other modules of the electronic device 700 via the bus 730. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 700, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, various aspects of the disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to perform the steps according to various exemplary embodiments of the disclosure described in the "exemplary methods" section above of this specification, when the program product is run on the terminal device.
Referring to fig. 8, a program product 800 for implementing the above method according to an embodiment of the present disclosure is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Furthermore, the above-described figures are merely schematic illustrations of processes included in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (16)

1. A text processing method is applied to a touch terminal capable of presenting an interactive interface, and is characterized by comprising the following steps:
detecting non-contact operation of an operation object acting on the front of the interactive interface within a first preset distance range;
when the non-contact operation is a line operation, determining a starting point of a linear track according to a projection point of the reference point of the control object on the interactive interface;
acquiring a linear track formed by the sliding of the control object from the starting point to the end point;
determining a mapping track according to the projection of the linear track on the interactive interface;
determining a text selected area according to the mapping track;
when the non-contact operation is point operation, determining the text selected area according to the distance from the position of the point operation to the interactive interface; wherein the textual select area is a radioactive select area having a different regular shape; the distance is positively correlated with the size of the text selected area, and when the operation point is closer to the interactive interface, the text selected area is smaller; the text selected area is larger when the operation point is farther away from the interactive interface within the first preset distance range;
detecting an interactive operation between the control object and the interactive interface;
and determining the text content in the selected area of the text according to the interactive operation, and carrying out shortcut operation on the text content.
2. The text processing method of claim 1, wherein the mapping track is a straight line;
the determining the selected region of text according to the mapping track comprises:
and forming a box area by taking the mapping track as a diagonal line, and taking the box area as the text selected area.
3. The text processing method according to claim 1, wherein the obtaining of the linear trajectory formed by the sliding of the manipulation object from the starting point to the ending point comprises:
when the control object is detected to move from the starting point, a text presentation mark state corresponding to the sliding track of the control object in the interactive interface is detected;
when the control object is detected to slide to the end point, determining the linear track according to the text with the marking state.
4. The text processing method according to claim 1, wherein the obtaining of the linear trajectory formed by the sliding of the manipulation object from the starting point to the ending point comprises:
and detecting a track formed by the suspended sliding of the control object from the starting point to the end point along the peripheral path of the text, and taking the track as the linear track.
5. The method of claim 1, wherein determining the selected region of text according to the location of the operating point of the point operation comprises:
determining a mapping point of the point operation on the interactive interface according to the position of the operating point of the point operation;
and determining the selected text area according to the mapping points.
6. The method of claim 5, wherein determining the selected region of text from the mapped points comprises:
and determining the central point of the selected area of the text according to the mapping point.
7. The method of claim 6, wherein said determining the selected region of text from the mapped points comprises:
and determining the size of the selected text area according to the pressing force of the control object on the interactive interface by taking the central point as a datum point.
8. The text processing method of claim 7, wherein the method further comprises:
and generating the text selected area according to the pressing operation of the control object on the interactive interface within a preset time period.
9. The method according to claim 7, wherein the determining the size of the selected region of the text according to the degree of pressing force of the control object on the interactive interface with the central point as a reference point comprises:
the pressing degree of the control object is positively correlated with the size of the selected text area.
10. The text processing method according to any one of claims 1 to 9, wherein the detecting an interactive operation between the manipulation object and the interactive interface comprises:
when the non-contact operation is line operation, detecting the click operation of the control object in the text selected area;
when the non-contact operation is a point operation, the sliding operation of the control object in the first preset distance range in front of the text selected area is detected.
11. The method of claim 1, wherein determining text content in the selected region of text based on the interaction comprises:
and copying, cutting or pasting the text content in the selected text area according to the interactive operation.
12. The text processing method of claim 7, wherein the method further comprises:
the shape of the selected area of text is switched by a virtual button.
13. The text processing method of claim 1, wherein the method further comprises:
judging whether the selection of the text selected area is cancelled or not according to the distance between the control object and the interactive interface;
and when the distance between the control object and the interactive interface is greater than a second preset distance, canceling the selection of the text selected area, wherein the second preset distance is greater than or equal to the first preset distance.
14. A text processing device is applied to a touch terminal capable of presenting an interactive interface, and is characterized by comprising:
the first detection module is used for detecting non-contact operation of an operation object acting on the front of the interactive interface within a first preset distance range;
the area determining module is used for determining a starting point of a linear track according to a projection point of the reference point of the control object on the interactive interface when the non-contact operation is a linear operation; acquiring a linear track formed by the sliding of the control object from the starting point to the end point; determining a mapping track according to the projection of the linear track on the interactive interface; determining a text selected area according to the mapping track; when the non-contact operation is point operation, determining the text selected area according to the distance from the position of the point operation to the interactive interface; wherein the textual select area is a radioactive select area having a different regular shape; the distance is positively correlated with the size of the selected region of text;
the second detection module is used for detecting the interactive operation between the control object and the interactive interface;
and the text processing module is used for determining the text content in the selected region of the text according to the interactive operation and carrying out shortcut operation on the text content.
15. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out a text processing method according to any one of claims 1 to 13.
16. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the text processing method of any one of claims 1-13 via execution of the executable instructions.
CN201811407692.6A 2018-11-23 2018-11-23 Text processing method and device, computer storage medium and electronic equipment Active CN110427139B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811407692.6A CN110427139B (en) 2018-11-23 2018-11-23 Text processing method and device, computer storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811407692.6A CN110427139B (en) 2018-11-23 2018-11-23 Text processing method and device, computer storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN110427139A CN110427139A (en) 2019-11-08
CN110427139B true CN110427139B (en) 2022-03-04

Family

ID=68407346

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811407692.6A Active CN110427139B (en) 2018-11-23 2018-11-23 Text processing method and device, computer storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN110427139B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111338540B (en) * 2020-02-11 2022-02-18 Oppo广东移动通信有限公司 Picture text processing method and device, electronic equipment and storage medium
CN112190922A (en) * 2020-10-22 2021-01-08 网易(杭州)网络有限公司 Virtual article processing method and device, storage medium and electronic device
CN115909342B (en) * 2023-01-03 2023-05-23 湖北瑞云智联科技有限公司 Image mark recognition system and method based on contact movement track

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104007897A (en) * 2014-06-19 2014-08-27 中科创达软件股份有限公司 Method and system for displaying toolbar for text editing
CN104375756A (en) * 2013-08-16 2015-02-25 北京三星通信技术研究有限公司 Touch operation method and touch operation device
CN105190520A (en) * 2013-03-13 2015-12-23 微软技术许可有限责任公司 Hover gestures for touch-enabled devices
CN105573492A (en) * 2015-11-25 2016-05-11 小米科技有限责任公司 Interactive type screen control method and apparatus
CN205427823U (en) * 2015-03-19 2016-08-03 苹果公司 Electronic equipment and device that is used for carrying out text select an action
CN106527729A (en) * 2016-11-17 2017-03-22 科大讯飞股份有限公司 Non-contact type input method and device
CN108228041A (en) * 2016-12-13 2018-06-29 中兴通讯股份有限公司 A kind of control amplification shows content and method, device and mobile terminal

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110058623A (en) * 2009-11-24 2011-06-01 삼성전자주식회사 Method of providing gui for guiding initial position of user operation and digital device using the same
US20120268485A1 (en) * 2011-04-22 2012-10-25 Panasonic Corporation Visualization of Query Results in Relation to a Map
CN103677568A (en) * 2013-12-10 2014-03-26 华为技术有限公司 Clicked object amplifying method and device based on floating touch
CN104932755B (en) * 2014-03-18 2017-11-10 昆盈企业股份有限公司 Input system and its operating method
US20170153798A1 (en) * 2015-11-30 2017-06-01 International Business Machines Corporation Changing context and behavior of a ui component
CN106909289B (en) * 2017-03-31 2019-12-03 维沃移动通信有限公司 A kind of operating method and mobile terminal of application controls
CN108416018A (en) * 2018-03-06 2018-08-17 北京百度网讯科技有限公司 Screenshotss searching method, device and intelligent terminal

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105190520A (en) * 2013-03-13 2015-12-23 微软技术许可有限责任公司 Hover gestures for touch-enabled devices
CN104375756A (en) * 2013-08-16 2015-02-25 北京三星通信技术研究有限公司 Touch operation method and touch operation device
CN104007897A (en) * 2014-06-19 2014-08-27 中科创达软件股份有限公司 Method and system for displaying toolbar for text editing
CN205427823U (en) * 2015-03-19 2016-08-03 苹果公司 Electronic equipment and device that is used for carrying out text select an action
CN105573492A (en) * 2015-11-25 2016-05-11 小米科技有限责任公司 Interactive type screen control method and apparatus
CN106527729A (en) * 2016-11-17 2017-03-22 科大讯飞股份有限公司 Non-contact type input method and device
CN108228041A (en) * 2016-12-13 2018-06-29 中兴通讯股份有限公司 A kind of control amplification shows content and method, device and mobile terminal

Also Published As

Publication number Publication date
CN110427139A (en) 2019-11-08

Similar Documents

Publication Publication Date Title
US11487426B2 (en) Apparatus and method for cursor control and text selection and editing based on gesture-based touch inputs received in a virtual keyboard display area
US9400567B2 (en) Explicit touch selection and cursor placement
KR102214437B1 (en) Method for copying contents in a computing device, method for pasting contents in a computing device, and the computing device
CN110058782B (en) Touch operation method and system based on interactive electronic whiteboard
US10416777B2 (en) Device manipulation using hover
US9292161B2 (en) Pointer tool with touch-enabled precise placement
US9401099B2 (en) Dedicated on-screen closed caption display
US20130215018A1 (en) Touch position locating method, text selecting method, device, and electronic equipment
CN111475097B (en) Handwriting selection method and device, computer equipment and storage medium
CN110427139B (en) Text processing method and device, computer storage medium and electronic equipment
CN102902480A (en) Control area for a touch screen
KR20140038568A (en) Multi-touch uses, gestures, and implementation
KR20110081040A (en) Method and apparatus for operating content in a portable terminal having transparent display panel
AU2013223015A1 (en) Method and apparatus for moving contents in terminal
CN108553894B (en) Display control method and device, electronic equipment and storage medium
WO2014147716A1 (en) Electronic device and handwritten document processing method
US20140123036A1 (en) Touch screen display process
WO2014121626A1 (en) Displaying method, device and storage medium of mobile terminal shortcuts
KR20160033547A (en) Apparatus and method for styling a content
US10146424B2 (en) Display of objects on a touch screen and their selection
CN108492349A (en) Processing method, device, equipment and the storage medium of stroke writing
KR101447886B1 (en) Method and apparatus for selecting contents through a touch-screen display
JP6100013B2 (en) Electronic device and handwritten document processing method
KR102078748B1 (en) Method for inputting for character in flexible display an electronic device thereof
KR20130080218A (en) Method for moving the cursor of text editor using motion sensor, and computer-readable recording medium with moving program of the cursor of text editor using motion sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant