WO2020224158A1 - 编辑方法、计算机可读存储介质及终端 - Google Patents

编辑方法、计算机可读存储介质及终端 Download PDF

Info

Publication number
WO2020224158A1
WO2020224158A1 PCT/CN2019/106310 CN2019106310W WO2020224158A1 WO 2020224158 A1 WO2020224158 A1 WO 2020224158A1 CN 2019106310 W CN2019106310 W CN 2019106310W WO 2020224158 A1 WO2020224158 A1 WO 2020224158A1
Authority
WO
WIPO (PCT)
Prior art keywords
edited
area
touch
editing method
touch track
Prior art date
Application number
PCT/CN2019/106310
Other languages
English (en)
French (fr)
Inventor
洪帆
Original Assignee
深圳传音控股股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳传音控股股份有限公司 filed Critical 深圳传音控股股份有限公司
Publication of WO2020224158A1 publication Critical patent/WO2020224158A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/166Editing, e.g. inserting or deleting

Definitions

  • the present invention relates to the technical field of information processing, in particular to an editing method, a computer-readable storage medium and a terminal.
  • terminals such as mobile phones or tablet computers to watch news, e-books, or browse websites.
  • the existing editing method is to obtain the object to be edited by controlling the positions of the cursors at both ends, which is time-consuming, laborious and inflexible.
  • the present invention provides an editing method, which obtains the to-be-edited area in the to-be-edited object according to the touch trajectory, with simple operation and high flexibility.
  • the editing method provided by the present invention includes: receiving a user's touch operation; acquiring a touch track according to the touch operation; acquiring a region to be edited in the object to be edited according to the touch track; and editing the region to be edited .
  • the object to be edited is a text box, or a text file, or an edit box.
  • the step of obtaining the area to be edited in the object to be edited according to the touch track includes: when the touch track is closed, determining according to the object surrounded and/or covered by the touch track The area to be edited.
  • the step of obtaining the area to be edited in the object to be edited according to the touch track includes: when the touch track is not closed, determining the area to be edited according to the object covered by the touch track , Or determine the area to be edited according to the straight line formed by the two ends of the touch track.
  • determining the area to be edited according to the line formed by the two ends of the touch track includes: generating a closed line associated with the line according to the line formed by the two ends of the touch track.
  • the graphics associated with the straight line include at least one of a rectangular frame with the straight line as a diagonal, a circle with the straight line as a diameter, and a rectangular frame with the midpoint connecting line of the straight line as the long side Item; Determine the area to be edited according to the objects enclosed and/or covered by the closed graphics associated with the straight line.
  • the step before the step of editing the area to be edited, the step includes: marking the area to be edited in the object to be edited.
  • the step before the step of editing the area to be edited, includes: moving the area to be edited according to a drag instruction; and/or expanding the area to be edited according to an add instruction; and/or according to a delete instruction Reduce the area to be edited.
  • the step of acquiring the touch trajectory according to the touch operation includes: when receiving a trigger instruction to start acquiring, start acquiring and storing the coordinates of the touch point according to the touch operation; When the trigger instruction is acquired, stop acquiring the coordinates of the touch point; and acquire the touch track according to the stored coordinates of the touch point.
  • the present invention also provides a computer-readable storage medium in which a computer program is stored, and when the computer program is executed by a processor, the steps of the above-mentioned editing method are realized.
  • the present invention also provides a terminal, which includes the above-mentioned computer-readable storage medium.
  • Fig. 1 shows a schematic flowchart of an editing method according to a first embodiment of the present invention
  • FIG. 2 shows a schematic flowchart of the editing method of the second embodiment of the present invention
  • FIG. 3 shows a schematic diagram of the effect of an area to be edited obtained by applying the editing method shown in FIG. 2;
  • FIG. 4 shows a schematic flowchart of the editing method of the third embodiment of the present invention.
  • FIG. 5 shows a schematic diagram of the effect of an area to be edited obtained by applying the editing method shown in FIG. 4;
  • FIG. 6 shows a schematic structural diagram of a computer-readable storage medium according to a fourth embodiment of the present invention.
  • FIG. 7 shows a schematic structural diagram of a terminal according to the fifth embodiment of the present invention.
  • A, B or C or "A, B and/or C” means "any of the following: A; B; C; A and B; A and C; B and C; A, B and C” .
  • An exception to this definition will only occur when the combination of elements, functions, steps or operations is inherently mutually exclusive in some way.
  • Fig. 1 shows a flowchart of the editing method of the first embodiment of the present invention.
  • the editing method can be, but not limited to, applied to a terminal or server.
  • the terminal can be an electronic device with editing functions such as a mobile phone, a computer, a TV, etc.
  • the method includes the following steps:
  • Step S11 receiving a user's touch operation
  • the user's touch operation can be, but is not limited to, a touch operation for the displayed object to be edited.
  • the object to be edited may be an editable object such as a text box, or a text file, or an edit box.
  • the content of the object to be edited may specifically include characters such as numbers, letters, text, etc., or other editable objects such as images, or any combination thereof.
  • Step S12 Obtain a touch track according to the touch operation
  • step S12: acquiring the touch track according to the touch operation includes starting to acquire and store the coordinates of the touch point according to the touch operation when a trigger instruction to start the acquisition is received.
  • a trigger instruction to start acquisition is received.
  • a trigger instruction to start acquisition is received, and a voice instruction including keywords such as "start acquiring" may also be received.
  • a voice instruction including keywords such as "start acquiring” may also be received.
  • a trigger instruction to start acquiring when a trigger instruction to start acquiring is received, after starting to acquire and store the coordinates of the touch point according to the touch information, it may include, but is not limited to, marking the stored touch point in real time according to the coordinates of the touch point, And real-time display the object to be edited with the touch point marked.
  • step S12: acquiring the touch track according to the touch operation may also include but is not limited to: real-time displaying the object to be edited with the touch track marked. Specifically, for example, a solid line may be used to mark the touch track in real time.
  • Step S13 Obtain the area to be edited in the object to be edited according to the touch track;
  • step S13: before acquiring the area to be edited in the object to be edited according to the touch trajectory includes: determining that a trigger instruction to stop acquiring is received when no touch information is detected within a preset time period ; When receiving a trigger instruction to stop acquiring, stop acquiring the coordinates of the touch point, and acquire the area to be edited according to the touch track.
  • the area to be edited in the object to be edited can also be obtained in real time according to the touch track.
  • step S13: obtaining the area to be edited in the object to be edited according to the touch track may include, but is not limited to, marking the area to be edited. Specifically, for example, the area to be edited can be marked in real time using a dashed frame.
  • Step S14 Edit the area to be edited.
  • step S14: editing the area to be edited may, but is not limited to, including: editing characters or pictures in the area to be edited, such as retypesetting, deleting a character or picture in the area to be edited , Or inserting a character or picture in the area to be edited, etc., can also include copying and pasting the entire area to be edited, etc.
  • the editing method of this embodiment obtains the to-be-edited area in the to-be-edited object according to the touch track, and the operation is simple and the flexibility is high.
  • the editing method of this embodiment can display the object to be edited with the touch point marked in real time, which can help the user select the target object more accurately.
  • Fig. 2 shows a schematic flowchart of the editing method of the second embodiment of the present invention. As shown in Fig. 2, the editing method of this embodiment includes the following steps:
  • Step S21 receiving a user's touch operation
  • Step S22 Obtain a touch track according to the touch operation
  • Step S23 when the touch trajectory is closed, determine the area to be edited according to the objects surrounded and/or covered by the touch trajectory;
  • step S23 when the touch track is closed, determining the area to be edited according to the object surrounded and/or covered by the touch track includes: when the touch track covers the second object, touching The control trajectory is expanded to the state where the second object is surrounded and the number of objects is the smallest; the expanded touch trajectory is deformed according to the layout rule of the editing object.
  • the second object may be a character, for example.
  • the deformation process can also be performed before the external expansion process or simultaneously with the external expansion process.
  • FIG. 3 shows an effect diagram of an area to be edited obtained by applying the editing method shown in FIG. 2.
  • the boundary of the area to be edited is also composed of horizontal and vertical lines, but the present invention is not limited to this, and the boundary of the area to be edited may also be Arc shape, wave shape, etc.
  • step S23: when the touch trajectory is closed, determining the area to be edited according to the object surrounded by the touch trajectory and/or the covered object includes: acquiring the first object surrounded by the touch trajectory and the object covered by the touch trajectory The second object; the area containing the first object and the second object in the object to be edited is determined as the area to be edited. Then, only the area containing the first object in the object to be edited may be determined as the area to be edited, and the second object covered by the touch track may also be extracted to generate the area to be edited.
  • the typesetting manner of the second object in the area to be edited and the typesetting manner of the second object in the object to be edited may be the same or different.
  • the first object and/or the second object may be characters, pictures or other types of objects.
  • Step S24 mark the area to be edited in the object to be edited
  • a dotted line can be used to mark the boundary of the area to be edited, but the present invention is not limited to this.
  • Step S25 Edit the area to be edited.
  • step S25 before editing the area to be edited, it may also include but is not limited to: moving the area to be edited according to the drag instruction; and/or expanding the area to be edited according to the adding instruction; and/or reducing the area to be edited according to the delete instruction Edit area. Specifically, for example, when a joining instruction is received, two regions to be edited are merged to expand the region to be edited. When receiving the delete instruction, delete the sub-areas in the area to be edited to reduce the area to be edited.
  • the editing method of this embodiment can obtain the area to be edited in the object to be edited according to the closed touch track, without sequentially selecting target objects (for example, characters), and has simple operation and high flexibility.
  • target objects for example, characters
  • FIG. 4 shows a schematic flowchart of the editing method of the third embodiment of the present invention. As shown in Fig. 4, the editing method of this embodiment includes the following steps:
  • Step S41 Display the object to be edited
  • Step S42 receiving a user's touch operation
  • Step S43 Obtain a touch track according to the touch operation
  • step S43 after acquiring the touch trajectory according to the touch operation, it can also include but is not limited to: marking the touch trajectory in real time (please refer to FIG. 5, where the solid line is used to mark the touch trajectory, but the present invention Not limited to this).
  • Step S44 when the touch track is not closed, the area to be edited is determined according to the straight line formed by the two ends of the touch track;
  • step S44 when the touch trajectory is not closed, determining the area to be edited according to the straight line formed by the two ends of the touch trajectory may include, but is not limited to: generating and Graphic associated with a straight line; the area to be edited is determined according to the objects enclosed and/or covered by the graphic associated with the straight line.
  • the figure associated with a straight line includes a rectangular frame with a straight line as a diagonal, a circle with a straight line as a diameter, a rectangular frame with a midpoint connecting line with a straight line as a long side, and a rectangular frame with a straight line as the long side.
  • the figure associated with a straight line can also be other closed figures set by the system, such as a rhombus or other non-closed figures, such as a semicircle with a straight line as the diameter.
  • the touch track may be used as a closed figure to determine the area to be edited.
  • the touch track is a closed figure, and the method of determining the area to be edited can refer to the related description in the second embodiment, which will not be repeated here.
  • FIG. 5 shows an effect diagram of the area to be edited obtained by applying the editing method shown in FIG. 4.
  • Determining the area to be edited according to the objects enclosed and/or covered by the graphics associated with the straight line may include, but is not limited to: displaying the content of the object to be edited marked with the graphics associated with the straight line (the rectangle is shown in Figure 5) including Multiple characters; determine the area to be edited as the area containing the characters surrounded by the graphics associated with the straight line and the covered characters; mark the area to be edited in the object to be edited (the dotted line is used in Figure 5).
  • the area to be edited may also be determined according to the objects covered by the touch track, such as characters. That is, the objects covered by the touch track, such as characters, are extracted to generate the area to be edited.
  • Step S45 Edit the area to be edited.
  • the editing method of this embodiment can obtain the area to be edited in the object to be edited according to the non-closed touch trajectory, without sequentially selecting target objects (such as characters), and can obtain the corresponding characters in a short touch operation time. Simple operation and high flexibility.
  • Fig. 6 shows a schematic structural diagram of a computer-readable storage medium according to a fourth embodiment of the present invention.
  • a computer program 60 is stored in the computer readable storage medium as shown in FIG. 6, and when the computer program is executed by a processor, the steps of the above-mentioned editing method are realized.
  • the aforementioned computer-readable storage medium is, for example, a non-volatile memory such as an optical disk, a hard disk, or a flash memory.
  • FIG. 7 shows a schematic structural diagram of a terminal according to the fifth embodiment of the present invention.
  • the terminal 10 of this embodiment includes a camera module 102, a memory 104, a storage controller 106, one or more (only one is shown in the figure) processor 108, a screen 110, and a peripheral interface 112. These components communicate with each other through one or more communication buses/signal lines.
  • FIG. 7 is only for illustration, and the terminal 10 may also include more or less components than those shown in FIG. 7 or have a different configuration from that shown in FIG. 7.
  • the components shown in FIG. 7 can be implemented by hardware, software, or a combination thereof.
  • the camera module 102 is used to take photos or videos.
  • the captured photos or videos can be stored in the memory 104.
  • the memory 104 can be used to store software programs and modules, such as the file editing method and program instructions/modules corresponding to the device in the embodiment of the present invention.
  • the processor 102 executes various functions by running the software programs and modules stored in the memory 104 Application and data processing realize the above-mentioned editing method.
  • the memory 104 may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory.
  • the memory 104 may further include a memory remotely provided with respect to the processor 108, and these remote memories may be connected to the terminal 10 through a network.
  • Examples of the aforementioned networks include but are not limited to the Internet, corporate intranets, local area networks, mobile communication networks, and combinations thereof.
  • the processor 108 and other possible components can access the memory 104 under the control of the memory controller 106.
  • the peripheral interface 112 couples various input/input devices to the processor 108 and the memory 104.
  • the processor 108, the storage controller 106, and the peripheral interface 112 may be implemented in a single chip. In some other instances, they can be implemented by independent chips.
  • the screen 110 provides an output interface between the terminal 10 and the user. Specifically, the screen 110 displays video output to the user, and the content of the video output may include characters, graphics, videos, and any combination thereof.
  • the editing method, computer-readable storage medium and terminal of the present invention start to acquire and store the coordinates of the touch point according to the touch information when receiving the trigger instruction to start acquisition, and acquire the touch track according to the stored coordinates of the touch point Then, the area to be edited in the object to be edited is obtained according to the touch track, which is simple and highly flexible.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Artificial Intelligence (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

一种编辑方法、计算机可读存储介质及终端,该编辑方法包括:接收用户的触控操作(S11);根据所述触控操作获取触控轨迹(S12);根据触控轨迹获取待编辑对象中的待编辑区域(S13);对待编辑区域进行编辑(S14)。该编辑方法、计算机可读存储介质及终端,根据触控轨迹获取待编辑对象中的待编辑区域,操作简单且灵活性高。

Description

编辑方法、计算机可读存储介质及终端
本专利申请要求 2019年5月5日提交的中国专利申请号为201910369076.4,申请人为深圳传音控股股份有限公司,发明名称为“编辑方法、计算机可读存储介质及终端”的优先权,该申请的全文以引用的方式并入本申请中。
技术领域
本发明涉及一种信息处理技术领域,尤其涉及一种编辑方法、计算机可读存储介质及终端。
背景技术
随着终端技术的快速发展,越来越多的人选择使用终端,例如手机或平板电脑等观看新闻、电子书或浏览网站等等。
用户在看新闻或电子书时,可能需要对显示的字符等等进行编辑。
技术问题
但现有的编辑方法是通过控制两端的光标的位置来获取待编辑对象,费时费力且灵活性差。
技术解决方案
有鉴于此,本发明提供一种编辑方法,根据触控轨迹获取待编辑对象中的待编辑区域,操作简单且灵活性高。
本发明提供的编辑方法包括:接收用户的触控操作;根据触控操作获取触控轨迹;根据所述触控轨迹获取所述待编辑对象中的待编辑区域;对所述待编辑区域进行编辑。
在一实施方式中,所述待编辑对象所述待编辑对象为文本框、或文本文件、或编辑框。
在一实施方式中,根据所述触控轨迹获取待编辑对象中的待编辑区域的步骤包括:在所述触控轨迹闭合时,根据所述触控轨迹包围的对象和/或覆盖的对象确定所述待编辑区域。
在一实施方式中,根据所述触控轨迹获取待编辑对象中的待编辑区域的步骤包括:在所述触控轨迹未闭合时,根据所述触控轨迹覆盖的对象确定所述待编辑区域,或根据所述触控轨迹的两端连成的直线确定所述待编辑区域。
在一实施方式中,根据所述触控轨迹的两端连成的直线确定所述待编辑区域的包括:根据所述触控轨迹的两端连成的直线,生成与所述直线关联的闭合图形,所述与直线关联的图形包括以所述直线为对角线的矩形框、以所述直线为直径的圆圈、以所述直线为长边的中点连接线的矩形框中的至少一项;根据与所述直线关联的闭合图形包围的对象和/或覆盖的对象确定所述待编辑区域。
在一实施方式中,所述对所述待编辑区域进行编辑的步骤之前包括:对所述待编辑对象中的待编辑区域进行标记。
在一实施方式中,所述对所述待编辑区域进行编辑的步骤之前包括:根据拖曳指令移动所述待编辑区域;和/或根据加入指令扩大所述待编辑区域;和/或根据删除指令缩小所述待编辑区域。
在一实施方式中,根据所述触控操作获取触控轨迹的步骤包括:在接收到开始获取的触发指令时,开始根据所述触控操作获取并存储触控点的坐标;在接收到停止获取的触发指令时,停止获取触控点的坐标;根据存储的触控点的坐标获取所述触控轨迹。
本发明还提供一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机程序,所述计算机程序被处理器执行时实现上述编辑方法的步骤。
本发明还提供一种终端,所述终端包括上述计算机可读存储介质。
有益效果
本申请的基于编辑方法、计算机可读存储介质及终端根据触控操作获取触控轨迹后,根据触控轨迹获取待编辑对象中的待编辑区域,操作简单且灵活性高。
附图说明
图1示出了本发明第一实施方式的编辑方法的流程示意图;
图2示出了本发明第二实施方式的编辑方法的流程示意图;
图3示出一种应用如图2所示的编辑方法获取的待编辑区域的效果示意图;
图4示出了本发明第三实施方式的编辑方法的流程示意图;
图5示出一种应用如图4所示的编辑方法获取的待编辑区域的效果示意图;
图6示出了本发明第四实施方式的计算机可读存储介质的结构示意图;
图7示出了本发明第五实施方式的终端的结构示意图。
本发明的实施方式
以下由特定的具体实施例说明本申请的实施方式,熟悉此技术的人士可由本说明书所揭露的内容轻易地了解本申请的其他优点及功效。
在下述描述中,参考附图,附图描述了本申请的若干实施例。应当理解,还可使用其他实施例,并且可以在不背离本申请的精神和范围的情况下进行机械组成、结构、电气以及操作上的改变。下面的详细描述不应该被认为是限制性的,并且本申请的实施例的范围仅由公布的专利的权利要求书所限定。这里使用的术语仅是为了描述特定实施例,而并非旨在限制本申请。空间相关的术语,例如“上”、“下”、“左”、“右”、“下面”、“下方”、“下部”、“上方”、“上部”等,可在文中使用以便于说明图中所示的一个元件或特征与另一元件或特征的关系。
虽然在一些实例中术语第一、第二等在本文中用来描述各种元件,但是这些元件不应当被这些术语限制。这些术语仅用来将一个元件与另一个元件进行区分。
再者,如同在本文中所使用的,单数形式“一”、“一个”和“该”旨在也包括复数形式,除非上下文中有相反的指示。应当进一步理解,术语“包含”、“包括”表明存在所述的特征、步骤、操作、元件、组件、项目、种类、和/或组,但不排除一个或多个其他特征、步骤、操作、元件、组件、项目、种类、和/或组的存在、出现或添加。此处使用的术语“或”和“和/或”被解释为包括性的,或意味着任一个或任何组合。因此,“A、B或C”或者“A、B和/或C”意味着“以下任一个:A;B;C;A和B;A和C;B和C;A、B和C”。仅当元件、功能、步骤或操作的组合在某些方式下内在地互相排斥时,才会出现该定义的例外。
图1示出了本发明第一实施方式的编辑方法的流程图。如图1所示,编辑方法可以但不限于应用于终端或服务器,终端可以是手机、电脑、电视机等等具有编辑功能的电子设备,该方法包括如下步骤:
步骤S11:接收用户的触控操作;
其中,用户的触控操作可以但不限于为针对显示的待编辑对象的触控操作。
具体地,待编辑对象可以为文本框、或文本文件、或编辑框等可编辑的对象。待编辑对象中的内容具体可以包括例如数字、字母、文字等字符,也可以为图像等等其他可编辑对象,或者为其任意组合。
步骤S12:根据触控操作获取触控轨迹;
在一实施方式中,步骤S12:根据触控操作获取触控轨迹包括在接收到开始获取的触发指令时,开始根据触控操作获取并存储触控点的坐标。
具体地,在同时检测到多个长按事件时判定接收到开始获取的触发指令。在其他实施方式中,也可以在检测到一个长按事件时或连续多个短按事件时,判定接收到开始获取的触发指令,还可以在接收到包括“开始获取”等关键字的语音指令时,判定接收到开始获取的触发指令。
在一实施方式中,在接收到开始获取的触发指令时,开始根据触控信息获取并存储触控点的坐标之后可以但不限于包括:根据触控点的坐标实时标记存储的触控点,并实时展示标记了触控点的待编辑对象。
在一实施方式中,步骤S12:根据触控操作获取触控轨迹还可以但不限于包括:实时展示标记了触控轨迹的待编辑对象。具体地,例如可以利用实线实时标记触控轨迹。
步骤S13:根据触控轨迹获取待编辑对象中的待编辑区域;
具体地,在一实施方式中,步骤S13:根据触控轨迹获取待编辑对象中的待编辑区域之前包括:在预设时间段内未检测到触控信息时,判定接收到停止获取的触发指令;在接收到停止获取的触发指令时,停止获取触控点的坐标,并根据触控轨迹获取待编辑区域。在其他实施方式中,也可以根据触控轨迹实时获取待编辑对象中的待编辑区域。
在一实施方式中,步骤S13:根据触控轨迹获取待编辑对象中的待编辑区域后可以但不限于包括:对待编辑区域进行标记。具体地,例如可以利用虚线框实时标记待编辑区域。
步骤S14:对待编辑区域进行编辑。
具体地,在一实施方式中,步骤S14:对待编辑区域进行编辑可以但不限于包括:对待编辑区域中的字符或图片等等进行编辑例如重新排版、删除待编辑区域中的某个字符或图片、或在所述待编辑区域中插入某个字符或图片等等,也可以包括对待编辑区域整体进行复制、粘贴等等。
本实施例的编辑方法根据触控轨迹获取待编辑对象中的待编辑区域,操作简单且灵活性高。此外,本实施例的编辑方法可以实时展示标记了触控点的待编辑对象,能帮助用户更准确地选取目标对象。
图2示出了本发明第二实施方式的编辑方法的流程示意图。如图2所示,本实施例的编辑方法包括如下步骤:
步骤S21:接收用户的触控操作;
步骤S22:根据触控操作获取触控轨迹;
步骤S23:在触控轨迹闭合时,根据触控轨迹包围的对象和/或覆盖的对象确定待编辑区域;
具体地,在本实施方式中,步骤S23:在触控轨迹闭合时,根据触控轨迹包围的对象和/或覆盖的对象确定待编辑区域包括:在触控轨迹覆盖第二对象时,将触控轨迹外扩至包围第二对象且对象数目最少的状态;将外扩后的触控轨迹根据编辑对象的排版规律进行变形处理。其中,第二对象例如可以是字符。当然,本领域的技术人员可以理解的是变形处理也可以在外扩处理之前,或者与外扩处理同时进行。图3示出一种应用如图2所示的编辑方法获取的待编辑区域的效果图。如图3所示,由于编辑对象的排版规律是以矩阵式排列,待编辑区域的边界也由横直线和竖直线组成,但本发明并不以此为限,待编辑区域的边界也可以圆弧状、波浪状等等。
在其他实施方式中,步骤S23:在触控轨迹闭合时,根据触控轨迹包围的对象和/或覆盖的对象确定待编辑区域包括:获取触控轨迹包围的第一对象和触控轨迹覆盖的第二对象;将待编辑对象中包含第一对象和第二对象的区域确定为待编辑区域。然后,也可以仅仅将待编辑对象中包含第一对象的区域确定为待编辑区域,还可以提取触控轨迹覆盖的第二对象,以生成待编辑区域。其中,待编辑区域中的第二对象的排版方式与第二对象在待编辑对象中的排版方式可以相同,也可以不同。其中,第一对象和/或第二对象可以为字符、也可以为图片或其他类型的对象。
步骤S24:对待编辑对象中的待编辑区域进行标记;
具体地,如图3所示,可以利用虚线对待编辑区域的边界进行标记,但本发明并不以此为限。
步骤S25:对待编辑区域进行编辑。
在一实施方式中,在步骤S25:对待编辑区域进行编辑之前还可以但不限于包括:根据拖曳指令移动待编辑区域;和/或根据加入指令扩大待编辑区域;和/或根据删除指令缩小待编辑区域。具体地,例如在接收到加入指令时,将两个待编辑区域进行融合,以扩大待编辑区域。在接收到删除指令时,将待编辑区域中的子区域删除,以缩小待编辑区域。
本实施例的编辑方法能根据闭合的触控轨迹获取待编辑对象中的待编辑区域,无需依次选取目标对象(例如字符),操作简单且灵活性高。
图4示出了本发明第三实施方式的编辑方法的流程示意图。如图4所示,本实施例的编辑方法包括如下步骤:
步骤S41:显示待编辑对象;
步骤S42:接收用户的触控操作;
步骤S43:根据触控操作获取触控轨迹;
在一实施方式中,步骤S43:根据触控操作获取触控轨迹之后还可以但不限于包括:实时标记触控轨迹(请参考图5,图5中利用实线标记触控轨迹,但本发明并不以此为限)。
步骤S44:在触控轨迹未闭合时,根据触控轨迹的两端连成的直线确定待编辑区域;
具体地,步骤S44:在触控轨迹为未闭合时,根据触控轨迹的两端连成的直线确定待编辑区域可以但不限于包括:根据触控轨迹的两端连成的直线,生成与直线关联的图形;根据与直线关联的图形包围的对象和/或覆盖的对象确定待编辑区域。其中,与直线关联的图形包括以直线为对角线的矩形框、以直线为直径的圆圈、以直线为长边的中点连接线的矩形框、以直线为长边的矩形框中的至少一项,本领域的技术人员可以理解的是,与直线关联的图形,也可以是系统设置的其他闭合图形例如菱形或其他非闭合图形例如以直线为直径的半圆等等,还可以是与待编辑对象中的对象的排版规律生成的其他闭合图形等等。在其他实施方式中,当触控轨迹的两端的距离小于阈值时,可以将触控轨迹作为闭合图形,以确定待编辑区域。触控轨迹为闭合图形,确定待编辑区域的方法请参考第二实施例中的相关描述,在此不再赘述。
图5示出应用如图4所示的编辑方法获取的待编辑区域的效果图。根据与直线关联的图形包围的对象和/或覆盖的对象确定待编辑区域可以但不限于包括:显示标记了与直线关联的图形(图5用示出的是矩形)的待编辑对象的内容包括多个字符;确定待编辑区域为包含与直线关联的图形包围的字符和覆盖的字符的区域;在待编辑对象中标记出待编辑区域(图5中用的是虚线)。
在其他实施方式中,在触控轨迹未闭合时,也可以根据触控轨迹覆盖的对象例如字符确定待编辑区域。也就是说,提取触控轨迹覆盖的对象例如字符,以生成待编辑区域。
步骤S45:对待编辑区域进行编辑。
本实施例的编辑方法能根据非闭合的触控轨迹获取待编辑对象中的待编辑区域,无需依次选取目标对象(例如字符),能在较短的触控操作时间内,获取对应的字符,操作简单且灵活性高。
图6示出了本发明第四实施方式的计算机可读存储介质的结构示意图。如图6所示的计算机可读存储介质中存储有计算机程序60,此计算机程序被处理器执行时实现上述编辑方法的步骤。上述的计算机可读存储介质例如为非易失性存储器例如光盘、硬盘、或者闪存。
图7示出了本发明第五实施方式的终端的结构示意图。如图7所示,本实施例的终端10包括摄像模块102、存储器104、存储控制器106,一个或多个(图中仅示出一个)处理器108、屏幕110、外设接口112。这些组件通过一条或多条通讯总线/信号线相互通讯。
可以理解,图7所示的结构仅为示意,终端10还可包括比图7中所示更多或者更少的组件,或者具有与图7所示不同的配置。图7中所示的各组件可以采用硬件、软件或其组合实现。
摄像模块102用于拍摄照片或者视频。拍摄的照片或者视频可以存储至存储器104内。存储器104可用于存储软件程序以及模块,如本发明实施例中的文件编辑方法、装置对应的程序指令/模块,处理器102通过运行存储在存储器104内的软件程序以及模块,从而执行各种功能应用以及数据处理,即实现上述的编辑方法。存储器104可包括高速随机存储器,还可包括非易失性存储器,如一个或者多个磁性存储装置、闪存、或者其他非易失性固态存储器。在一些实例中,存储器104可进一步包括相对于处理器108远程设置的存储器,这些远程存储器可以通过网络连接至终端10。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。处理器108以及其他可能的组件对存储器104的访问可在存储控制器106的控制下进行。
在一些实施例中,外设接口112将各种输入/输入装置耦合至处理器108以及存储器104。处理器108、存储控制器106及外设接口112可以在单个芯片中实现。在其他一些实例中,他们可以分别由独立的芯片实现。
屏幕110在终端10与用户之间提供一个输出界面。具体地,屏幕110向用户显示视频输出,这些视频输出的内容可包括字符、图形、视频、及其任意组合。
工业实用性
本发明的编辑方法、计算机可读存储介质及终端在接收到开始获取的触发指令时,开始根据触控信息获取并存储触控点的坐标,且根据存储的触控点的坐标获取触控轨迹后,根据触控轨迹获取待编辑对象中的待编辑区域,操作简单且灵活性高。

Claims (10)

  1. 一种编辑方法,其特征在于,所述编辑方法包括:
    接收用户的触控操作;
    根据所述触控操作获取触控轨迹;
    根据所述触控轨迹获取待编辑对象中的待编辑区域;
    对所述待编辑区域进行编辑。
  2. 如权利要求1所述的编辑方法,其特征在于,所述待编辑对象为文本框、或文本文件、或编辑框。
  3. 如权利要求1所述的编辑方法,其特征在于,根据所述触控轨迹获取待编辑对象中的待编辑区域的步骤包括:
    在所述触控轨迹闭合时,根据所述触控轨迹包围的对象和/或覆盖的对象确定所述待编辑区域。
  4. 如权利要求1所述的编辑方法,其特征在于,根据所述触控轨迹获取待编辑对象中的待编辑区域的步骤包括:
    在所述触控轨迹未闭合时,根据所述触控轨迹覆盖的对象确定所述待编辑区域,或根据所述触控轨迹的两端连成的直线确定所述待编辑区域。
  5. 如权利要求4所述的编辑方法,其特征在于,根据所述触控轨迹的两端连成的直线确定所述待编辑区域的步骤包括:
    根据所述触控轨迹的两端连成的直线,生成与所述直线关联的闭合图形,所述与直线关联的图形包括以所述直线为对角线的矩形框、以所述直线为直径的圆圈、以所述直线为长边的中点连接线的矩形框中的至少一项;
    根据与所述直线关联的闭合图形包围的对象和/或覆盖的对象确定所述待编辑区域。
  6. 如权利要求1所述的编辑方法,其特征在于,所述对所述待编辑区域进行编辑的步骤之前还包括:
    对所述待编辑对象中的待编辑区域进行标记。
  7. 如权利要求6所述的编辑方法,其特征在于,所述对所述待编辑区域进行编辑的步骤之前还包括:
    根据拖曳指令移动所述待编辑区域;和/或
    根据加入指令扩大所述待编辑区域;和/或
    根据删除指令缩小所述待编辑区域。
  8. 如权利要求1所述的编辑方法,其特征在于,根据所述触控操作获取触控轨迹的步骤包括:
    在接收到开始获取的触发指令时,开始根据触控操作获取并存储触控点的坐标;
    在接收到停止获取的触发指令时,停止获取触控点的坐标;
    根据存储的触控点的坐标获取所述触控轨迹。
  9. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有计算机程序,所述计算机程序被处理器执行时实现如权利要求1至8任意一项编辑方法的步骤。
  10. 一种终端,其特征在于,所述终端包括如权利要求9所述计算机可读存储介质。
PCT/CN2019/106310 2019-05-05 2019-09-18 编辑方法、计算机可读存储介质及终端 WO2020224158A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910369076.4A CN110222322A (zh) 2019-05-05 2019-05-05 编辑方法、计算机可读存储介质及终端
CN201910369076.4 2019-05-05

Publications (1)

Publication Number Publication Date
WO2020224158A1 true WO2020224158A1 (zh) 2020-11-12

Family

ID=67820510

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/106310 WO2020224158A1 (zh) 2019-05-05 2019-09-18 编辑方法、计算机可读存储介质及终端

Country Status (2)

Country Link
CN (1) CN110222322A (zh)
WO (1) WO2020224158A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110222322A (zh) * 2019-05-05 2019-09-10 深圳传音控股股份有限公司 编辑方法、计算机可读存储介质及终端

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102609404A (zh) * 2012-02-08 2012-07-25 刘津立 一种利用两点触控技术实现的文档编辑方法
CN103116461A (zh) * 2013-01-25 2013-05-22 中兴通讯股份有限公司 一种基于触摸屏的字符处理方法及终端
US20130205201A1 (en) * 2012-02-08 2013-08-08 Phihong Technology Co.,Ltd. Touch Control Presentation System and the Method thereof
CN104007914A (zh) * 2013-02-26 2014-08-27 北京三星通信技术研究有限公司 对输入字符进行操作的方法及装置
CN104423869A (zh) * 2013-09-09 2015-03-18 中兴通讯股份有限公司 文本擦除方法及装置
CN107632773A (zh) * 2017-10-17 2018-01-26 北京百度网讯科技有限公司 用于获取信息的方法及装置
CN110222322A (zh) * 2019-05-05 2019-09-10 深圳传音控股股份有限公司 编辑方法、计算机可读存储介质及终端

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102609404A (zh) * 2012-02-08 2012-07-25 刘津立 一种利用两点触控技术实现的文档编辑方法
US20130205201A1 (en) * 2012-02-08 2013-08-08 Phihong Technology Co.,Ltd. Touch Control Presentation System and the Method thereof
CN103116461A (zh) * 2013-01-25 2013-05-22 中兴通讯股份有限公司 一种基于触摸屏的字符处理方法及终端
CN104007914A (zh) * 2013-02-26 2014-08-27 北京三星通信技术研究有限公司 对输入字符进行操作的方法及装置
CN104423869A (zh) * 2013-09-09 2015-03-18 中兴通讯股份有限公司 文本擦除方法及装置
CN107632773A (zh) * 2017-10-17 2018-01-26 北京百度网讯科技有限公司 用于获取信息的方法及装置
CN110222322A (zh) * 2019-05-05 2019-09-10 深圳传音控股股份有限公司 编辑方法、计算机可读存储介质及终端

Also Published As

Publication number Publication date
CN110222322A (zh) 2019-09-10

Similar Documents

Publication Publication Date Title
WO2020103218A1 (zh) 一种WebRTC中的直播流处理方法及推流客户端
CN110100251B (zh) 用于处理文档的设备、方法和计算机可读存储介质
WO2017024964A1 (zh) 一种物品关联图片快速预览的方法以及装置
CN109121000A (zh) 一种视频处理方法及客户端
WO2021147461A1 (zh) 展示字幕信息的方法、装置、电子设备和计算机可读介质
WO2020024580A1 (zh) 图形绘制方法及装置、设备及存储介质
WO2021254502A1 (zh) 目标对象显示方法、装置及电子设备
WO2022022689A1 (zh) 交互方法、装置和电子设备
KR20230049691A (ko) 비디오 처리 방법, 단말기 및 저장매체
US20240119082A1 (en) Method, apparatus, device, readable storage medium and product for media content processing
US11190653B2 (en) Techniques for capturing an image within the context of a document
CN113076048A (zh) 视频的展示方法、装置、电子设备和存储介质
CN103870197A (zh) 一种处理图片的方法及装置
WO2017101390A1 (zh) 一种图片显示方法及装置
CN114339363B (zh) 画面切换处理方法、装置、计算机设备和存储介质
CN107861711B (zh) 页面适配方法及装置
CN107390986A (zh) 一种移动终端裁图控制方法、存储设备及移动终端
WO2022134390A1 (zh) 标注方法及装置、电子设备和存储介质
CN107071574A (zh) 智能电视页面跳转方法
WO2020224158A1 (zh) 编辑方法、计算机可读存储介质及终端
WO2023174297A1 (zh) 屏幕控制方法、装置及电子设备
CN114125297B (zh) 视频拍摄方法、装置、电子设备及存储介质
JP6917149B2 (ja) コンテンツ提供方法を実行するために記憶媒体に記憶されるコンピュータプログラム、その方法及びその装置
CN112740161A (zh) 终端、用于控制终端的方法以及其中记录有用于实现该方法的程序的记录介质
CN112202958B (zh) 截图方法、装置及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19928082

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19928082

Country of ref document: EP

Kind code of ref document: A1